Sample records for previously developed computer

  1. Approaches to Classroom-Based Computational Science.

    ERIC Educational Resources Information Center

    Guzdial, Mark

    Computational science includes the use of computer-based modeling and simulation to define and test theories about scientific phenomena. The challenge for educators is to develop techniques for implementing computational science in the classroom. This paper reviews some previous work on the use of simulation alone (without modeling), modeling…

  2. Image restoration for three-dimensional fluorescence microscopy using an orthonormal basis for efficient representation of depth-variant point-spread functions

    PubMed Central

    Patwary, Nurmohammed; Preza, Chrysanthe

    2015-01-01

    A depth-variant (DV) image restoration algorithm for wide field fluorescence microscopy, using an orthonormal basis decomposition of DV point-spread functions (PSFs), is investigated in this study. The efficient PSF representation is based on a previously developed principal component analysis (PCA), which is computationally intensive. We present an approach developed to reduce the number of DV PSFs required for the PCA computation, thereby making the PCA-based approach computationally tractable for thick samples. Restoration results from both synthetic and experimental images show consistency and that the proposed algorithm addresses efficiently depth-induced aberration using a small number of principal components. Comparison of the PCA-based algorithm with a previously-developed strata-based DV restoration algorithm demonstrates that the proposed method improves performance by 50% in terms of accuracy and simultaneously reduces the processing time by 64% using comparable computational resources. PMID:26504634

  3. Comparison of Methods for Determining Boundary Layer Edge Conditions for Transition Correlations

    NASA Technical Reports Server (NTRS)

    Liechty, Derek S.; Berry, Scott A.; Hollis, Brian R.; Horvath, Thomas J.

    2003-01-01

    Data previously obtained for the X-33 in the NASA Langley Research Center 20-Inch Mach 6 Air Tunnel have been reanalyzed to compare methods for determining boundary layer edge conditions for use in transition correlations. The experimental results were previously obtained utilizing the phosphor thermography technique to monitor the status of the boundary layer downstream of discrete roughness elements via global heat transfer images of the X-33 windward surface. A boundary layer transition correlation was previously developed for this data set using boundary layer edge conditions calculated using an inviscid/integral boundary layer approach. An algorithm was written in the present study to extract boundary layer edge quantities from higher fidelity viscous computational fluid dynamic solutions to develop transition correlations that account for viscous effects on vehicles of arbitrary complexity. The boundary layer transition correlation developed for the X-33 from the viscous solutions are compared to the previous boundary layer transition correlations. It is shown that the boundary layer edge conditions calculated using an inviscid/integral boundary layer approach are significantly different than those extracted from viscous computational fluid dynamic solutions. The present results demonstrate the differences obtained in correlating transition data using different computational methods.

  4. Development of a Personalized Educational Computer Game Based on Students' Learning Styles

    ERIC Educational Resources Information Center

    Hwang, Gwo-Jen; Sung, Han-Yu; Hung, Chun-Ming; Huang, Iwen; Tsai, Chin-Chung

    2012-01-01

    In recent years, many researchers have been engaged in the development of educational computer games; however, previous studies have indicated that, without supportive models that take individual students' learning needs or difficulties into consideration, students might only show temporary interest during the learning process, and their learning…

  5. Developing and validating an instrument for measuring mobile computing self-efficacy.

    PubMed

    Wang, Yi-Shun; Wang, Hsiu-Yuan

    2008-08-01

    IT-related self-efficacy has been found to have a critical influence on system use. However, traditional measures of computer self-efficacy and Internet-related self-efficacy are perceived to be inapplicable in the context of mobile computing and commerce because they are targeted primarily at either desktop computer or wire-based technology contexts. Based on previous research, this study develops and validates a multidimensional instrument for measuring mobile computing self-efficacy (MCSE). This empirically validated instrument will be useful to researchers in developing and testing the theories of mobile user behavior, and to practitioners in assessing the mobile computing self-efficacy of users and promoting the use of mobile commerce systems.

  6. A Computer Interview for Multivariate Monitoring of Psychiatric Outcome.

    ERIC Educational Resources Information Center

    Stevenson, John F.; And Others

    Application of computer technology to psychiatric outcome measurement offers the promise of coping with increasing demands for extensive patient interviews repeated longitudinally. Described is the development of a cost-effective multi-dimensional tracking device to monitor psychiatric functioning, building on a previous local computer interview…

  7. Computer Use within a Play-Based Early Years Curriculum

    ERIC Educational Resources Information Center

    Howard, Justine; Miles, Gareth E.; Rees-Davies, Laura

    2012-01-01

    Early years curricula promote learning through play and in addition emphasise the development of computer literacy. Previous research, however, has described that teachers feel unprepared to integrate Information and Communication Technology (ICT) and play. Also, whereas research has suggested that effective computer use in the early years is…

  8. Students' Technology Use and Its Effects on Peer Relationships, Academic Involvement, and Healthy Lifestyles

    ERIC Educational Resources Information Center

    Lloyd, Jan M.; Dean, Laura A.; Cooper, Diane L.

    2007-01-01

    The purpose of this study was to explore students' technology use and its relationship with their psychosocial development. Previous research explored students' computer use in conjunction with their cognitive development. This study examined the effects of computer use and other technologies, such as instant messaging, handheld gaming devices,…

  9. On finite element methods for the Helmholtz equation

    NASA Technical Reports Server (NTRS)

    Aziz, A. K.; Werschulz, A. G.

    1979-01-01

    The numerical solution of the Helmholtz equation is considered via finite element methods. A two-stage method which gives the same accuracy in the computed gradient as in the computed solution is discussed. Error estimates for the method using a newly developed proof are given, and the computational considerations which show this method to be computationally superior to previous methods are presented.

  10. The Effects of Gender on the Attitudes towards the Computer Assisted Instruction: A Meta-Analysis

    ERIC Educational Resources Information Center

    Cam, Sefika Sumeyye; Yarar, Gokhan; Toraman, Cetin; Erdamar, Gurcu Koc

    2016-01-01

    The idea that gender factor creates a difference on computer usage and computer-assisted instruction is based upon previous years. At that time, it was thought that some areas like engineering, science and mathematics were for males so it created a difference on the computer usage. Nevertheless, developing technology and females becoming more…

  11. Developing Computerized Tests for Classroom Teachers: A Pilot Study.

    ERIC Educational Resources Information Center

    Glowacki, Margaret L.; And Others

    Two types of computerized testing have been defined: (1) computer-based testing, using a computer to administer conventional tests in which all examinees take the same set of items; and (2) adaptive tests, in which items are selected for administration by the computer, based on examinee's previous responses. This paper discusses an option for…

  12. The electromagnetic modeling of thin apertures using the finite-difference time-domain technique

    NASA Technical Reports Server (NTRS)

    Demarest, Kenneth R.

    1987-01-01

    A technique which computes transient electromagnetic responses of narrow apertures in complex conducting scatterers was implemented as an extension of previously developed Finite-Difference Time-Domain (FDTD) computer codes. Although these apertures are narrow with respect to the wavelengths contained within the power spectrum of excitation, this technique does not require significantly more computer resources to attain the increased resolution at the apertures. In the report, an analytical technique which utilizes Babinet's principle to model the apertures is developed, and an FDTD computer code which utilizes this technique is described.

  13. DIRAC in Large Particle Physics Experiments

    NASA Astrophysics Data System (ADS)

    Stagni, F.; Tsaregorodtsev, A.; Arrabito, L.; Sailer, A.; Hara, T.; Zhang, X.; Consortium, DIRAC

    2017-10-01

    The DIRAC project is developing interware to build and operate distributed computing systems. It provides a development framework and a rich set of services for both Workload and Data Management tasks of large scientific communities. A number of High Energy Physics and Astrophysics collaborations have adopted DIRAC as the base for their computing models. DIRAC was initially developed for the LHCb experiment at LHC, CERN. Later, the Belle II, BES III and CTA experiments as well as the linear collider detector collaborations started using DIRAC for their computing systems. Some of the experiments built their DIRAC-based systems from scratch, others migrated from previous solutions, ad-hoc or based on different middlewares. Adaptation of DIRAC for a particular experiment was enabled through the creation of extensions to meet their specific requirements. Each experiment has a heterogeneous set of computing and storage resources at their disposal that were aggregated through DIRAC into a coherent pool. Users from different experiments can interact with the system in different ways depending on their specific tasks, expertise level and previous experience using command line tools, python APIs or Web Portals. In this contribution we will summarize the experience of using DIRAC in particle physics collaborations. The problems of migration to DIRAC from previous systems and their solutions will be presented. An overview of specific DIRAC extensions will be given. We hope that this review will be useful for experiments considering an update, or for those designing their computing models.

  14. Facilitating higher-fidelity simulations of axial compressor instability and other turbomachinery flow conditions

    NASA Astrophysics Data System (ADS)

    Herrick, Gregory Paul

    The quest to accurately capture flow phenomena with length-scales both short and long and to accurately represent complex flow phenomena within disparately sized geometry inspires a need for an efficient, high-fidelity, multi-block structured computational fluid dynamics (CFD) parallel computational scheme. This research presents and demonstrates a more efficient computational method by which to perform multi-block structured CFD parallel computational simulations, thus facilitating higher-fidelity solutions of complicated geometries (due to the inclusion of grids for "small'' flow areas which are often merely modeled) and their associated flows. This computational framework offers greater flexibility and user-control in allocating the resource balance between process count and wall-clock computation time. The principal modifications implemented in this revision consist of a "multiple grid block per processing core'' software infrastructure and an analytic computation of viscous flux Jacobians. The development of this scheme is largely motivated by the desire to simulate axial compressor stall inception with more complete gridding of the flow passages (including rotor tip clearance regions) than has been previously done while maintaining high computational efficiency (i.e., minimal consumption of computational resources), and thus this paradigm shall be demonstrated with an examination of instability in a transonic axial compressor. However, the paradigm presented herein facilitates CFD simulation of myriad previously impractical geometries and flows and is not limited to detailed analyses of axial compressor flows. While the simulations presented herein were technically possible under the previous structure of the subject software, they were much less computationally efficient and thus not pragmatically feasible; the previous research using this software to perform three-dimensional, full-annulus, time-accurate, unsteady, full-stage (with sliding-interface) simulations of rotating stall inception in axial compressors utilized tip clearance periodic models, while the scheme here is demonstrated by a simulation of axial compressor stall inception utilizing gridded rotor tip clearance regions. As will be discussed, much previous research---experimental, theoretical, and computational---has suggested that understanding clearance flow behavior is critical to understanding stall inception, and previous computational research efforts which have used tip clearance models have begged the question, "What about the clearance flows?''. This research begins to address that question.

  15. Singularity: Scientific containers for mobility of compute.

    PubMed

    Kurtzer, Gregory M; Sochat, Vanessa; Bauer, Michael W

    2017-01-01

    Here we present Singularity, software developed to bring containers and reproducibility to scientific computing. Using Singularity containers, developers can work in reproducible environments of their choosing and design, and these complete environments can easily be copied and executed on other platforms. Singularity is an open source initiative that harnesses the expertise of system and software engineers and researchers alike, and integrates seamlessly into common workflows for both of these groups. As its primary use case, Singularity brings mobility of computing to both users and HPC centers, providing a secure means to capture and distribute software and compute environments. This ability to create and deploy reproducible environments across these centers, a previously unmet need, makes Singularity a game changing development for computational science.

  16. Singularity: Scientific containers for mobility of compute

    PubMed Central

    Kurtzer, Gregory M.; Bauer, Michael W.

    2017-01-01

    Here we present Singularity, software developed to bring containers and reproducibility to scientific computing. Using Singularity containers, developers can work in reproducible environments of their choosing and design, and these complete environments can easily be copied and executed on other platforms. Singularity is an open source initiative that harnesses the expertise of system and software engineers and researchers alike, and integrates seamlessly into common workflows for both of these groups. As its primary use case, Singularity brings mobility of computing to both users and HPC centers, providing a secure means to capture and distribute software and compute environments. This ability to create and deploy reproducible environments across these centers, a previously unmet need, makes Singularity a game changing development for computational science. PMID:28494014

  17. Development of a computer technique for the prediction of transport aircraft flight profile sonic boom signatures. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Coen, Peter G.

    1991-01-01

    A new computer technique for the analysis of transport aircraft sonic boom signature characteristics was developed. This new technique, based on linear theory methods, combines the previously separate equivalent area and F function development with a signature propagation method using a single geometry description. The new technique was implemented in a stand-alone computer program and was incorporated into an aircraft performance analysis program. Through these implementations, both configuration designers and performance analysts are given new capabilities to rapidly analyze an aircraft's sonic boom characteristics throughout the flight envelope.

  18. The Foundation and Development of Computer Assisted Instruction in the Field of Reading from Its Inception to the Present.

    ERIC Educational Resources Information Center

    Zuberman, Lea K.

    This critical review and evaluation of the literature covers the field of computer assisted instruction (CAI) and reading from its inception to the present day. Seventeen research studies are discussed as well as four surveys of previous research in this field. Major issues addressed include the effectiveness of CAI and computer managed…

  19. The Development, Implementation, and Evaluation of a Computer-Assisted Branched Test for a Program of Individually Prescribed Instruction.

    ERIC Educational Resources Information Center

    Ferguson, Richard L.

    The focus of this study was upon the development and evaluation of a computer-assisted branched test to be used in making instructional decisions for individuals in the program of Individually Prescribed Instruction. A Branched Test is one in which the presentation of test items is contingent upon the previous responses of the examinee. The…

  20. Using NCLab-karel to improve computational thinking skill of junior high school students

    NASA Astrophysics Data System (ADS)

    Kusnendar, J.; Prabawa, H. W.

    2018-05-01

    Increasingly human interaction with technology and the increasingly complex development of digital technology world make the theme of computer science education interesting to study. Previous studies on Computer Literacy and Competency reveal that Indonesian teachers in general have fairly high computational skill, but their skill utilization are limited to some applications. This engenders limited and minimum computer-related learning for the students. On the other hand, computer science education is considered unrelated to real-world solutions. This paper attempts to address the utilization of NCLab- Karel in shaping the computational thinking in students. This computational thinking is believed to be able to making learn students about technology. Implementation of Karel utilization provides information that Karel is able to increase student interest in studying computational material, especially algorithm. Observations made during the learning process also indicate the growth and development of computing mindset in students.

  1. Hexagonalization of correlation functions II: two-particle contributions

    NASA Astrophysics Data System (ADS)

    Fleury, Thiago; Komatsu, Shota

    2018-02-01

    In this work, we compute one-loop planar five-point functions in N=4 super-Yang-Mills using integrability. As in the previous work, we decompose the correlation functions into hexagon form factors and glue them using the weight factors which depend on the cross-ratios. The main new ingredient in the computation, as compared to the four-point functions studied in the previous paper, is the two-particle mirror contribution. We develop techniques to evaluate it and find agreement with the perturbative results in all the cases we analyzed. In addition, we consider next-to-extremal four-point functions, which are known to be protected, and show that the sum of one-particle and two-particle contributions at one loop adds up to zero as expected. The tools developed in this work would be useful for computing higher-particle contributions which would be relevant for more complicated quantities such as higher-loop corrections and non-planar correlators.

  2. Teachers' Attitudes to Using iPads or Tablet Computers; Implications for Developing New Skills, Pedagogies and School-Provided Support

    ERIC Educational Resources Information Center

    Young, Keith

    2016-01-01

    This study examined the attitudes of teachers towards using tablet computers, predominantly Apple's iPad, across 22 post primary-schools in Ireland. The study also questions some previous research and assumptions on the educational use of tablet computers. The majority of schools were using devices with students and teachers; the combined size of…

  3. The Cyborg Astrobiologist: testing a novelty detection algorithm on two mobile exploration systems at Rivas Vaciamadrid in Spain and at the Mars Desert Research Station in Utah

    NASA Astrophysics Data System (ADS)

    McGuire, P. C.; Gross, C.; Wendt, L.; Bonnici, A.; Souza-Egipsy, V.; Ormö, J.; Díaz-Martínez, E.; Foing, B. H.; Bose, R.; Walter, S.; Oesker, M.; Ontrup, J.; Haschke, R.; Ritter, H.

    2010-01-01

    In previous work, a platform was developed for testing computer-vision algorithms for robotic planetary exploration. This platform consisted of a digital video camera connected to a wearable computer for real-time processing of images at geological and astrobiological field sites. The real-time processing included image segmentation and the generation of interest points based upon uncommonness in the segmentation maps. Also in previous work, this platform for testing computer-vision algorithms has been ported to a more ergonomic alternative platform, consisting of a phone camera connected via the Global System for Mobile Communications (GSM) network to a remote-server computer. The wearable-computer platform has been tested at geological and astrobiological field sites in Spain (Rivas Vaciamadrid and Riba de Santiuste), and the phone camera has been tested at a geological field site in Malta. In this work, we (i) apply a Hopfield neural-network algorithm for novelty detection based upon colour, (ii) integrate a field-capable digital microscope on the wearable computer platform, (iii) test this novelty detection with the digital microscope at Rivas Vaciamadrid, (iv) develop a Bluetooth communication mode for the phone-camera platform, in order to allow access to a mobile processing computer at the field sites, and (v) test the novelty detection on the Bluetooth-enabled phone camera connected to a netbook computer at the Mars Desert Research Station in Utah. This systems engineering and field testing have together allowed us to develop a real-time computer-vision system that is capable, for example, of identifying lichens as novel within a series of images acquired in semi-arid desert environments. We acquired sequences of images of geologic outcrops in Utah and Spain consisting of various rock types and colours to test this algorithm. The algorithm robustly recognized previously observed units by their colour, while requiring only a single image or a few images to learn colours as familiar, demonstrating its fast learning capability.

  4. To Lease or Not To Lease.

    ERIC Educational Resources Information Center

    Hamilton, William A.

    1998-01-01

    Thanks to previous bond issues, the Walled Lake (Michigan) Schools had a well-defined technology plan featuring staff development, student performance benchmarks, and rooms of outdated computers. After three bond issues failed, the district adopted leasing as an alternative. Their present three-year contract supplies 154 used computers and a…

  5. Efficient conjugate gradient algorithms for computation of the manipulator forward dynamics

    NASA Technical Reports Server (NTRS)

    Fijany, Amir; Scheid, Robert E.

    1989-01-01

    The applicability of conjugate gradient algorithms for computation of the manipulator forward dynamics is investigated. The redundancies in the previously proposed conjugate gradient algorithm are analyzed. A new version is developed which, by avoiding these redundancies, achieves a significantly greater efficiency. A preconditioned conjugate gradient algorithm is also presented. A diagonal matrix whose elements are the diagonal elements of the inertia matrix is proposed as the preconditioner. In order to increase the computational efficiency, an algorithm is developed which exploits the synergism between the computation of the diagonal elements of the inertia matrix and that required by the conjugate gradient algorithm.

  6. Expedited patient-specific assessment of contact stress exposure in the ankle joint following definitive articular fracture reduction.

    PubMed

    Kern, Andrew M; Anderson, Donald D

    2015-09-18

    Acute injury severity, altered joint kinematics, and joint incongruity are three important mechanical factors linked to post-traumatic osteoarthritis (PTOA). Finite element analysis (FEA) was previously used to assess the influence of increased contact stress due to joint incongruity on PTOA development. While promising agreement with PTOA development was seen, the inherent complexities of contact FEA limited the numbers of subjects that could be analyzed. Discrete element analysis (DEA) is a simplified methodology for contact stress computation, which idealizes contact surfaces as a bed of independent linear springs. In this study, DEA was explored as an expedited alternative to FEA contact stress exposure computation. DEA was compared to FEA using results from a previously completed validation study of two cadaveric human ankles, as well as a previous study of post-operative contact stress exposure in 11 patients with tibial plafond fracture. DEA-computed maximum contact stresses were within 19% of those experimentally measured, with 90% of the contact area having computed contact stress values within 1MPa of those measured. In the 11 fractured ankles, maximum contact stress and contact area differences between DEA and FEA were 0.85 ± 0.64 MPa and 22.5 ± 11.5mm(2). As a predictive measure for PTOA development, both DEA and FEA had 100% concordance with presence of OA (KL grade ≥ 2) and >95% concordance with KL grade at 2 years. These results support DEA as a reasonable alternative to FEA for computing contact stress exposures following surgical reduction of a tibial plafond fracture. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. 43 CFR 11.64 - Injury determination phase-testing and sampling methods.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    .... In developing these objectives, the availability of information from response actions relating to the...), test cases proving the code works, and any alteration of previously documented code made to adapt the... computer code (if any), test cases proving the code works, and any alteration of previously documented code...

  8. 43 CFR 11.64 - Injury determination phase-testing and sampling methods.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    .... In developing these objectives, the availability of information from response actions relating to the...), test cases proving the code works, and any alteration of previously documented code made to adapt the... computer code (if any), test cases proving the code works, and any alteration of previously documented code...

  9. 43 CFR 11.64 - Injury determination phase-testing and sampling methods.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    .... In developing these objectives, the availability of information from response actions relating to the...), test cases proving the code works, and any alteration of previously documented code made to adapt the... computer code (if any), test cases proving the code works, and any alteration of previously documented code...

  10. A Randomized Field Trial of the Fast ForWord Language Computer-Based Training Program

    ERIC Educational Resources Information Center

    Borman, Geoffrey D.; Benson, James G.; Overman, Laura

    2009-01-01

    This article describes an independent assessment of the Fast ForWord Language computer-based training program developed by Scientific Learning Corporation. Previous laboratory research involving children with language-based learning impairments showed strong effects on their abilities to recognize brief and fast sequences of nonspeech and speech…

  11. Computational Model of the Fathead Minnow Hypothalamic-Pituitary-Gonadal Axis: Incorporating Protein Synthesis in Improving Predictability of Responses to Endocrine Active Chemicals

    EPA Science Inventory

    There is international concern about chemicals that alter endocrine system function in humans and/or wildlife and subsequently cause adverse effects. We previously developed a mechanistic computational model of the hypothalamic-pituitary-gonadal (HPG) axis in female fathead minno...

  12. Automatic Grading of 3D Computer Animation Laboratory Assignments

    ERIC Educational Resources Information Center

    Lamberti, Fabrizio; Sanna, Andrea; Paravati, Gianluca; Carlevaris, Gilles

    2014-01-01

    Assessment is a delicate task in the overall teaching process because it may require significant time and may be prone to subjectivity. Subjectivity is especially true for disciplines in which perceptual factors play a key role in the evaluation. In previous decades, computer-based assessment techniques were developed to address the…

  13. Computational Modelling and Children's Expressions of Signal and Noise

    ERIC Educational Resources Information Center

    Ainley, Janet; Pratt, Dave

    2017-01-01

    Previous research has demonstrated how young children can identify the signal in data. In this exploratory study we considered how they might also express meanings for noise when creating computational models using recent developments in software tools. We conducted extended clinical interviews with four groups of 11-year-olds and analysed the…

  14. Mobile-Computing Trends: Lighter, Faster, Smarter

    ERIC Educational Resources Information Center

    Godwin-Jones, Robert

    2008-01-01

    The new era of mobile computing promises greater variety in applications, highly improved usability, and speedier networking. The 3G iPhone from Apple is the poster child for this trend, but there are plenty of other developments that point in this direction. Previous surveys, in LLT, and by researchers at the UK's Open University, have…

  15. Trends in computer hardware and software.

    PubMed

    Frankenfeld, F M

    1993-04-01

    Previously identified and current trends in the development of computer systems and in the use of computers for health care applications are reviewed. Trends identified in a 1982 article were increasing miniaturization and archival ability, increasing software costs, increasing software independence, user empowerment through new software technologies, shorter computer-system life cycles, and more rapid development and support of pharmaceutical services. Most of these trends continue today. Current trends in hardware and software include the increasing use of reduced instruction-set computing, migration to the UNIX operating system, the development of large software libraries, microprocessor-based smart terminals that allow remote validation of data, speech synthesis and recognition, application generators, fourth-generation languages, computer-aided software engineering, object-oriented technologies, and artificial intelligence. Current trends specific to pharmacy and hospitals are the withdrawal of vendors of hospital information systems from the pharmacy market, improved linkage of information systems within hospitals, and increased regulation by government. The computer industry and its products continue to undergo dynamic change. Software development continues to lag behind hardware, and its high cost is offsetting the savings provided by hardware.

  16. Probabilistic analysis algorithm for UA slope software program.

    DOT National Transportation Integrated Search

    2013-12-01

    A reliability-based computational algorithm for using a single row and equally spaced drilled shafts to : stabilize an unstable slope has been developed in this research. The Monte-Carlo simulation (MCS) : technique was used in the previously develop...

  17. Advanced computer architecture specification for automated weld systems

    NASA Technical Reports Server (NTRS)

    Katsinis, Constantine

    1994-01-01

    This report describes the requirements for an advanced automated weld system and the associated computer architecture, and defines the overall system specification from a broad perspective. According to the requirements of welding procedures as they relate to an integrated multiaxis motion control and sensor architecture, the computer system requirements are developed based on a proven multiple-processor architecture with an expandable, distributed-memory, single global bus architecture, containing individual processors which are assigned to specific tasks that support sensor or control processes. The specified architecture is sufficiently flexible to integrate previously developed equipment, be upgradable and allow on-site modifications.

  18. Development and assessment of a chemistry-based computer video game as a learning tool

    NASA Astrophysics Data System (ADS)

    Martinez-Hernandez, Kermin Joel

    The chemistry-based computer video game is a multidisciplinary collaboration between chemistry and computer graphics and technology fields developed to explore the use of video games as a possible learning tool. This innovative approach aims to integrate elements of commercial video game and authentic chemistry context environments into a learning experience through gameplay. The project consists of three areas: development, assessment, and implementation. However, the foci of this study were the development and assessment of the computer video game including possible learning outcomes and game design elements. A chemistry-based game using a mixed genre of a single player first-person game embedded with action-adventure and puzzle components was developed to determine if students' level of understanding of chemistry concepts change after gameplay intervention. Three phases have been completed to assess students' understanding of chemistry concepts prior and after gameplay intervention. Two main assessment instruments (pre/post open-ended content survey and individual semi-structured interviews) were used to assess student understanding of concepts. In addition, game design elements were evaluated for future development phases. Preliminary analyses of the interview data suggest that students were able to understand most of the chemistry challenges presented in the game and the game served as a review for previously learned concepts as well as a way to apply such previous knowledge. To guarantee a better understanding of the chemistry concepts, additions such as debriefing and feedback about the content presented in the game seem to be needed. The use of visuals in the game to represent chemical processes, game genre, and game idea appear to be the game design elements that students like the most about the current computer video game.

  19. A Knowledge Engineering Approach to Developing Educational Computer Games for Improving Students' Differentiating Knowledge

    ERIC Educational Resources Information Center

    Hwang, Gwo-Jen; Sung, Han-Yu; Hung, Chun-Ming; Yang, Li-Hsueh; Huang, Iwen

    2013-01-01

    Educational computer games have been recognized as being a promising approach for motivating students to learn. Nevertheless, previous studies have shown that without proper learning strategies or supportive models, the learning achievement of students might not be as good as expected. In this study, a knowledge engineering approach is proposed…

  20. Developing an Educational Computer Game for Migratory Bird Identification Based on a Two-Tier Test Approach

    ERIC Educational Resources Information Center

    Chu, Hui-Chun; Chang, Shao-Chen

    2014-01-01

    Although educational computer games have been recognized as being a promising approach, previous studies have indicated that, without supportive models, students might only show temporary interest during the game-based learning process, and their learning performance is often not as good as expected. Therefore, in this paper, a two-tier test…

  1. Collaborative Learning with Screen-Based Simulation in Health Care Education: An Empirical Study of Collaborative Patterns and Proficiency Development

    ERIC Educational Resources Information Center

    Hall, L. O.; Soderstrom, T.; Ahlqvist, J.; Nilsson, T.

    2011-01-01

    This article is about collaborative learning with educational computer-assisted simulation (ECAS) in health care education. Previous research on training with a radiological virtual reality simulator has indicated positive effects on learning when compared to a more conventional alternative. Drawing upon the field of Computer-Supported…

  2. Computational Thinking in Mathematics Teacher Education

    ERIC Educational Resources Information Center

    Gadanidis, George; Cendros, Rosa; Floyd, Lisa; Namukasa, Immaculate

    2017-01-01

    As computational thinking (CT) is increasing in focus in K-12 education, it is important to consider how teacher education programs may better prepare teacher candidates (TCs). Previous studies have found that TCs do not always have a firm understanding of what CT involves, and they might not have clear ideas about how to develop CT in their…

  3. APPLICATION OF COMPUTER-AIDED TOMOGRAPHY (CAT) AS A POTENTIAL INDICATOR OF MARINE MARCO BENTHIC ACTIVITY ALONG POLLUTION GRADIENTS

    EPA Science Inventory

    Sediment cores were imaged using a local hospital CAT scanner. These image data were transferred to a personal computer at our laboratory using specially developed software. Previously, we reported an inverse correlation (r2 = 0.98, P<0.01) between the average sediment x-ray atte...

  4. Planning and Scheduling of Software Manufacturing Projects

    DTIC Science & Technology

    1991-03-01

    based on the previous results in social analysis of computing, operations research in manufacturing, artificial intelligence in manufacturing...planning and scheduling, and the traditional approaches to planning in artificial intelligence, and extends the techniques that have been developed by them...social analysis of computing, operations research in manufacturing, artificial intelligence in manufacturing planning and scheduling, and the

  5. Chinese Herbal Medicine Meets Biological Networks of Complex Diseases: A Computational Perspective

    PubMed Central

    Gu, Shuo

    2017-01-01

    With the rapid development of cheminformatics, computational biology, and systems biology, great progress has been made recently in the computational research of Chinese herbal medicine with in-depth understanding towards pharmacognosy. This paper summarized these studies in the aspects of computational methods, traditional Chinese medicine (TCM) compound databases, and TCM network pharmacology. Furthermore, we chose arachidonic acid metabolic network as a case study to demonstrate the regulatory function of herbal medicine in the treatment of inflammation at network level. Finally, a computational workflow for the network-based TCM study, derived from our previous successful applications, was proposed. PMID:28690664

  6. Chinese Herbal Medicine Meets Biological Networks of Complex Diseases: A Computational Perspective.

    PubMed

    Gu, Shuo; Pei, Jianfeng

    2017-01-01

    With the rapid development of cheminformatics, computational biology, and systems biology, great progress has been made recently in the computational research of Chinese herbal medicine with in-depth understanding towards pharmacognosy. This paper summarized these studies in the aspects of computational methods, traditional Chinese medicine (TCM) compound databases, and TCM network pharmacology. Furthermore, we chose arachidonic acid metabolic network as a case study to demonstrate the regulatory function of herbal medicine in the treatment of inflammation at network level. Finally, a computational workflow for the network-based TCM study, derived from our previous successful applications, was proposed.

  7. Semiannual Report, April 1, 1989 through September 30, 1989 (Institute for Computer Applications in Science and Engineering)

    DTIC Science & Technology

    1990-02-01

    noise. Tobias B. Orloff Work began on developing a high quality rendering algorithm based on the radiosity method. The algorithm is similar to...previous progressive radiosity algorithms except for the following improvements: 1. At each iteration vertex radiosities are computed using a modified scan...line approach, thus eliminating the quadratic cost associated with a ray tracing computation of vortex radiosities . 2. At each iteration the scene is

  8. Parallel, distributed and GPU computing technologies in single-particle electron microscopy

    PubMed Central

    Schmeisser, Martin; Heisen, Burkhard C.; Luettich, Mario; Busche, Boris; Hauer, Florian; Koske, Tobias; Knauber, Karl-Heinz; Stark, Holger

    2009-01-01

    Most known methods for the determination of the structure of macromolecular complexes are limited or at least restricted at some point by their computational demands. Recent developments in information technology such as multicore, parallel and GPU processing can be used to overcome these limitations. In particular, graphics processing units (GPUs), which were originally developed for rendering real-time effects in computer games, are now ubiquitous and provide unprecedented computational power for scientific applications. Each parallel-processing paradigm alone can improve overall performance; the increased computational performance obtained by combining all paradigms, unleashing the full power of today’s technology, makes certain applications feasible that were previously virtually impossible. In this article, state-of-the-art paradigms are introduced, the tools and infrastructure needed to apply these paradigms are presented and a state-of-the-art infrastructure and solution strategy for moving scientific applications to the next generation of computer hardware is outlined. PMID:19564686

  9. Parallel, distributed and GPU computing technologies in single-particle electron microscopy.

    PubMed

    Schmeisser, Martin; Heisen, Burkhard C; Luettich, Mario; Busche, Boris; Hauer, Florian; Koske, Tobias; Knauber, Karl-Heinz; Stark, Holger

    2009-07-01

    Most known methods for the determination of the structure of macromolecular complexes are limited or at least restricted at some point by their computational demands. Recent developments in information technology such as multicore, parallel and GPU processing can be used to overcome these limitations. In particular, graphics processing units (GPUs), which were originally developed for rendering real-time effects in computer games, are now ubiquitous and provide unprecedented computational power for scientific applications. Each parallel-processing paradigm alone can improve overall performance; the increased computational performance obtained by combining all paradigms, unleashing the full power of today's technology, makes certain applications feasible that were previously virtually impossible. In this article, state-of-the-art paradigms are introduced, the tools and infrastructure needed to apply these paradigms are presented and a state-of-the-art infrastructure and solution strategy for moving scientific applications to the next generation of computer hardware is outlined.

  10. Interacting with a Computer-Simulated Pet: Factors Influencing Children's Humane Attitudes and Empathy

    ERIC Educational Resources Information Center

    Tsai, Yueh-Feng; Kaufman, David

    2014-01-01

    Previous research by Tsai and Kaufman (2010a, 2010b) has suggested that computer-simulated virtual pet dogs can be used as a potential medium to enhance children's development of empathy and humane attitudes toward animals. To gain a deeper understanding of how and why interacting with a virtual pet dog might influence children's social and…

  11. Factors affecting and affected by user acceptance of computer-based nursing documentation: results of a two-year study.

    PubMed

    Ammenwerth, Elske; Mansmann, Ulrich; Iller, Carola; Eichstädter, Ronald

    2003-01-01

    The documentation of the nursing process is an important but often neglected part of clinical documentation. Paper-based systems have been introduced to support nursing process documentation. Frequently, however, problems such as low quality of documentation are reported. It is unclear whether computer-based documentation systems can reduce these problems and which factors influence their acceptance by users. We introduced a computer-based nursing documentation system on four wards of the University Hospitals of Heidelberg and systematically evaluated its preconditions and its effects in a pretest-posttest intervention study. For the analysis of user acceptance, we concentrated on subjective data drawn from questionnaires and interviews. A questionnaire was developed using items from published questionnaires and items that had to be developed for the special purpose of this study. The quantitative results point to two factors influencing the acceptance of a new computer-based documentation system: the previous acceptance of the nursing process and the previous amount of self-confidence when using computers. On one ward, the diverse acceptance scores heavily declined after the introduction of the nursing documentation system. Explorative qualitative analysis on this ward points to further success factors of computer-based nursing documentation systems. Our results can be used to assist the planning and introduction of computer-based nursing documentation systems. They demonstrate the importance of computer experience and acceptance of the nursing process on a ward but also point to other factors such as the fit between nursing workflow and the functionality of a nursing documentation system.

  12. Structural Optimization Methodology for Rotating Disks of Aircraft Engines

    NASA Technical Reports Server (NTRS)

    Armand, Sasan C.

    1995-01-01

    In support of the preliminary evaluation of various engine technologies, a methodology has been developed for structurally designing the rotating disks of an aircraft engine. The structural design methodology, along with a previously derived methodology for predicting low-cycle fatigue life, was implemented in a computer program. An interface computer program was also developed that gathers the required data from a flowpath analysis program (WATE) being used at NASA Lewis. The computer program developed for this study requires minimum interaction with the user, thus allowing engineers with varying backgrounds in aeropropulsion to successfully execute it. The stress analysis portion of the methodology and the computer program were verified by employing the finite element analysis method. The 10th- stage, high-pressure-compressor disk of the Energy Efficient Engine Program (E3) engine was used to verify the stress analysis; the differences between the stresses and displacements obtained from the computer program developed for this study and from the finite element analysis were all below 3 percent for the problem solved. The computer program developed for this study was employed to structurally optimize the rotating disks of the E3 high-pressure compressor. The rotating disks designed by the computer program in this study were approximately 26 percent lighter than calculated from the E3 drawings. The methodology is presented herein.

  13. User's guide for a computer program for calculating the zero-lift wave drag of complex aircraft configurations

    NASA Technical Reports Server (NTRS)

    Craidon, C. B.

    1983-01-01

    A computer program was developed to extend the geometry input capabilities of previous versions of a supersonic zero lift wave drag computer program. The arbitrary geometry input description is flexible enough to describe almost any complex aircraft concept, so that highly accurate wave drag analysis can now be performed because complex geometries can be represented accurately and do not have to be modified to meet the requirements of a restricted input format.

  14. Business Mathematics Curriculum Guide.

    ERIC Educational Resources Information Center

    Ebersole, Benjamin P., Comp.; And Others

    This revised course in business mathematics emphasizes computations needed for problem solving, but greater attention is focused on mathematical principles that were developed in previous grades. In addition, the course aims to develop further an understanding of business principles and practices which can be used in gainful employment and in the…

  15. Exploring the experience of clients with tetraplegia utilizing assistive technology for computer access.

    PubMed

    Folan, Alyce; Barclay, Linda; Cooper, Cathy; Robinson, Merren

    2015-01-01

    Assistive technology for computer access can be used to facilitate people with a spinal cord injury to utilize mainstream computer applications, thereby enabling participation in a variety of meaningful occupations. The aim of this study was to gain an understanding of the experiences of clients with tetraplegia trialing assistive technologies for computer access during different stages in a public rehabilitation service. In order to explore the experiences of clients with tetraplegia trialing assistive technologies for computer use, qualitative methodology was selected. Data were collected from seven participants using semi-structured interviews, which were audio-taped, transcribed and analyzed thematically. Three main themes were identified. These were: getting back into life, assisting in adjusting to injury and learning new skills. The findings from this study demonstrated that people with tetraplegia can be assisted to return to previous life roles or engage in new roles, through developing skills in the use of assistive technology for computer access. Being able to use computers for meaningful activities contributed to the participants gaining an enhanced sense of self-efficacy, and thereby quality of life. Implications for Rehabilitation Findings from this pilot study indicate that people with tetraplegia can be assisted to return to previous life roles, and develop new roles that have meaning to them through the use of assistive technologies for computer use. Being able to use the internet to socialize, and complete daily tasks, contributed to the participants gaining a sense of control over their lives. Early introduction to assistive technology is important to ensure sufficient time for newly injured people to feel comfortable enough with the assistive technology to use the computers productively by the time of discharge. Further research into this important and expanding area is indicated.

  16. NASA's 3D Flight Computer for Space Applications

    NASA Technical Reports Server (NTRS)

    Alkalai, Leon

    2000-01-01

    The New Millennium Program (NMP) Integrated Product Development Team (IPDT) for Microelectronics Systems was planning to validate a newly developed 3D Flight Computer system on its first deep-space flight, DS1, launched in October 1998. This computer, developed in the 1995-97 time frame, contains many new computer technologies previously never used in deep-space systems. They include: advanced 3D packaging architecture for future low-mass and low-volume avionics systems; high-density 3D packaged chip-stacks for both volatile and non-volatile mass memory: 400 Mbytes of local DRAM memory, and 128 Mbytes of Flash memory; high-bandwidth Peripheral Component Interface (Per) local-bus with a bridge to VME; high-bandwidth (20 Mbps) fiber-optic serial bus; and other attributes, such as standard support for Design for Testability (DFT). Even though this computer system did not complete on time for delivery to the DS1 project, it was an important development along a technology roadmap towards highly integrated and highly miniaturized avionics systems for deep-space applications. This continued technology development is now being performed by NASA's Deep Space System Development Program (also known as X2000) and within JPL's Center for Integrated Space Microsystems (CISM).

  17. The magic words: Using computers to uncover mental associations for use in magic trick design.

    PubMed

    Williams, Howard; McOwan, Peter W

    2017-01-01

    The use of computational systems to aid in the design of magic tricks has been previously explored. Here further steps are taken in this direction, introducing the use of computer technology as a natural language data sourcing and processing tool for magic trick design purposes. Crowd sourcing of psychological concepts is investigated; further, the role of human associative memory and its exploitation in magical effects is explored. A new trick is developed and evaluated: a physical card trick partially designed by a computational system configured to search for and explore conceptual spaces readily understood by spectators.

  18. 48 CFR 252.227-7028 - Technical data or computer software previously delivered to the government.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... software previously delivered to the government. 252.227-7028 Section 252.227-7028 Federal Acquisition... computer software previously delivered to the government. As prescribed in 227.7103-6(d), 227.7104(f)(2), or 227.7203-6(e), use the following provision: Technical Data or Computer Software Previously...

  19. 48 CFR 252.227-7028 - Technical data or computer software previously delivered to the government.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... software previously delivered to the government. 252.227-7028 Section 252.227-7028 Federal Acquisition... computer software previously delivered to the government. As prescribed in 227.7103-6(d), 227.7104(f)(2), or 227.7203-6(e), use the following provision: Technical Data or Computer Software Previously...

  20. 48 CFR 252.227-7028 - Technical data or computer software previously delivered to the government.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... software previously delivered to the government. 252.227-7028 Section 252.227-7028 Federal Acquisition... computer software previously delivered to the government. As prescribed in 227.7103-6(d), 227.7104(f)(2), or 227.7203-6(e), use the following provision: Technical Data or Computer Software Previously...

  1. 48 CFR 252.227-7028 - Technical data or computer software previously delivered to the government.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... software previously delivered to the government. 252.227-7028 Section 252.227-7028 Federal Acquisition... computer software previously delivered to the government. As prescribed in 227.7103-6(d), 227.7104(f)(2), or 227.7203-6(e), use the following provision: Technical Data or Computer Software Previously...

  2. 48 CFR 252.227-7028 - Technical data or computer software previously delivered to the government.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... software previously delivered to the government. 252.227-7028 Section 252.227-7028 Federal Acquisition... computer software previously delivered to the government. As prescribed in 227.7103-6(d), 227.7104(f)(2), or 227.7203-6(e), use the following provision: Technical Data or Computer Software Previously...

  3. An Improved Treatment of External Boundary for Three-Dimensional Flow Computations

    NASA Technical Reports Server (NTRS)

    Tsynkov, Semyon V.; Vatsa, Veer N.

    1997-01-01

    We present an innovative numerical approach for setting highly accurate nonlocal boundary conditions at the external computational boundaries when calculating three-dimensional compressible viscous flows over finite bodies. The approach is based on application of the difference potentials method by V. S. Ryaben'kii and extends our previous technique developed for the two-dimensional case. The new boundary conditions methodology has been successfully combined with the NASA-developed code TLNS3D and used for the analysis of wing-shaped configurations in subsonic and transonic flow regimes. As demonstrated by the computational experiments, the improved external boundary conditions allow one to greatly reduce the size of the computational domain while still maintaining high accuracy of the numerical solution. Moreover, they may provide for a noticeable speedup of convergence of the multigrid iterations.

  4. Auto Mechanics; Methodology. Technical Instruction Manual.

    ERIC Educational Resources Information Center

    Systems Operation Support, Inc., King of Prussia, PA.

    This student instruction manual was written in conformance with selected criteria for programed instruction books as developed previously for various military training courses. The manual was developed as a part of "A Study of the Effectiveness of a Military-Type Computer-Based Instructional System When Used in Civilian High School Courses in…

  5. Numerical Technology for Large-Scale Computational Electromagnetics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharpe, R; Champagne, N; White, D

    The key bottleneck of implicit computational electromagnetics tools for large complex geometries is the solution of the resulting linear system of equations. The goal of this effort was to research and develop critical numerical technology that alleviates this bottleneck for large-scale computational electromagnetics (CEM). The mathematical operators and numerical formulations used in this arena of CEM yield linear equations that are complex valued, unstructured, and indefinite. Also, simultaneously applying multiple mathematical modeling formulations to different portions of a complex problem (hybrid formulations) results in a mixed structure linear system, further increasing the computational difficulty. Typically, these hybrid linear systems aremore » solved using a direct solution method, which was acceptable for Cray-class machines but does not scale adequately for ASCI-class machines. Additionally, LLNL's previously existing linear solvers were not well suited for the linear systems that are created by hybrid implicit CEM codes. Hence, a new approach was required to make effective use of ASCI-class computing platforms and to enable the next generation design capabilities. Multiple approaches were investigated, including the latest sparse-direct methods developed by our ASCI collaborators. In addition, approaches that combine domain decomposition (or matrix partitioning) with general-purpose iterative methods and special purpose pre-conditioners were investigated. Special-purpose pre-conditioners that take advantage of the structure of the matrix were adapted and developed based on intimate knowledge of the matrix properties. Finally, new operator formulations were developed that radically improve the conditioning of the resulting linear systems thus greatly reducing solution time. The goal was to enable the solution of CEM problems that are 10 to 100 times larger than our previous capability.« less

  6. Computational modeling for prediction of the shear stress of three-dimensional isotropic and aligned fiber networks.

    PubMed

    Park, Seungman

    2017-09-01

    Interstitial flow (IF) is a creeping flow through the interstitial space of the extracellular matrix (ECM). IF plays a key role in diverse biological functions, such as tissue homeostasis, cell function and behavior. Currently, most studies that have characterized IF have focused on the permeability of ECM or shear stress distribution on the cells, but less is known about the prediction of shear stress on the individual fibers or fiber networks despite its significance in the alignment of matrix fibers and cells observed in fibrotic or wound tissues. In this study, I developed a computational model to predict shear stress for different structured fibrous networks. To generate isotropic models, a random growth algorithm and a second-order orientation tensor were employed. Then, a three-dimensional (3D) solid model was created using computer-aided design (CAD) software for the aligned models (i.e., parallel, perpendicular and cubic models). Subsequently, a tetrahedral unstructured mesh was generated and flow solutions were calculated by solving equations for mass and momentum conservation for all models. Through the flow solutions, I estimated permeability using Darcy's law. Average shear stress (ASS) on the fibers was calculated by averaging the wall shear stress of the fibers. By using nonlinear surface fitting of permeability, viscosity, velocity, porosity and ASS, I devised new computational models. Overall, the developed models showed that higher porosity induced higher permeability, as previous empirical and theoretical models have shown. For comparison of the permeability, the present computational models were matched well with previous models, which justify our computational approach. ASS tended to increase linearly with respect to inlet velocity and dynamic viscosity, whereas permeability was almost the same. Finally, the developed model nicely predicted the ASS values that had been directly estimated from computational fluid dynamics (CFD). The present computational models will provide new tools for predicting accurate functional properties and designing fibrous porous materials, thereby significantly advancing tissue engineering. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. A useful approximation for the flat surface impulse response

    NASA Technical Reports Server (NTRS)

    Brown, Gary S.

    1989-01-01

    The flat surface impulse response (FSIR) is a very useful quantity in computing the mean return power for near-nadir-oriented short-pulse radar altimeters. However, for very small antenna beamwidths and relatively large pointing angles, previous analytical descriptions become very difficult to compute accurately. An asymptotic approximation is developed to overcome these computational problems. Since accuracy is of key importance, a condition is developed under which this solution is within 2 percent of the exact answer. The asymptotic solution is shown to be in functional agreement with a conventional clutter power result and gives a 1.25-dB correction to this formula to account properly for the antenna-pattern variation over the illuminated area.

  8. Electrochemical carbon dioxide concentrator subsystem math model. [for manned space station

    NASA Technical Reports Server (NTRS)

    Marshall, R. D.; Carlson, J. N.; Schubert, F. H.

    1974-01-01

    A steady state computer simulation model has been developed to describe the performance of a total six man, self-contained electrochemical carbon dioxide concentrator subsystem built for the space station prototype. The math model combines expressions describing the performance of the electrochemical depolarized carbon dioxide concentrator cells and modules previously developed with expressions describing the performance of the other major CS-6 components. The model is capable of accurately predicting CS-6 performance over EDC operating ranges and the computer simulation results agree with experimental data obtained over the prediction range.

  9. Tensor scale-based fuzzy connectedness image segmentation

    NASA Astrophysics Data System (ADS)

    Saha, Punam K.; Udupa, Jayaram K.

    2003-05-01

    Tangible solutions to image segmentation are vital in many medical imaging applications. Toward this goal, a framework based on fuzzy connectedness was developed in our laboratory. A fundamental notion called "affinity" - a local fuzzy hanging togetherness relation on voxels - determines the effectiveness of this segmentation framework in real applications. In this paper, we introduce the notion of "tensor scale" - a recently developed local morphometric parameter - in affinity definition and study its effectiveness. Although, our previous notion of "local scale" using the spherical model successfully incorporated local structure size into affinity and resulted in measureable improvements in segmentation results, a major limitation of the previous approach was that it ignored local structural orientation and anisotropy. The current approach of using tensor scale in affinity computation allows an effective utilization of local size, orientation, and ansiotropy in a unified manner. Tensor scale is used for computing both the homogeneity- and object-feature-based components of affinity. Preliminary results of the proposed method on several medical images and computer generated phantoms of realistic shapes are presented. Further extensions of this work are discussed.

  10. Fusing Symbolic and Numerical Diagnostic Computations

    NASA Technical Reports Server (NTRS)

    James, Mark

    2007-01-01

    X-2000 Anomaly Detection Language denotes a developmental computing language, and the software that establishes and utilizes the language, for fusing two diagnostic computer programs, one implementing a numerical analysis method, the other implementing a symbolic analysis method into a unified event-based decision analysis software system for realtime detection of events (e.g., failures) in a spacecraft, aircraft, or other complex engineering system. The numerical analysis method is performed by beacon-based exception analysis for multi-missions (BEAMs), which has been discussed in several previous NASA Tech Briefs articles. The symbolic analysis method is, more specifically, an artificial-intelligence method of the knowledge-based, inference engine type, and its implementation is exemplified by the Spacecraft Health Inference Engine (SHINE) software. The goal in developing the capability to fuse numerical and symbolic diagnostic components is to increase the depth of analysis beyond that previously attainable, thereby increasing the degree of confidence in the computed results. In practical terms, the sought improvement is to enable detection of all or most events, with no or few false alarms.

  11. Noise Radiation From a Leading-Edge Slat

    NASA Technical Reports Server (NTRS)

    Lockard, David P.; Choudhari, Meelan M.

    2009-01-01

    This paper extends our previous computations of unsteady flow within the slat cove region of a multi-element high-lift airfoil configuration, which showed that both statistical and structural aspects of the experimentally observed unsteady flow behavior can be captured via 3D simulations over a computational domain of narrow spanwise extent. Although such narrow domain simulation can account for the spanwise decorrelation of the slat cove fluctuations, the resulting database cannot be applied towards acoustic predictions of the slat without invoking additional approximations to synthesize the fluctuation field over the rest of the span. This deficiency is partially alleviated in the present work by increasing the spanwise extent of the computational domain from 37.3% of the slat chord to nearly 226% (i.e., 15% of the model span). The simulation database is used to verify consistency with previous computational results and, then, to develop predictions of the far-field noise radiation in conjunction with a frequency-domain Ffowcs-Williams Hawkings solver.

  12. Computer Simulations of Coronary Blood Flow Through a Constriction

    DTIC Science & Technology

    2014-03-01

    interventional procedures (e.g., stent deployment). Building off previous models that have been partially validated with experimental data, this thesis... stent deployment). Building off previous models that have been partially validated with experimental data, this thesis continues to develop the...the artery and increase blood flow. Generally a stent , or a mesh wire tube, is permanently inserted in order to scaffold open the artery wall

  13. 5 CFR 839.1002 - Will OPM compute the lost earnings if my qualifying retirement coverage error was previously...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Will OPM compute the lost earnings if my... compute the lost earnings if my qualifying retirement coverage error was previously corrected and I made... coverage error was previously corrected, OPM will compute the lost earnings on your make-up contributions...

  14. Development of a multiple-parameter nonlinear perturbation procedure for transonic turbomachinery flows: Preliminary application to design/optimization problems

    NASA Technical Reports Server (NTRS)

    Stahara, S. S.; Elliott, J. P.; Spreiter, J. R.

    1983-01-01

    An investigation was conducted to continue the development of perturbation procedures and associated computational codes for rapidly determining approximations to nonlinear flow solutions, with the purpose of establishing a method for minimizing computational requirements associated with parametric design studies of transonic flows in turbomachines. The results reported here concern the extension of the previously developed successful method for single parameter perturbations to simultaneous multiple-parameter perturbations, and the preliminary application of the multiple-parameter procedure in combination with an optimization method to blade design/optimization problem. In order to provide as severe a test as possible of the method, attention is focused in particular on transonic flows which are highly supercritical. Flows past both isolated blades and compressor cascades, involving simultaneous changes in both flow and geometric parameters, are considered. Comparisons with the corresponding exact nonlinear solutions display remarkable accuracy and range of validity, in direct correspondence with previous results for single-parameter perturbations.

  15. Assessing the physical loading of wearable computers.

    PubMed

    Knight, James F; Baber, Chris

    2007-03-01

    Wearable computers enable workers to interact with computer equipment in situations where previously they were unable. Attaching a computer to the body though has an unknown physical effect. This paper reports a methodology for addressing this, by assessing postural effects and the effect of added weight. Using the example of arm-mounted computers (AMCs), the paper shows that adopting a posture to interact with an AMC generates fatiguing levels of stress and a load of 0.54 kg results in increased level of stress and increased rate of fatigue. The paper shows that, due to poor postures adopted when wearing and interacting with computers and the weight of the device attached to the body, one possible outcome for prolonged exposure is the development of musculoskeletal disorders.

  16. Impact of computer use on children's vision.

    PubMed

    Kozeis, N

    2009-10-01

    Today, millions of children use computers on a daily basis. Extensive viewing of the computer screen can lead to eye discomfort, fatigue, blurred vision and headaches, dry eyes and other symptoms of eyestrain. These symptoms may be caused by poor lighting, glare, an improper work station set-up, vision problems of which the person was not previously aware, or a combination of these factors. Children can experience many of the same symptoms related to computer use as adults. However, some unique aspects of how children use computers may make them more susceptible than adults to the development of these problems. In this study, the most common eye symptoms related to computer use in childhood, the possible causes and ways to avoid them are reviewed.

  17. Efficient storage, computation, and exposure of computer-generated holograms by electron-beam lithography.

    PubMed

    Newman, D M; Hawley, R W; Goeckel, D L; Crawford, R D; Abraham, S; Gallagher, N C

    1993-05-10

    An efficient storage format was developed for computer-generated holograms for use in electron-beam lithography. This method employs run-length encoding and Lempel-Ziv-Welch compression and succeeds in exposing holograms that were previously infeasible owing to the hologram's tremendous pattern-data file size. These holograms also require significant computation; thus the algorithm was implemented on a parallel computer, which improved performance by 2 orders of magnitude. The decompression algorithm was integrated into the Cambridge electron-beam machine's front-end processor.Although this provides much-needed ability, some hardware enhancements will be required in the future to overcome inadequacies in the current front-end processor that result in a lengthy exposure time.

  18. Reliable Acquisition of RAM Dumps from Intel-Based Apple Mac Computers over FireWire

    NASA Astrophysics Data System (ADS)

    Gladyshev, Pavel; Almansoori, Afrah

    RAM content acquisition is an important step in live forensic analysis of computer systems. FireWire offers an attractive way to acquire RAM content of Apple Mac computers equipped with a FireWire connection. However, the existing techniques for doing so require substantial knowledge of the target computer configuration and cannot be used reliably on a previously unknown computer in a crime scene. This paper proposes a novel method for acquiring RAM content of Apple Mac computers over FireWire, which automatically discovers necessary information about the target computer and can be used in the crime scene setting. As an application of the developed method, the techniques for recovery of AOL Instant Messenger (AIM) conversation fragments from RAM dumps are also discussed in this paper.

  19. Teaching the Grammar and Syntax of Business German: Problems and Practical Suggestions. Innovative Strategies and Exercises for Computer-Aided Instruction: Word Order Variation.

    ERIC Educational Resources Information Center

    Paulsell, Patricia R.

    A computer program is described that is a substack of the "Business German" HyperCard program previously developed by Paulsell and designed as a tutorial to be used with materials for a business German course on the third year college level. The program consists of six stacks, a central one providing graphics-based information on Germany…

  20. The symbolic computation of series solutions to ordinary differential equations using trees (extended abstract)

    NASA Technical Reports Server (NTRS)

    Grossman, Robert

    1991-01-01

    Algorithms previously developed by the author give formulas which can be used for the efficient symbolic computation of series expansions to solutions of nonlinear systems of ordinary differential equations. As a by product of this analysis, formulas are derived which relate to trees to the coefficients of the series expansions, similar to the work of Leroux and Viennot, and Lamnabhi, Leroux and Viennot.

  1. Fast sweeping methods for hyperbolic systems of conservation laws at steady state II

    NASA Astrophysics Data System (ADS)

    Engquist, Björn; Froese, Brittany D.; Tsai, Yen-Hsi Richard

    2015-04-01

    The idea of using fast sweeping methods for solving stationary systems of conservation laws has previously been proposed for efficiently computing solutions with sharp shocks. We further develop these methods to allow for a more challenging class of problems including problems with sonic points, shocks originating in the interior of the domain, rarefaction waves, and two-dimensional systems. We show that fast sweeping methods can produce higher-order accuracy. Computational results validate the claims of accuracy, sharp shock curves, and optimal computational efficiency.

  2. Cortico-striatal language pathways dynamically adjust for syntactic complexity: A computational study.

    PubMed

    Szalisznyó, Krisztina; Silverstein, David; Teichmann, Marc; Duffau, Hugues; Smits, Anja

    2017-01-01

    A growing body of literature supports a key role of fronto-striatal circuits in language perception. It is now known that the striatum plays a role in engaging attentional resources and linguistic rule computation while also serving phonological short-term memory capabilities. The ventral semantic and the dorsal phonological stream dichotomy assumed for spoken language processing also seems to play a role in cortico-striatal perception. Based on recent studies that correlate deep Broca-striatal pathways with complex syntax performance, we used a previously developed computational model of frontal-striatal syntax circuits and hypothesized that different parallel language pathways may contribute to canonical and non-canonical sentence comprehension separately. We modified and further analyzed a thematic role assignment task and corresponding reservoir computing model of language circuits, as previously developed by Dominey and coworkers. We examined the models performance under various parameter regimes, by influencing how fast the presented language input decays and altering the temporal dynamics of activated word representations. This enabled us to quantify canonical and non-canonical sentence comprehension abilities. The modeling results suggest that separate cortico-cortical and cortico-striatal circuits may be recruited differently for processing syntactically more difficult and less complicated sentences. Alternatively, a single circuit would need to dynamically and adaptively adjust to syntactic complexity. Copyright © 2016. Published by Elsevier Inc.

  3. MoPCoM Methodology: Focus on Models of Computation

    NASA Astrophysics Data System (ADS)

    Koudri, Ali; Champeau, Joël; Le Lann, Jean-Christophe; Leilde, Vincent

    Today, developments of Real Time Embedded Systems have to face new challenges. On the one hand, Time-To-Market constraints require a reliable development process allowing quick design space exploration. On the other hand, rapidly developing technology, as stated by Moore's law, requires techniques to handle the resulting productivity gap. In a previous paper, we have presented our Model Based Engineering methodology addressing those issues. In this paper, we make a focus on Models of Computation design and analysis. We illustrate our approach on a Cognitive Radio System development implemented on an FPGA. This work is part of the MoPCoM research project gathering academic and industrial organizations (http://www.mopcom.fr).

  4. Drug Target Optimization in Chronic Myeloid Leukemia Using Innovative Computational Platform

    PubMed Central

    Chuang, Ryan; Hall, Benjamin A.; Benque, David; Cook, Byron; Ishtiaq, Samin; Piterman, Nir; Taylor, Alex; Vardi, Moshe; Koschmieder, Steffen; Gottgens, Berthold; Fisher, Jasmin

    2015-01-01

    Chronic Myeloid Leukemia (CML) represents a paradigm for the wider cancer field. Despite the fact that tyrosine kinase inhibitors have established targeted molecular therapy in CML, patients often face the risk of developing drug resistance, caused by mutations and/or activation of alternative cellular pathways. To optimize drug development, one needs to systematically test all possible combinations of drug targets within the genetic network that regulates the disease. The BioModelAnalyzer (BMA) is a user-friendly computational tool that allows us to do exactly that. We used BMA to build a CML network-model composed of 54 nodes linked by 104 interactions that encapsulates experimental data collected from 160 publications. While previous studies were limited by their focus on a single pathway or cellular process, our executable model allowed us to probe dynamic interactions between multiple pathways and cellular outcomes, suggest new combinatorial therapeutic targets, and highlight previously unexplored sensitivities to Interleukin-3. PMID:25644994

  5. Drug Target Optimization in Chronic Myeloid Leukemia Using Innovative Computational Platform

    NASA Astrophysics Data System (ADS)

    Chuang, Ryan; Hall, Benjamin A.; Benque, David; Cook, Byron; Ishtiaq, Samin; Piterman, Nir; Taylor, Alex; Vardi, Moshe; Koschmieder, Steffen; Gottgens, Berthold; Fisher, Jasmin

    2015-02-01

    Chronic Myeloid Leukemia (CML) represents a paradigm for the wider cancer field. Despite the fact that tyrosine kinase inhibitors have established targeted molecular therapy in CML, patients often face the risk of developing drug resistance, caused by mutations and/or activation of alternative cellular pathways. To optimize drug development, one needs to systematically test all possible combinations of drug targets within the genetic network that regulates the disease. The BioModelAnalyzer (BMA) is a user-friendly computational tool that allows us to do exactly that. We used BMA to build a CML network-model composed of 54 nodes linked by 104 interactions that encapsulates experimental data collected from 160 publications. While previous studies were limited by their focus on a single pathway or cellular process, our executable model allowed us to probe dynamic interactions between multiple pathways and cellular outcomes, suggest new combinatorial therapeutic targets, and highlight previously unexplored sensitivities to Interleukin-3.

  6. Interactive multi-mode blade impact analysis

    NASA Technical Reports Server (NTRS)

    Alexander, A.; Cornell, R. W.

    1978-01-01

    The theoretical methodology used in developing an analysis for the response of turbine engine fan blades subjected to soft-body (bird) impacts is reported, and the computer program developed using this methodology as its basis is described. This computer program is an outgrowth of two programs that were previously developed for the purpose of studying problems of a similar nature (a 3-mode beam impact analysis and a multi-mode beam impact analysis). The present program utilizes an improved missile model that is interactively coupled with blade motion which is more consistent with actual observations. It takes into account local deformation at the impact area, blade camber effects, and the spreading of the impacted missile mass on the blade surface. In addition, it accommodates plate-type mode shapes. The analysis capability in this computer program represents a significant improvement in the development of the methodology for evaluating potential fan blade materials and designs with regard to foreign object impact resistance.

  7. Enhancements to the Image Analysis Tool for Core Punch Experiments and Simulations (vs. 2014)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hogden, John Edward; Unal, Cetin

    A previous paper (Hogden & Unal, 2012, Image Analysis Tool for Core Punch Experiments and Simulations) described an image processing computer program developed at Los Alamos National Laboratory. This program has proven useful so developement has been continued. In this paper we describe enhacements to the program as of 2014.

  8. Generalized Maintenance Trainer Simulator: Development of Hardware and Software. Final Report.

    ERIC Educational Resources Information Center

    Towne, Douglas M.; Munro, Allen

    A general purpose maintenance trainer, which has the potential to simulate a wide variety of electronic equipments without hardware changes or new computer programs, has been developed and field tested by the Navy. Based on a previous laboratory model, the Generalized Maintenance Trainer Simulator (GMTS) is a relatively low cost trainer that…

  9. The magic words: Using computers to uncover mental associations for use in magic trick design

    PubMed Central

    2017-01-01

    The use of computational systems to aid in the design of magic tricks has been previously explored. Here further steps are taken in this direction, introducing the use of computer technology as a natural language data sourcing and processing tool for magic trick design purposes. Crowd sourcing of psychological concepts is investigated; further, the role of human associative memory and its exploitation in magical effects is explored. A new trick is developed and evaluated: a physical card trick partially designed by a computational system configured to search for and explore conceptual spaces readily understood by spectators. PMID:28792941

  10. Reversible Data Hiding Based on DNA Computing

    PubMed Central

    Xie, Yingjie

    2017-01-01

    Biocomputing, especially DNA, computing has got great development. It is widely used in information security. In this paper, a novel algorithm of reversible data hiding based on DNA computing is proposed. Inspired by the algorithm of histogram modification, which is a classical algorithm for reversible data hiding, we combine it with DNA computing to realize this algorithm based on biological technology. Compared with previous results, our experimental results have significantly improved the ER (Embedding Rate). Furthermore, some PSNR (peak signal-to-noise ratios) of test images are also improved. Experimental results show that it is suitable for protecting the copyright of cover image in DNA-based information security. PMID:28280504

  11. High-efficiency multiphoton boson sampling

    NASA Astrophysics Data System (ADS)

    Wang, Hui; He, Yu; Li, Yu-Huai; Su, Zu-En; Li, Bo; Huang, He-Liang; Ding, Xing; Chen, Ming-Cheng; Liu, Chang; Qin, Jian; Li, Jin-Peng; He, Yu-Ming; Schneider, Christian; Kamp, Martin; Peng, Cheng-Zhi; Höfling, Sven; Lu, Chao-Yang; Pan, Jian-Wei

    2017-06-01

    Boson sampling is considered as a strong candidate to demonstrate 'quantum computational supremacy' over classical computers. However, previous proof-of-principle experiments suffered from small photon number and low sampling rates owing to the inefficiencies of the single-photon sources and multiport optical interferometers. Here, we develop two central components for high-performance boson sampling: robust multiphoton interferometers with 99% transmission rate and actively demultiplexed single-photon sources based on a quantum dot-micropillar with simultaneously high efficiency, purity and indistinguishability. We implement and validate three-, four- and five-photon boson sampling, and achieve sampling rates of 4.96 kHz, 151 Hz and 4 Hz, respectively, which are over 24,000 times faster than previous experiments. Our architecture can be scaled up for a larger number of photons and with higher sampling rates to compete with classical computers, and might provide experimental evidence against the extended Church-Turing thesis.

  12. Quasi-static earthquake cycle simulation based on nonlinear viscoelastic finite element analyses

    NASA Astrophysics Data System (ADS)

    Agata, R.; Ichimura, T.; Hyodo, M.; Barbot, S.; Hori, T.

    2017-12-01

    To explain earthquake generation processes, simulation methods of earthquake cycles have been studied. For such simulations, the combination of the rate- and state-dependent friction law at the fault plane and the boundary integral method based on Green's function in an elastic half space is widely used (e.g. Hori 2009; Barbot et al. 2012). In this approach, stress change around the fault plane due to crustal deformation can be computed analytically, while the effects of complex physics such as mantle rheology and gravity are generally not taken into account. To consider such effects, we seek to develop an earthquake cycle simulation combining crustal deformation computation based on the finite element (FE) method with the rate- and state-dependent friction law. Since the drawback of this approach is the computational cost associated with obtaining numerical solutions, we adopt a recently developed fast and scalable FE solver (Ichimura et al. 2016), which assumes use of supercomputers, to solve the problem in a realistic time. As in the previous approach, we solve the governing equations consisting of the rate- and state-dependent friction law. In solving the equations, we compute stress changes along the fault plane due to crustal deformation using FE simulation, instead of computing them by superimposing slip response function as in the previous approach. In stress change computation, we take into account nonlinear viscoelastic deformation in the asthenosphere. In the presentation, we will show simulation results in a normative three-dimensional problem, where a circular-shaped velocity-weakening area is set in a square-shaped fault plane. The results with and without nonlinear viscosity in the asthenosphere will be compared. We also plan to apply the developed code to simulate the post-earthquake deformation of a megathrust earthquake, such as the 2011 Tohoku earthquake. Acknowledgment: The results were obtained using the K computer at the RIKEN (Proposal number hp160221).

  13. Code Modernization of VPIC

    NASA Astrophysics Data System (ADS)

    Bird, Robert; Nystrom, David; Albright, Brian

    2017-10-01

    The ability of scientific simulations to effectively deliver performant computation is increasingly being challenged by successive generations of high-performance computing architectures. Code development to support efficient computation on these modern architectures is both expensive, and highly complex; if it is approached without due care, it may also not be directly transferable between subsequent hardware generations. Previous works have discussed techniques to support the process of adapting a legacy code for modern hardware generations, but despite the breakthroughs in the areas of mini-app development, portable-performance, and cache oblivious algorithms the problem still remains largely unsolved. In this work we demonstrate how a focus on platform agnostic modern code-development can be applied to Particle-in-Cell (PIC) simulations to facilitate effective scientific delivery. This work builds directly on our previous work optimizing VPIC, in which we replaced intrinsic based vectorisation with compile generated auto-vectorization to improve the performance and portability of VPIC. In this work we present the use of a specialized SIMD queue for processing some particle operations, and also preview a GPU capable OpenMP variant of VPIC. Finally we include a lessons learnt. Work performed under the auspices of the U.S. Dept. of Energy by the Los Alamos National Security, LLC Los Alamos National Laboratory under contract DE-AC52-06NA25396 and supported by the LANL LDRD program.

  14. Computer-aided design of liposomal drugs: In silico prediction and experimental validation of drug candidates for liposomal remote loading.

    PubMed

    Cern, Ahuva; Barenholz, Yechezkel; Tropsha, Alexander; Goldblum, Amiram

    2014-01-10

    Previously we have developed and statistically validated Quantitative Structure Property Relationship (QSPR) models that correlate drugs' structural, physical and chemical properties as well as experimental conditions with the relative efficiency of remote loading of drugs into liposomes (Cern et al., J. Control. Release 160 (2012) 147-157). Herein, these models have been used to virtually screen a large drug database to identify novel candidate molecules for liposomal drug delivery. Computational hits were considered for experimental validation based on their predicted remote loading efficiency as well as additional considerations such as availability, recommended dose and relevance to the disease. Three compounds were selected for experimental testing which were confirmed to be correctly classified by our previously reported QSPR models developed with Iterative Stochastic Elimination (ISE) and k-Nearest Neighbors (kNN) approaches. In addition, 10 new molecules with known liposome remote loading efficiency that were not used by us in QSPR model development were identified in the published literature and employed as an additional model validation set. The external accuracy of the models was found to be as high as 82% or 92%, depending on the model. This study presents the first successful application of QSPR models for the computer-model-driven design of liposomal drugs. © 2013.

  15. Computer-aided design of liposomal drugs: in silico prediction and experimental validation of drug candidates for liposomal remote loading

    PubMed Central

    Cern, Ahuva; Barenholz, Yechezkel; Tropsha, Alexander; Goldblum, Amiram

    2014-01-01

    Previously we have developed and statistically validated Quantitative Structure Property Relationship (QSPR) models that correlate drugs’ structural, physical and chemical properties as well as experimental conditions with the relative efficiency of remote loading of drugs into liposomes (Cern et al, Journal of Controlled Release, 160(2012) 14–157). Herein, these models have been used to virtually screen a large drug database to identify novel candidate molecules for liposomal drug delivery. Computational hits were considered for experimental validation based on their predicted remote loading efficiency as well as additional considerations such as availability, recommended dose and relevance to the disease. Three compounds were selected for experimental testing which were confirmed to be correctly classified by our previously reported QSPR models developed with Iterative Stochastic Elimination (ISE) and k-nearest neighbors (kNN) approaches. In addition, 10 new molecules with known liposome remote loading efficiency that were not used in QSPR model development were identified in the published literature and employed as an additional model validation set. The external accuracy of the models was found to be as high as 82% or 92%, depending on the model. This study presents the first successful application of QSPR models for the computer-model-driven design of liposomal drugs. PMID:24184343

  16. Orientation-modulated attention effect on visual evoked potential: Application for PIN system using brain-computer interface.

    PubMed

    Wilaiprasitporn, Theerawit; Yagi, Tohru

    2015-01-01

    This research demonstrates the orientation-modulated attention effect on visual evoked potential. We combined this finding with our previous findings about the motion-modulated attention effect and used the result to develop novel visual stimuli for a personal identification number (PIN) application based on a brain-computer interface (BCI) framework. An electroencephalography amplifier with a single electrode channel was sufficient for our application. A computationally inexpensive algorithm and small datasets were used in processing. Seven healthy volunteers participated in experiments to measure offline performance. Mean accuracy was 83.3% at 13.9 bits/min. Encouraged by these results, we plan to continue developing the BCI-based personal identification application toward real-time systems.

  17. Locomotive crashworthiness report : volume 4 : additional freight locomotive calculations

    DOT National Transportation Integrated Search

    1995-07-01

    Previously developed computer models (see volume 1) are used to carry out additional calculations for evaluation of road freight locomotive crashworthiness. The effect of fewer locomotives (as would be expected after transition from DC motor to highe...

  18. Advances in computational design and analysis of airbreathing propulsion systems

    NASA Technical Reports Server (NTRS)

    Klineberg, John M.

    1989-01-01

    The development of commercial and military aircraft depends, to a large extent, on engine manufacturers being able to achieve significant increases in propulsion capability through improved component aerodynamics, materials, and structures. The recent history of propulsion has been marked by efforts to develop computational techniques that can speed up the propulsion design process and produce superior designs. The availability of powerful supercomputers, such as the NASA Numerical Aerodynamic Simulator, and the potential for even higher performance offered by parallel computer architectures, have opened the door to the use of multi-dimensional simulations to study complex physical phenomena in propulsion systems that have previously defied analysis or experimental observation. An overview of several NASA Lewis research efforts is provided that are contributing toward the long-range goal of a numerical test-cell for the integrated, multidisciplinary design, analysis, and optimization of propulsion systems. Specific examples in Internal Computational Fluid Mechanics, Computational Structural Mechanics, Computational Materials Science, and High Performance Computing are cited and described in terms of current capabilities, technical challenges, and future research directions.

  19. Global Software Development with Cloud Platforms

    NASA Astrophysics Data System (ADS)

    Yara, Pavan; Ramachandran, Ramaseshan; Balasubramanian, Gayathri; Muthuswamy, Karthik; Chandrasekar, Divya

    Offshore and outsourced distributed software development models and processes are facing challenges, previously unknown, with respect to computing capacity, bandwidth, storage, security, complexity, reliability, and business uncertainty. Clouds promise to address these challenges by adopting recent advances in virtualization, parallel and distributed systems, utility computing, and software services. In this paper, we envision a cloud-based platform that addresses some of these core problems. We outline a generic cloud architecture, its design and our first implementation results for three cloud forms - a compute cloud, a storage cloud and a cloud-based software service- in the context of global distributed software development (GSD). Our ”compute cloud” provides computational services such as continuous code integration and a compile server farm, ”storage cloud” offers storage (block or file-based) services with an on-line virtual storage service, whereas the on-line virtual labs represent a useful cloud service. We note some of the use cases for clouds in GSD, the lessons learned with our prototypes and identify challenges that must be conquered before realizing the full business benefits. We believe that in the future, software practitioners will focus more on these cloud computing platforms and see clouds as a means to supporting a ecosystem of clients, developers and other key stakeholders.

  20. Computational techniques for flows with finite-rate condensation

    NASA Technical Reports Server (NTRS)

    Candler, Graham V.

    1993-01-01

    A computational method to simulate the inviscid two-dimensional flow of a two-phase fluid was developed. This computational technique treats the gas phase and each of a prescribed number of particle sizes as separate fluids which are allowed to interact with one another. Thus, each particle-size class is allowed to move through the fluid at its own velocity at each point in the flow field. Mass, momentum, and energy are exchanged between each particle class and the gas phase. It is assumed that the particles do not collide with one another, so that there is no inter-particle exchange of momentum and energy. However, the particles are allowed to grow, and therefore, they may change from one size class to another. Appropriate rates of mass, momentum, and energy exchange between the gas and particle phases and between the different particle classes were developed. A numerical method was developed for use with this equation set. Several test cases were computed and show qualitative agreement with previous calculations.

  1. A dynamical-systems approach for computing ice-affected streamflow

    USGS Publications Warehouse

    Holtschlag, David J.

    1996-01-01

    A dynamical-systems approach was developed and evaluated for computing ice-affected streamflow. The approach provides for dynamic simulation and parameter estimation of site-specific equations relating ice effects to routinely measured environmental variables. Comparison indicates that results from the dynamical-systems approach ranked higher than results from 11 analytical methods previously investigated on the basis of accuracy and feasibility criteria. Additional research will likely lead to further improvements in the approach.

  2. Novel opportunities for computational biology and sociology in drug discovery☆

    PubMed Central

    Yao, Lixia; Evans, James A.; Rzhetsky, Andrey

    2013-01-01

    Current drug discovery is impossible without sophisticated modeling and computation. In this review we outline previous advances in computational biology and, by tracing the steps involved in pharmaceutical development, explore a range of novel, high-value opportunities for computational innovation in modeling the biological process of disease and the social process of drug discovery. These opportunities include text mining for new drug leads, modeling molecular pathways and predicting the efficacy of drug cocktails, analyzing genetic overlap between diseases and predicting alternative drug use. Computation can also be used to model research teams and innovative regions and to estimate the value of academy–industry links for scientific and human benefit. Attention to these opportunities could promise punctuated advance and will complement the well-established computational work on which drug discovery currently relies. PMID:20349528

  3. Computational open-channel hydraulics for movable-bed problems

    USGS Publications Warehouse

    Lai, Chintu; ,

    1990-01-01

    As a major branch of computational hydraulics, notable advances have been made in numerical modeling of unsteady open-channel flow since the beginning of the computer age. According to the broader definition and scope of 'computational hydraulics,' the basic concepts and technology of modeling unsteady open-channel flow have been systematically studied previously. As a natural extension, computational open-channel hydraulics for movable-bed problems are addressed in this paper. The introduction of the multimode method of characteristics (MMOC) has made the modeling of this class of unsteady flows both practical and effective. New modeling techniques are developed, thereby shedding light on several aspects of computational hydraulics. Some special features of movable-bed channel-flow simulation are discussed here in the same order as given by the author in the fixed-bed case.

  4. Stability of colloidal gold and determination of the Hamaker constant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Demirci, S.; Enuestuen, B.V.; Turkevich, J.

    1978-12-14

    Previous computation of stability factors of colloidal gold from coagulation data was found to be in systematic error due to an underestimation of the particle concentration by electron microscopy. A new experimental technique was developed for determination of this concentration. Stability factors were recalculated from the previous data using the correct concentration. While most of the previously reported conclusions remain unchanged, the absolute rate of fast coagulation is found to agree with that predicted by the theory. A value of the Hamaker constant was determined from the corrected data.

  5. Design analysis and computer-aided performance evaluation of shuttle orbiter electrical power system. Volume 1: Summary

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Studies were conducted to develop appropriate space shuttle electrical power distribution and control (EPDC) subsystem simulation models and to apply the computer simulations to systems analysis of the EPDC. A previously developed software program (SYSTID) was adapted for this purpose. The following objectives were attained: (1) significant enhancement of the SYSTID time domain simulation software, (2) generation of functionally useful shuttle EPDC element models, and (3) illustrative simulation results in the analysis of EPDC performance, under the conditions of fault, current pulse injection due to lightning, and circuit protection sizing and reaction times.

  6. A computing method for sound propagation through a nonuniform jet stream

    NASA Technical Reports Server (NTRS)

    Padula, S. L.; Liu, C. H.

    1974-01-01

    The classical formulation of sound propagation through a jet flow was found to be inadequate for computer solutions. Previous investigations selected the phase and amplitude of the acoustic pressure as dependent variables requiring the solution of a system of nonlinear algebraic equations. The nonlinearities complicated both the analysis and the computation. A reformulation of the convective wave equation in terms of a new set of dependent variables is developed with a special emphasis on its suitability for numerical solutions on fast computers. The technique is very attractive because the resulting equations are linear in nonwaving variables. The computer solution to such a linear system of algebraic equations may be obtained by well-defined and direct means which are conservative of computer time and storage space. Typical examples are illustrated and computational results are compared with available numerical and experimental data.

  7. A new method for the prediction of combustion instability

    NASA Astrophysics Data System (ADS)

    Flanagan, Steven Meville

    This dissertation presents a new approach to the prediction of combustion instability in solid rocket motors. Previous attempts at developing computational tools to solve this problem have been largely unsuccessful, showing very poor agreement with experimental results and having little or no predictive capability. This is due primarily to deficiencies in the linear stability theory upon which these efforts have been based. Recent advances in linear instability theory by Flandro have demonstrated the importance of including unsteady rotational effects, previously considered negligible. Previous versions of the theory also neglected corrections to the unsteady flow field of the first order in the mean flow Mach number. This research explores the stability implications of extending the solution to include these corrections. Also, the corrected linear stability theory based upon a rotational unsteady flow field extended to first order in mean flow Mach number has been implemented in two computer programs developed for the Macintosh platform. A quasi one-dimensional version of the program has been developed which is based upon an approximate solution to the cavity acoustics problem. The three-dimensional program applies Greens's Function Discretization (GFD) to the solution for the acoustic mode shapes and frequency. GFD is a recently developed numerical method for finding fully three dimensional solutions for this class of problems. The analysis of complex motor geometries, previously a tedious and time consuming task, has also been greatly simplified through the development of a drawing package designed specifically to facilitate the specification of typical motor geometries. The combination of the drawing package, improved acoustic solutions, and new analysis, results in a tool which is capable of producing more accurate and meaningful predictions than have been possible in the past.

  8. Computational techniques for solar wind flows past terrestrial planets: Theory and computer programs

    NASA Technical Reports Server (NTRS)

    Stahara, S. S.; Chaussee, D. S.; Trudinger, B. C.; Spreiter, J. R.

    1977-01-01

    The interaction of the solar wind with terrestrial planets can be predicted using a computer program based on a single fluid, steady, dissipationless, magnetohydrodynamic model to calculate the axisymmetric, supersonic, super-Alfvenic solar wind flow past both magnetic and nonmagnetic planets. The actual calculations are implemented by an assemblage of computer codes organized into one program. These include finite difference codes which determine the gas-dynamic solution, together with a variety of special purpose output codes for determining and automatically plotting both flow field and magnetic field results. Comparisons are made with previous results, and results are presented for a number of solar wind flows. The computational programs developed are documented and are presented in a general user's manual which is included.

  9. Assessment in health care education - modelling and implementation of a computer supported scoring process.

    PubMed

    Alfredsson, Jayne; Plichart, Patrick; Zary, Nabil

    2012-01-01

    Research on computer supported scoring of assessments in health care education has mainly focused on automated scoring. Little attention has been given to how informatics can support the currently predominant human-based grading approach. This paper reports steps taken to develop a model for a computer supported scoring process that focuses on optimizing a task that was previously undertaken without computer support. The model was also implemented in the open source assessment platform TAO in order to study its benefits. Ability to score test takers anonymously, analytics on the graders reliability and a more time efficient process are example of observed benefits. A computer supported scoring will increase the quality of the assessment results.

  10. Theoretical studies of Resonance Enhanced Stimulated Raman Scattering (RESRS) of frequency doubled Alexandrite laser wavelength in cesium vapor

    NASA Technical Reports Server (NTRS)

    Lawandy, Nabil M.

    1987-01-01

    The third phase of research will focus on the propagation and energy extraction of the pump and SERS beams in a variety of configurations including oscillator structures. In order to address these questions a numerical code capable of allowing for saturation and full transverse beam evolution is required. The method proposed is based on a discretized propagation energy extraction model which uses a Kirchoff integral propagator coupled to the three level Raman model already developed. The model will have the resolution required by diffraction limits and will use the previous density matrix results in the adiabatic following limit. Owing to its large computational requirements, such a code must be implemented on a vector array processor. One code on the Cyber is being tested by using previously understood two-level laser models as guidelines for interpreting the results. Two tests were implemented: the evolution of modes in a passive resonator and the evolution of a stable state of the adiabatically eliminated laser equations. These results show mode shapes and diffraction losses for the first case and relaxation oscillations for the second one. Finally, in order to clarify the computing methodology used to exploit the speed of the Cyber's computational speed, the time it takes to perform both of the computations previously mentioned to run on the Cyber and VAX 730 must be measured. Also included is a short description of the current laser model (CAVITY.FOR) and a flow chart of the test computations.

  11. Incorporating Prototyping and Iteration into Intervention Development: A Case Study of a Dining Hall-Based Intervention

    ERIC Educational Resources Information Center

    McClain, Arianna D.; Hekler, Eric B.; Gardner, Christopher D.

    2013-01-01

    Background: Previous research from the fields of computer science and engineering highlight the importance of an iterative design process (IDP) to create more creative and effective solutions. Objective: This study describes IDP as a new method for developing health behavior interventions and evaluates the effectiveness of a dining hall--based…

  12. The Impact of Recasts on the Development of Primary Stress in a Synchronous Computer-Mediated Environment

    ERIC Educational Resources Information Center

    Parlak, Özgür; Ziegler, Nicole

    2017-01-01

    Although previous research has demonstrated the efficacy of recasts on second language (L2) morphology and lexis (e.g., Li, 2010; Mackey & Goo, 2007), few studies have examined their effect on learners' phonological development (although see Saito, 2015; Saito & Lyster, 2012). The current study investigates the impact of recasts on the…

  13. A method of computer modelling the lithium-ion batteries aging process based on the experimental characteristics

    NASA Astrophysics Data System (ADS)

    Czerepicki, A.; Koniak, M.

    2017-06-01

    The paper presents a method of modelling the processes of aging lithium-ion batteries, its implementation as a computer application and results for battery state estimation. Authors use previously developed behavioural battery model, which was built using battery operating characteristics obtained from the experiment. This model was implemented in the form of a computer program using a database to store battery characteristics. Batteries aging process is a new extended functionality of the model. Algorithm of computer simulation uses a real measurements of battery capacity as a function of the battery charge and discharge cycles number. Simulation allows to take into account the incomplete cycles of charge or discharge battery, which are characteristic for transport powered by electricity. The developed model was used to simulate the battery state estimation for different load profiles, obtained by measuring the movement of the selected means of transport.

  14. Development and application of GASP 2.0

    NASA Technical Reports Server (NTRS)

    Mcgrory, W. D.; Huebner, L. D.; Slack, D. C.; Walters, R. W.

    1992-01-01

    GASP 2.0 represents a major new release of the computational fluid dynamics code in wide use by the aerospace community. The authors have spent the last two years analyzing the strengths and weaknesses of the previous version of the finite-rate chemistry, Navier Stokes solution algorithm. What has resulted is a completely redesigned computer code that offers two to four times the performance of previous versions while requiring as little as one quarter of the memory requirements. In addition to the improvements in efficiency over the original code, Version 2.0 contains many new features. A brief discussion of the improvements made to GASP, and an application using GASP 2.0 which demonstrates some of the new features are presented.

  15. Application of a Scalable, Parallel, Unstructured-Grid-Based Navier-Stokes Solver

    NASA Technical Reports Server (NTRS)

    Parikh, Paresh

    2001-01-01

    A parallel version of an unstructured-grid based Navier-Stokes solver, USM3Dns, previously developed for efficient operation on a variety of parallel computers, has been enhanced to incorporate upgrades made to the serial version. The resultant parallel code has been extensively tested on a variety of problems of aerospace interest and on two sets of parallel computers to understand and document its characteristics. An innovative grid renumbering construct and use of non-blocking communication are shown to produce superlinear computing performance. Preliminary results from parallelization of a recently introduced "porous surface" boundary condition are also presented.

  16. Investigations of flowfields found in typical combustor geometries

    NASA Technical Reports Server (NTRS)

    Lilley, D. G.

    1982-01-01

    Measurements and computations are being applied to an axisymmetric swirling flow, emerging from swirl vanes at angle phi, entering a large chamber test section via a sudden expansion of various side-wall angles alpha. New features are: the turbulence measurements are being performed on swirling as well as nonswirling flow; and all measurements and computations are also being performed on a confined jet flowfield with realistic downstream blockage. Recent activity falls into three categories: (1) Time-mean flowfield characterization by five-hole pitot probe measurements and by flow visualization; (2) Turbulence measurements by a variety of single- and multi-wire hot-wire probe techniques; and (3) Flowfield computations using the computer code developed during the previous year's research program.

  17. The study on the parallel processing based time series correlation analysis of RBC membrane flickering in quantitative phase imaging

    NASA Astrophysics Data System (ADS)

    Lee, Minsuk; Won, Youngjae; Park, Byungjun; Lee, Seungrag

    2017-02-01

    Not only static characteristics but also dynamic characteristics of the red blood cell (RBC) contains useful information for the blood diagnosis. Quantitative phase imaging (QPI) can capture sample images with subnanometer scale depth resolution and millisecond scale temporal resolution. Various researches have been used QPI for the RBC diagnosis, and recently many researches has been developed to decrease the process time of RBC information extraction using QPI by the parallel computing algorithm, however previous studies are interested in the static parameters such as morphology of the cells or simple dynamic parameters such as root mean square (RMS) of the membrane fluctuations. Previously, we presented a practical blood test method using the time series correlation analysis of RBC membrane flickering with QPI. However, this method has shown that there is a limit to the clinical application because of the long computation time. In this study, we present an accelerated time series correlation analysis of RBC membrane flickering using the parallel computing algorithm. This method showed consistent fractal scaling exponent results of the surrounding medium and the normal RBC with our previous research.

  18. Computational fluid mechanics

    NASA Technical Reports Server (NTRS)

    Hassan, H. A.

    1993-01-01

    Two papers are included in this progress report. In the first, the compressible Navier-Stokes equations have been used to compute leading edge receptivity of boundary layers over parabolic cylinders. Natural receptivity at the leading edge was simulated and Tollmien-Schlichting waves were observed to develop in response to an acoustic disturbance, applied through the farfield boundary conditions. To facilitate comparison with previous work, all computations were carried out at a free stream Mach number of 0.3. The spatial and temporal behavior of the flowfields are calculated through the use of finite volume algorithms and Runge-Kutta integration. The results are dominated by strong decay of the Tollmien-Schlichting wave due to the presence of the mean flow favorable pressure gradient. The effects of numerical dissipation, forcing frequency, and nose radius are studied. The Strouhal number is shown to have the greatest effect on the unsteady results. In the second paper, a transition model for low-speed flows, previously developed by Young et al., which incorporates first-mode (Tollmien-Schlichting) disturbance information from linear stability theory has been extended to high-speed flow by incorporating the effects of second mode disturbances. The transition model is incorporated into a Reynolds-averaged Navier-Stokes solver with a one-equation turbulence model. Results using a variable turbulent Prandtl number approach demonstrate that the current model accurately reproduces available experimental data for first and second-mode dominated transitional flows. The performance of the present model shows significant improvement over previous transition modeling attempts.

  19. Mycotoxin Management Studies by USDA-"Ag Lab" in 2008

    USDA-ARS?s Scientific Manuscript database

    Studies again included several popcorn fields in 2008, in order to continue gathering data for modification of the previously developed management strategies for mycotoxins in field corn (the mycotoxin predictive computer program). Weather conditions were generally good for growing corn, but excess...

  20. Mycotoxin Management Studies by USDA-ARS, NCAUR in 2009

    USDA-ARS?s Scientific Manuscript database

    Studies again included several popcorn fields in 2009 in order to continue gathering data for modification of the previously developed management strategies for mycotoxins in field corn (including the mycotoxin predictive computer program). Without an attempt for optimization, the field corn model ...

  1. Rapid calculation of genomic evaluations for new animals

    USDA-ARS?s Scientific Manuscript database

    A method was developed to calculate preliminary genomic evaluations daily or weekly before the release of official monthly evaluations by processing only newly genotyped animals using estimates of SNP effects from the previous official evaluation. To minimize computing time, reliabilities and genomi...

  2. Domain decomposition for aerodynamic and aeroacoustic analyses, and optimization

    NASA Technical Reports Server (NTRS)

    Baysal, Oktay

    1995-01-01

    The overarching theme was the domain decomposition, which intended to improve the numerical solution technique for the partial differential equations at hand; in the present study, those that governed either the fluid flow, or the aeroacoustic wave propagation, or the sensitivity analysis for a gradient-based optimization. The role of the domain decomposition extended beyond the original impetus of discretizing geometrical complex regions or writing modular software for distributed-hardware computers. It induced function-space decompositions and operator decompositions that offered the valuable property of near independence of operator evaluation tasks. The objectives have gravitated about the extensions and implementations of either the previously developed or concurrently being developed methodologies: (1) aerodynamic sensitivity analysis with domain decomposition (SADD); (2) computational aeroacoustics of cavities; and (3) dynamic, multibody computational fluid dynamics using unstructured meshes.

  3. Novel opportunities for computational biology and sociology in drug discovery

    PubMed Central

    Yao, Lixia

    2009-01-01

    Drug discovery today is impossible without sophisticated modeling and computation. In this review we touch on previous advances in computational biology and by tracing the steps involved in pharmaceutical development, we explore a range of novel, high value opportunities for computational innovation in modeling the biological process of disease and the social process of drug discovery. These opportunities include text mining for new drug leads, modeling molecular pathways and predicting the efficacy of drug cocktails, analyzing genetic overlap between diseases and predicting alternative drug use. Computation can also be used to model research teams and innovative regions and to estimate the value of academy-industry ties for scientific and human benefit. Attention to these opportunities could promise punctuated advance, and will complement the well-established computational work on which drug discovery currently relies. PMID:19674801

  4. CFD Analysis of the SBXC Glider Airframe

    DTIC Science & Technology

    2016-06-01

    mathematically on finite element methods. To validate and verify the methodology developed, a mathematical comparison was made with the previous research data...greater than 15 m/s. 14. SUBJECT TERMS finite element method, computational fluid dynamics, Y Plus, mesh element quality, aerodynamic data, fluid...based mathematically on finite element methods. To validate and verify the methodology developed, a mathematical comparison was made with the

  5. Language and Cognition Interaction Neural Mechanisms

    PubMed Central

    Perlovsky, Leonid

    2011-01-01

    How language and cognition interact in thinking? Is language just used for communication of completed thoughts, or is it fundamental for thinking? Existing approaches have not led to a computational theory. We develop a hypothesis that language and cognition are two separate but closely interacting mechanisms. Language accumulates cultural wisdom; cognition develops mental representations modeling surrounding world and adapts cultural knowledge to concrete circumstances of life. Language is acquired from surrounding language “ready-made” and therefore can be acquired early in life. This early acquisition of language in childhood encompasses the entire hierarchy from sounds to words, to phrases, and to highest concepts existing in culture. Cognition is developed from experience. Yet cognition cannot be acquired from experience alone; language is a necessary intermediary, a “teacher.” A mathematical model is developed; it overcomes previous difficulties and leads to a computational theory. This model is consistent with Arbib's “language prewired brain” built on top of mirror neuron system. It models recent neuroimaging data about cognition, remaining unnoticed by other theories. A number of properties of language and cognition are explained, which previously seemed mysterious, including influence of language grammar on cultural evolution, which may explain specifics of English and Arabic cultures. PMID:21876687

  6. Atmospheric energetics as related to cyclogenesis over the eastern United States. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    West, P. W.

    1973-01-01

    A method is presented to investigate the atmospheric energy budget as related to cyclogenesis. Energy budget equations are developed that are shown to be advantageous because the individual terms represent basic physical processes which produce changes in atmospheric energy, and the equations provide a means to study the interaction of the cyclone with the larger scales of motion. The work presented represents an extension of previous studies because all of the terms of the energy budget equations were evaluated throughout the development period of the cyclone. Computations are carried out over a limited atmospheric volume which encompasses the cyclone, and boundary fluxes of energy that were ignored in most previous studies are evaluated. Two examples of cyclogenesis over the eastern United States were chosen for study. One of the cases (1-4 November, 1966) represented an example of vigorous development, while the development in the other case (5-8 December, 1969) was more modest. Objectively analyzed data were used in the evaluation of the energy budget terms in order to minimize computational errors, and an objective analysis scheme is described that insures that all of the resolution contained in the rawinsonde observations is incorporated in the analyses.

  7. Applications in Data-Intensive Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shah, Anuj R.; Adkins, Joshua N.; Baxter, Douglas J.

    2010-04-01

    This book chapter, to be published in Advances in Computers, Volume 78, in 2010 describes applications of data intensive computing (DIC). This is an invited chapter resulting from a previous publication on DIC. This work summarizes efforts coming out of the PNNL's Data Intensive Computing Initiative. Advances in technology have empowered individuals with the ability to generate digital content with mouse clicks and voice commands. Digital pictures, emails, text messages, home videos, audio, and webpages are common examples of digital content that are generated on a regular basis. Data intensive computing facilitates human understanding of complex problems. Data-intensive applications providemore » timely and meaningful analytical results in response to exponentially growing data complexity and associated analysis requirements through the development of new classes of software, algorithms, and hardware.« less

  8. An Algebra-Based Introductory Computational Neuroscience Course with Lab.

    PubMed

    Fink, Christian G

    2017-01-01

    A course in computational neuroscience has been developed at Ohio Wesleyan University which requires no previous experience with calculus or computer programming, and which exposes students to theoretical models of neural information processing and techniques for analyzing neural data. The exploration of theoretical models of neural processes is conducted in the classroom portion of the course, while data analysis techniques are covered in lab. Students learn to program in MATLAB and are offered the opportunity to conclude the course with a final project in which they explore a topic of their choice within computational neuroscience. Results from a questionnaire administered at the beginning and end of the course indicate significant gains in student facility with core concepts in computational neuroscience, as well as with analysis techniques applied to neural data.

  9. Motion Planning in a Society of Intelligent Mobile Agents

    NASA Technical Reports Server (NTRS)

    Esterline, Albert C.; Shafto, Michael (Technical Monitor)

    2002-01-01

    The majority of the work on this grant involved formal modeling of human-computer integration. We conceptualize computer resources as a multiagent system so that these resources and human collaborators may be modeled uniformly. In previous work we had used modal for this uniform modeling, and we had developed a process-algebraic agent abstraction. In this work, we applied this abstraction (using CSP) in uniformly modeling agents and users, which allowed us to use tools for investigating CSP models. This work revealed the power of, process-algebraic handshakes in modeling face-to-face conversation. We also investigated specifications of human-computer systems in the style of algebraic specification. This involved specifying the common knowledge required for coordination and process-algebraic patterns of communication actions intended to establish the common knowledge. We investigated the conditions for agents endowed with perception to gain common knowledge and implemented a prototype neural-network system that allows agents to detect when such conditions hold. The literature on multiagent systems conceptualizes communication actions as speech acts. We implemented a prototype system that infers the deontic effects (obligations, permissions, prohibitions) of speech acts and detects violations of these effects. A prototype distributed system was developed that allows users to collaborate in moving proxy agents; it was designed to exploit handshakes and common knowledge Finally. in work carried over from a previous NASA ARC grant, about fifteen undergraduates developed and presented projects on multiagent motion planning.

  10. Free-Field Spatialized Aural Cues for Synthetic Environments

    DTIC Science & Technology

    1994-09-01

    any of the references previously listed. B. MIDI Other than electronic musicians and a few hobbyists, the Musical Instrument Digital Interface (MIDI...developed in 1983 and still has a long way to go in improving its capabilities, but the advantages are numerous. An entire musical score can be stored...the same musical file on a computer in one of the various digital sound formats could easily occupy 90 megabytes of disk space. 7 K III. PREVIOUS WORK

  11. Computational complexity of algorithms for sequence comparison, short-read assembly and genome alignment.

    PubMed

    Baichoo, Shakuntala; Ouzounis, Christos A

    A multitude of algorithms for sequence comparison, short-read assembly and whole-genome alignment have been developed in the general context of molecular biology, to support technology development for high-throughput sequencing, numerous applications in genome biology and fundamental research on comparative genomics. The computational complexity of these algorithms has been previously reported in original research papers, yet this often neglected property has not been reviewed previously in a systematic manner and for a wider audience. We provide a review of space and time complexity of key sequence analysis algorithms and highlight their properties in a comprehensive manner, in order to identify potential opportunities for further research in algorithm or data structure optimization. The complexity aspect is poised to become pivotal as we will be facing challenges related to the continuous increase of genomic data on unprecedented scales and complexity in the foreseeable future, when robust biological simulation at the cell level and above becomes a reality. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Brief introductory guide to agent-based modeling and an illustration from urban health research.

    PubMed

    Auchincloss, Amy H; Garcia, Leandro Martin Totaro

    2015-11-01

    There is growing interest among urban health researchers in addressing complex problems using conceptual and computation models from the field of complex systems. Agent-based modeling (ABM) is one computational modeling tool that has received a lot of interest. However, many researchers remain unfamiliar with developing and carrying out an ABM, hindering the understanding and application of it. This paper first presents a brief introductory guide to carrying out a simple agent-based model. Then, the method is illustrated by discussing a previously developed agent-based model, which explored inequalities in diet in the context of urban residential segregation.

  13. Brief introductory guide to agent-based modeling and an illustration from urban health research

    PubMed Central

    Auchincloss, Amy H.; Garcia, Leandro Martin Totaro

    2017-01-01

    There is growing interest among urban health researchers in addressing complex problems using conceptual and computation models from the field of complex systems. Agent-based modeling (ABM) is one computational modeling tool that has received a lot of interest. However, many researchers remain unfamiliar with developing and carrying out an ABM, hindering the understanding and application of it. This paper first presents a brief introductory guide to carrying out a simple agent-based model. Then, the method is illustrated by discussing a previously developed agent-based model, which explored inequalities in diet in the context of urban residential segregation. PMID:26648364

  14. Development of a thermal storage module using modified anhydrous sodium hydroxide

    NASA Technical Reports Server (NTRS)

    Rice, R. E.; Rowny, P. E.

    1980-01-01

    The laboratory scale testing of a modified anhydrous NaOH latent heat storage concept for small solar thermal power systems such as total energy systems utilizing organic Rankine systems is discussed. A diagnostic test on the thermal energy storage module and an investigation of alternative heat transfer fluids and heat exchange concepts are specifically addressed. A previously developed computer simulation model is modified to predict the performance of the module in a solar total energy system environment. In addition, the computer model is expanded to investigate parametrically the incorporation of a second heat exchange inside the module which will vaporize and superheat the Rankine cycle power fluid.

  15. Earth-atmosphere system and surface reflectivities in arid regions from Landsat MSS data

    NASA Technical Reports Server (NTRS)

    Otterman, J.; Fraser, R. S.

    1976-01-01

    Previously developed programs for computing atmospheric transmission and scattering of the solar radiation are used to compute the ratios of the earth-atmosphere system (space) directional reflectivities in the nadir direction to the surface Lambertian reflectivity, for the four bands of the Landsat multispectral scanner (MSS). These ratios are presented as graphs for two water vapor levels, as a function of the surface reflectivity, for various sun elevation angles. Space directional reflectivities in the vertical direction are reported for selected arid regions in Asia, Africa, and Central America from the spectral radiance levels measured by the Landsat MSS. From these space reflectivities, surface reflectivities are computed applying the pertinent graphs. These surface reflectivities are used to estimate the surface albedo for the entire solar spectrum. The estimated albedos are in the range 0.34-0.52, higher than the values reported by most previous researchers from space measurements, but are consistent with laboratory and in situ measurements.

  16. Application of Multi-Frequency Modulation (MFM) for High-Speed Data Communications to a Voice Frequency Channel

    DTIC Science & Technology

    1990-06-01

    reader is cautioned that computer programs developed in this research may not have been exercised for all cases of interest. While every effort has been...Source of Funding Numbers _. Program Element No Project No I Task No I Work Unit Accession No 11 Title (Include security classflcation) APPLICATION OF...formats. Previous applications of these encoding formats were on industry standard computers (PC) over a 16-20 klIz channel. This report discusses the

  17. Sediment and Hydraulic Measurements with Computed Bed Load on the Missouri River, Sioux City to Hermann, 2014

    DTIC Science & Technology

    2017-05-01

    large sand bed river, with seven sites representing increasingly larger flows along the river length. The data set will be very useful for additional...quantity, quality , and types of data that can be obtained for the study of natural phenomenon. The study of riverine sedimentation is no exception...detail than in previous years. Additionally, new methodologies have been developed that allow the computation of bed-load transport in large sand bed

  18. Development of a turbomachinery design optimization procedure using a multiple-parameter nonlinear perturbation method

    NASA Technical Reports Server (NTRS)

    Stahara, S. S.

    1984-01-01

    An investigation was carried out to complete the preliminary development of a combined perturbation/optimization procedure and associated computational code for designing optimized blade-to-blade profiles of turbomachinery blades. The overall purpose of the procedures developed is to provide demonstration of a rapid nonlinear perturbation method for minimizing the computational requirements associated with parametric design studies of turbomachinery flows. The method combines the multiple parameter nonlinear perturbation method, successfully developed in previous phases of this study, with the NASA TSONIC blade-to-blade turbomachinery flow solver, and the COPES-CONMIN optimization procedure into a user's code for designing optimized blade-to-blade surface profiles of turbomachinery blades. Results of several design applications and a documented version of the code together with a user's manual are provided.

  19. Uniform rovibrational collisional N2 bin model for DSMC, with application to atmospheric entry flows

    NASA Astrophysics Data System (ADS)

    Torres, E.; Bondar, Ye. A.; Magin, T. E.

    2016-11-01

    A state-to-state model for internal energy exchange and molecular dissociation allows for high-fidelity DSMC simulations. Elementary reaction cross sections for the N2 (v, J)+ N system were previously extracted from a quantum-chemical database, originally compiled at NASA Ames Research Center. Due to the high computational cost of simulating the full range of inelastic collision processes (approx. 23 million reactions), a coarse-grain model, called the Uniform RoVibrational Collisional (URVC) bin model can be used instead. This allows to reduce the original 9390 rovibrational levels of N2 to 10 energy bins. In the present work, this reduced model is used to simulate a 2D flow configuration, which more closely reproduces the conditions of high-speed entry into Earth's atmosphere. For this purpose, the URVC bin model had to be adapted for integration into the "Rarefied Gas Dynamics Analysis System" (RGDAS), a separate high-performance DSMC code capable of handling complex geometries and parallel computations. RGDAS was developed at the Institute of Theoretical and Applied Mechanics in Novosibirsk, Russia for use by the European Space Agency (ESA) and shares many features with the well-known SMILE code developed by the same group. We show that the reduced mechanism developed previously can be implemented in RGDAS, and the results exhibit nonequilibrium effects consistent with those observed in previous 1D-simulations.

  20. Computational techniques in gamma-ray skyshine analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    George, D.L.

    1988-12-01

    Two computer codes were developed to analyze gamma-ray skyshine, the scattering of gamma photons by air molecules. A review of previous gamma-ray skyshine studies discusses several Monte Carlo codes, programs using a single-scatter model, and the MicroSkyshine program for microcomputers. A benchmark gamma-ray skyshine experiment performed at Kansas State University is also described. A single-scatter numerical model was presented which traces photons from the source to their first scatter, then applies a buildup factor along a direct path from the scattering point to a detector. The FORTRAN code SKY, developed with this model before the present study, was modified tomore » use Gauss quadrature, recent photon attenuation data and a more accurate buildup approximation. The resulting code, SILOGP, computes response from a point photon source on the axis of a silo, with and without concrete shielding over the opening. Another program, WALLGP, was developed using the same model to compute response from a point gamma source behind a perfectly absorbing wall, with and without shielding overhead. 29 refs., 48 figs., 13 tabs.« less

  1. Vehicle Maneuver Detection with Accelerometer-Based Classification.

    PubMed

    Cervantes-Villanueva, Javier; Carrillo-Zapata, Daniel; Terroso-Saenz, Fernando; Valdes-Vela, Mercedes; Skarmeta, Antonio F

    2016-09-29

    In the mobile computing era, smartphones have become instrumental tools to develop innovative mobile context-aware systems. In that sense, their usage in the vehicular domain eases the development of novel and personal transportation solutions. In this frame, the present work introduces an innovative mechanism to perceive the current kinematic state of a vehicle on the basis of the accelerometer data from a smartphone mounted in the vehicle. Unlike previous proposals, the introduced architecture targets the computational limitations of such devices to carry out the detection process following an incremental approach. For its realization, we have evaluated different classification algorithms to act as agents within the architecture. Finally, our approach has been tested with a real-world dataset collected by means of the ad hoc mobile application developed.

  2. Multiscale atomistic simulation of metal-oxygen surface interactions: Methodological development, theoretical investigation, and correlation with experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Judith C.

    The purpose of this grant is to develop the multi-scale theoretical methods to describe the nanoscale oxidation of metal thin films, as the PI (Yang) extensive previous experience in the experimental elucidation of the initial stages of Cu oxidation by primarily in situ transmission electron microscopy methods. Through the use and development of computational tools at varying length (and time) scales, from atomistic quantum mechanical calculation, force field mesoscale simulations, to large scale Kinetic Monte Carlo (KMC) modeling, the fundamental underpinings of the initial stages of Cu oxidation have been elucidated. The development of computational modeling tools allows for acceleratedmore » materials discovery. The theoretical tools developed from this program impact a wide range of technologies that depend on surface reactions, including corrosion, catalysis, and nanomaterials fabrication.« less

  3. Robot computer problem solving system

    NASA Technical Reports Server (NTRS)

    Merriam, E. W.; Becker, J. D.

    1973-01-01

    A robot computer problem solving system which represents a robot exploration vehicle in a simulated Mars environment is described. The model exhibits changes and improvements made on a previously designed robot in a city environment. The Martian environment is modeled in Cartesian coordinates; objects are scattered about a plane; arbitrary restrictions on the robot's vision have been removed; and the robot's path contains arbitrary curves. New environmental features, particularly the visual occlusion of objects by other objects, were added to the model. Two different algorithms were developed for computing occlusion. Movement and vision capabilities of the robot were established in the Mars environment, using LISP/FORTRAN interface for computational efficiency. The graphical display program was redesigned to reflect the change to the Mars-like environment.

  4. Acoustic environmental accuracy requirements for response determination

    NASA Technical Reports Server (NTRS)

    Pettitt, M. R.

    1983-01-01

    A general purpose computer program was developed for the prediction of vehicle interior noise. This program, named VIN, has both modal and statistical energy analysis capabilities for structural/acoustic interaction analysis. The analytic models and their computer implementation were verified through simple test cases with well-defined experimental results. The model was also applied in a space shuttle payload bay launch acoustics prediction study. The computer program processes large and small problems with equal efficiency because all arrays are dynamically sized by program input variables at run time. A data base is built and easily accessed for design studies. The data base significantly reduces the computational costs of such studies by allowing the reuse of the still-valid calculated parameters of previous iterations.

  5. Teaching Non-Recursive Binary Searching: Establishing a Conceptual Framework.

    ERIC Educational Resources Information Center

    Magel, E. Terry

    1989-01-01

    Discusses problems associated with teaching non-recursive binary searching in computer language classes, and describes a teacher-directed dialog based on dictionary use that helps students use their previous searching experiences to conceptualize the binary search process. Algorithmic development is discussed and appropriate classroom discussion…

  6. Graded SPSS Exercises.

    ERIC Educational Resources Information Center

    Allen, Mary J.

    The attached materials have been developed for use on the CSU CYBER Computer's Statistical Package for the Social Sciences (SPSSONL). The assignments are graded in difficulty and gradually introduce new commands and require the practice of previously learned commands. The handouts begin with basic instructions for logging on; then XEDIT is taught…

  7. Computational Approaches to Drug Repurposing and Pharmacology

    PubMed Central

    Hodos, Rachel A; Kidd, Brian A; Khader, Shameer; Readhead, Ben P; Dudley, Joel T

    2016-01-01

    Data in the biological, chemical, and clinical domains are accumulating at ever-increasing rates and have the potential to accelerate and inform drug development in new ways. Challenges and opportunities now lie in developing analytic tools to transform these often complex and heterogeneous data into testable hypotheses and actionable insights. This is the aim of computational pharmacology, which uses in silico techniques to better understand and predict how drugs affect biological systems, which can in turn improve clinical use, avoid unwanted side effects, and guide selection and development of better treatments. One exciting application of computational pharmacology is drug repurposing- finding new uses for existing drugs. Already yielding many promising candidates, this strategy has the potential to improve the efficiency of the drug development process and reach patient populations with previously unmet needs such as those with rare diseases. While current techniques in computational pharmacology and drug repurposing often focus on just a single data modality such as gene expression or drug-target interactions, we rationalize that methods such as matrix factorization that can integrate data within and across diverse data types have the potential to improve predictive performance and provide a fuller picture of a drug's pharmacological action. PMID:27080087

  8. Computing in Hydraulic Engineering Education

    NASA Astrophysics Data System (ADS)

    Duan, J. G.

    2011-12-01

    Civil engineers, pioneers of our civilization, are rarely perceived as leaders and innovators in modern society because of retardations in technology innovation. This crisis has resulted in the decline of the prestige of civil engineering profession, reduction of federal funding on deteriorating infrastructures, and problems with attracting the most talented high-school students. Infusion of cutting-edge computer technology and stimulating creativity and innovation therefore are the critical challenge to civil engineering education. To better prepare our graduates to innovate, this paper discussed the adaption of problem-based collaborative learning technique and integration of civil engineering computing into a traditional civil engineering curriculum. Three interconnected courses: Open Channel Flow, Computational Hydraulics, and Sedimentation Engineering, were developed with emphasis on computational simulations. In Open Channel flow, the focuses are principles of free surface flow and the application of computational models. This prepares students to the 2nd course, Computational Hydraulics, that introduce the fundamental principles of computational hydraulics, including finite difference and finite element methods. This course complements the Open Channel Flow class to provide students with in-depth understandings of computational methods. The 3rd course, Sedimentation Engineering, covers the fundamentals of sediment transport and river engineering, so students can apply the knowledge and programming skills gained from previous courses to develop computational models for simulating sediment transport. These courses effectively equipped students with important skills and knowledge to complete thesis and dissertation research.

  9. 1-1 in Education: Current Practice, International Comparative Research Evidence and Policy Implications. OECD Education Working Papers, No. 44,

    ERIC Educational Resources Information Center

    Valiente, Oscar

    2010-01-01

    Over the last decade, more and more public and private stakeholders, in developed and developing countries, have been supporting 1:1 initiatives in education (i.e. every child receives her/his own personal computing device). These 1:1 initiatives represent a qualitative move forward from previous educational experiences with ICT, inasmuch as every…

  10. Continued Development of Expert System Tools for NPSS Engine Diagnostics

    NASA Technical Reports Server (NTRS)

    Lewandowski, Henry

    1996-01-01

    The objectives of this grant were to work with previously developed NPSS (Numerical Propulsion System Simulation) tools and enhance their functionality; explore similar AI systems; and work with the High Performance Computing Communication (HPCC) K-12 program. Activities for this reporting period are briefly summarized and a paper addressing the implementation, monitoring and zooming in a distributed jet engine simulation is included as an attachment.

  11. Improved object optimal synthetic description, modeling, learning, and discrimination by GEOGINE computational kernel

    NASA Astrophysics Data System (ADS)

    Fiorini, Rodolfo A.; Dacquino, Gianfranco

    2005-03-01

    GEOGINE (GEOmetrical enGINE), a state-of-the-art OMG (Ontological Model Generator) based on n-D Tensor Invariants for n-Dimensional shape/texture optimal synthetic representation, description and learning, was presented in previous conferences elsewhere recently. Improved computational algorithms based on the computational invariant theory of finite groups in Euclidean space and a demo application is presented. Progressive model automatic generation is discussed. GEOGINE can be used as an efficient computational kernel for fast reliable application development and delivery in advanced biomedical engineering, biometric, intelligent computing, target recognition, content image retrieval, data mining technological areas mainly. Ontology can be regarded as a logical theory accounting for the intended meaning of a formal dictionary, i.e., its ontological commitment to a particular conceptualization of the world object. According to this approach, "n-D Tensor Calculus" can be considered a "Formal Language" to reliably compute optimized "n-Dimensional Tensor Invariants" as specific object "invariant parameter and attribute words" for automated n-Dimensional shape/texture optimal synthetic object description by incremental model generation. The class of those "invariant parameter and attribute words" can be thought as a specific "Formal Vocabulary" learned from a "Generalized Formal Dictionary" of the "Computational Tensor Invariants" language. Even object chromatic attributes can be effectively and reliably computed from object geometric parameters into robust colour shape invariant characteristics. As a matter of fact, any highly sophisticated application needing effective, robust object geometric/colour invariant attribute capture and parameterization features, for reliable automated object learning and discrimination can deeply benefit from GEOGINE progressive automated model generation computational kernel performance. Main operational advantages over previous, similar approaches are: 1) Progressive Automated Invariant Model Generation, 2) Invariant Minimal Complete Description Set for computational efficiency, 3) Arbitrary Model Precision for robust object description and identification.

  12. Physical and Genetic Mechanisms Guiding the Evolution and Development of Dendritic, Flapping Insect-Wings

    DTIC Science & Technology

    2010-04-12

    computer graphics, to music . Map L systems [2, 6, 7] extend the parallel rewriting in L systems to planar graphs with cycles, called maps [8]. The maps...carbon. For easy of reading , in the results below the mass is represented by the percentage carbon laminate coverage. As explained previously, the...previously, that value can be read in the left vertical axis. There are nine designs in the Pareto front. Depending on design goals any of these individuals

  13. HGIMDA: Heterogeneous graph inference for miRNA-disease association prediction

    PubMed Central

    Zhang, Xu; You, Zhu-Hong; Huang, Yu-An; Yan, Gui-Ying

    2016-01-01

    Recently, microRNAs (miRNAs) have drawn more and more attentions because accumulating experimental studies have indicated miRNA could play critical roles in multiple biological processes as well as the development and progression of human complex diseases. Using the huge number of known heterogeneous biological datasets to predict potential associations between miRNAs and diseases is an important topic in the field of biology, medicine, and bioinformatics. In this study, considering the limitations in the previous computational methods, we developed the computational model of Heterogeneous Graph Inference for MiRNA-Disease Association prediction (HGIMDA) to uncover potential miRNA-disease associations by integrating miRNA functional similarity, disease semantic similarity, Gaussian interaction profile kernel similarity, and experimentally verified miRNA-disease associations into a heterogeneous graph. HGIMDA obtained AUCs of 0.8781 and 0.8077 based on global and local leave-one-out cross validation, respectively. Furthermore, HGIMDA was applied to three important human cancers for performance evaluation. As a result, 90% (Colon Neoplasms), 88% (Esophageal Neoplasms) and 88% (Kidney Neoplasms) of top 50 predicted miRNAs are confirmed by recent experiment reports. Furthermore, HGIMDA could be effectively applied to new diseases and new miRNAs without any known associations, which overcome the important limitations of many previous computational models. PMID:27533456

  14. HGIMDA: Heterogeneous graph inference for miRNA-disease association prediction.

    PubMed

    Chen, Xing; Yan, Chenggang Clarence; Zhang, Xu; You, Zhu-Hong; Huang, Yu-An; Yan, Gui-Ying

    2016-10-04

    Recently, microRNAs (miRNAs) have drawn more and more attentions because accumulating experimental studies have indicated miRNA could play critical roles in multiple biological processes as well as the development and progression of human complex diseases. Using the huge number of known heterogeneous biological datasets to predict potential associations between miRNAs and diseases is an important topic in the field of biology, medicine, and bioinformatics. In this study, considering the limitations in the previous computational methods, we developed the computational model of Heterogeneous Graph Inference for MiRNA-Disease Association prediction (HGIMDA) to uncover potential miRNA-disease associations by integrating miRNA functional similarity, disease semantic similarity, Gaussian interaction profile kernel similarity, and experimentally verified miRNA-disease associations into a heterogeneous graph. HGIMDA obtained AUCs of 0.8781 and 0.8077 based on global and local leave-one-out cross validation, respectively. Furthermore, HGIMDA was applied to three important human cancers for performance evaluation. As a result, 90% (Colon Neoplasms), 88% (Esophageal Neoplasms) and 88% (Kidney Neoplasms) of top 50 predicted miRNAs are confirmed by recent experiment reports. Furthermore, HGIMDA could be effectively applied to new diseases and new miRNAs without any known associations, which overcome the important limitations of many previous computational models.

  15. Virtual photons in imaginary time: Computing exact Casimir forces via standard numerical electromagnetism techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rodriguez, Alejandro; Ibanescu, Mihai; Joannopoulos, J. D.

    2007-09-15

    We describe a numerical method to compute Casimir forces in arbitrary geometries, for arbitrary dielectric and metallic materials, with arbitrary accuracy (given sufficient computational resources). Our approach, based on well-established integration of the mean stress tensor evaluated via the fluctuation-dissipation theorem, is designed to directly exploit fast methods developed for classical computational electromagnetism, since it only involves repeated evaluation of the Green's function for imaginary frequencies (equivalently, real frequencies in imaginary time). We develop the approach by systematically examining various formulations of Casimir forces from the previous decades and evaluating them according to their suitability for numerical computation. We illustratemore » our approach with a simple finite-difference frequency-domain implementation, test it for known geometries such as a cylinder and a plate, and apply it to new geometries. In particular, we show that a pistonlike geometry of two squares sliding between metal walls, in both two and three dimensions with both perfect and realistic metallic materials, exhibits a surprising nonmonotonic ''lateral'' force from the walls.« less

  16. Automated Measurement of Patient-Specific Tibial Slopes from MRI

    PubMed Central

    Amerinatanzi, Amirhesam; Summers, Rodney K.; Ahmadi, Kaveh; Goel, Vijay K.; Hewett, Timothy E.; Nyman, Edward

    2017-01-01

    Background: Multi-planar proximal tibial slopes may be associated with increased likelihood of osteoarthritis and anterior cruciate ligament injury, due in part to their role in checking the anterior-posterior stability of the knee. Established methods suffer repeatability limitations and lack computational efficiency for intuitive clinical adoption. The aims of this study were to develop a novel automated approach and to compare the repeatability and computational efficiency of the approach against previously established methods. Methods: Tibial slope geometries were obtained via MRI and measured using an automated Matlab-based approach. Data were compared for repeatability and evaluated for computational efficiency. Results: Mean lateral tibial slope (LTS) for females (7.2°) was greater than for males (1.66°). Mean LTS in the lateral concavity zone was greater for females (7.8° for females, 4.2° for males). Mean medial tibial slope (MTS) for females was greater (9.3° vs. 4.6°). Along the medial concavity zone, female subjects demonstrated greater MTS. Conclusion: The automated method was more repeatable and computationally efficient than previously identified methods and may aid in the clinical assessment of knee injury risk, inform surgical planning, and implant design efforts. PMID:28952547

  17. Computer-aided analysis of Skylab scanner data for land use mapping, forestry and water resource applications

    NASA Technical Reports Server (NTRS)

    Hoffer, R. M.

    1975-01-01

    Skylab data were obtained over a mountainous test site containing a complex association of cover types and rugged topography. The application of computer-aided analysis techniques to the multispectral scanner data produced a number of significant results. Techniques were developed to digitally overlay topographic data (elevation, slope, and aspect) onto the S-192 MSS data to provide a method for increasing the effectiveness and accuracy of computer-aided analysis techniques for cover type mapping. The S-192 MSS data were analyzed using computer techniques developed at Laboratory for Applications of Remote Sensing (LARS), Purdue University. Land use maps, forest cover type maps, snow cover maps, and area tabulations were obtained and evaluated. These results compared very well with information obtained by conventional techniques. Analysis of the spectral characteristics of Skylab data has conclusively proven the value of the middle infrared portion of the spectrum (about 1.3-3.0 micrometers), a wavelength region not previously available in multispectral satellite data.

  18. CYCLOPS-3 System Research.

    ERIC Educational Resources Information Center

    Marill, Thomas; And Others

    The aim of the CYCLOPS Project research is the development of techniques for allowing computers to perform visual scene analysis, pre-processing of visual imagery, and perceptual learning. Work on scene analysis and learning has previously been described. The present report deals with research on pre-processing and with further work on scene…

  19. Robust network design for multispecies conservation

    Treesearch

    Ronan Le Bras; Bistra Dilkina; Yexiang Xue; Carla P. Gomes; Kevin S. McKelvey; Michael K. Schwartz; Claire A. Montgomery

    2013-01-01

    Our work is motivated by an important network design application in computational sustainability concerning wildlife conservation. In the face of human development and climate change, it is important that conservation plans for protecting landscape connectivity exhibit certain level of robustness. While previous work has focused on conservation strategies that result...

  20. Hierarchical Bayesian Models of Subtask Learning

    ERIC Educational Resources Information Center

    Anglim, Jeromy; Wynton, Sarah K. A.

    2015-01-01

    The current study used Bayesian hierarchical methods to challenge and extend previous work on subtask learning consistency. A general model of individual-level subtask learning was proposed focusing on power and exponential functions with constraints to test for inconsistency. To study subtask learning, we developed a novel computer-based booking…

  1. Learning to Teach Mathematics with Technology: A Survey of Professional Development Needs, Experiences and Impacts

    ERIC Educational Resources Information Center

    Bennison, Anne; Goos, Merrilyn

    2010-01-01

    The potential for digital technologies to enhance students' mathematics learning is widely recognised, and use of computers and graphics calculators is now encouraged or required by secondary school mathematics curriculum documents throughout Australia. However, previous research indicates that effective integration of technology into classroom…

  2. Problem Solving Under Time-Constraints.

    ERIC Educational Resources Information Center

    Richardson, Michael; Hunt, Earl

    A model of how automated and controlled processing can be mixed in computer simulations of problem solving is proposed. It is based on previous work by Hunt and Lansman (1983), who developed a model of problem solving that could reproduce the data obtained with several attention and performance paradigms, extending production-system notation to…

  3. ESTIMATION OF THE RATE OF VOC EMISSIONS FROM SOLVENT-BASED INDOOR COATING MATERIALS BASED ON PRODUCT FORMULATION

    EPA Science Inventory

    Two computational methods are proposed for estimation of the emission rate of volatile organic compounds (VOCs) from solvent-based indoor coating materials based on the knowledge of product formulation. The first method utilizes two previously developed mass transfer models with ...

  4. Teaching Petri Nets Using P3

    ERIC Educational Resources Information Center

    Gasevic, Dragan; Devedzic, Vladan

    2004-01-01

    This paper presents Petri net software tool P3 that is developed for training purposes of the Architecture and organization of computers (AOC) course. The P3 has the following features: graphical modeling interface, interactive simulation by single and parallel (with previous conflict resolution) transition firing, two well-known Petri net…

  5. Dynamic Courseware Generation on the WWW.

    ERIC Educational Resources Information Center

    Vassileva, Julita; Deters, Ralph

    1998-01-01

    The Dynamic Courseware Generator (DCG), which runs on a Web server, was developed for the authoring of adaptive computer-assisted learning courses. It generates an individual course according to the learner's goals and previous knowledge, and dynamically adapts the course according to the learner's success in knowledge acquisition. The tool may be…

  6. Integrating Incremental Learning and Episodic Memory Models of the Hippocampal Region

    ERIC Educational Resources Information Center

    Meeter, M.; Myers, C. E.; Gluck, M. A.

    2005-01-01

    By integrating previous computational models of corticohippocampal function, the authors develop and test a unified theory of the neural substrates of familiarity, recollection, and classical conditioning. This approach integrates models from 2 traditions of hippocampal modeling, those of episodic memory and incremental learning, by drawing on an…

  7. Measurement of information and communication technology experience and attitudes to e-learning of students in the healthcare professions: integrative review.

    PubMed

    Wilkinson, Ann; While, Alison E; Roberts, Julia

    2009-04-01

    This paper is a report of a review to describe and discuss the psychometric properties of instruments used in healthcare education settings measuring experience and attitudes of healthcare students regarding their information and communication technology skills and their use of computers and the Internet for education. Healthcare professionals are expected to be computer and information literate at registration. A previous review of evaluative studies of computer-based learning suggests that methods of measuring learners' attitudes to computers and computer aided learning are problematic. A search of eight health and social science databases located 49 papers, the majority published between 1995 and January 2007, focusing on the experience and attitudes of students in the healthcare professions towards computers and e-learning. An integrative approach was adopted, with narrative description of findings. Criteria for inclusion were quantitative studies using survey tools with samples of healthcare students and concerning computer and information literacy skills, access to computers, experience with computers and use of computers and the Internet for education purposes. Since the 1980s a number of instruments have been developed, mostly in the United States of America, to measure attitudes to computers, anxiety about computer use, information and communication technology skills, satisfaction and more recently attitudes to the Internet and computers for education. The psychometric properties are poorly described. Advances in computers and technology mean that many earlier tools are no longer valid. Measures of the experience and attitudes of healthcare students to the increased use of e-learning require development in line with computer and technology advances.

  8. PlantCV v2: Image analysis software for high-throughput plant phenotyping

    PubMed Central

    Abbasi, Arash; Berry, Jeffrey C.; Callen, Steven T.; Chavez, Leonardo; Doust, Andrew N.; Feldman, Max J.; Gilbert, Kerrigan B.; Hodge, John G.; Hoyer, J. Steen; Lin, Andy; Liu, Suxing; Lizárraga, César; Lorence, Argelia; Miller, Michael; Platon, Eric; Tessman, Monica; Sax, Tony

    2017-01-01

    Systems for collecting image data in conjunction with computer vision techniques are a powerful tool for increasing the temporal resolution at which plant phenotypes can be measured non-destructively. Computational tools that are flexible and extendable are needed to address the diversity of plant phenotyping problems. We previously described the Plant Computer Vision (PlantCV) software package, which is an image processing toolkit for plant phenotyping analysis. The goal of the PlantCV project is to develop a set of modular, reusable, and repurposable tools for plant image analysis that are open-source and community-developed. Here we present the details and rationale for major developments in the second major release of PlantCV. In addition to overall improvements in the organization of the PlantCV project, new functionality includes a set of new image processing and normalization tools, support for analyzing images that include multiple plants, leaf segmentation, landmark identification tools for morphometrics, and modules for machine learning. PMID:29209576

  9. PlantCV v2: Image analysis software for high-throughput plant phenotyping.

    PubMed

    Gehan, Malia A; Fahlgren, Noah; Abbasi, Arash; Berry, Jeffrey C; Callen, Steven T; Chavez, Leonardo; Doust, Andrew N; Feldman, Max J; Gilbert, Kerrigan B; Hodge, John G; Hoyer, J Steen; Lin, Andy; Liu, Suxing; Lizárraga, César; Lorence, Argelia; Miller, Michael; Platon, Eric; Tessman, Monica; Sax, Tony

    2017-01-01

    Systems for collecting image data in conjunction with computer vision techniques are a powerful tool for increasing the temporal resolution at which plant phenotypes can be measured non-destructively. Computational tools that are flexible and extendable are needed to address the diversity of plant phenotyping problems. We previously described the Plant Computer Vision (PlantCV) software package, which is an image processing toolkit for plant phenotyping analysis. The goal of the PlantCV project is to develop a set of modular, reusable, and repurposable tools for plant image analysis that are open-source and community-developed. Here we present the details and rationale for major developments in the second major release of PlantCV. In addition to overall improvements in the organization of the PlantCV project, new functionality includes a set of new image processing and normalization tools, support for analyzing images that include multiple plants, leaf segmentation, landmark identification tools for morphometrics, and modules for machine learning.

  10. PlantCV v2: Image analysis software for high-throughput plant phenotyping

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gehan, Malia A.; Fahlgren, Noah; Abbasi, Arash

    Systems for collecting image data in conjunction with computer vision techniques are a powerful tool for increasing the temporal resolution at which plant phenotypes can be measured non-destructively. Computational tools that are flexible and extendable are needed to address the diversity of plant phenotyping problems. We previously described the Plant Computer Vision (PlantCV) software package, which is an image processing toolkit for plant phenotyping analysis. The goal of the PlantCV project is to develop a set of modular, reusable, and repurposable tools for plant image analysis that are open-source and community-developed. Here in this paper we present the details andmore » rationale for major developments in the second major release of PlantCV. In addition to overall improvements in the organization of the PlantCV project, new functionality includes a set of new image processing and normalization tools, support for analyzing images that include multiple plants, leaf segmentation, landmark identification tools for morphometrics, and modules for machine learning.« less

  11. PlantCV v2: Image analysis software for high-throughput plant phenotyping

    DOE PAGES

    Gehan, Malia A.; Fahlgren, Noah; Abbasi, Arash; ...

    2017-12-01

    Systems for collecting image data in conjunction with computer vision techniques are a powerful tool for increasing the temporal resolution at which plant phenotypes can be measured non-destructively. Computational tools that are flexible and extendable are needed to address the diversity of plant phenotyping problems. We previously described the Plant Computer Vision (PlantCV) software package, which is an image processing toolkit for plant phenotyping analysis. The goal of the PlantCV project is to develop a set of modular, reusable, and repurposable tools for plant image analysis that are open-source and community-developed. Here in this paper we present the details andmore » rationale for major developments in the second major release of PlantCV. In addition to overall improvements in the organization of the PlantCV project, new functionality includes a set of new image processing and normalization tools, support for analyzing images that include multiple plants, leaf segmentation, landmark identification tools for morphometrics, and modules for machine learning.« less

  12. Pressure algorithm for elliptic flow calculations with the PDF method

    NASA Technical Reports Server (NTRS)

    Anand, M. S.; Pope, S. B.; Mongia, H. C.

    1991-01-01

    An algorithm to determine the mean pressure field for elliptic flow calculations with the probability density function (PDF) method is developed and applied. The PDF method is a most promising approach for the computation of turbulent reacting flows. Previous computations of elliptic flows with the method were in conjunction with conventional finite volume based calculations that provided the mean pressure field. The algorithm developed and described here permits the mean pressure field to be determined within the PDF calculations. The PDF method incorporating the pressure algorithm is applied to the flow past a backward-facing step. The results are in good agreement with data for the reattachment length, mean velocities, and turbulence quantities including triple correlations.

  13. METLIN-PC: An applications-program package for problems of mathematical programming

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pshenichnyi, B.N.; Sobolenko, L.A.; Sosnovskii, A.A.

    1994-05-01

    The METLIN-PC applications-program package (APP) was developed at the V.M. Glushkov Institute of Cybernetics of the Academy of Sciences of Ukraine on IBM PC XT and AT computers. The present version of the package was written in Turbo Pascal and Fortran-77. The METLIN-PC is chiefly designed for the solution of smooth problems of mathematical programming and is a further development of the METLIN prototype, which was created earlier on a BESM-6 computer. The principal property of the previous package is retained - the applications modules employ a single approach based on the linearization method of B.N. Pschenichnyi. Hence the namemore » {open_quotes}METLIN.{close_quotes}« less

  14. Computer-automated tinnitus assessment: noise-band matching, maskability, and residual inhibition.

    PubMed

    Henry, James A; Roberts, Larry E; Ellingson, Roger M; Thielman, Emily J

    2013-06-01

    Psychoacoustic measures of tinnitus typically include loudness and pitch match, minimum masking level (MML), and residual inhibition (RI). We previously developed and documented a computer-automated tinnitus evaluation system (TES) capable of subject-guided loudness and pitch matching. The TES was further developed to conduct computer-aided, subject-guided testing for noise-band matching (NBM), MML, and RI. The purpose of the present study was to document the capability of the upgraded TES to obtain measures of NBM, MML, and RI, and to determine the test-retest reliability of the responses obtained. Three subject-guided, computer-automated testing protocols were developed to conduct NBM. For MML and RI testing, a 2-12 kHz band of noise was used. All testing was repeated during a second session. Subjects meeting study criteria were selected from those who had previously been tested for loudness and pitch matching in our laboratory. A total of 21 subjects completed testing, including seven females and 14 males. The upgraded TES was found to be fairly time efficient. Subjects were generally reliable, both within and between sessions, with respect to the type of stimulus they chose as the best match to their tinnitus. Matching to bandwidth was more variable between measurements, with greater consistency seen for subjects reporting tonal tinnitus or wide-band noisy tinnitus than intermediate types. Between-session repeated MMLs were within 10 dB of each other for all but three of the subjects. Subjects who experienced RI during Session 1 tended to be those who experienced it during Session 2. This study may represent the first time that NBM, MML, and RI audiometric testing results have been obtained entirely through a self-contained, computer-automated system designed specifically for use in the clinic. Future plans include refinements to achieve greater testing efficiency. American Academy of Audiology.

  15. An Analysis of Cloud Computing with Amazon Web Services for the Atmospheric Science Data Center

    NASA Astrophysics Data System (ADS)

    Gleason, J. L.; Little, M. M.

    2013-12-01

    NASA science and engineering efforts rely heavily on compute and data handling systems. The nature of NASA science data is such that it is not restricted to NASA users, instead it is widely shared across a globally distributed user community including scientists, educators, policy decision makers, and the public. Therefore NASA science computing is a candidate use case for cloud computing where compute resources are outsourced to an external vendor. Amazon Web Services (AWS) is a commercial cloud computing service developed to use excess computing capacity at Amazon, and potentially provides an alternative to costly and potentially underutilized dedicated acquisitions whenever NASA scientists or engineers require additional data processing. AWS desires to provide a simplified avenue for NASA scientists and researchers to share large, complex data sets with external partners and the public. AWS has been extensively used by JPL for a wide range of computing needs and was previously tested on a NASA Agency basis during the Nebula testing program. Its ability to support the Langley Science Directorate needs to be evaluated by integrating it with real world operational needs across NASA and the associated maturity that would come with that. The strengths and weaknesses of this architecture and its ability to support general science and engineering applications has been demonstrated during the previous testing. The Langley Office of the Chief Information Officer in partnership with the Atmospheric Sciences Data Center (ASDC) has established a pilot business interface to utilize AWS cloud computing resources on a organization and project level pay per use model. This poster discusses an effort to evaluate the feasibility of the pilot business interface from a project level perspective by specifically using a processing scenario involving the Clouds and Earth's Radiant Energy System (CERES) project.

  16. Parallel Computational Protein Design.

    PubMed

    Zhou, Yichao; Donald, Bruce R; Zeng, Jianyang

    2017-01-01

    Computational structure-based protein design (CSPD) is an important problem in computational biology, which aims to design or improve a prescribed protein function based on a protein structure template. It provides a practical tool for real-world protein engineering applications. A popular CSPD method that guarantees to find the global minimum energy solution (GMEC) is to combine both dead-end elimination (DEE) and A* tree search algorithms. However, in this framework, the A* search algorithm can run in exponential time in the worst case, which may become the computation bottleneck of large-scale computational protein design process. To address this issue, we extend and add a new module to the OSPREY program that was previously developed in the Donald lab (Gainza et al., Methods Enzymol 523:87, 2013) to implement a GPU-based massively parallel A* algorithm for improving protein design pipeline. By exploiting the modern GPU computational framework and optimizing the computation of the heuristic function for A* search, our new program, called gOSPREY, can provide up to four orders of magnitude speedups in large protein design cases with a small memory overhead comparing to the traditional A* search algorithm implementation, while still guaranteeing the optimality. In addition, gOSPREY can be configured to run in a bounded-memory mode to tackle the problems in which the conformation space is too large and the global optimal solution cannot be computed previously. Furthermore, the GPU-based A* algorithm implemented in the gOSPREY program can be combined with the state-of-the-art rotamer pruning algorithms such as iMinDEE (Gainza et al., PLoS Comput Biol 8:e1002335, 2012) and DEEPer (Hallen et al., Proteins 81:18-39, 2013) to also consider continuous backbone and side-chain flexibility.

  17. Computational and transcriptional evidence for microRNAs in the honey bee genome

    PubMed Central

    Weaver, Daniel B; Anzola, Juan M; Evans, Jay D; Reid, Jeffrey G; Reese, Justin T; Childs, Kevin L; Zdobnov, Evgeny M; Samanta, Manoj P; Miller, Jonathan; Elsik, Christine G

    2007-01-01

    Background Non-coding microRNAs (miRNAs) are key regulators of gene expression in eukaryotes. Insect miRNAs help regulate the levels of proteins involved with development, metabolism, and other life history traits. The recently sequenced honey bee genome provides an opportunity to detect novel miRNAs in both this species and others, and to begin to infer the roles of miRNAs in honey bee development. Results Three independent computational surveys of the assembled honey bee genome identified a total of 65 non-redundant candidate miRNAs, several of which appear to have previously unrecognized orthologs in the Drosophila genome. A subset of these candidate miRNAs were screened for expression by quantitative RT-PCR and/or genome tiling arrays and most predicted miRNAs were confirmed as being expressed in at least one honey bee tissue. Interestingly, the transcript abundance for several known and novel miRNAs displayed caste or age-related differences in honey bees. Genes in proximity to miRNAs in the bee genome are disproportionately associated with the Gene Ontology terms 'physiological process', 'nucleus' and 'response to stress'. Conclusion Computational approaches successfully identified miRNAs in the honey bee and indicated previously unrecognized miRNAs in the well-studied Drosophila melanogaster genome despite the 280 million year distance between these insects. Differentially transcribed miRNAs are likely to be involved in regulating honey bee development, and arguably in the extreme developmental switch between sterile worker bees and highly fertile queens. PMID:17543122

  18. Novel metaheuristic for parameter estimation in nonlinear dynamic biological systems

    PubMed Central

    Rodriguez-Fernandez, Maria; Egea, Jose A; Banga, Julio R

    2006-01-01

    Background We consider the problem of parameter estimation (model calibration) in nonlinear dynamic models of biological systems. Due to the frequent ill-conditioning and multi-modality of many of these problems, traditional local methods usually fail (unless initialized with very good guesses of the parameter vector). In order to surmount these difficulties, global optimization (GO) methods have been suggested as robust alternatives. Currently, deterministic GO methods can not solve problems of realistic size within this class in reasonable computation times. In contrast, certain types of stochastic GO methods have shown promising results, although the computational cost remains large. Rodriguez-Fernandez and coworkers have presented hybrid stochastic-deterministic GO methods which could reduce computation time by one order of magnitude while guaranteeing robustness. Our goal here was to further reduce the computational effort without loosing robustness. Results We have developed a new procedure based on the scatter search methodology for nonlinear optimization of dynamic models of arbitrary (or even unknown) structure (i.e. black-box models). In this contribution, we describe and apply this novel metaheuristic, inspired by recent developments in the field of operations research, to a set of complex identification problems and we make a critical comparison with respect to the previous (above mentioned) successful methods. Conclusion Robust and efficient methods for parameter estimation are of key importance in systems biology and related areas. The new metaheuristic presented in this paper aims to ensure the proper solution of these problems by adopting a global optimization approach, while keeping the computational effort under reasonable values. This new metaheuristic was applied to a set of three challenging parameter estimation problems of nonlinear dynamic biological systems, outperforming very significantly all the methods previously used for these benchmark problems. PMID:17081289

  19. Novel metaheuristic for parameter estimation in nonlinear dynamic biological systems.

    PubMed

    Rodriguez-Fernandez, Maria; Egea, Jose A; Banga, Julio R

    2006-11-02

    We consider the problem of parameter estimation (model calibration) in nonlinear dynamic models of biological systems. Due to the frequent ill-conditioning and multi-modality of many of these problems, traditional local methods usually fail (unless initialized with very good guesses of the parameter vector). In order to surmount these difficulties, global optimization (GO) methods have been suggested as robust alternatives. Currently, deterministic GO methods can not solve problems of realistic size within this class in reasonable computation times. In contrast, certain types of stochastic GO methods have shown promising results, although the computational cost remains large. Rodriguez-Fernandez and coworkers have presented hybrid stochastic-deterministic GO methods which could reduce computation time by one order of magnitude while guaranteeing robustness. Our goal here was to further reduce the computational effort without loosing robustness. We have developed a new procedure based on the scatter search methodology for nonlinear optimization of dynamic models of arbitrary (or even unknown) structure (i.e. black-box models). In this contribution, we describe and apply this novel metaheuristic, inspired by recent developments in the field of operations research, to a set of complex identification problems and we make a critical comparison with respect to the previous (above mentioned) successful methods. Robust and efficient methods for parameter estimation are of key importance in systems biology and related areas. The new metaheuristic presented in this paper aims to ensure the proper solution of these problems by adopting a global optimization approach, while keeping the computational effort under reasonable values. This new metaheuristic was applied to a set of three challenging parameter estimation problems of nonlinear dynamic biological systems, outperforming very significantly all the methods previously used for these benchmark problems.

  20. Improving Distributed Diagnosis Through Structural Model Decomposition

    NASA Technical Reports Server (NTRS)

    Bregon, Anibal; Daigle, Matthew John; Roychoudhury, Indranil; Biswas, Gautam; Koutsoukos, Xenofon; Pulido, Belarmino

    2011-01-01

    Complex engineering systems require efficient fault diagnosis methodologies, but centralized approaches do not scale well, and this motivates the development of distributed solutions. This work presents an event-based approach for distributed diagnosis of abrupt parametric faults in continuous systems, by using the structural model decomposition capabilities provided by Possible Conflicts. We develop a distributed diagnosis algorithm that uses residuals computed by extending Possible Conflicts to build local event-based diagnosers based on global diagnosability analysis. The proposed approach is applied to a multitank system, and results demonstrate an improvement in the design of local diagnosers. Since local diagnosers use only a subset of the residuals, and use subsystem models to compute residuals (instead of the global system model), the local diagnosers are more efficient than previously developed distributed approaches.

  1. Changing computing paradigms towards power efficiency

    PubMed Central

    Klavík, Pavel; Malossi, A. Cristiano I.; Bekas, Costas; Curioni, Alessandro

    2014-01-01

    Power awareness is fast becoming immensely important in computing, ranging from the traditional high-performance computing applications to the new generation of data centric workloads. In this work, we describe our efforts towards a power-efficient computing paradigm that combines low- and high-precision arithmetic. We showcase our ideas for the widely used kernel of solving systems of linear equations that finds numerous applications in scientific and engineering disciplines as well as in large-scale data analytics, statistics and machine learning. Towards this goal, we developed tools for the seamless power profiling of applications at a fine-grain level. In addition, we verify here previous work on post-FLOPS/W metrics and show that these can shed much more light in the power/energy profile of important applications. PMID:24842033

  2. The cardiac muscle duplex as a method to study myocardial heterogeneity

    PubMed Central

    Solovyova, O.; Katsnelson, L.B.; Konovalov, P.V.; Kursanov, A.G.; Vikulova, N.A.; Kohl, P.; Markhasin, V.S.

    2014-01-01

    This paper reviews the development and application of paired muscle preparations, called duplex, for the investigation of mechanisms and consequences of intra-myocardial electro-mechanical heterogeneity. We illustrate the utility of the underlying combined experimental and computational approach for conceptual development and integration of basic science insight with clinically relevant settings, using previously published and new data. Directions for further study are identified. PMID:25106702

  3. Mariner Mars 1971 science operational support equipment

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The Mariner Mars 1971 science operational support equipment (SOSE) was developed to support the checkout of the proof test model and flight spacecraft. The test objectives of the SOSE and how these objectives were implemented are discussed. Attention is focused on the computer portion of the SOSE, since incorporation of a computer in ground checkout equipment represents a major departure from the support equipment concepts previously used. A functional description of the major hardware elements contained in the SOSE is also included, along with the operational performance of the SOSE during spacecraft testing.

  4. Design, processing and testing of LSI arrays, hybrid microelectronics task

    NASA Technical Reports Server (NTRS)

    Himmel, R. P.; Stuhlbarg, S. M.; Ravetti, R. G.; Zulueta, P. J.; Rothrock, C. W.

    1979-01-01

    Mathematical cost models previously developed for hybrid microelectronic subsystems were refined and expanded. Rework terms related to substrate fabrication, nonrecurring developmental and manufacturing operations, and prototype production are included. Sample computer programs were written to demonstrate hybrid microelectric applications of these cost models. Computer programs were generated to calculate and analyze values for the total microelectronics costs. Large scale integrated (LST) chips utilizing tape chip carrier technology were studied. The feasibility of interconnecting arrays of LSU chips utilizing tape chip carrier and semiautomatic wire bonding technology was demonstrated.

  5. On singlet s-wave electron-hydrogen scattering.

    NASA Technical Reports Server (NTRS)

    Madan, R. N.

    1973-01-01

    Discussion of various zeroth-order approximations to s-wave scattering of electrons by hydrogen atoms below the first excitation threshold. The formalism previously developed by the author (1967, 1968) is applied to Feshbach operators to derive integro-differential equations, with the optical-potential set equal to zero, for the singlet and triplet cases. Phase shifts of s-wave scattering are computed in the zeroth-order approximation of the Feshbach operator method and in the static-exchange approximation. It is found that the convergence of numerical computations is faster in the former approximation than in the latter.

  6. Trends in micro- and nanoComputed Tomography 2008-2010

    NASA Astrophysics Data System (ADS)

    Stock, S. R.

    2010-09-01

    Trends in micro- and nanoComputed Tomography (CT) from January 2008 through July 2010 are the subject of this brief report which takes up where a previous report in Developments in X-ray Tomography VI (2008) concluded. First, the number of systems operating world-wide is estimated. The main focus is on what searches of three citation indices (Web of Science, Compendex and PubMed) reveal about the field of micro- and nanoCT. Given research-fielddependent and disparate terminology used by investigators, searches were on "microtomography", "microCT" and "synchrotron tomography".

  7. CFD Modeling of Free-Piston Stirling Engines

    NASA Technical Reports Server (NTRS)

    Ibrahim, Mounir B.; Zhang, Zhi-Guo; Tew, Roy C., Jr.; Gedeon, David; Simon, Terrence W.

    2001-01-01

    NASA Glenn Research Center (GRC) is funding Cleveland State University (CSU) to develop a reliable Computational Fluid Dynamics (CFD) code that can predict engine performance with the goal of significant improvements in accuracy when compared to one-dimensional (1-D) design code predictions. The funding also includes conducting code validation experiments at both the University of Minnesota (UMN) and CSU. In this paper a brief description of the work-in-progress is provided in the two areas (CFD and Experiments). Also, previous test results are compared with computational data obtained using (1) a 2-D CFD code obtained from Dr. Georg Scheuerer and further developed at CSU and (2) a multidimensional commercial code CFD-ACE+. The test data and computational results are for (1) a gas spring and (2) a single piston/cylinder with attached annular heat exchanger. The comparisons among the codes are discussed. The paper also discusses plans for conducting code validation experiments at CSU and UMN.

  8. A composite computational model of liver glucose homeostasis. I. Building the composite model.

    PubMed

    Hetherington, J; Sumner, T; Seymour, R M; Li, L; Rey, M Varela; Yamaji, S; Saffrey, P; Margoninski, O; Bogle, I D L; Finkelstein, A; Warner, A

    2012-04-07

    A computational model of the glucagon/insulin-driven liver glucohomeostasis function, focusing on the buffering of glucose into glycogen, has been developed. The model exemplifies an 'engineering' approach to modelling in systems biology, and was produced by linking together seven component models of separate aspects of the physiology. The component models use a variety of modelling paradigms and degrees of simplification. Model parameters were determined by an iterative hybrid of fitting to high-scale physiological data, and determination from small-scale in vitro experiments or molecular biological techniques. The component models were not originally designed for inclusion within such a composite model, but were integrated, with modification, using our published modelling software and computational frameworks. This approach facilitates the development of large and complex composite models, although, inevitably, some compromises must be made when composing the individual models. Composite models of this form have not previously been demonstrated.

  9. A reassessment of Galileo radiation exposures in the Jupiter magnetosphere.

    PubMed

    Atwell, William; Townsend, Lawrence; Miller, Thomas; Campbell, Christina

    2005-01-01

    Earlier particle experiments in the 1970s on Pioneer-10 and -11 and Voyager-1 and -2 provided Jupiter flyby particle data, which were used by Divine and Garrett to develop the first Jupiter trapped radiation environment model. This model was used to establish a baseline radiation effects design limit for the Galileo onboard electronics. Recently, Garrett et al. have developed an updated Galileo Interim Radiation Environment (GIRE) model based on Galileo electron data. In this paper, we have used the GIRE model to reassess the computed radiation exposures and dose effects for Galileo. The 34-orbit 'as flown' Galileo trajectory data and the updated GIRE model were used to compute the electron and proton spectra for each of the 34 orbits. The total ionisation doses of electrons and protons have been computed based on a parametric shielding configuration, and these results are compared with previously published results.

  10. Design of a modular digital computer system, CDRL no. D001, final design plan

    NASA Technical Reports Server (NTRS)

    Easton, R. A.

    1975-01-01

    The engineering breadboard implementation for the CDRL no. D001 modular digital computer system developed during design of the logic system was documented. This effort followed the architecture study completed and documented previously, and was intended to verify the concepts of a fault tolerant, automatically reconfigurable, modular version of the computer system conceived during the architecture study. The system has a microprogrammed 32 bit word length, general register architecture and an instruction set consisting of a subset of the IBM System 360 instruction set plus additional fault tolerance firmware. The following areas were covered: breadboard packaging, central control element, central processing element, memory, input/output processor, and maintenance/status panel and electronics.

  11. Graphics Processing Unit Assisted Thermographic Compositing

    NASA Technical Reports Server (NTRS)

    Ragasa, Scott; Russell, Samuel S.

    2012-01-01

    Objective Develop a software application utilizing high performance computing techniques, including general purpose graphics processing units (GPGPUs), for the analysis and visualization of large thermographic data sets. Over the past several years, an increasing effort among scientists and engineers to utilize graphics processing units (GPUs) in a more general purpose fashion is allowing for previously unobtainable levels of computation by individual workstations. As data sets grow, the methods to work them grow at an equal, and often greater, pace. Certain common computations can take advantage of the massively parallel and optimized hardware constructs of the GPU which yield significant increases in performance. These common computations have high degrees of data parallelism, that is, they are the same computation applied to a large set of data where the result does not depend on other data elements. Image processing is one area were GPUs are being used to greatly increase the performance of certain analysis and visualization techniques.

  12. The application of contraction theory to an iterative formulation of electromagnetic scattering

    NASA Technical Reports Server (NTRS)

    Brand, J. C.; Kauffman, J. F.

    1985-01-01

    Contraction theory is applied to an iterative formulation of electromagnetic scattering from periodic structures and a computational method for insuring convergence is developed. A short history of spectral (or k-space) formulation is presented with an emphasis on application to periodic surfaces. To insure a convergent solution of the iterative equation, a process called the contraction corrector method is developed. Convergence properties of previously presented iterative solutions to one-dimensional problems are examined utilizing contraction theory and the general conditions for achieving a convergent solution are explored. The contraction corrector method is then applied to several scattering problems including an infinite grating of thin wires with the solution data compared to previous works.

  13. Microstructure development in Kolmogorov, Johnson-Mehl, and Avrami nucleation and growth kinetics

    NASA Astrophysics Data System (ADS)

    Pineda, Eloi; Crespo, Daniel

    1999-08-01

    A statistical model with the ability to evaluate the microstructure developed in nucleation and growth kinetics is built in the framework of the Kolmogorov, Johnson-Mehl, and Avrami theory. A populational approach is used to compute the observed grain-size distribution. The impingement process which delays grain growth is analyzed, and the effective growth rate of each population is estimated considering the previous grain history. The proposed model is integrated for a wide range of nucleation and growth protocols, including constant nucleation, pre-existing nuclei, and intermittent nucleation with interface or diffusion-controlled grain growth. The results are compared with Monte Carlo simulations, giving quantitative agreement even in cases where previous models fail.

  14. Achieving TASAR Operational Readiness

    NASA Technical Reports Server (NTRS)

    Wing, David J.

    2015-01-01

    NASA has been developing and testing the Traffic Aware Strategic Aircrew Requests (TASAR) concept for aircraft operations featuring a NASA-developed cockpit automation tool, the Traffic Aware Planner (TAP), which computes traffic/hazard-compatible route changes to improve flight efficiency. The TAP technology is anticipated to save fuel and flight time and thereby provide immediate and pervasive benefits to the aircraft operator, as well as improving flight schedule compliance, passenger comfort, and pilot and controller workload. Previous work has indicated the potential for significant benefits for TASAR-equipped aircraft, and a flight trial of the TAP software application in the National Airspace System has demonstrated its technical viability. This paper reviews previous and ongoing activities to prepare TASAR for operational use.

  15. GiveMe Shelter: A People-Centred Design Process for Promoting Independent Inquiry-Led Learning in Engineering

    ERIC Educational Resources Information Center

    Dyer, Mark; Grey, Thomas; Kinnane, Oliver

    2017-01-01

    It has become increasingly common for tasks traditionally carried out by engineers to be undertaken by technicians and technologist with access to sophisticated computers and software that can often perform complex calculations that were previously the responsibility of engineers. Not surprisingly, this development raises serious questions about…

  16. Louisiana's sixth forest survey

    Treesearch

    James F. Rosson

    1993-01-01

    The sixth Louisiana forest survey was completed in November 1991.Previous surveys were done in 1935, 1954, 1964, 1974, and 1984. Information derived from the forest survey has become more and more detailed and comprehensive since the first survey in 1935 as a direct result of enhanced sampling techniques, expanded administrative budgets, and development of computer...

  17. How Readability and Topic Incidence Relate to Performance on Mathematics Story Problems in Computer-Based Curricula

    ERIC Educational Resources Information Center

    Walkington, Candace; Clinton, Virginia; Ritter, Steven N.; Nathan, Mitchell J.

    2015-01-01

    Solving mathematics story problems requires text comprehension skills. However, previous studies have found few connections between traditional measures of text readability and performance on story problems. We hypothesized that recently developed measures of readability and topic incidence measured by text-mining tools may illuminate associations…

  18. easyCBM Norms. 2014 Edition. Technical Report #1409

    ERIC Educational Resources Information Center

    Saven, Jessica L.; Tindal, Gerald; Irvin, P. Shawn; Farley, Dan; Alonzo, Julie

    2014-01-01

    Previous norms for the easyCBM assessment system were computed using scores from all students who took each measure for every grade and benchmark season (fall, winter, and spring). During the 2013-­14 school year, new national norms were developed to more accurately (proportionately) represent reading and mathematics performance by two variables:…

  19. Domain-Specific QSAR Models for Identifying Potential Estrogenic Activity of Phenols (FutureTox III)

    EPA Science Inventory

    Computational tools can be used for efficient evaluation of untested chemicals for their ability to disrupt the endocrine system. We have employed previously developed global QSAR models that were trained and validated on the ToxCast/Tox21 ER assay data for virtual screening of a...

  20. Fusion Simulation Project Workshop Report

    NASA Astrophysics Data System (ADS)

    Kritz, Arnold; Keyes, David

    2009-03-01

    The mission of the Fusion Simulation Project is to develop a predictive capability for the integrated modeling of magnetically confined plasmas. This FSP report adds to the previous activities that defined an approach to integrated modeling in magnetic fusion. These previous activities included a Fusion Energy Sciences Advisory Committee panel that was charged to study integrated simulation in 2002. The report of that panel [Journal of Fusion Energy 20, 135 (2001)] recommended the prompt initiation of a Fusion Simulation Project. In 2003, the Office of Fusion Energy Sciences formed a steering committee that developed a project vision, roadmap, and governance concepts [Journal of Fusion Energy 23, 1 (2004)]. The current FSP planning effort involved 46 physicists, applied mathematicians and computer scientists, from 21 institutions, formed into four panels and a coordinating committee. These panels were constituted to consider: Status of Physics Components, Required Computational and Applied Mathematics Tools, Integration and Management of Code Components, and Project Structure and Management. The ideas, reported here, are the products of these panels, working together over several months and culminating in a 3-day workshop in May 2007.

  1. Big Data: An Opportunity for Collaboration with Computer Scientists on Data-Driven Science

    NASA Astrophysics Data System (ADS)

    Baru, C.

    2014-12-01

    Big data technologies are evolving rapidly, driven by the need to manage ever increasing amounts of historical data; process relentless streams of human and machine-generated data; and integrate data of heterogeneous structure from extremely heterogeneous sources of information. Big data is inherently an application-driven problem. Developing the right technologies requires an understanding of the applications domain. Though, an intriguing aspect of this phenomenon is that the availability of the data itself enables new applications not previously conceived of! In this talk, we will discuss how the big data phenomenon creates an imperative for collaboration among domain scientists (in this case, geoscientists) and computer scientists. Domain scientists provide the application requirements as well as insights about the data involved, while computer scientists help assess whether problems can be solved with currently available technologies or require adaptaion of existing technologies and/or development of new technologies. The synergy can create vibrant collaborations potentially leading to new science insights as well as development of new data technologies and systems. The area of interface between geosciences and computer science, also referred to as geoinformatics is, we believe, a fertile area for interdisciplinary research.

  2. Combining wet and dry research: experience with model development for cardiac mechano-electric structure-function studies

    PubMed Central

    Quinn, T. Alexander; Kohl, Peter

    2013-01-01

    Since the development of the first mathematical cardiac cell model 50 years ago, computational modelling has become an increasingly powerful tool for the analysis of data and for the integration of information related to complex cardiac behaviour. Current models build on decades of iteration between experiment and theory, representing a collective understanding of cardiac function. All models, whether computational, experimental, or conceptual, are simplified representations of reality and, like tools in a toolbox, suitable for specific applications. Their range of applicability can be explored (and expanded) by iterative combination of ‘wet’ and ‘dry’ investigation, where experimental or clinical data are used to first build and then validate computational models (allowing integration of previous findings, quantitative assessment of conceptual models, and projection across relevant spatial and temporal scales), while computational simulations are utilized for plausibility assessment, hypotheses-generation, and prediction (thereby defining further experimental research targets). When implemented effectively, this combined wet/dry research approach can support the development of a more complete and cohesive understanding of integrated biological function. This review illustrates the utility of such an approach, based on recent examples of multi-scale studies of cardiac structure and mechano-electric function. PMID:23334215

  3. Validation of a computer case definition for sudden cardiac death in opioid users.

    PubMed

    Kawai, Vivian K; Murray, Katherine T; Stein, C Michael; Cooper, William O; Graham, David J; Hall, Kathi; Ray, Wayne A

    2012-08-31

    To facilitate the use of automated databases for studies of sudden cardiac death, we previously developed a computerized case definition that had a positive predictive value between 86% and 88%. However, the definition has not been specifically validated for prescription opioid users, for whom out-of-hospital overdose deaths may be difficult to distinguish from sudden cardiac death. We assembled a cohort of persons 30-74 years of age prescribed propoxyphene or hydrocodone who had no life-threatening non-cardiovascular illness, diagnosed drug abuse, residence in a nursing home in the past year, or hospital stay within the past 30 days. Medical records were sought for a sample of 140 cohort deaths within 30 days of a prescription fill meeting the computer case definition. Of the 140 sampled deaths, 81 were adjudicated; 73 (90%) were sudden cardiac deaths. Two deaths had possible opioid overdose; after removing these two the positive predictive value was 88%. These findings are consistent with our previous validation studies and suggest the computer case definition of sudden cardiac death is a useful tool for pharmacoepidemiologic studies of opioid analgesics.

  4. An automated method to find reaction mechanisms and solve the kinetics in organometallic catalysis.

    PubMed

    Varela, J A; Vázquez, S A; Martínez-Núñez, E

    2017-05-01

    A novel computational method is proposed in this work for use in discovering reaction mechanisms and solving the kinetics of transition metal-catalyzed reactions. The method does not rely on either chemical intuition or assumed a priori mechanisms, and it works in a fully automated fashion. Its core is a procedure, recently developed by one of the authors, that combines accelerated direct dynamics with an efficient geometry-based post-processing algorithm to find transition states (Martinez-Nunez, E., J. Comput. Chem. 2015 , 36 , 222-234). In the present work, several auxiliary tools have been added to deal with the specific features of transition metal catalytic reactions. As a test case, we chose the cobalt-catalyzed hydroformylation of ethylene because of its well-established mechanism, and the fact that it has already been used in previous automated computational studies. Besides the generally accepted mechanism of Heck and Breslow, several side reactions, such as hydrogenation of the alkene, emerged from our calculations. Additionally, the calculated rate law for the hydroformylation reaction agrees reasonably well with those obtained in previous experimental and theoretical studies.

  5. Geant4 Computing Performance Benchmarking and Monitoring

    DOE PAGES

    Dotti, Andrea; Elvira, V. Daniel; Folger, Gunter; ...

    2015-12-23

    Performance evaluation and analysis of large scale computing applications is essential for optimal use of resources. As detector simulation is one of the most compute intensive tasks and Geant4 is the simulation toolkit most widely used in contemporary high energy physics (HEP) experiments, it is important to monitor Geant4 through its development cycle for changes in computing performance and to identify problems and opportunities for code improvements. All Geant4 development and public releases are being profiled with a set of applications that utilize different input event samples, physics parameters, and detector configurations. Results from multiple benchmarking runs are compared tomore » previous public and development reference releases to monitor CPU and memory usage. Observed changes are evaluated and correlated with code modifications. Besides the full summary of call stack and memory footprint, a detailed call graph analysis is available to Geant4 developers for further analysis. The set of software tools used in the performance evaluation procedure, both in sequential and multi-threaded modes, include FAST, IgProf and Open|Speedshop. In conclusion, the scalability of the CPU time and memory performance in multi-threaded application is evaluated by measuring event throughput and memory gain as a function of the number of threads for selected event samples.« less

  6. Supplemental computational phantoms to estimate out-of-field absorbed dose in photon radiotherapy

    NASA Astrophysics Data System (ADS)

    Gallagher, Kyle J.; Tannous, Jaad; Nabha, Racile; Feghali, Joelle Ann; Ayoub, Zeina; Jalbout, Wassim; Youssef, Bassem; Taddei, Phillip J.

    2018-01-01

    The purpose of this study was to develop a straightforward method of supplementing patient anatomy and estimating out-of-field absorbed dose for a cohort of pediatric radiotherapy patients with limited recorded anatomy. A cohort of nine children, aged 2-14 years, who received 3D conformal radiotherapy for low-grade localized brain tumors (LBTs), were randomly selected for this study. The extent of these patients’ computed tomography simulation image sets were cranial only. To approximate their missing anatomy, we supplemented the LBT patients’ image sets with computed tomography images of patients in a previous study with larger extents of matched sex, height, and mass and for whom contours of organs at risk for radiogenic cancer had already been delineated. Rigid fusion was performed between the LBT patients’ data and that of the supplemental computational phantoms using commercial software and in-house codes. In-field dose was calculated with a clinically commissioned treatment planning system, and out-of-field dose was estimated with a previously developed analytical model that was re-fit with parameters based on new measurements for intracranial radiotherapy. Mean doses greater than 1 Gy were found in the red bone marrow, remainder, thyroid, and skin of the patients in this study. Mean organ doses between 150 mGy and 1 Gy were observed in the breast tissue of the girls and lungs of all patients. Distant organs, i.e. prostate, bladder, uterus, and colon, received mean organ doses less than 150 mGy. The mean organ doses of the younger, smaller LBT patients (0-4 years old) were a factor of 2.4 greater than those of the older, larger patients (8-12 years old). Our findings demonstrated the feasibility of a straightforward method of applying supplemental computational phantoms and dose-calculation models to estimate absorbed dose for a set of children of various ages who received radiotherapy and for whom anatomies were largely missing in their original computed tomography simulations.

  7. A Computing Method for Sound Propagation Through a Nonuniform Jet Stream

    NASA Technical Reports Server (NTRS)

    Padula, S. L.; Liu, C. H.

    1974-01-01

    Understanding the principles of jet noise propagation is an essential ingredient of systematic noise reduction research. High speed computer methods offer a unique potential for dealing with complex real life physical systems whereas analytical solutions are restricted to sophisticated idealized models. The classical formulation of sound propagation through a jet flow was found to be inadequate for computer solutions and a more suitable approach was needed. Previous investigations selected the phase and amplitude of the acoustic pressure as dependent variables requiring the solution of a system of nonlinear algebraic equations. The nonlinearities complicated both the analysis and the computation. A reformulation of the convective wave equation in terms of a new set of dependent variables is developed with a special emphasis on its suitability for numerical solutions on fast computers. The technique is very attractive because the resulting equations are linear in nonwaving variables. The computer solution to such a linear system of algebraic equations may be obtained by well-defined and direct means which are conservative of computer time and storage space. Typical examples are illustrated and computational results are compared with available numerical and experimental data.

  8. Design Criteria for Low Profile Flange Calculations

    NASA Technical Reports Server (NTRS)

    Leimbach, K. R.

    1973-01-01

    An analytical method and a design procedure to develop flanged separable pipe connectors are discussed. A previously established algorithm is the basis for calculating low profile flanges. The characteristics and advantages of the low profile flange are analyzed. The use of aluminum, titanium, and plastics for flange materials is described. Mathematical models are developed to show the mechanical properties of various flange configurations. A computer program for determining the structural stability of the flanges is described.

  9. Dual-energy contrast-enhanced spectral mammography (CESM).

    PubMed

    Daniaux, Martin; De Zordo, Tobias; Santner, Wolfram; Amort, Birgit; Koppelstätter, Florian; Jaschke, Werner; Dromain, Clarisse; Oberaigner, Willi; Hubalek, Michael; Marth, Christian

    2015-10-01

    Dual-energy contrast-enhanced mammography is one of the latest developments in breast care. Imaging with contrast agents in breast cancer was already known from previous magnetic resonance imaging and computed tomography studies. However, high costs, limited availability-or high radiation dose-led to the development of contrast-enhanced spectral mammography (CESM). We reviewed the current literature, present our experience, discuss the advantages and drawbacks of CESM and look at the future of this innovative technique.

  10. Automatic maintenance payload on board of a Mexican LEO microsatellite

    NASA Astrophysics Data System (ADS)

    Vicente-Vivas, Esaú; García-Nocetti, Fabián; Mendieta-Jiménez, Francisco

    2006-02-01

    Few research institutions from Mexico work together to finalize the integration of a technological demonstration microsatellite called Satex, aiming the launching of the first ever fully designed and manufactured domestic space vehicle. The project is based on technical knowledge gained in previous space experiences, particularly in developing GASCAN automatic experiments for NASA's space shuttle, and in some support obtained from the local team which assembled the México-OSCAR-30 microsatellites. Satex includes three autonomous payloads and a power subsystem, each one with a local microcomputer to provide intelligent and dedicated control. It also contains a flight computer (FC) with a pair of full redundancies. This enables the remote maintenance of processing boards from the ground station. A fourth communications payload depends on the flight computer for control purposes. A fifth payload was decided to be developed for the satellite. It adds value to the available on-board computers and extends the opportunity for a developing country to learn and to generate domestic space technology. Its aim is to provide automatic maintenance capabilities for the most critical on-board computer in order to achieve continuous satellite operations. This paper presents the virtual computer architecture specially developed to provide maintenance capabilities to the flight computer. The architecture is periodically implemented by software with a small amount of physical processors (FC processors) and virtual redundancies (payload processors) to emulate a hybrid redundancy computer. Communications among processors are accomplished over a fault-tolerant LAN. This allows a versatile operating behavior in terms of data communication as well as in terms of distributed fault tolerance. Obtained results, payload validation and reliability results are also presented.

  11. Computational Fluid Dynamics at ICMA (Institute for Computational Mathematics and Applications)

    DTIC Science & Technology

    1988-10-18

    PERSONAL. AUTHOR(S) Charles A. Hall and Thomas A. Porsching 13a. TYPE OF REPORT 13b. TIME COVERED 114. DATE OF REPORT (YearMOth, De ) 1. PAGE COUNT...of ten ICtA (Institute for Computational Mathe- matics and Applications) personnel, relating to the general area of computational fluid mechanics...questions raised in the previous subsection. Our previous work in this area concentrated on a study of the differential geometric aspects of the prob- lem

  12. Relativistic, correlation, and polarization effects in two-photon photoionization of Xe

    NASA Astrophysics Data System (ADS)

    Lagutin, B. M.; Petrov, I. D.; Sukhorukov, V. L.; Demekhin, Ph. V.; Knie, A.; Ehresmann, A.

    2017-06-01

    Two-photon ionization of xenon was investigated theoretically for exciting-photon energies from 6.7 to 11.5 eV, which results in the ionization of Xe between 5 p1 /2 (13.43 eV) and 5 s (23.40 eV) thresholds. We describe the extension of a previously developed computational technique for the inclusion of relativistic effects to calculate energies of intermediate resonance state and cross sections for two-photon ionization. Reasonable consistency of cross sections calculated in length and velocity form was obtained only after considering many-electron correlations. Agreement between calculated and measured resonance energies is found when core polarization was additionally included in the calculations. The presently computed two-photon photoionization cross sections of Xe are compared with Ar cross sections in our previous work. Photoelectron angular distribution parameters calculated here indicate that intermediated resonances strongly influence photoelectron angular distribution of Xe.

  13. Development of a Nonequilibrium Radiative Heating Prediction Method for Coupled Flowfield Solutions

    NASA Technical Reports Server (NTRS)

    Hartung, Lin C.

    1991-01-01

    A method for predicting radiative heating and coupling effects in nonequilibrium flow-fields has been developed. The method resolves atomic lines with a minimum number of spectral points, and treats molecular radiation using the smeared band approximation. To further minimize computational time, the calculation is performed on an optimized spectrum, which is computed for each flow condition to enhance spectral resolution. Additional time savings are obtained by performing the radiation calculation on a subgrid optimally selected for accuracy. Representative results from the new method are compared to previous work to demonstrate that the speedup does not cause a loss of accuracy and is sufficient to make coupled solutions practical. The method is found to be a useful tool for studies of nonequilibrium flows.

  14. An Experimental and Computational Analysis of Primary Cilia Deflection Under Fluid Flow

    PubMed Central

    Downs, Matthew E.; Nguyen, An M.; Herzog, Florian A.; Hoey, David A.; Jacobs, Christopher R.

    2013-01-01

    In this work we have developed a novel model of the deflection of primary cilia experiencing fluid flow accounting for phenomena not previously considered. Specifically, we developed a large rotation formulation that accounts for rotation at the base of the cilium, the initial shape of the cilium and fluid drag at high deflection angles. We utilized this model to analyze full three dimensional datasets of primary cilia deflecting under fluid flow acquired with high-speed confocal microscopy. We found a wide variety of previously unreported bending shapes and behaviors. We also analyzed post-flow relaxation patterns. Results from our combined experimental and theoretical approach suggest that the average flexural rigidity of primary cilia might be higher than previously reported (Schwartz et al. 1997). In addition our findings indicate the mechanics of primary cilia are richly varied and mechanisms may exist to alter their mechanical behavior. PMID:22452422

  15. Changing computing paradigms towards power efficiency.

    PubMed

    Klavík, Pavel; Malossi, A Cristiano I; Bekas, Costas; Curioni, Alessandro

    2014-06-28

    Power awareness is fast becoming immensely important in computing, ranging from the traditional high-performance computing applications to the new generation of data centric workloads. In this work, we describe our efforts towards a power-efficient computing paradigm that combines low- and high-precision arithmetic. We showcase our ideas for the widely used kernel of solving systems of linear equations that finds numerous applications in scientific and engineering disciplines as well as in large-scale data analytics, statistics and machine learning. Towards this goal, we developed tools for the seamless power profiling of applications at a fine-grain level. In addition, we verify here previous work on post-FLOPS/W metrics and show that these can shed much more light in the power/energy profile of important applications. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  16. Software for Building Models of 3D Objects via the Internet

    NASA Technical Reports Server (NTRS)

    Schramer, Tim; Jensen, Jeff

    2003-01-01

    The Virtual EDF Builder (where EDF signifies Electronic Development Fixture) is a computer program that facilitates the use of the Internet for building and displaying digital models of three-dimensional (3D) objects that ordinarily comprise assemblies of solid models created previously by use of computer-aided-design (CAD) programs. The Virtual EDF Builder resides on a Unix-based server computer. It is used in conjunction with a commercially available Web-based plug-in viewer program that runs on a client computer. The Virtual EDF Builder acts as a translator between the viewer program and a database stored on the server. The translation function includes the provision of uniform resource locator (URL) links to other Web-based computer systems and databases. The Virtual EDF builder can be used in two ways: (1) If the client computer is Unix-based, then it can assemble a model locally; the computational load is transferred from the server to the client computer. (2) Alternatively, the server can be made to build the model, in which case the server bears the computational load and the results are downloaded to the client computer or workstation upon completion.

  17. Novel prediction model of renal function after nephrectomy from automated renal volumetry with preoperative multidetector computed tomography (MDCT).

    PubMed

    Isotani, Shuji; Shimoyama, Hirofumi; Yokota, Isao; Noma, Yasuhiro; Kitamura, Kousuke; China, Toshiyuki; Saito, Keisuke; Hisasue, Shin-ichi; Ide, Hisamitsu; Muto, Satoru; Yamaguchi, Raizo; Ukimura, Osamu; Gill, Inderbir S; Horie, Shigeo

    2015-10-01

    The predictive model of postoperative renal function may impact on planning nephrectomy. To develop the novel predictive model using combination of clinical indices with computer volumetry to measure the preserved renal cortex volume (RCV) using multidetector computed tomography (MDCT), and to prospectively validate performance of the model. Total 60 patients undergoing radical nephrectomy from 2011 to 2013 participated, including a development cohort of 39 patients and an external validation cohort of 21 patients. RCV was calculated by voxel count using software (Vincent, FUJIFILM). Renal function before and after radical nephrectomy was assessed via the estimated glomerular filtration rate (eGFR). Factors affecting postoperative eGFR were examined by regression analysis to develop the novel model for predicting postoperative eGFR with a backward elimination method. The predictive model was externally validated and the performance of the model was compared with that of the previously reported models. The postoperative eGFR value was associated with age, preoperative eGFR, preserved renal parenchymal volume (RPV), preserved RCV, % of RPV alteration, and % of RCV alteration (p < 0.01). The significant correlated variables for %eGFR alteration were %RCV preservation (r = 0.58, p < 0.01) and %RPV preservation (r = 0.54, p < 0.01). We developed our regression model as follows: postoperative eGFR = 57.87 - 0.55(age) - 15.01(body surface area) + 0.30(preoperative eGFR) + 52.92(%RCV preservation). Strong correlation was seen between postoperative eGFR and the calculated estimation model (r = 0.83; p < 0.001). The external validation cohort (n = 21) showed our model outperformed previously reported models. Combining MDCT renal volumetry and clinical indices might yield an important tool for predicting postoperative renal function.

  18. Monte Carlo treatment planning for molecular targeted radiotherapy within the MINERVA system

    NASA Astrophysics Data System (ADS)

    Lehmann, Joerg; Hartmann Siantar, Christine; Wessol, Daniel E.; Wemple, Charles A.; Nigg, David; Cogliati, Josh; Daly, Tom; Descalle, Marie-Anne; Flickinger, Terry; Pletcher, David; DeNardo, Gerald

    2005-03-01

    The aim of this project is to extend accurate and patient-specific treatment planning to new treatment modalities, such as molecular targeted radiation therapy, incorporating previously crafted and proven Monte Carlo and deterministic computation methods. A flexible software environment is being created that allows planning radiation treatment for these new modalities and combining different forms of radiation treatment with consideration of biological effects. The system uses common input interfaces, medical image sets for definition of patient geometry and dose reporting protocols. Previously, the Idaho National Engineering and Environmental Laboratory (INEEL), Montana State University (MSU) and Lawrence Livermore National Laboratory (LLNL) had accrued experience in the development and application of Monte Carlo based, three-dimensional, computational dosimetry and treatment planning tools for radiotherapy in several specialized areas. In particular, INEEL and MSU have developed computational dosimetry systems for neutron radiotherapy and neutron capture therapy, while LLNL has developed the PEREGRINE computational system for external beam photon-electron therapy. Building on that experience, the INEEL and MSU are developing the MINERVA (modality inclusive environment for radiotherapeutic variable analysis) software system as a general framework for computational dosimetry and treatment planning for a variety of emerging forms of radiotherapy. In collaboration with this development, LLNL has extended its PEREGRINE code to accommodate internal sources for molecular targeted radiotherapy (MTR), and has interfaced it with the plugin architecture of MINERVA. Results from the extended PEREGRINE code have been compared to published data from other codes, and found to be in general agreement (EGS4—2%, MCNP—10%) (Descalle et al 2003 Cancer Biother. Radiopharm. 18 71-9). The code is currently being benchmarked against experimental data. The interpatient variability of the drug pharmacokinetics in MTR can only be properly accounted for by image-based, patient-specific treatment planning, as has been common in external beam radiation therapy for many years. MINERVA offers 3D Monte Carlo-based MTR treatment planning as its first integrated operational capability. The new MINERVA system will ultimately incorporate capabilities for a comprehensive list of radiation therapies. In progress are modules for external beam photon-electron therapy and boron neutron capture therapy (BNCT). Brachytherapy and proton therapy are planned. Through the open application programming interface (API), other groups can add their own modules and share them with the community.

  19. Description of data base management systems activities

    NASA Technical Reports Server (NTRS)

    1983-01-01

    One of the major responsibilities of the JPL Computing and Information Services Office is to develop and maintain a JPL plan for providing computing services to the JPL management and administrative community that will lead to improved productivity. The CISO plan to accomplish this objective has been titled 'Management and Administrative Support Systems' (MASS). The MASS plan is based on the continued use of JPL's IBM 3032 Computer system for administrative computing and for the MASS functions. The current candidate administrative Data Base Management Systems required to support the MASS include ADABASE, Cullinane IDMS and TOTAL. Previous uses of administrative Data Base Systems have been applied to specific local functions rather than in a centralized manner with elements common to the many user groups. Limited capacity data base systems have been installed in microprocessor based office automation systems in a few Project and Management Offices using Ashton-Tate dBASE II. These experiences plus some other localized in house DBMS uses have provided an excellent background for developing user and system requirements for a single DBMS to support the MASS program.

  20. CASKS (Computer Analysis of Storage casKS): A microcomputer based analysis system for storage cask design review. User`s manual to Version 1b (including program reference)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, T.F.; Gerhard, M.A.; Trummer, D.J.

    CASKS (Computer Analysis of Storage casKS) is a microcomputer-based system of computer programs and databases developed at the Lawrence Livermore National Laboratory (LLNL) for evaluating safety analysis reports on spent-fuel storage casks. The bulk of the complete program and this user`s manual are based upon the SCANS (Shipping Cask ANalysis System) program previously developed at LLNL. A number of enhancements and improvements were added to the original SCANS program to meet requirements unique to storage casks. CASKS is an easy-to-use system that calculates global response of storage casks to impact loads, pressure loads and thermal conditions. This provides reviewers withmore » a tool for an independent check on analyses submitted by licensees. CASKS is based on microcomputers compatible with the IBM-PC family of computers. The system is composed of a series of menus, input programs, cask analysis programs, and output display programs. All data is entered through fill-in-the-blank input screens that contain descriptive data requests.« less

  1. Evaluation of a computer-aided detection algorithm for timely diagnosis of small acute intracranial hemorrhage on computed tomography in a critical care environment

    NASA Astrophysics Data System (ADS)

    Lee, Joon K.; Chan, Tao; Liu, Brent J.; Huang, H. K.

    2009-02-01

    Detection of acute intracranial hemorrhage (AIH) is a primary task in the interpretation of computed tomography (CT) brain scans of patients suffering from acute neurological disturbances or after head trauma. Interpretation can be difficult especially when the lesion is inconspicuous or the reader is inexperienced. We have previously developed a computeraided detection (CAD) algorithm to detect small AIH. One hundred and thirty five small AIH CT studies from the Los Angeles County (LAC) + USC Hospital were identified and matched by age and sex with one hundred and thirty five normal studies. These cases were then processed using our AIH CAD system to evaluate the efficacy and constraints of the algorithm.

  2. Towards a computational- and algorithmic-level account of concept blending using analogies and amalgams

    NASA Astrophysics Data System (ADS)

    Besold, Tarek R.; Kühnberger, Kai-Uwe; Plaza, Enric

    2017-10-01

    Concept blending - a cognitive process which allows for the combination of certain elements (and their relations) from originally distinct conceptual spaces into a new unified space combining these previously separate elements, and enables reasoning and inference over the combination - is taken as a key element of creative thought and combinatorial creativity. In this article, we summarise our work towards the development of a computational-level and algorithmic-level account of concept blending, combining approaches from computational analogy-making and case-based reasoning (CBR). We present the theoretical background, as well as an algorithmic proposal integrating higher-order anti-unification matching and generalisation from analogy with amalgams from CBR. The feasibility of the approach is then exemplified in two case studies.

  3. Stereolithography: a potential new tool in forensic medicine.

    PubMed

    Dolz, M S; Cina, S J; Smith, R

    2000-06-01

    Stereolithography is a computer-mediated method that can be used to quickly create anatomically correct three-dimensional epoxy and acrylic resin models from various types of medical data. Multiple imaging modalities can be exploited, including computed tomography and magnetic resonance imaging. The technology was first developed and used in 1986 to overcome limitations in previous computer-aided manufacturing/milling techniques. Stereolithography is presently used to accurately reproduce both the external and internal anatomy of body structures. Current medical uses of stereolithography include preoperative planning of orthopedic and maxillofacial surgeries, the fabrication of custom prosthetic devices; and the assessment of the degree of bony and soft-tissue injury caused by trauma. We propose that there is a useful, as yet untapped, potential for this technology in forensic medicine.

  4. Providing Graduated Corrective Feedback in an Intelligent Computer-Assisted Language Learning Environment

    ERIC Educational Resources Information Center

    Ai, Haiyang

    2017-01-01

    Corrective feedback (CF), a response to linguistic errors made by second language (L2) learners, has received extensive scholarly attention in second language acquisition. While much of the previous research in the field has focused on whether CF facilitates or impedes L2 development, few studies have examined the efficacy of gradually modifying…

  5. Satisfaction Clustering Analysis of Distance Education Computer Programming Students: A Sample of Karadeniz Technical University

    ERIC Educational Resources Information Center

    Ozyurt, Hacer

    2014-01-01

    In line with recently developing technology, distant education systems based on information technologies are started to be commonly used within higher education. Students' satisfaction is one of the vital aspects in order to maintain distant education efficiently and achieving its goal. As a matter of the fact, previous studies proved that student…

  6. Evaluation of Emerging Energy-Efficient Heterogeneous Computing Platforms for Biomolecular and Cellular Simulation Workloads.

    PubMed

    Stone, John E; Hallock, Michael J; Phillips, James C; Peterson, Joseph R; Luthey-Schulten, Zaida; Schulten, Klaus

    2016-05-01

    Many of the continuing scientific advances achieved through computational biology are predicated on the availability of ongoing increases in computational power required for detailed simulation and analysis of cellular processes on biologically-relevant timescales. A critical challenge facing the development of future exascale supercomputer systems is the development of new computing hardware and associated scientific applications that dramatically improve upon the energy efficiency of existing solutions, while providing increased simulation, analysis, and visualization performance. Mobile computing platforms have recently become powerful enough to support interactive molecular visualization tasks that were previously only possible on laptops and workstations, creating future opportunities for their convenient use for meetings, remote collaboration, and as head mounted displays for immersive stereoscopic viewing. We describe early experiences adapting several biomolecular simulation and analysis applications for emerging heterogeneous computing platforms that combine power-efficient system-on-chip multi-core CPUs with high-performance massively parallel GPUs. We present low-cost power monitoring instrumentation that provides sufficient temporal resolution to evaluate the power consumption of individual CPU algorithms and GPU kernels. We compare the performance and energy efficiency of scientific applications running on emerging platforms with results obtained on traditional platforms, identify hardware and algorithmic performance bottlenecks that affect the usability of these platforms, and describe avenues for improving both the hardware and applications in pursuit of the needs of molecular modeling tasks on mobile devices and future exascale computers.

  7. Streamlining ICAT development through off-the shelf hypermedia systems

    NASA Technical Reports Server (NTRS)

    Orey, Michael; Trent, Ann; Young, James; Sanders, Michael

    1993-01-01

    This project examined the efficacy of building intelligent computer assisted training using an off-the-shelf hypermedia package. In addition, we compared this package to an architecture that had been developed in a previous contract which was based in the C programming language. One person developed a tutor in LinkWay (an off-the-shelf hypermedia system) and another developed the same tutor using the ALM C-based architecture. Development times, ease of use, learner preferences, learner options, and learning effectiveness were compared. In all cases, the off-the-shelf package was shown to be superior to the C-based system.

  8. Parallel computing on Unix workstation arrays

    NASA Astrophysics Data System (ADS)

    Reale, F.; Bocchino, F.; Sciortino, S.

    1994-12-01

    We have tested arrays of general-purpose Unix workstations used as MIMD systems for massive parallel computations. In particular we have solved numerically a demanding test problem with a 2D hydrodynamic code, generally developed to study astrophysical flows, by exucuting it on arrays either of DECstations 5000/200 on Ethernet LAN, or of DECstations 3000/400, equipped with powerful Alpha processors, on FDDI LAN. The code is appropriate for data-domain decomposition, and we have used a library for parallelization previously developed in our Institute, and easily extended to work on Unix workstation arrays by using the PVM software toolset. We have compared the parallel efficiencies obtained on arrays of several processors to those obtained on a dedicated MIMD parallel system, namely a Meiko Computing Surface (CS-1), equipped with Intel i860 processors. We discuss the feasibility of using non-dedicated parallel systems and conclude that the convenience depends essentially on the size of the computational domain as compared to the relative processor power and network bandwidth. We point out that for future perspectives a parallel development of processor and network technology is important, and that the software still offers great opportunities of improvement, especially in terms of latency times in the message-passing protocols. In conditions of significant gain in terms of speedup, such workstation arrays represent a cost-effective approach to massive parallel computations.

  9. Development of Efficient Real-Fluid Model in Simulating Liquid Rocket Injector Flows

    NASA Technical Reports Server (NTRS)

    Cheng, Gary; Farmer, Richard

    2003-01-01

    The characteristics of propellant mixing near the injector have a profound effect on the liquid rocket engine performance. However, the flow features near the injector of liquid rocket engines are extremely complicated, for example supercritical-pressure spray, turbulent mixing, and chemical reactions are present. Previously, a homogeneous spray approach with a real-fluid property model was developed to account for the compressibility and evaporation effects such that thermodynamics properties of a mixture at a wide range of pressures and temperatures can be properly calculated, including liquid-phase, gas- phase, two-phase, and dense fluid regions. The developed homogeneous spray model demonstrated a good success in simulating uni- element shear coaxial injector spray combustion flows. However, the real-fluid model suffered a computational deficiency when applied to a pressure-based computational fluid dynamics (CFD) code. The deficiency is caused by the pressure and enthalpy being the independent variables in the solution procedure of a pressure-based code, whereas the real-fluid model utilizes density and temperature as independent variables. The objective of the present research work is to improve the computational efficiency of the real-fluid property model in computing thermal properties. The proposed approach is called an efficient real-fluid model, and the improvement of computational efficiency is achieved by using a combination of a liquid species and a gaseous species to represent a real-fluid species.

  10. Real-time control system for adaptive resonator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flath, L; An, J; Brase, J

    2000-07-24

    Sustained operation of high average power solid-state lasers currently requires an adaptive resonator to produce the optimal beam quality. We describe the architecture of a real-time adaptive control system for correcting intra-cavity aberrations in a heat capacity laser. Image data collected from a wavefront sensor are processed and used to control phase with a high-spatial-resolution deformable mirror. Our controller takes advantage of recent developments in low-cost, high-performance processor technology. A desktop-based computational engine and object-oriented software architecture replaces the high-cost rack-mount embedded computers of previous systems.

  11. Biophysically Inspired Rational Design of Structured Chimeric Substrates for DNAzyme Cascade Engineering

    PubMed Central

    Lakin, Matthew R.; Brown, Carl W.; Horwitz, Eli K.; Fanning, M. Leigh; West, Hannah E.; Stefanovic, Darko; Graves, Steven W.

    2014-01-01

    The development of large-scale molecular computational networks is a promising approach to implementing logical decision making at the nanoscale, analogous to cellular signaling and regulatory cascades. DNA strands with catalytic activity (DNAzymes) are one means of systematically constructing molecular computation networks with inherent signal amplification. Linking multiple DNAzymes into a computational circuit requires the design of substrate molecules that allow a signal to be passed from one DNAzyme to another through programmed biochemical interactions. In this paper, we chronicle an iterative design process guided by biophysical and kinetic constraints on the desired reaction pathways and use the resulting substrate design to implement heterogeneous DNAzyme signaling cascades. A key aspect of our design process is the use of secondary structure in the substrate molecule to sequester a downstream effector sequence prior to cleavage by an upstream DNAzyme. Our goal was to develop a concrete substrate molecule design to achieve efficient signal propagation with maximal activation and minimal leakage. We have previously employed the resulting design to develop high-performance DNAzyme-based signaling systems with applications in pathogen detection and autonomous theranostics. PMID:25347066

  12. Modelling milk production from feed intake in dairy cattle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clarke, D.L.

    1985-05-01

    Predictive models were developed for both Holstein and Jersey cows. Since Holsteins comprised eighty-five percent of the data, the predictive models developed for Holsteins were used for the development of a user-friendly computer model. Predictive models included: milk production (squared multiple correlation .73), natural log (ln) of milk production (.73), four percent fat-corrected milk (.67), ln four percent fat-corrected milk (.68), fat-free milk (.73), ln fat-free milk (.73), dry matter intake (.61), ln dry matter intake (.60), milk fat (.52), and ln milk fat (.56). The predictive models for ln milk production, ln fat-free milk and ln dry matter intakemore » were incorporated into a computer model. The model was written in standard Fortran for use on mainframe or micro-computers. Daily milk production, fat-free milk production, and dry matter intake were predicted on a daily basis with the previous day's dry matter intake serving as an independent variable in the prediction of the daily milk and fat-free milk production. 21 refs.« less

  13. Prospective estimation of organ dose in CT under tube current modulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tian, Xiaoyu, E-mail: xt3@duke.edu; Li, Xiang; Segars, W. Paul

    Purpose: Computed tomography (CT) has been widely used worldwide as a tool for medical diagnosis and imaging. However, despite its significant clinical benefits, CT radiation dose at the population level has become a subject of public attention and concern. In this light, optimizing radiation dose has become a core responsibility for the CT community. As a fundamental step to manage and optimize dose, it may be beneficial to have accurate and prospective knowledge about the radiation dose for an individual patient. In this study, the authors developed a framework to prospectively estimate organ dose for chest and abdominopelvic CT examsmore » under tube current modulation (TCM). Methods: The organ dose is mainly dependent on two key factors: patient anatomy and irradiation field. A prediction process was developed to accurately model both factors. To model the anatomical diversity and complexity in the patient population, the authors used a previously developed library of computational phantoms with broad distributions of sizes, ages, and genders. A selected clinical patient, represented by a computational phantom in the study, was optimally matched with another computational phantom in the library to obtain a representation of the patient’s anatomy. To model the irradiation field, a previously validated Monte Carlo program was used to model CT scanner systems. The tube current profiles were modeled using a ray-tracing program as previously reported that theoretically emulated the variability of modulation profiles from major CT machine manufacturers Li et al., [Phys. Med. Biol. 59, 4525–4548 (2014)]. The prediction of organ dose was achieved using the following process: (1) CTDI{sub vol}-normalized-organ dose coefficients (h{sub organ}) for fixed tube current were first estimated as the prediction basis for the computational phantoms; (2) each computation phantom, regarded as a clinical patient, was optimally matched with one computational phantom in the library; (3) to account for the effect of the TCM scheme, a weighted organ-specific CTDI{sub vol} [denoted as (CTDI{sub vol}){sub organ,weighted}] was computed for each organ based on the TCM profile and the anatomy of the “matched” phantom; (4) the organ dose was predicted by multiplying the weighted organ-specific CTDI{sub vol} with the organ dose coefficients (h{sub organ}). To quantify the prediction accuracy, each predicted organ dose was compared with the corresponding organ dose simulated from the Monte Carlo program with the TCM profile explicitly modeled. Results: The predicted organ dose showed good agreements with the simulated organ dose across all organs and modulation profiles. The average percentage error in organ dose estimation was generally within 20% across all organs and modulation profiles, except for organs located in the pelvic and shoulder regions. For an average CTDI{sub vol} of a CT exam of 10 mGy, the average error at full modulation strength (α = 1) across all organs was 0.91 mGy for chest exams, and 0.82 mGy for abdominopelvic exams. Conclusions: This study developed a quantitative model to predict organ dose for clinical chest and abdominopelvic scans. Such information may aid in the design of optimized CT protocols in relation to a targeted level of image quality.« less

  14. Robonaut's Flexible Information Technology Infrastructure

    NASA Technical Reports Server (NTRS)

    Askew, Scott; Bluethmann, William; Alder, Ken; Ambrose, Robert

    2003-01-01

    Robonaut, NASA's humanoid robot, is designed to work as both an astronaut assistant and, in certain situations, an astronaut surrogate. This highly dexterous robot performs complex tasks under telepresence control that could previously only be carried out directly by humans. Currently with 47 degrees of freedom (DOF), Robonaut is a state-of-the-art human size telemanipulator system. while many of Robonaut's embedded components have been custom designed to meet packaging or environmental requirements, the primary computing systems used in Robonaut are currently commercial-off-the-shelf (COTS) products which have some correlation to flight qualified computer systems. This loose coupling of information technology (IT) resources allows Robonaut to exploit cost effective solutions while floating the technology base to take advantage of the rapid pace of IT advances. These IT systems utilize a software development environment, which is both compatible with COTS hardware as well as flight proven computing systems, preserving the majority of software development for a flight system. The ability to use highly integrated and flexible COTS software development tools improves productivity while minimizing redesign for a space flight system. Further, the flexibility of Robonaut's software and communication architecture has allowed it to become a widely used distributed development testbed for integrating new capabilities and furthering experimental research.

  15. Architecture Adaptive Computing Environment

    NASA Technical Reports Server (NTRS)

    Dorband, John E.

    2006-01-01

    Architecture Adaptive Computing Environment (aCe) is a software system that includes a language, compiler, and run-time library for parallel computing. aCe was developed to enable programmers to write programs, more easily than was previously possible, for a variety of parallel computing architectures. Heretofore, it has been perceived to be difficult to write parallel programs for parallel computers and more difficult to port the programs to different parallel computing architectures. In contrast, aCe is supportable on all high-performance computing architectures. Currently, it is supported on LINUX clusters. aCe uses parallel programming constructs that facilitate writing of parallel programs. Such constructs were used in single-instruction/multiple-data (SIMD) programming languages of the 1980s, including Parallel Pascal, Parallel Forth, C*, *LISP, and MasPar MPL. In aCe, these constructs are extended and implemented for both SIMD and multiple- instruction/multiple-data (MIMD) architectures. Two new constructs incorporated in aCe are those of (1) scalar and virtual variables and (2) pre-computed paths. The scalar-and-virtual-variables construct increases flexibility in optimizing memory utilization in various architectures. The pre-computed-paths construct enables the compiler to pre-compute part of a communication operation once, rather than computing it every time the communication operation is performed.

  16. Graphics Processing Unit Assisted Thermographic Compositing

    NASA Technical Reports Server (NTRS)

    Ragasa, Scott; McDougal, Matthew; Russell, Sam

    2013-01-01

    Objective: To develop a software application utilizing general purpose graphics processing units (GPUs) for the analysis of large sets of thermographic data. Background: Over the past few years, an increasing effort among scientists and engineers to utilize the GPU in a more general purpose fashion is allowing for supercomputer level results at individual workstations. As data sets grow, the methods to work them grow at an equal, and often greater, pace. Certain common computations can take advantage of the massively parallel and optimized hardware constructs of the GPU to allow for throughput that was previously reserved for compute clusters. These common computations have high degrees of data parallelism, that is, they are the same computation applied to a large set of data where the result does not depend on other data elements. Signal (image) processing is one area were GPUs are being used to greatly increase the performance of certain algorithms and analysis techniques.

  17. A Parametric Computational Analysis into Galvanic Coupling Intrabody Communication.

    PubMed

    Callejon, M Amparo; Del Campo, P; Reina-Tosina, Javier; Roa, Laura M

    2017-08-02

    Intrabody Communication (IBC) uses the human body tissues as transmission media for electrical signals to interconnect personal health devices in wireless body area networks. The main goal of this work is to conduct a computational analysis covering some bioelectric issues that still have not been fully explained, such as the modeling of the skin-electrode impedance, the differences associated to the use of constant voltage or current excitation modes, or the influence on attenuation of the subject's anthropometrical and bioelectric properties. With this aim, a computational finite element model has been developed, allowing the IBC channel attenuation as well as the electric field and current density through arm tissues to be computed as a function of these parameters. As a conclusion, this parametric analysis has in turn permitted us to disclose some knowledge about the causes and effects of the above-mentioned issues, thus explaining and complementing previous results reported in the literature.

  18. Software Aids Visualization of Computed Unsteady Flow

    NASA Technical Reports Server (NTRS)

    Kao, David; Kenwright, David

    2003-01-01

    Unsteady Flow Analysis Toolkit (UFAT) is a computer program that synthesizes motions of time-dependent flows represented by very large sets of data generated in computational fluid dynamics simulations. Prior to the development of UFAT, it was necessary to rely on static, single-snapshot depictions of time-dependent flows generated by flow-visualization software designed for steady flows. Whereas it typically takes weeks to analyze the results of a largescale unsteady-flow simulation by use of steady-flow visualization software, the analysis time is reduced to hours when UFAT is used. UFAT can be used to generate graphical objects of flow visualization results using multi-block curvilinear grids in the format of a previously developed NASA data-visualization program, PLOT3D. These graphical objects can be rendered using FAST, another popular flow visualization software developed at NASA. Flow-visualization techniques that can be exploited by use of UFAT include time-dependent tracking of particles, detection of vortex cores, extractions of stream ribbons and surfaces, and tetrahedral decomposition for optimal particle tracking. Unique computational features of UFAT include capabilities for automatic (batch) processing, restart, memory mapping, and parallel processing. These capabilities significantly reduce analysis time and storage requirements, relative to those of prior flow-visualization software. UFAT can be executed on a variety of supercomputers.

  19. Implementation of an Associative Flow Rule Including Hydrostatic Stress Effects Into the High Strain Rate Deformation Analysis of Polymer Matrix Composites

    NASA Technical Reports Server (NTRS)

    Goldberg, Robert K.; Roberts, Gary D.; Gilat, Amos

    2003-01-01

    A previously developed analytical formulation has been modified in order to more accurately account for the effects of hydrostatic stresses on the nonlinear, strain rate dependent deformation of polymer matrix composites. State variable constitutive equations originally developed for metals have been modified in order to model the nonlinear, strain rate dependent deformation of polymeric materials. To account for the effects of hydrostatic stresses, which are significant in polymers, the classical J2 plasticity theory definitions of effective stress and effective inelastic strain, along with the equations used to compute the components of the inelastic strain rate tensor, are appropriately modified. To verify the revised formulation, the shear and tensile deformation of two representative polymers are computed across a wide range of strain rates. Results computed using the developed constitutive equations correlate well with experimental data. The polymer constitutive equations are implemented within a strength of materials based micromechanics method to predict the nonlinear, strain rate dependent deformation of polymer matrix composites. The composite mechanics are verified by analyzing the deformation of a representative polymer matrix composite for several fiber orientation angles across a variety of strain rates. The computed values compare well to experimentally obtained results.

  20. cisTEM, user-friendly software for single-particle image processing.

    PubMed

    Grant, Timothy; Rohou, Alexis; Grigorieff, Nikolaus

    2018-03-07

    We have developed new open-source software called cis TEM (computational imaging system for transmission electron microscopy) for the processing of data for high-resolution electron cryo-microscopy and single-particle averaging. cis TEM features a graphical user interface that is used to submit jobs, monitor their progress, and display results. It implements a full processing pipeline including movie processing, image defocus determination, automatic particle picking, 2D classification, ab-initio 3D map generation from random parameters, 3D classification, and high-resolution refinement and reconstruction. Some of these steps implement newly-developed algorithms; others were adapted from previously published algorithms. The software is optimized to enable processing of typical datasets (2000 micrographs, 200 k - 300 k particles) on a high-end, CPU-based workstation in half a day or less, comparable to GPU-accelerated processing. Jobs can also be scheduled on large computer clusters using flexible run profiles that can be adapted for most computing environments. cis TEM is available for download from cistem.org. © 2018, Grant et al.

  1. cisTEM, user-friendly software for single-particle image processing

    PubMed Central

    2018-01-01

    We have developed new open-source software called cisTEM (computational imaging system for transmission electron microscopy) for the processing of data for high-resolution electron cryo-microscopy and single-particle averaging. cisTEM features a graphical user interface that is used to submit jobs, monitor their progress, and display results. It implements a full processing pipeline including movie processing, image defocus determination, automatic particle picking, 2D classification, ab-initio 3D map generation from random parameters, 3D classification, and high-resolution refinement and reconstruction. Some of these steps implement newly-developed algorithms; others were adapted from previously published algorithms. The software is optimized to enable processing of typical datasets (2000 micrographs, 200 k – 300 k particles) on a high-end, CPU-based workstation in half a day or less, comparable to GPU-accelerated processing. Jobs can also be scheduled on large computer clusters using flexible run profiles that can be adapted for most computing environments. cisTEM is available for download from cistem.org. PMID:29513216

  2. Automating the parallel processing of fluid and structural dynamics calculations

    NASA Technical Reports Server (NTRS)

    Arpasi, Dale J.; Cole, Gary L.

    1987-01-01

    The NASA Lewis Research Center is actively involved in the development of expert system technology to assist users in applying parallel processing to computational fluid and structural dynamic analysis. The goal of this effort is to eliminate the necessity for the physical scientist to become a computer scientist in order to effectively use the computer as a research tool. Programming and operating software utilities have previously been developed to solve systems of ordinary nonlinear differential equations on parallel scalar processors. Current efforts are aimed at extending these capabilities to systems of partial differential equations, that describe the complex behavior of fluids and structures within aerospace propulsion systems. This paper presents some important considerations in the redesign, in particular, the need for algorithms and software utilities that can automatically identify data flow patterns in the application program and partition and allocate calculations to the parallel processors. A library-oriented multiprocessing concept for integrating the hardware and software functions is described.

  3. Where Next for Marine Cloud Brightening Research?

    NASA Astrophysics Data System (ADS)

    Jenkins, A. K. L.; Forster, P.

    2014-12-01

    Realistic estimates of geoengineering effectiveness will be central to informed decision-making on its possible role in addressing climate change. Over the last decade, global-scale computer climate modelling of geoengineering has been developing. While these developments have allowed quantitative estimates of geoengineering effectiveness to be produced, the relative coarseness of the grid of these models (tens of kilometres) means that key practical details of the proposed geoengineering is not always realistically captured. This is particularly true for marine cloud brightening (MCB), where both the clouds, as well as the tens-of-meters scale sea-going implementation vessels cannot be captured in detail. Previous research using cloud resolving modelling has shown that neglecting such details may lead to MCB effectiveness being overestimated by up to half. Realism of MCB effectiveness will likely improve from ongoing developments in the understanding and modelling of clouds. We also propose that realism can be increased via more specific improvements (see figure). A readily achievable example would be the reframing of previous MCB effectiveness estimates in light of the cloud resolving scale findings. Incorporation of implementation details could also be made - via parameterisation - into future global-scale modelling of MCB. However, as significant unknowns regarding the design of the MCB aerosol production technique remain, resource-intensive cloud resolving computer modelling of MCB may be premature unless of broader benefit to the wider understanding of clouds. One of the most essential recommendations is for enhanced communication between climate scientists and MCB designers. This would facilitate the identification of potentially important design aspects necessary for realistic computer simulations. Such relationships could be mutually beneficial, with computer modelling potentially informing more efficient designs of the MCB implementation technique. (Acknowledgment) This work is part of the Integrated Assessment of Geoengineering Proposals (IAGP) project, funded by the Engineering and Physical Sciences Research Council and the Natural Environment Research Council (EP/I014721/1).

  4. Standard payload computer for the international space station

    NASA Astrophysics Data System (ADS)

    Knott, Karl; Taylor, Chris; Koenig, Horst; Schlosstein, Uwe

    1999-01-01

    This paper describes the development and application of a Standard PayLoad Computer (SPLC) which is being applied by the majority of ESA payloads accommodated on the International Space Station (ISS). The strategy of adopting of a standard computer leads to a radical rethink in the payload data handling procurement process. Traditionally, this has been based on a proprietary development with repeating costs for qualification, spares, expertise and maintenance for each new payload. Implementations have also tended to be unique with very little opportunity for reuse or utilisation of previous developments. While this may to some extent have been justified for short duration one-off missions, the availability of a standard, long term space infrastructure calls for a quite different approach. To support a large number of concurrent payloads, the ISS implementation relies heavily on standardisation, and this is particularly true in the area of payloads. Physical accommodation, data interfaces, protocols, component quality, operational requirements and maintenance including spares provisioning must all conform to a common set of standards. The data handling system and associated computer used by each payload must also comply with these common requirements, and thus it makes little sense to instigate multiple developments for the same task. The opportunity exists to provide a single computer suitable for all payloads, but with only a one-off development and qualification cost. If this is combined with the benefits of multiple procurement, centralised spares and maintenance, there is potential for great savings to be made by all those concerned in the payload development process. In response to the above drivers, the SPLC is based on the following concepts: • A one-off development and qualification process • A modular computer, configurable according to the payload developer's needs from a list of space-qualified items • An `open system' which may be added to by payload developers • Core software providing a suite of common communications services including a verified protocol implementation required to communicate with the ISS • A standardized ground support equipment and accompanying software development environment • The use of commercial hardware and software standards and products.

  5. Blade loss transient dynamics analysis, volume 2. Task 2: Theoretical and analytical development. Task 3: Experimental verification

    NASA Technical Reports Server (NTRS)

    Gallardo, V. C.; Storace, A. S.; Gaffney, E. F.; Bach, L. J.; Stallone, M. J.

    1981-01-01

    The component element method was used to develop a transient dynamic analysis computer program which is essentially based on modal synthesis combined with a central, finite difference, numerical integration scheme. The methodology leads to a modular or building-block technique that is amenable to computer programming. To verify the analytical method, turbine engine transient response analysis (TETRA), was applied to two blade-out test vehicles that had been previously instrumented and tested. Comparison of the time dependent test data with those predicted by TETRA led to recommendations for refinement or extension of the analytical method to improve its accuracy and overcome its shortcomings. The development of working equations, their discretization, numerical solution scheme, the modular concept of engine modelling, the program logical structure and some illustrated results are discussed. The blade-loss test vehicles (rig full engine), the type of measured data, and the engine structural model are described.

  6. Experience with abstract notation one

    NASA Technical Reports Server (NTRS)

    Harvey, James D.; Weaver, Alfred C.

    1990-01-01

    The development of computer science has produced a vast number of machine architectures, programming languages, and compiler technologies. The cross product of these three characteristics defines the spectrum of previous and present data representation methodologies. With regard to computer networks, the uniqueness of these methodologies presents an obstacle when disparate host environments are to be interconnected. Interoperability within a heterogeneous network relies upon the establishment of data representation commonality. The International Standards Organization (ISO) is currently developing the abstract syntax notation one standard (ASN.1) and the basic encoding rules standard (BER) that collectively address this problem. When used within the presentation layer of the open systems interconnection reference model, these two standards provide the data representation commonality required to facilitate interoperability. The details of a compiler that was built to automate the use of ASN.1 and BER are described. From this experience, insights into both standards are given and potential problems relating to this development effort are discussed.

  7. Hypercube matrix computation task

    NASA Technical Reports Server (NTRS)

    Calalo, Ruel H.; Imbriale, William A.; Jacobi, Nathan; Liewer, Paulett C.; Lockhart, Thomas G.; Lyzenga, Gregory A.; Lyons, James R.; Manshadi, Farzin; Patterson, Jean E.

    1988-01-01

    A major objective of the Hypercube Matrix Computation effort at the Jet Propulsion Laboratory (JPL) is to investigate the applicability of a parallel computing architecture to the solution of large-scale electromagnetic scattering problems. Three scattering analysis codes are being implemented and assessed on a JPL/California Institute of Technology (Caltech) Mark 3 Hypercube. The codes, which utilize different underlying algorithms, give a means of evaluating the general applicability of this parallel architecture. The three analysis codes being implemented are a frequency domain method of moments code, a time domain finite difference code, and a frequency domain finite elements code. These analysis capabilities are being integrated into an electromagnetics interactive analysis workstation which can serve as a design tool for the construction of antennas and other radiating or scattering structures. The first two years of work on the Hypercube Matrix Computation effort is summarized. It includes both new developments and results as well as work previously reported in the Hypercube Matrix Computation Task: Final Report for 1986 to 1987 (JPL Publication 87-18).

  8. CMS Distributed Computing Integration in the LHC sustained operations era

    NASA Astrophysics Data System (ADS)

    Grandi, C.; Bockelman, B.; Bonacorsi, D.; Fisk, I.; González Caballero, I.; Farina, F.; Hernández, J. M.; Padhi, S.; Sarkar, S.; Sciabà, A.; Sfiligoi, I.; Spiga, F.; Úbeda García, M.; Van Der Ster, D. C.; Zvada, M.

    2011-12-01

    After many years of preparation the CMS computing system has reached a situation where stability in operations limits the possibility to introduce innovative features. Nevertheless it is the same need of stability and smooth operations that requires the introduction of features that were considered not strategic in the previous phases. Examples are: adequate authorization to control and prioritize the access to storage and computing resources; improved monitoring to investigate problems and identify bottlenecks on the infrastructure; increased automation to reduce the manpower needed for operations; effective process to deploy in production new releases of the software tools. We present the work of the CMS Distributed Computing Integration Activity that is responsible for providing a liaison between the CMS distributed computing infrastructure and the software providers, both internal and external to CMS. In particular we describe the introduction of new middleware features during the last 18 months as well as the requirements to Grid and Cloud software developers for the future.

  9. Computational mechanisms underlying cortical responses to the affordance properties of visual scenes

    PubMed Central

    Epstein, Russell A.

    2018-01-01

    Biologically inspired deep convolutional neural networks (CNNs), trained for computer vision tasks, have been found to predict cortical responses with remarkable accuracy. However, the internal operations of these models remain poorly understood, and the factors that account for their success are unknown. Here we develop a set of techniques for using CNNs to gain insights into the computational mechanisms underlying cortical responses. We focused on responses in the occipital place area (OPA), a scene-selective region of dorsal occipitoparietal cortex. In a previous study, we showed that fMRI activation patterns in the OPA contain information about the navigational affordances of scenes; that is, information about where one can and cannot move within the immediate environment. We hypothesized that this affordance information could be extracted using a set of purely feedforward computations. To test this idea, we examined a deep CNN with a feedforward architecture that had been previously trained for scene classification. We found that responses in the CNN to scene images were highly predictive of fMRI responses in the OPA. Moreover the CNN accounted for the portion of OPA variance relating to the navigational affordances of scenes. The CNN could thus serve as an image-computable candidate model of affordance-related responses in the OPA. We then ran a series of in silico experiments on this model to gain insights into its internal operations. These analyses showed that the computation of affordance-related features relied heavily on visual information at high-spatial frequencies and cardinal orientations, both of which have previously been identified as low-level stimulus preferences of scene-selective visual cortex. These computations also exhibited a strong preference for information in the lower visual field, which is consistent with known retinotopic biases in the OPA. Visualizations of feature selectivity within the CNN suggested that affordance-based responses encoded features that define the layout of the spatial environment, such as boundary-defining junctions and large extended surfaces. Together, these results map the sensory functions of the OPA onto a fully quantitative model that provides insights into its visual computations. More broadly, they advance integrative techniques for understanding visual cortex across multiple level of analysis: from the identification of cortical sensory functions to the modeling of their underlying algorithms. PMID:29684011

  10. Computer-Assisted Second Language Vocabulary Learning in a Paired-Associate Paradigm: A Critical Investigation of Flashcard Software

    ERIC Educational Resources Information Center

    Nakata, Tatsuya

    2011-01-01

    The present study aims to conduct a comprehensive investigation of flashcard software for learning vocabulary in a second language. Nine flashcard programs were analysed using 17 criteria derived from previous studies on flashcard learning as well as paired-associate learning. Results suggest that in general, most programs have been developed in a…

  11. New PDS will predict performance of pallets made with used parts

    Treesearch

    John W. Clarke; Marshall S. White; Philip A. Araman

    2001-01-01

    The Pallet Design System (PDS) is a computer design program developed by Virginia Tech, the National Wooden Pallet & Container Association, and the U.S. Forest Service to quickly and accurately predict the performance of new wood pallets. PDS has been upgraded annually since its original version in 1984. All of the previous upgrades, however, have continued to...

  12. Using Words Instead of Jumbled Characters as Stimuli in Keyboard Training Facilitates Fluent Performance

    ERIC Educational Resources Information Center

    DeFulio, Anthony; Crone-Todd, Darlene E.; Long, Lauren V.; Nuzzo, Paul A.; Silverman, Kenneth

    2011-01-01

    Keyboarding skill is an important target for adult education programs due to the ubiquity of computers in modern work environments. A previous study showed that novice typists learned key locations quickly but that fluency took a relatively long time to develop. In the present study, novice typists achieved fluent performance in nearly half the…

  13. Collaboration, Reflection and Selective Neglect: Campus-Based Marketing Students' Experiences of Using a Virtual Learning Environment

    ERIC Educational Resources Information Center

    Molesworth, Mike

    2004-01-01

    Previous studies have suggested significant benefits to using computer-mediated communication in higher education and the development of the relevant skills may also be important for preparing students for their working careers. This study is a review of the introduction of a virtual learning environment to support a group of 60 campus-based,…

  14. Validation of a computer case definition for sudden cardiac death in opioid users

    PubMed Central

    2012-01-01

    Background To facilitate the use of automated databases for studies of sudden cardiac death, we previously developed a computerized case definition that had a positive predictive value between 86% and 88%. However, the definition has not been specifically validated for prescription opioid users, for whom out-of-hospital overdose deaths may be difficult to distinguish from sudden cardiac death. Findings We assembled a cohort of persons 30-74 years of age prescribed propoxyphene or hydrocodone who had no life-threatening non-cardiovascular illness, diagnosed drug abuse, residence in a nursing home in the past year, or hospital stay within the past 30 days. Medical records were sought for a sample of 140 cohort deaths within 30 days of a prescription fill meeting the computer case definition. Of the 140 sampled deaths, 81 were adjudicated; 73 (90%) were sudden cardiac deaths. Two deaths had possible opioid overdose; after removing these two the positive predictive value was 88%. Conclusions These findings are consistent with our previous validation studies and suggest the computer case definition of sudden cardiac death is a useful tool for pharmacoepidemiologic studies of opioid analgesics. PMID:22938531

  15. Time-dependent Gutzwiller theory of magnetic excitations in the Hubbard model

    NASA Astrophysics Data System (ADS)

    Seibold, G.; Becca, F.; Rubin, P.; Lorenzana, J.

    2004-04-01

    We use a spin-rotational invariant Gutzwiller energy functional to compute random-phase-approximation-like (RPA) fluctuations on top of the Gutzwiller approximation (GA). The method can be viewed as an extension of the previously developed GA+RPA approach for the charge sector [G. Seibold and J. Lorenzana, Phys. Rev. Lett. 86, 2605 (2001)] with respect to the inclusion of the magnetic excitations. Unlike the charge case, no assumptions about the time evolution of the double occupancy are needed in this case. Interestingly, in a spin-rotational invariant system, we find the correct degeneracy between triplet excitations, showing the consistency of both computations. Since no restrictions are imposed on the symmetry of the underlying saddle-point solution, our approach is suitable for the evaluation of the magnetic susceptibility and dynamical structure factor in strongly correlated inhomogeneous systems. We present a detailed study of the quality of our approach by comparing with exact diagonalization results and show its much higher accuracy compared to the conventional Hartree-Fock+RPA theory. In infinite dimensions, where the GA becomes exact for the Gutzwiller variational energy, we evaluate ferromagnetic and antiferromagnetic instabilities from the transverse magnetic susceptibility. The resulting phase diagram is in complete agreement with previous variational computations.

  16. In Silico Knockout Screening of Plasmodium falciparum Reactions and Prediction of Novel Essential Reactions by Analysing the Metabolic Network

    PubMed Central

    Isewon, Itunuoluwa; Aromolaran, Olufemi; Oladipupo, Olufunke

    2018-01-01

    Malaria is an infectious disease that affects close to half a million individuals every year and Plasmodium falciparum is a major cause of malaria. The treatment of this disease could be done effectively if the essential enzymes of this parasite are specifically targeted. Nevertheless, the development of the parasite in resisting existing drugs now makes discovering new drugs a core responsibility. In this study, a novel computational model that makes the prediction of new and validated antimalarial drug target cheaper, easier, and faster has been developed. We have identified new essential reactions as potential targets for drugs in the metabolic network of the parasite. Among the top seven (7) predicted essential reactions, four (4) have been previously identified in earlier studies with biological evidence and one (1) has been with computational evidence. The results from our study were compared with an extensive list of seventy-seven (77) essential reactions with biological evidence from a previous study. We present a list of thirty-one (31) potential candidates for drug targets in Plasmodium falciparum which includes twenty-four (24) new potential candidates for drug targets. PMID:29789805

  17. Control of Flow Structure in Square Cross-Sectioned U Bend using Numerical Modeling

    NASA Astrophysics Data System (ADS)

    Yavuz, Mehmet Metin; Guden, Yigitcan

    2014-11-01

    Due to the curvature in U-bends, the flow development involves complex flow structures including Dean vortices and high levels of turbulence that are quite critical in considering noise problems and structural failure of the ducts. Computational fluid dynamic (CFD) models are developed using ANSYS Fluent to analyze and to control the flow structure in a square cross-sectioned U-bend with a radius of curvature Rc/D = 0.65. The predictions of velocity profiles on different angular positions of the U-bend are compared against the experimental results available in the literature and the previous numerical studies. The performances of different turbulence models are evaluated to propose the best numerical approach that has high accuracy with reduced computation time. The numerical results of the present study indicate improvements with respect to the previous numerical predictions and very good agreement with the available experimental results. In addition, a flow control technique is utilized to regulate the flow inside the bend. The elimination of Dean vortices along with significant reduction in turbulence levels in different cross flow planes are successfully achieved when the flow control technique is applied. The project is supported by Meteksan Defense Industries, Inc.

  18. Enhanced Video-Oculography System

    NASA Technical Reports Server (NTRS)

    Moore, Steven T.; MacDougall, Hamish G.

    2009-01-01

    A previously developed video-oculography system has been enhanced for use in measuring vestibulo-ocular reflexes of a human subject in a centrifuge, motor vehicle, or other setting. The system as previously developed included a lightweight digital video camera mounted on goggles. The left eye was illuminated by an infrared light-emitting diode via a dichroic mirror, and the camera captured images of the left eye in infrared light. To extract eye-movement data, the digitized video images were processed by software running in a laptop computer. Eye movements were calibrated by having the subject view a target pattern, fixed with respect to the subject s head, generated by a goggle-mounted laser with a diffraction grating. The system as enhanced includes a second camera for imaging the scene from the subject s perspective, and two inertial measurement units (IMUs) for measuring linear accelerations and rates of rotation for computing head movements. One IMU is mounted on the goggles, the other on the centrifuge or vehicle frame. All eye-movement and head-motion data are time-stamped. In addition, the subject s point of regard is superimposed on each scene image to enable analysis of patterns of gaze in real time.

  19. Computer games: a double-edged sword?

    PubMed

    Sun, De-Lin; Ma, Ning; Bao, Min; Chen, Xang-Chuan; Zhang, Da-Ren

    2008-10-01

    Excessive computer game playing (ECGP) has already become a serious social problem. However, limited data from experimental lab studies are available about the negative consequences of ECGP on players' cognitive characteristics. In the present study, we compared three groups of participants (current ECGP participants, previous ECGP participants, and control participants) on a Multiple Object Tracking (MOT) task. The previous ECGP participants performed significantly better than the control participants, which suggested a facilitation effect of computer games on visuospatial abilities. Moreover, the current ECGP participants performed significantly worse than the previous ECGP participants. This more important finding indicates that ECGP may be related to cognitive deficits. Implications of this study are discussed.

  20. A New Soft Computing Method for K-Harmonic Means Clustering.

    PubMed

    Yeh, Wei-Chang; Jiang, Yunzhi; Chen, Yee-Fen; Chen, Zhe

    2016-01-01

    The K-harmonic means clustering algorithm (KHM) is a new clustering method used to group data such that the sum of the harmonic averages of the distances between each entity and all cluster centroids is minimized. Because it is less sensitive to initialization than K-means (KM), many researchers have recently been attracted to studying KHM. In this study, the proposed iSSO-KHM is based on an improved simplified swarm optimization (iSSO) and integrates a variable neighborhood search (VNS) for KHM clustering. As evidence of the utility of the proposed iSSO-KHM, we present extensive computational results on eight benchmark problems. From the computational results, the comparison appears to support the superiority of the proposed iSSO-KHM over previously developed algorithms for all experiments in the literature.

  1. Direct coal liquefaction baseline design and system analysis. Quarterly report, January--March 1991

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-04-01

    The primary objective of the study is to develop a computer model for a base line direct coal liquefaction design based on two stage direct coupled catalytic reactors. This primary objective is to be accomplished by completing the following: a base line design based on previous DOE/PETC results from Wilsonville pilot plant and other engineering evaluations; a cost estimate and economic analysis; a computer model incorporating the above two steps over a wide range of capacities and selected process alternatives; a comprehensive training program for DOE/PETC Staff to understand and use the computer model; a thorough documentation of all underlyingmore » assumptions for baseline economics; and a user manual and training material which will facilitate updating of the model in the future.« less

  2. Direct coal liquefaction baseline design and system analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-04-01

    The primary objective of the study is to develop a computer model for a base line direct coal liquefaction design based on two stage direct coupled catalytic reactors. This primary objective is to be accomplished by completing the following: a base line design based on previous DOE/PETC results from Wilsonville pilot plant and other engineering evaluations; a cost estimate and economic analysis; a computer model incorporating the above two steps over a wide range of capacities and selected process alternatives; a comprehensive training program for DOE/PETC Staff to understand and use the computer model; a thorough documentation of all underlyingmore » assumptions for baseline economics; and a user manual and training material which will facilitate updating of the model in the future.« less

  3. Gyrofluid Modeling of Turbulent, Kinetic Physics

    NASA Astrophysics Data System (ADS)

    Despain, Kate Marie

    2011-12-01

    Gyrofluid models to describe plasma turbulence combine the advantages of fluid models, such as lower dimensionality and well-developed intuition, with those of gyrokinetics models, such as finite Larmor radius (FLR) effects. This allows gyrofluid models to be more tractable computationally while still capturing much of the physics related to the FLR of the particles. We present a gyrofluid model derived to capture the behavior of slow solar wind turbulence and describe the computer code developed to implement the model. In addition, we describe the modifications we made to a gyrofluid model and code that simulate plasma turbulence in tokamak geometries. Specifically, we describe a nonlinear phase mixing phenomenon, part of the E x B term, that was previously missing from the model. An inherently FLR effect, it plays an important role in predicting turbulent heat flux and diffusivity levels for the plasma. We demonstrate this importance by comparing results from the updated code to studies done previously by gyrofluid and gyrokinetic codes. We further explain what would be necessary to couple the updated gyrofluid code, gryffin, to a turbulent transport code, thus allowing gryffin to play a role in predicting profiles for fusion devices such as ITER and to explore novel fusion configurations. Such a coupling would require the use of Graphical Processing Units (GPUs) to make the modeling process fast enough to be viable. Consequently, we also describe our experience with GPU computing and demonstrate that we are poised to complete a gryffin port to this innovative architecture.

  4. Development of posture-specific computational phantoms using motion capture technology and application to radiation dose-reconstruction for the 1999 Tokai-Mura nuclear criticality accident

    NASA Astrophysics Data System (ADS)

    Vazquez, Justin A.; Caracappa, Peter F.; Xu, X. George

    2014-09-01

    The majority of existing computational phantoms are designed to represent workers in typical standing anatomical postures with fixed arm and leg positions. However, workers found in accident-related scenarios often assume varied postures. This paper describes the development and application of two phantoms with adjusted postures specified by data acquired from a motion capture system to simulate unique human postures found in a 1999 criticality accident that took place at a JCO facility in Tokai-Mura, Japan. In the course of this accident, two workers were fatally exposed to extremely high levels of radiation. Implementation of the emergent techniques discussed produced more accurate and more detailed dose estimates for the two workers than were reported in previous studies. A total-body dose of 6.43 and 26.38 Gy was estimated for the two workers, who assumed a crouching and a standing posture, respectively. Additionally, organ-specific dose estimates were determined, including a 7.93 Gy dose to the thyroid and 6.11 Gy dose to the stomach for the crouching worker and a 41.71 Gy dose to the liver and a 37.26 Gy dose to the stomach for the standing worker. Implications for the medical prognosis of the workers are discussed, and the results of this study were found to correlate better with the patient outcome than previous estimates, suggesting potential future applications of such methods for improved epidemiological studies involving next-generation computational phantom tools.

  5. Development of posture-specific computational phantoms using motion capture technology and application to radiation dose-reconstruction for the 1999 Tokai-Mura nuclear criticality accident.

    PubMed

    Vazquez, Justin A; Caracappa, Peter F; Xu, X George

    2014-09-21

    The majority of existing computational phantoms are designed to represent workers in typical standing anatomical postures with fixed arm and leg positions. However, workers found in accident-related scenarios often assume varied postures. This paper describes the development and application of two phantoms with adjusted postures specified by data acquired from a motion capture system to simulate unique human postures found in a 1999 criticality accident that took place at a JCO facility in Tokai-Mura, Japan. In the course of this accident, two workers were fatally exposed to extremely high levels of radiation. Implementation of the emergent techniques discussed produced more accurate and more detailed dose estimates for the two workers than were reported in previous studies. A total-body dose of 6.43 and 26.38 Gy was estimated for the two workers, who assumed a crouching and a standing posture, respectively. Additionally, organ-specific dose estimates were determined, including a 7.93 Gy dose to the thyroid and 6.11 Gy dose to the stomach for the crouching worker and a 41.71 Gy dose to the liver and a 37.26 Gy dose to the stomach for the standing worker. Implications for the medical prognosis of the workers are discussed, and the results of this study were found to correlate better with the patient outcome than previous estimates, suggesting potential future applications of such methods for improved epidemiological studies involving next-generation computational phantom tools.

  6. Computational Models for Calcium-Mediated Astrocyte Functions.

    PubMed

    Manninen, Tiina; Havela, Riikka; Linne, Marja-Leena

    2018-01-01

    The computational neuroscience field has heavily concentrated on the modeling of neuronal functions, largely ignoring other brain cells, including one type of glial cell, the astrocytes. Despite the short history of modeling astrocytic functions, we were delighted about the hundreds of models developed so far to study the role of astrocytes, most often in calcium dynamics, synchronization, information transfer, and plasticity in vitro , but also in vascular events, hyperexcitability, and homeostasis. Our goal here is to present the state-of-the-art in computational modeling of astrocytes in order to facilitate better understanding of the functions and dynamics of astrocytes in the brain. Due to the large number of models, we concentrated on a hundred models that include biophysical descriptions for calcium signaling and dynamics in astrocytes. We categorized the models into four groups: single astrocyte models, astrocyte network models, neuron-astrocyte synapse models, and neuron-astrocyte network models to ease their use in future modeling projects. We characterized the models based on which earlier models were used for building the models and which type of biological entities were described in the astrocyte models. Features of the models were compared and contrasted so that similarities and differences were more readily apparent. We discovered that most of the models were basically generated from a small set of previously published models with small variations. However, neither citations to all the previous models with similar core structure nor explanations of what was built on top of the previous models were provided, which made it possible, in some cases, to have the same models published several times without an explicit intention to make new predictions about the roles of astrocytes in brain functions. Furthermore, only a few of the models are available online which makes it difficult to reproduce the simulation results and further develop the models. Thus, we would like to emphasize that only via reproducible research are we able to build better computational models for astrocytes, which truly advance science. Our study is the first to characterize in detail the biophysical and biochemical mechanisms that have been modeled for astrocytes.

  7. Computational Models for Calcium-Mediated Astrocyte Functions

    PubMed Central

    Manninen, Tiina; Havela, Riikka; Linne, Marja-Leena

    2018-01-01

    The computational neuroscience field has heavily concentrated on the modeling of neuronal functions, largely ignoring other brain cells, including one type of glial cell, the astrocytes. Despite the short history of modeling astrocytic functions, we were delighted about the hundreds of models developed so far to study the role of astrocytes, most often in calcium dynamics, synchronization, information transfer, and plasticity in vitro, but also in vascular events, hyperexcitability, and homeostasis. Our goal here is to present the state-of-the-art in computational modeling of astrocytes in order to facilitate better understanding of the functions and dynamics of astrocytes in the brain. Due to the large number of models, we concentrated on a hundred models that include biophysical descriptions for calcium signaling and dynamics in astrocytes. We categorized the models into four groups: single astrocyte models, astrocyte network models, neuron-astrocyte synapse models, and neuron-astrocyte network models to ease their use in future modeling projects. We characterized the models based on which earlier models were used for building the models and which type of biological entities were described in the astrocyte models. Features of the models were compared and contrasted so that similarities and differences were more readily apparent. We discovered that most of the models were basically generated from a small set of previously published models with small variations. However, neither citations to all the previous models with similar core structure nor explanations of what was built on top of the previous models were provided, which made it possible, in some cases, to have the same models published several times without an explicit intention to make new predictions about the roles of astrocytes in brain functions. Furthermore, only a few of the models are available online which makes it difficult to reproduce the simulation results and further develop the models. Thus, we would like to emphasize that only via reproducible research are we able to build better computational models for astrocytes, which truly advance science. Our study is the first to characterize in detail the biophysical and biochemical mechanisms that have been modeled for astrocytes. PMID:29670517

  8. Continuous development of schemes for parallel computing of the electrostatics in biological systems: implementation in DelPhi.

    PubMed

    Li, Chuan; Petukh, Marharyta; Li, Lin; Alexov, Emil

    2013-08-15

    Due to the enormous importance of electrostatics in molecular biology, calculating the electrostatic potential and corresponding energies has become a standard computational approach for the study of biomolecules and nano-objects immersed in water and salt phase or other media. However, the electrostatics of large macromolecules and macromolecular complexes, including nano-objects, may not be obtainable via explicit methods and even the standard continuum electrostatics methods may not be applicable due to high computational time and memory requirements. Here, we report further development of the parallelization scheme reported in our previous work (Li, et al., J. Comput. Chem. 2012, 33, 1960) to include parallelization of the molecular surface and energy calculations components of the algorithm. The parallelization scheme utilizes different approaches such as space domain parallelization, algorithmic parallelization, multithreading, and task scheduling, depending on the quantity being calculated. This allows for efficient use of the computing resources of the corresponding computer cluster. The parallelization scheme is implemented in the popular software DelPhi and results in speedup of several folds. As a demonstration of the efficiency and capability of this methodology, the electrostatic potential, and electric field distributions are calculated for the bovine mitochondrial supercomplex illustrating their complex topology, which cannot be obtained by modeling the supercomplex components alone. Copyright © 2013 Wiley Periodicals, Inc.

  9. P-HS-SFM: a parallel harmony search algorithm for the reproduction of experimental data in the continuous microscopic crowd dynamic models

    NASA Astrophysics Data System (ADS)

    Jaber, Khalid Mohammad; Alia, Osama Moh'd.; Shuaib, Mohammed Mahmod

    2018-03-01

    Finding the optimal parameters that can reproduce experimental data (such as the velocity-density relation and the specific flow rate) is a very important component of the validation and calibration of microscopic crowd dynamic models. Heavy computational demand during parameter search is a known limitation that exists in a previously developed model known as the Harmony Search-Based Social Force Model (HS-SFM). In this paper, a parallel-based mechanism is proposed to reduce the computational time and memory resource utilisation required to find these parameters. More specifically, two MATLAB-based multicore techniques (parfor and create independent jobs) using shared memory are developed by taking advantage of the multithreading capabilities of parallel computing, resulting in a new framework called the Parallel Harmony Search-Based Social Force Model (P-HS-SFM). The experimental results show that the parfor-based P-HS-SFM achieved a better computational time of about 26 h, an efficiency improvement of ? 54% and a speedup factor of 2.196 times in comparison with the HS-SFM sequential processor. The performance of the P-HS-SFM using the create independent jobs approach is also comparable to parfor with a computational time of 26.8 h, an efficiency improvement of about 30% and a speedup of 2.137 times.

  10. Computational analysis of high resolution unsteady airloads for rotor aeroacoustics

    NASA Technical Reports Server (NTRS)

    Quackenbush, Todd R.; Lam, C.-M. Gordon; Wachspress, Daniel A.; Bliss, Donald B.

    1994-01-01

    The study of helicopter aerodynamic loading for acoustics applications requires the application of efficient yet accurate simulations of the velocity field induced by the rotor's vortex wake. This report summarizes work to date on the development of such an analysis, which builds on the Constant Vorticity Contour (CVC) free wake model, previously implemented for the study of vibratory loading in the RotorCRAFT computer code. The present effort has focused on implementation of an airload reconstruction approach that computes high resolution airload solutions of rotor/rotor-wake interactions required for acoustics computations. Supplementary efforts on the development of improved vortex core modeling, unsteady aerodynamic effects, higher spatial resolution of rotor loading, and fast vortex wake implementations have substantially enhanced the capabilities of the resulting software, denoted RotorCRAFT/AA (AeroAcoustics). Results of validation calculations using recently acquired model rotor data show that by employing airload reconstruction it is possible to apply the CVC wake analysis with temporal and spatial resolution suitable for acoustics applications while reducing the computation time required by one to two orders of magnitude relative to that required by direct calculations. Promising correlation with this body of airload and noise data has been obtained for a variety of rotor configurations and operating conditions.

  11. A computational fluid dynamics simulation framework for ventricular catheter design optimization.

    PubMed

    Weisenberg, Sofy H; TerMaath, Stephanie C; Barbier, Charlotte N; Hill, Judith C; Killeffer, James A

    2017-11-10

    OBJECTIVE Cerebrospinal fluid (CSF) shunts are the primary treatment for patients suffering from hydrocephalus. While proven effective in symptom relief, these shunt systems are plagued by high failure rates and often require repeated revision surgeries to replace malfunctioning components. One of the leading causes of CSF shunt failure is obstruction of the ventricular catheter by aggregations of cells, proteins, blood clots, or fronds of choroid plexus that occlude the catheter's small inlet holes or even the full internal catheter lumen. Such obstructions can disrupt CSF diversion out of the ventricular system or impede it entirely. Previous studies have suggested that altering the catheter's fluid dynamics may help to reduce the likelihood of complete ventricular catheter failure caused by obstruction. However, systematic correlation between a ventricular catheter's design parameters and its performance, specifically its likelihood to become occluded, still remains unknown. Therefore, an automated, open-source computational fluid dynamics (CFD) simulation framework was developed for use in the medical community to determine optimized ventricular catheter designs and to rapidly explore parameter influence for a given flow objective. METHODS The computational framework was developed by coupling a 3D CFD solver and an iterative optimization algorithm and was implemented in a high-performance computing environment. The capabilities of the framework were demonstrated by computing an optimized ventricular catheter design that provides uniform flow rates through the catheter's inlet holes, a common design objective in the literature. The baseline computational model was validated using 3D nuclear imaging to provide flow velocities at the inlet holes and through the catheter. RESULTS The optimized catheter design achieved through use of the automated simulation framework improved significantly on previous attempts to reach a uniform inlet flow rate distribution using the standard catheter hole configuration as a baseline. While the standard ventricular catheter design featuring uniform inlet hole diameters and hole spacing has a standard deviation of 14.27% for the inlet flow rates, the optimized design has a standard deviation of 0.30%. CONCLUSIONS This customizable framework, paired with high-performance computing, provides a rapid method of design testing to solve complex flow problems. While a relatively simplified ventricular catheter model was used to demonstrate the framework, the computational approach is applicable to any baseline catheter model, and it is easily adapted to optimize catheters for the unique needs of different patients as well as for other fluid-based medical devices.

  12. Informed public choices for low-carbon electricity portfolios using a computer decision tool.

    PubMed

    Mayer, Lauren A Fleishman; Bruine de Bruin, Wändi; Morgan, M Granger

    2014-04-01

    Reducing CO2 emissions from the electricity sector will likely require policies that encourage the widespread deployment of a diverse mix of low-carbon electricity generation technologies. Public discourse informs such policies. To make informed decisions and to productively engage in public discourse, citizens need to understand the trade-offs between electricity technologies proposed for widespread deployment. Building on previous paper-and-pencil studies, we developed a computer tool that aimed to help nonexperts make informed decisions about the challenges faced in achieving a low-carbon energy future. We report on an initial usability study of this interactive computer tool. After providing participants with comparative and balanced information about 10 electricity technologies, we asked them to design a low-carbon electricity portfolio. Participants used the interactive computer tool, which constrained portfolio designs to be realistic and yield low CO2 emissions. As they changed their portfolios, the tool updated information about projected CO2 emissions, electricity costs, and specific environmental impacts. As in the previous paper-and-pencil studies, most participants designed diverse portfolios that included energy efficiency, nuclear, coal with carbon capture and sequestration, natural gas, and wind. Our results suggest that participants understood the tool and used it consistently. The tool may be downloaded from http://cedmcenter.org/tools-for-cedm/informing-the-public-about-low-carbon-technologies/ .

  13. MMA-EoS: A Computational Framework for Mineralogical Thermodynamics

    NASA Astrophysics Data System (ADS)

    Chust, T. C.; Steinle-Neumann, G.; Dolejš, D.; Schuberth, B. S. A.; Bunge, H.-P.

    2017-12-01

    We present a newly developed software framework, MMA-EoS, that evaluates phase equilibria and thermodynamic properties of multicomponent systems by Gibbs energy minimization, with application to mantle petrology. The code is versatile in terms of the equation-of-state and mixing properties and allows for the computation of properties of single phases, solution phases, and multiphase aggregates. Currently, the open program distribution contains equation-of-state formulations widely used, that is, Caloric-Murnaghan, Caloric-Modified-Tait, and Birch-Murnaghan-Mie-Grüneisen-Debye models, with published databases included. Through its modular design and easily scripted database, MMA-EoS can readily be extended with new formulations of equations-of-state and changes or extensions to thermodynamic data sets. We demonstrate the application of the program by reproducing and comparing physical properties of mantle phases and assemblages with previously published work and experimental data, successively increasing complexity, up to computing phase equilibria of six-component compositions. Chemically complex systems allow us to trace the budget of minor chemical components in order to explore whether they lead to the formation of new phases or extend stability fields of existing ones. Self-consistently computed thermophysical properties for a homogeneous mantle and a mechanical mixture of slab lithologies show no discernible differences that require a heterogeneous mantle structure as has been suggested previously. Such examples illustrate how thermodynamics of mantle mineralogy can advance the study of Earth's interior.

  14. Terascale Computing in Accelerator Science and Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ko, Kwok

    2002-08-21

    We have entered the age of ''terascale'' scientific computing. Processors and system architecture both continue to evolve; hundred-teraFLOP computers are expected in the next few years, and petaFLOP computers toward the end of this decade are conceivable. This ever-increasing power to solve previously intractable numerical problems benefits almost every field of science and engineering and is revolutionizing some of them, notably including accelerator physics and technology. At existing accelerators, it will help us optimize performance, expand operational parameter envelopes, and increase reliability. Design decisions for next-generation machines will be informed by unprecedented comprehensive and accurate modeling, as well as computer-aidedmore » engineering; all this will increase the likelihood that even their most advanced subsystems can be commissioned on time, within budget, and up to specifications. Advanced computing is also vital to developing new means of acceleration and exploring the behavior of beams under extreme conditions. With continued progress it will someday become reasonable to speak of a complete numerical model of all phenomena important to a particular accelerator.« less

  15. An adaptive model approach for quantitative wrist rigidity evaluation during deep brain stimulation surgery.

    PubMed

    Assis, Sofia; Costa, Pedro; Rosas, Maria Jose; Vaz, Rui; Silva Cunha, Joao Paulo

    2016-08-01

    Intraoperative evaluation of the efficacy of Deep Brain Stimulation includes evaluation of the effect on rigidity. A subjective semi-quantitative scale is used, dependent on the examiner perception and experience. A system was proposed previously, aiming to tackle this subjectivity, using quantitative data and providing real-time feedback of the computed rigidity reduction, hence supporting the physician decision. This system comprised of a gyroscope-based motion sensor in a textile band, placed in the patients hand, which communicated its measurements to a laptop. The latter computed a signal descriptor from the angular velocity of the hand during wrist flexion in DBS surgery. The first approach relied on using a general rigidity reduction model, regardless of the initial severity of the symptom. Thus, to enhance the performance of the previously presented system, we aimed to develop models for high and low baseline rigidity, according to the examiner assessment before any stimulation. This would allow a more patient-oriented approach. Additionally, usability was improved by having in situ processing in a smartphone, instead of a computer. Such system has shown to be reliable, presenting an accuracy of 82.0% and a mean error of 3.4%. Relatively to previous results, the performance was similar, further supporting the importance of considering the cogwheel rigidity to better infer about the reduction in rigidity. Overall, we present a simple, wearable, mobile system, suitable for intra-operatory conditions during DBS, supporting a physician in decision-making when setting stimulation parameters.

  16. Evaluation of Emerging Energy-Efficient Heterogeneous Computing Platforms for Biomolecular and Cellular Simulation Workloads

    PubMed Central

    Stone, John E.; Hallock, Michael J.; Phillips, James C.; Peterson, Joseph R.; Luthey-Schulten, Zaida; Schulten, Klaus

    2016-01-01

    Many of the continuing scientific advances achieved through computational biology are predicated on the availability of ongoing increases in computational power required for detailed simulation and analysis of cellular processes on biologically-relevant timescales. A critical challenge facing the development of future exascale supercomputer systems is the development of new computing hardware and associated scientific applications that dramatically improve upon the energy efficiency of existing solutions, while providing increased simulation, analysis, and visualization performance. Mobile computing platforms have recently become powerful enough to support interactive molecular visualization tasks that were previously only possible on laptops and workstations, creating future opportunities for their convenient use for meetings, remote collaboration, and as head mounted displays for immersive stereoscopic viewing. We describe early experiences adapting several biomolecular simulation and analysis applications for emerging heterogeneous computing platforms that combine power-efficient system-on-chip multi-core CPUs with high-performance massively parallel GPUs. We present low-cost power monitoring instrumentation that provides sufficient temporal resolution to evaluate the power consumption of individual CPU algorithms and GPU kernels. We compare the performance and energy efficiency of scientific applications running on emerging platforms with results obtained on traditional platforms, identify hardware and algorithmic performance bottlenecks that affect the usability of these platforms, and describe avenues for improving both the hardware and applications in pursuit of the needs of molecular modeling tasks on mobile devices and future exascale computers. PMID:27516922

  17. Image Algebra Matlab language version 2.3 for image processing and compression research

    NASA Astrophysics Data System (ADS)

    Schmalz, Mark S.; Ritter, Gerhard X.; Hayden, Eric

    2010-08-01

    Image algebra is a rigorous, concise notation that unifies linear and nonlinear mathematics in the image domain. Image algebra was developed under DARPA and US Air Force sponsorship at University of Florida for over 15 years beginning in 1984. Image algebra has been implemented in a variety of programming languages designed specifically to support the development of image processing and computer vision algorithms and software. The University of Florida has been associated with development of the languages FORTRAN, Ada, Lisp, and C++. The latter implementation involved a class library, iac++, that supported image algebra programming in C++. Since image processing and computer vision are generally performed with operands that are array-based, the Matlab™ programming language is ideal for implementing the common subset of image algebra. Objects include sets and set operations, images and operations on images, as well as templates and image-template convolution operations. This implementation, called Image Algebra Matlab (IAM), has been found to be useful for research in data, image, and video compression, as described herein. Due to the widespread acceptance of the Matlab programming language in the computing community, IAM offers exciting possibilities for supporting a large group of users. The control over an object's computational resources provided to the algorithm designer by Matlab means that IAM programs can employ versatile representations for the operands and operations of the algebra, which are supported by the underlying libraries written in Matlab. In a previous publication, we showed how the functionality of IAC++ could be carried forth into a Matlab implementation, and provided practical details of a prototype implementation called IAM Version 1. In this paper, we further elaborate the purpose and structure of image algebra, then present a maturing implementation of Image Algebra Matlab called IAM Version 2.3, which extends the previous implementation of IAM to include polymorphic operations over different point sets, as well as recursive convolution operations and functional composition. We also show how image algebra and IAM can be employed in image processing and compression research, as well as algorithm development and analysis.

  18. Development of on-line monitoring system for Nuclear Power Plant (NPP) using neuro-expert, noise analysis, and modified neural networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Subekti, M.; Center for Development of Reactor Safety Technology, National Nuclear Energy Agency of Indonesia, Puspiptek Complex BO.80, Serpong-Tangerang, 15340; Ohno, T.

    2006-07-01

    The neuro-expert has been utilized in previous monitoring-system research of Pressure Water Reactor (PWR). The research improved the monitoring system by utilizing neuro-expert, conventional noise analysis and modified neural networks for capability extension. The parallel method applications required distributed architecture of computer-network for performing real-time tasks. The research aimed to improve the previous monitoring system, which could detect sensor degradation, and to perform the monitoring demonstration in High Temperature Engineering Tested Reactor (HTTR). The developing monitoring system based on some methods that have been tested using the data from online PWR simulator, as well as RSG-GAS (30 MW research reactormore » in Indonesia), will be applied in HTTR for more complex monitoring. (authors)« less

  19. A unified algorithm for predicting partition coefficients for PBPK modeling of drugs and environmental chemicals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peyret, Thomas; Poulin, Patrick; Krishnan, Kannan, E-mail: kannan.krishnan@umontreal.ca

    The algorithms in the literature focusing to predict tissue:blood PC (P{sub tb}) for environmental chemicals and tissue:plasma PC based on total (K{sub p}) or unbound concentration (K{sub pu}) for drugs differ in their consideration of binding to hemoglobin, plasma proteins and charged phospholipids. The objective of the present study was to develop a unified algorithm such that P{sub tb}, K{sub p} and K{sub pu} for both drugs and environmental chemicals could be predicted. The development of the unified algorithm was accomplished by integrating all mechanistic algorithms previously published to compute the PCs. Furthermore, the algorithm was structured in such amore » way as to facilitate predictions of the distribution of organic compounds at the macro (i.e. whole tissue) and micro (i.e. cells and fluids) levels. The resulting unified algorithm was applied to compute the rat P{sub tb}, K{sub p} or K{sub pu} of muscle (n = 174), liver (n = 139) and adipose tissue (n = 141) for acidic, neutral, zwitterionic and basic drugs as well as ketones, acetate esters, alcohols, aliphatic hydrocarbons, aromatic hydrocarbons and ethers. The unified algorithm reproduced adequately the values predicted previously by the published algorithms for a total of 142 drugs and chemicals. The sensitivity analysis demonstrated the relative importance of the various compound properties reflective of specific mechanistic determinants relevant to prediction of PC values of drugs and environmental chemicals. Overall, the present unified algorithm uniquely facilitates the computation of macro and micro level PCs for developing organ and cellular-level PBPK models for both chemicals and drugs.« less

  20. [Effectivity of an occlusion-supporting PC-based visual training programme by horizontal drifting sinus gratings in children with amblyopia].

    PubMed

    Bau, V; Rose, K; Pollack, K; Spoerl, E; Pillunat, L E

    2012-10-01

    The studies of Kämpf et al. suggested an efficiency of a computer-based stimulation therapy by drifting sinus gratings in patients with anisometropic and/or strabismic amblyopia but provided no clear evidence. This is the first trial with amblyopic patients without previous treatment at the beginning of amblyopia therapy. A prospective, randomised, single-blinded, placebo-controlled study of n = 15 patients with anisometropic and/or strabismic amblyopia without previous treatment was performed. Age of the patients was between 4 and 10 years, mean 6.3 years (± 2.0), all after full correction of refraction errors and refractive adaptation. Stimulation therapy was performed 5 times a week over 4 weeks, respectively 2 × 20 min, a drifting sinus grating of constant spatial and temporal frequency was combined with computer games (n = 8). Control group had only computer games with a neutral background (n = 7). In both groups patching was only done in stimulation times. Stimulation and control group did not differ due to age, gender, and cause of amblyopie, baseline visual acuity, and time of wearing glasses. There was no significant difference in the development of visual acuity over the stimulation period between stimulation and control groups. Stimulation therapy with drifting sinus gratings did not improve the development of visual acuity in the first phase of amblyopia treatment combined with minimal occlusion therapy. Accordingly, the stimulation therapy is not adequate to replace sufficient occlusion therapy. Whether this therapy could support patching therapy and improve acuity development in later therapy phases cannot be assumed from this trial. Georg Thieme Verlag KG Stuttgart · New York.

  1. The development of expertise using an intelligent computer-aided training system

    NASA Technical Reports Server (NTRS)

    Johnson, Debra Steele

    1991-01-01

    An initial examination was conducted of an Intelligent Tutoring System (ITS) developed for use in industry. The ITS, developed by NASA, simulated a satellite deployment task. More specifically, the PD (Payload Assist Module Deployment)/ICAT (Intelligent Computer Aided Training) System simulated a nominal Payload Assist Module (PAM) deployment. The development of expertise on this task was examined using three Flight Dynamics Officer (FDO) candidates who has no previous experience with this task. The results indicated that performance improved rapidly until Trial 5, followed by more gradual improvements through Trial 12. The performance dimensions measured included performance speed, actions completed, errors, help required, and display fields checked. Suggestions for further refining the software and for deciding when to expose trainees to more difficult task scenarios are discussed. Further, the results provide an initial demonstration of the effectiveness of the PD/ICAT system in training the nominal PAM deployment task and indicate the potential benefits of using ITS's for training other FDO tasks.

  2. Issues and recommendations associated with distributed computation and data management systems for the space sciences

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The primary purpose of the report is to explore management approaches and technology developments for computation and data management systems designed to meet future needs in the space sciences.The report builds on work presented in previous reports on solar-terrestrial and planetary reports, broadening the outlook to all of the space sciences, and considering policy issues aspects related to coordiantion between data centers, missions, and ongoing research activities, because it is perceived that the rapid growth of data and the wide geographic distribution of relevant facilities will present especially troublesome problems for data archiving, distribution, and analysis.

  3. Reasoning by analogy as an aid to heuristic theorem proving.

    NASA Technical Reports Server (NTRS)

    Kling, R. E.

    1972-01-01

    When heuristic problem-solving programs are faced with large data bases that contain numbers of facts far in excess of those needed to solve any particular problem, their performance rapidly deteriorates. In this paper, the correspondence between a new unsolved problem and a previously solved analogous problem is computed and invoked to tailor large data bases to manageable sizes. This paper outlines the design of an algorithm for generating and exploiting analogies between theorems posed to a resolution-logic system. These algorithms are believed to be the first computationally feasible development of reasoning by analogy to be applied to heuristic theorem proving.

  4. Probability density of tunneled carrier states near heterojunctions calculated numerically by the scattering method.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wampler, William R.; Myers, Samuel M.; Modine, Normand A.

    2017-09-01

    The energy-dependent probability density of tunneled carrier states for arbitrarily specified longitudinal potential-energy profiles in planar bipolar devices is numerically computed using the scattering method. Results agree accurately with a previous treatment based on solution of the localized eigenvalue problem, where computation times are much greater. These developments enable quantitative treatment of tunneling-assisted recombination in irradiated heterojunction bipolar transistors, where band offsets may enhance the tunneling effect by orders of magnitude. The calculations also reveal the density of non-tunneled carrier states in spatially varying potentials, and thereby test the common approximation of uniform- bulk values for such densities.

  5. Approaches for scalable modeling and emulation of cyber systems : LDRD final report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mayo, Jackson R.; Minnich, Ronald G.; Armstrong, Robert C.

    2009-09-01

    The goal of this research was to combine theoretical and computational approaches to better understand the potential emergent behaviors of large-scale cyber systems, such as networks of {approx} 10{sup 6} computers. The scale and sophistication of modern computer software, hardware, and deployed networked systems have significantly exceeded the computational research community's ability to understand, model, and predict current and future behaviors. This predictive understanding, however, is critical to the development of new approaches for proactively designing new systems or enhancing existing systems with robustness to current and future cyber threats, including distributed malware such as botnets. We have developed preliminarymore » theoretical and modeling capabilities that can ultimately answer questions such as: How would we reboot the Internet if it were taken down? Can we change network protocols to make them more secure without disrupting existing Internet connectivity and traffic flow? We have begun to address these issues by developing new capabilities for understanding and modeling Internet systems at scale. Specifically, we have addressed the need for scalable network simulation by carrying out emulations of a network with {approx} 10{sup 6} virtualized operating system instances on a high-performance computing cluster - a 'virtual Internet'. We have also explored mappings between previously studied emergent behaviors of complex systems and their potential cyber counterparts. Our results provide foundational capabilities for further research toward understanding the effects of complexity in cyber systems, to allow anticipating and thwarting hackers.« less

  6. vFitness: a web-based computing tool for improving estimation of in vitro HIV-1 fitness experiments

    PubMed Central

    2010-01-01

    Background The replication rate (or fitness) between viral variants has been investigated in vivo and in vitro for human immunodeficiency virus (HIV). HIV fitness plays an important role in the development and persistence of drug resistance. The accurate estimation of viral fitness relies on complicated computations based on statistical methods. This calls for tools that are easy to access and intuitive to use for various experiments of viral fitness. Results Based on a mathematical model and several statistical methods (least-squares approach and measurement error models), a Web-based computing tool has been developed for improving estimation of virus fitness in growth competition assays of human immunodeficiency virus type 1 (HIV-1). Conclusions Unlike the two-point calculation used in previous studies, the estimation here uses linear regression methods with all observed data in the competition experiment to more accurately estimate relative viral fitness parameters. The dilution factor is introduced for making the computational tool more flexible to accommodate various experimental conditions. This Web-based tool is implemented in C# language with Microsoft ASP.NET, and is publicly available on the Web at http://bis.urmc.rochester.edu/vFitness/. PMID:20482791

  7. vFitness: a web-based computing tool for improving estimation of in vitro HIV-1 fitness experiments.

    PubMed

    Ma, Jingming; Dykes, Carrie; Wu, Tao; Huang, Yangxin; Demeter, Lisa; Wu, Hulin

    2010-05-18

    The replication rate (or fitness) between viral variants has been investigated in vivo and in vitro for human immunodeficiency virus (HIV). HIV fitness plays an important role in the development and persistence of drug resistance. The accurate estimation of viral fitness relies on complicated computations based on statistical methods. This calls for tools that are easy to access and intuitive to use for various experiments of viral fitness. Based on a mathematical model and several statistical methods (least-squares approach and measurement error models), a Web-based computing tool has been developed for improving estimation of virus fitness in growth competition assays of human immunodeficiency virus type 1 (HIV-1). Unlike the two-point calculation used in previous studies, the estimation here uses linear regression methods with all observed data in the competition experiment to more accurately estimate relative viral fitness parameters. The dilution factor is introduced for making the computational tool more flexible to accommodate various experimental conditions. This Web-based tool is implemented in C# language with Microsoft ASP.NET, and is publicly available on the Web at http://bis.urmc.rochester.edu/vFitness/.

  8. RAID v2.0: an updated resource of RNA-associated interactions across organisms

    PubMed Central

    Yi, Ying; Zhao, Yue; Li, Chunhua; Zhang, Lin; Huang, Huiying; Li, Yana; Liu, Lanlan; Hou, Ping; Cui, Tianyu; Tan, Puwen; Hu, Yongfei; Zhang, Ting; Huang, Yan; Li, Xiaobo; Yu, Jia; Wang, Dong

    2017-01-01

    With the development of biotechnologies and computational prediction algorithms, the number of experimental and computational prediction RNA-associated interactions has grown rapidly in recent years. However, diverse RNA-associated interactions are scattered over a wide variety of resources and organisms, whereas a fully comprehensive view of diverse RNA-associated interactions is still not available for any species. Hence, we have updated the RAID database to version 2.0 (RAID v2.0, www.rna-society.org/raid/) by integrating experimental and computational prediction interactions from manually reading literature and other database resources under one common framework. The new developments in RAID v2.0 include (i) over 850-fold RNA-associated interactions, an enhancement compared to the previous version; (ii) numerous resources integrated with experimental or computational prediction evidence for each RNA-associated interaction; (iii) a reliability assessment for each RNA-associated interaction based on an integrative confidence score; and (iv) an increase of species coverage to 60. Consequently, RAID v2.0 recruits more than 5.27 million RNA-associated interactions, including more than 4 million RNA–RNA interactions and more than 1.2 million RNA–protein interactions, referring to nearly 130 000 RNA/protein symbols across 60 species. PMID:27899615

  9. Computing nucleon EDM on a lattice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abramczyk, Michael; Izubuchi, Taku

    I will discuss briefly recent changes in the methodology of computing the baryon EDM on a lattice. The associated correction substantially reduces presently existing lattice values for the proton and neutron theta-induced EDMs, so that even the most precise previous lattice results become consistent with zero. On one hand, this change removes previous disagreements between these lattice results and the phenomenological estimates of the nucleon EDM. On the other hand, the nucleon EDM becomes much harder to compute on a lattice. In addition, I will review the progress in computing quark chromo-EDM-induced nucleon EDM using chiral quark action.

  10. Computational identification of promising thermoelectric materials among known quasi-2D binary compounds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gorai, Prashun; Toberer, Eric S.; Stevanović, Vladan

    Quasi low-dimensional structures are abundant among known thermoelectric materials, primarily because of their low lattice thermal conductivities. In this work, we have computationally assessed the potential of 427 known binary quasi-2D structures in 272 different chemistries for thermoelectric performance. To assess the thermoelectric performance, we employ an improved version of our previously developed descriptor for thermoelectric performance [Yan et al., Energy Environ. Sci., 2015, 8, 983]. The improvement is in the explicit treatment of van der Waals interactions in quasi-2D materials, which leads to significantly better predictions of their crystal structures and lattice thermal conductivities. The improved methodology correctly identifiesmore » known binary quasi-2D thermoelectric materials such as Sb2Te3, Bi2Te3, SnSe, SnS, InSe, and In2Se3. As a result, we propose candidate quasi-2D binary materials, a number of which have not been previously considered for thermoelectric applications.« less

  11. Design of a serotonin 4 receptor radiotracer with decreased lipophilicity for single photon emission computed tomography.

    PubMed

    Fresneau, Nathalie; Dumas, Noé; Tournier, Benjamin B; Fossey, Christine; Ballandonne, Céline; Lesnard, Aurélien; Millet, Philippe; Charnay, Yves; Cailly, Thomas; Bouillon, Jean-Philippe; Fabis, Frédéric

    2015-04-13

    With the aim to develop a suitable radiotracer for the brain imaging of the serotonin 4 receptor subtype (5-HT4R) using single photon emission computed tomography (SPECT), we synthesized and evaluated a library of di- and triazaphenanthridines with lipophilicity values which were in the range expected to favour brain penetration, and which demonstrated specific binding to the target of interest. Adding additional nitrogen atoms to previously described phenanthridine ligands exhibiting a high unspecific binding, we were able to design a radioiodinated compound [(125)I]14. This compound exhibited a binding affinity value of 0.094 nM toward human 5-HT4R and a high selectivity over other serotonin receptor subtypes (5-HTR). In vivo SPECT imaging studies and competition experiments demonstrated that the decreased lipophilicity (in comparison with our previously reported compounds 4 and 5) allowed a more specific labelling of the 5-HT4R brain-containing regions. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  12. USSR Report, Cybernetics, Computers and Automation Technology

    DTIC Science & Technology

    1987-04-02

    Communication Channel (NTR: PROBLEMY I RESHENIYA, No 14, 22 Jul-4 Aug 86) 52 EDUCATION Informatics and the National Information Resource (I. Chebotaru...the method of actions, which were successful in the past. The experience of previous developments is implemented in the prototype programs. Many data...of the converter lining, due to reduction of ferroalloy consumption, oxygen consumption and energy resource consumption and due to a decrease of

  13. Studies on Vapor Adsorption Systems

    NASA Technical Reports Server (NTRS)

    Shamsundar, N.; Ramotowski, M.

    1998-01-01

    The project consisted of performing experiments on single and dual bed vapor adsorption systems, thermodynamic cycle optimization, and thermal modeling. The work was described in a technical paper that appeared in conference proceedings and a Master's thesis, which were previously submitted to NASA. The present report describes some additional thermal modeling work done subsequently, and includes listings of computer codes developed during the project. Recommendations for future work are provided.

  14. Patrol force allocation for law enforcement: An introductory planning guide

    NASA Technical Reports Server (NTRS)

    Sohn, R. L.; Kennedy, R. D.

    1976-01-01

    Previous and current methods for analyzing police patrol forces are reviewed and discussed. The steps in developing an allocation analysis procedure are defined, including the prediction of the rate of calls for service, determination of the number of patrol units needed, designing sectors, and analyzing dispatch strategies. Existing computer programs used for this purpose are briefly described, and some results of their application are given.

  15. Early Examples from the Integrated Multi-Satellite Retrievals for GPM (IMERG)

    NASA Astrophysics Data System (ADS)

    Huffman, George; Bolvin, David; Braithwaite, Daniel; Hsu, Kuolin; Joyce, Robert; Kidd, Christopher; Sorooshian, Soroosh; Xie, Pingping

    2014-05-01

    The U.S. GPM Science Team's Day-1 algorithm for computing combined precipitation estimates as part of GPM is the Integrated Multi-satellitE Retrievals for GPM (IMERG). The goal is to compute the best time series of (nearly) global precipitation from "all" precipitation-relevant satellites and global surface precipitation gauge analyses. IMERG is being developed as a unified U.S. algorithm drawing on strengths in the three contributing groups, whose previous work includes: 1) the TRMM Multi-satellite Precipitation Analysis (TMPA); 2) the CPC Morphing algorithm with Kalman Filtering (K-CMORPH); and 3) the Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks using a Cloud Classification System (PERSIANN-CCS). We review the IMERG design and development, plans for testing, and current status. Some of the lessons learned in running and reprocessing the previous data sets include the importance of quality-controlling input data sets, strategies for coping with transitions in the various input data sets, and practical approaches to retrospective analysis of multiple output products (namely the real- and post-real-time data streams). IMERG output will be illustrated using early test data, including the variety of supporting fields, such as the merged-microwave and infrared estimates, and the precipitation type. We end by considering recent changes in input data specifications, the transition from TRMM-based calibration to GPM-based, and further "Day 2" development.

  16. Computational models for the analysis/design of hypersonic scramjet components. I - Combustor and nozzle models

    NASA Technical Reports Server (NTRS)

    Dash, S. M.; Sinha, N.; Wolf, D. E.; York, B. J.

    1986-01-01

    An overview of computational models developed for the complete, design-oriented analysis of a scramjet propulsion system is provided. The modular approach taken involves the use of different PNS models to analyze the individual propulsion system components. The external compression and internal inlet flowfields are analyzed by the SCRAMP and SCRINT components discussed in Part II of this paper. The combustor is analyzed by the SCORCH code which is based upon SPLITP PNS pressure-split methodology formulated by Dash and Sinha. The nozzle is analyzed by the SCHNOZ code which is based upon SCIPVIS PNS shock-capturing methodology formulated by Dash and Wolf. The current status of these models, previous developments leading to this status, and, progress towards future hybrid and 3D versions are discussed in this paper.

  17. Measurement of breast volume using body scan technology(computer-aided anthropometry).

    PubMed

    Veitch, Daisy; Burford, Karen; Dench, Phil; Dean, Nicola; Griffin, Philip

    2012-01-01

    Assessment of breast volume is an important tool for preoperative planning in various breast surgeries and other applications, such as bra development. Accurate assessment can improve the consistency and quality of surgery outcomes. This study outlines a non-invasive method to measure breast volume using a whole body 3D laser surface anatomy scanner, the Cyberware WBX. It expands on a previous publication where this method was validated against patients undergoing mastectomy. It specifically outlines and expands the computer-aided anthropometric (CAA) method for extracting breast volumes in a non-invasive way from patients enrolled in a breast reduction study at Flinders Medical Centre, South Australia. This step-by-step description allows others to replicate this work and provides an additional tool to assist them in their own clinical practice and development of designs.

  18. The FITS model office ergonomics program: a model for best practice.

    PubMed

    Chim, Justine M Y

    2014-01-01

    An effective office ergonomics program can predict positive results in reducing musculoskeletal injury rates, enhancing productivity, and improving staff well-being and job satisfaction. Its objective is to provide a systematic solution to manage the potential risk of musculoskeletal disorders among computer users in an office setting. A FITS Model office ergonomics program is developed. The FITS Model Office Ergonomics Program has been developed which draws on the legislative requirements for promoting the health and safety of workers using computers for extended periods as well as previous research findings. The Model is developed according to the practical industrial knowledge in ergonomics, occupational health and safety management, and human resources management in Hong Kong and overseas. This paper proposes a comprehensive office ergonomics program, the FITS Model, which considers (1) Furniture Evaluation and Selection; (2) Individual Workstation Assessment; (3) Training and Education; (4) Stretching Exercises and Rest Break as elements of an effective program. An experienced ergonomics practitioner should be included in the program design and implementation. Through the FITS Model Office Ergonomics Program, the risk of musculoskeletal disorders among computer users can be eliminated or minimized, and workplace health and safety and employees' wellness enhanced.

  19. A Cervico-Thoraco-Lumbar Multibody Dynamic Model for the Estimation of Joint Loads and Muscle Forces.

    PubMed

    Khurelbaatar, Tsolmonbaatar; Kim, Kyungsoo; Hyuk Kim, Yoon

    2015-11-01

    Computational musculoskeletal models have been developed to predict mechanical joint loads on the human spine, such as the forces and moments applied to vertebral and facet joints and the forces that act on ligaments and muscles because of difficulties in the direct measurement of joint loads. However, many whole-spine models lack certain elements. For example, the detailed facet joints in the cervical region or the whole spine region may not be implemented. In this study, a detailed cervico-thoraco-lumbar multibody musculoskeletal model with all major ligaments, separated structures of facet contact and intervertebral disk joints, and the rib cage was developed. The model was validated by comparing the intersegmental rotations, ligament tensile forces, facet joint contact forces, compressive and shear forces on disks, and muscle forces were to those reported in previous experimental and computational studies both by region (cervical, thoracic, or lumbar regions) and for the whole model. The comparisons demonstrated that our whole spine model is consistent with in vitro and in vivo experimental studies and with computational studies. The model developed in this study can be used in further studies to better understand spine structures and injury mechanisms of spinal disorders.

  20. The provision of feedback through computer-based technology to promote self-managed post-stroke rehabilitation in the home.

    PubMed

    Parker, Jack; Mawson, Susan; Mountain, Gail; Nasr, Nasrin; Davies, Richard; Zheng, Huiru

    2014-11-01

    Building on previous research findings, this article describes the development of the feedback interfaces for a Personalised Self-Managed Rehabilitation System (PSMrS) for home-based post-stroke rehabilitation using computer-based technology. Embedded within a realistic evaluative methodological approach, the development of the feedback interfaces for the PSMrS involved the incorporation of existing and emerging theories and a hybrid of health and social sciences research and user-centred design methods. User testing confirmed that extrinsic feedback for home-based post-stroke rehabilitation through computer-based technology needs to be personalisable, accurate, rewarding and measurable. In addition, user testing also confirmed the feasibility of using specific components of the PSMrS. A number of key elements are crucial for the development and potential utilisation of technology in what is an inevitable shift towards the use of innovative methods of delivering post-stroke rehabilitation. This includes the specific elements that are essential for the promotion of self-managed rehabilitation and rehabilitative behaviour change; the impact of the context on the mechanisms; and, importantly, the need for reliability and accuracy of the technology.

  1. Space environment and lunar surface processes

    NASA Technical Reports Server (NTRS)

    Comstock, G. M.

    1979-01-01

    The development of a general rock/soil model capable of simulating in a self consistent manner the mechanical and exposure history of an assemblage of solid and loose material from submicron to planetary size scales, applicable to lunar and other space exposed planetary surfaces is discussed. The model was incorporated into a computer code called MESS.2 (model for the evolution of space exposed surfaces). MESS.2, which represents a considerable increase in sophistication and scope over previous soil and rock surface models, is described. The capabilities of previous models for near surface soil and rock surfaces are compared with the rock/soil model, MESS.2.

  2. Nonlinear calculations of the time evolution of black hole accretion disks

    NASA Technical Reports Server (NTRS)

    Luo, C.

    1994-01-01

    Based on previous works on black hole accretion disks, I continue to explore the disk dynamics using the finite difference method to solve the highly nonlinear problem of time-dependent alpha disk equations. Here a radially zoned model is used to develop a computational scheme in order to accommodate functional dependence of the viscosity parameter alpha on the disk scale height and/or surface density. This work is based on the author's previous work on the steady disk structure and the linear analysis of disk dynamics to try to apply to x-ray emissions from black candidates (i.e., multiple-state spectra, instabilities, QPO's, etc.).

  3. A high-performance seizure detection algorithm based on Discrete Wavelet Transform (DWT) and EEG

    PubMed Central

    Chen, Duo; Wan, Suiren; Xiang, Jing; Bao, Forrest Sheng

    2017-01-01

    In the past decade, Discrete Wavelet Transform (DWT), a powerful time-frequency tool, has been widely used in computer-aided signal analysis of epileptic electroencephalography (EEG), such as the detection of seizures. One of the important hurdles in the applications of DWT is the settings of DWT, which are chosen empirically or arbitrarily in previous works. The objective of this study aimed to develop a framework for automatically searching the optimal DWT settings to improve accuracy and to reduce computational cost of seizure detection. To address this, we developed a method to decompose EEG data into 7 commonly used wavelet families, to the maximum theoretical level of each mother wavelet. Wavelets and decomposition levels providing the highest accuracy in each wavelet family were then searched in an exhaustive selection of frequency bands, which showed optimal accuracy and low computational cost. The selection of frequency bands and features removed approximately 40% of redundancies. The developed algorithm achieved promising performance on two well-tested EEG datasets (accuracy >90% for both datasets). The experimental results of the developed method have demonstrated that the settings of DWT affect its performance on seizure detection substantially. Compared with existing seizure detection methods based on wavelet, the new approach is more accurate and transferable among datasets. PMID:28278203

  4. Training faculty in Bangladesh to use a microcomputer for public health: followup report.

    PubMed Central

    Gould, J B; Frerichs, R R

    1986-01-01

    In 1984 the Western Consortium for the Health Professions, Inc., under contract to the United Nations Fund for Population Activities (UNFPA), began a project to assist Bangladesh's National Institute for Preventive and Social Medicine (NIPSOM) in establishing a microcomputing capability. The project's goal was to enable NIPSOM to become self-sufficient in the analysis by microcomputer of health, population, and family planning data; program evaluation; and policy activities. Lack of a local microcomputer infrastructure demanded that a local team of experts be developed to run the system described in a previous report. Five NIPSOM faculty members--three of whom had taken the workshop held when the system was first installed--were assigned to a computer committee, which was responsible for the computer's well-being. Six months after the microcomputer system was installed, a second 2-week workshop was given. The consortium's consultant facilitated the development of a basic microcomputer course, which was taught by four members of the computer committee to an additional eight NIPSOM faculty members. Emphasis was placed on developing local self-reliance and the need to overcome obstacles imposed by the lack of local hardware and software support systems. A strategy is proposed for the successful introduction of microcomputers in developing countries. PMID:3097742

  5. The development and validation of a numerical integration method for non-linear viscoelastic modeling

    PubMed Central

    Ramo, Nicole L.; Puttlitz, Christian M.

    2018-01-01

    Compelling evidence that many biological soft tissues display both strain- and time-dependent behavior has led to the development of fully non-linear viscoelastic modeling techniques to represent the tissue’s mechanical response under dynamic conditions. Since the current stress state of a viscoelastic material is dependent on all previous loading events, numerical analyses are complicated by the requirement of computing and storing the stress at each step throughout the load history. This requirement quickly becomes computationally expensive, and in some cases intractable, for finite element models. Therefore, we have developed a strain-dependent numerical integration approach for capturing non-linear viscoelasticity that enables calculation of the current stress from a strain-dependent history state variable stored from the preceding time step only, which improves both fitting efficiency and computational tractability. This methodology was validated based on its ability to recover non-linear viscoelastic coefficients from simulated stress-relaxation (six strain levels) and dynamic cyclic (three frequencies) experimental stress-strain data. The model successfully fit each data set with average errors in recovered coefficients of 0.3% for stress-relaxation fits and 0.1% for cyclic. The results support the use of the presented methodology to develop linear or non-linear viscoelastic models from stress-relaxation or cyclic experimental data of biological soft tissues. PMID:29293558

  6. Continued development of a detailed model of arc discharge dynamics

    NASA Technical Reports Server (NTRS)

    Beers, B. L.; Pine, V. W.; Ives, S. T.

    1982-01-01

    Using a previously developed set of codes (SEMC, CASCAD, ACORN), a parametric study was performed to quantify the parameters which describe the development of a single electron indicated avalanche into a negative tip streamer. The electron distribution function in Teflon is presented for values of the electric field in the range of four-hundred million volts/meter to four billon volts/meter. A formulation of the scattering parameters is developed which shows that the transport can be represented by three independent variables. The distribution of ionization sites is used to indicate an avalanche. The self consistent evolution of the avalanche is computed over the parameter range of scattering set.

  7. Design and Development of Intelligent Electrodes for Future Digital Health Monitoring: A Review

    NASA Astrophysics Data System (ADS)

    Khairuddin, A. M.; Azir, K. N. F. Ku; Kan, P. Eh

    2018-03-01

    Electrodes are sensors used in electrocardiography (ECG) monitoring system to diagnose heart diseases. Over the years, diverse types of electrodes have been designed and developed to improve ECG monitoring system. However, more recently, with the technological advances and capabilities from the Internet of Things (IoT), cloud computing and data analytics in personalized healthcare, researchers are attempting to design and develop more effective as well as flexible ECG devices by using intelligent electrodes. This paper reviews previous works on electrodes used in electrocardiography (ECG) monitoring devices to identify the key ftures for designing and developing intelligent electrodes in digital health monitoring devices.

  8. Hollow cathodes as electron emitting plasma contactors Theory and computer modeling

    NASA Technical Reports Server (NTRS)

    Davis, V. A.; Katz, I.; Mandell, M. J.; Parks, D. E.

    1987-01-01

    Several researchers have suggested using hollow cathodes as plasma contactors for electrodynamic tethers, particularly to prevent the Shuttle Orbiter from charging to large negative potentials. Previous studies have shown that fluid models with anomalous scattering can describe the electron transport in hollow cathode generated plasmas. An improved theory of the hollow cathode plasmas is developed and computational results using the theory are compared with laboratory experiments. Numerical predictions for a hollow cathode plasma source of the type considered for use on the Shuttle are presented, as are three-dimensional NASCAP/LEO calculations of the emitted ion trajectories and the resulting potentials in the vicinity of the Orbiter. The computer calculations show that the hollow cathode plasma source makes vastly superior contact with the ionospheric plasma compared with either an electron gun or passive ion collection by the Orbiter.

  9. Cone beam computed tomography and intraoral radiography for diagnosis of dental abnormalities in dogs and cats

    PubMed Central

    Silva, Luiz Antonio F.; Barriviera, Mauricio; Januário, Alessandro L.; Bezerra, Ana Cristina B.; Fioravanti, Maria Clorinda S.

    2011-01-01

    The development of veterinary dentistry has substantially improved the ability to diagnose canine and feline dental abnormalities. Consequently, examinations previously performed only on humans are now available for small animals, thus improving the diagnostic quality. This has increased the need for technical qualification of veterinary professionals and increased technological investments. This study evaluated the use of cone beam computed tomography and intraoral radiography as complementary exams for diagnosing dental abnormalities in dogs and cats. Cone beam computed tomography was provided faster image acquisition with high image quality, was associated with low ionizing radiation levels, enabled image editing, and reduced the exam duration. Our results showed that radiography was an effective method for dental radiographic examination with low cost and fast execution times, and can be performed during surgical procedures. PMID:22122905

  10. Steady-state and dynamic models for particle engulfment during solidification

    NASA Astrophysics Data System (ADS)

    Tao, Yutao; Yeckel, Andrew; Derby, Jeffrey J.

    2016-06-01

    Steady-state and dynamic models are developed to study the physical mechanisms that determine the pushing or engulfment of a solid particle at a moving solid-liquid interface. The mathematical model formulation rigorously accounts for energy and momentum conservation, while faithfully representing the interfacial phenomena affecting solidification phase change and particle motion. A numerical solution approach is developed using the Galerkin finite element method and elliptic mesh generation in an arbitrary Lagrangian-Eulerian implementation, thus allowing for a rigorous representation of forces and dynamics previously inaccessible by approaches using analytical approximations. We demonstrate that this model accurately computes the solidification interface shape while simultaneously resolving thin fluid layers around the particle that arise from premelting during particle engulfment. We reinterpret the significance of premelting via the definition an unambiguous critical velocity for engulfment from steady-state analysis and bifurcation theory. We also explore the complicated transient behaviors that underlie the steady states of this system and posit the significance of dynamical behavior on engulfment events for many systems. We critically examine the onset of engulfment by comparing our computational predictions to those obtained using the analytical model of Rempel and Worster [29]. We assert that, while the accurate calculation of van der Waals repulsive forces remains an open issue, the computational model developed here provides a clear benefit over prior models for computing particle drag forces and other phenomena needed for the faithful simulation of particle engulfment.

  11. Semiautomated skeletonization of the pulmonary arterial tree in micro-CT images

    NASA Astrophysics Data System (ADS)

    Hanger, Christopher C.; Haworth, Steven T.; Molthen, Robert C.; Dawson, Christopher A.

    2001-05-01

    We present a simple and robust approach that utilizes planar images at different angular rotations combined with unfiltered back-projection to locate the central axes of the pulmonary arterial tree. Three-dimensional points are selected interactively by the user. The computer calculates a sub- volume unfiltered back-projection orthogonal to the vector connecting the two points and centered on the first point. Because more x-rays are absorbed at the thickest portion of the vessel, in the unfiltered back-projection, the darkest pixel is assumed to be the center of the vessel. The computer replaces this point with the newly computer-calculated point. A second back-projection is calculated around the original point orthogonal to a vector connecting the newly-calculated first point and user-determined second point. The darkest pixel within the reconstruction is determined. The computer then replaces the second point with the XYZ coordinates of the darkest pixel within this second reconstruction. Following a vector based on a moving average of previously determined 3- dimensional points along the vessel's axis, the computer continues this skeletonization process until stopped by the user. The computer estimates the vessel diameter along the set of previously determined points using a method similar to the full width-half max algorithm. On all subsequent vessels, the process works the same way except that at each point, distances between the current point and all previously determined points along different vessels are determined. If the difference is less than the previously estimated diameter, the vessels are assumed to branch. This user/computer interaction continues until the vascular tree has been skeletonized.

  12. Logic integration of mRNA signals by an RNAi-based molecular computer.

    PubMed

    Xie, Zhen; Liu, Siyuan John; Bleris, Leonidas; Benenson, Yaakov

    2010-05-01

    Synthetic in vivo molecular 'computers' could rewire biological processes by establishing programmable, non-native pathways between molecular signals and biological responses. Multiple molecular computer prototypes have been shown to work in simple buffered solutions. Many of those prototypes were made of DNA strands and performed computations using cycles of annealing-digestion or strand displacement. We have previously introduced RNA interference (RNAi)-based computing as a way of implementing complex molecular logic in vivo. Because it also relies on nucleic acids for its operation, RNAi computing could benefit from the tools developed for DNA systems. However, these tools must be harnessed to produce bioactive components and be adapted for harsh operating environments that reflect in vivo conditions. In a step toward this goal, we report the construction and implementation of biosensors that 'transduce' mRNA levels into bioactive, small interfering RNA molecules via RNA strand exchange in a cell-free Drosophila embryo lysate, a step beyond simple buffered environments. We further integrate the sensors with our RNAi 'computational' module to evaluate two-input logic functions on mRNA concentrations. Our results show how RNA strand exchange can expand the utility of RNAi computing and point toward the possibility of using strand exchange in a native biological setting.

  13. Discovering Synergistic Drug Combination from a Computational Perspective.

    PubMed

    Ding, Pingjian; Luo, Jiawei; Liang, Cheng; Xiao, Qiu; Cao, Buwen; Li, Guanghui

    2018-03-30

    Synergistic drug combinations play an important role in the treatment of complex diseases. The identification of effective drug combination is vital to further reduce the side effects and improve therapeutic efficiency. In previous years, in vitro method has been the main route to discover synergistic drug combinations. However, many limitations of time and resource consumption lie within the in vitro method. Therefore, with the rapid development of computational models and the explosive growth of large and phenotypic data, computational methods for discovering synergistic drug combinations are an efficient and promising tool and contribute to precision medicine. It is the key of computational methods how to construct the computational model. Different computational strategies generate different performance. In this review, the recent advancements in computational methods for predicting effective drug combination are concluded from multiple aspects. First, various datasets utilized to discover synergistic drug combinations are summarized. Second, we discussed feature-based approaches and partitioned these methods into two classes including feature-based methods in terms of similarity measure, and feature-based methods in terms of machine learning. Third, we discussed network-based approaches for uncovering synergistic drug combinations. Finally, we analyzed and prospected computational methods for predicting effective drug combinations. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  14. A Rural Community's Involvement in the Design and Usability Testing of a Computer-Based Informed Consent Process for the Personalized Medicine Research Project

    PubMed Central

    Mahnke, Andrea N; Plasek, Joseph M; Hoffman, David G; Partridge, Nathan S; Foth, Wendy S; Waudby, Carol J; Rasmussen, Luke V; McManus, Valerie D; McCarty, Catherine A

    2014-01-01

    Many informed consent studies demonstrate that research subjects poorly retain and understand information in written consent documents. Previous research in multimedia consent is mixed in terms of success for improving participants’ understanding, satisfaction, and retention. This failure may be due to a lack of a community-centered design approach to building the interventions. The goal of this study was to gather information from the community to determine the best way to undertake the consent process. Community perceptions regarding different computer-based consenting approaches were evaluated, and a computer-based consent was developed and tested. A second goal was to evaluate whether participants make truly informed decisions to participate in research. Simulations of an informed consent process were videotaped to document the process. Focus groups were conducted to determine community attitudes towards a computer-based informed consent process. Hybrid focus groups were conducted to determine the most acceptable hardware device. Usability testing was conducted on a computer-based consent prototype using a touch-screen kiosk. Based on feedback, a computer-based consent was developed. Representative study participants were able to easily complete the consent, and all were able to correctly answer the comprehension check questions. Community involvement in developing a computer-based consent proved valuable for a population-based genetic study. These findings may translate to other types of informed consents, such as genetic clinical trials consents. A computer-based consent may serve to better communicate consistent, clear, accurate, and complete information regarding the risks and benefits of study participation. Additional analysis is necessary to measure the level of comprehension of the check-question answers by larger numbers of participants. The next step will involve contacting participants to measure whether understanding of what they consented to is retained over time. PMID:24273095

  15. A rural community's involvement in the design and usability testing of a computer-based informed consent process for the Personalized Medicine Research Project.

    PubMed

    Mahnke, Andrea N; Plasek, Joseph M; Hoffman, David G; Partridge, Nathan S; Foth, Wendy S; Waudby, Carol J; Rasmussen, Luke V; McManus, Valerie D; McCarty, Catherine A

    2014-01-01

    Many informed consent studies demonstrate that research subjects poorly retain and understand information in written consent documents. Previous research in multimedia consent is mixed in terms of success for improving participants' understanding, satisfaction, and retention. This failure may be due to a lack of a community-centered design approach to building the interventions. The goal of this study was to gather information from the community to determine the best way to undertake the consent process. Community perceptions regarding different computer-based consenting approaches were evaluated, and a computer-based consent was developed and tested. A second goal was to evaluate whether participants make truly informed decisions to participate in research. Simulations of an informed consent process were videotaped to document the process. Focus groups were conducted to determine community attitudes towards a computer-based informed consent process. Hybrid focus groups were conducted to determine the most acceptable hardware device. Usability testing was conducted on a computer-based consent prototype using a touch-screen kiosk. Based on feedback, a computer-based consent was developed. Representative study participants were able to easily complete the consent, and all were able to correctly answer the comprehension check questions. Community involvement in developing a computer-based consent proved valuable for a population-based genetic study. These findings may translate to other types of informed consents, including those for trials involving treatment of genetic disorders. A computer-based consent may serve to better communicate consistent, clear, accurate, and complete information regarding the risks and benefits of study participation. Additional analysis is necessary to measure the level of comprehension of the check-question answers by larger numbers of participants. The next step will involve contacting participants to measure whether understanding of what they consented to is retained over time. © 2013 Wiley Periodicals, Inc.

  16. Solubility prediction, solvate and cocrystal screening as tools for rational crystal engineering.

    PubMed

    Loschen, Christoph; Klamt, Andreas

    2015-06-01

    The fact that novel drug candidates are becoming increasingly insoluble is a major problem of current drug development. Computational tools may address this issue by screening for suitable solvents or by identifying potential novel cocrystal formers that increase bioavailability. In contrast to other more specialized methods, the fluid phase thermodynamics approach COSMO-RS (conductor-like screening model for real solvents) allows for a comprehensive treatment of drug solubility, solvate and cocrystal formation and many other thermodynamics properties in liquids. This article gives an overview of recent COSMO-RS developments that are of interest for drug development and contains several new application examples for solubility prediction and solvate/cocrystal screening. For all property predictions COSMO-RS has been used. The basic concept of COSMO-RS consists of using the screening charge density as computed from first principles calculations in combination with fast statistical thermodynamics to compute the chemical potential of a compound in solution. The fast and accurate assessment of drug solubility and the identification of suitable solvents, solvate or cocrystal formers is nowadays possible and may be used to complement modern drug development. Efficiency is increased by avoiding costly quantum-chemical computations using a database of previously computed molecular fragments. COSMO-RS theory can be applied to a range of physico-chemical properties, which are of interest in rational crystal engineering. Most notably, in combination with experimental reference data, accurate quantitative solubility predictions in any solvent or solvent mixture are possible. Additionally, COSMO-RS can be extended to the prediction of cocrystal formation, which results in considerable predictive accuracy concerning coformer screening. In a recent variant costly quantum chemical calculations are avoided resulting in a significant speed-up and ease-of-use. © 2015 Royal Pharmaceutical Society.

  17. Ergonomics standards and guidelines for computer workstation design and the impact on users' health - a review.

    PubMed

    Woo, E H C; White, P; Lai, C W K

    2016-03-01

    This paper presents an overview of global ergonomics standards and guidelines for design of computer workstations, with particular focus on their inconsistency and associated health risk impact. Overall, considerable disagreements were found in the design specifications of computer workstations globally, particularly in relation to the results from previous ergonomics research and the outcomes from current ergonomics standards and guidelines. To cope with the rapid advancement in computer technology, this article provides justifications and suggestions for modifications in the current ergonomics standards and guidelines for the design of computer workstations. Practitioner Summary: A research gap exists in ergonomics standards and guidelines for computer workstations. We explore the validity and generalisability of ergonomics recommendations by comparing previous ergonomics research through to recommendations and outcomes from current ergonomics standards and guidelines.

  18. Coupled Aerodynamic and Structural Sensitivity Analysis of a High-Speed Civil Transport

    NASA Technical Reports Server (NTRS)

    Mason, B. H.; Walsh, J. L.

    2001-01-01

    An objective of the High Performance Computing and Communication Program at the NASA Langley Research Center is to demonstrate multidisciplinary shape and sizing optimization of a complete aerospace vehicle configuration by using high-fidelity, finite-element structural analysis and computational fluid dynamics aerodynamic analysis. In a previous study, a multi-disciplinary analysis system for a high-speed civil transport was formulated to integrate a set of existing discipline analysis codes, some of them computationally intensive, This paper is an extension of the previous study, in which the sensitivity analysis for the coupled aerodynamic and structural analysis problem is formulated and implemented. Uncoupled stress sensitivities computed with a constant load vector in a commercial finite element analysis code are compared to coupled aeroelastic sensitivities computed by finite differences. The computational expense of these sensitivity calculation methods is discussed.

  19. Intra Frame Coding In Advanced Video Coding Standard (H.264) to Obtain Consistent PSNR and Reduce Bit Rate for Diagonal Down Left Mode Using Gaussian Pulse

    NASA Astrophysics Data System (ADS)

    Manjanaik, N.; Parameshachari, B. D.; Hanumanthappa, S. N.; Banu, Reshma

    2017-08-01

    Intra prediction process of H.264 video coding standard used to code first frame i.e. Intra frame of video to obtain good coding efficiency compare to previous video coding standard series. More benefit of intra frame coding is to reduce spatial pixel redundancy with in current frame, reduces computational complexity and provides better rate distortion performance. To code Intra frame it use existing process Rate Distortion Optimization (RDO) method. This method increases computational complexity, increases in bit rate and reduces picture quality so it is difficult to implement in real time applications, so the many researcher has been developed fast mode decision algorithm for coding of intra frame. The previous work carried on Intra frame coding in H.264 standard using fast decision mode intra prediction algorithm based on different techniques was achieved increased in bit rate, degradation of picture quality(PSNR) for different quantization parameters. Many previous approaches of fast mode decision algorithms on intra frame coding achieved only reduction of computational complexity or it save encoding time and limitation was increase in bit rate with loss of quality of picture. In order to avoid increase in bit rate and loss of picture quality a better approach was developed. In this paper developed a better approach i.e. Gaussian pulse for Intra frame coding using diagonal down left intra prediction mode to achieve higher coding efficiency in terms of PSNR and bitrate. In proposed method Gaussian pulse is multiplied with each 4x4 frequency domain coefficients of 4x4 sub macro block of macro block of current frame before quantization process. Multiplication of Gaussian pulse for each 4x4 integer transformed coefficients at macro block levels scales the information of the coefficients in a reversible manner. The resulting signal would turn abstract. Frequency samples are abstract in a known and controllable manner without intermixing of coefficients, it avoids picture getting bad hit for higher values of quantization parameters. The proposed work was implemented using MATLAB and JM 18.6 reference software. The proposed work measure the performance parameters PSNR, bit rate and compression of intra frame of yuv video sequences in QCIF resolution under different values of quantization parameter with Gaussian value for diagonal down left intra prediction mode. The simulation results of proposed algorithm are tabulated and compared with previous algorithm i.e. Tian et al method. The proposed algorithm achieved reduced in bit rate averagely 30.98% and maintain consistent picture quality for QCIF sequences compared to previous algorithm i.e. Tian et al method.

  20. Allan deviation computations of a linear frequency synthesizer system using frequency domain techniques

    NASA Technical Reports Server (NTRS)

    Wu, Andy

    1995-01-01

    Allan Deviation computations of linear frequency synthesizer systems have been reported previously using real-time simulations. Even though it takes less time compared with the actual measurement, it is still very time consuming to compute the Allan Deviation for long sample times with the desired confidence level. Also noises, such as flicker phase noise and flicker frequency noise, can not be simulated precisely. The use of frequency domain techniques can overcome these drawbacks. In this paper the system error model of a fictitious linear frequency synthesizer is developed and its performance using a Cesium (Cs) atomic frequency standard (AFS) as a reference is evaluated using frequency domain techniques. For a linear timing system, the power spectral density at the system output can be computed with known system transfer functions and known power spectral densities from the input noise sources. The resulting power spectral density can then be used to compute the Allan Variance at the system output. Sensitivities of the Allan Variance at the system output to each of its independent input noises are obtained, and they are valuable for design trade-off and trouble-shooting.

  1. The quantum computer game: citizen science

    NASA Astrophysics Data System (ADS)

    Damgaard, Sidse; Mølmer, Klaus; Sherson, Jacob

    2013-05-01

    Progress in the field of quantum computation is hampered by daunting technical challenges. Here we present an alternative approach to solving these by enlisting the aid of computer players around the world. We have previously examined a quantum computation architecture involving ultracold atoms in optical lattices and strongly focused tweezers of light. In The Quantum Computer Game (see http://www.scienceathome.org/), we have encapsulated the time-dependent Schrödinger equation for the problem in a graphical user interface allowing for easy user input. Players can then search the parameter space with real-time graphical feedback in a game context with a global high-score that rewards short gate times and robustness to experimental errors. The game which is still in a demo version has so far been tried by several hundred players. Extensions of the approach to other models such as Gross-Pitaevskii and Bose-Hubbard are currently under development. The game has also been incorporated into science education at high-school and university level as an alternative method for teaching quantum mechanics. Initial quantitative evaluation results are very positive. AU Ideas Center for Community Driven Research, CODER.

  2. Acceleration of High Angular Momentum Electron Repulsion Integrals and Integral Derivatives on Graphics Processing Units.

    PubMed

    Miao, Yipu; Merz, Kenneth M

    2015-04-14

    We present an efficient implementation of ab initio self-consistent field (SCF) energy and gradient calculations that run on Compute Unified Device Architecture (CUDA) enabled graphical processing units (GPUs) using recurrence relations. We first discuss the machine-generated code that calculates the electron-repulsion integrals (ERIs) for different ERI types. Next we describe the porting of the SCF gradient calculation to GPUs, which results in an acceleration of the computation of the first-order derivative of the ERIs. However, only s, p, and d ERIs and s and p derivatives could be executed simultaneously on GPUs using the current version of CUDA and generation of NVidia GPUs using a previously described algorithm [Miao and Merz J. Chem. Theory Comput. 2013, 9, 965-976.]. Hence, we developed an algorithm to compute f type ERIs and d type ERI derivatives on GPUs. Our benchmarks shows the performance GPU enable ERI and ERI derivative computation yielded speedups of 10-18 times relative to traditional CPU execution. An accuracy analysis using double-precision calculations demonstrates that the overall accuracy is satisfactory for most applications.

  3. Prosthetically directed implant placement using computer software to ensure precise placement and predictable prosthetic outcomes. Part 2: rapid-prototype medical modeling and stereolithographic drilling guides requiring bone exposure.

    PubMed

    Rosenfeld, Alan L; Mandelaris, George A; Tardieu, Philippe B

    2006-08-01

    The purpose of this paper is to expand on part 1 of this series (published in the previous issue) regarding the emerging future of computer-guided implant dentistry. This article will introduce the concept of rapid-prototype medical modeling as well as describe the utilization and fabrication of computer-generated surgical drilling guides used during implant surgery. The placement of dental implants has traditionally been an intuitive process, whereby the surgeon relies on mental navigation to achieve optimal implant positioning. Through rapid-prototype medical modeling and the ste-reolithographic process, surgical drilling guides (eg, SurgiGuide) can be created. These guides are generated from a surgical implant plan created with a computer software system that incorporates all relevant prosthetic information from which the surgical plan is developed. The utilization of computer-generated planning and stereolithographically generated surgical drilling guides embraces the concept of collaborative accountability and supersedes traditional mental navigation on all levels of implant therapy.

  4. Discharge Chamber Primary Electron Modeling Activities in Three-Dimensions

    NASA Technical Reports Server (NTRS)

    Steuber, Thomas J.

    2004-01-01

    Designing discharge chambers for ion thrusters involves many geometric configuration decisions. Various decisions will impact discharge chamber performance with respect to propellant utilization efficiency, ion production costs, and grid lifetime. These hardware design decisions can benefit from the assistance of computational modeling. Computational modeling for discharge chambers has been limited to two-dimensional codes that leveraged symmetry for interpretation into three-dimensional analysis. This paper presents model development activities towards a three-dimensional discharge chamber simulation to aid discharge chamber design decisions. Specifically, of the many geometric configuration decisions toward attainment of a worthy discharge chamber, this paper focuses on addressing magnetic circuit considerations with a three-dimensional discharge chamber simulation as a tool. With this tool, candidate discharge chamber magnetic circuit designs can be analyzed computationally to gain insight into factors that may influence discharge chamber performance such as: primary electron loss width in magnetic cusps, cathode tip position with respect to the low magnetic field volume, definition of a low magnetic field region, and maintenance of a low magnetic field region across the grid span. Corroborating experimental data will be obtained from mockup hardware tests. Initially, simulated candidate magnetic circuit designs will resemble previous successful thruster designs. To provide opportunity to improve beyond previous performance benchmarks, off-design modifications will be simulated and experimentally tested.

  5. Giant anterior sacral meningocele presenting as bacterial meningitis in a previously healthy adult.

    PubMed

    Miletic, D; Poljak, I; Eskinja, N; Valkovic, P; Sestan, B; Troselj-Vukic, B

    2008-02-01

    Meningocele may be asymptomatic and incidentally discovered. Presenting as a retrorectal mass, sacral meningocele may produce urinary, rectal, and menstrual pain. Anterior sacral meningocele may be the cause of tethered cord syndrome. This article presents a case of a previously healthy 39-year-old man with large meningeal herniation that occupied the entire pelvis who developed symptoms of bacterial meningitis. A 39-year-old man was admitted with fever, chills, headache and photophobia. Escherichia coli was isolated from cerebrospinal fluid culture. Moderate improvement regarding meningeal symptoms was noted due to intravenous antibiotic therapy, but intense pain in the lower back associated with constipation, fecal and urinary incontinence, and saddle anesthesia developed. Abdominal ultrasound was negative. Plain radiographs and computed tomography demonstrated sacral bone defect and retrorectal expansive mass. MRI confirmed anterior sacral meningocele with cord tethering. After posterior laminectomy and dural opening, communication between meningocele and intrathecal compartment was obliterated. Computed tomography-guided percutaneous drainage through the ischiorectal fossa was performed to treat residual presacral cyst. Delayed diagnosis in our patient was related to misleading signs of bacterial meningitis without symptoms of intrapelvic expansion until the second week of illness. In our patient, surgical treatment was unavoidable due to resistive meningitis, acute back pain, and symptoms of space-occupying pelvic lesion. Neurosurgical approach was successful in treatment of meningitis and neurological disorders. Computed tomography-guided evacuation of the residual retrorectal cyst was less invasive than laparotomy, resulting in normalization of defecation and miction despite incomplete evacuation. Further follow-up studies may provide insight into the most effective treatment of such conditions.

  6. Graphics Processing Unit Assisted Thermographic Compositing

    NASA Technical Reports Server (NTRS)

    Ragasa, Scott; McDougal, Matthew; Russell, Sam

    2012-01-01

    Objective: To develop a software application utilizing general purpose graphics processing units (GPUs) for the analysis of large sets of thermographic data. Background: Over the past few years, an increasing effort among scientists and engineers to utilize the GPU in a more general purpose fashion is allowing for supercomputer level results at individual workstations. As data sets grow, the methods to work them grow at an equal, and often great, pace. Certain common computations can take advantage of the massively parallel and optimized hardware constructs of the GPU to allow for throughput that was previously reserved for compute clusters. These common computations have high degrees of data parallelism, that is, they are the same computation applied to a large set of data where the result does not depend on other data elements. Signal (image) processing is one area were GPUs are being used to greatly increase the performance of certain algorithms and analysis techniques. Technical Methodology/Approach: Apply massively parallel algorithms and data structures to the specific analysis requirements presented when working with thermographic data sets.

  7. Computational Methods in Drug Discovery

    PubMed Central

    Sliwoski, Gregory; Kothiwale, Sandeepkumar; Meiler, Jens

    2014-01-01

    Computer-aided drug discovery/design methods have played a major role in the development of therapeutically important small molecules for over three decades. These methods are broadly classified as either structure-based or ligand-based methods. Structure-based methods are in principle analogous to high-throughput screening in that both target and ligand structure information is imperative. Structure-based approaches include ligand docking, pharmacophore, and ligand design methods. The article discusses theory behind the most important methods and recent successful applications. Ligand-based methods use only ligand information for predicting activity depending on its similarity/dissimilarity to previously known active ligands. We review widely used ligand-based methods such as ligand-based pharmacophores, molecular descriptors, and quantitative structure-activity relationships. In addition, important tools such as target/ligand data bases, homology modeling, ligand fingerprint methods, etc., necessary for successful implementation of various computer-aided drug discovery/design methods in a drug discovery campaign are discussed. Finally, computational methods for toxicity prediction and optimization for favorable physiologic properties are discussed with successful examples from literature. PMID:24381236

  8. Electron Impact Ionization: A New Parameterization for 100 eV to 1 MeV Electrons

    NASA Technical Reports Server (NTRS)

    Fang, Xiaohua; Randall, Cora E.; Lummerzheim, Dirk; Solomon, Stanley C.; Mills, Michael J.; Marsh, Daniel; Jackman, Charles H.; Wang, Wenbin; Lu, Gang

    2008-01-01

    Low, medium and high energy electrons can penetrate to the thermosphere (90-400 km; 55-240 miles) and mesosphere (50-90 km; 30-55 miles). These precipitating electrons ionize that region of the atmosphere, creating positively charged atoms and molecules and knocking off other negatively charged electrons. The precipitating electrons also create nitrogen-containing compounds along with other constituents. Since the electron precipitation amounts change within minutes, it is necessary to have a rapid method of computing the ionization and production of nitrogen-containing compounds for inclusion in computationally-demanding global models. A new methodology has been developed, which has parameterized a more detailed model computation of the ionizing impact of precipitating electrons over the very large range of 100 eV up to 1,000,000 eV. This new parameterization method is more accurate than a previous parameterization scheme, when compared with the more detailed model computation. Global models at the National Center for Atmospheric Research will use this new parameterization method in the near future.

  9. 2-D Model for Normal and Sickle Cell Blood Microcirculation

    NASA Astrophysics Data System (ADS)

    Tekleab, Yonatan; Harris, Wesley

    2011-11-01

    Sickle cell disease (SCD) is a genetic disorder that alters the red blood cell (RBC) structure and function such that hemoglobin (Hb) cannot effectively bind and release oxygen. Previous computational models have been designed to study the microcirculation for insight into blood disorders such as SCD. Our novel 2-D computational model represents a fast, time efficient method developed to analyze flow dynamics, O2 diffusion, and cell deformation in the microcirculation. The model uses a finite difference, Crank-Nicholson scheme to compute the flow and O2 concentration, and the level set computational method to advect the RBC membrane on a staggered grid. Several sets of initial and boundary conditions were tested. Simulation data indicate a few parameters to be significant in the perturbation of the blood flow and O2 concentration profiles. Specifically, the Hill coefficient, arterial O2 partial pressure, O2 partial pressure at 50% Hb saturation, and cell membrane stiffness are significant factors. Results were found to be consistent with those of Le Floch [2010] and Secomb [2006].

  10. Digital pathology in nephrology clinical trials, research, and pathology practice.

    PubMed

    Barisoni, Laura; Hodgin, Jeffrey B

    2017-11-01

    In this review, we will discuss (i) how the recent advancements in digital technology and computational engineering are currently applied to nephropathology in the setting of clinical research, trials, and practice; (ii) the benefits of the new digital environment; (iii) how recognizing its challenges provides opportunities for transformation; and (iv) nephropathology in the upcoming era of kidney precision and predictive medicine. Recent studies highlighted how new standardized protocols facilitate the harmonization of digital pathology database infrastructure and morphologic, morphometric, and computer-aided quantitative analyses. Digital pathology enables robust protocols for clinical trials and research, with the potential to identify previously underused or unrecognized clinically useful parameters. The integration of digital pathology with molecular signatures is leading the way to establishing clinically relevant morpho-omic taxonomies of renal diseases. The introduction of digital pathology in clinical research and trials, and the progressive implementation of the modern software ecosystem, opens opportunities for the development of new predictive diagnostic paradigms and computer-aided algorithms, transforming the practice of renal disease into a modern computational science.

  11. Remotely Accessible Testbed for Software Defined Radio Development

    NASA Technical Reports Server (NTRS)

    Lux, James P.; Lang, Minh; Peters, Kenneth J.; Taylor, Gregory H.

    2012-01-01

    Previous development testbeds have assumed that the developer was physically present in front of the hardware being used. No provision for remote operation of basic functions (power on/off or reset) was made, because the developer/operator was sitting in front of the hardware, and could just push the button manually. In this innovation, a completely remotely accessible testbed has been created, with all diagnostic equipment and tools set up for remote access, and using standardized interfaces so that failed equipment can be quickly replaced. In this testbed, over 95% of the operating hours were used for testing without the developer being physically present. The testbed includes a pair of personal computers, one running Linux and one running Windows. A variety of peripherals is connected via Ethernet and USB (universal serial bus) interfaces. A private internal Ethernet is used to connect to test instruments and other devices, so that the sole connection to the outside world is via the two PCs. An important design consideration was that all of the instruments and interfaces used stable, long-lived industry standards, such as Ethernet, USB, and GPIB (general purpose interface bus). There are no plug-in cards for the two PCs, so there are no problems with finding replacement computers with matching interfaces, device drivers, and installation. The only thing unique to the two PCs is the locally developed software, which is not specific to computer or operating system version. If a device (including one of the computers) were to fail or become unavailable (e.g., a test instrument needed to be recalibrated), replacing it is a straightforward process with a standard, off-the-shelf device.

  12. Prediction of quantitative intrathoracic fluid volume to diagnose pulmonary oedema using LabVIEW.

    PubMed

    Urooj, Shabana; Khan, M; Ansari, A Q; Lay-Ekuakille, Aimé; Salhan, Ashok K

    2012-01-01

    Pulmonary oedema is a life-threatening disease that requires special attention in the area of research and clinical diagnosis. Computer-based techniques are rarely used to quantify the intrathoracic fluid volume (IFV) for diagnostic purposes. This paper discusses a software program developed to detect and diagnose pulmonary oedema using LabVIEW. The software runs on anthropometric dimensions and physiological parameters, mainly transthoracic electrical impedance (TEI). This technique is accurate and faster than existing manual techniques. The LabVIEW software was used to compute the parameters required to quantify IFV. An equation relating per cent control and IFV was obtained. The results of predicted TEI and measured TEI were compared with previously reported data to validate the developed program. It was found that the predicted values of TEI obtained from the computer-based technique were much closer to the measured values of TEI. Six new subjects were enrolled to measure and predict transthoracic impedance and hence to quantify IFV. A similar difference was also observed in the measured and predicted values of TEI for the new subjects.

  13. The challenge of raising ethical awareness: a case-based aiding system for use by computing and ICT students.

    PubMed

    Sherratt, Don; Rogerson, Simon; Ben Fairweather, N

    2005-04-01

    Students, the future Information and Communication Technology (ICT) professionals, are often perceived to have little understanding of the ethical issues associated with the use of ICTs. There is a growing recognition that the moral issues associated with the use of the new technologies should be brought to the attention of students. Furthermore, they should be encouraged to explore and think more deeply about the social and legal consequences of the use of ICTs. This paper describes the development of a tool designed to raise students' awareness of the social impact of ICTs. The tool offers guidance to students undertaking computing and computer-related courses when considering the social, legal and professional implications of the actions of participants in situations of ethical conflict. However, unlike previous work in this field, this tool is not based on an artificial intelligence paradigm. Aspects of the theoretical basis for the design of the tool and the tool's practical development are discussed. Preliminary results from the testing of the tool are also discussed.

  14. Multiphase Modeling of Secondary Atomization in a Shock Environment

    NASA Astrophysics Data System (ADS)

    St. Clair, Jeffrey; McGrath, Thomas; Balachandar, Sivaramakrishnan

    2017-06-01

    Understanding and developing accurate modeling strategies for shock-particulate interaction remains a challenging and important topic, with application to energetic materials development, volcanic eruptions, and safety/risk assessment. This work presents computational modeling of compressible multiphase flows with shock-induced droplet atomization. Droplet size has a strong influence on the interphase momentum and heat transfer. A test case is presented that is sensitive to this, requiring the dynamic modeling of the secondary atomization process occurring when the shock impacts the droplets. An Eulerian-Eulerian computational model that treats all phases as compressible, is hyperbolic and satisfies the 2nd Law of Thermodynamics is applied. Four different breakup models are applied to the test case in which a planar shock wave encounters a cloud of water droplets. The numerical results are compared with both experimental and previously-generated modeling results. The effect of the drag relation used is also investigated. The computed results indicate the necessity of using a droplet breakup model for this application, and the relative accuracy of results obtained with the different droplet breakup and drag models is discussed.

  15. Development of Compositionally Graded Metallic Glass Alloys with Desirable Properties

    DTIC Science & Technology

    2016-06-01

    individual study . Include participation in conferences, workshops, and seminars not listed under major activities. At OSU, one graduate student ...best glass former in a fraction of the experimental time of previous study , which identified similar compositions of Cu57.6Zr32.4Ti10,13 and...years, several experimental and computational studies have suggested a direct link between the population of shear transformation zones (STZs

  16. Advances in optimal routing through computer networks

    NASA Technical Reports Server (NTRS)

    Paz, I. M.

    1977-01-01

    The optimal routing problem is defined. Progress in solving the problem during the previous decade is reviewed, with special emphasis on technical developments made during the last few years. The relationships between the routing, the throughput, and the switching technology used are discussed and their future trends are reviewed. Economic aspects are also briefly considered. Modern technical approaches for handling the routing problems and, more generally, the flow control problems are reviewed.

  17. Ownership and Agency of an Independent Supernumerary Hand Induced by an Imitation Brain-Computer Interface.

    PubMed

    Bashford, Luke; Mehring, Carsten

    2016-01-01

    To study body ownership and control, illusions that elicit these feelings in non-body objects are widely used. Classically introduced with the Rubber Hand Illusion, these illusions have been replicated more recently in virtual reality and by using brain-computer interfaces. Traditionally these illusions investigate the replacement of a body part by an artificial counterpart, however as brain-computer interface research develops it offers us the possibility to explore the case where non-body objects are controlled in addition to movements of our own limbs. Therefore we propose a new illusion designed to test the feeling of ownership and control of an independent supernumerary hand. Subjects are under the impression they control a virtual reality hand via a brain-computer interface, but in reality there is no causal connection between brain activity and virtual hand movement but correct movements are observed with 80% probability. These imitation brain-computer interface trials are interspersed with movements in both the subjects' real hands, which are in view throughout the experiment. We show that subjects develop strong feelings of ownership and control over the third hand, despite only receiving visual feedback with no causal link to the actual brain signals. Our illusion is crucially different from previously reported studies as we demonstrate independent ownership and control of the third hand without loss of ownership in the real hands.

  18. Development of a Boundary Layer Property Interpolation Tool in Support of Orbiter Return To Flight

    NASA Technical Reports Server (NTRS)

    Greene, Francis A.; Hamilton, H. Harris

    2006-01-01

    A new tool was developed to predict the boundary layer quantities required by several physics-based predictive/analytic methods that assess damaged Orbiter tile. This new tool, the Boundary Layer Property Prediction (BLPROP) tool, supplies boundary layer values used in correlations that determine boundary layer transition onset and surface heating-rate augmentation/attenuation factors inside tile gouges (i.e. cavities). BLPROP interpolates through a database of computed solutions and provides boundary layer and wall data (delta, theta, Re(sub theta)/M(sub e), Re(sub theta)/M(sub e), Re(sub theta), P(sub w), and q(sub w)) based on user input surface location and free stream conditions. Surface locations are limited to the Orbiter s windward surface. Constructed using predictions from an inviscid w/boundary-layer method and benchmark viscous CFD, the computed database covers the hypersonic continuum flight regime based on two reference flight trajectories. First-order one-dimensional Lagrange interpolation accounts for Mach number and angle-of-attack variations, whereas non-dimensional normalization accounts for differences between the reference and input Reynolds number. Employing the same computational methods used to construct the database, solutions at other trajectory points taken from previous STS flights were computed: these results validate the BLPROP algorithm. Percentage differences between interpolated and computed values are presented and are used to establish the level of uncertainty of the new tool.

  19. Study of basic computer competence among public health nurses in Taiwan.

    PubMed

    Yang, Kuei-Feng; Yu, Shu; Lin, Ming-Sheng; Hsu, Chia-Ling

    2004-03-01

    Rapid advances in information technology and media have made distance learning on the Internet possible. This new model of learning allows greater efficiency and flexibility in knowledge acquisition. Since basic computer competence is a prerequisite for this new learning model, this study was conducted to examine the basic computer competence of public health nurses in Taiwan and explore factors influencing computer competence. A national cross-sectional randomized study was conducted with 329 public health nurses. A questionnaire was used to collect data and was delivered by mail. Results indicate that basic computer competence of public health nurses in Taiwan is still needs to be improved (mean = 57.57 +- 2.83, total score range from 26-130). Among the five most frequently used software programs, nurses were most knowledgeable about Word and least knowledgeable about PowerPoint. Stepwise multiple regression analysis revealed eight variables (weekly number of hours spent online at home, weekly amount of time spent online at work, weekly frequency of computer use at work, previous computer training, computer at workplace and Internet access, job position, education level, and age) that significantly influenced computer competence, which accounted for 39.0 % of the variance. In conclusion, greater computer competence, broader educational programs regarding computer technology, and a greater emphasis on computers at work are necessary to increase the usefulness of distance learning via the Internet in Taiwan. Building a user-friendly environment is important in developing this new media model of learning for the future.

  20. Improving Simulated Annealing by Recasting it as a Non-Cooperative Game

    NASA Technical Reports Server (NTRS)

    Wolpert, David; Bandari, Esfandiar; Tumer, Kagan

    2001-01-01

    The game-theoretic field of COllective INtelligence (COIN) concerns the design of computer-based players engaged in a non-cooperative game so that as those players pursue their self-interests, a pre-specified global goal for the collective computational system is achieved "as a side-effect". Previous implementations of COIN algorithms have outperformed conventional techniques by up to several orders of magnitude, on domains ranging from telecommunications control to optimization in congestion problems. Recent mathematical developments have revealed that these previously developed game-theory-motivated algorithms were based on only two of the three factors determining performance. Consideration of only the third factor would instead lead to conventional optimization techniques like simulated annealing that have little to do with non-cooperative games. In this paper we present an algorithm based on all three terms at once. This algorithm can be viewed as a way to modify simulated annealing by recasting it as a non-cooperative game, with each variable replaced by a player. This recasting allows us to leverage the intelligent behavior of the individual players to substantially improve the exploration step of the simulated annealing. Experiments are presented demonstrating that this recasting improves simulated annealing by several orders of magnitude for spin glass relaxation and bin-packing.

  1. A computer-aided detection (CAD) system with a 3D algorithm for small acute intracranial hemorrhage

    NASA Astrophysics Data System (ADS)

    Wang, Ximing; Fernandez, James; Deshpande, Ruchi; Lee, Joon K.; Chan, Tao; Liu, Brent

    2012-02-01

    Acute Intracranial hemorrhage (AIH) requires urgent diagnosis in the emergency setting to mitigate eventual sequelae. However, experienced radiologists may not always be available to make a timely diagnosis. This is especially true for small AIH, defined as lesion smaller than 10 mm in size. A computer-aided detection (CAD) system for the detection of small AIH would facilitate timely diagnosis. A previously developed 2D algorithm shows high false positive rates in the evaluation based on LAC/USC cases, due to the limitation of setting up correct coordinate system for the knowledge-based classification system. To achieve a higher sensitivity and specificity, a new 3D algorithm is developed. The algorithm utilizes a top-hat transformation and dynamic threshold map to detect small AIH lesions. Several key structures of brain are detected and are used to set up a 3D anatomical coordinate system. A rule-based classification of the lesion detected is applied based on the anatomical coordinate system. For convenient evaluation in clinical environment, the CAD module is integrated with a stand-alone system. The CAD is evaluated by small AIH cases and matched normal collected in LAC/USC. The result of 3D CAD and the previous 2D CAD has been compared.

  2. Investigating the Transonic Flutter Boundary of the Benchmark Supercritical Wing

    NASA Technical Reports Server (NTRS)

    Heeg, Jennifer; Chwalowski, Pawel

    2017-01-01

    This paper builds on the computational aeroelastic results published previously and generated in support of the second Aeroelastic Prediction Workshop for the NASA Benchmark Supercritical Wing configuration. The computational results are obtained using FUN3D, an unstructured grid Reynolds-Averaged Navier-Stokes solver developed at the NASA Langley Research Center. The analysis results focus on understanding the dip in the transonic flutter boundary at a single Mach number (0.74), exploring an angle of attack range of ??1 to 8 and dynamic pressures from wind off to beyond flutter onset. The rigid analysis results are examined for insights into the behavior of the aeroelastic system. Both static and dynamic aeroelastic simulation results are also examined.

  3. Computing arbitrary defect structures on arbitrary lattices on arbitrary geometries from arbitrary energies

    NASA Astrophysics Data System (ADS)

    Allen, Brian; Travesset, Alex

    2004-03-01

    Dislocations and disclinations play a fundamental role in the properties of two dimensional crystals. In this talk, it will be shown that a general computational framework can be developed by combining previous work of Seung and Nelson* and modern advances in objected oriented design. This allows separating the problem into independent classes such as: geometry (sphere, plane, torus..), lattice (triangular, square, etc..), type of defect (dislocation, disclinations, etc..), boundary conditions, type of order (crystalline, hexatic) or energy functional. As applications, the ground state of crystals in several geometries will be discussed. Experimental examples with colloidal particles will be shown. *S. Seung and D. Nelson, Phys. Rev. A 38, 1005 (1988)

  4. Instrumentation and control of harmonic oscillators via a single-board microprocessor-FPGA device.

    PubMed

    Picone, Rico A R; Davis, Solomon; Devine, Cameron; Garbini, Joseph L; Sidles, John A

    2017-04-01

    We report the development of an instrumentation and control system instantiated on a microprocessor-field programmable gate array (FPGA) device for a harmonic oscillator comprising a portion of a magnetic resonance force microscope. The specific advantages of the system are that it minimizes computation, increases maintainability, and reduces the technical barrier required to enter the experimental field of magnetic resonance force microscopy. Heterodyne digital control and measurement yields computational advantages. A single microprocessor-FPGA device improves system maintainability by using a single programming language. The system presented requires significantly less technical expertise to instantiate than the instrumentation of previous systems, yet integrity of performance is retained and demonstrated with experimental data.

  5. Instrumentation and control of harmonic oscillators via a single-board microprocessor-FPGA device

    NASA Astrophysics Data System (ADS)

    Picone, Rico A. R.; Davis, Solomon; Devine, Cameron; Garbini, Joseph L.; Sidles, John A.

    2017-04-01

    We report the development of an instrumentation and control system instantiated on a microprocessor-field programmable gate array (FPGA) device for a harmonic oscillator comprising a portion of a magnetic resonance force microscope. The specific advantages of the system are that it minimizes computation, increases maintainability, and reduces the technical barrier required to enter the experimental field of magnetic resonance force microscopy. Heterodyne digital control and measurement yields computational advantages. A single microprocessor-FPGA device improves system maintainability by using a single programming language. The system presented requires significantly less technical expertise to instantiate than the instrumentation of previous systems, yet integrity of performance is retained and demonstrated with experimental data.

  6. Digital SAR processing using a fast polynomial transform

    NASA Technical Reports Server (NTRS)

    Truong, T. K.; Lipes, R. G.; Butman, S. A.; Reed, I. S.; Rubin, A. L.

    1984-01-01

    A new digital processing algorithm based on the fast polynomial transform is developed for producing images from Synthetic Aperture Radar data. This algorithm enables the computation of the two dimensional cyclic correlation of the raw echo data with the impulse response of a point target, thereby reducing distortions inherent in one dimensional transforms. This SAR processing technique was evaluated on a general-purpose computer and an actual Seasat SAR image was produced. However, regular production runs will require a dedicated facility. It is expected that such a new SAR processing algorithm could provide the basis for a real-time SAR correlator implementation in the Deep Space Network. Previously announced in STAR as N82-11295

  7. Determination of optimal parameters for dual-layer cathode of polymer electrolyte fuel cell using computational intelligence-aided design.

    PubMed

    Chen, Yi; Huang, Weina; Peng, Bei

    2014-01-01

    Because of the demands for sustainable and renewable energy, fuel cells have become increasingly popular, particularly the polymer electrolyte fuel cell (PEFC). Among the various components, the cathode plays a key role in the operation of a PEFC. In this study, a quantitative dual-layer cathode model was proposed for determining the optimal parameters that minimize the over-potential difference η and improve the efficiency using a newly developed bat swarm algorithm with a variable population embedded in the computational intelligence-aided design. The simulation results were in agreement with previously reported results, suggesting that the proposed technique has potential applications for automating and optimizing the design of PEFCs.

  8. Sustaining and Extending the Open Science Grid: Science Innovation on a PetaScale Nationwide Facility (DE-FC02-06ER41436) SciDAC-2 Closeout Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Livny, Miron; Shank, James; Ernst, Michael

    Under this SciDAC-2 grant the project’s goal w a s t o stimulate new discoveries by providing scientists with effective and dependable access to an unprecedented national distributed computational facility: the Open Science Grid (OSG). We proposed to achieve this through the work of the Open Science Grid Consortium: a unique hands-on multi-disciplinary collaboration of scientists, software developers and providers of computing resources. Together the stakeholders in this consortium sustain and use a shared distributed computing environment that transforms simulation and experimental science in the US. The OSG consortium is an open collaboration that actively engages new research communities. Wemore » operate an open facility that brings together a broad spectrum of compute, storage, and networking resources and interfaces to other cyberinfrastructures, including the US XSEDE (previously TeraGrid), the European Grids for ESciencE (EGEE), as well as campus and regional grids. We leverage middleware provided by computer science groups, facility IT support organizations, and computing programs of application communities for the benefit of consortium members and the US national CI.« less

  9. Identification of Protein–Excipient Interaction Hotspots Using Computational Approaches

    PubMed Central

    Barata, Teresa S.; Zhang, Cheng; Dalby, Paul A.; Brocchini, Steve; Zloh, Mire

    2016-01-01

    Protein formulation development relies on the selection of excipients that inhibit protein–protein interactions preventing aggregation. Empirical strategies involve screening many excipient and buffer combinations using force degradation studies. Such methods do not readily provide information on intermolecular interactions responsible for the protective effects of excipients. This study describes a molecular docking approach to screen and rank interactions allowing for the identification of protein–excipient hotspots to aid in the selection of excipients to be experimentally screened. Previously published work with Drosophila Su(dx) was used to develop and validate the computational methodology, which was then used to determine the formulation hotspots for Fab A33. Commonly used excipients were examined and compared to the regions in Fab A33 prone to protein–protein interactions that could lead to aggregation. This approach could provide information on a molecular level about the protective interactions of excipients in protein formulations to aid the more rational development of future formulations. PMID:27258262

  10. The Unification of Space Qualified Integrated Circuits by Example of International Space Project GAMMA-400

    NASA Astrophysics Data System (ADS)

    Bobkov, S. G.; Serdin, O. V.; Arkhangelskiy, A. I.; Arkhangelskaja, I. V.; Suchkov, S. I.; Topchiev, N. P.

    The problem of electronic component unification at the different levels (circuits, interfaces, hardware and software) used in space industry is considered. The task of computer systems for space purposes developing is discussed by example of scientific data acquisition system for space project GAMMA-400. The basic characteristics of high reliable and fault tolerant chips developed by SRISA RAS for space applicable computational systems are given. To reduce power consumption and enhance data reliability, embedded system interconnect made hierarchical: upper level is Serial RapidIO 1x or 4x with rate transfer 1.25 Gbaud; next level - SpaceWire with rate transfer up to 400 Mbaud and lower level - MIL-STD-1553B and RS232/RS485. The Ethernet 10/100 is technology interface and provided connection with the previously released modules too. Systems interconnection allows creating different redundancy systems. Designers can develop heterogeneous systems that employ the peer-to-peer networking performance of Serial RapidIO using multiprocessor clusters interconnected by SpaceWire.

  11. A Simple Memristor Model for Circuit Simulations

    NASA Astrophysics Data System (ADS)

    Fullerton, Farrah-Amoy; Joe, Aaleyah; Gergel-Hackett, Nadine; Department of Chemistry; Physics Team

    This work describes the development of a model for the memristor, a novel nanoelectronic technology. The model was designed to replicate the real-world electrical characteristics of previously fabricated memristor devices, but was constructed with basic circuit elements using a free widely available circuit simulator, LT Spice. The modeled memrsistors were then used to construct a circuit that performs material implication. Material implication is a digital logic that can be used to perform all of the same basic functions as traditional CMOS gates, but with fewer nanoelectronic devices. This memristor-based digital logic could enable memristors' use in new paradigms of computer architecture with advantages in size, speed, and power over traditional computing circuits. Additionally, the ability to model the real-world electrical characteristics of memristors in a free circuit simulator using its standard library of elements could enable not only the development of memristor material implication, but also the development of a virtually unlimited array of other memristor-based circuits.

  12. A Novel Method to Verify Multilevel Computational Models of Biological Systems Using Multiscale Spatio-Temporal Meta Model Checking

    PubMed Central

    Gilbert, David

    2016-01-01

    Insights gained from multilevel computational models of biological systems can be translated into real-life applications only if the model correctness has been verified first. One of the most frequently employed in silico techniques for computational model verification is model checking. Traditional model checking approaches only consider the evolution of numeric values, such as concentrations, over time and are appropriate for computational models of small scale systems (e.g. intracellular networks). However for gaining a systems level understanding of how biological organisms function it is essential to consider more complex large scale biological systems (e.g. organs). Verifying computational models of such systems requires capturing both how numeric values and properties of (emergent) spatial structures (e.g. area of multicellular population) change over time and across multiple levels of organization, which are not considered by existing model checking approaches. To address this limitation we have developed a novel approximate probabilistic multiscale spatio-temporal meta model checking methodology for verifying multilevel computational models relative to specifications describing the desired/expected system behaviour. The methodology is generic and supports computational models encoded using various high-level modelling formalisms because it is defined relative to time series data and not the models used to generate it. In addition, the methodology can be automatically adapted to case study specific types of spatial structures and properties using the spatio-temporal meta model checking concept. To automate the computational model verification process we have implemented the model checking approach in the software tool Mule (http://mule.modelchecking.org). Its applicability is illustrated against four systems biology computational models previously published in the literature encoding the rat cardiovascular system dynamics, the uterine contractions of labour, the Xenopus laevis cell cycle and the acute inflammation of the gut and lung. Our methodology and software will enable computational biologists to efficiently develop reliable multilevel computational models of biological systems. PMID:27187178

  13. A Novel Method to Verify Multilevel Computational Models of Biological Systems Using Multiscale Spatio-Temporal Meta Model Checking.

    PubMed

    Pârvu, Ovidiu; Gilbert, David

    2016-01-01

    Insights gained from multilevel computational models of biological systems can be translated into real-life applications only if the model correctness has been verified first. One of the most frequently employed in silico techniques for computational model verification is model checking. Traditional model checking approaches only consider the evolution of numeric values, such as concentrations, over time and are appropriate for computational models of small scale systems (e.g. intracellular networks). However for gaining a systems level understanding of how biological organisms function it is essential to consider more complex large scale biological systems (e.g. organs). Verifying computational models of such systems requires capturing both how numeric values and properties of (emergent) spatial structures (e.g. area of multicellular population) change over time and across multiple levels of organization, which are not considered by existing model checking approaches. To address this limitation we have developed a novel approximate probabilistic multiscale spatio-temporal meta model checking methodology for verifying multilevel computational models relative to specifications describing the desired/expected system behaviour. The methodology is generic and supports computational models encoded using various high-level modelling formalisms because it is defined relative to time series data and not the models used to generate it. In addition, the methodology can be automatically adapted to case study specific types of spatial structures and properties using the spatio-temporal meta model checking concept. To automate the computational model verification process we have implemented the model checking approach in the software tool Mule (http://mule.modelchecking.org). Its applicability is illustrated against four systems biology computational models previously published in the literature encoding the rat cardiovascular system dynamics, the uterine contractions of labour, the Xenopus laevis cell cycle and the acute inflammation of the gut and lung. Our methodology and software will enable computational biologists to efficiently develop reliable multilevel computational models of biological systems.

  14. Update on PISCES

    NASA Technical Reports Server (NTRS)

    Pearson, Don; Hamm, Dustin; Kubena, Brian; Weaver, Jonathan K.

    2010-01-01

    An updated version of the Platform Independent Software Components for the Exploration of Space (PISCES) software library is available. A previous version was reported in Library for Developing Spacecraft-Mission-Planning Software (MSC-22983), NASA Tech Briefs, Vol. 25, No. 7 (July 2001), page 52. To recapitulate: This software provides for Web-based, collaborative development of computer programs for planning trajectories and trajectory- related aspects of spacecraft-mission design. The library was built using state-of-the-art object-oriented concepts and software-development methodologies. The components of PISCES include Java-language application programs arranged in a hierarchy of classes that facilitates the reuse of the components. As its full name suggests, the PISCES library affords platform-independence: The Java language makes it possible to use the classes and application programs with a Java virtual machine, which is available in most Web-browser programs. Another advantage is expandability: Object orientation facilitates expansion of the library through creation of a new class. Improvements in the library since the previous version include development of orbital-maneuver- planning and rendezvous-launch-window application programs, enhancement of capabilities for propagation of orbits, and development of a desktop user interface.

  15. Efficient Parallel Kernel Solvers for Computational Fluid Dynamics Applications

    NASA Technical Reports Server (NTRS)

    Sun, Xian-He

    1997-01-01

    Distributed-memory parallel computers dominate today's parallel computing arena. These machines, such as Intel Paragon, IBM SP2, and Cray Origin2OO, have successfully delivered high performance computing power for solving some of the so-called "grand-challenge" problems. Despite initial success, parallel machines have not been widely accepted in production engineering environments due to the complexity of parallel programming. On a parallel computing system, a task has to be partitioned and distributed appropriately among processors to reduce communication cost and to attain load balance. More importantly, even with careful partitioning and mapping, the performance of an algorithm may still be unsatisfactory, since conventional sequential algorithms may be serial in nature and may not be implemented efficiently on parallel machines. In many cases, new algorithms have to be introduced to increase parallel performance. In order to achieve optimal performance, in addition to partitioning and mapping, a careful performance study should be conducted for a given application to find a good algorithm-machine combination. This process, however, is usually painful and elusive. The goal of this project is to design and develop efficient parallel algorithms for highly accurate Computational Fluid Dynamics (CFD) simulations and other engineering applications. The work plan is 1) developing highly accurate parallel numerical algorithms, 2) conduct preliminary testing to verify the effectiveness and potential of these algorithms, 3) incorporate newly developed algorithms into actual simulation packages. The work plan has well achieved. Two highly accurate, efficient Poisson solvers have been developed and tested based on two different approaches: (1) Adopting a mathematical geometry which has a better capacity to describe the fluid, (2) Using compact scheme to gain high order accuracy in numerical discretization. The previously developed Parallel Diagonal Dominant (PDD) algorithm and Reduced Parallel Diagonal Dominant (RPDD) algorithm have been carefully studied on different parallel platforms for different applications, and a NASA simulation code developed by Man M. Rai and his colleagues has been parallelized and implemented based on data dependency analysis. These achievements are addressed in detail in the paper.

  16. Amoeba-inspired nanoarchitectonic computing: solving intractable computational problems using nanoscale photoexcitation transfer dynamics.

    PubMed

    Aono, Masashi; Naruse, Makoto; Kim, Song-Ju; Wakabayashi, Masamitsu; Hori, Hirokazu; Ohtsu, Motoichi; Hara, Masahiko

    2013-06-18

    Biologically inspired computing devices and architectures are expected to overcome the limitations of conventional technologies in terms of solving computationally demanding problems, adapting to complex environments, reducing energy consumption, and so on. We previously demonstrated that a primitive single-celled amoeba (a plasmodial slime mold), which exhibits complex spatiotemporal oscillatory dynamics and sophisticated computing capabilities, can be used to search for a solution to a very hard combinatorial optimization problem. We successfully extracted the essential spatiotemporal dynamics by which the amoeba solves the problem. This amoeba-inspired computing paradigm can be implemented by various physical systems that exhibit suitable spatiotemporal dynamics resembling the amoeba's problem-solving process. In this Article, we demonstrate that photoexcitation transfer phenomena in certain quantum nanostructures mediated by optical near-field interactions generate the amoebalike spatiotemporal dynamics and can be used to solve the satisfiability problem (SAT), which is the problem of judging whether a given logical proposition (a Boolean formula) is self-consistent. SAT is related to diverse application problems in artificial intelligence, information security, and bioinformatics and is a crucially important nondeterministic polynomial time (NP)-complete problem, which is believed to become intractable for conventional digital computers when the problem size increases. We show that our amoeba-inspired computing paradigm dramatically outperforms a conventional stochastic search method. These results indicate the potential for developing highly versatile nanoarchitectonic computers that realize powerful solution searching with low energy consumption.

  17. Acceleration of Monte Carlo simulation of photon migration in complex heterogeneous media using Intel many-integrated core architecture.

    PubMed

    Gorshkov, Anton V; Kirillin, Mikhail Yu

    2015-08-01

    Over two decades, the Monte Carlo technique has become a gold standard in simulation of light propagation in turbid media, including biotissues. Technological solutions provide further advances of this technique. The Intel Xeon Phi coprocessor is a new type of accelerator for highly parallel general purpose computing, which allows execution of a wide range of applications without substantial code modification. We present a technical approach of porting our previously developed Monte Carlo (MC) code for simulation of light transport in tissues to the Intel Xeon Phi coprocessor. We show that employing the accelerator allows reducing computational time of MC simulation and obtaining simulation speed-up comparable to GPU. We demonstrate the performance of the developed code for simulation of light transport in the human head and determination of the measurement volume in near-infrared spectroscopy brain sensing.

  18. Combining metric episodes with semantic event concepts within the Symbolic and Sub-Symbolic Robotics Intelligence Control System (SS-RICS)

    NASA Astrophysics Data System (ADS)

    Kelley, Troy D.; McGhee, S.

    2013-05-01

    This paper describes the ongoing development of a robotic control architecture that inspired by computational cognitive architectures from the discipline of cognitive psychology. The Symbolic and Sub-Symbolic Robotics Intelligence Control System (SS-RICS) combines symbolic and sub-symbolic representations of knowledge into a unified control architecture. The new architecture leverages previous work in cognitive architectures, specifically the development of the Adaptive Character of Thought-Rational (ACT-R) and Soar. This paper details current work on learning from episodes or events. The use of episodic memory as a learning mechanism has, until recently, been largely ignored by computational cognitive architectures. This paper details work on metric level episodic memory streams and methods for translating episodes into abstract schemas. The presentation will include research on learning through novelty and self generated feedback mechanisms for autonomous systems.

  19. Implementing finite state machines in a computer-based teaching system

    NASA Astrophysics Data System (ADS)

    Hacker, Charles H.; Sitte, Renate

    1999-09-01

    Finite State Machines (FSM) are models for functions commonly implemented in digital circuits such as timers, remote controls, and vending machines. Teaching FSM is core in the curriculum of many university digital electronic or discrete mathematics subjects. Students often have difficulties grasping the theoretical concepts in the design and analysis of FSM. This has prompted the author to develop an MS-WindowsTM compatible software, WinState, that provides a tutorial style teaching aid for understanding the mechanisms of FSM. The animated computer screen is ideal for visually conveying the required design and analysis procedures. WinState complements other software for combinatorial logic previously developed by the author, and enhances the existing teaching package by adding sequential logic circuits. WinState enables the construction of a students own FSM, which can be simulated, to test the design for functionality and possible errors.

  20. Computational Modeling of a Mechanized Benchtop Apparatus for Leading-Edge Slat Noise Treatment Device Prototypes

    NASA Technical Reports Server (NTRS)

    Turner, Travis L.; Moore, James B.; Long, David L.

    2017-01-01

    Airframe noise is a growing concern in the vicinity of airports because of population growth and gains in engine noise reduction that have rendered the airframe an equal contributor during the approach and landing phases of flight for many transport aircraft. The leading-edge-slat device of a typical high-lift system for transport aircraft is a prominent source of airframe noise. Two technologies have significant potential for slat noise reduction; the slat-cove filler (SCF) and the slat-gap filler (SGF). Previous work was done on a 2D section of a transport-aircraft wing to demonstrate the implementation feasibility of these concepts. Benchtop hardware was developed in that work for qualitative parametric study. The benchtop models were mechanized for quantitative measurements of performance. Computational models of the mechanized benchtop apparatus for the SCF were developed and the performance of the system for five different SCF assemblies is demonstrated.

  1. Computer literacy and attitudes towards e-learning among first year medical students

    PubMed Central

    Link, Thomas Michael; Marz, Richard

    2006-01-01

    Background At the Medical University of Vienna, most information for students is available only online. In 2005, an e-learning project was initiated and there are plans to introduce a learning management system. In this study, we estimate the level of students' computer skills, the number of students having difficulty with e-learning, and the number of students opposed to e-learning. Methods The study was conducted in an introductory course on computer-based and web-based training (CBT/WBT). Students were asked to fill out a questionnaire online that covered a wide range of relevant attitudes and experiences. Results While the great majority of students possess sufficient computer skills and acknowledge the advantages of interactive and multimedia-enhanced learning material, a small percentage lacks basic computer skills and/or is very skeptical about e-learning. There is also a consistently significant albeit weak gender difference in available computer infrastructure and Internet access. As for student attitudes toward e-learning, we found that age, computer use, and previous exposure to computers are more important than gender. A sizable number of students, 12% of the total, make little or no use of existing e-learning offerings. Conclusion Many students would benefit from a basic introduction to computers and to the relevant computer-based resources of the university. Given to the wide range of computer skills among students, a single computer course for all students would not be useful nor would it be accepted. Special measures should be taken to prevent students who lack computer skills from being disadvantaged or from developing computer-hostile attitudes. PMID:16784524

  2. Computer literacy and attitudes towards e-learning among first year medical students.

    PubMed

    Link, Thomas Michael; Marz, Richard

    2006-06-19

    At the Medical University of Vienna, most information for students is available only online. In 2005, an e-learning project was initiated and there are plans to introduce a learning management system. In this study, we estimate the level of students' computer skills, the number of students having difficulty with e-learning, and the number of students opposed to e-learning. The study was conducted in an introductory course on computer-based and web-based training (CBT/WBT). Students were asked to fill out a questionnaire online that covered a wide range of relevant attitudes and experiences. While the great majority of students possess sufficient computer skills and acknowledge the advantages of interactive and multimedia-enhanced learning material, a small percentage lacks basic computer skills and/or is very skeptical about e-learning. There is also a consistently significant albeit weak gender difference in available computer infrastructure and Internet access. As for student attitudes toward e-learning, we found that age, computer use, and previous exposure to computers are more important than gender. A sizable number of students, 12% of the total, make little or no use of existing e-learning offerings. Many students would benefit from a basic introduction to computers and to the relevant computer-based resources of the university. Given to the wide range of computer skills among students, a single computer course for all students would not be useful nor would it be accepted. Special measures should be taken to prevent students who lack computer skills from being disadvantaged or from developing computer-hostile attitudes.

  3. Computer Anxiety: Relationship to Math Anxiety and Holland Types.

    ERIC Educational Resources Information Center

    Bellando, Jayne; Winer, Jane L.

    Although the number of computers in the school system is increasing, many schools are not using computers to their capacity. One reason for this may be computer anxiety on the part of the teacher. A review of the computer anxiety literature reveals little information on the subject, and findings from previous studies suggest that basic controlled…

  4. RAID v2.0: an updated resource of RNA-associated interactions across organisms.

    PubMed

    Yi, Ying; Zhao, Yue; Li, Chunhua; Zhang, Lin; Huang, Huiying; Li, Yana; Liu, Lanlan; Hou, Ping; Cui, Tianyu; Tan, Puwen; Hu, Yongfei; Zhang, Ting; Huang, Yan; Li, Xiaobo; Yu, Jia; Wang, Dong

    2017-01-04

    With the development of biotechnologies and computational prediction algorithms, the number of experimental and computational prediction RNA-associated interactions has grown rapidly in recent years. However, diverse RNA-associated interactions are scattered over a wide variety of resources and organisms, whereas a fully comprehensive view of diverse RNA-associated interactions is still not available for any species. Hence, we have updated the RAID database to version 2.0 (RAID v2.0, www.rna-society.org/raid/) by integrating experimental and computational prediction interactions from manually reading literature and other database resources under one common framework. The new developments in RAID v2.0 include (i) over 850-fold RNA-associated interactions, an enhancement compared to the previous version; (ii) numerous resources integrated with experimental or computational prediction evidence for each RNA-associated interaction; (iii) a reliability assessment for each RNA-associated interaction based on an integrative confidence score; and (iv) an increase of species coverage to 60. Consequently, RAID v2.0 recruits more than 5.27 million RNA-associated interactions, including more than 4 million RNA-RNA interactions and more than 1.2 million RNA-protein interactions, referring to nearly 130 000 RNA/protein symbols across 60 species. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  5. A human ether-á-go-go-related (hERG) ion channel atomistic model generated by long supercomputer molecular dynamics simulations and its use in predicting drug cardiotoxicity.

    PubMed

    Anwar-Mohamed, Anwar; Barakat, Khaled H; Bhat, Rakesh; Noskov, Sergei Y; Tyrrell, D Lorne; Tuszynski, Jack A; Houghton, Michael

    2014-11-04

    Acquired cardiac long QT syndrome (LQTS) is a frequent drug-induced toxic event that is often caused through blocking of the human ether-á-go-go-related (hERG) K(+) ion channel. This has led to the removal of several major drugs post-approval and is a frequent cause of termination of clinical trials. We report here a computational atomistic model derived using long molecular dynamics that allows sensitive prediction of hERG blockage. It identified drug-mediated hERG blocking activity of a test panel of 18 compounds with high sensitivity and specificity and was experimentally validated using hERG binding assays and patch clamp electrophysiological assays. The model discriminates between potent, weak, and non-hERG blockers and is superior to previous computational methods. This computational model serves as a powerful new tool to predict hERG blocking thus rendering drug development safer and more efficient. As an example, we show that a drug that was halted recently in clinical development because of severe cardiotoxicity is a potent inhibitor of hERG in two different biological assays which could have been predicted using our new computational model. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  6. 1995 Joseph E. Whitley, MD, Award. A World Wide Web gateway to the radiologic learning file.

    PubMed

    Channin, D S

    1995-12-01

    Computer networks in general, and the Internet specifically, are changing the way information is manipulated in the world at large and in radiology. The goal of this project was to develop a computer system in which images from the Radiologic Learning File, available previously only via a single-user laser disc, are made available over a generic, high-availability computer network to many potential users simultaneously. Using a networked workstation in our laboratory and freely available distributed hypertext software, we established a World Wide Web (WWW) information server for radiology. Images from the Radiologic Learning File are requested through the WWW client software, digitized from a single laser disc containing the entire teaching file and then transmitted over the network to the client. The text accompanying each image is incorporated into the transmitted document. The Radiologic Learning File is now on-line, and requests to view the cases result in the delivery of the text and images. Image digitization via a frame grabber takes 1/30th of a second. Conversion of the image to a standard computer graphic format takes 45-60 sec. Text and image transmission speed on a local area network varies between 200 and 400 kilobytes (KB) per second depending on the network load. We have made images from a laser disc of the Radiologic Learning File available through an Internet-based hypertext server. The images previously available through a single-user system located in a remote section of our department are now ubiquitously available throughout our department via the department's computer network. We have thus converted a single-user, limited functionality system into a multiuser, widely available resource.

  7. Embracing the Cloud: Six Ways to Look at the Shift to Cloud Computing

    ERIC Educational Resources Information Center

    Ullman, David F.; Haggerty, Blake

    2010-01-01

    Cloud computing is the latest paradigm shift for the delivery of IT services. Where previous paradigms (centralized, decentralized, distributed) were based on fairly straightforward approaches to technology and its management, cloud computing is radical in comparison. The literature on cloud computing, however, suffers from many divergent…

  8. Brute force meets Bruno force in parameter optimisation: introduction of novel constraints for parameter accuracy improvement by symbolic computation.

    PubMed

    Nakatsui, M; Horimoto, K; Lemaire, F; Ürgüplü, A; Sedoglavic, A; Boulier, F

    2011-09-01

    Recent remarkable advances in computer performance have enabled us to estimate parameter values by the huge power of numerical computation, the so-called 'Brute force', resulting in the high-speed simultaneous estimation of a large number of parameter values. However, these advancements have not been fully utilised to improve the accuracy of parameter estimation. Here the authors review a novel method for parameter estimation using symbolic computation power, 'Bruno force', named after Bruno Buchberger, who found the Gröbner base. In the method, the objective functions combining the symbolic computation techniques are formulated. First, the authors utilise a symbolic computation technique, differential elimination, which symbolically reduces an equivalent system of differential equations to a system in a given model. Second, since its equivalent system is frequently composed of large equations, the system is further simplified by another symbolic computation. The performance of the authors' method for parameter accuracy improvement is illustrated by two representative models in biology, a simple cascade model and a negative feedback model in comparison with the previous numerical methods. Finally, the limits and extensions of the authors' method are discussed, in terms of the possible power of 'Bruno force' for the development of a new horizon in parameter estimation.

  9. Deep Learning in Medical Imaging: General Overview

    PubMed Central

    Lee, June-Goo; Jun, Sanghoon; Cho, Young-Won; Lee, Hyunna; Kim, Guk Bae

    2017-01-01

    The artificial neural network (ANN)–a machine learning technique inspired by the human neuronal synapse system–was introduced in the 1950s. However, the ANN was previously limited in its ability to solve actual problems, due to the vanishing gradient and overfitting problems with training of deep architecture, lack of computing power, and primarily the absence of sufficient data to train the computer system. Interest in this concept has lately resurfaced, due to the availability of big data, enhanced computing power with the current graphics processing units, and novel algorithms to train the deep neural network. Recent studies on this technology suggest its potentially to perform better than humans in some visual and auditory recognition tasks, which may portend its applications in medicine and healthcare, especially in medical imaging, in the foreseeable future. This review article offers perspectives on the history, development, and applications of deep learning technology, particularly regarding its applications in medical imaging. PMID:28670152

  10. Deep Learning in Medical Imaging: General Overview.

    PubMed

    Lee, June-Goo; Jun, Sanghoon; Cho, Young-Won; Lee, Hyunna; Kim, Guk Bae; Seo, Joon Beom; Kim, Namkug

    2017-01-01

    The artificial neural network (ANN)-a machine learning technique inspired by the human neuronal synapse system-was introduced in the 1950s. However, the ANN was previously limited in its ability to solve actual problems, due to the vanishing gradient and overfitting problems with training of deep architecture, lack of computing power, and primarily the absence of sufficient data to train the computer system. Interest in this concept has lately resurfaced, due to the availability of big data, enhanced computing power with the current graphics processing units, and novel algorithms to train the deep neural network. Recent studies on this technology suggest its potentially to perform better than humans in some visual and auditory recognition tasks, which may portend its applications in medicine and healthcare, especially in medical imaging, in the foreseeable future. This review article offers perspectives on the history, development, and applications of deep learning technology, particularly regarding its applications in medical imaging.

  11. A method for estimating the probability of lightning causing a methane ignition in an underground mine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sacks, H.K.; Novak, T.

    2008-03-15

    During the past decade, several methane/air explosions in abandoned or sealed areas of underground coal mines have been attributed to lightning. Previously published work by the authors showed, through computer simulations, that currents from lightning could propagate down steel-cased boreholes and ignite explosive methane/air mixtures. The presented work expands on the model and describes a methodology based on IEEE Standard 1410-2004 to estimate the probability of an ignition. The methodology provides a means to better estimate the likelihood that an ignition could occur underground and, more importantly, allows the calculation of what-if scenarios to investigate the effectiveness of engineering controlsmore » to reduce the hazard. The computer software used for calculating fields and potentials is also verified by comparing computed results with an independently developed theoretical model of electromagnetic field propagation through a conductive medium.« less

  12. COSIM: A Finite-Difference Computer Model to Predict Ternary Concentration Profiles Associated With Oxidation and Interdiffusion of Overlay-Coated Substrates

    NASA Technical Reports Server (NTRS)

    Nesbitt, James A.

    2001-01-01

    A finite-difference computer program (COSIM) has been written which models the one-dimensional, diffusional transport associated with high-temperature oxidation and interdiffusion of overlay-coated substrates. The program predicts concentration profiles for up to three elements in the coating and substrate after various oxidation exposures. Surface recession due to solute loss is also predicted. Ternary cross terms and concentration-dependent diffusion coefficients are taken into account. The program also incorporates a previously-developed oxide growth and spalling model to simulate either isothermal or cyclic oxidation exposures. In addition to predicting concentration profiles after various oxidation exposures, the program can also be used to predict coating life based on a concentration dependent failure criterion (e.g., surface solute content drops to 2%). The computer code is written in FORTRAN and employs numerous subroutines to make the program flexible and easily modifiable to other coating oxidation problems.

  13. Simulating and assessing boson sampling experiments with phase-space representations

    NASA Astrophysics Data System (ADS)

    Opanchuk, Bogdan; Rosales-Zárate, Laura; Reid, Margaret D.; Drummond, Peter D.

    2018-04-01

    The search for new, application-specific quantum computers designed to outperform any classical computer is driven by the ending of Moore's law and the quantum advantages potentially obtainable. Photonic networks are promising examples, with experimental demonstrations and potential for obtaining a quantum computer to solve problems believed classically impossible. This introduces a challenge: how does one design or understand such photonic networks? One must be able to calculate observables using general methods capable of treating arbitrary inputs, dissipation, and noise. We develop complex phase-space software for simulating these photonic networks, and apply this to boson sampling experiments. Our techniques give sampling errors orders of magnitude lower than experimental correlation measurements for the same number of samples. We show that these techniques remove systematic errors in previous algorithms for estimating correlations, with large improvements in errors in some cases. In addition, we obtain a scalable channel-combination strategy for assessment of boson sampling devices.

  14. Probability calculations for three-part mineral resource assessments

    USGS Publications Warehouse

    Ellefsen, Karl J.

    2017-06-27

    Three-part mineral resource assessment is a methodology for predicting, in a specified geographic region, both the number of undiscovered mineral deposits and the amount of mineral resources in those deposits. These predictions are based on probability calculations that are performed with computer software that is newly implemented. Compared to the previous implementation, the new implementation includes new features for the probability calculations themselves and for checks of those calculations. The development of the new implementation lead to a new understanding of the probability calculations, namely the assumptions inherent in the probability calculations. Several assumptions strongly affect the mineral resource predictions, so it is crucial that they are checked during an assessment. The evaluation of the new implementation leads to new findings about the probability calculations,namely findings regarding the precision of the computations,the computation time, and the sensitivity of the calculation results to the input.

  15. Site-Mutation of Hydrophobic Core Residues Synchronically Poise Super Interleukin 2 for Signaling: Identifying Distant Structural Effects through Affordable Computations.

    PubMed

    Mei, Longcan; Zhou, Yanping; Zhu, Lizhe; Liu, Changlin; Wu, Zhuo; Wang, Fangkui; Hao, Gefei; Yu, Di; Yuan, Hong; Cui, Yanfang

    2018-03-20

    A superkine variant of interleukin-2 with six site mutations away from the binding interface developed from the yeast display technique has been previously characterized as undergoing a distal structure alteration which is responsible for its super-potency and provides an elegant case study with which to get insight about how to utilize allosteric effect to achieve desirable protein functions. By examining the dynamic network and the allosteric pathways related to those mutated residues using various computational approaches, we found that nanosecond time scale all-atom molecular dynamics simulations can identify the dynamic network as efficient as an ensemble algorithm. The differentiated pathways for the six core residues form a dynamic network that outlines the area of structure alteration. The results offer potentials of using affordable computing power to predict allosteric structure of mutants in knowledge-based mutagenesis.

  16. Subaru Telescope Network III (STN-III): more effective, more operation-oriented, and more inexpensive solutions for the observatory's needs

    NASA Astrophysics Data System (ADS)

    Noumaru, Junichi; Kawai, Jun A.; Schubert, Kiaina; Yagi, Masafumi; Takata, Tadafumi; Winegar, Tom; Scanlon, Tim; Nishida, Takuhiro; Fox, Camron; Hayasaka, James; Forester, Jason; Uchida, Kenji; Nakamura, Isamu; Tom, Richard; Koura, Norikazu; Yamamoto, Tadahiro; Tanoue, Toshiya; Yamada, Toru

    2008-07-01

    Subaru Telescope has recently replaced most equipment of Subaru Telescope Network II with the new equipment which includes 124TB of RAID system for data archive. Switching the data storage from tape to RAID enables users to access the data faster. The STN-III dropped some important components of STN-II, such as supercomputers, development & testing subsystem for Subaru Observation Control System, or data processing subsystem. On the other hand, we invested more computers to the remote operation system. Thanks to IT innovations, our LAN as well as the network between Hilo and summit were upgraded to gigabit network at the similar or even reduced cost from the previous system. As the result of the redesigning of the computer system by more focusing on the observatory operation, we greatly reduced the total cost for computer rental, purchase and maintenance.

  17. Single-shot ultrafast tomographic imaging by spectral multiplexing

    NASA Astrophysics Data System (ADS)

    Matlis, N. H.; Axley, A.; Leemans, W. P.

    2012-10-01

    Computed tomography has profoundly impacted science, medicine and technology by using projection measurements scanned over multiple angles to permit cross-sectional imaging of an object. The application of computed tomography to moving or dynamically varying objects, however, has been limited by the temporal resolution of the technique, which is set by the time required to complete the scan. For objects that vary on ultrafast timescales, traditional scanning methods are not an option. Here we present a non-scanning method capable of resolving structure on femtosecond timescales by using spectral multiplexing of a single laser beam to perform tomographic imaging over a continuous range of angles simultaneously. We use this technique to demonstrate the first single-shot ultrafast computed tomography reconstructions and obtain previously inaccessible structure and position information for laser-induced plasma filaments. This development enables real-time tomographic imaging for ultrafast science, and offers a potential solution to the challenging problem of imaging through scattering surfaces.

  18. Classified one-step high-radix signed-digit arithmetic units

    NASA Astrophysics Data System (ADS)

    Cherri, Abdallah K.

    1998-08-01

    High-radix number systems enable higher information storage density, less complexity, fewer system components, and fewer cascaded gates and operations. A simple one-step fully parallel high-radix signed-digit arithmetic is proposed for parallel optical computing based on new joint spatial encodings. This reduces hardware requirements and improves throughput by reducing the space-bandwidth produce needed. The high-radix signed-digit arithmetic operations are based on classifying the neighboring input digit pairs into various groups to reduce the computation rules. A new joint spatial encoding technique is developed to present both the operands and the computation rules. This technique increases the spatial bandwidth product of the spatial light modulators of the system. An optical implementation of the proposed high-radix signed-digit arithmetic operations is also presented. It is shown that our one-step trinary signed-digit and quaternary signed-digit arithmetic units are much simpler and better than all previously reported high-radix signed-digit techniques.

  19. Ontology-based prediction of surgical events in laparoscopic surgery

    NASA Astrophysics Data System (ADS)

    Katić, Darko; Wekerle, Anna-Laura; Gärtner, Fabian; Kenngott, Hannes; Müller-Stich, Beat Peter; Dillmann, Rüdiger; Speidel, Stefanie

    2013-03-01

    Context-aware technologies have great potential to help surgeons during laparoscopic interventions. Their underlying idea is to create systems which can adapt their assistance functions automatically to the situation in the OR, thus relieving surgeons from the burden of managing computer assisted surgery devices manually. To this purpose, a certain kind of understanding of the current situation in the OR is essential. Beyond that, anticipatory knowledge of incoming events is beneficial, e.g. for early warnings of imminent risk situations. To achieve the goal of predicting surgical events based on previously observed ones, we developed a language to describe surgeries and surgical events using Description Logics and integrated it with methods from computational linguistics. Using n-Grams to compute probabilities of followup events, we are able to make sensible predictions of upcoming events in real-time. The system was evaluated on professionally recorded and labeled surgeries and showed an average prediction rate of 80%.

  20. Magnetic skyrmion-based artificial neuron device

    NASA Astrophysics Data System (ADS)

    Li, Sai; Kang, Wang; Huang, Yangqi; Zhang, Xichao; Zhou, Yan; Zhao, Weisheng

    2017-08-01

    Neuromorphic computing, inspired by the biological nervous system, has attracted considerable attention. Intensive research has been conducted in this field for developing artificial synapses and neurons, attempting to mimic the behaviors of biological synapses and neurons, which are two basic elements of a human brain. Recently, magnetic skyrmions have been investigated as promising candidates in neuromorphic computing design owing to their topologically protected particle-like behaviors, nanoscale size and low driving current density. In one of our previous studies, a skyrmion-based artificial synapse was proposed, with which both short-term plasticity and long-term potentiation functions have been demonstrated. In this work, we further report on a skyrmion-based artificial neuron by exploiting the tunable current-driven skyrmion motion dynamics, mimicking the leaky-integrate-fire function of a biological neuron. With a simple single-device implementation, this proposed artificial neuron may enable us to build a dense and energy-efficient spiking neuromorphic computing system.

  1. A description of the index of active Florida water data collection stations and a user's guide for station or site information retrieval using computer program Findex H578

    USGS Publications Warehouse

    Merritt, M.L.

    1977-01-01

    A computerized index of water-data collection activities and retrieval software to generate publication list of this information was developed for Florida. This system serves a vital need in the administration of the many and diverse water-data collection activities. Previously, needed data was very difficult to assemble for use in program planning or project implementation. Largely descriptive, the report tells how a file of computer card images has been established which contains entries for all sites in Florida at which there is currently a water-data-collection activity. Entries include information such as identification number, station name, location, type of site, county, information about data collection, funding, and other pertinent details. The computer program FINDEX selectively retrieves entries and lists them in a format suitable for publication. Updating the index is done routinely. (Woodard-USGS)

  2. Efficient biometric authenticated key agreements based on extended chaotic maps for telecare medicine information systems.

    PubMed

    Lou, Der-Chyuan; Lee, Tian-Fu; Lin, Tsung-Hung

    2015-05-01

    Authenticated key agreements for telecare medicine information systems provide patients, doctors, nurses and health visitors with accessing medical information systems and getting remote services efficiently and conveniently through an open network. In order to have higher security, many authenticated key agreement schemes appended biometric keys to realize identification except for using passwords and smartcards. Due to too many transmissions and computational costs, these authenticated key agreement schemes are inefficient in communication and computation. This investigation develops two secure and efficient authenticated key agreement schemes for telecare medicine information systems by using biometric key and extended chaotic maps. One scheme is synchronization-based, while the other nonce-based. Compared to related approaches, the proposed schemes not only retain the same security properties with previous schemes, but also provide users with privacy protection and have fewer transmissions and lower computational cost.

  3. An efficient technique for the numerical solution of the bidomain equations.

    PubMed

    Whiteley, Jonathan P

    2008-08-01

    Computing the numerical solution of the bidomain equations is widely accepted to be a significant computational challenge. In this study we extend a previously published semi-implicit numerical scheme with good stability properties that has been used to solve the bidomain equations (Whiteley, J.P. IEEE Trans. Biomed. Eng. 53:2139-2147, 2006). A new, efficient numerical scheme is developed which utilizes the observation that the only component of the ionic current that must be calculated on a fine spatial mesh and updated frequently is the fast sodium current. Other components of the ionic current may be calculated on a coarser mesh and updated less frequently, and then interpolated onto the finer mesh. Use of this technique to calculate the transmembrane potential and extracellular potential induces very little error in the solution. For the simulations presented in this study an increase in computational efficiency of over two orders of magnitude over standard numerical techniques is obtained.

  4. Correlative multiple porosimetries for reservoir sandstones with adoption of a new reference-sample-guided computed-tomographic method.

    PubMed

    Jin, Jae Hwa; Kim, Junho; Lee, Jeong-Yil; Oh, Young Min

    2016-07-22

    One of the main interests in petroleum geology and reservoir engineering is to quantify the porosity of reservoir beds as accurately as possible. A variety of direct measurements, including methods of mercury intrusion, helium injection and petrographic image analysis, have been developed; however, their application frequently yields equivocal results because these methods are different in theoretical bases, means of measurement, and causes of measurement errors. Here, we present a set of porosities measured in Berea Sandstone samples by the multiple methods, in particular with adoption of a new method using computed tomography and reference samples. The multiple porosimetric data show a marked correlativeness among different methods, suggesting that these methods are compatible with each other. The new method of reference-sample-guided computed tomography is more effective than the previous methods when the accompanied merits such as experimental conveniences are taken into account.

  5. Correlative multiple porosimetries for reservoir sandstones with adoption of a new reference-sample-guided computed-tomographic method

    PubMed Central

    Jin, Jae Hwa; Kim, Junho; Lee, Jeong-Yil; Oh, Young Min

    2016-01-01

    One of the main interests in petroleum geology and reservoir engineering is to quantify the porosity of reservoir beds as accurately as possible. A variety of direct measurements, including methods of mercury intrusion, helium injection and petrographic image analysis, have been developed; however, their application frequently yields equivocal results because these methods are different in theoretical bases, means of measurement, and causes of measurement errors. Here, we present a set of porosities measured in Berea Sandstone samples by the multiple methods, in particular with adoption of a new method using computed tomography and reference samples. The multiple porosimetric data show a marked correlativeness among different methods, suggesting that these methods are compatible with each other. The new method of reference-sample-guided computed tomography is more effective than the previous methods when the accompanied merits such as experimental conveniences are taken into account. PMID:27445105

  6. Development and Application of Agglomerated Multigrid Methods for Complex Geometries

    NASA Technical Reports Server (NTRS)

    Nishikawa, Hiroaki; Diskin, Boris; Thomas, James L.

    2010-01-01

    We report progress in the development of agglomerated multigrid techniques for fully un- structured grids in three dimensions, building upon two previous studies focused on efficiently solving a model diffusion equation. We demonstrate a robust fully-coarsened agglomerated multigrid technique for 3D complex geometries, incorporating the following key developments: consistent and stable coarse-grid discretizations, a hierarchical agglomeration scheme, and line-agglomeration/relaxation using prismatic-cell discretizations in the highly-stretched grid regions. A signi cant speed-up in computer time is demonstrated for a model diffusion problem, the Euler equations, and the Reynolds-averaged Navier-Stokes equations for 3D realistic complex geometries.

  7. Finite Larmor radius effects on the (m = 2, n = 1) cylindrical tearing mode

    NASA Astrophysics Data System (ADS)

    Chen, Y.; Chowdhury, J.; Parker, S. E.; Wan, W.

    2015-04-01

    New field solvers are developed in the gyrokinetic code GEM [Chen and Parker, J. Comput. Phys. 220, 839 (2007)] to simulate low-n modes. A novel discretization is developed for the ion polarization term in the gyrokinetic vorticity equation. An eigenmode analysis with finite Larmor radius effects is developed to study the linear resistive tearing mode. The mode growth rate is shown to scale with resistivity as γ ˜ η1/3, the same as the semi-collisional regime in previous kinetic treatments [Drake and Lee, Phys. Fluids 20, 1341 (1977)]. Tearing mode simulations with gyrokinetic ions are verified with the eigenmode calculation.

  8. A stochastic Markov chain approach for tennis: Monte Carlo simulation and modeling

    NASA Astrophysics Data System (ADS)

    Aslam, Kamran

    This dissertation describes the computational formulation of probability density functions (pdfs) that facilitate head-to-head match simulations in tennis along with ranking systems developed from their use. A background on the statistical method used to develop the pdfs , the Monte Carlo method, and the resulting rankings are included along with a discussion on ranking methods currently being used both in professional sports and in other applications. Using an analytical theory developed by Newton and Keller in [34] that defines a tennis player's probability of winning a game, set, match and single elimination tournament, a computational simulation has been developed in Matlab that allows further modeling not previously possible with the analytical theory alone. Such experimentation consists of the exploration of non-iid effects, considers the concept the varying importance of points in a match and allows an unlimited number of matches to be simulated between unlikely opponents. The results of these studies have provided pdfs that accurately model an individual tennis player's ability along with a realistic, fair and mathematically sound platform for ranking them.

  9. Space shuttle propulsion estimation development verification, volume 1

    NASA Technical Reports Server (NTRS)

    Rogers, Robert M.

    1989-01-01

    The results of the Propulsion Estimation Development Verification are summarized. A computer program developed under a previous contract (NAS8-35324) was modified to include improved models for the Solid Rocket Booster (SRB) internal ballistics, the Space Shuttle Main Engine (SSME) power coefficient model, the vehicle dynamics using quaternions, and an improved Kalman filter algorithm based on the U-D factorized algorithm. As additional output, the estimated propulsion performances, for each device are computed with the associated 1-sigma bounds. The outputs of the estimation program are provided in graphical plots. An additional effort was expended to examine the use of the estimation approach to evaluate single engine test data. In addition to the propulsion estimation program PFILTER, a program was developed to produce a best estimate of trajectory (BET). The program LFILTER, also uses the U-D factorized algorithm form of the Kalman filter as in the propulsion estimation program PFILTER. The necessary definitions and equations explaining the Kalman filtering approach for the PFILTER program, the models used for this application for dynamics and measurements, program description, and program operation are presented.

  10. Development of advanced Czochralski growth process to produce low cost 150 kg silicon ingots from a single crucible for technology readiness

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The process development continued, with a total of nine crystal growth runs. One of these was a 150 kg run of 5 crystals of approximately 30 kg each. Several machine and process problems were corrected and the 150 kg run was as successful as previous long runs on CG2000 RC's. The accelerated recharge and growth will be attempted when the development program resumes at full capacity in FY '82. The automation controls (Automatic Grower Light Computer System) were integrated to the seed dip temperature, shoulder, and diameter sensors on the CG2000 RC development grower. Test growths included four crystals, which were grown by the computer/sensor system from seed dip through tail off. This system will be integrated on the Mod CG2000 grower during the next quarter. The analytical task included the completion and preliminary testing of the gas chromatograph portion of the Furnace Atmosphere Analysis System. The system can detect CO concentrations and will be expanded to oxygen and water analysis in FY '82.

  11. Generation of the Human Biped Stance by a Neural Controller Able to Compensate Neurological Time Delay

    PubMed Central

    Jiang, Ping; Chiba, Ryosuke; Takakusaki, Kaoru; Ota, Jun

    2016-01-01

    The development of a physiologically plausible computational model of a neural controller that can realize a human-like biped stance is important for a large number of potential applications, such as assisting device development and designing robotic control systems. In this paper, we develop a computational model of a neural controller that can maintain a musculoskeletal model in a standing position, while incorporating a 120-ms neurological time delay. Unlike previous studies that have used an inverted pendulum model, a musculoskeletal model with seven joints and 70 muscular-tendon actuators is adopted to represent the human anatomy. Our proposed neural controller is composed of both feed-forward and feedback controls. The feed-forward control corresponds to the constant activation input necessary for the musculoskeletal model to maintain a standing posture. This compensates for gravity and regulates stiffness. The developed neural controller model can replicate two salient features of the human biped stance: (1) physiologically plausible muscle activations for quiet standing; and (2) selection of a low active stiffness for low energy consumption. PMID:27655271

  12. Simulating electron energy loss spectroscopy with the MNPBEM toolbox

    NASA Astrophysics Data System (ADS)

    Hohenester, Ulrich

    2014-03-01

    Within the MNPBEM toolbox, we show how to simulate electron energy loss spectroscopy (EELS) of plasmonic nanoparticles using a boundary element method approach. The methodology underlying our approach closely follows the concepts developed by García de Abajo and coworkers (Garcia de Abajo, 2010). We introduce two classes eelsret and eelsstat that allow in combination with our recently developed MNPBEM toolbox for a simple, robust, and efficient computation of EEL spectra and maps. The classes are accompanied by a number of demo programs for EELS simulation of metallic nanospheres, nanodisks, and nanotriangles, and for electron trajectories passing by or penetrating through the metallic nanoparticles. We also discuss how to compute electric fields induced by the electron beam and cathodoluminescence. Catalogue identifier: AEKJ_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKJ_v2_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 38886 No. of bytes in distributed program, including test data, etc.: 1222650 Distribution format: tar.gz Programming language: Matlab 7.11.0 (R2010b). Computer: Any which supports Matlab 7.11.0 (R2010b). Operating system: Any which supports Matlab 7.11.0 (R2010b). RAM:≥1 GB Classification: 18. Catalogue identifier of previous version: AEKJ_v1_0 Journal reference of previous version: Comput. Phys. Comm. 183 (2012) 370 External routines: MESH2D available at www.mathworks.com Does the new version supersede the previous version?: Yes Nature of problem: Simulation of electron energy loss spectroscopy (EELS) for plasmonic nanoparticles. Solution method: Boundary element method using electromagnetic potentials. Reasons for new version: The new version of the toolbox includes two additional classes for the simulation of electron energy loss spectroscopy (EELS) of plasmonic nanoparticles, and corrects a few minor bugs and inconsistencies. Summary of revisions: New classes “eelsstat” and “eelsret” for the simulation of electron energy loss spectroscopy (EELS) of plasmonic nanoparticles have been added. A few minor errors in the implementation of dipole excitation have been corrected. Running time: Depending on surface discretization between seconds and hours.

  13. Development of expert systems for analyzing electronic documents

    NASA Astrophysics Data System (ADS)

    Abeer Yassin, Al-Azzawi; Shidlovskiy, S.; Jamal, A. A.

    2018-05-01

    The paper analyses a Database Management System (DBMS). Expert systems, Databases, and database technology have become an essential component of everyday life in the modern society. As databases are widely used in every organization with a computer system, data resource control and data management are very important [1]. DBMS is the most significant tool developed to serve multiple users in a database environment consisting of programs that enable users to create and maintain a database. This paper focuses on development of a database management system for General Directorate for education of Diyala in Iraq (GDED) using Clips, java Net-beans and Alfresco and system components, which were previously developed in Tomsk State University at the Faculty of Innovative Technology.

  14. HEPData: a repository for high energy physics data

    NASA Astrophysics Data System (ADS)

    Maguire, Eamonn; Heinrich, Lukas; Watt, Graeme

    2017-10-01

    The Durham High Energy Physics Database (HEPData) has been built up over the past four decades as a unique open-access repository for scattering data from experimental particle physics papers. It comprises data points underlying several thousand publications. Over the last two years, the HEPData software has been completely rewritten using modern computing technologies as an overlay on the Invenio v3 digital library framework. The software is open source with the new site available at https://hepdata.net now replacing the previous site at http://hepdata.cedar.ac.uk. In this write-up, we describe the development of the new site and explain some of the advantages it offers over the previous platform.

  15. Identifying messaging completion in a parallel computer by checking for change in message received and transmitted count at each node

    DOEpatents

    Archer, Charles J [Rochester, MN; Hardwick, Camesha R [Fayetteville, NC; McCarthy, Patrick J [Rochester, MN; Wallenfelt, Brian P [Eden Prairie, MN

    2009-06-23

    Methods, parallel computers, and products are provided for identifying messaging completion on a parallel computer. The parallel computer includes a plurality of compute nodes, the compute nodes coupled for data communications by at least two independent data communications networks including a binary tree data communications network optimal for collective operations that organizes the nodes as a tree and a torus data communications network optimal for point to point operations that organizes the nodes as a torus. Embodiments include reading all counters at each node of the torus data communications network; calculating at each node a current node value in dependence upon the values read from the counters at each node; and determining for all nodes whether the current node value for each node is the same as a previously calculated node value for each node. If the current node is the same as the previously calculated node value for all nodes of the torus data communications network, embodiments include determining that messaging is complete and if the current node is not the same as the previously calculated node value for all nodes of the torus data communications network, embodiments include determining that messaging is currently incomplete.

  16. Development of a high resolution voxelised head phantom for medical physics applications.

    PubMed

    Giacometti, V; Guatelli, S; Bazalova-Carter, M; Rosenfeld, A B; Schulte, R W

    2017-01-01

    Computational anthropomorphic phantoms have become an important investigation tool for medical imaging and dosimetry for radiotherapy and radiation protection. The development of computational phantoms with realistic anatomical features contribute significantly to the development of novel methods in medical physics. For many applications, it is desirable that such computational phantoms have a real-world physical counterpart in order to verify the obtained results. In this work, we report the development of a voxelised phantom, the HIGH_RES_HEAD, modelling a paediatric head based on the commercial phantom 715-HN (CIRS). HIGH_RES_HEAD is unique for its anatomical details and high spatial resolution (0.18×0.18mm 2 pixel size). The development of such a phantom was required to investigate the performance of a new proton computed tomography (pCT) system, in terms of detector technology and image reconstruction algorithms. The HIGH_RES_HEAD was used in an ad-hoc Geant4 simulation modelling the pCT system. The simulation application was previously validated with respect to experimental results. When compared to a standard spatial resolution voxelised phantom of the same paediatric head, it was shown that in pCT reconstruction studies, the use of the HIGH_RES_HEAD translates into a reduction from 2% to 0.7% of the average relative stopping power difference between experimental and simulated results thus improving the overall quality of the head phantom simulation. The HIGH_RES_HEAD can also be used for other medical physics applications such as treatment planning studies. A second version of the voxelised phantom was created that contains a prototypic base of skull tumour and surrounding organs at risk. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  17. One-dimensional high-order compact method for solving Euler's equations

    NASA Astrophysics Data System (ADS)

    Mohamad, M. A. H.; Basri, S.; Basuno, B.

    2012-06-01

    In the field of computational fluid dynamics, many numerical algorithms have been developed to simulate inviscid, compressible flows problems. Among those most famous and relevant are based on flux vector splitting and Godunov-type schemes. Previously, this system was developed through computational studies by Mawlood [1]. However the new test cases for compressible flows, the shock tube problems namely the receding flow and shock waves were not investigated before by Mawlood [1]. Thus, the objective of this study is to develop a high-order compact (HOC) finite difference solver for onedimensional Euler equation. Before developing the solver, a detailed investigation was conducted to assess the performance of the basic third-order compact central discretization schemes. Spatial discretization of the Euler equation is based on flux-vector splitting. From this observation, discretization of the convective flux terms of the Euler equation is based on a hybrid flux-vector splitting, known as the advection upstream splitting method (AUSM) scheme which combines the accuracy of flux-difference splitting and the robustness of flux-vector splitting. The AUSM scheme is based on the third-order compact scheme to the approximate finite difference equation was completely analyzed consequently. In one-dimensional problem for the first order schemes, an explicit method is adopted by using time integration method. In addition to that, development and modification of source code for the one-dimensional flow is validated with four test cases namely, unsteady shock tube, quasi-one-dimensional supersonic-subsonic nozzle flow, receding flow and shock waves in shock tubes. From these results, it was also carried out to ensure that the definition of Riemann problem can be identified. Further analysis had also been done in comparing the characteristic of AUSM scheme against experimental results, obtained from previous works and also comparative analysis with computational results generated by van Leer, KFVS and AUSMPW schemes. Furthermore, there is a remarkable improvement with the extension of the AUSM scheme from first-order to third-order accuracy in terms of shocks, contact discontinuities and rarefaction waves.

  18. Knowledge-Assisted Approach to Identify Pathways with Differential Dependencies | Office of Cancer Genomics

    Cancer.gov

    We have previously developed a statistical method to identify gene sets enriched with condition-specific genetic dependencies. The method constructs gene dependency networks from bootstrapped samples in one condition and computes the divergence between distributions of network likelihood scores from different conditions. It was shown to be capable of sensitive and specific identification of pathways with phenotype-specific dysregulation, i.e., rewiring of dependencies between genes in different conditions.

  19. Ship Structure Committee Long-Range Research Plan - Guidelines for Program Development.

    DTIC Science & Technology

    1982-01-01

    many scientists and engineers who contributed their time and expertise. We are indebted especially to Mr. J. J. Hopkinson, Dr. J. G. Giannotti, Mr...projection is better than assuming extension of the status quo. There is a long lead time in the use of new knowledge. Scientific research maturation...They even promise to remove traditional constraints previously too intractable to be labelled problems. The modern computer is an outstanding example

  20. Eye/Brain/Task Testbed And Software

    NASA Technical Reports Server (NTRS)

    Janiszewski, Thomas; Mainland, Nora; Roden, Joseph C.; Rothenheber, Edward H.; Ryan, Arthur M.; Stokes, James M.

    1994-01-01

    Eye/brain/task (EBT) testbed records electroencephalograms, movements of eyes, and structures of tasks to provide comprehensive data on neurophysiological experiments. Intended to serve continuing effort to develop means for interactions between human brain waves and computers. Software library associated with testbed provides capabilities to recall collected data, to process data on movements of eyes, to correlate eye-movement data with electroencephalographic data, and to present data graphically. Cognitive processes investigated in ways not previously possible.

  1. Timeline Analysis Program (TLA-1)

    NASA Technical Reports Server (NTRS)

    Miller, K. H.

    1976-01-01

    The Timeline Analysis Program (TLA-1) was described. This program is a crew workload analysis computer program that was developed and expanded from previous workload analysis programs, and is designed to be used on the NASA terminal controlled vehicle program. The following information is described: derivation of the input data, processing of the data, and form of the output data. Eight scenarios that were created, programmed, and analyzed as verification of this model were also described.

  2. The LARSYS Educational Package: Instructor's Notes for Use with the Data 100

    NASA Technical Reports Server (NTRS)

    Lindenlaub, J. C.; Russell, J. D.

    1977-01-01

    The LARSYS Educational Package is a set of instructional materials developed to train people to analyze remotely sensed multispectral data using LARSYS, a computer software system. The materials included in this volume have been designed to assist LARSYS instructors as they guide students through the LARSYS Educational Package. All of the materials have been updated from the previous version to reflect the use of a Data 100 Remote Terminal.

  3. Experimental and Analytical Characterization of the Macromechanical Response for Triaxial Braided Composite Materials

    NASA Technical Reports Server (NTRS)

    Littell, Justin D.

    2013-01-01

    Increasingly, carbon composite structures are being used in aerospace applications. Their highstrength, high-stiffness, and low-weight properties make them good candidates for replacing many aerospace structures currently made of aluminum or steel. Recently, many of the aircraft engine manufacturers have developed new commercial jet engines that will use composite fan cases. Instead of using traditional composite layup techniques, these new fan cases will use a triaxially braided pattern, which improves case performance. The impact characteristics of composite materials for jet engine fan case applications have been an important research topic because Federal regulations require that an engine case be able to contain a blade and blade fragments during an engine blade-out event. Once the impact characteristics of these triaxial braided materials become known, computer models can be developed to simulate a jet engine blade-out event, thus reducing cost and time in the development of these composite jet engine cases. The two main problems that have arisen in this area of research are that the properties for these materials have not been fully determined and computationally efficient computer models, which incorporate much of the microscale deformation and failure mechanisms, are not available. The research reported herein addresses some of the deficiencies present in previous research regarding these triaxial braided composite materials. The current research develops new techniques to accurately quantify the material properties of the triaxial braided composite materials. New test methods are developed for the polymer resin composite constituent and representative composite coupons. These methods expand previous research by using novel specimen designs along with using a noncontact measuring system that is also capable of identifying and quantifying many of the microscale failure mechanisms present in the materials. Finally, using the data gathered, a new hybrid micromacromechanical computer model is created to simulate the behavior of these composite material systems under static and ballistic impact loading using the test data acquired. The model also quantifies the way in which the fiber/matrix interface affects material response under static and impact loading. The results show that the test methods are capable of accurately quantifying the polymer resin under a variety of strain rates and temperature for three loading conditions. The resin strength and stiffness data show a clear rate and temperature dependence. The data also show the hydrostatic stress effects and hysteresis, all of which can be used by researchers developing composite constitutive models for the resins. The results for the composite data reveal noticeable differences in strength, failure strain, and stiffness in the different material systems presented. The investigations into the microscale failure mechanisms provide information about the nature of the different material system behaviors. Finally, the developed computer model predicts composite static strength and stiffness to within 10 percent of the gathered test data and also agrees with composite impact data, where available.

  4. Intelligence Applied to Air Vehicles

    NASA Technical Reports Server (NTRS)

    Rosen, Robert; Gross, Anthony R.; Fletcher, L. Skip; Zornetzer, Steven (Technical Monitor)

    2000-01-01

    The exponential growth in information technology has provided the potential for air vehicle capabilities that were previously unavailable to mission and vehicle designers. The increasing capabilities of computer hardware and software, including new developments such as neural networks, provide a new balance of work between humans and machines. This paper will describe several NASA projects, and review results and conclusions from ground and flight investigations where vehicle intelligence was developed and applied to aeronautical and space systems. In the first example, flight results from a neural network flight control demonstration will be reviewed. Using, a highly-modified F-15 aircraft, a NASA/Dryden experimental flight test program has demonstrated how the neural network software can correctly identify and respond to changes in aircraft stability and control characteristics. Using its on-line learning capability, the neural net software would identify that something in the vehicle has changed, then reconfigure the flight control computer system to adapt to those changes. The results of the Remote Agent software project will be presented. This capability will reduce the cost of future spacecraft operations as computers become "thinking" partners along with humans. In addition, the paper will describe the objectives and plans for the autonomous airplane program and the autonomous rotorcraft project. Technologies will also be developed.

  5. Development of self-compressing BLSOM for comprehensive analysis of big sequence data.

    PubMed

    Kikuchi, Akihito; Ikemura, Toshimichi; Abe, Takashi

    2015-01-01

    With the remarkable increase in genomic sequence data from various organisms, novel tools are needed for comprehensive analyses of available big sequence data. We previously developed a Batch-Learning Self-Organizing Map (BLSOM), which can cluster genomic fragment sequences according to phylotype solely dependent on oligonucleotide composition and applied to genome and metagenomic studies. BLSOM is suitable for high-performance parallel-computing and can analyze big data simultaneously, but a large-scale BLSOM needs a large computational resource. We have developed Self-Compressing BLSOM (SC-BLSOM) for reduction of computation time, which allows us to carry out comprehensive analysis of big sequence data without the use of high-performance supercomputers. The strategy of SC-BLSOM is to hierarchically construct BLSOMs according to data class, such as phylotype. The first-layer BLSOM was constructed with each of the divided input data pieces that represents the data subclass, such as phylotype division, resulting in compression of the number of data pieces. The second BLSOM was constructed with a total of weight vectors obtained in the first-layer BLSOMs. We compared SC-BLSOM with the conventional BLSOM by analyzing bacterial genome sequences. SC-BLSOM could be constructed faster than BLSOM and cluster the sequences according to phylotype with high accuracy, showing the method's suitability for efficient knowledge discovery from big sequence data.

  6. Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application.

    PubMed

    Hanwell, Marcus D; de Jong, Wibe A; Harris, Christopher J

    2017-10-30

    An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction-connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platform with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web-going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.

  7. Exploratory analysis regarding the domain definitions for computer based analytical models

    NASA Astrophysics Data System (ADS)

    Raicu, A.; Oanta, E.; Barhalescu, M.

    2017-08-01

    Our previous computer based studies dedicated to structural problems using analytical methods defined the composite cross section of a beam as a result of Boolean operations with so-called ‘simple’ shapes. Using generalisations, in the class of the ‘simple’ shapes were included areas bounded by curves approximated using spline functions and areas approximated as polygons. However, particular definitions lead to particular solutions. In order to ascend above the actual limitations, we conceived a general definition of the cross sections that are considered now calculus domains consisting of several subdomains. The according set of input data use complex parameterizations. This new vision allows us to naturally assign a general number of attributes to the subdomains. In this way there may be modelled new phenomena that use map-wise information, such as the metal alloys equilibrium diagrams. The hierarchy of the input data text files that use the comma-separated-value format and their structure are also presented and discussed in the paper. This new approach allows us to reuse the concepts and part of the data processing software instruments already developed. The according software to be subsequently developed will be modularised and generalised in order to be used in the upcoming projects that require rapid development of computer based models.

  8. Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application

    DOE PAGES

    Hanwell, Marcus D.; de Jong, Wibe A.; Harris, Christopher J.

    2017-10-30

    An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction - connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platformmore » with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web - going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.« less

  9. Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanwell, Marcus D.; de Jong, Wibe A.; Harris, Christopher J.

    An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction - connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platformmore » with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web - going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.« less

  10. Theoretical Investigations of Well-Defined Graphene Nanostructures: Catalysis, Spectroscopy, and Development of Novel Fragment-Based Models

    NASA Astrophysics Data System (ADS)

    Noffke, Benjamin W.

    Carbon materials have the potential to replace some precious metals in renewable energy applications. These materials are particularly attractive because of the elemental abundance and relatively low nuclear mass of carbon, implying economically feasible and lightweight materials. Targeted design of carbon materials is hindered by the lack of fundamental understanding that is required to tailor their properties for the desired application. However, most available synthetic methods to create carbon materials involve harsh conditions that limit the control of the resulting structure. Without a well-defined structure, the system is too complex and fundamental studies cannot be definitive. This work seeks to gain fundamental understanding through the development and application of efficient computational models for these systems, in conjunction with experiments performed on soluble, well-defined graphene nanostructures prepared by our group using a bottom-up synthetic approach. Theory is used to determine mechanistic details for well-defined carbon systems in applications of catalysis and electrochemical transformations. The resulting computational models do well to explain previous observations of carbon materials and provide suggestions for future directions. However, as the system size of the nanostructures gets larger, the computational cost can become prohibitive. To reduce the computational scaling of quantum chemical calculations, a new fragmentation scheme has been developed that addresses the challenges of fragmenting conjugated molecules. By selecting fragments that retain important structural characteristics in graphene, a more efficient method is achieved. The new method paves the way for an automated, systematic fragmentation scheme of graphene molecules.

  11. Instability Mechanisms of Thermally-Driven Interfacial Flows in Liquid-Encapsulated Crystal Growth

    NASA Technical Reports Server (NTRS)

    Haj-Hariri, Hossein; Borhan, Ali

    1997-01-01

    During the past year, a great deal of effort was focused on the enhancement and refinement of the computational tools developed as part of our previous NASA grant. In particular, the interface mollification algorithm developed earlier was extended to incorporate the effects of surface-rheological properties in order to allow the study of thermocapillary flows in the presence of surface contamination. These tools will be used in the computational component of the proposed research in the remaining years of this grant. A detailed description of the progress made in this area is provided elsewhere. Briefly, the method developed allows for the convection and diffusion of bulk-insoluble surfactants on a moving and deforming interface. The novelty of the method is its grid independence: there is no need for front tracking, surface reconstruction, body-fitted grid generation, or metric evaluations; these are all very expensive computational tasks in three dimensions. For small local radii of curvature there is a need for local grid adaption so that the smearing thickness remains a small fraction of the radius of curvature. A special Neumann boundary condition was devised and applied so that the calculated surfactant concentration has no variations normal to the interface, and it is hence truly a surface-defined quantity. The discretized governing equations are solved subsequently using a time-split integration scheme which updates the concentration and the shape successively. Results demonstrate excellent agreement between the computed and exact solutions.

  12. Computational correction of copy number effect improves specificity of CRISPR-Cas9 essentiality screens in cancer cells.

    PubMed

    Meyers, Robin M; Bryan, Jordan G; McFarland, James M; Weir, Barbara A; Sizemore, Ann E; Xu, Han; Dharia, Neekesh V; Montgomery, Phillip G; Cowley, Glenn S; Pantel, Sasha; Goodale, Amy; Lee, Yenarae; Ali, Levi D; Jiang, Guozhi; Lubonja, Rakela; Harrington, William F; Strickland, Matthew; Wu, Ting; Hawes, Derek C; Zhivich, Victor A; Wyatt, Meghan R; Kalani, Zohra; Chang, Jaime J; Okamoto, Michael; Stegmaier, Kimberly; Golub, Todd R; Boehm, Jesse S; Vazquez, Francisca; Root, David E; Hahn, William C; Tsherniak, Aviad

    2017-12-01

    The CRISPR-Cas9 system has revolutionized gene editing both at single genes and in multiplexed loss-of-function screens, thus enabling precise genome-scale identification of genes essential for proliferation and survival of cancer cells. However, previous studies have reported that a gene-independent antiproliferative effect of Cas9-mediated DNA cleavage confounds such measurement of genetic dependency, thereby leading to false-positive results in copy number-amplified regions. We developed CERES, a computational method to estimate gene-dependency levels from CRISPR-Cas9 essentiality screens while accounting for the copy number-specific effect. In our efforts to define a cancer dependency map, we performed genome-scale CRISPR-Cas9 essentiality screens across 342 cancer cell lines and applied CERES to this data set. We found that CERES decreased false-positive results and estimated sgRNA activity for both this data set and previously published screens performed with different sgRNA libraries. We further demonstrate the utility of this collection of screens, after CERES correction, for identifying cancer-type-specific vulnerabilities.

  13. Computational correction of copy-number effect improves specificity of CRISPR-Cas9 essentiality screens in cancer cells

    PubMed Central

    Meyers, Robin M.; Bryan, Jordan G.; McFarland, James M.; Weir, Barbara A.; Sizemore, Ann E.; Xu, Han; Dharia, Neekesh V.; Montgomery, Phillip G.; Cowley, Glenn S.; Pantel, Sasha; Goodale, Amy; Lee, Yenarae; Ali, Levi D.; Jiang, Guozhi; Lubonja, Rakela; Harrington, William F.; Strickland, Matthew; Wu, Ting; Hawes, Derek C.; Zhivich, Victor A.; Wyatt, Meghan R.; Kalani, Zohra; Chang, Jaime J.; Okamoto, Michael; Stegmaier, Kimberly; Golub, Todd R.; Boehm, Jesse S.; Vazquez, Francisca; Root, David E.; Hahn, William C.; Tsherniak, Aviad

    2017-01-01

    The CRISPR-Cas9 system has revolutionized gene editing both on single genes and in multiplexed loss-of-function screens, enabling precise genome-scale identification of genes essential to proliferation and survival of cancer cells1,2. However, previous studies reported that a gene-independent anti-proliferative effect of Cas9-mediated DNA cleavage confounds such measurement of genetic dependency, leading to false positive results in copy number amplified regions3,4. We developed CERES, a computational method to estimate gene dependency levels from CRISPR-Cas9 essentiality screens while accounting for the copy-number-specific effect. As part of our efforts to define a cancer dependency map, we performed genome-scale CRISPR-Cas9 essentiality screens across 342 cancer cell lines and applied CERES to this dataset. We found that CERES reduced false positive results and estimated sgRNA activity for both this dataset and previously published screens performed with different sgRNA libraries. Here, we demonstrate the utility of this collection of screens, upon CERES correction, in revealing cancer-type-specific vulnerabilities. PMID:29083409

  14. FIT: statistical modeling tool for transcriptome dynamics under fluctuating field conditions

    PubMed Central

    Iwayama, Koji; Aisaka, Yuri; Kutsuna, Natsumaro

    2017-01-01

    Abstract Motivation: Considerable attention has been given to the quantification of environmental effects on organisms. In natural conditions, environmental factors are continuously changing in a complex manner. To reveal the effects of such environmental variations on organisms, transcriptome data in field environments have been collected and analyzed. Nagano et al. proposed a model that describes the relationship between transcriptomic variation and environmental conditions and demonstrated the capability to predict transcriptome variation in rice plants. However, the computational cost of parameter optimization has prevented its wide application. Results: We propose a new statistical model and efficient parameter optimization based on the previous study. We developed and released FIT, an R package that offers functions for parameter optimization and transcriptome prediction. The proposed method achieves comparable or better prediction performance within a shorter computational time than the previous method. The package will facilitate the study of the environmental effects on transcriptomic variation in field conditions. Availability and Implementation: Freely available from CRAN (https://cran.r-project.org/web/packages/FIT/). Contact: anagano@agr.ryukoku.ac.jp Supplementary information: Supplementary data are available at Bioinformatics online PMID:28158396

  15. On the convergence of an iterative formulation of the electromagnetic scattering from an infinite grating of thin wires

    NASA Technical Reports Server (NTRS)

    Brand, J. C.

    1985-01-01

    Contraction theory is applied to an iterative formulation of electromagnetic scattering from periodic structures and a computational method for insuring convergence is developed. A short history of spectral (or k-space) formulation is presented with an emphasis on application to periodic surfaces. The mathematical background for formulating an iterative equation is covered using straightforward single variable examples including an extension to vector spaces. To insure a convergent solution of the iterative equation, a process called the contraction corrector method is developed. Convergence properties of previously presented iterative solutions to one-dimensional problems are examined utilizing contraction theory and the general conditions for achieving a convergent solution are explored. The contraction corrector method is then applied to several scattering problems including an infinite grating of thin wires with the solution data compared to previous works.

  16. Automated laser guidance of neuronal growth cones using a spatial light modulator.

    PubMed

    Carnegie, David J; Cizmár, Tomás; Baumgartl, Jörg; Gunn-Moore, Frank J; Dholakia, Kishan

    2009-11-01

    The growth cone of a developing neuron can be guided using a focused infra-red (IR) laser beam [1]. In previous setups this process has required a significant amount of user intervention to adjust continuously the laser beam to guide the growing neuron. Previously, a system using an acousto-optical deflector (AOD) has been developed to steer the beam [2]. However, to enhance the controllability of this system, here we demonstrate the use of a computer controlled spatial light modulator (SLM) to steer and manipulate the shape of a laser beam for use in guided neuronal growth. This new experimental setup paves the way to enable a comprehensive investigation into beam shaping effects on neuronal growth and we show neuronal growth initiated by a Bessel light mode. This is a robust platform to explore the biochemistry of this novel phenomenon. (c) 2009 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim

  17. All-in-one model for designing optimal water distribution pipe networks

    NASA Astrophysics Data System (ADS)

    Aklog, Dagnachew; Hosoi, Yoshihiko

    2017-05-01

    This paper discusses the development of an easy-to-use, all-in-one model for designing optimal water distribution networks. The model combines different optimization techniques into a single package in which a user can easily choose what optimizer to use and compare the results of different optimizers to gain confidence in the performances of the models. At present, three optimization techniques are included in the model: linear programming (LP), genetic algorithm (GA) and a heuristic one-by-one reduction method (OBORM) that was previously developed by the authors. The optimizers were tested on a number of benchmark problems and performed very well in terms of finding optimal or near-optimal solutions with a reasonable computation effort. The results indicate that the model effectively addresses the issues of complexity and limited performance trust associated with previous models and can thus be used for practical purposes.

  18. Approaching Gender Parity: Women in Computer Science at Afghanistan's Kabul University

    ERIC Educational Resources Information Center

    Plane, Jandelyn

    2010-01-01

    This study explores the representation of women in computer science at the tertiary level through data collected about undergraduate computer science education at Kabul University in Afghanistan. Previous studies have theorized reasons for underrepresentation of women in computer science, and while many of these reasons are indeed present in…

  19. Multi-scale Modeling of the Evolution of a Large-Scale Nourishment

    NASA Astrophysics Data System (ADS)

    Luijendijk, A.; Hoonhout, B.

    2016-12-01

    Morphological predictions are often computed using a single morphological model commonly forced with schematized boundary conditions representing the time scale of the prediction. Recent model developments are now allowing us to think and act differently. This study presents some recent developments in coastal morphological modeling focusing on flexible meshes, flexible coupling between models operating at different time scales, and a recently developed morphodynamic model for the intertidal and dry beach. This integrated modeling approach is applied to the Sand Engine mega nourishment in The Netherlands to illustrate the added-values of this integrated approach both in accuracy and computational efficiency. The state-of-the-art Delft3D Flexible Mesh (FM) model is applied at the study site under moderate wave conditions. One of the advantages is that the flexibility of the mesh structure allows a better representation of the water exchange with the lagoon and corresponding morphological behavior than with the curvilinear grid used in the previous version of Delft3D. The XBeach model is applied to compute the morphodynamic response to storm events in detail incorporating the long wave effects on bed level changes. The recently developed aeolian transport and bed change model AeoLiS is used to compute the bed changes in the intertidal and dry beach area. In order to enable flexible couplings between the three abovementioned models, a component-based environment has been developed using the BMI method. This allows a serial coupling of Delft3D FM and XBeach steered by a control module that uses a hydrodynamic time series as input (see figure). In addition, a parallel online coupling, with information exchange in each timestep will be made with the AeoLiS model that predicts the bed level changes at the intertidal and dry beach area. This study presents the first years of evolution of the Sand Engine computed with the integrated modelling approach. Detailed comparisons are made between the observed and computed morphological behaviour for the Sand Engine on an aggregated as well as sub-system level.

  20. ORCA Project: Research on high-performance parallel computer programming environments. Final report, 1 Apr-31 Mar 90

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snyder, L.; Notkin, D.; Adams, L.

    1990-03-31

    This task relates to research on programming massively parallel computers. Previous work on the Ensamble concept of programming was extended and investigation into nonshared memory models of parallel computation was undertaken. Previous work on the Ensamble concept defined a set of programming abstractions and was used to organize the programming task into three distinct levels; Composition of machine instruction, composition of processes, and composition of phases. It was applied to shared memory models of computations. During the present research period, these concepts were extended to nonshared memory models. During the present research period, one Ph D. thesis was completed, onemore » book chapter, and six conference proceedings were published.« less

  1. Scheduler Design Criteria: Requirements and Considerations

    NASA Technical Reports Server (NTRS)

    Lee, Hanbong

    2016-01-01

    This presentation covers fundamental requirements and considerations for developing schedulers in airport operations. We first introduce performance and functional requirements for airport surface schedulers. Among various optimization problems in airport operations, we focus on airport surface scheduling problem, including runway and taxiway operations. We then describe a basic methodology for airport surface scheduling such as node-link network model and scheduling algorithms previously developed. Next, we explain how to design a mathematical formulation in more details, which consists of objectives, decision variables, and constraints. Lastly, we review other considerations, including optimization tools, computational performance, and performance metrics for evaluation.

  2. An analytical method to predict efficiency of aircraft gearboxes

    NASA Technical Reports Server (NTRS)

    Anderson, N. E.; Loewenthal, S. H.; Black, J. D.

    1984-01-01

    A spur gear efficiency prediction method previously developed by the authors was extended to include power loss of planetary gearsets. A friction coefficient model was developed for MIL-L-7808 oil based on disc machine data. This combined with the recent capability of predicting losses in spur gears of nonstandard proportions allows the calculation of power loss for complete aircraft gearboxes that utilize spur gears. The method was applied to the T56/501 turboprop gearbox and compared with measured test data. Bearing losses were calculated with large scale computer programs. Breakdowns of the gearbox losses point out areas for possible improvement.

  3. Stress analysis of ribbon parachutes

    NASA Technical Reports Server (NTRS)

    Reynolds, D. T.; Mullins, W. M.

    1975-01-01

    An analytical method has been developed for determining the internal load distribution for ribbon parachutes subjected to known riser and aerodynamic forces. Finite elements with non-linear elastic properties represent the parachute structure. This method is an extension of the analysis previously developed by the authors and implemented in the digital computer program CANO. The present analysis accounts for the effect of vertical ribbons in the solution for canopy shape and stress distribution. Parametric results are presented which relate the canopy stress distribution to such factors as vertical ribbon strength, number of gores, and gore shape in a ribbon parachute.

  4. Chemical Markup, XML and the World-Wide Web. 8. Polymer Markup Language.

    PubMed

    Adams, Nico; Winter, Jerry; Murray-Rust, Peter; Rzepa, Henry S

    2008-11-01

    Polymers are among the most important classes of materials but are only inadequately supported by modern informatics. The paper discusses the reasons why polymer informatics is considerably more challenging than small molecule informatics and develops a vision for the computer-aided design of polymers, based on modern semantic web technologies. The paper then discusses the development of Polymer Markup Language (PML). PML is an extensible language, designed to support the (structural) representation of polymers and polymer-related information. PML closely interoperates with Chemical Markup Language (CML) and overcomes a number of the previously identified challenges.

  5. 4D Subject-Specific Inverse Modeling of the Chick Embryonic Heart Outflow Tract Hemodynamics

    PubMed Central

    Goenezen, Sevan; Chivukula, Venkat Keshav; Midgett, Madeline; Phan, Ly; Rugonyi, Sandra

    2015-01-01

    Blood flow plays a critical role in regulating embryonic cardiac growth and development, with altered flow leading to congenital heart disease. Progress in the field, however, is hindered by a lack of quantification of hemodynamic conditions in the developing heart. In this study, we present a methodology to quantify blood flow dynamics in the embryonic heart using subject-specific computational fluid dynamics (CFD) models. While the methodology is general, we focused on a model of the chick embryonic heart outflow tract (OFT), which distally connects the heart to the arterial system, and is the region of origin of many congenital cardiac defects. Using structural and Doppler velocity data collected from optical coherence tomography (OCT), we generated 4D (3D + time) embryo-specific CFD models of the heart OFT. To replicate the blood flow dynamics over time during the cardiac cycle, we developed an iterative inverse-method optimization algorithm, which determines the CFD model boundary conditions such that differences between computed velocities and measured velocities at one point within the OFT lumen are minimized. Results from our developed CFD model agree with previously measured hemodynamics in the OFT. Further, computed velocities and measured velocities differ by less than 15% at locations that were not used in the optimization, validating the model. The presented methodology can be used in quantifications of embryonic cardiac hemodynamics under normal and altered blood flow conditions, enabling an in depth quantitative study of how blood flow influences cardiac development. PMID:26361767

  6. Plant-Level Modeling and Simulation of Used Nuclear Fuel Dissolution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    de Almeida, Valmor F.

    2012-09-07

    Plant-level modeling and simulation of a used nuclear fuel prototype dissolver is presented. Emphasis is given in developing a modeling and simulation approach to be explored by other processes involved in the recycle of used fuel. The commonality concepts presented in a previous communication were used to create a model and realize its software module. An initial model was established based on a theory of chemical thermomechanical network transport outlined previously. A software module prototype was developed with the required external behavior and internal mathematical structure. Results obtained demonstrate the generality of the design approach and establish an extensible mathematicalmore » model with its corresponding software module for a wide range of dissolvers. Scale up numerical tests were made varying the type of used fuel (breeder and light-water reactors) and the capacity of dissolution (0.5 t/d to 1.7 t/d). These tests were motivated by user requirements in the area of nuclear materials safeguards. A computer module written in high-level programing languages (MATLAB and Octave) was developed, tested, and provided as open-source code (MATLAB) for integration into the Separations and Safeguards Performance Model application in development at Sandia National Laboratories. The modeling approach presented here is intended to serve as a template for a rational modeling of all plant-level modules. This will facilitate the practical application of the commonality features underlying the unifying network transport theory proposed recently. In addition, by example, this model describes, explicitly, the needed data from sub-scale models, and logical extensions for future model development. For example, from thermodynamics, an off-line simulation of molecular dynamics could quantify partial molar volumes for the species in the liquid phase; this simulation is currently at reach for high-performance computing. From fluid mechanics, a hold-up capacity function is needed for the dissolver device; this simulation is currently at reach for computational fluid mechanics given the existing CAD geometry. From chemical transport phenomena, a simulation of the particle-scale dissolution front is needed to derive an improved solid dissolution kinetics law by predicting the local surface area change; an example was provided in this report. In addition, the associated reaction mechanisms for dissolution are presently largely untested and simplified, hence even a parallel experimental program in reaction kinetics is needed to support modeling and simulation efforts. Last but not least, a simple account of finite rates of solid feed and transfer can be readily introduced via a coupled delayed model. These are some of the theoretical benefits of a rational plant-level modeling approach which guides the development of smaller length and time scale modeling. Practical, and other theoretical benefits have been presented on a previous report.« less

  7. Stochastic Surface Mesh Reconstruction

    NASA Astrophysics Data System (ADS)

    Ozendi, M.; Akca, D.; Topan, H.

    2018-05-01

    A generic and practical methodology is presented for 3D surface mesh reconstruction from the terrestrial laser scanner (TLS) derived point clouds. It has two main steps. The first step deals with developing an anisotropic point error model, which is capable of computing the theoretical precisions of 3D coordinates of each individual point in the point cloud. The magnitude and direction of the errors are represented in the form of error ellipsoids. The following second step is focused on the stochastic surface mesh reconstruction. It exploits the previously determined error ellipsoids by computing a point-wise quality measure, which takes into account the semi-diagonal axis length of the error ellipsoid. The points only with the least errors are used in the surface triangulation. The remaining ones are automatically discarded.

  8. Automated Illustration of Patients Instructions

    PubMed Central

    Bui, Duy; Nakamura, Carlos; Bray, Bruce E.; Zeng-Treitler, Qing

    2012-01-01

    A picture can be a powerful communication tool. However, creating pictures to illustrate patient instructions can be a costly and time-consuming task. Building on our prior research in this area, we developed a computer application that automatically converts text to pictures using natural language processing and computer graphics techniques. After iterative testing, the automated illustration system was evaluated using 49 previously unseen cardiology discharge instructions. The completeness of the system-generated illustrations was assessed by three raters using a three-level scale. The average inter-rater agreement for text correctly represented in the pictograph was about 66 percent. Since illustration in this context is intended to enhance rather than replace text, these results support the feasibility of conducting automated illustration. PMID:23304392

  9. Combination of visual and symbolic knowledge: A survey in anatomy.

    PubMed

    Banerjee, Imon; Patané, Giuseppe; Spagnuolo, Michela

    2017-01-01

    In medicine, anatomy is considered as the most discussed field and results in a huge amount of knowledge, which is heterogeneous and covers aspects that are mostly independent in nature. Visual and symbolic modalities are mainly adopted for exemplifying knowledge about human anatomy and are crucial for the evolution of computational anatomy. In particular, a tight integration of visual and symbolic modalities is beneficial to support knowledge-driven methods for biomedical investigation. In this paper, we review previous work on the presentation and sharing of anatomical knowledge, and the development of advanced methods for computational anatomy, also focusing on the key research challenges for harmonizing symbolic knowledge and spatial 3D data. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Functional integration of vertical flight path and speed control using energy principles

    NASA Technical Reports Server (NTRS)

    Lambregts, A. A.

    1984-01-01

    A generalized automatic flight control system was developed which integrates all longitudinal flight path and speed control functions previously provided by a pitch autopilot and autothrottle. In this design, a net thrust command is computed based on total energy demand arising from both flight path and speed targets. The elevator command is computed based on the energy distribution error between flight path and speed. The engine control is configured to produce the commanded net thrust. The design incorporates control strategies and hierarchy to deal systematically and effectively with all aircraft operational requirements, control nonlinearities, and performance limits. Consistent decoupled maneuver control is achieved for all modes and flight conditions without outer loop gain schedules, control law submodes, or control function duplication.

  11. Determination of Optimal Parameters for Dual-Layer Cathode of Polymer Electrolyte Fuel Cell Using Computational Intelligence-Aided Design

    PubMed Central

    Chen, Yi; Huang, Weina; Peng, Bei

    2014-01-01

    Because of the demands for sustainable and renewable energy, fuel cells have become increasingly popular, particularly the polymer electrolyte fuel cell (PEFC). Among the various components, the cathode plays a key role in the operation of a PEFC. In this study, a quantitative dual-layer cathode model was proposed for determining the optimal parameters that minimize the over-potential difference and improve the efficiency using a newly developed bat swarm algorithm with a variable population embedded in the computational intelligence-aided design. The simulation results were in agreement with previously reported results, suggesting that the proposed technique has potential applications for automating and optimizing the design of PEFCs. PMID:25490761

  12. Leading-edge effects on boundary-layer receptivity

    NASA Technical Reports Server (NTRS)

    Gatski, Thomas B.; Kerschen, Edward J.

    1990-01-01

    Numerical calculations are presented for the incompressible flow over a parabolic cylinder. The computational domain extends from a region upstream of the body downstream to the region where the Blasius boundary-layer solution holds. A steady mean flow solution is computed and the results for the scaled surface vorticity, surface pressure and displacement thickness are compared to previous studies. The unsteady problem is then formulated as a perturbation solution starting with and evolving from the mean flow. The response to irrotational time harmonic pulsation of the free-stream is examined. Results for the initial development of the velocity profile and displacement thickness are presented. These calculations will be extended to later times to investigate the initiation of instability waves within the boundary-layer.

  13. Recent observations with phase-contrast x-ray computed tomography

    NASA Astrophysics Data System (ADS)

    Momose, Atsushi; Takeda, Tohoru; Itai, Yuji; Tu, Jinhong; Hirano, Keiichi

    1999-09-01

    Recent development in phase-contrast X-ray computed tomography using an X-ray interferometer is reported. To observe larger samples than is possible with our previous X-ray interferometer, a large monolithic X-ray interferometer and a separated-type X-ray interferometer were studied. At the present time, 2.5 cm X 1.5 cm interference patterns have been generated with the X-ray interferometers using synchrotron X-rays. The large monolithic X-ray interferometer has produced interference fringes with 80% visibility, and has been used to measure various tissues. To produce images with higher spatial resolution, we fabricated another X-ray interferometer whose wafer was partially thinned by chemical etching. A preliminary test suggested that the spatial resolution has been improved.

  14. Observation of temperature trace, induced by changing of temperature inside the human body, on the human body skin using commercially available IR camera

    NASA Astrophysics Data System (ADS)

    Trofimov, Vyacheslav A.; Trofimov, Vladislav V.

    2015-05-01

    As it is well-known, application of the passive THz camera for the security problems is very promising way. It allows seeing concealed object without contact with a person and this camera is non-dangerous for a person. In previous papers, we demonstrate new possibility of the passive THz camera using for a temperature difference observing on the human skin if this difference is caused by different temperatures inside the body. For proof of validity of our statement we make the similar physical experiment using the IR camera. We show a possibility of temperature trace on human body skin, caused by changing of temperature inside the human body due to water drinking. We use as a computer code that is available for treatment of images captured by commercially available IR camera, manufactured by Flir Corp., as well as our developed computer code for computer processing of these images. Using both codes we demonstrate clearly changing of human body skin temperature induced by water drinking. Shown phenomena are very important for the detection of forbidden samples and substances concealed inside the human body using non-destructive control without X-rays using. Early we have demonstrated such possibility using THz radiation. Carried out experiments can be used for counter-terrorism problem solving. We developed original filters for computer processing of images captured by IR cameras. Their applications for computer processing of images results in a temperature resolution enhancing of cameras.

  15. Identification of informative features for predicting proinflammatory potentials of engine exhausts.

    PubMed

    Wang, Chia-Chi; Lin, Ying-Chi; Lin, Yuan-Chung; Jhang, Syu-Ruei; Tung, Chun-Wei

    2017-08-18

    The immunotoxicity of engine exhausts is of high concern to human health due to the increasing prevalence of immune-related diseases. However, the evaluation of immunotoxicity of engine exhausts is currently based on expensive and time-consuming experiments. It is desirable to develop efficient methods for immunotoxicity assessment. To accelerate the development of safe alternative fuels, this study proposed a computational method for identifying informative features for predicting proinflammatory potentials of engine exhausts. A principal component regression (PCR) algorithm was applied to develop prediction models. The informative features were identified by a sequential backward feature elimination (SBFE) algorithm. A total of 19 informative chemical and biological features were successfully identified by SBFE algorithm. The informative features were utilized to develop a computational method named FS-CBM for predicting proinflammatory potentials of engine exhausts. FS-CBM model achieved a high performance with correlation coefficient values of 0.997 and 0.943 obtained from training and independent test sets, respectively. The FS-CBM model was developed for predicting proinflammatory potentials of engine exhausts with a large improvement on prediction performance compared with our previous CBM model. The proposed method could be further applied to construct models for bioactivities of mixtures.

  16. A novel semi-transductive learning framework for efficient atypicality detection in chest radiographs

    NASA Astrophysics Data System (ADS)

    Alzubaidi, Mohammad; Balasubramanian, Vineeth; Patel, Ameet; Panchanathan, Sethuraman; Black, John A., Jr.

    2012-03-01

    Inductive learning refers to machine learning algorithms that learn a model from a set of training data instances. Any test instance is then classified by comparing it to the learned model. When the set of training instances lend themselves well to modeling, the use of a model substantially reduces the computation cost of classification. However, some training data sets are complex, and do not lend themselves well to modeling. Transductive learning refers to machine learning algorithms that classify test instances by comparing them to all of the training instances, without creating an explicit model. This can produce better classification performance, but at a much higher computational cost. Medical images vary greatly across human populations, constituting a data set that does not lend itself well to modeling. Our previous work showed that the wide variations seen across training sets of "normal" chest radiographs make it difficult to successfully classify test radiographs with an inductive (modeling) approach, and that a transductive approach leads to much better performance in detecting atypical regions. The problem with the transductive approach is its high computational cost. This paper develops and demonstrates a novel semi-transductive framework that can address the unique challenges of atypicality detection in chest radiographs. The proposed framework combines the superior performance of transductive methods with the reduced computational cost of inductive methods. Our results show that the proposed semitransductive approach provides both effective and efficient detection of atypical regions within a set of chest radiographs previously labeled by Mayo Clinic expert thoracic radiologists.

  17. A Lagrangian subgrid-scale model with dynamic estimation of Lagrangian time scale for large eddy simulation of complex flows

    NASA Astrophysics Data System (ADS)

    Verma, Aman; Mahesh, Krishnan

    2012-08-01

    The dynamic Lagrangian averaging approach for the dynamic Smagorinsky model for large eddy simulation is extended to an unstructured grid framework and applied to complex flows. The Lagrangian time scale is dynamically computed from the solution and does not need any adjustable parameter. The time scale used in the standard Lagrangian model contains an adjustable parameter θ. The dynamic time scale is computed based on a "surrogate-correlation" of the Germano-identity error (GIE). Also, a simple material derivative relation is used to approximate GIE at different events along a pathline instead of Lagrangian tracking or multi-linear interpolation. Previously, the time scale for homogeneous flows was computed by averaging along directions of homogeneity. The present work proposes modifications for inhomogeneous flows. This development allows the Lagrangian averaged dynamic model to be applied to inhomogeneous flows without any adjustable parameter. The proposed model is applied to LES of turbulent channel flow on unstructured zonal grids at various Reynolds numbers. Improvement is observed when compared to other averaging procedures for the dynamic Smagorinsky model, especially at coarse resolutions. The model is also applied to flow over a cylinder at two Reynolds numbers and good agreement with previous computations and experiments is obtained. Noticeable improvement is obtained using the proposed model over the standard Lagrangian model. The improvement is attributed to a physically consistent Lagrangian time scale. The model also shows good performance when applied to flow past a marine propeller in an off-design condition; it regularizes the eddy viscosity and adjusts locally to the dominant flow features.

  18. Space-Time Conservation Element and Solution Element Method Being Developed

    NASA Technical Reports Server (NTRS)

    Chang, Sin-Chung; Himansu, Ananda; Jorgenson, Philip C. E.; Loh, Ching-Yuen; Wang, Xiao-Yen; Yu, Sheng-Tao

    1999-01-01

    The engineering research and design requirements of today pose great computer-simulation challenges to engineers and scientists who are called on to analyze phenomena in continuum mechanics. The future will bring even more daunting challenges, when increasingly complex phenomena must be analyzed with increased accuracy. Traditionally used numerical simulation methods have evolved to their present state by repeated incremental extensions to broaden their scope. They are reaching the limits of their applicability and will need to be radically revised, at the very least, to meet future simulation challenges. At the NASA Lewis Research Center, researchers have been developing a new numerical framework for solving conservation laws in continuum mechanics, namely, the Space-Time Conservation Element and Solution Element Method, or the CE/SE method. This method has been built from fundamentals and is not a modification of any previously existing method. It has been designed with generality, simplicity, robustness, and accuracy as cornerstones. The CE/SE method has thus far been applied in the fields of computational fluid dynamics, computational aeroacoustics, and computational electromagnetics. Computer programs based on the CE/SE method have been developed for calculating flows in one, two, and three spatial dimensions. Results have been obtained for numerous problems and phenomena, including various shock-tube problems, ZND detonation waves, an implosion and explosion problem, shocks over a forward-facing step, a blast wave discharging from a nozzle, various acoustic waves, and shock/acoustic-wave interactions. The method can clearly resolve shock/acoustic-wave interactions, wherein the difference of the magnitude between the acoustic wave and shock could be up to six orders. In two-dimensional flows, the reflected shock is as crisp as the leading shock. CE/SE schemes are currently being used for advanced applications to jet and fan noise prediction and to chemically reacting flows.

  19. Effect of computer game playing on baseline laparoscopic simulator skills.

    PubMed

    Halvorsen, Fredrik H; Cvancarova, Milada; Fosse, Erik; Mjåland, Odd

    2013-08-01

    Studies examining the possible association between computer game playing and laparoscopic performance in general have yielded conflicting results and neither has a relationship between computer game playing and baseline performance on laparoscopic simulators been established. The aim of this study was to examine the possible association between previous and present computer game playing and baseline performance on a virtual reality laparoscopic performance in a sample of potential future medical students. The participating students completed a questionnaire covering the weekly amount and type of computer game playing activity during the previous year and 3 years ago. They then performed 2 repetitions of 2 tasks ("gallbladder dissection" and "traverse tube") on a virtual reality laparoscopic simulator. Performance on the simulator were then analyzed for association to their computer game experience. Local high school, Norway. Forty-eight students from 2 high school classes volunteered to participate in the study. No association between prior and present computer game playing and baseline performance was found. The results were similar both for prior and present action game playing and prior and present computer game playing in general. Our results indicate that prior and present computer game playing may not affect baseline performance in a virtual reality simulator.

  20. Fast Construction of Near Parsimonious Hybridization Networks for Multiple Phylogenetic Trees.

    PubMed

    Mirzaei, Sajad; Wu, Yufeng

    2016-01-01

    Hybridization networks represent plausible evolutionary histories of species that are affected by reticulate evolutionary processes. An established computational problem on hybridization networks is constructing the most parsimonious hybridization network such that each of the given phylogenetic trees (called gene trees) is "displayed" in the network. There have been several previous approaches, including an exact method and several heuristics, for this NP-hard problem. However, the exact method is only applicable to a limited range of data, and heuristic methods can be less accurate and also slow sometimes. In this paper, we develop a new algorithm for constructing near parsimonious networks for multiple binary gene trees. This method is more efficient for large numbers of gene trees than previous heuristics. This new method also produces more parsimonious results on many simulated datasets as well as a real biological dataset than a previous method. We also show that our method produces topologically more accurate networks for many datasets.

  1. Computer program to minimize prediction error in models from experiments with 16 hypercube points and 0 to 6 center points

    NASA Technical Reports Server (NTRS)

    Holms, A. G.

    1982-01-01

    A previous report described a backward deletion procedure of model selection that was optimized for minimum prediction error and which used a multiparameter combination of the F - distribution and an order statistics distribution of Cochran's. A computer program is described that applies the previously optimized procedure to real data. The use of the program is illustrated by examples.

  2. Simulated hydrologic effects of possible ground-water and surface-water management alternatives in and near the Platte River, south-central Nebraska

    USGS Publications Warehouse

    Burns, Alan W.

    1981-01-01

    Digital computer models were developed and used to simulate the hydrologic effects of hypothetical water-management alternatives on the wetland habitat area near Grand Island, Nebr. Areally distributed recharge to and discharge from the aquifer system adjacent to the Platte River between Overton and Grand Island were computed for four hypothetical water-management alternatives. Using stream-aquifer response functions, the stream depletions resulting from the different alternatives ranged from 53,000 acre-feet per year for increased surface-water irrigation to 177,000 acre-feet per year for increased ground-water pumpage. Current conditions would result in stream depletions of 125,000 acre-feet per year. Using the relationship between discharge and river stage, frequency curves of the stage in the river near the wildlife habitat area were computed using a 50-year sequence of historical streamflow at Overton, minus the stream depletions resulting from various management practices. For the management alternatives previously discussed, differences in the stage-frequency curves were minimal. For comparative purposes, three additional water-management alternatives whose application would change the incoming streamflow at Overton were simulated. Although in these alternatives the amounts of water that were diverted or imported were similar to the amounts in the previous alternatives, their effects on the stage-frequency curves were much more dramatic. (USGS)

  3. Simulation of Mach Probes in Non-Uniform Magnetized Plasmas: the Influence of a Background Density Gradient

    NASA Astrophysics Data System (ADS)

    Haakonsen, Christian Bernt; Hutchinson, Ian H.

    2013-10-01

    Mach probes can be used to measure transverse flow in magnetized plasmas, but what they actually measure in strongly non-uniform plasmas has not been definitively established. A fluid treatment in previous work has suggested that the diamagnetic drifts associated with background density and temperature gradients affect transverse flow measurements, but detailed computational study is required to validate and elaborate on those results; it is really a kinetic problem, since the probe deforms and introduces voids in the ion and electron distribution functions. A new code, the Plasma-Object Simulator with Iterated Trajectories (POSIT) has been developed to self-consistently compute the steady-state six-dimensional ion and electron distribution functions in the perturbed plasma. Particle trajectories are integrated backwards in time to the domain boundary, where arbitrary background distribution functions can be specified. This allows POSIT to compute the ion and electron density at each node of its unstructured mesh, update the potential based on those densities, and then iterate until convergence. POSIT is used to study the impact of a background density gradient on transverse Mach probe measurements, and the results compared to the previous fluid theory. C.B. Haakonsen was supported in part by NSF/DOE Grant No. DE-FG02-06ER54512, and in part by an SCGF award administered by ORISE under DOE Contract No. DE-AC05-06OR23100.

  4. An Integrative and Collaborative Approach to Creating a Diverse and Computationally Competent Geoscience Workforce

    NASA Astrophysics Data System (ADS)

    Moore, S. L.; Kar, A.; Gomez, R.

    2015-12-01

    A partnership between Fort Valley State University (FVSU), the Jackson School of Geosciences at The University of Texas (UT) at Austin, and the Texas Advanced Computing Center (TACC) is engaging computational geoscience faculty and researchers with academically talented underrepresented minority (URM) students, training them to solve grand challenges . These next generation computational geoscientists are being trained to solve some of the world's most challenging geoscience grand challenges requiring data intensive large scale modeling and simulation on high performance computers . UT Austin's geoscience outreach program GeoFORCE, recently awarded the Presidential Award in Excellence in Science, Mathematics and Engineering Mentoring, contributes to the collaborative best practices in engaging researchers with URM students. Collaborative efforts over the past decade are providing data demonstrating that integrative pipeline programs with mentoring and paid internship opportunities, multi-year scholarships, computational training, and communication skills development are having an impact on URMs developing middle skills for geoscience careers. Since 1997, the Cooperative Developmental Energy Program at FVSU and its collaborating universities have graduated 87 engineers, 33 geoscientists, and eight health physicists. Recruited as early as high school, students enroll for three years at FVSU majoring in mathematics, chemistry or biology, and then transfer to UT Austin or other partner institutions to complete a second STEM degree, including geosciences. A partnership with the Integrative Computational Education and Research Traineeship (ICERT), a National Science Foundation (NSF) Research Experience for Undergraduates (REU) Site at TACC provides students with a 10-week summer research experience at UT Austin. Mentored by TACC researchers, students with no previous background in computational science learn to use some of the world's most powerful high performance computing resources to address a grand geosciences problem. Students increase their ability to understand and explain the societal impact of their research and communicate the research to multidisciplinary and lay audiences via near-peer mentoring, poster presentations, and publication opportunities.

  5. Cloud based emergency health care information service in India.

    PubMed

    Karthikeyan, N; Sukanesh, R

    2012-12-01

    A hospital is a health care organization providing patient treatment by expert physicians, surgeons and equipments. A report from a health care accreditation group says that miscommunication between patients and health care providers is the reason for the gap in providing emergency medical care to people in need. In developing countries, illiteracy is the major key root for deaths resulting from uncertain diseases constituting a serious public health problem. Mentally affected, differently abled and unconscious patients can't communicate about their medical history to the medical practitioners. Also, Medical practitioners can't edit or view DICOM images instantly. Our aim is to provide palm vein pattern recognition based medical record retrieval system, using cloud computing for the above mentioned people. Distributed computing technology is coming in the new forms as Grid computing and Cloud computing. These new forms are assured to bring Information Technology (IT) as a service. In this paper, we have described how these new forms of distributed computing will be helpful for modern health care industries. Cloud Computing is germinating its benefit to industrial sectors especially in medical scenarios. In Cloud Computing, IT-related capabilities and resources are provided as services, via the distributed computing on-demand. This paper is concerned with sprouting software as a service (SaaS) by means of Cloud computing with an aim to bring emergency health care sector in an umbrella with physical secured patient records. In framing the emergency healthcare treatment, the crucial thing considered necessary to decide about patients is their previous health conduct records. Thus a ubiquitous access to appropriate records is essential. Palm vein pattern recognition promises a secured patient record access. Likewise our paper reveals an efficient means to view, edit or transfer the DICOM images instantly which was a challenging task for medical practitioners in the past years. We have developed two services for health care. 1. Cloud based Palm vein recognition system 2. Distributed Medical image processing tools for medical practitioners.

  6. Application of NASA General-Purpose Solver to Large-Scale Computations in Aeroacoustics

    NASA Technical Reports Server (NTRS)

    Watson, Willie R.; Storaasli, Olaf O.

    2004-01-01

    Of several iterative and direct equation solvers evaluated previously for computations in aeroacoustics, the most promising was the NASA-developed General-Purpose Solver (winner of NASA's 1999 software of the year award). This paper presents detailed, single-processor statistics of the performance of this solver, which has been tailored and optimized for large-scale aeroacoustic computations. The statistics, compiled using an SGI ORIGIN 2000 computer with 12 Gb available memory (RAM) and eight available processors, are the central processing unit time, RAM requirements, and solution error. The equation solver is capable of solving 10 thousand complex unknowns in as little as 0.01 sec using 0.02 Gb RAM, and 8.4 million complex unknowns in slightly less than 3 hours using all 12 Gb. This latter solution is the largest aeroacoustics problem solved to date with this technique. The study was unable to detect any noticeable error in the solution, since noise levels predicted from these solution vectors are in excellent agreement with the noise levels computed from the exact solution. The equation solver provides a means for obtaining numerical solutions to aeroacoustics problems in three dimensions.

  7. Special-purpose computer for holography HORN-4 with recurrence algorithm

    NASA Astrophysics Data System (ADS)

    Shimobaba, Tomoyoshi; Hishinuma, Sinsuke; Ito, Tomoyoshi

    2002-10-01

    We designed and built a special-purpose computer for holography, HORN-4 (HOlographic ReconstructioN) using PLD (Programmable Logic Device) technology. HORN computers have a pipeline architecture. We use HORN-4 as an attached processor to enhance the performance of a general-purpose computer when it is used to generate holograms using a "recurrence formulas" algorithm developed by our previous paper. In the HORN-4 system, we designed the pipeline by adopting our "recurrence formulas" algorithm which can calculate the phase on a hologram. As the result, we could integrate the pipeline composed of 21 units into one PLD chip. The units in the pipeline consists of one BPU (Basic Phase Unit) unit and twenty CU (Cascade Unit) units. These CU units can compute twenty light intensities on a hologram plane at one time. By mounting two of the PLD chips on a PCI (Peripheral Component Interconnect) universal board, HORN-4 can calculate holograms at high speed of about 42 Gflops equivalent. The cost of HORN-4 board is about 1700 US dollar. We could obtain 800×600 grids hologram from a 3D-image composed of 415 points in about 0.45 sec with the HORN-4 system.

  8. Iterative load-balancing method with multigrid level relaxation for particle simulation with short-range interactions

    NASA Astrophysics Data System (ADS)

    Furuichi, Mikito; Nishiura, Daisuke

    2017-10-01

    We developed dynamic load-balancing algorithms for Particle Simulation Methods (PSM) involving short-range interactions, such as Smoothed Particle Hydrodynamics (SPH), Moving Particle Semi-implicit method (MPS), and Discrete Element method (DEM). These are needed to handle billions of particles modeled in large distributed-memory computer systems. Our method utilizes flexible orthogonal domain decomposition, allowing the sub-domain boundaries in the column to be different for each row. The imbalances in the execution time between parallel logical processes are treated as a nonlinear residual. Load-balancing is achieved by minimizing the residual within the framework of an iterative nonlinear solver, combined with a multigrid technique in the local smoother. Our iterative method is suitable for adjusting the sub-domain frequently by monitoring the performance of each computational process because it is computationally cheaper in terms of communication and memory costs than non-iterative methods. Numerical tests demonstrated the ability of our approach to handle workload imbalances arising from a non-uniform particle distribution, differences in particle types, or heterogeneous computer architecture which was difficult with previously proposed methods. We analyzed the parallel efficiency and scalability of our method using Earth simulator and K-computer supercomputer systems.

  9. Modeling resident error-making patterns in detection of mammographic masses using computer-extracted image features: preliminary experiments

    NASA Astrophysics Data System (ADS)

    Mazurowski, Maciej A.; Zhang, Jing; Lo, Joseph Y.; Kuzmiak, Cherie M.; Ghate, Sujata V.; Yoon, Sora

    2014-03-01

    Providing high quality mammography education to radiology trainees is essential, as good interpretation skills potentially ensure the highest benefit of screening mammography for patients. We have previously proposed a computer-aided education system that utilizes trainee models, which relate human-assessed image characteristics to interpretation error. We proposed that these models be used to identify the most difficult and therefore the most educationally useful cases for each trainee. In this study, as a next step in our research, we propose to build trainee models that utilize features that are automatically extracted from images using computer vision algorithms. To predict error, we used a logistic regression which accepts imaging features as input and returns error as output. Reader data from 3 experts and 3 trainees were used. Receiver operating characteristic analysis was applied to evaluate the proposed trainee models. Our experiments showed that, for three trainees, our models were able to predict error better than chance. This is an important step in the development of adaptive computer-aided education systems since computer-extracted features will allow for faster and more extensive search of imaging databases in order to identify the most educationally beneficial cases.

  10. Compact Method for Modeling and Simulation of Memristor Devices

    DTIC Science & Technology

    2011-08-01

    single-valued equations. 15. SUBJECT TERMS Memristor, Neuromorphic , Cognitive, Computing, Memory, Emerging Technology, Computational Intelligence 16...resistance state depends on its previous state and present electrical biasing conditions, and when combined with transistors in a hybrid chip ...computers, reconfigurable electronics and neuromorphic computing [3,4]. According to Chua [4], the memristor behaves like a linear resistor with

  11. 40 CFR 86.099-17 - Emission control diagnostic system for 1999 and later light-duty vehicles and light-duty trucks.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... of computer codes. The emission control diagnostic system shall record and store in computer memory..., shall be stored in computer memory to identify correctly functioning emission control systems and those... in computer memory. Should a subsequent fuel system or misfire malfunction occur, any previously...

  12. 40 CFR 86.099-17 - Emission control diagnostic system for 1999 and later light-duty vehicles and light-duty trucks.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... of computer codes. The emission control diagnostic system shall record and store in computer memory..., shall be stored in computer memory to identify correctly functioning emission control systems and those... in computer memory. Should a subsequent fuel system or misfire malfunction occur, any previously...

  13. 37 CFR 1.825 - Amendments to or replacement of sequence listing and computer readable copy thereof.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... of sequence listing and computer readable copy thereof. 1.825 Section 1.825 Patents, Trademarks, and... Amino Acid Sequences § 1.825 Amendments to or replacement of sequence listing and computer readable copy... copy of the computer readable form (§ 1.821(e)) including all previously submitted data with the...

  14. 37 CFR 1.825 - Amendments to or replacement of sequence listing and computer readable copy thereof.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... of sequence listing and computer readable copy thereof. 1.825 Section 1.825 Patents, Trademarks, and... Amino Acid Sequences § 1.825 Amendments to or replacement of sequence listing and computer readable copy... copy of the computer readable form (§ 1.821(e)) including all previously submitted data with the...

  15. 37 CFR 1.825 - Amendments to or replacement of sequence listing and computer readable copy thereof.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... of sequence listing and computer readable copy thereof. 1.825 Section 1.825 Patents, Trademarks, and... Amino Acid Sequences § 1.825 Amendments to or replacement of sequence listing and computer readable copy... copy of the computer readable form (§ 1.821(e)) including all previously submitted data with the...

  16. 37 CFR 1.825 - Amendments to or replacement of sequence listing and computer readable copy thereof.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... of sequence listing and computer readable copy thereof. 1.825 Section 1.825 Patents, Trademarks, and... Amino Acid Sequences § 1.825 Amendments to or replacement of sequence listing and computer readable copy... copy of the computer readable form (§ 1.821(e)) including all previously submitted data with the...

  17. 37 CFR 1.825 - Amendments to or replacement of sequence listing and computer readable copy thereof.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... of sequence listing and computer readable copy thereof. 1.825 Section 1.825 Patents, Trademarks, and... Amino Acid Sequences § 1.825 Amendments to or replacement of sequence listing and computer readable copy... copy of the computer readable form (§ 1.821(e)) including all previously submitted data with the...

  18. Computer-assisted photogrammetric mapping systems for geologic studies-A progress report

    USGS Publications Warehouse

    Pillmore, C.L.; Dueholm, K.S.; Jepsen, H.S.; Schuch, C.H.

    1981-01-01

    Photogrammetry has played an important role in geologic mapping for many years; however, only recently have attempts been made to automate mapping functions for geology. Computer-assisted photogrammetric mapping systems for geologic studies have been developed and are currently in use in offices of the Geological Survey of Greenland at Copenhagen, Denmark, and the U.S. Geological Survey at Denver, Colorado. Though differing somewhat, the systems are similar in that they integrate Kern PG-2 photogrammetric plotting instruments and small desk-top computers that are programmed to perform special geologic functions and operate flat-bed plotters by means of specially designed hardware and software. A z-drive capability, in which stepping motors control the z-motions of the PG-2 plotters, is an integral part of both systems. This feature enables the computer to automatically position the floating mark on computer-calculated, previously defined geologic planes, such as contacts or the base of coal beds, throughout the stereoscopic model in order to improve the mapping capabilities of the instrument and to aid in correlation and tracing of geologic units. The common goal is to enhance the capabilities of the PG-2 plotter and provide a means by which geologists can make conventional geologic maps more efficiently and explore ways to apply computer technology to geologic studies. ?? 1981.

  19. Logic integration of mRNA signals by an RNAi-based molecular computer

    PubMed Central

    Xie, Zhen; Liu, Siyuan John; Bleris, Leonidas; Benenson, Yaakov

    2010-01-01

    Synthetic in vivo molecular ‘computers’ could rewire biological processes by establishing programmable, non-native pathways between molecular signals and biological responses. Multiple molecular computer prototypes have been shown to work in simple buffered solutions. Many of those prototypes were made of DNA strands and performed computations using cycles of annealing-digestion or strand displacement. We have previously introduced RNA interference (RNAi)-based computing as a way of implementing complex molecular logic in vivo. Because it also relies on nucleic acids for its operation, RNAi computing could benefit from the tools developed for DNA systems. However, these tools must be harnessed to produce bioactive components and be adapted for harsh operating environments that reflect in vivo conditions. In a step toward this goal, we report the construction and implementation of biosensors that ‘transduce’ mRNA levels into bioactive, small interfering RNA molecules via RNA strand exchange in a cell-free Drosophila embryo lysate, a step beyond simple buffered environments. We further integrate the sensors with our RNAi ‘computational’ module to evaluate two-input logic functions on mRNA concentrations. Our results show how RNA strand exchange can expand the utility of RNAi computing and point toward the possibility of using strand exchange in a native biological setting. PMID:20194121

  20. Thermodynamics of natural selection III: Landauer's principle in computation and chemistry.

    PubMed

    Smith, Eric

    2008-05-21

    This is the third in a series of three papers devoted to energy flow and entropy changes in chemical and biological processes, and their relations to the thermodynamics of computation. The previous two papers have developed reversible chemical transformations as idealizations for studying physiology and natural selection, and derived bounds from the second law of thermodynamics, between information gain in an ensemble and the chemical work required to produce it. This paper concerns the explicit mapping of chemistry to computation, and particularly the Landauer decomposition of irreversible computations, in which reversible logical operations generating no heat are separated from heat-generating erasure steps which are logically irreversible but thermodynamically reversible. The Landauer arrangement of computation is shown to produce the same entropy-flow diagram as that of the chemical Carnot cycles used in the second paper of the series to idealize physiological cycles. The specific application of computation to data compression and error-correcting encoding also makes possible a Landauer analysis of the somewhat different problem of optimal molecular recognition, which has been considered as an information theory problem. It is shown here that bounds on maximum sequence discrimination from the enthalpy of complex formation, although derived from the same logical model as the Shannon theorem for channel capacity, arise from exactly the opposite model for erasure.

  1. A five-layer users' need hierarchy of computer input device selection: a contextual observation survey of computer users with cervical spinal injuries (CSI).

    PubMed

    Tsai, Tsai-Hsuan; Nash, Robert J; Tseng, Kevin C

    2009-05-01

    This article presents how the researcher goes about answering the research question, 'how assistive technology impacts computer use among individuals with cervical spinal cord injury?' through an in-depth investigation into the real-life situations among computer operators with cervical spinal cord injuries (CSI). An in-depth survey was carried out to provide an insight into the function abilities and limitation, habitual practice and preference, choices and utilisation of input devices, personal and/or technical assistance, environmental set-up and arrangements and special requirements among 20 experienced computer users with cervical spinal cord injuries. Following the survey findings, a five-layer CSI users' needs hierarchy of input device selection and use was proposed. These needs were ranked in order: beginning with the most basic criterion at the bottom of the pyramid; lower-level criteria must be met before one moves onto the higher level. The users' needs hierarchy for CSI computer users, which had not been applied by previous research work and which has established a rationale for the development of alternative input devices. If an input device achieves the criteria set up in the needs hierarchy, then a good match of person and technology will be achieved.

  2. Computer-Assisted Instruction and Its Application to Air Force Civil Engineering.

    DTIC Science & Technology

    1987-09-01

    train the student. The student’s respesees my cause the ceeputot to preseet the previous material In a different =mus if these resposes indicaed that the...presented by Schlechter. He reports study results that indicate that CAI time savings may be due to self - pacing, a characteristic of other less-expenLve...lesson developer must ask: Is this instruc- tional requirement suited for Individual self -paced inter- active instruction? If the answer is no, then

  3. The Beneficial Role of Random Strategies in Social and Financial Systems

    NASA Astrophysics Data System (ADS)

    Biondo, Alessio Emanuele; Pluchino, Alessandro; Rapisarda, Andrea

    2013-05-01

    In this paper we focus on the beneficial role of random strategies in social sciences by means of simple mathematical and computational models. We briefly review recent results obtained by two of us in previous contributions for the case of the Peter principle and the efficiency of a Parliament. Then, we develop a new application of random strategies to the case of financial trading and discuss in detail our findings about forecasts of markets dynamics.

  4. Improving Simulated Annealing by Replacing Its Variables with Game-Theoretic Utility Maximizers

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.; Bandari, Esfandiar; Tumer, Kagan

    2001-01-01

    The game-theory field of Collective INtelligence (COIN) concerns the design of computer-based players engaged in a non-cooperative game so that as those players pursue their self-interests, a pre-specified global goal for the collective computational system is achieved as a side-effect. Previous implementations of COIN algorithms have outperformed conventional techniques by up to several orders of magnitude, on domains ranging from telecommunications control to optimization in congestion problems. Recent mathematical developments have revealed that these previously developed algorithms were based on only two of the three factors determining performance. Consideration of only the third factor would instead lead to conventional optimization techniques like simulated annealing that have little to do with non-cooperative games. In this paper we present an algorithm based on all three terms at once. This algorithm can be viewed as a way to modify simulated annealing by recasting it as a non-cooperative game, with each variable replaced by a player. This recasting allows us to leverage the intelligent behavior of the individual players to substantially improve the exploration step of the simulated annealing. Experiments are presented demonstrating that this recasting significantly improves simulated annealing for a model of an economic process run over an underlying small-worlds topology. Furthermore, these experiments reveal novel small-worlds phenomena, and highlight the shortcomings of conventional mechanism design in bounded rationality domains.

  5. Numerical Analysis of Organ Doses Delivered During Computed Tomography Examinations Using Japanese Adult Phantoms with the WAZA-ARI Dosimetry System.

    PubMed

    Takahashi, Fumiaki; Sato, Kaoru; Endo, Akira; Ono, Koji; Ban, Nobuhiko; Hasegawa, Takayuki; Katsunuma, Yasushi; Yoshitake, Takayasu; Kai, Michiaki

    2015-08-01

    A dosimetry system for computed tomography (CT) examinations, named WAZA-ARI, is being developed to accurately assess radiation doses to patients in Japan. For dose calculations in WAZA-ARI, organ doses were numerically analyzed using average adult Japanese male (JM) and female (JF) phantoms with the Particle and Heavy Ion Transport code System (PHITS). Experimental studies clarified the photon energy distribution of emitted photons and dose profiles on the table for some multi-detector row CT (MDCT) devices. Numerical analyses using a source model in PHITS could specifically take into account emissions of x rays from the tube to the table with attenuation of photons through a beam-shaping filter for each MDCT device based on the experiment results. The source model was validated by measuring the CT dose index (CTDI). Numerical analyses with PHITS revealed a concordance of organ doses with body sizes of the JM and JF phantoms. The organ doses in the JM phantoms were compared with data obtained using previously developed systems. In addition, the dose calculations in WAZA-ARI were verified with previously reported results by realistic NUBAS phantoms and radiation dose measurement using a physical Japanese model (THRA1 phantom). The results imply that numerical analyses using the Japanese phantoms and specified source models can give reasonable estimates of dose for MDCT devices for typical Japanese adults.

  6. Performance analysis of wireless sensor networks in geophysical sensing applications

    NASA Astrophysics Data System (ADS)

    Uligere Narasimhamurthy, Adithya

    Performance is an important criteria to consider before switching from a wired network to a wireless sensing network. Performance is especially important in geophysical sensing where the quality of the sensing system is measured by the precision of the acquired signal. Can a wireless sensing network maintain the same reliability and quality metrics that a wired system provides? Our work focuses on evaluating the wireless GeoMote sensor motes that were developed by previous computer science graduate students at Mines. Specifically, we conducted a set of experiments, namely WalkAway and Linear Array experiments, to characterize the performance of the wireless motes. The motes were also equipped with the Sticking Heartbeat Aperture Resynchronization Protocol (SHARP), a time synchronization protocol developed by a previous computer science graduate student at Mines. This protocol should automatically synchronize the mote's internal clocks and reduce time synchronization errors. We also collected passive data to evaluate the response of GeoMotes to various frequency components associated with the seismic waves. With the data collected from these experiments, we evaluated the performance of the SHARP protocol and compared the performance of our GeoMote wireless system against the industry standard wired seismograph system (Geometric-Geode). Using arrival time analysis and seismic velocity calculations, we set out to answer the following question. Can our wireless sensing system (GeoMotes) perform similarly to a traditional wired system in a realistic scenario?

  7. Multiscale intensity homogeneity transformation method and its application to computer-aided detection of pulmonary embolism in computed tomographic pulmonary angiography (CTPA)

    NASA Astrophysics Data System (ADS)

    Guo, Yanhui; Zhou, Chuan; Chan, Heang-Ping; Wei, Jun; Chughtai, Aamer; Sundaram, Baskaran; Hadjiiski, Lubomir M.; Patel, Smita; Kazerooni, Ella A.

    2013-04-01

    A 3D multiscale intensity homogeneity transformation (MIHT) method was developed to reduce false positives (FPs) in our previously developed CAD system for pulmonary embolism (PE) detection. In MIHT, the voxel intensity of a PE candidate region was transformed to an intensity homogeneity value (IHV) with respect to the local median intensity. The IHVs were calculated in multiscales (MIHVs) to measure the intensity homogeneity, taking into account vessels of different sizes and different degrees of occlusion. Seven new features including the entropy, gradient, and moments that characterized the intensity distributions of the candidate regions were derived from the MIHVs and combined with the previously designed features that described the shape and intensity of PE candidates for the training of a linear classifier to reduce the FPs. 59 CTPA PE cases were collected from our patient files (UM set) with IRB approval and 69 cases from the PIOPED II data set with access permission. 595 and 800 PEs were identified as reference standard by experienced thoracic radiologists in the UM and PIOPED set, respectively. FROC analysis was used for performance evaluation. Compared with our previous CAD system, at a test sensitivity of 80%, the new method reduced the FP rate from 18.9 to 14.1/scan for the PIOPED set when the classifier was trained with the UM set and from 22.6 to 16.0/scan vice versa. The improvement was statistically significant (p<0.05) by JAFROC analysis. This study demonstrated that the MIHT method is effective in reducing FPs and improving the performance of the CAD system.

  8. Three-dimensional rendering of segmented object using matlab - biomed 2010.

    PubMed

    Anderson, Jeffrey R; Barrett, Steven F

    2010-01-01

    The three-dimensional rendering of microscopic objects is a difficult and challenging task that often requires specialized image processing techniques. Previous work has been described of a semi-automatic segmentation process of fluorescently stained neurons collected as a sequence of slice images with a confocal laser scanning microscope. Once properly segmented, each individual object can be rendered and studied as a three-dimensional virtual object. This paper describes the work associated with the design and development of Matlab files to create three-dimensional images from the segmented object data previously mentioned. Part of the motivation for this work is to integrate both the segmentation and rendering processes into one software application, providing a seamless transition from the segmentation tasks to the rendering and visualization tasks. Previously these tasks were accomplished on two different computer systems, windows and Linux. This transition basically limits the usefulness of the segmentation and rendering applications to those who have both computer systems readily available. The focus of this work is to create custom Matlab image processing algorithms for object rendering and visualization, and merge these capabilities to the Matlab files that were developed especially for the image segmentation task. The completed Matlab application will contain both the segmentation and rendering processes in a single graphical user interface, or GUI. This process for rendering three-dimensional images in Matlab requires that a sequence of two-dimensional binary images, representing a cross-sectional slice of the object, be reassembled in a 3D space, and covered with a surface. Additional segmented objects can be rendered in the same 3D space. The surface properties of each object can be varied by the user to aid in the study and analysis of the objects. This inter-active process becomes a powerful visual tool to study and understand microscopic objects.

  9. Symbolic Computation of Strongly Connected Components Using Saturation

    NASA Technical Reports Server (NTRS)

    Zhao, Yang; Ciardo, Gianfranco

    2010-01-01

    Finding strongly connected components (SCCs) in the state-space of discrete-state models is a critical task in formal verification of LTL and fair CTL properties, but the potentially huge number of reachable states and SCCs constitutes a formidable challenge. This paper is concerned with computing the sets of states in SCCs or terminal SCCs of asynchronous systems. Because of its advantages in many applications, we employ saturation on two previously proposed approaches: the Xie-Beerel algorithm and transitive closure. First, saturation speeds up state-space exploration when computing each SCC in the Xie-Beerel algorithm. Then, our main contribution is a novel algorithm to compute the transitive closure using saturation. Experimental results indicate that our improved algorithms achieve a clear speedup over previous algorithms in some cases. With the help of the new transitive closure computation algorithm, up to 10(exp 150) SCCs can be explored within a few seconds.

  10. Extended precision data types for the development of the original computer aided engineering applications

    NASA Astrophysics Data System (ADS)

    Pescaru, A.; Oanta, E.; Axinte, T.; Dascalescu, A.-D.

    2015-11-01

    Computer aided engineering is based on models of the phenomena which are expressed as algorithms. The implementations of the algorithms are usually software applications which are processing a large volume of numerical data, regardless the size of the input data. In this way, the finite element method applications used to have an input data generator which was creating the entire volume of geometrical data, starting from the initial geometrical information and the parameters stored in the input data file. Moreover, there were several data processing stages, such as: renumbering of the nodes meant to minimize the size of the band length of the system of equations to be solved, computation of the equivalent nodal forces, computation of the element stiffness matrix, assemblation of system of equations, solving the system of equations, computation of the secondary variables. The modern software application use pre-processing and post-processing programs to easily handle the information. Beside this example, CAE applications use various stages of complex computation, being very interesting the accuracy of the final results. Along time, the development of CAE applications was a constant concern of the authors and the accuracy of the results was a very important target. The paper presents the various computing techniques which were imagined and implemented in the resulting applications: finite element method programs, finite difference element method programs, applied general numerical methods applications, data generators, graphical applications, experimental data reduction programs. In this context, the use of the extended precision data types was one of the solutions, the limitations being imposed by the size of the memory which may be allocated. To avoid the memory-related problems the data was stored in files. To minimize the execution time, part of the file was accessed using the dynamic memory allocation facilities. One of the most important consequences of the paper is the design of a library which includes the optimized solutions previously tested, that may be used for the easily development of original CAE cross-platform applications. Last but not least, beside the generality of the data type solutions, there is targeted the development of a software library which may be used for the easily development of node-based CAE applications, each node having several known or unknown parameters, the system of equations being automatically generated and solved.

  11. Monte Carlo Perturbation Theory Estimates of Sensitivities to System Dimensions

    DOE PAGES

    Burke, Timothy P.; Kiedrowski, Brian C.

    2017-12-11

    Here, Monte Carlo methods are developed using adjoint-based perturbation theory and the differential operator method to compute the sensitivities of the k-eigenvalue, linear functions of the flux (reaction rates), and bilinear functions of the forward and adjoint flux (kinetics parameters) to system dimensions for uniform expansions or contractions. The calculation of sensitivities to system dimensions requires computing scattering and fission sources at material interfaces using collisions occurring at the interface—which is a set of events with infinitesimal probability. Kernel density estimators are used to estimate the source at interfaces using collisions occurring near the interface. The methods for computing sensitivitiesmore » of linear and bilinear ratios are derived using the differential operator method and adjoint-based perturbation theory and are shown to be equivalent to methods previously developed using a collision history–based approach. The methods for determining sensitivities to system dimensions are tested on a series of fast, intermediate, and thermal critical benchmarks as well as a pressurized water reactor benchmark problem with iterated fission probability used for adjoint-weighting. The estimators are shown to agree within 5% and 3σ of reference solutions obtained using direct perturbations with central differences for the majority of test problems.« less

  12. Approximate Algorithms for Computing Spatial Distance Histograms with Accuracy Guarantees

    PubMed Central

    Grupcev, Vladimir; Yuan, Yongke; Tu, Yi-Cheng; Huang, Jin; Chen, Shaoping; Pandit, Sagar; Weng, Michael

    2014-01-01

    Particle simulation has become an important research tool in many scientific and engineering fields. Data generated by such simulations impose great challenges to database storage and query processing. One of the queries against particle simulation data, the spatial distance histogram (SDH) query, is the building block of many high-level analytics, and requires quadratic time to compute using a straightforward algorithm. Previous work has developed efficient algorithms that compute exact SDHs. While beating the naive solution, such algorithms are still not practical in processing SDH queries against large-scale simulation data. In this paper, we take a different path to tackle this problem by focusing on approximate algorithms with provable error bounds. We first present a solution derived from the aforementioned exact SDH algorithm, and this solution has running time that is unrelated to the system size N. We also develop a mathematical model to analyze the mechanism that leads to errors in the basic approximate algorithm. Our model provides insights on how the algorithm can be improved to achieve higher accuracy and efficiency. Such insights give rise to a new approximate algorithm with improved time/accuracy tradeoff. Experimental results confirm our analysis. PMID:24693210

  13. A novel computational approach for simultaneous tracking and feature extraction of C. elegans populations in fluid environments.

    PubMed

    Tsechpenakis, Gabriel; Bianchi, Laura; Metaxas, Dimitris; Driscoll, Monica

    2008-05-01

    The nematode Caenorhabditis elegans (C. elegans) is a genetic model widely used to dissect conserved basic biological mechanisms of development and nervous system function. C. elegans locomotion is under complex neuronal regulation and is impacted by genetic and environmental factors; thus, its analysis is expected to shed light on how genetic, environmental, and pathophysiological processes control behavior. To date, computer-based approaches have been used for analysis of C. elegans locomotion; however, none of these is both high resolution and high throughput. We used computer vision methods to develop a novel automated approach for analyzing the C. elegans locomotion. Our method provides information on the position, trajectory, and body shape during locomotion and is designed to efficiently track multiple animals (C. elegans) in cluttered images and under lighting variations. We used this method to describe in detail C. elegans movement in liquid for the first time and to analyze six unc-8, one mec-4, and one odr-1 mutants. We report features of nematode swimming not previously noted and show that our method detects differences in the swimming profile of mutants that appear at first glance similar.

  14. An efficient and robust method for predicting helicopter rotor high-speed impulsive noise

    NASA Technical Reports Server (NTRS)

    Brentner, Kenneth S.

    1996-01-01

    A new formulation for the Ffowcs Williams-Hawkings quadrupole source, which is valid for a far-field in-plane observer, is presented. The far-field approximation is new and unique in that no further approximation of the quadrupole source strength is made and integrands with r(exp -2) and r(exp -3) dependence are retained. This paper focuses on the development of a retarded-time formulation in which time derivatives are analytically taken inside the integrals to avoid unnecessary computational work when the observer moves with the rotor. The new quadrupole formulation is similar to Farassat's thickness and loading formulation 1A. Quadrupole noise prediction is carried out in two parts: a preprocessing stage in which the previously computed flow field is integrated in the direction normal to the rotor disk, and a noise computation stage in which quadrupole surface integrals are evaluated for a particular observer position. Preliminary predictions for hover and forward flight agree well with experimental data. The method is robust and requires computer resources comparable to thickness and loading noise prediction.

  15. A Parallel Distributed-Memory Particle Method Enables Acquisition-Rate Segmentation of Large Fluorescence Microscopy Images

    PubMed Central

    Afshar, Yaser; Sbalzarini, Ivo F.

    2016-01-01

    Modern fluorescence microscopy modalities, such as light-sheet microscopy, are capable of acquiring large three-dimensional images at high data rate. This creates a bottleneck in computational processing and analysis of the acquired images, as the rate of acquisition outpaces the speed of processing. Moreover, images can be so large that they do not fit the main memory of a single computer. We address both issues by developing a distributed parallel algorithm for segmentation of large fluorescence microscopy images. The method is based on the versatile Discrete Region Competition algorithm, which has previously proven useful in microscopy image segmentation. The present distributed implementation decomposes the input image into smaller sub-images that are distributed across multiple computers. Using network communication, the computers orchestrate the collectively solving of the global segmentation problem. This not only enables segmentation of large images (we test images of up to 1010 pixels), but also accelerates segmentation to match the time scale of image acquisition. Such acquisition-rate image segmentation is a prerequisite for the smart microscopes of the future and enables online data compression and interactive experiments. PMID:27046144

  16. Rapid Calculation of Max-Min Fair Rates for Multi-Commodity Flows in Fat-Tree Networks

    DOE PAGES

    Mollah, Md Atiqul; Yuan, Xin; Pakin, Scott; ...

    2017-08-29

    Max-min fairness is often used in the performance modeling of interconnection networks. Existing methods to compute max-min fair rates for multi-commodity flows have high complexity and are computationally infeasible for large networks. In this paper, we show that by considering topological features, this problem can be solved efficiently for the fat-tree topology that is widely used in data centers and high performance compute clusters. Several efficient new algorithms are developed for this problem, including a parallel algorithm that can take advantage of multi-core and shared-memory architectures. Using these algorithms, we demonstrate that it is possible to find the max-min fairmore » rate allocation for multi-commodity flows in fat-tree networks that support tens of thousands of nodes. We evaluate the run-time performance of the proposed algorithms and show improvement in orders of magnitude over the previously best known method. Finally, we further demonstrate a new application of max-min fair rate allocation that is only computationally feasible using our new algorithms.« less

  17. Rapid Calculation of Max-Min Fair Rates for Multi-Commodity Flows in Fat-Tree Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mollah, Md Atiqul; Yuan, Xin; Pakin, Scott

    Max-min fairness is often used in the performance modeling of interconnection networks. Existing methods to compute max-min fair rates for multi-commodity flows have high complexity and are computationally infeasible for large networks. In this paper, we show that by considering topological features, this problem can be solved efficiently for the fat-tree topology that is widely used in data centers and high performance compute clusters. Several efficient new algorithms are developed for this problem, including a parallel algorithm that can take advantage of multi-core and shared-memory architectures. Using these algorithms, we demonstrate that it is possible to find the max-min fairmore » rate allocation for multi-commodity flows in fat-tree networks that support tens of thousands of nodes. We evaluate the run-time performance of the proposed algorithms and show improvement in orders of magnitude over the previously best known method. Finally, we further demonstrate a new application of max-min fair rate allocation that is only computationally feasible using our new algorithms.« less

  18. A combined registration and finite element analysis method for fast estimation of intraoperative brain shift; phantom and animal model study.

    PubMed

    Mohammadi, Amrollah; Ahmadian, Alireza; Rabbani, Shahram; Fattahi, Ehsan; Shirani, Shapour

    2017-12-01

    Finite element models for estimation of intraoperative brain shift suffer from huge computational cost. In these models, image registration and finite element analysis are two time-consuming processes. The proposed method is an improved version of our previously developed Finite Element Drift (FED) registration algorithm. In this work the registration process is combined with the finite element analysis. In the Combined FED (CFED), the deformation of whole brain mesh is iteratively calculated by geometrical extension of a local load vector which is computed by FED. While the processing time of the FED-based method including registration and finite element analysis was about 70 s, the computation time of the CFED was about 3.2 s. The computational cost of CFED is almost 50% less than similar state of the art brain shift estimators based on finite element models. The proposed combination of registration and structural analysis can make the calculation of brain deformation much faster. Copyright © 2016 John Wiley & Sons, Ltd.

  19. Symmetrical compression distance for arrhythmia discrimination in cloud-based big-data services.

    PubMed

    Lillo-Castellano, J M; Mora-Jiménez, I; Santiago-Mozos, R; Chavarría-Asso, F; Cano-González, A; García-Alberola, A; Rojo-Álvarez, J L

    2015-07-01

    The current development of cloud computing is completely changing the paradigm of data knowledge extraction in huge databases. An example of this technology in the cardiac arrhythmia field is the SCOOP platform, a national-level scientific cloud-based big data service for implantable cardioverter defibrillators. In this scenario, we here propose a new methodology for automatic classification of intracardiac electrograms (EGMs) in a cloud computing system, designed for minimal signal preprocessing. A new compression-based similarity measure (CSM) is created for low computational burden, so-called weighted fast compression distance, which provides better performance when compared with other CSMs in the literature. Using simple machine learning techniques, a set of 6848 EGMs extracted from SCOOP platform were classified into seven cardiac arrhythmia classes and one noise class, reaching near to 90% accuracy when previous patient arrhythmia information was available and 63% otherwise, hence overcoming in all cases the classification provided by the majority class. Results show that this methodology can be used as a high-quality service of cloud computing, providing support to physicians for improving the knowledge on patient diagnosis.

  20. Navier-Stokes simulations of slender axisymmetric shapes in supersonic, turbulent flow

    NASA Astrophysics Data System (ADS)

    Moran, Kenneth J.; Beran, Philip S.

    1994-07-01

    Computational fluid dynamics is used to study flows about slender, axisymmetric bodies at very high speeds. Numerical experiments are conducted to simulate a broad range of flight conditions. Mach number is varied from 1.5 to 8 and Reynolds number is varied from 1 X 10(exp 6)/m to 10(exp 8)/m. The primary objective is to develop and validate a computational and methodology for the accurate simulation of a wide variety of flow structures. Accurate results are obtained for detached bow shocks, recompression shocks, corner-point expansions, base-flow recirculations, and turbulent boundary layers. Accuracy is assessed through comparison with theory and experimental data; computed surface pressure, shock structure, base-flow structure, and velocity profiles are within measurement accuracy throughout the range of conditions tested. The methodology is both practical and general: general in its applicability, and practicaal in its performance. To achieve high accuracy, modifications to previously reported techniques are implemented in the scheme. These modifications improve computed results in the vicinity of symmetry lines and in the base flow region, including the turbulent wake.

  1. Improved full analytical polygon-based method using Fourier analysis of the three-dimensional affine transformation.

    PubMed

    Pan, Yijie; Wang, Yongtian; Liu, Juan; Li, Xin; Jia, Jia

    2014-03-01

    Previous research [Appl. Opt.52, A290 (2013)] has revealed that Fourier analysis of three-dimensional affine transformation theory can be used to improve the computation speed of the traditional polygon-based method. In this paper, we continue our research and propose an improved full analytical polygon-based method developed upon this theory. Vertex vectors of primitive and arbitrary triangles and the pseudo-inverse matrix were used to obtain an affine transformation matrix representing the spatial relationship between the two triangles. With this relationship and the primitive spectrum, we analytically obtained the spectrum of the arbitrary triangle. This algorithm discards low-level angular dependent computations. In order to add diffusive reflection to each arbitrary surface, we also propose a whole matrix computation approach that takes advantage of the affine transformation matrix and uses matrix multiplication to calculate shifting parameters of similar sub-polygons. The proposed method improves hologram computation speed for the conventional full analytical approach. Optical experimental results are demonstrated which prove that the proposed method can effectively reconstruct three-dimensional scenes.

  2. A Parallel Distributed-Memory Particle Method Enables Acquisition-Rate Segmentation of Large Fluorescence Microscopy Images.

    PubMed

    Afshar, Yaser; Sbalzarini, Ivo F

    2016-01-01

    Modern fluorescence microscopy modalities, such as light-sheet microscopy, are capable of acquiring large three-dimensional images at high data rate. This creates a bottleneck in computational processing and analysis of the acquired images, as the rate of acquisition outpaces the speed of processing. Moreover, images can be so large that they do not fit the main memory of a single computer. We address both issues by developing a distributed parallel algorithm for segmentation of large fluorescence microscopy images. The method is based on the versatile Discrete Region Competition algorithm, which has previously proven useful in microscopy image segmentation. The present distributed implementation decomposes the input image into smaller sub-images that are distributed across multiple computers. Using network communication, the computers orchestrate the collectively solving of the global segmentation problem. This not only enables segmentation of large images (we test images of up to 10(10) pixels), but also accelerates segmentation to match the time scale of image acquisition. Such acquisition-rate image segmentation is a prerequisite for the smart microscopes of the future and enables online data compression and interactive experiments.

  3. Gene discovery in the hamster: a comparative genomics approach for gene annotation by sequencing of hamster testis cDNAs

    PubMed Central

    Oduru, Sreedhar; Campbell, Janee L; Karri, SriTulasi; Hendry, William J; Khan, Shafiq A; Williams, Simon C

    2003-01-01

    Background Complete genome annotation will likely be achieved through a combination of computer-based analysis of available genome sequences combined with direct experimental characterization of expressed regions of individual genomes. We have utilized a comparative genomics approach involving the sequencing of randomly selected hamster testis cDNAs to begin to identify genes not previously annotated on the human, mouse, rat and Fugu (pufferfish) genomes. Results 735 distinct sequences were analyzed for their relatedness to known sequences in public databases. Eight of these sequences were derived from previously unidentified genes and expression of these genes in testis was confirmed by Northern blotting. The genomic locations of each sequence were mapped in human, mouse, rat and pufferfish, where applicable, and the structure of their cognate genes was derived using computer-based predictions, genomic comparisons and analysis of uncharacterized cDNA sequences from human and macaque. Conclusion The use of a comparative genomics approach resulted in the identification of eight cDNAs that correspond to previously uncharacterized genes in the human genome. The proteins encoded by these genes included a new member of the kinesin superfamily, a SET/MYND-domain protein, and six proteins for which no specific function could be predicted. Each gene was expressed primarily in testis, suggesting that they may play roles in the development and/or function of testicular cells. PMID:12783626

  4. Identification of genes related to proliferative diabetic retinopathy through RWR algorithm based on protein-protein interaction network.

    PubMed

    Zhang, Jian; Suo, Yan; Liu, Min; Xu, Xun

    2018-06-01

    Proliferative diabetic retinopathy (PDR) is one of the most common complications of diabetes and can lead to blindness. Proteomic studies have provided insight into the pathogenesis of PDR and a series of PDR-related genes has been identified but are far from fully characterized because the experimental methods are expensive and time consuming. In our previous study, we successfully identified 35 candidate PDR-related genes through the shortest-path algorithm. In the current study, we developed a computational method using the random walk with restart (RWR) algorithm and the protein-protein interaction (PPI) network to identify potential PDR-related genes. After some possible genes were obtained by the RWR algorithm, a three-stage filtration strategy, which includes the permutation test, interaction test and enrichment test, was applied to exclude potential false positives caused by the structure of PPI network, the poor interaction strength, and the limited similarity on gene ontology (GO) terms and biological pathways. As a result, 36 candidate genes were discovered by the method which was different from the 35 genes reported in our previous study. A literature review showed that 21 of these 36 genes are supported by previous experiments. These findings suggest the robustness and complementary effects of both our efforts using different computational methods, thus providing an alternative method to study PDR pathogenesis. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Multithreaded transactions in scientific computing. The Growth06_v2 program

    NASA Astrophysics Data System (ADS)

    Daniluk, Andrzej

    2009-07-01

    Writing a concurrent program can be more difficult than writing a sequential program. Programmer needs to think about synchronization, race conditions and shared variables. Transactions help reduce the inconvenience of using threads. A transaction is an abstraction, which allows programmers to group a sequence of actions on the program into a logical, higher-level computation unit. This paper presents a new version of the GROWTHGr and GROWTH06 programs. New version program summaryProgram title: GROWTH06_v2 Catalogue identifier: ADVL_v2_1 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADVL_v2_1.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 65 255 No. of bytes in distributed program, including test data, etc.: 865 985 Distribution format: tar.gz Programming language: Object Pascal Computer: Pentium-based PC Operating system: Windows 9x, XP, NT, Vista RAM: more than 1 MB Classification: 4.3, 7.2, 6.2, 8, 14 Catalogue identifier of previous version: ADVL_v2_0 Journal reference of previous version: Comput. Phys. Comm. 175 (2006) 678 Does the new version supersede the previous version?: Yes Nature of problem: The programs compute the RHEED intensities during the growth of thin epitaxial structures prepared using the molecular beam epitaxy (MBE). The computations are based on the use of kinematical diffraction theory. Solution method: Epitaxial growth of thin films is modelled by a set of non-linear differential equations [1]. The Runge-Kutta method with adaptive stepsize control was used for solving initial value problem for non-linear differential equations [2]. Reasons for new version: According to the users' suggestions functionality of the program has been improved. Moreover, new use cases have been added which make the handling of the program easier and more efficient than the previous ones [3]. Summary of revisions:The design pattern (See Fig. 2 of Ref. [3]) has been modified according to the scheme shown on Fig. 1. A graphical user interface (GUI) for the program has been reconstructed. Fig. 2 presents a hybrid diagram of a GUI that shows how onscreen objects connect to use cases. The program has been compiled with English/USA regional and language options. Note: The figures mentioned above are contained in the program distribution file. Unusual features: The program is distributed in the form of source project GROWTH06_v2.dpr with associated files, and should be compiled using Borland Delphi compilers versions 6 or latter (including Borland Developer Studio 2006 and Code Gear compilers for Delphi). Additional comments: Two figures are included in the program distribution file. These are captioned Static classes model for Transaction design pattern. A model of a window that shows how onscreen objects connect to use cases. Running time: The typical running time is machine and user-parameters dependent. References: [1] A. Daniluk, Comput. Phys. Comm. 170 (2005) 265. [2] W.H. Press, B.P. Flannery, S.A. Teukolsky, W.T. Vetterling, Numerical Recipes in Pascal: The Art of Scientific Computing, first ed., Cambridge University Press, 1989. [3] M. Brzuszek, A. Daniluk, Comput. Phys. Comm. 175 (2006) 678.

  6. Finite Larmor radius effects on the (m = 2, n = 1) cylindrical tearing mode

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Y.; Chowdhury, J.; Parker, S. E.

    2015-04-15

    New field solvers are developed in the gyrokinetic code GEM [Chen and Parker, J. Comput. Phys. 220, 839 (2007)] to simulate low-n modes. A novel discretization is developed for the ion polarization term in the gyrokinetic vorticity equation. An eigenmode analysis with finite Larmor radius effects is developed to study the linear resistive tearing mode. The mode growth rate is shown to scale with resistivity as γ ∼ η{sup 1∕3}, the same as the semi-collisional regime in previous kinetic treatments [Drake and Lee, Phys. Fluids 20, 1341 (1977)]. Tearing mode simulations with gyrokinetic ions are verified with the eigenmode calculation.

  7. A systematic investigation of computation models for predicting Adverse Drug Reactions (ADRs).

    PubMed

    Kuang, Qifan; Wang, MinQi; Li, Rong; Dong, YongCheng; Li, Yizhou; Li, Menglong

    2014-01-01

    Early and accurate identification of adverse drug reactions (ADRs) is critically important for drug development and clinical safety. Computer-aided prediction of ADRs has attracted increasing attention in recent years, and many computational models have been proposed. However, because of the lack of systematic analysis and comparison of the different computational models, there remain limitations in designing more effective algorithms and selecting more useful features. There is therefore an urgent need to review and analyze previous computation models to obtain general conclusions that can provide useful guidance to construct more effective computational models to predict ADRs. In the current study, the main work is to compare and analyze the performance of existing computational methods to predict ADRs, by implementing and evaluating additional algorithms that have been earlier used for predicting drug targets. Our results indicated that topological and intrinsic features were complementary to an extent and the Jaccard coefficient had an important and general effect on the prediction of drug-ADR associations. By comparing the structure of each algorithm, final formulas of these algorithms were all converted to linear model in form, based on this finding we propose a new algorithm called the general weighted profile method and it yielded the best overall performance among the algorithms investigated in this paper. Several meaningful conclusions and useful findings regarding the prediction of ADRs are provided for selecting optimal features and algorithms.

  8. Weighted fusion of depth and inertial data to improve view invariance for real-time human action recognition

    NASA Astrophysics Data System (ADS)

    Chen, Chen; Hao, Huiyan; Jafari, Roozbeh; Kehtarnavaz, Nasser

    2017-05-01

    This paper presents an extension to our previously developed fusion framework [10] involving a depth camera and an inertial sensor in order to improve its view invariance aspect for real-time human action recognition applications. A computationally efficient view estimation based on skeleton joints is considered in order to select the most relevant depth training data when recognizing test samples. Two collaborative representation classifiers, one for depth features and one for inertial features, are appropriately weighted to generate a decision making probability. The experimental results applied to a multi-view human action dataset show that this weighted extension improves the recognition performance by about 5% over equally weighted fusion deployed in our previous fusion framework.

  9. Statistical Analysis of CFD Solutions From the Fifth AIAA Drag Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Morrison, Joseph H.

    2013-01-01

    A graphical framework is used for statistical analysis of the results from an extensive N-version test of a collection of Reynolds-averaged Navier-Stokes computational fluid dynamics codes. The solutions were obtained by code developers and users from North America, Europe, Asia, and South America using a common grid sequence and multiple turbulence models for the June 2012 fifth Drag Prediction Workshop sponsored by the AIAA Applied Aerodynamics Technical Committee. The aerodynamic configuration for this workshop was the Common Research Model subsonic transport wing-body previously used for the 4th Drag Prediction Workshop. This work continues the statistical analysis begun in the earlier workshops and compares the results from the grid convergence study of the most recent workshop with previous workshops.

  10. Improved optical design of nontracking concentrators

    NASA Astrophysics Data System (ADS)

    Kwan, B. M.; Bannerot, R. B.

    1984-08-01

    Optical designs based on a two reflections or less criterion have been developed for one and two-facet trapezoidal concentrators. Collector designs resulting from this criterion have been evaluated with the aid of a ray-trace computer simulation which includes the effects of nonideal reflectors. Results indicate a marked increase in performance, particularly for the one-facet designs, as compared to the collectors previously designed with the one reflection or less criterion. A significant result is that when a proper accounting is made for the actual acceptance angle for the concentrators, the performances of the optimal one and two-facet designs become nearly identical, indicating that the previously held contention that improved performance could be achieved with multifaceted reflectors (geometrically approaching the compound parabolic shape) may be incorrect.

  11. Free-style dual plane recycling tensor fascia lata musculocutaneous perforator flap for reconstruction of recurrent trochanteric defects.

    PubMed

    Han, Ba Leun; Choi, Hwan Jun

    2014-03-01

    Sequential flap coverage might be required for recurrent defects, but reusing a flap as a donor site has seldom been reported. The concept of a "free-style flap" has been developed, and it allows reconstructive surgeons to raise flaps with various designs reliably, even at sites of previous flap surgery. This article presents the concept of free-style recycling of a tensor fascia lata flap into a perforator-based flap separated in 2 planes in a patient with a recurrent bilateral trochanteric defect. If a reliable perforator is preserved and identified within the tissues by computed tomography angiography or a Doppler device, a new perforator flap can be designed and raised at the previous flap site.

  12. Body-wide anatomy recognition in PET/CT images

    NASA Astrophysics Data System (ADS)

    Wang, Huiqian; Udupa, Jayaram K.; Odhner, Dewey; Tong, Yubing; Zhao, Liming; Torigian, Drew A.

    2015-03-01

    With the rapid growth of positron emission tomography/computed tomography (PET/CT)-based medical applications, body-wide anatomy recognition on whole-body PET/CT images becomes crucial for quantifying body-wide disease burden. This, however, is a challenging problem and seldom studied due to unclear anatomy reference frame and low spatial resolution of PET images as well as low contrast and spatial resolution of the associated low-dose CT images. We previously developed an automatic anatomy recognition (AAR) system [15] whose applicability was demonstrated on diagnostic computed tomography (CT) and magnetic resonance (MR) images in different body regions on 35 objects. The aim of the present work is to investigate strategies for adapting the previous AAR system to low-dose CT and PET images toward automated body-wide disease quantification. Our adaptation of the previous AAR methodology to PET/CT images in this paper focuses on 16 objects in three body regions - thorax, abdomen, and pelvis - and consists of the following steps: collecting whole-body PET/CT images from existing patient image databases, delineating all objects in these images, modifying the previous hierarchical models built from diagnostic CT images to account for differences in appearance in low-dose CT and PET images, automatically locating objects in these images following object hierarchy, and evaluating performance. Our preliminary evaluations indicate that the performance of the AAR approach on low-dose CT images achieves object localization accuracy within about 2 voxels, which is comparable to the accuracies achieved on diagnostic contrast-enhanced CT images. Object recognition on low-dose CT images from PET/CT examinations without requiring diagnostic contrast-enhanced CT seems feasible.

  13. A tetrahedron beam computed tomography benchtop system with a multiple pixel field emission x-ray tube.

    PubMed

    Xu, Xiaochao; Kim, Joshua; Laganis, Philip; Schulze, Derek; Liang, Yongguang; Zhang, Tiezhi

    2011-10-01

    To demonstrate the feasibility of Tetrahedron Beam Computed Tomography (TBCT) using a carbon nanotube (CNT) multiple pixel field emission x-ray (MPFEX) tube. A multiple pixel x-ray source facilitates the creation of novel x-ray imaging modalities. In a previous publication, the authors proposed a Tetrahedron Beam Computed Tomography (TBCT) imaging system which comprises a linear source array and a linear detector array that are orthogonal to each other. TBCT is expected to reduce scatter compared with Cone Beam Computed Tomography (CBCT) and to have better detector performance. Therefore, it may produce improved image quality for image guided radiotherapy. In this study, a TBCT benchtop system has been developed with an MPFEX tube. The tube has 75 CNT cold cathodes, which generate 75 x-ray focal spots on an elongated anode, and has 4 mm pixel spacing. An in-house-developed, 5-row CT detector array using silicon photodiodes and CdWO(4) scintillators was employed in the system. Hardware and software were developed for tube control and detector data acquisition. The raw data were preprocessed for beam hardening and detector response linearity and were reconstructed with an FDK-based image reconstruction algorithm. The focal spots were measured at about 1 × 2 mm(2) using a star phantom. Each cathode generates around 3 mA cathode current with 2190 V gate voltage. The benchtop system is able to perform TBCT scans with a prolonged scanning time. Images of a commercial CT phantom were successfully acquired. A prototype system was developed, and preliminary phantom images were successfully acquired. MPFEX is a promising x-ray source for TBCT. Further improvement of tube output is needed in order for it to be used in clinical TBCT systems.

  14. Node Resource Manager: A Distributed Computing Software Framework Used for Solving Geophysical Problems

    NASA Astrophysics Data System (ADS)

    Lawry, B. J.; Encarnacao, A.; Hipp, J. R.; Chang, M.; Young, C. J.

    2011-12-01

    With the rapid growth of multi-core computing hardware, it is now possible for scientific researchers to run complex, computationally intensive software on affordable, in-house commodity hardware. Multi-core CPUs (Central Processing Unit) and GPUs (Graphics Processing Unit) are now commonplace in desktops and servers. Developers today have access to extremely powerful hardware that enables the execution of software that could previously only be run on expensive, massively-parallel systems. It is no longer cost-prohibitive for an institution to build a parallel computing cluster consisting of commodity multi-core servers. In recent years, our research team has developed a distributed, multi-core computing system and used it to construct global 3D earth models using seismic tomography. Traditionally, computational limitations forced certain assumptions and shortcuts in the calculation of tomographic models; however, with the recent rapid growth in computational hardware including faster CPU's, increased RAM, and the development of multi-core computers, we are now able to perform seismic tomography, 3D ray tracing and seismic event location using distributed parallel algorithms running on commodity hardware, thereby eliminating the need for many of these shortcuts. We describe Node Resource Manager (NRM), a system we developed that leverages the capabilities of a parallel computing cluster. NRM is a software-based parallel computing management framework that works in tandem with the Java Parallel Processing Framework (JPPF, http://www.jppf.org/), a third party library that provides a flexible and innovative way to take advantage of modern multi-core hardware. NRM enables multiple applications to use and share a common set of networked computers, regardless of their hardware platform or operating system. Using NRM, algorithms can be parallelized to run on multiple processing cores of a distributed computing cluster of servers and desktops, which results in a dramatic speedup in execution time. NRM is sufficiently generic to support applications in any domain, as long as the application is parallelizable (i.e., can be subdivided into multiple individual processing tasks). At present, NRM has been effective in decreasing the overall runtime of several algorithms: 1) the generation of a global 3D model of the compressional velocity distribution in the Earth using tomographic inversion, 2) the calculation of the model resolution matrix, model covariance matrix, and travel time uncertainty for the aforementioned velocity model, and 3) the correlation of waveforms with archival data on a massive scale for seismic event detection. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  15. Production version of the extended NASA-Langley Vortex Lattice FORTRAN computer program. Volume 1: User's guide

    NASA Technical Reports Server (NTRS)

    Lamar, J. E.; Herbert, H. E.

    1982-01-01

    The latest production version, MARK IV, of the NASA-Langley vortex lattice computer program is summarized. All viable subcritical aerodynamic features of previous versions were retained. This version extends the previously documented program capabilities to four planforms, 400 panels, and enables the user to obtain vortex-flow aerodynamics on cambered planforms, flowfield properties off the configuration in attached flow, and planform longitudinal load distributions.

  16. More IMPATIENT: A Gridding-Accelerated Toeplitz-based Strategy for Non-Cartesian High-Resolution 3D MRI on GPUs

    PubMed Central

    Gai, Jiading; Obeid, Nady; Holtrop, Joseph L.; Wu, Xiao-Long; Lam, Fan; Fu, Maojing; Haldar, Justin P.; Hwu, Wen-mei W.; Liang, Zhi-Pei; Sutton, Bradley P.

    2013-01-01

    Several recent methods have been proposed to obtain significant speed-ups in MRI image reconstruction by leveraging the computational power of GPUs. Previously, we implemented a GPU-based image reconstruction technique called the Illinois Massively Parallel Acquisition Toolkit for Image reconstruction with ENhanced Throughput in MRI (IMPATIENT MRI) for reconstructing data collected along arbitrary 3D trajectories. In this paper, we improve IMPATIENT by removing computational bottlenecks by using a gridding approach to accelerate the computation of various data structures needed by the previous routine. Further, we enhance the routine with capabilities for off-resonance correction and multi-sensor parallel imaging reconstruction. Through implementation of optimized gridding into our iterative reconstruction scheme, speed-ups of more than a factor of 200 are provided in the improved GPU implementation compared to the previous accelerated GPU code. PMID:23682203

  17. Quantitative Computed Tomography (QCT) derived Bone Mineral Density (BMD) in finite element studies: a review of the literature.

    PubMed

    Knowles, Nikolas K; Reeves, Jacob M; Ferreira, Louis M

    2016-12-01

    Finite element modeling of human bone provides a powerful tool to evaluate a wide variety of outcomes in a highly repeatable and parametric manner. These models are most often derived from computed tomography data, with mechanical properties related to bone mineral density (BMD) from the x-ray energy attenuation provided from this data. To increase accuracy, many researchers report the use of quantitative computed tomography (QCT), in which a calibration phantom is used during image acquisition to improve the estimation of BMD. Since model accuracy is dependent on the methods used in the calculation of BMD and density-mechanical property relationships, it is important to use relationships developed for the same anatomical location and using the same scanner settings, as these may impact model accuracy. The purpose of this literature review is to report the relationships used in the conversion of QCT equivalent density measures to ash, apparent, and/or tissue densities in recent finite element (FE) studies used in common density-modulus relationships. For studies reporting experimental validation, the validation metrics and results are presented. Of the studies reviewed, 29% reported the use of a dipotassium phosphate (K 2 HPO 4 ) phantom, 47% a hydroxyapatite (HA) phantom, 13% did not report phantom type, 7% reported use of both K 2 HPO 4 and HA phantoms, and 4% alternate phantom types. Scanner type and/or settings were omitted or partially reported in 31% of studies. The majority of studies used densitometric and/or density-modulus relationships derived from different anatomical locations scanned in different scanners with different scanner settings. The methods used to derive various densitometric relationships are reported and recommendations are provided toward the standardization of reporting metrics. This review assessed the current state of QCT-based FE modeling with use of clinical scanners. It was found that previously developed densitometric relationships vary by anatomical location, scanner type and settings. Reporting of all parameters used when referring to previously developed relationships, or in the development of new relationships, may increase the accuracy and repeatability of future FE models.

  18. A collaborative environment for developing and validating predictive tools for protein biophysical characteristics

    NASA Astrophysics Data System (ADS)

    Johnston, Michael A.; Farrell, Damien; Nielsen, Jens Erik

    2012-04-01

    The exchange of information between experimentalists and theoreticians is crucial to improving the predictive ability of theoretical methods and hence our understanding of the related biology. However many barriers exist which prevent the flow of information between the two disciplines. Enabling effective collaboration requires that experimentalists can easily apply computational tools to their data, share their data with theoreticians, and that both the experimental data and computational results are accessible to the wider community. We present a prototype collaborative environment for developing and validating predictive tools for protein biophysical characteristics. The environment is built on two central components; a new python-based integration module which allows theoreticians to provide and manage remote access to their programs; and PEATDB, a program for storing and sharing experimental data from protein biophysical characterisation studies. We demonstrate our approach by integrating PEATSA, a web-based service for predicting changes in protein biophysical characteristics, into PEATDB. Furthermore, we illustrate how the resulting environment aids method development using the Potapov dataset of experimentally measured ΔΔGfold values, previously employed to validate and train protein stability prediction algorithms.

  19. Efficient generation of connectivity in neuronal networks from simulator-independent descriptions

    PubMed Central

    Djurfeldt, Mikael; Davison, Andrew P.; Eppler, Jochen M.

    2014-01-01

    Simulator-independent descriptions of connectivity in neuronal networks promise greater ease of model sharing, improved reproducibility of simulation results, and reduced programming effort for computational neuroscientists. However, until now, enabling the use of such descriptions in a given simulator in a computationally efficient way has entailed considerable work for simulator developers, which must be repeated for each new connectivity-generating library that is developed. We have developed a generic connection generator interface that provides a standard way to connect a connectivity-generating library to a simulator, such that one library can easily be replaced by another, according to the modeler's needs. We have used the connection generator interface to connect C++ and Python implementations of the previously described connection-set algebra to the NEST simulator. We also demonstrate how the simulator-independent modeling framework PyNN can transparently take advantage of this, passing a connection description through to the simulator layer for rapid processing in C++ where a simulator supports the connection generator interface and falling-back to slower iteration in Python otherwise. A set of benchmarks demonstrates the good performance of the interface. PMID:24795620

  20. Critical Low-Noise Technologies Being Developed for Engine Noise Reduction Systems Subproject

    NASA Technical Reports Server (NTRS)

    Grady, Joseph E.; Civinskas, Kestutis C.

    2004-01-01

    NASA's previous Advanced Subsonic Technology (AST) Noise Reduction Program delivered the initial technologies for meeting a 10-year goal of a 10-dB reduction in total aircraft system noise. Technology Readiness Levels achieved for the engine-noise-reduction technologies ranged from 4 (rig scale) to 6 (engine demonstration). The current Quiet Aircraft Technology (QAT) project is building on those AST accomplishments to achieve the additional noise reduction needed to meet the Aerospace Technology Enterprise's 10-year goal, again validated through a combination of laboratory rig and engine demonstration tests. In order to meet the Aerospace Technology Enterprise goal for future aircraft of a 50- reduction in the perceived noise level, reductions of 4 dB are needed in both fan and jet noise. The primary objectives of the Engine Noise Reduction Systems (ENRS) subproject are, therefore, to develop technologies to reduce both fan and jet noise by 4 dB, to demonstrate these technologies in engine tests, and to develop and experimentally validate Computational Aero Acoustics (CAA) computer codes that will improve our ability to predict engine noise.

Top