Science.gov

Sample records for accounting functions computer

  1. Accounting & Computing Curriculum Guide.

    ERIC Educational Resources Information Center

    Avani, Nathan T.; And Others

    This curriculum guide consists of materials for use in teaching a competency-based accounting and computing course that is designed to prepare students for employability in the following occupational areas: inventory control clerk, invoice clerk, payroll clerk, traffic clerk, general ledger bookkeeper, accounting clerk, account information clerk,…

  2. Teaching Accounting with Computers.

    ERIC Educational Resources Information Center

    Shaoul, Jean

    This paper addresses the numerous ways that computers may be used to enhance the teaching of accounting and business topics. It focuses on the pedagogical use of spreadsheet software to improve the conceptual coverage of accounting principles and practice, increase student understanding by involvement in the solution process, and reduce the amount…

  3. Rayleigh radiance computations for satellite remote sensing: accounting for the effect of sensor spectral response function.

    PubMed

    Wang, Menghua

    2016-05-30

    To understand and assess the effect of the sensor spectral response function (SRF) on the accuracy of the top of the atmosphere (TOA) Rayleigh-scattering radiance computation, new TOA Rayleigh radiance lookup tables (LUTs) over global oceans and inland waters have been generated. The new Rayleigh LUTs include spectral coverage of 335-2555 nm, all possible solar-sensor geometries, and surface wind speeds of 0-30 m/s. Using the new Rayleigh LUTs, the sensor SRF effect on the accuracy of the TOA Rayleigh radiance computation has been evaluated for spectral bands of the Visible Infrared Imaging Radiometer Suite (VIIRS) on the Suomi National Polar-orbiting Partnership (SNPP) satellite and the Joint Polar Satellite System (JPSS)-1, showing some important uncertainties for VIIRS-SNPP particularly for large solar- and/or sensor-zenith angles as well as for large Rayleigh optical thicknesses (i.e., short wavelengths) and bands with broad spectral bandwidths. To accurately account for the sensor SRF effect, a new correction algorithm has been developed for VIIRS spectral bands, which improves the TOA Rayleigh radiance accuracy to ~0.01% even for the large solar-zenith angles of 70°-80°, compared with the error of ~0.7% without applying the correction for the VIIRS-SNPP 410 nm band. The same methodology that accounts for the sensor SRF effect on the Rayleigh radiance computation can be used for other satellite sensors. In addition, with the new Rayleigh LUTs, the effect of surface atmospheric pressure variation on the TOA Rayleigh radiance computation can be calculated precisely, and no specific atmospheric pressure correction algorithm is needed. There are some other important applications and advantages to using the new Rayleigh LUTs for satellite remote sensing, including an efficient and accurate TOA Rayleigh radiance computation for hyperspectral satellite remote sensing, detector-based TOA Rayleigh radiance computation, Rayleigh radiance calculations for high altitude

  4. Rayleigh radiance computations for satellite remote sensing: accounting for the effect of sensor spectral response function.

    PubMed

    Wang, Menghua

    2016-05-30

    To understand and assess the effect of the sensor spectral response function (SRF) on the accuracy of the top of the atmosphere (TOA) Rayleigh-scattering radiance computation, new TOA Rayleigh radiance lookup tables (LUTs) over global oceans and inland waters have been generated. The new Rayleigh LUTs include spectral coverage of 335-2555 nm, all possible solar-sensor geometries, and surface wind speeds of 0-30 m/s. Using the new Rayleigh LUTs, the sensor SRF effect on the accuracy of the TOA Rayleigh radiance computation has been evaluated for spectral bands of the Visible Infrared Imaging Radiometer Suite (VIIRS) on the Suomi National Polar-orbiting Partnership (SNPP) satellite and the Joint Polar Satellite System (JPSS)-1, showing some important uncertainties for VIIRS-SNPP particularly for large solar- and/or sensor-zenith angles as well as for large Rayleigh optical thicknesses (i.e., short wavelengths) and bands with broad spectral bandwidths. To accurately account for the sensor SRF effect, a new correction algorithm has been developed for VIIRS spectral bands, which improves the TOA Rayleigh radiance accuracy to ~0.01% even for the large solar-zenith angles of 70°-80°, compared with the error of ~0.7% without applying the correction for the VIIRS-SNPP 410 nm band. The same methodology that accounts for the sensor SRF effect on the Rayleigh radiance computation can be used for other satellite sensors. In addition, with the new Rayleigh LUTs, the effect of surface atmospheric pressure variation on the TOA Rayleigh radiance computation can be calculated precisely, and no specific atmospheric pressure correction algorithm is needed. There are some other important applications and advantages to using the new Rayleigh LUTs for satellite remote sensing, including an efficient and accurate TOA Rayleigh radiance computation for hyperspectral satellite remote sensing, detector-based TOA Rayleigh radiance computation, Rayleigh radiance calculations for high altitude

  5. Vocational Accounting and Computing Programs.

    ERIC Educational Resources Information Center

    Avani, Nathan T.

    1986-01-01

    Describes an "Accounting and Computing" program in Michigan that emphasizes computerized accounting procedures. This article describes the program curriculum and duty areas (such as handling accounts receivable), presents a list of sample tasks in each duty area, and specifies components of each task. Computer equipment necessary for this program…

  6. Integrating Computer Concepts into Principles of Accounting.

    ERIC Educational Resources Information Center

    Beck, Henry J.; Parrish, Roy James, Jr.

    A package of instructional materials for an undergraduate principles of accounting course at Danville Community College was developed based upon the following assumptions: (1) the principles of accounting student does not need to be able to write computer programs; (2) computerized accounting concepts should be presented in this course; (3)…

  7. Space shuttle configuration accounting functional design specification

    NASA Technical Reports Server (NTRS)

    1974-01-01

    An analysis is presented of the requirements for an on-line automated system which must be capable of tracking the status of requirements and engineering changes and of providing accurate and timely records. The functional design specification provides the definition, description, and character length of the required data elements and the interrelationship of data elements to adequately track, display, and report the status of active configuration changes. As changes to the space shuttle program levels II and III configuration are proposed, evaluated, and dispositioned, it is the function of the configuration management office to maintain records regarding changes to the baseline and to track and report the status of those changes. The configuration accounting system will consist of a combination of computers, computer terminals, software, and procedures, all of which are designed to store, retrieve, display, and process information required to track proposed and proved engineering changes to maintain baseline documentation of the space shuttle program levels II and III.

  8. Computational complexity of Boolean functions

    NASA Astrophysics Data System (ADS)

    Korshunov, Aleksei D.

    2012-02-01

    Boolean functions are among the fundamental objects of discrete mathematics, especially in those of its subdisciplines which fall under mathematical logic and mathematical cybernetics. The language of Boolean functions is convenient for describing the operation of many discrete systems such as contact networks, Boolean circuits, branching programs, and some others. An important parameter of discrete systems of this kind is their complexity. This characteristic has been actively investigated starting from Shannon's works. There is a large body of scientific literature presenting many fundamental results. The purpose of this survey is to give an account of the main results over the last sixty years related to the complexity of computation (realization) of Boolean functions by contact networks, Boolean circuits, and Boolean circuits without branching. Bibliography: 165 titles.

  9. A Study of Work Values of Computer Education in Accounting.

    ERIC Educational Resources Information Center

    Ahadiat, Nasrollah

    1992-01-01

    Describes a survey of students at a large university and a random sample of faculty at accounting accredited schools that investigated the use of microcomputers in accounting curricula. Topics discussed include work values and computer education, student and teacher attitudes toward computers, and differences between upper division and lower…

  10. A Computational Account of Bilingual Aphasia Rehabilitation

    ERIC Educational Resources Information Center

    Kiran, Swathi; Grasemann, Uli; Sandberg, Chaleece; Miikkulainen, Risto

    2013-01-01

    Current research on bilingual aphasia highlights the paucity in recommendations for optimal rehabilitation for bilingual aphasic patients (Edmonds & Kiran, 2006; Roberts & Kiran, 2007). In this paper, we have developed a computational model to simulate an English-Spanish bilingual language system in which language representations can vary by age…

  11. A Computational Account of Bilingual Aphasia Rehabilitation.

    PubMed

    Kiran, Swathi; Grasemann, Uli; Sandberg, Chaleece; Miikkulainen, Risto

    2013-04-01

    Current research on bilingual aphasia highlights the paucity in recommendations for optimal rehabilitation for bilingual aphasic patients (Roberts & Kiran, 2007; Edmonds & Kiran, 2006). In this paper, we have developed a computational model to simulate an English-Spanish bilingual language system in which language representations can vary by age of acquisition (AoA) and relative proficiency in the two languages to model individual participants. This model is subsequently lesioned by varying connection strengths between the semantic and phonological networks and retrained based on individual patient demographic information to evaluate whether or not the model's prediction of rehabilitation matched the actual treatment outcome. In most cases the model comes close to the target performance subsequent to language therapy in the language trained, indicating the validity of this model in simulating rehabilitation of naming impairment in bilingual aphasia. Additionally, the amount of cross-language transfer is limited both in the patient performance and in the model's predictions and is dependent on that specific patient's AoA, language exposure and language impairment. It also suggests how well alternative treatment scenarios would have fared, including some cases where the alternative would have done better. Overall, the study suggests how computational modeling could be used in the future to design customized treatment recipes that result in better recovery than is currently possible.

  12. The Organizational Account of Function is an Etiological Account of Function.

    PubMed

    Artiga, Marc; Martínez, Manolo

    2016-06-01

    The debate on the notion of function has been historically dominated by dispositional and etiological accounts, but recently a third contender has gained prominence: the organizational account. This original theory of function is intended to offer an alternative account based on the notion of self-maintaining system. However, there is a set of cases where organizational accounts seem to generate counterintuitive results. These cases involve cross-generational traits, that is, traits that do not contribute in any relevant way to the self-maintenance of the organism carrying them, but instead have very important effects on organisms that belong to the next generation. We argue that any plausible solution to the problem of cross-generational traits shows that the organizational account just is a version of the etiological theory and, furthermore, that it does not provide any substantive advantage over standard etiological theories of function.

  13. Network Coding for Function Computation

    ERIC Educational Resources Information Center

    Appuswamy, Rathinakumar

    2011-01-01

    In this dissertation, the following "network computing problem" is considered. Source nodes in a directed acyclic network generate independent messages and a single receiver node computes a target function f of the messages. The objective is to maximize the average number of times f can be computed per network usage, i.e., the "computing…

  14. Program Computes Thermodynamic Functions

    NASA Technical Reports Server (NTRS)

    Mcbride, Bonnie J.; Gordon, Sanford

    1994-01-01

    PAC91 is latest in PAC (Properties and Coefficients) series. Two principal features are to provide means of (1) generating theoretical thermodynamic functions from molecular constants and (2) least-squares fitting of these functions to empirical equations. PAC91 written in FORTRAN 77 to be machine-independent.

  15. Common Accounting System for Monitoring the ATLAS Distributed Computing Resources

    NASA Astrophysics Data System (ADS)

    Karavakis, E.; Andreeva, J.; Campana, S.; Gayazov, S.; Jezequel, S.; Saiz, P.; Sargsyan, L.; Schovancova, J.; Ueda, I.; Atlas Collaboration

    2014-06-01

    This paper covers in detail a variety of accounting tools used to monitor the utilisation of the available computational and storage resources within the ATLAS Distributed Computing during the first three years of Large Hadron Collider data taking. The Experiment Dashboard provides a set of common accounting tools that combine monitoring information originating from many different information sources; either generic or ATLAS specific. This set of tools provides quality and scalable solutions that are flexible enough to support the constantly evolving requirements of the ATLAS user community.

  16. PC-DYMAC: Personal Computer---DYnamic Materials ACcounting

    SciTech Connect

    Jackson, B.G.

    1989-11-01

    This manual was designed to provide complete documentation for the computer system used by the EBR-II Fuels and Materials Department, Argonne National Laboratory-West (ANL-W) for accountability of special nuclear materials (SNM). This document includes background information on the operation of the Fuel Manufacturing Facility (FMF), instructions on computer operations in correlation with production and a detailed manual for DYMAC operation. 60 figs.

  17. Computers Can Help Student Retention in Introductory College Accounting.

    ERIC Educational Resources Information Center

    Price, Richard L.; Murvin, Harry J.

    1992-01-01

    Almost all students in a study of an integrated instructional approach indicated that using a computer and workbook was very helpful in understanding financial accounting. A related study found that students with lower reading levels benefited most from this approach, and withdrawal dropped from 10 percent to 2 percent. (JOW)

  18. Computational Models for Neuromuscular Function

    PubMed Central

    Valero-Cuevas, Francisco J.; Hoffmann, Heiko; Kurse, Manish U.; Kutch, Jason J.; Theodorou, Evangelos A.

    2011-01-01

    Computational models of the neuromuscular system hold the potential to allow us to reach a deeper understanding of neuromuscular function and clinical rehabilitation by complementing experimentation. By serving as a means to distill and explore specific hypotheses, computational models emerge from prior experimental data and motivate future experimental work. Here we review computational tools used to understand neuromuscular function including musculoskeletal modeling, machine learning, control theory, and statistical model analysis. We conclude that these tools, when used in combination, have the potential to further our understanding of neuromuscular function by serving as a rigorous means to test scientific hypotheses in ways that complement and leverage experimental data. PMID:21687779

  19. Automatic computation of transfer functions

    SciTech Connect

    Atcitty, Stanley; Watson, Luke Dale

    2015-04-14

    Technologies pertaining to the automatic computation of transfer functions for a physical system are described herein. The physical system is one of an electrical system, a mechanical system, an electromechanical system, an electrochemical system, or an electromagnetic system. A netlist in the form of a matrix comprises data that is indicative of elements in the physical system, values for the elements in the physical system, and structure of the physical system. Transfer functions for the physical system are computed based upon the netlist.

  20. Computer Experiments for Function Approximations

    SciTech Connect

    Chang, A; Izmailov, I; Rizzo, S; Wynter, S; Alexandrov, O; Tong, C

    2007-10-15

    This research project falls in the domain of response surface methodology, which seeks cost-effective ways to accurately fit an approximate function to experimental data. Modeling and computer simulation are essential tools in modern science and engineering. A computer simulation can be viewed as a function that receives input from a given parameter space and produces an output. Running the simulation repeatedly amounts to an equivalent number of function evaluations, and for complex models, such function evaluations can be very time-consuming. It is then of paramount importance to intelligently choose a relatively small set of sample points in the parameter space at which to evaluate the given function, and then use this information to construct a surrogate function that is close to the original function and takes little time to evaluate. This study was divided into two parts. The first part consisted of comparing four sampling methods and two function approximation methods in terms of efficiency and accuracy for simple test functions. The sampling methods used were Monte Carlo, Quasi-Random LP{sub {tau}}, Maximin Latin Hypercubes, and Orthogonal-Array-Based Latin Hypercubes. The function approximation methods utilized were Multivariate Adaptive Regression Splines (MARS) and Support Vector Machines (SVM). The second part of the study concerned adaptive sampling methods with a focus on creating useful sets of sample points specifically for monotonic functions, functions with a single minimum and functions with a bounded first derivative.

  1. Neural computation of arithmetic functions

    NASA Technical Reports Server (NTRS)

    Siu, Kai-Yeung; Bruck, Jehoshua

    1990-01-01

    An area of application of neural networks is considered. A neuron is modeled as a linear threshold gate, and the network architecture considered is the layered feedforward network. It is shown how common arithmetic functions such as multiplication and sorting can be efficiently computed in a shallow neural network. Some known results are improved by showing that the product of two n-bit numbers and sorting of n n-bit numbers can be computed by a polynomial-size neural network using only four and five unit delays, respectively. Moreover, the weights of each threshold element in the neural networks require O(log n)-bit (instead of n-bit) accuracy. These results can be extended to more complicated functions such as multiple products, division, rational functions, and approximation of analytic functions.

  2. An APEL Tool Based CPU Usage Accounting Infrastructure for Large Scale Computing Grids

    NASA Astrophysics Data System (ADS)

    Jiang, Ming; Novales, Cristina Del Cano; Mathieu, Gilles; Casson, John; Rogers, William; Gordon, John

    The APEL (Accounting Processor for Event Logs) is the fundamental tool for the CPU usage accounting infrastructure deployed within the WLCG and EGEE Grids. In these Grids, jobs are submitted by users to computing resources via a Grid Resource Broker (e.g. gLite Workload Management System). As a log processing tool, APEL interprets logs of Grid gatekeeper (e.g. globus) and batch system logs (e.g. PBS, LSF, SGE and Condor) to produce CPU job accounting records identified with Grid identities. These records provide a complete description of usage of computing resources by user's jobs. APEL publishes accounting records into an accounting record repository at a Grid Operations Centre (GOC) for the access from a GUI web tool. The functions of log files parsing, records generation and publication are implemented by the APEL Parser, APEL Core, and APEL Publisher component respectively. Within the distributed accounting infrastructure, accounting records are transported from APEL Publishers at Grid sites to either a regionalised accounting system or the central one by choice via a common ActiveMQ message broker network. This provides an open transport layer for other accounting systems to publish relevant accounting data to a central accounting repository via a unified interface provided an APEL Publisher and also will give regional/National Grid Initiatives (NGIs) Grids the flexibility in their choice of accounting system. The robust and secure delivery of accounting record messages at an NGI level and between NGI accounting instances and the central one are achieved by using configurable APEL Publishers and an ActiveMQ message broker network.

  3. Computer program for the automated attendance accounting system

    NASA Technical Reports Server (NTRS)

    Poulson, P.; Rasmusson, C.

    1971-01-01

    The automated attendance accounting system (AAAS) was developed under the auspices of the Space Technology Applications Program. The task is basically the adaptation of a small digital computer, coupled with specially developed pushbutton terminals located in school classrooms and offices for the purpose of taking daily attendance, maintaining complete attendance records, and producing partial and summary reports. Especially developed for high schools, the system is intended to relieve both teachers and office personnel from the time-consuming and dreary task of recording and analyzing the myriad classroom attendance data collected throughout the semester. In addition, since many school district budgets are related to student attendance, the increase in accounting accuracy is expected to augment district income. A major component of this system is the real-time AAAS software system, which is described.

  4. FUNCTION GENERATOR FOR ANALOGUE COMPUTERS

    DOEpatents

    Skramstad, H.K.; Wright, J.H.; Taback, L.

    1961-12-12

    An improved analogue computer is designed which can be used to determine the final ground position of radioactive fallout particles in an atomic cloud. The computer determines the fallout pattern on the basis of known wind velocity and direction at various altitudes, and intensity of radioactivity in the mushroom cloud as a function of particle size and initial height in the cloud. The output is then displayed on a cathode-ray tube so that the average or total luminance of the tube screen at any point represents the intensity of radioactive fallout at the geographical location represented by that point. (AEC)

  5. Metacognition: computation, biology and function.

    PubMed

    Fleming, Stephen M; Dolan, Raymond J; Frith, Christopher D

    2012-05-19

    Many complex systems maintain a self-referential check and balance. In animals, such reflective monitoring and control processes have been grouped under the rubric of metacognition. In this introductory article to a Theme Issue on metacognition, we review recent and rapidly progressing developments from neuroscience, cognitive psychology, computer science and philosophy of mind. While each of these areas is represented in detail by individual contributions to the volume, we take this opportunity to draw links between disciplines, and highlight areas where further integration is needed. Specifically, we cover the definition, measurement, neurobiology and possible functions of metacognition, and assess the relationship between metacognition and consciousness. We propose a framework in which level of representation, order of behaviour and access consciousness are orthogonal dimensions of the conceptual landscape. PMID:22492746

  6. Metacognition: computation, biology and function

    PubMed Central

    Fleming, Stephen M.; Dolan, Raymond J.; Frith, Christopher D.

    2012-01-01

    Many complex systems maintain a self-referential check and balance. In animals, such reflective monitoring and control processes have been grouped under the rubric of metacognition. In this introductory article to a Theme Issue on metacognition, we review recent and rapidly progressing developments from neuroscience, cognitive psychology, computer science and philosophy of mind. While each of these areas is represented in detail by individual contributions to the volume, we take this opportunity to draw links between disciplines, and highlight areas where further integration is needed. Specifically, we cover the definition, measurement, neurobiology and possible functions of metacognition, and assess the relationship between metacognition and consciousness. We propose a framework in which level of representation, order of behaviour and access consciousness are orthogonal dimensions of the conceptual landscape. PMID:22492746

  7. Integration of a Computer Application in a First Year Accounting Curriculum: An Evaluation of Student Attitudes

    ERIC Educational Resources Information Center

    Laing, Gregory Kenneth; Perrin, Ronald William

    2012-01-01

    This paper presents the findings of a field study conducted to ascertain the perceptions of first year accounting students concerning the integration of computer applications in the accounting curriculum. The results indicate that both student cohorts perceived the computer as a valuable educational tool. The use of computers to enhance the…

  8. Genre Analysis of Tax Computation Letters: How and Why Tax Accountants Write the Way They Do

    ERIC Educational Resources Information Center

    Flowerdew, John; Wan, Alina

    2006-01-01

    This study is a genre analysis which explores the specific discourse community of tax accountants. Tax computation letters from one international accounting firm in Hong Kong were analyzed and compared. To probe deeper into the tax accounting discourse community, a group of tax accountants from the same firm was observed and questioned. The texts…

  9. Computation of generating functions for biological molecules

    SciTech Connect

    Howell, J.A.; Smith, T.F.; Waterman, M.S.

    1980-08-01

    The object of this paper is to give algorithms and techniques for computing generating functions of certain RNA configurations. Combinatorics and symbolic computation are utilized to calculate the generating functions for small RNA molecules. From these generating functions, it is possible to obtain information about the bonding and structure of the molecules. Specific examples of interest to biology are given and discussed.

  10. Techniques for developing reliable and functional materials control and accounting software

    SciTech Connect

    Barlich, G.

    1988-01-01

    The media has increasingly focused on failures of computer systems resulting in financial, material, and other losses and on systems failing to function as advertised. Unfortunately, such failures with equally disturbing losses are possible in computer systems providing materials control and accounting (MCandA) functions. Major improvements in the reliability and correctness of systems are possible with disciplined design and development techniques applied during software development. This paper describes some of the techniques used in the Safeguard Systems Group at Los Alamos National Laboratory for various MCandA systems. 9 refs.

  11. Computing functions by approximating the input

    NASA Astrophysics Data System (ADS)

    Goldberg, Mayer

    2012-12-01

    In computing real-valued functions, it is ordinarily assumed that the input to the function is known, and it is the output that we need to approximate. In this work, we take the opposite approach: we show how to compute the values of some transcendental functions by approximating the input to these functions, and obtaining exact answers for their output. Our approach assumes only the most rudimentary knowledge of algebra and trigonometry, and makes no use of calculus.

  12. 49 CFR 1242.46 - Computers and data processing equipment (account XX-27-46).

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 9 2012-10-01 2012-10-01 false Computers and data processing equipment (account XX-27-46). 1242.46 Section 1242.46 Transportation Other Regulations Relating to Transportation... RAILROADS 1 Operating Expenses-Equipment § 1242.46 Computers and data processing equipment (account...

  13. 49 CFR 1242.46 - Computers and data processing equipment (account XX-27-46).

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 9 2011-10-01 2011-10-01 false Computers and data processing equipment (account XX-27-46). 1242.46 Section 1242.46 Transportation Other Regulations Relating to Transportation... RAILROADS 1 Operating Expenses-Equipment § 1242.46 Computers and data processing equipment (account...

  14. 49 CFR 1242.46 - Computers and data processing equipment (account XX-27-46).

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 9 2013-10-01 2013-10-01 false Computers and data processing equipment (account XX-27-46). 1242.46 Section 1242.46 Transportation Other Regulations Relating to Transportation... RAILROADS 1 Operating Expenses-Equipment § 1242.46 Computers and data processing equipment (account...

  15. 49 CFR 1242.46 - Computers and data processing equipment (account XX-27-46).

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 9 2010-10-01 2010-10-01 false Computers and data processing equipment (account XX-27-46). 1242.46 Section 1242.46 Transportation Other Regulations Relating to Transportation... RAILROADS 1 Operating Expenses-Equipment § 1242.46 Computers and data processing equipment (account...

  16. 49 CFR 1242.46 - Computers and data processing equipment (account XX-27-46).

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 9 2014-10-01 2014-10-01 false Computers and data processing equipment (account XX-27-46). 1242.46 Section 1242.46 Transportation Other Regulations Relating to Transportation... RAILROADS 1 Operating Expenses-Equipment § 1242.46 Computers and data processing equipment (account...

  17. Connecting Neural Coding to Number Cognition: A Computational Account

    ERIC Educational Resources Information Center

    Prather, Richard W.

    2012-01-01

    The current study presents a series of computational simulations that demonstrate how the neural coding of numerical magnitude may influence number cognition and development. This includes behavioral phenomena cataloged in cognitive literature such as the development of numerical estimation and operational momentum. Though neural research has…

  18. Computational Modeling of Mitochondrial Function

    PubMed Central

    Cortassa, Sonia; Aon, Miguel A.

    2012-01-01

    The advent of techniques with the ability to scan massive changes in cellular makeup (genomics, proteomics, etc.) has revealed the compelling need for analytical methods to interpret and make sense of those changes. Computational models built on sound physico-chemical mechanistic basis are unavoidable at the time of integrating, interpreting, and simulating high-throughput experimental data. Another powerful role of computational models is predicting new behavior provided they are adequately validated. Mitochondrial energy transduction has been traditionally studied with thermodynamic models. More recently, kinetic or thermo-kinetic models have been proposed, leading the path toward an understanding of the control and regulation of mitochondrial energy metabolism and its interaction with cytoplasmic and other compartments. In this work, we outline the methods, step-by-step, that should be followed to build a computational model of mitochondrial energetics in isolation or integrated to a network of cellular processes. Depending on the question addressed by the modeler, the methodology explained herein can be applied with different levels of detail, from the mitochondrial energy producing machinery in a network of cellular processes to the dynamics of a single enzyme during its catalytic cycle. PMID:22057575

  19. Accountability.

    ERIC Educational Resources Information Center

    Mullen, David J., Ed.

    This monograph, prepared to assist Georgia elementary principals to better understand accountability and its implications for educational improvement, sets forth many of the theoretical and philosophical bases from which accountability is being considered. Leon M. Lessinger begins this 5-paper presentation by describing the need for accountability…

  20. Accountability.

    ERIC Educational Resources Information Center

    Lashway, Larry

    1999-01-01

    This issue reviews publications that provide a starting point for principals looking for a way through the accountability maze. Each publication views accountability differently, but collectively these readings argue that even in an era of state-mandated assessment, principals can pursue proactive strategies that serve students' needs. James A.…

  1. Accountability.

    ERIC Educational Resources Information Center

    The Newsletter of the Comprehensive Center-Region VI, 1999

    1999-01-01

    Controversy surrounding the accountability movement is related to how the movement began in response to dissatisfaction with public schools. Opponents see it as one-sided, somewhat mean-spirited, and a threat to the professional status of teachers. Supporters argue that all other spheres of the workplace have accountability systems and that the…

  2. Gender Differences in Attitudes toward Computers and Performance in the Accounting Information Systems Class

    ERIC Educational Resources Information Center

    Lenard, Mary Jane; Wessels, Susan; Khanlarian, Cindi

    2010-01-01

    Using a model developed by Young (2000), this paper explores the relationship between performance in the Accounting Information Systems course, self-assessed computer skills, and attitudes toward computers. Results show that after taking the AIS course, students experience a change in perception about their use of computers. Females'…

  3. Testing Neuronal Accounts of Anisotropic Motion Perception with Computational Modelling

    PubMed Central

    Wong, William; Chiang Price, Nicholas Seow

    2014-01-01

    There is an over-representation of neurons in early visual cortical areas that respond most strongly to cardinal (horizontal and vertical) orientations and directions of visual stimuli, and cardinal- and oblique-preferring neurons are reported to have different tuning curves. Collectively, these neuronal anisotropies can explain two commonly-reported phenomena of motion perception – the oblique effect and reference repulsion – but it remains unclear whether neuronal anisotropies can simultaneously account for both perceptual effects. We show in psychophysical experiments that reference repulsion and the oblique effect do not depend on the duration of a moving stimulus, and that brief adaptation to a single direction simultaneously causes a reference repulsion in the orientation domain, and the inverse of the oblique effect in the direction domain. We attempted to link these results to underlying neuronal anisotropies by implementing a large family of neuronal decoding models with parametrically varied levels of anisotropy in neuronal direction-tuning preferences, tuning bandwidths and spiking rates. Surprisingly, no model instantiation was able to satisfactorily explain our perceptual data. We argue that the oblique effect arises from the anisotropic distribution of preferred directions evident in V1 and MT, but that reference repulsion occurs separately, perhaps reflecting a process of categorisation occurring in higher-order cortical areas. PMID:25409518

  4. Sequential decisions: a computational comparison of observational and reinforcement accounts.

    PubMed

    Mohammadi Sepahvand, Nazanin; Stöttinger, Elisabeth; Danckert, James; Anderson, Britt

    2014-01-01

    Right brain damaged patients show impairments in sequential decision making tasks for which healthy people do not show any difficulty. We hypothesized that this difficulty could be due to the failure of right brain damage patients to develop well-matched models of the world. Our motivation is the idea that to navigate uncertainty, humans use models of the world to direct the decisions they make when interacting with their environment. The better the model is, the better their decisions are. To explore the model building and updating process in humans and the basis for impairment after brain injury, we used a computational model of non-stationary sequence learning. RELPH (Reinforcement and Entropy Learned Pruned Hypothesis space) was able to qualitatively and quantitatively reproduce the results of left and right brain damaged patient groups and healthy controls playing a sequential version of Rock, Paper, Scissors. Our results suggests that, in general, humans employ a sub-optimal reinforcement based learning method rather than an objectively better statistical learning approach, and that differences between right brain damaged and healthy control groups can be explained by different exploration policies, rather than qualitatively different learning mechanisms.

  5. On computation of Hough functions

    NASA Astrophysics Data System (ADS)

    Wang, Houjun; Boyd, John P.; Akmaev, Rashid A.

    2016-04-01

    Hough functions are the eigenfunctions of the Laplace tidal equation governing fluid motion on a rotating sphere with a resting basic state. Several numerical methods have been used in the past. In this paper, we compare two of those methods: normalized associated Legendre polynomial expansion and Chebyshev collocation. Both methods are not widely used, but both have some advantages over the commonly used unnormalized associated Legendre polynomial expansion method. Comparable results are obtained using both methods. For the first method we note some details on numerical implementation. The Chebyshev collocation method was first used for the Laplace tidal problem by Boyd (1976) and is relatively easy to use. A compact MATLAB code is provided for this method. We also illustrate the importance and effect of including a parity factor in Chebyshev polynomial expansions for modes with odd zonal wave numbers.

  6. The Introductory Computer Course in the Accounting Curriculum: Objectives and Performance.

    ERIC Educational Resources Information Center

    Steedle, Lamont F.; Sinclair, Kenneth P.

    1984-01-01

    This study identified computer objectives based on recommendations of the authoritative accounting bodies, determined whether the typical introductory computer course has these same objectives, and examined the influence of the academic department responsible for teaching the course. Relationships between department and course objectives,…

  7. Approximate Bayesian computation with functional statistics.

    PubMed

    Soubeyrand, Samuel; Carpentier, Florence; Guiton, François; Klein, Etienne K

    2013-03-26

    Functional statistics are commonly used to characterize spatial patterns in general and spatial genetic structures in population genetics in particular. Such functional statistics also enable the estimation of parameters of spatially explicit (and genetic) models. Recently, Approximate Bayesian Computation (ABC) has been proposed to estimate model parameters from functional statistics. However, applying ABC with functional statistics may be cumbersome because of the high dimension of the set of statistics and the dependences among them. To tackle this difficulty, we propose an ABC procedure which relies on an optimized weighted distance between observed and simulated functional statistics. We applied this procedure to a simple step model, a spatial point process characterized by its pair correlation function and a pollen dispersal model characterized by genetic differentiation as a function of distance. These applications showed how the optimized weighted distance improved estimation accuracy. In the discussion, we consider the application of the proposed ABC procedure to functional statistics characterizing non-spatial processes.

  8. Dynamics and computation in functional shifts

    NASA Astrophysics Data System (ADS)

    Namikawa, Jun; Hashimoto, Takashi

    2004-07-01

    We introduce a new type of shift dynamics as an extended model of symbolic dynamics, and investigate the characteristics of shift spaces from the viewpoints of both dynamics and computation. This shift dynamics is called a functional shift, which is defined by a set of bi-infinite sequences of some functions on a set of symbols. To analyse the complexity of functional shifts, we measure them in terms of topological entropy, and locate their languages in the Chomsky hierarchy. Through this study, we argue that considering functional shifts from the viewpoints of both dynamics and computation gives us opposite results about the complexity of systems. We also describe a new class of shift spaces whose languages are not recursively enumerable.

  9. Does Participation in a Computer-Based Learning Program in Introductory Financial Accounting Course Lead to Choosing Accounting as a Major?

    ERIC Educational Resources Information Center

    Owhoso, Vincent; Malgwi, Charles A.; Akpomi, Margaret

    2014-01-01

    The authors examine whether students who completed a computer-based intervention program, designed to help them develop abilities and skills in introductory accounting, later declared accounting as a major. A sample of 1,341 students participated in the study, of which 74 completed the intervention program (computer-based assisted learning [CBAL])…

  10. Computer Games Functioning as Motivation Stimulants

    ERIC Educational Resources Information Center

    Lin, Grace Hui Chin; Tsai, Tony Kung Wan; Chien, Paul Shih Chieh

    2011-01-01

    Numerous scholars have recommended computer games can function as influential motivation stimulants of English learning, showing benefits as learning tools (Clarke and Dede, 2007; Dede, 2009; Klopfer and Squire, 2009; Liu and Chu, 2010; Mitchell, Dede & Dunleavy, 2009). This study aimed to further test and verify the above suggestion,…

  11. Computationally efficient method to construct scar functions

    NASA Astrophysics Data System (ADS)

    Revuelta, F.; Vergini, E. G.; Benito, R. M.; Borondo, F.

    2012-02-01

    The performance of a simple method [E. L. Sibert III, E. Vergini, R. M. Benito, and F. Borondo, New J. Phys.NJOPFM1367-263010.1088/1367-2630/10/5/053016 10, 053016 (2008)] to efficiently compute scar functions along unstable periodic orbits with complicated trajectories in configuration space is discussed, using a classically chaotic two-dimensional quartic oscillator as an illustration.

  12. Functional quantum computing: An optical approach

    NASA Astrophysics Data System (ADS)

    Rambo, Timothy M.; Altepeter, Joseph B.; Kumar, Prem; D'Ariano, G. Mauro

    2016-05-01

    Recent theoretical investigations treat quantum computations as functions, quantum processes which operate on other quantum processes, rather than circuits. Much attention has been given to the N -switch function which takes N black-box quantum operators as input, coherently permutes their ordering, and applies the result to a target quantum state. This is something which cannot be equivalently done using a quantum circuit. Here, we propose an all-optical system design which implements coherent operator permutation for an arbitrary number of input operators.

  13. Analysis of Ventricular Function by Computed Tomography

    PubMed Central

    Rizvi, Asim; Deaño, Roderick C.; Bachman, Daniel P.; Xiong, Guanglei; Min, James K.; Truong, Quynh A.

    2014-01-01

    The assessment of ventricular function, cardiac chamber dimensions and ventricular mass is fundamental for clinical diagnosis, risk assessment, therapeutic decisions, and prognosis in patients with cardiac disease. Although cardiac computed tomography (CT) is a noninvasive imaging technique often used for the assessment of coronary artery disease, it can also be utilized to obtain important data about left and right ventricular function and morphology. In this review, we will discuss the clinical indications for the use of cardiac CT for ventricular analysis, review the evidence on the assessment of ventricular function compared to existing imaging modalities such cardiac MRI and echocardiography, provide a typical cardiac CT protocol for image acquisition and post-processing for ventricular analysis, and provide step-by-step instructions to acquire multiplanar cardiac views for ventricular assessment from the standard axial, coronal, and sagittal planes. Furthermore, both qualitative and quantitative assessments of ventricular function as well as sample reporting are detailed. PMID:25576407

  14. New Computer Simulations of Macular Neural Functioning

    NASA Technical Reports Server (NTRS)

    Ross, Muriel D.; Doshay, D.; Linton, S.; Parnas, B.; Montgomery, K.; Chimento, T.

    1994-01-01

    We use high performance graphics workstations and supercomputers to study the functional significance of the three-dimensional (3-D) organization of gravity sensors. These sensors have a prototypic architecture foreshadowing more complex systems. Scaled-down simulations run on a Silicon Graphics workstation and scaled-up, 3-D versions run on a Cray Y-MP supercomputer. A semi-automated method of reconstruction of neural tissue from serial sections studied in a transmission electron microscope has been developed to eliminate tedious conventional photography. The reconstructions use a mesh as a step in generating a neural surface for visualization. Two meshes are required to model calyx surfaces. The meshes are connected and the resulting prisms represent the cytoplasm and the bounding membranes. A finite volume analysis method is employed to simulate voltage changes along the calyx in response to synapse activation on the calyx or on calyceal processes. The finite volume method insures that charge is conserved at the calyx-process junction. These and other models indicate that efferent processes act as voltage followers, and that the morphology of some afferent processes affects their functioning. In a final application, morphological information is symbolically represented in three dimensions in a computer. The possible functioning of the connectivities is tested using mathematical interpretations of physiological parameters taken from the literature. Symbolic, 3-D simulations are in progress to probe the functional significance of the connectivities. This research is expected to advance computer-based studies of macular functioning and of synaptic plasticity.

  15. A Computational Account of Children's Analogical Reasoning: Balancing Inhibitory Control in Working Memory and Relational Representation

    ERIC Educational Resources Information Center

    Morrison, Robert G.; Doumas, Leonidas A. A.; Richland, Lindsey E.

    2011-01-01

    Theories accounting for the development of analogical reasoning tend to emphasize either the centrality of relational knowledge accretion or changes in information processing capability. Simulations in LISA (Hummel & Holyoak, 1997, 2003), a neurally inspired computer model of analogical reasoning, allow us to explore how these factors may…

  16. Technology Readiness, Internet Self-Efficacy and Computing Experience of Professional Accounting Students

    ERIC Educational Resources Information Center

    Lai, Ming-Ling

    2008-01-01

    Purpose: This study aims to assess the state of technology readiness of professional accounting students in Malaysia, to examine their level of internet self-efficacy, to assess their prior computing experience, and to explore if they are satisfied with the professional course that they are pursuing in improving their technology skills.…

  17. Written and Computer-Mediated Accounting Communication Skills: An Employer Perspective

    ERIC Educational Resources Information Center

    Jones, Christopher G.

    2011-01-01

    Communication skills are a fundamental personal competency for a successful career in accounting. What is not so obvious is the specific written communication skill set employers look for and the extent those skills are computer mediated. Using survey research, this article explores the particular skills employers desire and their satisfaction…

  18. Computer network defense through radial wave functions

    NASA Astrophysics Data System (ADS)

    Malloy, Ian J.

    The purpose of this research is to synthesize basic and fundamental findings in quantum computing, as applied to the attack and defense of conventional computer networks. The concept focuses on uses of radio waves as a shield for, and attack against traditional computers. A logic bomb is analogous to a landmine in a computer network, and if one was to implement it as non-trivial mitigation, it will aid computer network defense. As has been seen in kinetic warfare, the use of landmines has been devastating to geopolitical regions in that they are severely difficult for a civilian to avoid triggering given the unknown position of a landmine. Thus, the importance of understanding a logic bomb is relevant and has corollaries to quantum mechanics as well. The research synthesizes quantum logic phase shifts in certain respects using the Dynamic Data Exchange protocol in software written for this work, as well as a C-NOT gate applied to a virtual quantum circuit environment by implementing a Quantum Fourier Transform. The research focus applies the principles of coherence and entanglement from quantum physics, the concept of expert systems in artificial intelligence, principles of prime number based cryptography with trapdoor functions, and modeling radio wave propagation against an event from unknown parameters. This comes as a program relying on the artificial intelligence concept of an expert system in conjunction with trigger events for a trapdoor function relying on infinite recursion, as well as system mechanics for elliptic curve cryptography along orbital angular momenta. Here trapdoor both denotes the form of cipher, as well as the implied relationship to logic bombs.

  19. Computational functions in biochemical reaction networks.

    PubMed Central

    Arkin, A; Ross, J

    1994-01-01

    In prior work we demonstrated the implementation of logic gates, sequential computers (universal Turing machines), and parallel computers by means of the kinetics of chemical reaction mechanisms. In the present article we develop this subject further by first investigating the computational properties of several enzymatic (single and multiple) reaction mechanisms: we show their steady states are analogous to either Boolean or fuzzy logic gates. Nearly perfect digital function is obtained only in the regime in which the enzymes are saturated with their substrates. With these enzymatic gates, we construct combinational chemical networks that execute a given truth-table. The dynamic range of a network's output is strongly affected by "input/output matching" conditions among the internal gate elements. We find a simple mechanism, similar to the interconversion of fructose-6-phosphate between its two bisphosphate forms (fructose-1,6-bisphosphate and fructose-2,6-bisphosphate), that functions analogously to an AND gate. When the simple model is supplanted with one in which the enzyme rate laws are derived from experimental data, the steady state of the mechanism functions as an asymmetric fuzzy aggregation operator with properties akin to a fuzzy AND gate. The qualitative behavior of the mechanism does not change when situated within a large model of glycolysis/gluconeogenesis and the TCA cycle. The mechanism, in this case, switches the pathway's mode from glycolysis to gluconeogenesis in response to chemical signals of low blood glucose (cAMP) and abundant fuel for the TCA cycle (acetyl coenzyme A). Images FIGURE 3 FIGURE 4 FIGURE 5 FIGURE 7 FIGURE 10 FIGURE 12 FIGURE 13 FIGURE 14 FIGURE 15 FIGURE 16 PMID:7948674

  20. Structure, function, and behaviour of computational models in systems biology

    PubMed Central

    2013-01-01

    Background Systems Biology develops computational models in order to understand biological phenomena. The increasing number and complexity of such “bio-models” necessitate computer support for the overall modelling task. Computer-aided modelling has to be based on a formal semantic description of bio-models. But, even if computational bio-models themselves are represented precisely in terms of mathematical expressions their full meaning is not yet formally specified and only described in natural language. Results We present a conceptual framework – the meaning facets – which can be used to rigorously specify the semantics of bio-models. A bio-model has a dual interpretation: On the one hand it is a mathematical expression which can be used in computational simulations (intrinsic meaning). On the other hand the model is related to the biological reality (extrinsic meaning). We show that in both cases this interpretation should be performed from three perspectives: the meaning of the model’s components (structure), the meaning of the model’s intended use (function), and the meaning of the model’s dynamics (behaviour). In order to demonstrate the strengths of the meaning facets framework we apply it to two semantically related models of the cell cycle. Thereby, we make use of existing approaches for computer representation of bio-models as much as possible and sketch the missing pieces. Conclusions The meaning facets framework provides a systematic in-depth approach to the semantics of bio-models. It can serve two important purposes: First, it specifies and structures the information which biologists have to take into account if they build, use and exchange models. Secondly, because it can be formalised, the framework is a solid foundation for any sort of computer support in bio-modelling. The proposed conceptual framework establishes a new methodology for modelling in Systems Biology and constitutes a basis for computer-aided collaborative research

  1. Computer simulation as a teaching aid in pharmacy management--Part 1: Principles of accounting.

    PubMed

    Morrison, D J

    1987-06-01

    The need for pharmacists to develop management expertise through participation in formal courses is now widely acknowledged. Many schools of pharmacy lay the foundations for future management training by providing introductory courses as an integral or elective part of the undergraduate syllabus. The benefit of such courses may, however, be limited by the lack of opportunity for the student to apply the concepts and procedures in a practical working environment. Computer simulations provide a means to overcome this problem, particularly in the field of resource management. In this, the first of two articles, the use of a computer model to demonstrate basic accounting principles is described.

  2. Computer simulation as a teaching aid in pharmacy management--Part 1: Principles of accounting.

    PubMed

    Morrison, D J

    1987-06-01

    The need for pharmacists to develop management expertise through participation in formal courses is now widely acknowledged. Many schools of pharmacy lay the foundations for future management training by providing introductory courses as an integral or elective part of the undergraduate syllabus. The benefit of such courses may, however, be limited by the lack of opportunity for the student to apply the concepts and procedures in a practical working environment. Computer simulations provide a means to overcome this problem, particularly in the field of resource management. In this, the first of two articles, the use of a computer model to demonstrate basic accounting principles is described. PMID:3301875

  3. The intrinsic quasar luminosity function: Accounting for accretion disk anisotropy

    SciTech Connect

    DiPompeo, M. A.; Myers, A. D.; Brotherton, M. S.; Runnoe, J. C.; Green, R. F.

    2014-05-20

    Quasar luminosity functions are a fundamental probe of the growth and evolution of supermassive black holes. Measuring the intrinsic luminosity function is difficult in practice, due to a multitude of observational and systematic effects. As sample sizes increase and measurement errors drop, characterizing the systematic effects is becoming more important. It is well known that the continuum emission from the accretion disk of quasars is anisotropic—in part due to its disk-like structure—but current luminosity function calculations effectively assume isotropy over the range of unobscured lines of sight. Here, we provide the first steps in characterizing the effect of random quasar orientations and simple models of anisotropy on observed luminosity functions. We find that the effect of orientation is not insignificant and exceeds other potential corrections such as those from gravitational lensing of foreground structures. We argue that current observational constraints may overestimate the intrinsic luminosity function by as much as a factor of ∼2 on the bright end. This has implications for models of quasars and their role in the universe, such as quasars' contribution to cosmological backgrounds.

  4. A cognitive neurobiological account of deception: evidence from functional neuroimaging.

    PubMed Central

    Spence, Sean A; Hunter, Mike D; Farrow, Tom F D; Green, Russell D; Leung, David H; Hughes, Catherine J; Ganesan, Venkatasubramanian

    2004-01-01

    An organism may use misinformation, knowingly (through deception) or unknowingly (as in the case of camouflage), to gain advantage in a competitive environment. From an evolutionary perspective, greater tactical deception occurs among primates closer to humans, with larger neocortices. In humans, the onset of deceptive behaviours in childhood exhibits a developmental trajectory, which may be regarded as 'normal' in the majority and deficient among a minority with certain neurodevelopmental disorders (e.g. autism). In the human adult, deception and lying exhibit features consistent with their use of 'higher' or 'executive' brain systems. Accurate detection of deception in humans may be of particular importance in forensic practice, while an understanding of its cognitive neurobiology may have implications for models of 'theory of mind' and social cognition, and societal notions of responsibility, guilt and mitigation. In recent years, functional neuroimaging techniques (especially functional magnetic resonance imaging) have been used to study deception. Though few in number, and using very different experimental protocols, studies published in the peer-reviewed literature exhibit certain consistencies. Attempted deception is associated with activation of executive brain regions (particularly prefrontal and anterior cingulate cortices), while truthful responding has not been shown to be associated with any areas of increased activation (relative to deception). Hence, truthful responding may comprise a relative 'baseline' in human cognition and communication. The subject who lies may necessarily engage 'higher' brain centres, consistent with a purpose or intention (to deceive). While the principle of executive control during deception remains plausible, its precise anatomy awaits elucidation. PMID:15590616

  5. Green's Function Analysis of Periodic Structures in Computational Electromagnetics

    NASA Astrophysics Data System (ADS)

    Van Orden, Derek

    2011-12-01

    Periodic structures are used widely in electromagnetic devices, including filters, waveguiding structures, and antennas. Their electromagnetic properties may be analyzed computationally by solving an integral equation, in which an unknown equivalent current distribution in a single unit cell is convolved with a periodic Green's function that accounts for the system's boundary conditions. Fast computation of the periodic Green's function is therefore essential to achieve high accuracy solutions of complicated periodic structures, including analysis of modal wave propagation and scattering from external sources. This dissertation first presents alternative spectral representations of the periodic Green's function of the Helmholtz equation for cases of linear periodic systems in 2D and 3D free space and near planarly layered media. Although there exist multiple representations of the periodic Green's function, most are not efficient in the important case where the fields are observed near the array axis. We present spectral-spatial representations for rapid calculation of the periodic Green's functions for linear periodic arrays of current sources residing in free space as well as near a planarly layered medium. They are based on the integral expansion of the periodic Green's functions in terms of the spectral parameters transverse to the array axis. These schemes are important for the rapid computation of the interaction among unit cells of a periodic array, and, by extension, the complex dispersion relations of guided waves. Extensions of this approach to planar periodic structures are discussed. With these computation tools established, we study the traveling wave properties of linear resonant arrays placed near surfaces, and examine the coupling mechanisms that lead to radiation into guided waves supported by the surface. This behavior is especially important to understand the properties of periodic structures printed on dielectric substrates, such as periodic

  6. Computation of the lattice Green function for a dislocation

    NASA Astrophysics Data System (ADS)

    Tan, Anne Marie Z.; Trinkle, Dallas R.

    2016-08-01

    Modeling isolated dislocations is challenging due to their long-ranged strain fields. Flexible boundary condition methods capture the correct long-range strain field of a defect by coupling the defect core to an infinite harmonic bulk through the lattice Green function (LGF). To improve the accuracy and efficiency of flexible boundary condition methods, we develop a numerical method to compute the LGF specifically for a dislocation geometry; in contrast to previous methods, where the LGF was computed for the perfect bulk as an approximation for the dislocation. Our approach directly accounts for the topology of a dislocation, and the errors in the LGF computation converge rapidly for edge dislocations in a simple cubic model system as well as in BCC Fe with an empirical potential. When used within the flexible boundary condition approach, the dislocation LGF relaxes dislocation core geometries in fewer iterations than when the perfect bulk LGF is used as an approximation for the dislocation, making a flexible boundary condition approach more efficient.

  7. Visual perception can account for the close relation between numerosity processing and computational fluency

    PubMed Central

    Zhou, Xinlin; Wei, Wei; Zhang, Yiyun; Cui, Jiaxin; Chen, Chuansheng

    2015-01-01

    Studies have shown that numerosity processing (e.g., comparison of numbers of dots in two dot arrays) is significantly correlated with arithmetic performance. Researchers have attributed this association to the fact that both tasks share magnitude processing. The current investigation tested an alternative hypothesis, which states that visual perceptual ability (as measured by a figure-matching task) can account for the close relation between numerosity processing and arithmetic performance (computational fluency). Four hundred and twenty four third- to fifth-grade children (220 boys and 204 girls, 8.0–11.0 years old; 120 third graders, 146 fourth graders, and 158 fifth graders) were recruited from two schools (one urban and one suburban) in Beijing, China. Six classes were randomly selected from each school, and all students in each selected class participated in the study. All children were given a series of cognitive and mathematical tests, including numerosity comparison, figure matching, forward verbal working memory, visual tracing, non-verbal matrices reasoning, mental rotation, choice reaction time, arithmetic tests and curriculum-based mathematical achievement test. Results showed that figure-matching ability had higher correlations with numerosity processing and computational fluency than did other cognitive factors (e.g., forward verbal working memory, visual tracing, non-verbal matrix reasoning, mental rotation, and choice reaction time). More important, hierarchical multiple regression showed that figure matching ability accounted for the well-established association between numerosity processing and computational fluency. In support of the visual perception hypothesis, the results suggest that visual perceptual ability, rather than magnitude processing, may be the shared component of numerosity processing and arithmetic performance. PMID:26441740

  8. An Atomistic Statistically Effective Energy Function for Computational Protein Design.

    PubMed

    Topham, Christopher M; Barbe, Sophie; André, Isabelle

    2016-08-01

    Shortcomings in the definition of effective free-energy surfaces of proteins are recognized to be a major contributory factor responsible for the low success rates of existing automated methods for computational protein design (CPD). The formulation of an atomistic statistically effective energy function (SEEF) suitable for a wide range of CPD applications and its derivation from structural data extracted from protein domains and protein-ligand complexes are described here. The proposed energy function comprises nonlocal atom-based and local residue-based SEEFs, which are coupled using a novel atom connectivity number factor to scale short-range, pairwise, nonbonded atomic interaction energies and a surface-area-dependent cavity energy term. This energy function was used to derive additional SEEFs describing the unfolded-state ensemble of any given residue sequence based on computed average energies for partially or fully solvent-exposed fragments in regions of irregular structure in native proteins. Relative thermal stabilities of 97 T4 bacteriophage lysozyme mutants were predicted from calculated energy differences for folded and unfolded states with an average unsigned error (AUE) of 0.84 kcal mol(-1) when compared to experiment. To demonstrate the utility of the energy function for CPD, further validation was carried out in tests of its capacity to recover cognate protein sequences and to discriminate native and near-native protein folds, loop conformers, and small-molecule ligand binding poses from non-native benchmark decoys. Experimental ligand binding free energies for a diverse set of 80 protein complexes could be predicted with an AUE of 2.4 kcal mol(-1) using an additional energy term to account for the loss in ligand configurational entropy upon binding. The atomistic SEEF is expected to improve the accuracy of residue-based coarse-grained SEEFs currently used in CPD and to extend the range of applications of extant atom-based protein statistical

  9. An Atomistic Statistically Effective Energy Function for Computational Protein Design.

    PubMed

    Topham, Christopher M; Barbe, Sophie; André, Isabelle

    2016-08-01

    Shortcomings in the definition of effective free-energy surfaces of proteins are recognized to be a major contributory factor responsible for the low success rates of existing automated methods for computational protein design (CPD). The formulation of an atomistic statistically effective energy function (SEEF) suitable for a wide range of CPD applications and its derivation from structural data extracted from protein domains and protein-ligand complexes are described here. The proposed energy function comprises nonlocal atom-based and local residue-based SEEFs, which are coupled using a novel atom connectivity number factor to scale short-range, pairwise, nonbonded atomic interaction energies and a surface-area-dependent cavity energy term. This energy function was used to derive additional SEEFs describing the unfolded-state ensemble of any given residue sequence based on computed average energies for partially or fully solvent-exposed fragments in regions of irregular structure in native proteins. Relative thermal stabilities of 97 T4 bacteriophage lysozyme mutants were predicted from calculated energy differences for folded and unfolded states with an average unsigned error (AUE) of 0.84 kcal mol(-1) when compared to experiment. To demonstrate the utility of the energy function for CPD, further validation was carried out in tests of its capacity to recover cognate protein sequences and to discriminate native and near-native protein folds, loop conformers, and small-molecule ligand binding poses from non-native benchmark decoys. Experimental ligand binding free energies for a diverse set of 80 protein complexes could be predicted with an AUE of 2.4 kcal mol(-1) using an additional energy term to account for the loss in ligand configurational entropy upon binding. The atomistic SEEF is expected to improve the accuracy of residue-based coarse-grained SEEFs currently used in CPD and to extend the range of applications of extant atom-based protein statistical

  10. Assigning unique identification numbers to new user accounts and groups in a computing environment with multiple registries

    DOEpatents

    DeRobertis, Christopher V.; Lu, Yantian T.

    2010-02-23

    A method, system, and program storage device for creating a new user account or user group with a unique identification number in a computing environment having multiple user registries is provided. In response to receiving a command to create a new user account or user group, an operating system of a clustered computing environment automatically checks multiple registries configured for the operating system to determine whether a candidate identification number for the new user account or user group has been assigned already to one or more existing user accounts or groups, respectively. The operating system automatically assigns the candidate identification number to the new user account or user group created in a target user registry if the checking indicates that the candidate identification number has not been assigned already to any of the existing user accounts or user groups, respectively.

  11. An Experimental Analysis of Computer-Mediated Instruction and Student Attitudes in a Principles of Financial Accounting Course.

    ERIC Educational Resources Information Center

    Basile, Anthony; D'Aquila, Jill M.

    2002-01-01

    Accounting students received either traditional instruction (n=46) or used computer-mediated communication and WebCT course management software. There were no significant differences in attitudes about the course. However, computer users were more positive about course delivery and course management tools. (Contains 17 references.) (SK)

  12. Using an Online Homework System to Submit Accounting Homework: Role of Cognitive Need, Computer Efficacy, and Perception

    ERIC Educational Resources Information Center

    Peng, Jacob C.

    2009-01-01

    The author investigated whether students' effort in working on homework problems was affected by their need for cognition, their perception of the system, and their computer efficacy when instructors used an online system to collect accounting homework. Results showed that individual intrinsic motivation and computer efficacy are important factors…

  13. Computer method for identification of boiler transfer functions

    NASA Technical Reports Server (NTRS)

    Miles, J. H.

    1972-01-01

    Iterative computer aided procedure was developed which provides for identification of boiler transfer functions using frequency response data. Method uses frequency response data to obtain satisfactory transfer function for both high and low vapor exit quality data.

  14. Computing the Partition Function for Kinetically Trapped RNA Secondary Structures

    PubMed Central

    Lorenz, William A.; Clote, Peter

    2011-01-01

    An RNA secondary structure is locally optimal if there is no lower energy structure that can be obtained by the addition or removal of a single base pair, where energy is defined according to the widely accepted Turner nearest neighbor model. Locally optimal structures form kinetic traps, since any evolution away from a locally optimal structure must involve energetically unfavorable folding steps. Here, we present a novel, efficient algorithm to compute the partition function over all locally optimal secondary structures of a given RNA sequence. Our software, RNAlocopt runs in time and space. Additionally, RNAlocopt samples a user-specified number of structures from the Boltzmann subensemble of all locally optimal structures. We apply RNAlocopt to show that (1) the number of locally optimal structures is far fewer than the total number of structures – indeed, the number of locally optimal structures approximately equal to the square root of the number of all structures, (2) the structural diversity of this subensemble may be either similar to or quite different from the structural diversity of the entire Boltzmann ensemble, a situation that depends on the type of input RNA, (3) the (modified) maximum expected accuracy structure, computed by taking into account base pairing frequencies of locally optimal structures, is a more accurate prediction of the native structure than other current thermodynamics-based methods. The software RNAlocopt constitutes a technical breakthrough in our study of the folding landscape for RNA secondary structures. For the first time, locally optimal structures (kinetic traps in the Turner energy model) can be rapidly generated for long RNA sequences, previously impossible with methods that involved exhaustive enumeration. Use of locally optimal structure leads to state-of-the-art secondary structure prediction, as benchmarked against methods involving the computation of minimum free energy and of maximum expected accuracy. Web server

  15. Computing the partition function for kinetically trapped RNA secondary structures.

    PubMed

    Lorenz, William A; Clote, Peter

    2011-01-01

    An RNA secondary structure is locally optimal if there is no lower energy structure that can be obtained by the addition or removal of a single base pair, where energy is defined according to the widely accepted Turner nearest neighbor model. Locally optimal structures form kinetic traps, since any evolution away from a locally optimal structure must involve energetically unfavorable folding steps. Here, we present a novel, efficient algorithm to compute the partition function over all locally optimal secondary structures of a given RNA sequence. Our software, RNAlocopt runs in O(n3) time and O(n2) space. Additionally, RNAlocopt samples a user-specified number of structures from the Boltzmann subensemble of all locally optimal structures. We apply RNAlocopt to show that (1) the number of locally optimal structures is far fewer than the total number of structures--indeed, the number of locally optimal structures approximately equal to the square root of the number of all structures, (2) the structural diversity of this subensemble may be either similar to or quite different from the structural diversity of the entire Boltzmann ensemble, a situation that depends on the type of input RNA, (3) the (modified) maximum expected accuracy structure, computed by taking into account base pairing frequencies of locally optimal structures, is a more accurate prediction of the native structure than other current thermodynamics-based methods. The software RNAlocopt constitutes a technical breakthrough in our study of the folding landscape for RNA secondary structures. For the first time, locally optimal structures (kinetic traps in the Turner energy model) can be rapidly generated for long RNA sequences, previously impossible with methods that involved exhaustive enumeration. Use of locally optimal structure leads to state-of-the-art secondary structure prediction, as benchmarked against methods involving the computation of minimum free energy and of maximum expected accuracy

  16. Computing black hole partition functions from quasinormal modes

    DOE PAGES

    Arnold, Peter; Szepietowski, Phillip; Vaman, Diana

    2016-07-07

    We propose a method of computing one-loop determinants in black hole space-times (with emphasis on asymptotically anti-de Sitter black holes) that may be used for numerics when completely-analytic results are unattainable. The method utilizes the expression for one-loop determinants in terms of quasinormal frequencies determined by Denef, Hartnoll and Sachdev in [1]. A numerical evaluation must face the fact that the sum over the quasinormal modes, indexed by momentum and overtone numbers, is divergent. A necessary ingredient is then a regularization scheme to handle the divergent contributions of individual fixed-momentum sectors to the partition function. To this end, we formulatemore » an effective two-dimensional problem in which a natural refinement of standard heat kernel techniques can be used to account for contributions to the partition function at fixed momentum. We test our method in a concrete case by reproducing the scalar one-loop determinant in the BTZ black hole background. Furthermore, we then discuss the application of such techniques to more complicated spacetimes.« less

  17. Computing black hole partition functions from quasinormal modes

    NASA Astrophysics Data System (ADS)

    Arnold, Peter; Szepietowski, Phillip; Vaman, Diana

    2016-07-01

    We propose a method of computing one-loop determinants in black hole space-times (with emphasis on asymptotically anti-de Sitter black holes) that may be used for numerics when completely-analytic results are unattainable. The method utilizes the expression for one-loop determinants in terms of quasinormal frequencies determined by Denef, Hartnoll and Sachdev in [1]. A numerical evaluation must face the fact that the sum over the quasinormal modes, indexed by momentum and overtone numbers, is divergent. A necessary ingredient is then a regularization scheme to handle the divergent contributions of individual fixed-momentum sectors to the partition function. To this end, we formulate an effective two-dimensional problem in which a natural refinement of standard heat kernel techniques can be used to account for contributions to the partition function at fixed momentum. We test our method in a concrete case by reproducing the scalar one-loop determinant in the BTZ black hole background. We then discuss the application of such techniques to more complicated spacetimes.

  18. Pair correlation function integrals: Computation and use

    NASA Astrophysics Data System (ADS)

    Wedberg, Rasmus; O'Connell, John P.; Peters, Günther H.; Abildskov, Jens

    2011-08-01

    We describe a method for extending radial distribution functions obtained from molecular simulations of pure and mixed molecular fluids to arbitrary distances. The method allows total correlation function integrals to be reliably calculated from simulations of relatively small systems. The long-distance behavior of radial distribution functions is determined by requiring that the corresponding direct correlation functions follow certain approximations at long distances. We have briefly described the method and tested its performance in previous communications [R. Wedberg, J. P. O'Connell, G. H. Peters, and J. Abildskov, Mol. Simul. 36, 1243 (2010);, 10.1080/08927020903536366 Fluid Phase Equilib. 302, 32 (2011)], 10.1016/j.fluid.2010.10.004, but describe here its theoretical basis more thoroughly and derive long-distance approximations for the direct correlation functions. We describe the numerical implementation of the method in detail, and report numerical tests complementing previous results. Pure molecular fluids are here studied in the isothermal-isobaric ensemble with isothermal compressibilities evaluated from the total correlation function integrals and compared with values derived from volume fluctuations. For systems where the radial distribution function has structure beyond the sampling limit imposed by the system size, the integration is more reliable, and usually more accurate, than simple integral truncation.

  19. The Computer and Its Functions; How to Communicate with the Computer.

    ERIC Educational Resources Information Center

    Ward, Peggy M.

    A brief discussion of why it is important for students to be familiar with computers and their functions and a list of some practical applications introduce this two-part paper. Focusing on how the computer works, the first part explains the various components of the computer, different kinds of memory storage devices, disk operating systems, and…

  20. Targeted computational probabilistic corroboration of experimental knee wear simulator: the importance of accounting for variability.

    PubMed

    Strickland, M A; Dressler, M R; Render, T; Browne, M; Taylor, M

    2011-04-01

    Experimental testing is widely used to predict wear of total knee replacement (TKR) devices. Computational models cannot replace this essential in vitro testing, but they do have complementary strengths and capabilities, which make in silico models a valuable support tool for experimental wear investigations. For effective exploitation, these two separate domains should be closely corroborated together; this requires extensive data-sharing and cross-checking at every stage of simulation and testing. However, isolated deterministic corroborations provide only a partial perspective; in vitro testing is inherently variable, and relatively small changes in the environmental and kinematic conditions at the articulating interface can account for considerable variation in the reported wear rates. Understanding these variations will be key to managing uncertainty in the tests, resulting in a 'cleaner' investigation environment for further refining current theories of wear. This study demonstrates the value of probabilistic in silico methods by describing a specific, targeted corroboration of the AMTI knee wear simulator, using rigid body dynamics software models. A deterministic model of the simulator under displacement-control was created for investigation. Firstly, a large sample of experimental data (N>100) was collated, and a probabilistic computational study (N>1000 trials) was used to compare the kinetic performance envelopes for in vitro and in silico models, to more fully corroborate the mechanical model. Secondly, corresponding theoretical wear-rate predictions were compared to the experimentally reported wear data, to assess the robustness of current wear theories to uncertainty (as distinct from the mechanical variability). The results reveal a good corroboration for the physical mechanics of the wear test rig; however they demonstrate that the distributions for wear are not currently well-predicted. The probabilistic domain is found to be far more sensitive at

  1. Graded modality-specific specialisation in semantics: A computational account of optic aphasia.

    PubMed

    Plaut, David C

    2002-10-01

    A long-standing debate regarding the representation of semantic knowledge is whether such knowledge is represented in a single, amodal system or whether it is organised into multiple subsystems based on modality of input or type of information. The current paper presents a distributed connectionist model of semantics that constitutes a middle ground between these unitary- versus multiple-semantics accounts. In the model, semantic representations develop under the pressure of learning to mediate between multiple input and output modalities in performing various tasks. The system has a topographic bias on learning that favours short connections, leading to a graded degree of modality-specific functional specialisation within semantics. The model is applied to the specific empirical phenomena of optic aphasia--a neuropsychological disorder in which patients exhibit a selective deficit in naming visually presented objects that is not attributable to more generalised impairments in object recognition (visual agnosia) or naming (anomia). As a result of the topographic bias in the model, as well as the relative degrees of systematicity among tasks, damage to connections from vision to regions of semantics near phonology impairs visual object naming far more than visual gesturing or tactile naming, as observed in optic aphasia. Moreover, as in optic aphasia, the system is better at generating the name of an action associated with an object than at generating the name of the object itself, because action naming receives interactive support from the activation of action representations. The ability of the model to account for the pattern of performance observed in optic aphasia across the full range of severity of impairment provides support for the claim that semantic representations exhibit graded functional specialisation rather than being entirely amodal or modality-specific.

  2. Basic mathematical function libraries for scientific computation

    NASA Technical Reports Server (NTRS)

    Galant, David C.

    1989-01-01

    Ada packages implementing selected mathematical functions for the support of scientific and engineering applications were written. The packages provide the Ada programmer with the mathematical function support found in the languages Pascal and FORTRAN as well as an extended precision arithmetic and a complete complex arithmetic. The algorithms used are fully described and analyzed. Implementation assumes that the Ada type FLOAT objects fully conform to the IEEE 754-1985 standard for single binary floating-point arithmetic, and that INTEGER objects are 32-bit entities. Codes for the Ada packages are included as appendixes.

  3. Computing Partial Transposes and Related Entanglement Functions

    NASA Astrophysics Data System (ADS)

    Maziero, Jonas

    2016-10-01

    The partial transpose (PT) is an important function for entanglement testing and quantification and also for the study of geometrical aspects of the quantum state space. In this article, considering general bipartite and multipartite discrete systems, explicit formulas ready for the numerical implementation of the PT and of related entanglement functions are presented and the Fortran code produced for that purpose is described. What is more, we obtain an analytical expression for the Hilbert-Schmidt entanglement of two-qudit systems and for the associated closest separable state. In contrast to previous works on this matter, we only use the properties of the PT, not applying Lagrange multipliers.

  4. 45 CFR 302.20 - Separation of cash handling and accounting functions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 2 2011-10-01 2011-10-01 false Separation of cash handling and accounting functions. 302.20 Section 302.20 Public Welfare Regulations Relating to Public Welfare OFFICE OF CHILD..., DEPARTMENT OF HEALTH AND HUMAN SERVICES STATE PLAN REQUIREMENTS § 302.20 Separation of cash handling...

  5. 45 CFR 302.20 - Separation of cash handling and accounting functions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 2 2014-10-01 2012-10-01 true Separation of cash handling and accounting functions. 302.20 Section 302.20 Public Welfare Regulations Relating to Public Welfare OFFICE OF CHILD..., DEPARTMENT OF HEALTH AND HUMAN SERVICES STATE PLAN REQUIREMENTS § 302.20 Separation of cash handling...

  6. 49 CFR 1242.78 - Employees performing clerical and accounting functions, and loss and damage claims processing...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... functions, and loss and damage claims processing (accounts XX-55-76 and XX-55-78). 1242.78 Section 1242.78... Employees performing clerical and accounting functions, and loss and damage claims processing (accounts XX-55-76 and XX-55-78). If the sum of the direct freight and the direct passenger expenses is more...

  7. 49 CFR 1242.78 - Employees performing clerical and accounting functions, and loss and damage claims processing...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... functions, and loss and damage claims processing (accounts XX-55-76 and XX-55-78). 1242.78 Section 1242.78... Employees performing clerical and accounting functions, and loss and damage claims processing (accounts XX-55-76 and XX-55-78). If the sum of the direct freight and the direct passenger expenses is more...

  8. 49 CFR 1242.78 - Employees performing clerical and accounting functions, and loss and damage claims processing...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... functions, and loss and damage claims processing (accounts XX-55-76 and XX-55-78). 1242.78 Section 1242.78... Employees performing clerical and accounting functions, and loss and damage claims processing (accounts XX-55-76 and XX-55-78). If the sum of the direct freight and the direct passenger expenses is more...

  9. 49 CFR 1242.78 - Employees performing clerical and accounting functions, and loss and damage claims processing...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... functions, and loss and damage claims processing (accounts XX-55-76 and XX-55-78). 1242.78 Section 1242.78... Employees performing clerical and accounting functions, and loss and damage claims processing (accounts XX-55-76 and XX-55-78). If the sum of the direct freight and the direct passenger expenses is more...

  10. Really computing nonperturbative real time correlation functions

    NASA Astrophysics Data System (ADS)

    Bödeker, Dietrich; McLerran, Larry; Smilga, Andrei

    1995-10-01

    It has been argued by Grigoriev and Rubakov that one can simulate real time processes involving baryon number nonconservation at high temperature using real time evolution of classical equations, and summing over initial conditions with a classical thermal weight. It is known that such a naive algorithm is plagued by ultraviolet divergences. In quantum theory the divergences are regularized, but the corresponding graphs involve the contributions from the hard momentum region and also the new scale ~gT comes into play. We propose a modified algorithm which involves solving the classical equations of motion for the effective hard thermal loop Hamiltonian with an ultraviolet cutoff μ>>gT and integrating over initial conditions with a proper thermal weight. Such an algorithm should provide a determination of the infrared behavior of the real time correlation function T determining the baryon violation rate. Hopefully, the results obtained in this modified algorithm will be cutoff independent.

  11. Basis Function Sampling for Material Property Computations

    NASA Astrophysics Data System (ADS)

    Whitmer, Jonathan K.; Chiu, Chi-Cheng; Joshi, Abhijeet A.; de Pablo, Juan J.

    2014-03-01

    Wang-Landau sampling, and the associated class of flat histogram simulation methods, have been particularly successful for free energy calculations in a wide array of physical systems. Practically, the convergence of these calculations to a target free energy surface is hampered by reliance on parameters which are unknown a priori. We derive and implement a method based on orthogonal (basis) functions which is fast, parameter-free, and geometrically robust. An important feature of this method is its ability to achieve arbitrary levels of description for the free energy. It is thus ideally suited to in silico measurement of elastic moduli and other quantities related to free energy perturbations. We demonstrate the utility of such applications by applying our method to calculation of the Frank elastic constants of the Lebwohl-Lasher model.

  12. Do Parents Recognize Autistic Deviant Behavior Long before Diagnosis? Taking into Account Interaction Using Computational Methods

    PubMed Central

    Saint-Georges, Catherine; Mahdhaoui, Ammar; Chetouani, Mohamed; Cassel, Raquel S.; Laznik, Marie-Christine; Apicella, Fabio; Muratori, Pietro; Maestro, Sandra; Muratori, Filippo; Cohen, David

    2011-01-01

    Background To assess whether taking into account interaction synchrony would help to better differentiate autism (AD) from intellectual disability (ID) and typical development (TD) in family home movies of infants aged less than 18 months, we used computational methods. Methodology and Principal Findings First, we analyzed interactive sequences extracted from home movies of children with AD (N = 15), ID (N = 12), or TD (N = 15) through the Infant and Caregiver Behavior Scale (ICBS). Second, discrete behaviors between baby (BB) and Care Giver (CG) co-occurring in less than 3 seconds were selected as single interactive patterns (or dyadic events) for analysis of the two directions of interaction (CG→BB and BB→CG) by group and semester. To do so, we used a Markov assumption, a Generalized Linear Mixed Model, and non negative matrix factorization. Compared to TD children, BBs with AD exhibit a growing deviant development of interactive patterns whereas those with ID rather show an initial delay of development. Parents of AD and ID do not differ very much from parents of TD when responding to their child. However, when initiating interaction, parents use more touching and regulation up behaviors as early as the first semester. Conclusion When studying interactive patterns, deviant autistic behaviors appear before 18 months. Parents seem to feel the lack of interactive initiative and responsiveness of their babies and try to increasingly supply soliciting behaviors. Thus we stress that credence should be given to parents' intuition as they recognize, long before diagnosis, the pathological process through the interactive pattern with their child. PMID:21818320

  13. Post choice information integration as a causal determinant of confidence: Novel data and a computational account.

    PubMed

    Moran, Rani; Teodorescu, Andrei R; Usher, Marius

    2015-05-01

    Confidence judgments are pivotal in the performance of daily tasks and in many domains of scientific research including the behavioral sciences, psychology and neuroscience. Positive resolution i.e., the positive correlation between choice-correctness and choice-confidence is a critical property of confidence judgments, which justifies their ubiquity. In the current paper, we study the mechanism underlying confidence judgments and their resolution by investigating the source of the inputs for the confidence-calculation. We focus on the intriguing debate between two families of confidence theories. According to single stage theories, confidence is based on the same information that underlies the decision (or on some other aspect of the decision process), whereas according to dual stage theories, confidence is affected by novel information that is collected after the decision was made. In three experiments, we support the case for dual stage theories by showing that post-choice perceptual availability manipulations exert a causal effect on confidence-resolution in the decision followed by confidence paradigm. These finding establish the role of RT2, the duration of the post-choice information-integration stage, as a prime dependent variable that theories of confidence should account for. We then present a novel list of robust empirical patterns ('hurdles') involving RT2 to guide further theorizing about confidence judgments. Finally, we present a unified computational dual stage model for choice, confidence and their latencies namely, the collapsing confidence boundary model (CCB). According to CCB, a diffusion-process choice is followed by a second evidence-integration stage towards a stochastic collapsing confidence boundary. Despite its simplicity, CCB clears the entire list of hurdles.

  14. Local brain atrophy accounts for functional activity differences in normal aging.

    PubMed

    Kalpouzos, Grégoria; Persson, Jonas; Nyberg, Lars

    2012-03-01

    Functional brain imaging studies of normal aging typically show age-related under- and overactivations during episodic memory tasks. Older individuals also undergo nonuniform gray matter volume (GMv) loss. Thus, age differences in functional brain activity could at least in part result from local atrophy. We conducted a series of voxel-based blood oxygen level-dependent (BOLD)-GMv analyses to highlight whether age-related under- and overrecruitment was accounted for by GMv changes. Occipital GMv loss accounted for underrecruitment at encoding. Efficiency reduction of sensory-perceptual mechanisms underpinned by these areas may partly be due to local atrophy. At retrieval, local GMv loss accounted for age-related overactivation of left dorsolateral prefrontal cortex, but not of left dorsomedial prefrontal cortex. Local atrophy also accounted for age-related overactivation in left lateral parietal cortex. Activity in these frontoparietal regions correlated with performance in the older group. Atrophy in the overrecruited regions was modest in comparison with other regions as shown by a between-group voxel-based morphometry comparison. Collectively, these findings link age-related structural differences to age-related functional under- as well as overrecruitment.

  15. Walk a Mile in My Shoes: Stakeholder Accounts of Testing Experience with a Computer-Administered Test

    ERIC Educational Resources Information Center

    Fox, Janna; Cheng, Liying

    2015-01-01

    In keeping with the trend to elicit multiple stakeholder responses to operational tests as part of test validation, this exploratory mixed methods study examines test-taker accounts of an Internet-based (i.e., computer-administered) test in the high-stakes context of proficiency testing for university admission. In 2013, as language testing…

  16. Computer-Intensive Algebra and Students' Conceptual Knowledge of Functions.

    ERIC Educational Resources Information Center

    O'Callaghan, Brian R.

    1998-01-01

    Describes a research project that examined the effects of the Computer-Intensive Algebra (CIA) and traditional algebra curricula on students' (N=802) understanding of the function concept. Results indicate that CIA students achieved a better understanding of functions and were better at the components of modeling, interpreting, and translating.…

  17. Positive Wigner Functions Render Classical Simulation of Quantum Computation Efficient

    NASA Astrophysics Data System (ADS)

    Mari, A.; Eisert, J.

    2012-12-01

    We show that quantum circuits where the initial state and all the following quantum operations can be represented by positive Wigner functions can be classically efficiently simulated. This is true both for continuous-variable as well as discrete variable systems in odd prime dimensions, two cases which will be treated on entirely the same footing. Noting the fact that Clifford and Gaussian operations preserve the positivity of the Wigner function, our result generalizes the Gottesman-Knill theorem. Our algorithm provides a way of sampling from the output distribution of a computation or a simulation, including the efficient sampling from an approximate output distribution in the case of sampling imperfections for initial states, gates, or measurements. In this sense, this work highlights the role of the positive Wigner function as separating classically efficiently simulable systems from those that are potentially universal for quantum computing and simulation, and it emphasizes the role of negativity of the Wigner function as a computational resource.

  18. PERFORMANCE OF A COMPUTER-BASED ASSESSMENT OF COGNITIVE FUNCTION MEASURES IN TWO COHORTS OF SENIORS

    PubMed Central

    Espeland, Mark A.; Katula, Jeffrey A.; Rushing, Julia; Kramer, Arthur F.; Jennings, Janine M.; Sink, Kaycee M.; Nadkarni, Neelesh K.; Reid, Kieran F.; Castro, Cynthia M.; Church, Timothy; Kerwin, Diana R.; Williamson, Jeff D.; Marottoli, Richard A.; Rushing, Scott; Marsiske, Michael; Rapp, Stephen R.

    2013-01-01

    Background Computer-administered assessment of cognitive function is being increasingly incorporated in clinical trials, however its performance in these settings has not been systematically evaluated. Design The Seniors Health and Activity Research Program (SHARP) pilot trial (N=73) developed a computer-based tool for assessing memory performance and executive functioning. The Lifestyle Interventions and Independence for Seniors (LIFE) investigators incorporated this battery in a full scale multicenter clinical trial (N=1635). We describe relationships that test scores have with those from interviewer-administered cognitive function tests and risk factors for cognitive deficits and describe performance measures (completeness, intra-class correlations). Results Computer-based assessments of cognitive function had consistent relationships across the pilot and full scale trial cohorts with interviewer-administered assessments of cognitive function, age, and a measure of physical function. In the LIFE cohort, their external validity was further demonstrated by associations with other risk factors for cognitive dysfunction: education, hypertension, diabetes, and physical function. Acceptable levels of data completeness (>83%) were achieved on all computer-based measures, however rates of missing data were higher among older participants (odds ratio=1.06 for each additional year; p<0.001) and those who reported no current computer use (odds ratio=2.71; p<0.001). Intra-class correlations among clinics were at least as low (ICC≤0.013) as for interviewer measures (ICC≤0.023), reflecting good standardization. All cognitive measures loaded onto the first principal component (global cognitive function), which accounted for 40% of the overall variance. Conclusion Our results support the use of computer-based tools for assessing cognitive function in multicenter clinical trials of older individuals. PMID:23589390

  19. Computer method for identification of boiler transfer functions

    NASA Technical Reports Server (NTRS)

    Miles, J. H.

    1971-01-01

    An iterative computer method is described for identifying boiler transfer functions using frequency response data. An objective penalized performance measure and a nonlinear minimization technique are used to cause the locus of points generated by a transfer function to resemble the locus of points obtained from frequency response measurements. Different transfer functions can be tried until a satisfactory empirical transfer function to the system is found. To illustrate the method, some examples and some results from a study of a set of data consisting of measurements of the inlet impedance of a single tube forced flow boiler with inserts are given.

  20. Computational approaches for rational design of proteins with novel functionalities.

    PubMed

    Tiwari, Manish Kumar; Singh, Ranjitha; Singh, Raushan Kumar; Kim, In-Won; Lee, Jung-Kul

    2012-01-01

    Proteins are the most multifaceted macromolecules in living systems and have various important functions, including structural, catalytic, sensory, and regulatory functions. Rational design of enzymes is a great challenge to our understanding of protein structure and physical chemistry and has numerous potential applications. Protein design algorithms have been applied to design or engineer proteins that fold, fold faster, catalyze, catalyze faster, signal, and adopt preferred conformational states. The field of de novo protein design, although only a few decades old, is beginning to produce exciting results. Developments in this field are already having a significant impact on biotechnology and chemical biology. The application of powerful computational methods for functional protein designing has recently succeeded at engineering target activities. Here, we review recently reported de novo functional proteins that were developed using various protein design approaches, including rational design, computational optimization, and selection from combinatorial libraries, highlighting recent advances and successes.

  1. The flight telerobotic servicer: From functional architecture to computer architecture

    NASA Technical Reports Server (NTRS)

    Lumia, Ronald; Fiala, John

    1989-01-01

    After a brief tutorial on the NASA/National Bureau of Standards Standard Reference Model for Telerobot Control System Architecture (NASREM) functional architecture, the approach to its implementation is shown. First, interfaces must be defined which are capable of supporting the known algorithms. This is illustrated by considering the interfaces required for the SERVO level of the NASREM functional architecture. After interface definition, the specific computer architecture for the implementation must be determined. This choice is obviously technology dependent. An example illustrating one possible mapping of the NASREM functional architecture to a particular set of computers which implements it is shown. The result of choosing the NASREM functional architecture is that it provides a technology independent paradigm which can be mapped into a technology dependent implementation capable of evolving with technology in the laboratory and in space.

  2. Quantum computing without wavefunctions: time-dependent density functional theory for universal quantum computation.

    PubMed

    Tempel, David G; Aspuru-Guzik, Alán

    2012-01-01

    We prove that the theorems of TDDFT can be extended to a class of qubit Hamiltonians that are universal for quantum computation. The theorems of TDDFT applied to universal Hamiltonians imply that single-qubit expectation values can be used as the basic variables in quantum computation and information theory, rather than wavefunctions. From a practical standpoint this opens the possibility of approximating observables of interest in quantum computations directly in terms of single-qubit quantities (i.e. as density functionals). Additionally, we also demonstrate that TDDFT provides an exact prescription for simulating universal Hamiltonians with other universal Hamiltonians that have different, and possibly easier-to-realize two-qubit interactions. This establishes the foundations of TDDFT for quantum computation and opens the possibility of developing density functionals for use in quantum algorithms.

  3. Quantum Computing Without Wavefunctions: Time-Dependent Density Functional Theory for Universal Quantum Computation

    PubMed Central

    Tempel, David G.; Aspuru-Guzik, Alán

    2012-01-01

    We prove that the theorems of TDDFT can be extended to a class of qubit Hamiltonians that are universal for quantum computation. The theorems of TDDFT applied to universal Hamiltonians imply that single-qubit expectation values can be used as the basic variables in quantum computation and information theory, rather than wavefunctions. From a practical standpoint this opens the possibility of approximating observables of interest in quantum computations directly in terms of single-qubit quantities (i.e. as density functionals). Additionally, we also demonstrate that TDDFT provides an exact prescription for simulating universal Hamiltonians with other universal Hamiltonians that have different, and possibly easier-to-realize two-qubit interactions. This establishes the foundations of TDDFT for quantum computation and opens the possibility of developing density functionals for use in quantum algorithms. PMID:22553483

  4. Outcomes Assessment of Computer-Assisted Behavioral Objectives for Accounting Graduates.

    ERIC Educational Resources Information Center

    Moore, John W.; Mitchem, Cheryl E.

    1997-01-01

    Presents behavioral objectives for accounting students and an outcomes assessment plan with five steps: (1) identification and definition of student competencies; (2) selection of valid instruments; (3) integration of assessment and instruction; (4) determination of levels of assessment; and (5) attribution of improvements to the program. (SK)

  5. 17 CFR 1.32 - Segregated account; daily computation and record.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... commodity and option customers; and (3) the amount of the futures commission merchant's residual interest in... business day, on a currency-by-currency basis: (1) The total amount of customer funds on deposit in segregated accounts on behalf of commodity and option customers; (2) the amount of such customer...

  6. Computational design of proteins with novel structure and functions

    NASA Astrophysics Data System (ADS)

    Wei, Yang; Lu-Hua, Lai

    2016-01-01

    Computational design of proteins is a relatively new field, where scientists search the enormous sequence space for sequences that can fold into desired structure and perform desired functions. With the computational approach, proteins can be designed, for example, as regulators of biological processes, novel enzymes, or as biotherapeutics. These approaches not only provide valuable information for understanding of sequence-structure-function relations in proteins, but also hold promise for applications to protein engineering and biomedical research. In this review, we briefly introduce the rationale for computational protein design, then summarize the recent progress in this field, including de novo protein design, enzyme design, and design of protein-protein interactions. Challenges and future prospects of this field are also discussed. Project supported by the National Basic Research Program of China (Grant No. 2015CB910300), the National High Technology Research and Development Program of China (Grant No. 2012AA020308), and the National Natural Science Foundation of China (Grant No. 11021463).

  7. Robust Computation of Morse-Smale Complexes of Bilinear Functions

    SciTech Connect

    Norgard, G; Bremer, P T

    2010-11-30

    The Morse-Smale (MS) complex has proven to be a useful tool in extracting and visualizing features from scalar-valued data. However, existing algorithms to compute the MS complex are restricted to either piecewise linear or discrete scalar fields. This paper presents a new combinatorial algorithm to compute MS complexes for two dimensional piecewise bilinear functions defined on quadrilateral meshes. We derive a new invariant of the gradient flow within a bilinear cell and use it to develop a provably correct computation which is unaffected by numerical instabilities. This includes a combinatorial algorithm to detect and classify critical points as well as a way to determine the asymptotes of cell-based saddles and their intersection with cell edges. Finally, we introduce a simple data structure to compute and store integral lines on quadrilateral meshes which by construction prevents intersections and enables us to enforce constraints on the gradient flow to preserve known invariants.

  8. SNAP: A computer program for generating symbolic network functions

    NASA Technical Reports Server (NTRS)

    Lin, P. M.; Alderson, G. E.

    1970-01-01

    The computer program SNAP (symbolic network analysis program) generates symbolic network functions for networks containing R, L, and C type elements and all four types of controlled sources. The program is efficient with respect to program storage and execution time. A discussion of the basic algorithms is presented, together with user's and programmer's guides.

  9. Computer program for calculating and fitting thermodynamic functions

    NASA Technical Reports Server (NTRS)

    Mcbride, Bonnie J.; Gordon, Sanford

    1992-01-01

    A computer program is described which (1) calculates thermodynamic functions (heat capacity, enthalpy, entropy, and free energy) for several optional forms of the partition function, (2) fits these functions to empirical equations by means of a least-squares fit, and (3) calculates, as a function of temperture, heats of formation and equilibrium constants. The program provides several methods for calculating ideal gas properties. For monatomic gases, three methods are given which differ in the technique used for truncating the partition function. For diatomic and polyatomic molecules, five methods are given which differ in the corrections to the rigid-rotator harmonic-oscillator approximation. A method for estimating thermodynamic functions for some species is also given.

  10. A computational account of the production effect: Still playing twenty questions with nature.

    PubMed

    Jamieson, Randall K; Mewhort, D J K; Hockley, William E

    2016-06-01

    People remember words that they read aloud better than words that they read silently, a result known as the production effect. The standing explanation for the production effect is that producing a word renders it distinctive in memory and, thus, memorable at test. By 1 key account, distinctiveness is defined in terms of sensory feedback. We formalize the sensory-feedback account using MINERVA 2, a standard model of memory. The model accommodates the basic result in recognition as well as the fact that the mixed-list production effect is larger than its pure-list counterpart, that the production effect is robust to forgetting, and that the production and generation effects have additive influences on performance. A final simulation addresses the strength-based account and suggests that it will be more difficult to distinguish a strength-based versus distinctiveness-based explanation than is typically thought. We conclude that the production effect is consistent with existing theory and discuss our analysis in relation to Alan Newell's (1973) classic criticism of psychology and call for an analysis of psychological principles instead of laboratory phenomena. (PsycINFO Database Record

  11. Computing the hadronic vacuum polarization function by analytic continuation

    DOE PAGES

    Feng, Xu; Hashimoto, Shoji; Hotzel, Grit; Jansen, Karl; Petschlies, Marcus; Renner, Dru B.

    2013-08-29

    We propose a method to compute the hadronic vacuum polarization function on the lattice at continuous values of photon momenta bridging between the spacelike and timelike regions. We provide two independent demonstrations to show that this method leads to the desired hadronic vacuum polarization function in Minkowski spacetime. We present with the example of the leading-order QCD correction to the muon anomalous magnetic moment that this approach can provide a valuable alternative method for calculations of physical quantities where the hadronic vacuum polarization function enters.

  12. Environment parameters and basic functions for floating-point computation

    NASA Technical Reports Server (NTRS)

    Brown, W. S.; Feldman, S. I.

    1978-01-01

    A language-independent proposal for environment parameters and basic functions for floating-point computation is presented. Basic functions are proposed to analyze, synthesize, and scale floating-point numbers. The model provides a small set of parameters and a small set of axioms along with sharp measures of roundoff error. The parameters and functions can be used to write portable and robust codes that deal intimately with the floating-point representation. Subject to underflow and overflow constraints, a number can be scaled by a power of the floating-point radix inexpensively and without loss of precision. A specific representation for FORTRAN is included.

  13. Assessment of cardiac function: magnetic resonance and computed tomography.

    PubMed

    Greenberg, S B

    2000-10-01

    A complete cardiac study requires both anatomic and physiologic evaluation. Cardiac function can be evaluated noninvasively by magnetic resonance imaging (MRI)or ultrafast computed tomography (CT). MRI allows for evaluation of cardiac function by cine gradient echo imaging of the ventricles and flow analysis across cardiac valves and the great vessels. Cine gradient echo imaging is useful for evaluation of cardiac wall motion, ventricular volumes and ventricular mass. Flow analysis allows for measurement of velocity and flow during the cardiac cycle that reflects cardiac function. Ultrafast CT allows for measurement of cardiac indices similar to that provided by gradient echo imaging of the ventricles.

  14. A Survey of Computational Intelligence Techniques in Protein Function Prediction

    PubMed Central

    Tiwari, Arvind Kumar; Srivastava, Rajeev

    2014-01-01

    During the past, there was a massive growth of knowledge of unknown proteins with the advancement of high throughput microarray technologies. Protein function prediction is the most challenging problem in bioinformatics. In the past, the homology based approaches were used to predict the protein function, but they failed when a new protein was different from the previous one. Therefore, to alleviate the problems associated with homology based traditional approaches, numerous computational intelligence techniques have been proposed in the recent past. This paper presents a state-of-the-art comprehensive review of various computational intelligence techniques for protein function predictions using sequence, structure, protein-protein interaction network, and gene expression data used in wide areas of applications such as prediction of DNA and RNA binding sites, subcellular localization, enzyme functions, signal peptides, catalytic residues, nuclear/G-protein coupled receptors, membrane proteins, and pathway analysis from gene expression datasets. This paper also summarizes the result obtained by many researchers to solve these problems by using computational intelligence techniques with appropriate datasets to improve the prediction performance. The summary shows that ensemble classifiers and integration of multiple heterogeneous data are useful for protein function prediction. PMID:25574395

  15. Integrated command, control, communications and computation system functional architecture

    NASA Technical Reports Server (NTRS)

    Cooley, C. G.; Gilbert, L. E.

    1981-01-01

    The functional architecture for an integrated command, control, communications, and computation system applicable to the command and control portion of the NASA End-to-End Data. System is described including the downlink data processing and analysis functions required to support the uplink processes. The functional architecture is composed of four elements: (1) the functional hierarchy which provides the decomposition and allocation of the command and control functions to the system elements; (2) the key system features which summarize the major system capabilities; (3) the operational activity threads which illustrate the interrelationahip between the system elements; and (4) the interfaces which illustrate those elements that originate or generate data and those elements that use the data. The interfaces also provide a description of the data and the data utilization and access techniques.

  16. Computer-based accountability system (Phase I) for special nuclear materials at Argonne-West

    SciTech Connect

    Ingermanson, R.S.; Proctor, A.E.

    1982-05-01

    An automated accountability system for special nuclear materials (SNM) is under development at Argonne National Laboratory-West. Phase I of the development effort has established the following basic features of the system: a unique file organization allows rapid updating or retrieval of the status of various SNM, based on batch numbers, storage location, serial number, or other attributes. Access to the program is controlled by an interactive user interface that can be easily understood by operators who have had no prior background in electronic data processing. Extensive use of structured programming techniques make the software package easy to understand and to modify for specific applications. All routines are written in FORTRAN.

  17. A general model for likelihood computations of genetic marker data accounting for linkage, linkage disequilibrium, and mutations.

    PubMed

    Kling, Daniel; Tillmar, Andreas; Egeland, Thore; Mostad, Petter

    2015-09-01

    Several applications necessitate an unbiased determination of relatedness, be it in linkage or association studies or in a forensic setting. An appropriate model to compute the joint probability of some genetic data for a set of persons given some hypothesis about the pedigree structure is then required. The increasing number of markers available through high-density SNP microarray typing and NGS technologies intensifies the demand, where using a large number of markers may lead to biased results due to strong dependencies between closely located loci, both within pedigrees (linkage) and in the population (allelic association or linkage disequilibrium (LD)). We present a new general model, based on a Markov chain for inheritance patterns and another Markov chain for founder allele patterns, the latter allowing us to account for LD. We also demonstrate a specific implementation for X chromosomal markers that allows for computation of likelihoods based on hypotheses of alleged relationships and genetic marker data. The algorithm can simultaneously account for linkage, LD, and mutations. We demonstrate its feasibility using simulated examples. The algorithm is implemented in the software FamLinkX, providing a user-friendly GUI for Windows systems (FamLinkX, as well as further usage instructions, is freely available at www.famlink.se ). Our software provides the necessary means to solve cases where no previous implementation exists. In addition, the software has the possibility to perform simulations in order to further study the impact of linkage and LD on computed likelihoods for an arbitrary set of markers.

  18. Computational design of receptor and sensor proteins with novel functions

    NASA Astrophysics Data System (ADS)

    Looger, Loren L.; Dwyer, Mary A.; Smith, James J.; Hellinga, Homme W.

    2003-05-01

    The formation of complexes between proteins and ligands is fundamental to biological processes at the molecular level. Manipulation of molecular recognition between ligands and proteins is therefore important for basic biological studies and has many biotechnological applications, including the construction of enzymes, biosensors, genetic circuits, signal transduction pathways and chiral separations. The systematic manipulation of binding sites remains a major challenge. Computational design offers enormous generality for engineering protein structure and function. Here we present a structure-based computational method that can drastically redesign protein ligand-binding specificities. This method was used to construct soluble receptors that bind trinitrotoluene, L-lactate or serotonin with high selectivity and affinity. These engineered receptors can function as biosensors for their new ligands; we also incorporated them into synthetic bacterial signal transduction pathways, regulating gene expression in response to extracellular trinitrotoluene or L-lactate. The use of various ligands and proteins shows that a high degree of control over biomolecular recognition has been established computationally. The biological and biosensing activities of the designed receptors illustrate potential applications of computational design.

  19. Computer Code For Calculation Of The Mutual Coherence Function

    NASA Astrophysics Data System (ADS)

    Bugnolo, Dimitri S.

    1986-05-01

    We present a computer code in FORTRAN 77 for the calculation of the mutual coherence function (MCF) of a plane wave normally incident on a stochastic half-space. This is an exact result. The user need only input the path length, the wavelength, the outer scale size, and the structure constant. This program may be used to calculate the MCF of a well-collimated laser beam in the atmosphere.

  20. Computations involving differential operators and their actions on functions

    NASA Technical Reports Server (NTRS)

    Crouch, Peter E.; Grossman, Robert; Larson, Richard

    1991-01-01

    The algorithms derived by Grossmann and Larson (1989) are further developed for rewriting expressions involving differential operators. The differential operators involved arise in the local analysis of nonlinear dynamical systems. These algorithms are extended in two different directions: the algorithms are generalized so that they apply to differential operators on groups and the data structures and algorithms are developed to compute symbolically the action of differential operators on functions. Both of these generalizations are needed for applications.

  1. Efficient quantum algorithm for computing n-time correlation functions.

    PubMed

    Pedernales, J S; Di Candia, R; Egusquiza, I L; Casanova, J; Solano, E

    2014-07-11

    We propose a method for computing n-time correlation functions of arbitrary spinorial, fermionic, and bosonic operators, consisting of an efficient quantum algorithm that encodes these correlations in an initially added ancillary qubit for probe and control tasks. For spinorial and fermionic systems, the reconstruction of arbitrary n-time correlation functions requires the measurement of two ancilla observables, while for bosonic variables time derivatives of the same observables are needed. Finally, we provide examples applicable to different quantum platforms in the frame of the linear response theory.

  2. Computational studies of the purine-functionalized graphene sheets

    NASA Astrophysics Data System (ADS)

    Mirzaei, Mahmoud; Yousefi, Mohammad

    2012-10-01

    We performed a computational work to investigate the properties of functionalized graphene sheets (S) by adenine (A) and guanine (G) purine nucleobases. To achieve the purpose of this work, we examined the functionalization of armchair and zigzag tips of the S model by each of the A and G purines. The results indicated that the optimized properties for the investigated hybrid structures are different depending on the tip of functionalization and the used purine nucleobase. Moreover, the atomic level properties of the investigated structures were investigated by evaluating quadrupole coupling constants (CQ) for the atoms of the optimized structures. The remarkable trend of the CQ parameters is that the changes of atomic properties are many more significant for the functionalization of the zigzag-tip by the G nucleobase, which is in agreement with the results of the optimized properties.

  3. Computational aspects of the continuum quaternionic wave functions for hydrogen

    SciTech Connect

    Morais, J.

    2014-10-15

    Over the past few years considerable attention has been given to the role played by the Hydrogen Continuum Wave Functions (HCWFs) in quantum theory. The HCWFs arise via the method of separation of variables for the time-independent Schrödinger equation in spherical coordinates. The HCWFs are composed of products of a radial part involving associated Laguerre polynomials multiplied by exponential factors and an angular part that is the spherical harmonics. In the present paper we introduce the continuum wave functions for hydrogen within quaternionic analysis ((R)QHCWFs), a result which is not available in the existing literature. In particular, the underlying functions are of three real variables and take on either values in the reduced and full quaternions (identified, respectively, with R{sup 3} and R{sup 4}). We prove that the (R)QHCWFs are orthonormal to one another. The representation of these functions in terms of the HCWFs are explicitly given, from which several recurrence formulae for fast computer implementations can be derived. A summary of fundamental properties and further computation of the hydrogen-like atom transforms of the (R)QHCWFs are also discussed. We address all the above and explore some basic facts of the arising quaternionic function theory. As an application, we provide the reader with plot simulations that demonstrate the effectiveness of our approach. (R)QHCWFs are new in the literature and have some consequences that are now under investigation.

  4. A hybrid method for the parallel computation of Green's functions

    SciTech Connect

    Petersen, Dan Erik; Li Song; Stokbro, Kurt; Sorensen, Hans Henrik B.; Hansen, Per Christian; Skelboe, Stig; Darve, Eric

    2009-08-01

    Quantum transport models for nanodevices using the non-equilibrium Green's function method require the repeated calculation of the block tridiagonal part of the Green's and lesser Green's function matrices. This problem is related to the calculation of the inverse of a sparse matrix. Because of the large number of times this calculation needs to be performed, this is computationally very expensive even on supercomputers. The classical approach is based on recurrence formulas which cannot be efficiently parallelized. This practically prevents the solution of large problems with hundreds of thousands of atoms. We propose new recurrences for a general class of sparse matrices to calculate Green's and lesser Green's function matrices which extend formulas derived by Takahashi and others. We show that these recurrences may lead to a dramatically reduced computational cost because they only require computing a small number of entries of the inverse matrix. Then, we propose a parallelization strategy for block tridiagonal matrices which involves a combination of Schur complement calculations and cyclic reduction. It achieves good scalability even on problems of modest size.

  5. The Functionality of Spontaneous Mimicry and Its Influences on Affiliation: An Implicit Socialization Account.

    PubMed

    Kavanagh, Liam C; Winkielman, Piotr

    2016-01-01

    There is a broad theoretical and empirical interest in spontaneous mimicry, or the automatic reproduction of a model's behavior. Evidence shows that people mimic models they like, and that mimicry enhances liking for the mimic. Yet, there is no satisfactory account of this phenomenon, especially in terms of its functional significance. While affiliation is often cited as the driver of mimicry, we argue that mimicry is primarily driven by a learning process that helps to produce the appropriate bodily and emotional responses to relevant social situations. Because the learning process and the resulting knowledge is implicit, it cannot easily be rejected, criticized, revised, and employed by the learner in a deliberative or deceptive manner. We argue that these characteristics will lead individuals to preferentially mimic ingroup members, whose implicit information is worth incorporating. Conversely, mimicry of the wrong person is costly because individuals will internalize "bad habits," including emotional reactions and mannerisms indicating wrong group membership. This pattern of mimicry, in turn, means that observed mimicry is an honest signal of group affiliation. We propose that the preferences of models for the mimic stems from this true signal value. Further, just like facial expressions, mimicry communicates a genuine disposition when it is truly spontaneous. Consequently, perceivers are attuned to relevant cues such as appropriate timing, fidelity, and selectivity. Our account, while assuming no previously unknown biological endowments, also explains greater mimicry of powerful people, and why affiliation can be signaled by mimicry of seemingly inconsequential behaviors. PMID:27064398

  6. The Functionality of Spontaneous Mimicry and Its Influences on Affiliation: An Implicit Socialization Account

    PubMed Central

    Kavanagh, Liam C.; Winkielman, Piotr

    2016-01-01

    There is a broad theoretical and empirical interest in spontaneous mimicry, or the automatic reproduction of a model’s behavior. Evidence shows that people mimic models they like, and that mimicry enhances liking for the mimic. Yet, there is no satisfactory account of this phenomenon, especially in terms of its functional significance. While affiliation is often cited as the driver of mimicry, we argue that mimicry is primarily driven by a learning process that helps to produce the appropriate bodily and emotional responses to relevant social situations. Because the learning process and the resulting knowledge is implicit, it cannot easily be rejected, criticized, revised, and employed by the learner in a deliberative or deceptive manner. We argue that these characteristics will lead individuals to preferentially mimic ingroup members, whose implicit information is worth incorporating. Conversely, mimicry of the wrong person is costly because individuals will internalize “bad habits,” including emotional reactions and mannerisms indicating wrong group membership. This pattern of mimicry, in turn, means that observed mimicry is an honest signal of group affiliation. We propose that the preferences of models for the mimic stems from this true signal value. Further, just like facial expressions, mimicry communicates a genuine disposition when it is truly spontaneous. Consequently, perceivers are attuned to relevant cues such as appropriate timing, fidelity, and selectivity. Our account, while assuming no previously unknown biological endowments, also explains greater mimicry of powerful people, and why affiliation can be signaled by mimicry of seemingly inconsequential behaviors. PMID:27064398

  7. The Functionality of Spontaneous Mimicry and Its Influences on Affiliation: An Implicit Socialization Account.

    PubMed

    Kavanagh, Liam C; Winkielman, Piotr

    2016-01-01

    There is a broad theoretical and empirical interest in spontaneous mimicry, or the automatic reproduction of a model's behavior. Evidence shows that people mimic models they like, and that mimicry enhances liking for the mimic. Yet, there is no satisfactory account of this phenomenon, especially in terms of its functional significance. While affiliation is often cited as the driver of mimicry, we argue that mimicry is primarily driven by a learning process that helps to produce the appropriate bodily and emotional responses to relevant social situations. Because the learning process and the resulting knowledge is implicit, it cannot easily be rejected, criticized, revised, and employed by the learner in a deliberative or deceptive manner. We argue that these characteristics will lead individuals to preferentially mimic ingroup members, whose implicit information is worth incorporating. Conversely, mimicry of the wrong person is costly because individuals will internalize "bad habits," including emotional reactions and mannerisms indicating wrong group membership. This pattern of mimicry, in turn, means that observed mimicry is an honest signal of group affiliation. We propose that the preferences of models for the mimic stems from this true signal value. Further, just like facial expressions, mimicry communicates a genuine disposition when it is truly spontaneous. Consequently, perceivers are attuned to relevant cues such as appropriate timing, fidelity, and selectivity. Our account, while assuming no previously unknown biological endowments, also explains greater mimicry of powerful people, and why affiliation can be signaled by mimicry of seemingly inconsequential behaviors.

  8. A radial basis function network approach for the computation of inverse continuous time variant functions.

    PubMed

    Mayorga, René V; Carrera, Jonathan

    2007-06-01

    This Paper presents an efficient approach for the fast computation of inverse continuous time variant functions with the proper use of Radial Basis Function Networks (RBFNs). The approach is based on implementing RBFNs for computing inverse continuous time variant functions via an overall damped least squares solution that includes a novel null space vector for singularities prevention. The singularities avoidance null space vector is derived from developing a sufficiency condition for singularities prevention that conduces to establish some characterizing matrices and an associated performance index.

  9. The Melanopic Sensitivity Function Accounts for Melanopsin-Driven Responses in Mice under Diverse Lighting Conditions

    PubMed Central

    Brown, Timothy M.; Allen, Annette E.; al-Enezi, Jazi; Wynne, Jonathan; Schlangen, Luc; Hommes, Vanja; Lucas, Robert J.

    2013-01-01

    In addition to rods and cones, photoreception in mammals extends to a third retinal cell type expressing the photopigment melanopsin. The influences of this novel opsin are widespread, ranging from pupillary and circadian responses to brightness perception, yet established approaches to quantifying the biological effects of light do not adequately account for melanopsin sensitivity. We have recently proposed a novel metric, the melanopic sensitivity function (VZλ), to address this deficiency. Here, we further validate this new measure with a variety of tests based on potential barriers to its applicability identified in the literature or relating to obvious practical benefits. Using electrophysiogical approaches and pupillometry, initially in rodless+coneless mice, our data demonstrate that under a very wide range of different conditions (including switching between stimuli with highly divergent spectral content) the VZλ function provides an accurate prediction of the sensitivity of melanopsin-dependent responses. We further show that VZλ provides the best available description of the spectral sensitivity of at least one aspect of the visual response in mice with functional rods and cones: tonic firing activity in the lateral geniculate nuclei. Together, these data establish VZλ as an important new approach for light measurement with widespread practical utility. PMID:23301090

  10. Why do interracial interactions impair executive function? A resource depletion account.

    PubMed

    Richeson, Jennifer A; Trawalter, Sophie

    2005-06-01

    Three studies investigated the veracity of a resource depletion account of the impairment of inhibitory task performance after interracial contact. White individuals engaged in either an interracial or same-race interaction, then completed an ostensibly unrelated Stroop color-naming test. In each study, the self-regulatory demands of the interaction were either increased (Study 1) or decreased (Studies 2 and 3). Results revealed that increasing the self-regulatory demands of an interracial interaction led to greater Stroop interference compared with control, whereas reducing self-regulatory demands led to less Stroop interference. Manipulating self-regulatory demands did not affect Stroop performance after same-race interactions. Taken together, the present studies point to resource depletion as the likely mechanism underlying the impairment of cognitive functioning after interracial dyadic interactions. PMID:15982114

  11. 49 CFR 1242.78 - Employees performing clerical and accounting functions, and loss and damage claims processing...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 9 2010-10-01 2010-10-01 false Employees performing clerical and accounting functions, and loss and damage claims processing (accounts XX-55-76 and XX-55-78). 1242.78 Section 1242.78 Transportation Other Regulations Relating to Transportation (Continued) SURFACE TRANSPORTATION BOARD, DEPARTMENT OF TRANSPORTATION...

  12. A three-dimensional model of mammalian tyrosinase active site accounting for loss of function mutations.

    PubMed

    Schweikardt, Thorsten; Olivares, Concepción; Solano, Francisco; Jaenicke, Elmar; García-Borrón, José Carlos; Decker, Heinz

    2007-10-01

    Tyrosinases are the first and rate-limiting enzymes in the synthesis of melanin pigments responsible for colouring hair, skin and eyes. Mutation of tyrosinases often decreases melanin production resulting in albinism, but the effects are not always understood at the molecular level. Homology modelling of mouse tyrosinase based on recently published crystal structures of non-mammalian tyrosinases provides an active site model accounting for loss-of-function mutations. According to the model, the copper-binding histidines are located in a helix bundle comprising four densely packed helices. A loop containing residues M374, S375 and V377 connects the CuA and CuB centres, with the peptide oxygens of M374 and V377 serving as hydrogen acceptors for the NH-groups of the imidazole rings of the copper-binding His367 and His180. Therefore, this loop is essential for the stability of the active site architecture. A double substitution (374)MS(375) --> (374)GG(375) or a single M374G mutation lead to a local perturbation of the protein matrix at the active site affecting the orientation of the H367 side chain, that may be unable to bind CuB reliably, resulting in loss of activity. The model also accounts for loss of function in two naturally occurring albino mutations, S380P and V393F. The hydroxyl group in S380 contributes to the correct orientation of M374, and the substitution of V393 for a bulkier phenylalanine sterically impedes correct side chain packing at the active site. Therefore, our model explains the mechanistic necessity for conservation of not only active site histidines but also adjacent amino acids in tyrosinase. PMID:17850513

  13. Computational approaches for inferring the functions of intrinsically disordered proteins

    PubMed Central

    Varadi, Mihaly; Vranken, Wim; Guharoy, Mainak; Tompa, Peter

    2015-01-01

    Intrinsically disordered proteins (IDPs) are ubiquitously involved in cellular processes and often implicated in human pathological conditions. The critical biological roles of these proteins, despite not adopting a well-defined fold, encouraged structural biologists to revisit their views on the protein structure-function paradigm. Unfortunately, investigating the characteristics and describing the structural behavior of IDPs is far from trivial, and inferring the function(s) of a disordered protein region remains a major challenge. Computational methods have proven particularly relevant for studying IDPs: on the sequence level their dependence on distinct characteristics determined by the local amino acid context makes sequence-based prediction algorithms viable and reliable tools for large scale analyses, while on the structure level the in silico integration of fundamentally different experimental data types is essential to describe the behavior of a flexible protein chain. Here, we offer an overview of the latest developments and computational techniques that aim to uncover how protein function is connected to intrinsic disorder. PMID:26301226

  14. Accounting for dye diffusion and orientation when relating FRET measurements to distances: three simple computational methods.

    PubMed

    Walczewska-Szewc, Katarzyna; Corry, Ben

    2014-06-28

    Förster resonance energy transfer (FRET) allows in principal for the structural changes of biological systems to be revealed by monitoring distributions and distance fluctuations between parts of individual molecules. However, because flexible probes usually have to be attached to the macromolecule to conduct these experiments, they suffer from uncertainty in probe positions and orientations. One of the way to address this issue is to use molecular dynamics simulations to explicitly model the likely positions of the probes, but, this is still not widely accessible because of the large computational effort required. Here we compare three simpler methods that can potentially replace MD simulations in FRET data interpretation. In the first, the volume accessible for dye movement is calculated using a fast, geometrical algorithm. The next method, adapted from the analysis of electron paramagnetic studies, utilises a library of rotamers describing probe conformations. The last method uses preliminary MD simulations of fluorescent dyes in solution, to identify all conformational states of dyes and overlays this on the macromolecular system. A comparison of these methods in the simple system of dye-labelled polyproline, shows that in the case of lack of interaction between the dye and host, all give results comparable with MD simulations but require much less time. Differences between these three methods and their ability to compete with MD simulations in the analysis of real experiment are demonstrated and discussed using the examples of cold shock protein and leucine transporter systems.

  15. On the Hydrodynamic Function of Sharkskin: A Computational Investigation

    NASA Astrophysics Data System (ADS)

    Boomsma, Aaron; Sotiropoulos, Fotis

    2014-11-01

    Denticles (placoid scales) are small structures that cover the epidermis of some sharks. The hydrodynamic function of denticles is unclear. Because they resemble riblets, they have been thought to passively reduce skin-friction-for which there is some experimental evidence. Others have experimentally shown that denticles increase skin-friction and have hypothesized that denticles act as vortex generators to delay separation. To help clarify their function, we use high-resolution large eddy and direct numerical simulations, with an immersed boundary method, to simulate flow patterns past and calculate the drag force on Mako Short Fin denticles. Simulations are carried out for the denticles placed in a canonical turbulent boundary layer as well as in the vicinity of a separation bubble. The computed results elucidate the three-dimensional structure of the flow around denticles and provide insights into the hydrodynamic function of sharkskin.

  16. Multiple von Neumann computers: an evolutionary approach to functional emergence.

    PubMed

    Suzuki, H

    1997-01-01

    A novel system composed of multiple von Neumann computers and an appropriate problem environment is proposed and simulated. Each computer has a memory to store the machine instruction program, and when a program is executed, a series of machine codes in the memory is sequentially decoded, leading to register operations in the central processing unit (CPU). By means of these operations, the computer not only can handle its generally used registers but also can read and write the environmental database. Simulation is driven by genetic algorithms (GAs) performed on the population of program memories. Mutation and crossover create program diversity in the memory, and selection facilitates the reproduction of appropriate programs. Through these evolutionary operations, advantageous combinations of machine codes are created and fixed in the population one by one, and the higher function, which enables the computer to calculate an appropriate number from the environment, finally emerges in the program memory. In the latter half of the article, the performance of GAs on this system is studied. Under different sets of parameters, the evolutionary speed, which is determined by the time until the domination of the final program, is examined and the conditions for faster evolution are clarified. At an intermediate mutation rate and at an intermediate population size, crossover helps create novel advantageous sets of machine codes and evidently accelerates optimization by GAs.

  17. 95Mo nuclear magnetic resonance parameters of molybdenum hexacarbonyl from density functional theory: appraisal of computational and geometrical parameters.

    PubMed

    Cuny, Jérôme; Sykina, Kateryna; Fontaine, Bruno; Le Pollès, Laurent; Pickard, Chris J; Gautier, Régis

    2011-11-21

    Solid-state (95)Mo nuclear magnetic resonance (NMR) properties of molybdenum hexacarbonyl have been computed using density functional theory (DFT) based methods. Both quadrupolar coupling and chemical shift parameters were evaluated and compared with parameters of high precision determined using single-crystal (95)Mo NMR experiments. Within a molecular approach, the effects of major computational parameters, i.e. basis set, exchange-correlation functional, treatment of relativity, have been evaluated. Except for the isotropic parameter of both chemical shift and chemical shielding, computed NMR parameters are more sensitive to geometrical variations than computational details. Relativistic effects do not play a crucial part in the calculations of such parameters for the 4d transition metal, in particular isotropic chemical shift. Periodic DFT calculations were tackled to measure the influence of neighbouring molecules on the crystal structure. These effects have to be taken into account to compute accurate solid-state (95)Mo NMR parameters even for such an inorganic molecular compound.

  18. Efficiently accounting for ion correlations in electrokinetic nanofluidic devices using density functional theory.

    PubMed

    Gillespie, Dirk; Khair, Aditya S; Bardhan, Jaydeep P; Pennathur, Sumita

    2011-07-15

    The electrokinetic behavior of nanofluidic devices is dominated by the electrical double layers at the device walls. Therefore, accurate, predictive models of double layers are essential for device design and optimization. In this paper, we demonstrate that density functional theory (DFT) of electrolytes is an accurate and computationally efficient method for computing finite ion size effects and the resulting ion-ion correlations that are neglected in classical double layer theories such as Poisson-Boltzmann. Because DFT is derived from liquid-theory thermodynamic principles, it is ideal for nanofluidic systems with small spatial dimensions, high surface charge densities, high ion concentrations, and/or large ions. Ion-ion correlations are expected to be important in these regimes, leading to nonlinear phenomena such as charge inversion, wherein more counterions adsorb at the wall than is necessary to neutralize its surface charge, leading to a second layer of co-ions. We show that DFT, unlike other theories that do not include ion-ion correlations, can predict charge inversion and other nonlinear phenomena that lead to qualitatively different current densities and ion velocities for both pressure-driven and electro-osmotic flows. We therefore propose that DFT can be a valuable modeling and design tool for nanofluidic devices as they become smaller and more highly charged.

  19. 21 CFR 870.1435 - Single-function, preprogrammed diagnostic computer.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Single-function, preprogrammed diagnostic computer... Single-function, preprogrammed diagnostic computer. (a) Identification. A single-function, preprogrammed diagnostic computer is a hard-wired computer that calculates a specific physiological or blood-flow...

  20. 21 CFR 870.1435 - Single-function, preprogrammed diagnostic computer.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Single-function, preprogrammed diagnostic computer... Single-function, preprogrammed diagnostic computer. (a) Identification. A single-function, preprogrammed diagnostic computer is a hard-wired computer that calculates a specific physiological or blood-flow...

  1. 21 CFR 870.1435 - Single-function, preprogrammed diagnostic computer.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Single-function, preprogrammed diagnostic computer... Single-function, preprogrammed diagnostic computer. (a) Identification. A single-function, preprogrammed diagnostic computer is a hard-wired computer that calculates a specific physiological or blood-flow...

  2. 21 CFR 870.1435 - Single-function, preprogrammed diagnostic computer.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Single-function, preprogrammed diagnostic computer... Single-function, preprogrammed diagnostic computer. (a) Identification. A single-function, preprogrammed diagnostic computer is a hard-wired computer that calculates a specific physiological or blood-flow...

  3. 21 CFR 870.1435 - Single-function, preprogrammed diagnostic computer.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Single-function, preprogrammed diagnostic computer... Single-function, preprogrammed diagnostic computer. (a) Identification. A single-function, preprogrammed diagnostic computer is a hard-wired computer that calculates a specific physiological or blood-flow...

  4. Confidence and psychosis: a neuro-computational account of contingency learning disruption by NMDA blockade.

    PubMed

    Vinckier, F; Gaillard, R; Palminteri, S; Rigoux, L; Salvador, A; Fornito, A; Adapa, R; Krebs, M O; Pessiglione, M; Fletcher, P C

    2016-07-01

    A state of pathological uncertainty about environmental regularities might represent a key step in the pathway to psychotic illness. Early psychosis can be investigated in healthy volunteers under ketamine, an NMDA receptor antagonist. Here, we explored the effects of ketamine on contingency learning using a placebo-controlled, double-blind, crossover design. During functional magnetic resonance imaging, participants performed an instrumental learning task, in which cue-outcome contingencies were probabilistic and reversed between blocks. Bayesian model comparison indicated that in such an unstable environment, reinforcement learning parameters are downregulated depending on confidence level, an adaptive mechanism that was specifically disrupted by ketamine administration. Drug effects were underpinned by altered neural activity in a fronto-parietal network, which reflected the confidence-based shift to exploitation of learned contingencies. Our findings suggest that an early characteristic of psychosis lies in a persistent doubt that undermines the stabilization of behavioral policy resulting in a failure to exploit regularities in the environment.

  5. Complete RNA inverse folding: computational design of functional hammerhead ribozymes

    PubMed Central

    Dotu, Ivan; Garcia-Martin, Juan Antonio; Slinger, Betty L.; Mechery, Vinodh; Meyer, Michelle M.; Clote, Peter

    2014-01-01

    Nanotechnology and synthetic biology currently constitute one of the most innovative, interdisciplinary fields of research, poised to radically transform society in the 21st century. This paper concerns the synthetic design of ribonucleic acid molecules, using our recent algorithm, RNAiFold, which can determine all RNA sequences whose minimum free energy secondary structure is a user-specified target structure. Using RNAiFold, we design ten cis-cleaving hammerhead ribozymes, all of which are shown to be functional by a cleavage assay. We additionally use RNAiFold to design a functional cis-cleaving hammerhead as a modular unit of a synthetic larger RNA. Analysis of kinetics on this small set of hammerheads suggests that cleavage rate of computationally designed ribozymes may be correlated with positional entropy, ensemble defect, structural flexibility/rigidity and related measures. Artificial ribozymes have been designed in the past either manually or by SELEX (Systematic Evolution of Ligands by Exponential Enrichment); however, this appears to be the first purely computational design and experimental validation of novel functional ribozymes. RNAiFold is available at http://bioinformatics.bc.edu/clotelab/RNAiFold/. PMID:25209235

  6. Computer Modeling of the Earliest Cellular Structures and Functions

    NASA Technical Reports Server (NTRS)

    Pohorille, Andrew; Chipot, Christophe; Schweighofer, Karl

    2000-01-01

    In the absence of extinct or extant record of protocells (the earliest ancestors of contemporary cells). the most direct way to test our understanding of the origin of cellular life is to construct laboratory models of protocells. Such efforts are currently underway in the NASA Astrobiology Program. They are accompanied by computational studies aimed at explaining self-organization of simple molecules into ordered structures and developing designs for molecules that perform proto-cellular functions. Many of these functions, such as import of nutrients, capture and storage of energy. and response to changes in the environment are carried out by proteins bound to membrane< We will discuss a series of large-scale, molecular-level computer simulations which demonstrate (a) how small proteins (peptides) organize themselves into ordered structures at water-membrane interfaces and insert into membranes, (b) how these peptides aggregate to form membrane-spanning structures (eg. channels), and (c) by what mechanisms such aggregates perform essential proto-cellular functions, such as proton transport of protons across cell walls, a key step in cellular bioenergetics. The simulations were performed using the molecular dynamics method, in which Newton's equations of motion for each item in the system are solved iteratively. The problems of interest required simulations on multi-nanosecond time scales, which corresponded to 10(exp 6)-10(exp 8) time steps.

  7. Confidence and psychosis: a neuro-computational account of contingency learning disruption by NMDA blockade.

    PubMed

    Vinckier, F; Gaillard, R; Palminteri, S; Rigoux, L; Salvador, A; Fornito, A; Adapa, R; Krebs, M O; Pessiglione, M; Fletcher, P C

    2016-07-01

    A state of pathological uncertainty about environmental regularities might represent a key step in the pathway to psychotic illness. Early psychosis can be investigated in healthy volunteers under ketamine, an NMDA receptor antagonist. Here, we explored the effects of ketamine on contingency learning using a placebo-controlled, double-blind, crossover design. During functional magnetic resonance imaging, participants performed an instrumental learning task, in which cue-outcome contingencies were probabilistic and reversed between blocks. Bayesian model comparison indicated that in such an unstable environment, reinforcement learning parameters are downregulated depending on confidence level, an adaptive mechanism that was specifically disrupted by ketamine administration. Drug effects were underpinned by altered neural activity in a fronto-parietal network, which reflected the confidence-based shift to exploitation of learned contingencies. Our findings suggest that an early characteristic of psychosis lies in a persistent doubt that undermines the stabilization of behavioral policy resulting in a failure to exploit regularities in the environment. PMID:26055423

  8. Non-functioning adrenal adenomas discovered incidentally on computed tomography

    SciTech Connect

    Mitnick, J.S.; Bosniak, M.A.; Megibow, A.J.; Naidich, D.P.

    1983-08-01

    Eighteen patients with unilateral non-metastatic non-functioning adrenal masses were studied with computed tomography (CT). Pathological examination in cases revealed benign adrenal adenomas. The others were followed up with serial CT scans and found to show no change in tumor size over a period of six months to three years. On the basis of these findings, the authors suggest certain criteria of a benign adrenal mass, including (a) diameter less than 5 cm, (b) smooth contour, (c) well-defined margin, and (d) no change in size on follow-up. Serial CT scanning can be used as an alternative to surgery in the management of many of these patients.

  9. Enhancing the Reliability of Spectral Correlation Function with Distributed Computing

    NASA Astrophysics Data System (ADS)

    Alfaqawi, M. I.; Chebil, J.; Habaebi, M. H.; Ramli, N.; Mohamad, H.

    2013-12-01

    Various random time series used in signal processing systems are cyclostationary due to the sinusoidal carriers, pulse trains, periodic motion, or physical phenomenon. The cyclostationarity of the signal could be analysed by using the spectral correlation function (SCF). However, SCF is considered high complex due to the 2-D functionality and the required long observation time. The SCF could be computed in various methods however there are two methods used in practice such as FFT accumulation method (FAM) and strip spectral correlation algorithm (SSCA). This paper shows the benefit on the complexity and the reliability due to the workload distribution of one processor over different cooperated processors. The paper found that with increasing the reliability of the SCF, the number of the cooperated processors to achieve the half of the maximum complexity will reduce.

  10. Toward a functional categorization of slow waves: taking into account past and future events.

    PubMed

    Rösler, F; Heil, M

    1991-05-01

    Ruchkin, Johnson, Mahaffey, and Sutton (1988) presented evidence for a frontal positive/posterior negative late slow wave (SW) which they found to be functionally related to conceptual load, i.e., the difficulty of mental calculation problems increased both the positive and negative parts of it. In the present study we replicated the paradigm of Ruchkin et al. with some modifications, and we also found that this late SW pattern is actually due to a superimposition of two slow potentials. Our results suggest that one potential (positive at frontopolar scalp) is related to the mental operation of division. However, the other potential (negative over posterior scalp) is not related to the computational task itself but to the expectation of stimuli that follow the task. In addition, we found that memorizing a digit seems to be associated with a positive slow wave over posterior scalp. Altogether, our data suggest that load imposed on working memory is associated with positive slow waves which show a task specific topography--mental division is associated with a pSW at FPZ, remembering with a pSW at PZ/OZ. On the other hand, the state of stimulus and task anticipation is associated with negative slow waves. The latter reach their amplitude maximum over posterior scalp, if visually presented information is anticipated. Our study demonstrates how functionally distinct slow waves can be disentangled by a systematic manipulation of events which either precede or follow the slow wave activity. Moreover, it shows that recording epochs must be of considerable length, if the functional significance of slow waves is the objective of research.

  11. Representing and analysing molecular and cellular function using the computer.

    PubMed

    van Helden, J; Naim, A; Mancuso, R; Eldridge, M; Wernisch, L; Gilbert, D; Wodak, S J

    2000-01-01

    Determining the biological function of a myriad of genes, and understanding how they interact to yield a living cell, is the major challenge of the post genome-sequencing era. The complexity of biological systems is such that this cannot be envisaged without the help of powerful computer systems capable of representing and analysing the intricate networks of physical and functional interactions between the different cellular components. In this review we try to provide the reader with an appreciation of where we stand in this regard. We discuss some of the inherent problems in describing the different facets of biological function, give an overview of how information on function is currently represented in the major biological databases, and describe different systems for organising and categorising the functions of gene products. In a second part, we present a new general data model, currently under development, which describes information on molecular function and cellular processes in a rigorous manner. The model is capable of representing a large variety of biochemical processes, including metabolic pathways, regulation of gene expression and signal transduction. It also incorporates taxonomies for categorising molecular entities, interactions and processes, and it offers means of viewing the information at different levels of resolution, and dealing with incomplete knowledge. The data model has been implemented in the database on protein function and cellular processes 'aMAZE' (http://www.ebi.ac.uk/research/pfbp/), which presently covers metabolic pathways and their regulation. Several tools for querying, displaying, and performing analyses on such pathways are briefly described in order to illustrate the practical applications enabled by the model.

  12. Enzymatic Halogenases and Haloperoxidases: Computational Studies on Mechanism and Function.

    PubMed

    Timmins, Amy; de Visser, Sam P

    2015-01-01

    Despite the fact that halogenated compounds are rare in biology, a number of organisms have developed processes to utilize halogens and in recent years, a string of enzymes have been identified that selectively insert halogen atoms into, for instance, a CH aliphatic bond. Thus, a number of natural products, including antibiotics, contain halogenated functional groups. This unusual process has great relevance to the chemical industry for stereoselective and regiospecific synthesis of haloalkanes. Currently, however, industry utilizes few applications of biological haloperoxidases and halogenases, but efforts are being worked on to understand their catalytic mechanism, so that their catalytic function can be upscaled. In this review, we summarize experimental and computational studies on the catalytic mechanism of a range of haloperoxidases and halogenases with structurally very different catalytic features and cofactors. This chapter gives an overview of heme-dependent haloperoxidases, nonheme vanadium-dependent haloperoxidases, and flavin adenine dinucleotide-dependent haloperoxidases. In addition, we discuss the S-adenosyl-l-methionine fluoridase and nonheme iron/α-ketoglutarate-dependent halogenases. In particular, computational efforts have been applied extensively for several of these haloperoxidases and halogenases and have given insight into the essential structural features that enable these enzymes to perform the unusual halogen atom transfer to substrates. PMID:26415843

  13. Enzymatic Halogenases and Haloperoxidases: Computational Studies on Mechanism and Function.

    PubMed

    Timmins, Amy; de Visser, Sam P

    2015-01-01

    Despite the fact that halogenated compounds are rare in biology, a number of organisms have developed processes to utilize halogens and in recent years, a string of enzymes have been identified that selectively insert halogen atoms into, for instance, a CH aliphatic bond. Thus, a number of natural products, including antibiotics, contain halogenated functional groups. This unusual process has great relevance to the chemical industry for stereoselective and regiospecific synthesis of haloalkanes. Currently, however, industry utilizes few applications of biological haloperoxidases and halogenases, but efforts are being worked on to understand their catalytic mechanism, so that their catalytic function can be upscaled. In this review, we summarize experimental and computational studies on the catalytic mechanism of a range of haloperoxidases and halogenases with structurally very different catalytic features and cofactors. This chapter gives an overview of heme-dependent haloperoxidases, nonheme vanadium-dependent haloperoxidases, and flavin adenine dinucleotide-dependent haloperoxidases. In addition, we discuss the S-adenosyl-l-methionine fluoridase and nonheme iron/α-ketoglutarate-dependent halogenases. In particular, computational efforts have been applied extensively for several of these haloperoxidases and halogenases and have given insight into the essential structural features that enable these enzymes to perform the unusual halogen atom transfer to substrates.

  14. Functional Connectivity’s Degenerate View of Brain Computation

    PubMed Central

    Giron, Alain; Rudrauf, David

    2016-01-01

    Brain computation relies on effective interactions between ensembles of neurons. In neuroimaging, measures of functional connectivity (FC) aim at statistically quantifying such interactions, often to study normal or pathological cognition. Their capacity to reflect a meaningful variety of patterns as expected from neural computation in relation to cognitive processes remains debated. The relative weights of time-varying local neurophysiological dynamics versus static structural connectivity (SC) in the generation of FC as measured remains unsettled. Empirical evidence features mixed results: from little to significant FC variability and correlation with cognitive functions, within and between participants. We used a unified approach combining multivariate analysis, bootstrap and computational modeling to characterize the potential variety of patterns of FC and SC both qualitatively and quantitatively. Empirical data and simulations from generative models with different dynamical behaviors demonstrated, largely irrespective of FC metrics, that a linear subspace with dimension one or two could explain much of the variability across patterns of FC. On the contrary, the variability across BOLD time-courses could not be reduced to such a small subspace. FC appeared to strongly reflect SC and to be partly governed by a Gaussian process. The main differences between simulated and empirical data related to limitations of DWI-based SC estimation (and SC itself could then be estimated from FC). Above and beyond the limited dynamical range of the BOLD signal itself, measures of FC may offer a degenerate representation of brain interactions, with limited access to the underlying complexity. They feature an invariant common core, reflecting the channel capacity of the network as conditioned by SC, with a limited, though perhaps meaningful residual variability. PMID:27736900

  15. Towards computational prediction of microRNA function and activity

    PubMed Central

    Ulitsky, Igor; Laurent, Louise C.; Shamir, Ron

    2010-01-01

    While it has been established that microRNAs (miRNAs) play key roles throughout development and are dysregulated in many human pathologies, the specific processes and pathways regulated by individual miRNAs are mostly unknown. Here, we use computational target predictions in order to automatically infer the processes affected by human miRNAs. Our approach improves upon standard statistical tools by addressing specific characteristics of miRNA regulation. Our analysis is based on a novel compendium of experimentally verified miRNA-pathway and miRNA-process associations that we constructed, which can be a useful resource by itself. Our method also predicts novel miRNA-regulated pathways, refines the annotation of miRNAs for which only crude functions are known, and assigns differential functions to miRNAs with closely related sequences. Applying our approach to groups of co-expressed genes allows us to identify miRNAs and genomic miRNA clusters with functional importance in specific stages of early human development. A full list of the predicted mRNA functions is available at http://acgt.cs.tau.ac.il/fame/. PMID:20576699

  16. The Time Transfer Functions: an efficient tool to compute range, Doppler and astrometric observables

    NASA Astrophysics Data System (ADS)

    Hees, A.; Bertone, S.; Le Poncin-Lafitte, C.; Teyssandier, P.

    2015-12-01

    Determining range, Doppler and astrometric observables is of crucial interest for modelling and analyzing space observations. We recall how these observables can be computed when the travel time of a light ray is known as a function of the positions of the emitter and the receiver for a given instant of reception (or emission). For a long time, such a function--called a reception (or emission) time transfer function--has been almost exclusively calculated by integrating the null geodesic equations describing the light rays. However, other methods avoiding such an integration have been considerably developped in the last twelve years. We give a survey of the analytical results obtained with these new methods up to the third order in the gravitational constant G for a mass monopole. We briefly discuss the case of quasi-conjunctions, where higher-order enhanced terms must be taken into account for correctly calculating the effects. We summarize the results obtained at the first order in G when the multipole structure and the motion of an axisymmetric body is taken into account. We present some applications to on-going or future missions like Gaia and Juno. We give a short review of the recent works devoted to the numerical estimates of the time transfer functions and their derivatives.

  17. Computational principles of syntax in the regions specialized for language: integrating theoretical linguistics and functional neuroimaging

    PubMed Central

    Ohta, Shinri; Fukui, Naoki; Sakai, Kuniyoshi L.

    2013-01-01

    The nature of computational principles of syntax remains to be elucidated. One promising approach to this problem would be to construct formal and abstract linguistic models that parametrically predict the activation modulations in the regions specialized for linguistic processes. In this article, we review recent advances in theoretical linguistics and functional neuroimaging in the following respects. First, we introduce the two fundamental linguistic operations: Merge (which combines two words or phrases to form a larger structure) and Search (which searches and establishes a syntactic relation of two words or phrases). We also illustrate certain universal properties of human language, and present hypotheses regarding how sentence structures are processed in the brain. Hypothesis I is that the Degree of Merger (DoM), i.e., the maximum depth of merged subtrees within a given domain, is a key computational concept to properly measure the complexity of tree structures. Hypothesis II is that the basic frame of the syntactic structure of a given linguistic expression is determined essentially by functional elements, which trigger Merge and Search. We then present our recent functional magnetic resonance imaging experiment, demonstrating that the DoM is indeed a key syntactic factor that accounts for syntax-selective activations in the left inferior frontal gyrus and supramarginal gyrus. Hypothesis III is that the DoM domain changes dynamically in accordance with iterative Merge applications, the Search distances, and/or task requirements. We confirm that the DoM accounts for activations in various sentence types. Hypothesis III successfully explains activation differences between object- and subject-relative clauses, as well as activations during explicit syntactic judgment tasks. A future research on the computational principles of syntax will further deepen our understanding of uniquely human mental faculties. PMID:24385957

  18. Computational principles of syntax in the regions specialized for language: integrating theoretical linguistics and functional neuroimaging.

    PubMed

    Ohta, Shinri; Fukui, Naoki; Sakai, Kuniyoshi L

    2013-01-01

    The nature of computational principles of syntax remains to be elucidated. One promising approach to this problem would be to construct formal and abstract linguistic models that parametrically predict the activation modulations in the regions specialized for linguistic processes. In this article, we review recent advances in theoretical linguistics and functional neuroimaging in the following respects. First, we introduce the two fundamental linguistic operations: Merge (which combines two words or phrases to form a larger structure) and Search (which searches and establishes a syntactic relation of two words or phrases). We also illustrate certain universal properties of human language, and present hypotheses regarding how sentence structures are processed in the brain. Hypothesis I is that the Degree of Merger (DoM), i.e., the maximum depth of merged subtrees within a given domain, is a key computational concept to properly measure the complexity of tree structures. Hypothesis II is that the basic frame of the syntactic structure of a given linguistic expression is determined essentially by functional elements, which trigger Merge and Search. We then present our recent functional magnetic resonance imaging experiment, demonstrating that the DoM is indeed a key syntactic factor that accounts for syntax-selective activations in the left inferior frontal gyrus and supramarginal gyrus. Hypothesis III is that the DoM domain changes dynamically in accordance with iterative Merge applications, the Search distances, and/or task requirements. We confirm that the DoM accounts for activations in various sentence types. Hypothesis III successfully explains activation differences between object- and subject-relative clauses, as well as activations during explicit syntactic judgment tasks. A future research on the computational principles of syntax will further deepen our understanding of uniquely human mental faculties.

  19. A new computational account of cognitive control over reinforcement-based decision-making: Modeling of a probabilistic learning task.

    PubMed

    Zendehrouh, Sareh

    2015-11-01

    Recent work on decision-making field offers an account of dual-system theory for decision-making process. This theory holds that this process is conducted by two main controllers: a goal-directed system and a habitual system. In the reinforcement learning (RL) domain, the habitual behaviors are connected with model-free methods, in which appropriate actions are learned through trial-and-error experiences. However, goal-directed behaviors are associated with model-based methods of RL, in which actions are selected using a model of the environment. Studies on cognitive control also suggest that during processes like decision-making, some cortical and subcortical structures work in concert to monitor the consequences of decisions and to adjust control according to current task demands. Here a computational model is presented based on dual system theory and cognitive control perspective of decision-making. The proposed model is used to simulate human performance on a variant of probabilistic learning task. The basic proposal is that the brain implements a dual controller, while an accompanying monitoring system detects some kinds of conflict including a hypothetical cost-conflict one. The simulation results address existing theories about two event-related potentials, namely error related negativity (ERN) and feedback related negativity (FRN), and explore the best account of them. Based on the results, some testable predictions are also presented. PMID:26339919

  20. A new computational account of cognitive control over reinforcement-based decision-making: Modeling of a probabilistic learning task.

    PubMed

    Zendehrouh, Sareh

    2015-11-01

    Recent work on decision-making field offers an account of dual-system theory for decision-making process. This theory holds that this process is conducted by two main controllers: a goal-directed system and a habitual system. In the reinforcement learning (RL) domain, the habitual behaviors are connected with model-free methods, in which appropriate actions are learned through trial-and-error experiences. However, goal-directed behaviors are associated with model-based methods of RL, in which actions are selected using a model of the environment. Studies on cognitive control also suggest that during processes like decision-making, some cortical and subcortical structures work in concert to monitor the consequences of decisions and to adjust control according to current task demands. Here a computational model is presented based on dual system theory and cognitive control perspective of decision-making. The proposed model is used to simulate human performance on a variant of probabilistic learning task. The basic proposal is that the brain implements a dual controller, while an accompanying monitoring system detects some kinds of conflict including a hypothetical cost-conflict one. The simulation results address existing theories about two event-related potentials, namely error related negativity (ERN) and feedback related negativity (FRN), and explore the best account of them. Based on the results, some testable predictions are also presented.

  1. Computer-Based Screening of Functional Conformers of Proteins

    PubMed Central

    Montiel Molina, Héctor Marlosti; Millán-Pacheco, César; Pastor, Nina; del Rio, Gabriel

    2008-01-01

    A long-standing goal in biology is to establish the link between function, structure, and dynamics of proteins. Considering that protein function at the molecular level is understood by the ability of proteins to bind to other molecules, the limited structural data of proteins in association with other bio-molecules represents a major hurdle to understanding protein function at the structural level. Recent reports show that protein function can be linked to protein structure and dynamics through network centrality analysis, suggesting that the structures of proteins bound to natural ligands may be inferred computationally. In the present work, a new method is described to discriminate protein conformations relevant to the specific recognition of a ligand. The method relies on a scoring system that matches critical residues with central residues in different structures of a given protein. Central residues are the most traversed residues with the same frequency in networks derived from protein structures. We tested our method in a set of 24 different proteins and more than 260,000 structures of these in the absence of a ligand or bound to it. To illustrate the usefulness of our method in the study of the structure/dynamics/function relationship of proteins, we analyzed mutants of the yeast TATA-binding protein with impaired DNA binding. Our results indicate that critical residues for an interaction are preferentially found as central residues of protein structures in complex with a ligand. Thus, our scoring system effectively distinguishes protein conformations relevant to the function of interest. PMID:18463705

  2. Do Executive Function and Impulsivity Predict Adolescent Health Behaviour after Accounting for Intelligence? Findings from the ALSPAC Cohort

    PubMed Central

    Pechey, Rachel; Couturier, Dominique-Laurent; Deary, Ian J.; Marteau, Theresa M.

    2016-01-01

    Objective Executive function, impulsivity, and intelligence are correlated markers of cognitive resource that predict health-related behaviours. It is unknown whether executive function and impulsivity are unique predictors of these behaviours after accounting for intelligence. Methods Data from 6069 participants from the Avon Longitudinal Study of Parents and Children were analysed to investigate whether components of executive function (selective attention, attentional control, working memory, and response inhibition) and impulsivity (parent-rated) measured between ages 8 and 10, predicted having ever drunk alcohol, having ever smoked, fruit and vegetable consumption, physical activity, and overweight at age 13, after accounting for intelligence at age 8 and childhood socioeconomic characteristics. Results Higher intelligence predicted having drunk alcohol, not smoking, greater fruit and vegetable consumption, and not being overweight. After accounting for intelligence, impulsivity predicted alcohol use (odds ratio = 1.10; 99% confidence interval = 1.02, 1.19) and smoking (1.22; 1.11, 1.34). Working memory predicted not being overweight (0.90; 0.81, 0.99). Conclusions After accounting for intelligence, executive function predicts overweight status but not health-related behaviours in early adolescence, whilst impulsivity predicts the onset of alcohol and cigarette use, all with small effects. This suggests overlap between executive function and intelligence as predictors of health behaviour in this cohort, with trait impulsivity accounting for additional variance. PMID:27479488

  3. Enhancing Commitment or Tightening Control: The Function of Teacher Professional Development in an Era of Accountability

    ERIC Educational Resources Information Center

    Smith, Thomas M.; Rowley, Kristie J.

    2005-01-01

    During the past decade or so, popular rhetoric has shifted away from site-based management and participatory governance as the centerpiece of school reform strategies as accountability and standards-based reform have become the reform mantra of policy makers at all levels of government. Critics of accountability-based reforms have suggested that…

  4. Theoretical studies of structure, function and reactivity of molecules— A personal account

    PubMed Central

    Morokuma, Keiji

    2009-01-01

    Last few decades theoretical/computational studies of structure, function and reactivity of molecules have been contributing significantly in chemistry by explanation of experimental results, better understanding of underlying principles and prediction of the unknown experimental outcome. Accuracy needed in chemistry has long been established, but due to high power dependency of such accurate methods on the molecular size, it has been a major challenge to apply theoretical methods to large molecular systems. In the present article we will review some examples of such applications. One is theoretical study of growth/formation of carbon nanostructures such as fullerenes and carbon nanotubes, using quantum mechanical molecular dynamics method. For growth of single walled carbon nanotube from transition metal cluster, we have demonstrated continued growth of attached nanotube, cap formation and growth from small carbon fragments. For homogeneous catalysis we presented results of studies on N2 activation by Zr complexes. For biomolecular reactions we use active site and protein models and show that in some catalyses the protein environment is involved in reactions and changes the preferred pathway, and in some other case the effect is modest. The review is concluded with a perspective. PMID:19444009

  5. An Evolutionary Computation Approach to Examine Functional Brain Plasticity

    PubMed Central

    Roy, Arnab; Campbell, Colin; Bernier, Rachel A.; Hillary, Frank G.

    2016-01-01

    One common research goal in systems neurosciences is to understand how the functional relationship between a pair of regions of interest (ROIs) evolves over time. Examining neural connectivity in this way is well-suited for the study of developmental processes, learning, and even in recovery or treatment designs in response to injury. For most fMRI based studies, the strength of the functional relationship between two ROIs is defined as the correlation between the average signal representing each region. The drawback to this approach is that much information is lost due to averaging heterogeneous voxels, and therefore, the functional relationship between a ROI-pair that evolve at a spatial scale much finer than the ROIs remain undetected. To address this shortcoming, we introduce a novel evolutionary computation (EC) based voxel-level procedure to examine functional plasticity between an investigator defined ROI-pair by simultaneously using subject-specific BOLD-fMRI data collected from two sessions seperated by finite duration of time. This data-driven procedure detects a sub-region composed of spatially connected voxels from each ROI (a so-called sub-regional-pair) such that the pair shows a significant gain/loss of functional relationship strength across the two time points. The procedure is recursive and iteratively finds all statistically significant sub-regional-pairs within the ROIs. Using this approach, we examine functional plasticity between the default mode network (DMN) and the executive control network (ECN) during recovery from traumatic brain injury (TBI); the study includes 14 TBI and 12 healthy control subjects. We demonstrate that the EC based procedure is able to detect functional plasticity where a traditional averaging based approach fails. The subject-specific plasticity estimates obtained using the EC-procedure are highly consistent across multiple runs. Group-level analyses using these plasticity estimates showed an increase in the strength

  6. An Evolutionary Computation Approach to Examine Functional Brain Plasticity.

    PubMed

    Roy, Arnab; Campbell, Colin; Bernier, Rachel A; Hillary, Frank G

    2016-01-01

    One common research goal in systems neurosciences is to understand how the functional relationship between a pair of regions of interest (ROIs) evolves over time. Examining neural connectivity in this way is well-suited for the study of developmental processes, learning, and even in recovery or treatment designs in response to injury. For most fMRI based studies, the strength of the functional relationship between two ROIs is defined as the correlation between the average signal representing each region. The drawback to this approach is that much information is lost due to averaging heterogeneous voxels, and therefore, the functional relationship between a ROI-pair that evolve at a spatial scale much finer than the ROIs remain undetected. To address this shortcoming, we introduce a novel evolutionary computation (EC) based voxel-level procedure to examine functional plasticity between an investigator defined ROI-pair by simultaneously using subject-specific BOLD-fMRI data collected from two sessions seperated by finite duration of time. This data-driven procedure detects a sub-region composed of spatially connected voxels from each ROI (a so-called sub-regional-pair) such that the pair shows a significant gain/loss of functional relationship strength across the two time points. The procedure is recursive and iteratively finds all statistically significant sub-regional-pairs within the ROIs. Using this approach, we examine functional plasticity between the default mode network (DMN) and the executive control network (ECN) during recovery from traumatic brain injury (TBI); the study includes 14 TBI and 12 healthy control subjects. We demonstrate that the EC based procedure is able to detect functional plasticity where a traditional averaging based approach fails. The subject-specific plasticity estimates obtained using the EC-procedure are highly consistent across multiple runs. Group-level analyses using these plasticity estimates showed an increase in the strength

  7. Computer Modeling of Protocellular Functions: Peptide Insertion in Membranes

    NASA Technical Reports Server (NTRS)

    Rodriquez-Gomez, D.; Darve, E.; Pohorille, A.

    2006-01-01

    Lipid vesicles became the precursors to protocells by acquiring the capabilities needed to survive and reproduce. These include transport of ions, nutrients and waste products across cell walls and capture of energy and its conversion into a chemically usable form. In modem organisms these functions are carried out by membrane-bound proteins (about 30% of the genome codes for this kind of proteins). A number of properties of alpha-helical peptides suggest that their associations are excellent candidates for protobiological precursors of proteins. In particular, some simple a-helical peptides can aggregate spontaneously and form functional channels. This process can be described conceptually by a three-step thermodynamic cycle: 1 - folding of helices at the water-membrane interface, 2 - helix insertion into the lipid bilayer and 3 - specific interactions of these helices that result in functional tertiary structures. Although a crucial step, helix insertion has not been adequately studied because of the insolubility and aggregation of hydrophobic peptides. In this work, we use computer simulation methods (Molecular Dynamics) to characterize the energetics of helix insertion and we discuss its importance in an evolutionary context. Specifically, helices could self-assemble only if their interactions were sufficiently strong to compensate the unfavorable Free Energy of insertion of individual helices into membranes, providing a selection mechanism for protobiological evolution.

  8. Assessing Executive Function Using a Computer Game: Computational Modeling of Cognitive Processes

    PubMed Central

    Hagler, Stuart; Jimison, Holly B.; Pavel, Misha

    2014-01-01

    Early and reliable detection of cognitive decline is one of the most important challenges of current healthcare. In this project we developed an approach whereby a frequently played computer game can be used to assess a variety of cognitive processes and estimate the results of the pen-and-paper Trail-Making Test (TMT) – known to measure executive function, as well as visual pattern recognition, speed of processing, working memory, and set-switching ability. We developed a computational model of the TMT based on a decomposition of the test into several independent processes, each characterized by a set of parameters that can be estimated from play of a computer game designed to resemble the TMT. An empirical evaluation of the model suggests that it is possible to use the game data to estimate the parameters of the underlying cognitive processes and using the values of the parameters to estimate the TMT performance. Cognitive measures and trends in these measures can be used to identify individuals for further assessment, to provide a mechanism for improving the early detection of neurological problems, and to provide feedback and monitoring for cognitive interventions in the home. PMID:25014944

  9. Optimizing high performance computing workflow for protein functional annotation.

    PubMed

    Stanberry, Larissa; Rekepalli, Bhanu; Liu, Yuan; Giblock, Paul; Higdon, Roger; Montague, Elizabeth; Broomall, William; Kolker, Natali; Kolker, Eugene

    2014-09-10

    Functional annotation of newly sequenced genomes is one of the major challenges in modern biology. With modern sequencing technologies, the protein sequence universe is rapidly expanding. Newly sequenced bacterial genomes alone contain over 7.5 million proteins. The rate of data generation has far surpassed that of protein annotation. The volume of protein data makes manual curation infeasible, whereas a high compute cost limits the utility of existing automated approaches. In this work, we present an improved and optmized automated workflow to enable large-scale protein annotation. The workflow uses high performance computing architectures and a low complexity classification algorithm to assign proteins into existing clusters of orthologous groups of proteins. On the basis of the Position-Specific Iterative Basic Local Alignment Search Tool the algorithm ensures at least 80% specificity and sensitivity of the resulting classifications. The workflow utilizes highly scalable parallel applications for classification and sequence alignment. Using Extreme Science and Engineering Discovery Environment supercomputers, the workflow processed 1,200,000 newly sequenced bacterial proteins. With the rapid expansion of the protein sequence universe, the proposed workflow will enable scientists to annotate big genome data. PMID:25313296

  10. Optimizing high performance computing workflow for protein functional annotation.

    PubMed

    Stanberry, Larissa; Rekepalli, Bhanu; Liu, Yuan; Giblock, Paul; Higdon, Roger; Montague, Elizabeth; Broomall, William; Kolker, Natali; Kolker, Eugene

    2014-09-10

    Functional annotation of newly sequenced genomes is one of the major challenges in modern biology. With modern sequencing technologies, the protein sequence universe is rapidly expanding. Newly sequenced bacterial genomes alone contain over 7.5 million proteins. The rate of data generation has far surpassed that of protein annotation. The volume of protein data makes manual curation infeasible, whereas a high compute cost limits the utility of existing automated approaches. In this work, we present an improved and optmized automated workflow to enable large-scale protein annotation. The workflow uses high performance computing architectures and a low complexity classification algorithm to assign proteins into existing clusters of orthologous groups of proteins. On the basis of the Position-Specific Iterative Basic Local Alignment Search Tool the algorithm ensures at least 80% specificity and sensitivity of the resulting classifications. The workflow utilizes highly scalable parallel applications for classification and sequence alignment. Using Extreme Science and Engineering Discovery Environment supercomputers, the workflow processed 1,200,000 newly sequenced bacterial proteins. With the rapid expansion of the protein sequence universe, the proposed workflow will enable scientists to annotate big genome data.

  11. Computational Effective Fault Detection by Means of Signature Functions

    PubMed Central

    Baranski, Przemyslaw; Pietrzak, Piotr

    2016-01-01

    The paper presents a computationally effective method for fault detection. A system’s responses are measured under healthy and ill conditions. These signals are used to calculate so-called signature functions that create a signal space. The current system’s response is projected into this space. The signal location in this space easily allows to determine the fault. No classifier such as a neural network, hidden Markov models, etc. is required. The advantage of this proposed method is its efficiency, as computing projections amount to calculating dot products. Therefore, this method is suitable for real-time embedded systems due to its simplicity and undemanding processing capabilities which permit the use of low-cost hardware and allow rapid implementation. The approach performs well for systems that can be considered linear and stationary. The communication presents an application, whereby an industrial process of moulding is supervised. The machine is composed of forms (dies) whose alignment must be precisely set and maintained during the work. Typically, the process is stopped periodically to manually control the alignment. The applied algorithm allows on-line monitoring of the device by analysing the acceleration signal from a sensor mounted on a die. This enables to detect failures at an early stage thus prolonging the machine’s life. PMID:26949942

  12. Imaging local brain function with emission computed tomography

    SciTech Connect

    Kuhl, D.E.

    1984-03-01

    Positron emission tomography (PET) using /sup 18/F-fluorodeoxyglucose (FDG) was used to map local cerebral glucose utilization in the study of local cerebral function. This information differs fundamentally from structural assessment by means of computed tomography (CT). In normal human volunteers, the FDG scan was used to determine the cerebral metabolic response to conrolled sensory stimulation and the effects of aging. Cerebral metabolic patterns are distinctive among depressed and demented elderly patients. The FDG scan appears normal in the depressed patient, studded with multiple metabolic defects in patients with multiple infarct dementia, and in the patients with Alzheimer disease, metabolism is particularly reduced in the parietal cortex, but only slightly reduced in the caudate and thalamus. The interictal FDG scan effectively detects hypometabolic brain zones that are sites of onset for seizures in patients with partial epilepsy, even though these zones usually appear normal on CT scans. The future prospects of PET are discussed.

  13. A computer vision based candidate for functional balance test.

    PubMed

    Nalci, Alican; Khodamoradi, Alireza; Balkan, Ozgur; Nahab, Fatta; Garudadri, Harinath

    2015-08-01

    Balance in humans is a motor skill based on complex multimodal sensing, processing and control. Ability to maintain balance in activities of daily living (ADL) is compromised due to aging, diseases, injuries and environmental factors. Center for Disease Control and Prevention (CDC) estimate of the costs of falls among older adults was $34 billion in 2013 and is expected to reach $54.9 billion in 2020. In this paper, we present a brief review of balance impairments followed by subjective and objective tools currently used in clinical settings for human balance assessment. We propose a novel computer vision (CV) based approach as a candidate for functional balance test. The test will take less than a minute to administer and expected to be objective, repeatable and highly discriminative in quantifying ability to maintain posture and balance. We present an informal study with preliminary data from 10 healthy volunteers, and compare performance with a balance assessment system called BTrackS Balance Assessment Board. Our results show high degree of correlation with BTrackS. The proposed system promises to be a good candidate for objective functional balance tests and warrants further investigations to assess validity in clinical settings, including acute care, long term care and assisted living care facilities. Our long term goals include non-intrusive approaches to assess balance competence during ADL in independent living environments.

  14. Chemical Visualization of Boolean Functions: A Simple Chemical Computer

    NASA Astrophysics Data System (ADS)

    Blittersdorf, R.; Müller, J.; Schneider, F. W.

    1995-08-01

    We present a chemical realization of the Boolean functions AND, OR, NAND, and NOR with a neutralization reaction carried out in three coupled continuous flow stirred tank reactors (CSTR). Two of these CSTR's are used as input reactors, the third reactor marks the output. The chemical reaction is the neutralization of hydrochloric acid (HCl) with sodium hydroxide (NaOH) in the presence of phenolphtalein as an indicator, which is red in alkaline solutions and colorless in acidic solutions representing the two binary states 1 and 0, respectively. The time required for a "chemical computation" is determined by the flow rate of reactant solutions into the reactors since the neutralization reaction itself is very fast. While the acid flow to all reactors is equal and constant, the flow rate of NaOH solution controls the states of the input reactors. The connectivities between the input and output reactors determine the flow rate of NaOH solution into the output reactor, according to the chosen Boolean function. Thus the state of the output reactor depends on the states of the input reactors.

  15. Astrocytes, Synapses and Brain Function: A Computational Approach

    NASA Astrophysics Data System (ADS)

    Nadkarni, Suhita

    2006-03-01

    Modulation of synaptic reliability is one of the leading mechanisms involved in long- term potentiation (LTP) and long-term depression (LTD) and therefore has implications in information processing in the brain. A recently discovered mechanism for modulating synaptic reliability critically involves recruitments of astrocytes - star- shaped cells that outnumber the neurons in most parts of the central nervous system. Astrocytes until recently were thought to be subordinate cells merely participating in supporting neuronal functions. New evidence, however, made available by advances in imaging technology has changed the way we envision the role of these cells in synaptic transmission and as modulator of neuronal excitability. We put forward a novel mathematical framework based on the biophysics of the bidirectional neuron-astrocyte interactions that quantitatively accounts for two distinct experimental manifestation of recruitment of astrocytes in synaptic transmission: a) transformation of a low fidelity synapse transforms into a high fidelity synapse and b) enhanced postsynaptic spontaneous currents when astrocytes are activated. Such a framework is not only useful for modeling neuronal dynamics in a realistic environment but also provides a conceptual basis for interpreting experiments. Based on this modeling framework, we explore the role of astrocytes for neuronal network behavior such as synchrony and correlations and compare with experimental data from cultured networks.

  16. HANOIPC3: a computer program to evaluate executive functions.

    PubMed

    Guevara, M A; Rizo, L; Ruiz-Díaz, M; Hernández-González, M

    2009-08-01

    This article describes a computer program (HANOIPC3) based on the Tower of Hanoi game that, by analyzing a series of parameters during execution, allows a fast and accurate evaluation of data related to certain executive functions, especially planning, organizing and problem-solving. This computerized version has only one level of difficulty based on the use of 3 disks, but it stipulates an additional rule: only one disk may be moved at a time, and only to an adjacent peg (i.e., no peg can be skipped over). In the original version--without this stipulation--the minimum number of movements required to complete the task is 7, but under the conditions of this computerized version this increases to 26. HANOIPC3 has three important advantages: (1) it allows a researcher or clinician to modify the rules by adding or removing certain conditions, thus augmenting the utility and flexibility in test execution and the interpretation of results; (2) it allows to provide on-line feedback to subjects about their execution; and, (3) it creates a specific file to store the scores that correspond to the parameters obtained during trials. The parameters that can be measured include: latencies (time taken for each movement, measured in seconds), total test time, total number of movements, and the number of correct and incorrect movements. The efficacy and adaptability of this program has been confirmed. PMID:19303660

  17. Computer Modelling of Functional Aspects of Noise in Endogenously Oscillating Neurons

    NASA Astrophysics Data System (ADS)

    Huber, M. T.; Dewald, M.; Voigt, K.; Braun, H. A.; Moss, F.

    1998-03-01

    Membrane potential oscillations are a widespread feature of neuronal activity. When such oscillations operate close to the spike-triggering threshold, noise can become an essential property of spike-generation. According to that, we developed a minimal Hodgkin-Huxley-type computer model which includes a noise term. This model accounts for experimental data from quite different cells ranging from mammalian cortical neurons to fish electroreceptors. With slight modifications of the parameters, the model's behavior can be tuned to bursting activity, which additionally allows it to mimick temperature encoding in peripheral cold receptors including transitions to apparently chaotic dynamics as indicated by methods for the detection of unstable periodic orbits. Under all conditions, cooperative effects between noise and nonlinear dynamics can be shown which, beyond stochastic resonance, might be of functional significance for stimulus encoding and neuromodulation.

  18. Accounting Specialist.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Center on Education and Training for Employment.

    This publication identifies 20 subjects appropriate for use in a competency list for the occupation of accounting specialist, 1 of 12 occupations within the business/computer technologies cluster. Each unit consists of a number of competencies; a list of competency builders is provided for each competency. Titles of the 20 units are as follows:…

  19. On computing closed forms for summations. [polynomials and rational functions

    NASA Technical Reports Server (NTRS)

    Moenck, R.

    1977-01-01

    The problem of finding closed forms for a summation involving polynomials and rational functions is considered. A method closely related to Hermite's method for integration of rational functions derived. The method expresses the sum of a rational function as a rational function part and a transcendental part involving derivatives of the gamma function.

  20. General methodology to optimize damping functions to account for charge penetration effects in electrostatic calculations using multicentered multipolar expansions.

    PubMed

    Werneck, Araken S; Filho, Tarcísio M Rocha; Dardenne, Laurent E

    2008-01-17

    We developed a methodology to optimize exponential damping functions to account for charge penetration effects when computing molecular electrostatic properties using the multicentered multipolar expansion method (MME). This methodology is based in the optimization of a damping parameter set using a two-step fast local fitting procedure and the ab initio (Hartree-Fock/6-31G** and 6-31G**+) electrostatic potential calculated in a set of concentric grid of points as reference. The principal aspect of the methodology is a first local fitting step which generates a focused initial guess to improve the performance of a simplex method avoiding the use of multiple runs and the choice of initial guesses. Three different strategies for the determination of optimized damping parameters were tested in the following studies: (1) investigation of the error in the calculation of the electrostatic interaction energy for five hydrogen-bonded dimers at standard and nonstandard hydrogen-bonded geometries and at nonequilibrium geometries; (2) calculation of the electrostatic molecular properties (potential and electric field) for eight small molecular systems (methanol, ammonia, water, formamide, dichloromethane, acetone, dimethyl sulfoxide, and acetonitrile) and for the 20 amino acids. Our results show that the methodology performs well not only for small molecules but also for relatively larger molecular systems. The analysis of the distinct parameter sets associated with different optimization strategies show that (i) a specific parameter set is more suitable and more general for electrostatic interaction energy calculations, with an average absolute error of 0.46 kcal/mol at hydrogen-bond geometries; (ii) a second parameter set is more suitable for electrostatic potential and electric field calculations at and outside the van der Waals (vdW) envelope, with an average error decrease >72% at the vdW surface. A more general amino acid damping parameter set was constructed from the

  1. Computational Refinement of Functional Single Nucleotide Polymorphisms Associated with ATM Gene

    PubMed Central

    George Priya Doss, C.; Rajith, B.

    2012-01-01

    Background Understanding and predicting molecular basis of disease is one of the major challenges in modern biology and medicine. SNPs associated with complex disorders can create, destroy, or modify protein coding sites. Single amino acid substitutions in the ATM gene are the most common forms of genetic variations that account for various forms of cancer. However, the extent to which SNPs interferes with the gene regulation and affects cancer susceptibility remains largely unknown. Principal findings We analyzed the deleterious nsSNPs associated with ATM gene based on different computational methods. An integrative scoring system and sequence conservation of amino acid residues was adapted for a priori nsSNP analysis of variants associated with cancer. We further extended our approach on SNPs that could potentially influence protein Post Translational Modifications in ATM gene. Significance In the lack of adequate prior reports on the possible deleterious effects of nsSNPs, we have systematically analyzed and characterized the functional variants in both coding and non coding region that can alter the expression and function of ATM gene. In silico characterization of nsSNPs affecting ATM gene function can aid in better understanding of genetic differences in disease susceptibility. PMID:22529920

  2. Texture functions in image analysis: A computationally efficient solution

    NASA Technical Reports Server (NTRS)

    Cox, S. C.; Rose, J. F.

    1983-01-01

    A computationally efficient means for calculating texture measurements from digital images by use of the co-occurrence technique is presented. The calculation of the statistical descriptors of image texture and a solution that circumvents the need for calculating and storing a co-occurrence matrix are discussed. The results show that existing efficient algorithms for calculating sums, sums of squares, and cross products can be used to compute complex co-occurrence relationships directly from the digital image input.

  3. Multiple, correlated covariates associated with differential item functioning (DIF): Accounting for language DIF when education levels differ across languages

    PubMed Central

    Gibbons, Laura E.; Crane, Paul K.; Mehta, Kala M.; Pedraza, Otto; Tang, Yuxiao; Manly, Jennifer J.; Narasimhalu, Kaavya; Teresi, Jeanne; Jones, Richard N.; Mungas, Dan

    2012-01-01

    Differential item functioning (DIF) occurs when a test item has different statistical properties in subgroups, controlling for the underlying ability measured by the test. DIF assessment is necessary when evaluating measurement bias in tests used across different language groups. However, other factors such as educational attainment can differ across language groups, and DIF due to these other factors may also exist. How to conduct DIF analyses in the presence of multiple, correlated factors remains largely unexplored. This study assessed DIF related to Spanish versus English language in a 44-item object naming test. Data come from a community-based sample of 1,755 Spanish- and English-speaking older adults. We compared simultaneous accounting, a new strategy for handling differences in educational attainment across language groups, with existing methods. Compared to other methods, simultaneously accounting for language- and education-related DIF yielded salient differences in some object naming scores, particularly for Spanish speakers with at least 9 years of education. Accounting for factors that vary across language groups can be important when assessing language DIF. The use of simultaneous accounting will be relevant to other cross-cultural studies in cognition and in other fields, including health-related quality of life. PMID:22900138

  4. Developmental Language Impairment through the Lens of the ICF: An Integrated Account of Children's Functioning

    ERIC Educational Resources Information Center

    Dempsey, Lynn; Skarakis-Doyle, Elizabeth

    2010-01-01

    The conceptual framework of the World Health Organization's International Classification of Functioning, Disability and Health (ICF) has the potential to advance understanding of developmental language impairment (LI) and enhance clinical practice. The framework provides a systematic way of unifying numerous lines of research, which have linked a…

  5. pH-Regulated Mechanisms Account for Pigment-Type Differences in Epidermal Barrier Function

    PubMed Central

    Gunathilake, Roshan; Schurer, Nanna Y.; Shoo, Brenda A.; Celli, Anna; Hachem, Jean-Pierre; Crumrine, Debra; Sirimanna, Ganga; Feingold, Kenneth R.; Mauro, Theodora M.; Elias, Peter M.

    2009-01-01

    To determine whether pigment type determines differences in epidermal function, we studied stratum corneum (SC) pH, permeability barrier homeostasis, and SC integrity in three geographically disparate populations with pigment type I–II versus IV–V skin (Fitzpatrick I–VI scale). Type IV–V subjects showed: (i) lower surface pH (≈0.5 U); (ii) enhanced SC integrity (transepidermal water loss change with sequential tape strippings); and (iii) more rapid barrier recovery than type I–II subjects. Enhanced barrier function could be ascribed to increased epidermal lipid content, increased lamellar body production, and reduced acidity, leading to enhanced lipid processing. Compromised SC integrity in type I–II subjects could be ascribed to increased serine protease activity, resulting in accelerated desmoglein-1 (DSG-1)/corneodesmosome degradation. In contrast, DSG-1-positive CDs persisted in type IV–V subjects, but due to enhanced cathepsin-D activity, SC thickness did not increase. Adjustment of pH of type I–II SC to type IV–V levels improved epidermal function. Finally, dendrites from type IV–V melanocytes were more acidic than those from type I–II subjects, and they transfer more melanosomes to the SC, suggesting that melanosome secretion could contribute to the more acidic pH of type IV–V skin. These studies show marked pigment-type differences in epidermal structure and function that are pH driven. PMID:19177137

  6. pH-regulated mechanisms account for pigment-type differences in epidermal barrier function.

    PubMed

    Gunathilake, Roshan; Schurer, Nanna Y; Shoo, Brenda A; Celli, Anna; Hachem, Jean-Pierre; Crumrine, Debra; Sirimanna, Ganga; Feingold, Kenneth R; Mauro, Theodora M; Elias, Peter M

    2009-07-01

    To determine whether pigment type determines differences in epidermal function, we studied stratum corneum (SC) pH, permeability barrier homeostasis, and SC integrity in three geographically disparate populations with pigment type I-II versus IV-V skin (Fitzpatrick I-VI scale). Type IV-V subjects showed: (i) lower surface pH (approximately 0.5 U); (ii) enhanced SC integrity (transepidermal water loss change with sequential tape strippings); and (iii) more rapid barrier recovery than type I-II subjects. Enhanced barrier function could be ascribed to increased epidermal lipid content, increased lamellar body production, and reduced acidity, leading to enhanced lipid processing. Compromised SC integrity in type I-II subjects could be ascribed to increased serine protease activity, resulting in accelerated desmoglein-1 (DSG-1)/corneodesmosome degradation. In contrast, DSG-1-positive CDs persisted in type IV-V subjects, but due to enhanced cathepsin-D activity, SC thickness did not increase. Adjustment of pH of type I-II SC to type IV-V levels improved epidermal function. Finally, dendrites from type IV-V melanocytes were more acidic than those from type I-II subjects, and they transfer more melanosomes to the SC, suggesting that melanosome secretion could contribute to the more acidic pH of type IV-V skin. These studies show marked pigment-type differences in epidermal structure and function that are pH driven.

  7. 45 CFR 302.20 - Separation of cash handling and accounting functions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... functions. 302.20 Section 302.20 Public Welfare Regulations Relating to Public Welfare OFFICE OF CHILD SUPPORT ENFORCEMENT (CHILD SUPPORT ENFORCEMENT PROGRAM), ADMINISTRATION FOR CHILDREN AND FAMILIES... will maintain methods of administration designed to assure that persons responsible for handling...

  8. Understanding of emotional experience in autism: insights from the personal accounts of high-functioning children with autism.

    PubMed

    Losh, Molly; Capps, Lisa

    2006-09-01

    In this study, the authors investigate emotional understanding in autism through a discourse analytic framework to provide a window into children's strategies for interpreting emotional versus nonemotional encounters and consider the implications for the mechanisms underlying emotional understanding in typical development. Accounts were analyzed for thematic content and discourse structure. Whereas high-functioning children with autism were able to discuss contextually appropriate accounts of simple emotions, their strategies for interpreting all types of emotional (but not nonemotional) experiences differed from those used by typically developing children. High-functioning children with autism were less inclined to organize their emotional accounts in personalized causal-explanatory frameworks and displayed a tendency to describe visually salient elements of experiences seldom observed among comparison children. Findings suggest that children with autism possess less coherent representations of emotional experiences and use alternative strategies for interpreting emotionally evocative encounters. Discussion focuses on the significance of these findings for informing the nature of emotional dysfunction in autism as well as implications for theories of emotional understanding in typical development.

  9. Calibration function for the Orbitrap FTMS accounting for the space charge effect.

    PubMed

    Gorshkov, Mikhail V; Good, David M; Lyutvinskiy, Yaroslav; Yang, Hongqian; Zubarev, Roman A

    2010-11-01

    Ion storage in an electrostatic trap has been implemented with the introduction of the Orbitrap Fourier transform mass spectrometer (FTMS), which demonstrates performance similar to high-field ion cyclotron resonance MS. High mass spectral characteristics resulted in rapid acceptance of the Orbitrap FTMS for Life Sciences applications. The basics of Orbitrap operation are well documented; however, like in any ion trap MS technology, its performance is limited by interactions between the ion clouds. These interactions result in ion cloud couplings, systematic errors in measured masses, interference between ion clouds of different size yet with close m/z ratios, etc. In this work, we have characterized the space-charge effect on the measured frequency for the Orbitrap FTMS, looking for the possibility to achieve sub-ppm levels of mass measurement accuracy (MMA) for peptides in a wide range of total ion population. As a result of this characterization, we proposed an m/z calibration law for the Orbitrap FTMS that accounts for the total ion population present in the trap during a data acquisition event. Using this law, we were able to achieve a zero-space charge MMA limit of 80 ppb for the commercial Orbitrap FTMS system and sub-ppm level of MMA over a wide range of total ion populations with the automatic gain control values varying from 10 to 10(7).

  10. Elusive accountabilities in the HIV scale-up: 'ownership' as a functional tautology.

    PubMed

    Esser, Daniel E

    2014-01-01

    Mounting concerns over aid effectiveness have rendered 'ownership' a central concept in the vocabulary of development assistance for health (DAH). The article investigates the application of both 'national ownership' and 'country ownership' in the broader development discourse as well as more specifically in the context of internationally funded HIV/AIDS interventions. Based on comprehensive literature reviews, the research uncovers a multiplicity of definitions, most of which either divert from or plainly contradict the concept's original meaning and intent. During the last 10 years in particular, it appears that both public and private donors have advocated for greater 'ownership' by recipient governments and countries to hedge their own political risk rather than to work towards greater inclusion of the latter in agenda-setting and programming. Such politically driven semantic dynamics suggest that the concept's salience is not merely a discursive reflection of globally skewed power relations in DAH but a deliberate exercise in limiting donors' accountabilities. At the same time, the research also finds evidence that this conceptual contortion frames current global public health scholarship, thus adding further urgency to the need to critically re-evaluate the international political economy of global public health from a discursive perspective.

  11. Accounting for heterogeneity among treatment sites and time trends in developing crash modification functions.

    PubMed

    Sacchi, Emanuele; Sayed, Tarek

    2014-11-01

    Collision modification factors (CMFs) are commonly used to quantify the impact of safety countermeasures. The CMFs obtained from observational before-after (BA) studies are usually estimated by averaging the safety impact (i.e., index of effectiveness) for a group of treatment sites. The heterogeneity among the treatment locations, in terms of their characteristics, and the effect of this heterogeneity on safety treatment effectiveness are usually ignored. This is in contrast to treatment evaluations in other fields like medical statistics where variations in the magnitude (or in the direction) of response to the same treatment given to different patients are considered. This paper introduces an approach for estimating a CMFunction from BA safety studies that account for variable treatment location characteristics (heterogeneity). The treatment sites heterogeneity was incorporated into the CMFunction using fixed-effects and random-effects regression models. In addition to heterogeneity, the paper also advocates the use of CMFunctions with a time variable to acknowledge that the safety treatment (intervention) effects do not occur instantaneously but are spread over future time. This is achieved using non-linear intervention (Koyck) models, developed within a hierarchical full Bayes (FB) context. To demonstrate the approach, a case study is presented to evaluate the safety effectiveness of the "Signal Head Upgrade Program" recently implemented in the city of Surrey (British Columbia, Canada), where signal visibility was improved at several urban signalized intersections. The results demonstrated the importance of considering treatment sites heterogeneity and time trends when developing CMFunctions. PMID:25033279

  12. Elusive accountabilities in the HIV scale-up: 'ownership' as a functional tautology.

    PubMed

    Esser, Daniel E

    2014-01-01

    Mounting concerns over aid effectiveness have rendered 'ownership' a central concept in the vocabulary of development assistance for health (DAH). The article investigates the application of both 'national ownership' and 'country ownership' in the broader development discourse as well as more specifically in the context of internationally funded HIV/AIDS interventions. Based on comprehensive literature reviews, the research uncovers a multiplicity of definitions, most of which either divert from or plainly contradict the concept's original meaning and intent. During the last 10 years in particular, it appears that both public and private donors have advocated for greater 'ownership' by recipient governments and countries to hedge their own political risk rather than to work towards greater inclusion of the latter in agenda-setting and programming. Such politically driven semantic dynamics suggest that the concept's salience is not merely a discursive reflection of globally skewed power relations in DAH but a deliberate exercise in limiting donors' accountabilities. At the same time, the research also finds evidence that this conceptual contortion frames current global public health scholarship, thus adding further urgency to the need to critically re-evaluate the international political economy of global public health from a discursive perspective. PMID:24498888

  13. 17 CFR 1.32 - Reporting of segregated account computation and details regarding the holding of futures customer...

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... and Exchange Commission (17 CFR 241.15c3-1(c)(2)(vi)), held for the same futures customer's account... accordance with Rule 240.15c3-1(c)(2)(vi) of the Securities and Exchange Commission (17 CFR 240.15c3-1(c)(2... in a safekeeping account with a bank, trust company, derivatives clearing organization, or...

  14. Challenges in computational studies of enzyme structure, function and dynamics.

    PubMed

    Carvalho, Alexandra T P; Barrozo, Alexandre; Doron, Dvir; Kilshtain, Alexandra Vardi; Major, Dan Thomas; Kamerlin, Shina Caroline Lynn

    2014-11-01

    In this review we give an overview of the field of Computational enzymology. We start by describing the birth of the field, with emphasis on the work of the 2013 chemistry Nobel Laureates. We then present key features of the state-of-the-art in the field, showing what theory, accompanied by experiments, has taught us so far about enzymes. We also briefly describe computational methods, such as quantum mechanics-molecular mechanics approaches, reaction coordinate treatment, and free energy simulation approaches. We finalize by discussing open questions and challenges.

  15. Basic processes in reading: a critical review of pseudohomophone effects in reading aloud and a new computational account.

    PubMed

    Reynolds, Michael; Besner, Derek

    2005-08-01

    There are pervasive lexical influences on the time that it takes to read aloud novel letter strings that sound like real words (e.g., brane from brain). However, the literature presents a complicated picture, given that the time taken to read aloud such items is sometimes shorter and sometimes longer than a control string (e.g.,frane) and that the time to read aloud is sometimes affected by the frequency of the base word and other times is not. In the present review, we first organize these data to show that there is considerably more consistency than has previously been acknowledged. We then consider six different accounts that have been proposed to explain various aspects of these data. Four of them immediately fail in one way or another. The remaining two accounts may be able to explain these findings, but they either make counterintuitive assumptions or invoke a novel mechanism solely to explain these findings. A new account is advanced that is able to explain all of the effects reviewed here and has none of the problems associated with the other accounts. According to this account, different types of lexical knowledge are used when pseudohomophones and nonword controls are read aloud in mixed and pure lists. This account is then implemented in Coltheart, Rastle, Perry, Langdon, and Ziegler's (2001) dual route cascaded model in order to provide an existence proof that it accommodates all of the effects, while retaining the ability to simulate three standard effects seen in nonword reading aloud. PMID:16447376

  16. Do Children's Executive Functions Account for Associations Between Early Autonomy-Supportive Parenting and Achievement Through High School?

    PubMed Central

    Bindman, Samantha W.; Pomerantz, Eva M.; Roisman, Glenn I.

    2015-01-01

    This study evaluated whether the positive association between early autonomy-supportive parenting and children's subsequent achievement is mediated by children's executive functions. Using observations of mothers’ parenting from the NICHD Study of Early Child Care and Youth Development (N = 1,306), analyses revealed that mothers’ autonomy support over the first 3 years of life predicted enhanced executive functions (i.e., inhibition, delay of gratification, and sustained attention) during the year prior to kindergarten and academic achievement in elementary and high school even when mothers’ warmth and cognitive stimulation, as well as other factors (e.g., children's early general cognitive skills and mothers’ educational attainment) were covaried. Mediation analyses demonstrated that over and above other attributes (e.g., temperament), children's executive functions partially accounted for the association between early autonomy-supportive parenting and children's subsequent achievement. PMID:26366009

  17. Dose spread functions in computed tomography: A Monte Carlo study

    PubMed Central

    Boone, John M.

    2009-01-01

    Purpose: Current CT dosimetry employing CTDI methodology has come under fire in recent years, partially in response to the increasing width of collimated x-ray fields in modern CT scanners. This study was conducted to provide a better understanding of the radiation dose distributions in CT. Methods: Monte Carlo simulations were used to evaluate radiation dose distributions along the z axis arising from CT imaging in cylindrical phantoms. Mathematical cylinders were simulated with compositions of water, polymethyl methacrylate (PMMA), and polyethylene. Cylinder diameters from 10 to 50 cm were studied. X-ray spectra typical of several CT manufacturers (80, 100, 120, and 140 kVp) were used. In addition to no bow tie filter, the head and body bow tie filters from modern General Electric and Siemens CT scanners were evaluated. Each cylinder was divided into three concentric regions of equal volume such that the energy deposited is proportional to dose for each region. Two additional dose assessment regions, central and edge locations 10 mm in diameter, were included for comparisons to CTDI100 measurements. Dose spread functions (DSFs) were computed for a wide number of imaging parameters. Results: DSFs generally exhibit a biexponential falloff from the z=0 position. For a very narrow primary beam input (⪡1 mm), DSFs demonstrated significant low amplitude long range scatter dose tails. For body imaging conditions (30 cm diameter in water), the DSF at the center showed ∼160 mm at full width at tenth maximum (FWTM), while at the edge the FWTM was ∼80 mm. Polyethylene phantoms exhibited wider DSFs than PMMA or water, as did higher tube voltages in any material. The FWTM were 80, 180, and 250 mm for 10, 30, and 50 cm phantom diameters, respectively, at the center in water at 120 kVp with a typical body bow tie filter. Scatter to primary dose ratios (SPRs) increased with phantom diameter from 4 at the center (1 cm diameter) for a 16 cm diameter cylinder to ∼12.5 for a

  18. Spaceborne computer executive routine functional design specification. Volume 2: Computer executive design for space station/base

    NASA Technical Reports Server (NTRS)

    Kennedy, J. R.; Fitzpatrick, W. S.

    1971-01-01

    The computer executive functional system design concepts derived from study of the Space Station/Base are presented. Information Management System hardware configuration as directly influencing the executive design is reviewed. The hardware configuration and generic executive design requirements are considered in detail in a previous report (System Configuration and Executive Requirements Specifications for Reusable Shuttle and Space Station/Base, 9/25/70). This report defines basic system primitives and delineates processes and process control. Supervisor states are considered for describing basic multiprogramming and multiprocessing systems. A high-level computer executive including control of scheduling, allocation of resources, system interactions, and real-time supervisory functions is defined. The description is oriented to provide a baseline for a functional simulation of the computer executive system.

  19. Do general intellectual functioning and socioeconomic status account for performance on the Children's Gambling Task?

    PubMed Central

    Mata, Fernanda; Sallum, Isabela; Miranda, Débora M.; Bechara, Antoine; Malloy-Diniz, Leandro F.

    2013-01-01

    Studies that use the Iowa Gambling Task (IGT) and its age-appropriate versions as indices of affective decision-making during childhood and adolescence have demonstrated significant individual differences in scores. Our study investigated the association between general intellectual functioning and socioeconomic status (SES) and its effect on the development of affective decision-making in preschoolers by using a computerized version of the Children's Gambling Task (CGT). We administered the CGT and the Columbia Mental Maturity Scale (CMMS) to 137 Brazilian children between the ages of 3 and 5 years old to assess their general intellectual functioning. We also used the Brazilian Criterion of Economic Classification (CCEB) to assess their SES. Age differences between 3- and 4-years-old, but not between 4- and 5-years-old, confirmed the results obtained by Kerr and Zelazo (2004), indicating the rapid development of affective decision-making during the preschool period. Both 4- and 5-years-old performed significantly above chance on blocks 3, 4, and 5 of the CGT, whereas 3-years-old mean scores did not differ from chance. We found that general intellectual functioning was not related to affective decision-making. On the other hand, our findings showed that children with high SES performed better on the last block of the CGT in comparison to children with low SES, which indicates that children from the former group seem more likely to use the information about the gain/loss aspects of the decks to efficiently choose cards from the advantageous deck throughout the task. PMID:23760222

  20. A Functional Analytic Approach to Computer-Interactive Mathematics

    ERIC Educational Resources Information Center

    Ninness, Chris; Rumph, Robin; McCuller, Glen; Harrison, Carol; Ford, Angela M.; Ninness, Sharon K.

    2005-01-01

    Following a pretest, 11 participants who were naive with regard to various algebraic and trigonometric transformations received an introductory lecture regarding the fundamentals of the rectangular coordinate system. Following the lecture, they took part in a computer-interactive matching-to-sample procedure in which they received training on…

  1. Computer routines for probability distributions, random numbers, and related functions

    USGS Publications Warehouse

    Kirby, W.

    1983-01-01

    Use of previously coded and tested subroutines simplifies and speeds up program development and testing. This report presents routines that can be used to calculate various probability distributions and other functions of importance in statistical hydrology. The routines are designed as general-purpose Fortran subroutines and functions to be called from user-written main progress. The probability distributions provided include the beta, chi-square, gamma, Gaussian (normal), Pearson Type III (tables and approximation), and Weibull. Also provided are the distributions of the Grubbs-Beck outlier test, Kolmogorov 's and Smirnov 's D, Student 's t, noncentral t (approximate), and Snedecor F. Other mathematical functions include the Bessel function, I sub o, gamma and log-gamma functions, error functions, and exponential integral. Auxiliary services include sorting and printer-plotting. Random number generators for uniform and normal numbers are provided and may be used with some of the above routines to generate numbers from other distributions. (USGS)

  2. Computer routines for probability distributions, random numbers, and related functions

    USGS Publications Warehouse

    Kirby, W.H.

    1980-01-01

    Use of previously codes and tested subroutines simplifies and speeds up program development and testing. This report presents routines that can be used to calculate various probability distributions and other functions of importance in statistical hydrology. The routines are designed as general-purpose Fortran subroutines and functions to be called from user-written main programs. The probability distributions provided include the beta, chisquare, gamma, Gaussian (normal), Pearson Type III (tables and approximation), and Weibull. Also provided are the distributions of the Grubbs-Beck outlier test, Kolmogorov 's and Smirnov 's D, Student 's t, noncentral t (approximate), and Snedecor F tests. Other mathematical functions include the Bessel function I (subzero), gamma and log-gamma functions, error functions and exponential integral. Auxiliary services include sorting and printer plotting. Random number generators for uniform and normal numbers are provided and may be used with some of the above routines to generate numbers from other distributions. (USGS)

  3. 'A Leg to Stand On' by Oliver Sacks: a unique autobiographical account of functional paralysis.

    PubMed

    Stone, Jon; Perthen, Jo; Carson, Alan J

    2012-09-01

    Oliver Sacks, the well known neurologist and writer, published his fourth book, 'A Leg to Stand On', in 1984 following an earlier essay 'The Leg' in 1982. The book described his recovery after a fall in a remote region of Norway in which he injured his leg. Following surgery to reattach his quadriceps muscle, he experienced an emotional period in which his leg no longer felt a part of his body, and he struggled to regain his ability to walk. Sacks attributed the experience to a neurologically determined disorder of body-image and bodyego induced by peripheral injury. In the first edition of his book Sacks explicitly rejected the diagnosis of 'hysterical paralysis' as it was then understood, although he approached this diagnosis more closely in subsequent revisions. In this article we propose that, in the light of better understanding of functional neurological symptoms, Sacks' experiences deserve to be reappraised as a unique insight in to a genuinely experienced functional/psychogenic leg paralysis following injury.

  4. EDF: Computing electron number probability distribution functions in real space from molecular wave functions

    NASA Astrophysics Data System (ADS)

    Francisco, E.; Pendás, A. Martín; Blanco, M. A.

    2008-04-01

    Given an N-electron molecule and an exhaustive partition of the real space ( R) into m arbitrary regions Ω,Ω,…,Ω ( ⋃i=1mΩ=R), the edf program computes all the probabilities P(n,n,…,n) of having exactly n electrons in Ω, n electrons in Ω,…, and n electrons ( n+n+⋯+n=N) in Ω. Each Ω may correspond to a single basin (atomic domain) or several such basins (functional group). In the later case, each atomic domain must belong to a single Ω. The program can manage both single- and multi-determinant wave functions which are read in from an aimpac-like wave function description ( .wfn) file (T.A. Keith et al., The AIMPAC95 programs, http://www.chemistry.mcmaster.ca/aimpac, 1995). For multi-determinantal wave functions a generalization of the original .wfn file has been introduced. The new format is completely backwards compatible, adding to the previous structure a description of the configuration interaction (CI) coefficients and the determinants of correlated wave functions. Besides the .wfn file, edf only needs the overlap integrals over all the atomic domains between the molecular orbitals (MO). After the P(n,n,…,n) probabilities are computed, edf obtains from them several magnitudes relevant to chemical bonding theory, such as average electronic populations and localization/delocalization indices. Regarding spin, edf may be used in two ways: with or without a splitting of the P(n,n,…,n) probabilities into α and β spin components. Program summaryProgram title: edf Catalogue identifier: AEAJ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEAJ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 5387 No. of bytes in distributed program, including test data, etc.: 52 381 Distribution format: tar.gz Programming language: Fortran 77 Computer

  5. Autobiographical accounts of sensing in Asperger syndrome and high-functioning autism.

    PubMed

    Elwin, Marie; Ek, Lena; Schröder, Agneta; Kjellin, Lars

    2012-10-01

    Sensory experiences in Asperger syndrome (AS) or high-functioning autism (HFA) were explored by qualitative content analysis of autobiographical texts by persons with AS/HFA. Predetermined categories of hyper- and hyposensitivity were applied to texts. Hypersensitivity consists of strong reactions and heightened apprehension in reaction to external stimuli, sometimes together with overfocused or unselective attention. It was common in vision, hearing, and touch. In contrast, hyposensitivity was frequent in reaction to internal and body stimuli such as interoception, proprioception, and pain. It consists of less registration, discrimination, and recognition of stimuli as well as cravings for specific stimuli. Awareness of the strong impact of sensitivity is essential for creating good environments and encounters in the context of psychiatric and other health care.

  6. Clinical evaluation of cochlear implant sound coding taking into account conjectural masking functions, MP3000™

    PubMed Central

    Buechner, Andreas; Beynon, Andy; Szyfter, Witold; Niemczyk, Kazimierz; Hoppe, Ulrich; Hey, Matthias; Brokx, Jan; Eyles, Julie; Van de Heyning, Paul; Paludetti, Gaetano; Zarowski, Andrzej; Quaranta, Nicola; Wesarg, Thomas; Festen, Joost; Olze, Heidi; Dhooge, Ingeborg; Müller-Deile, Joachim; Ramos, Angel; Roman, Stephane; Piron, Jean-Pierre; Cuda, Domenico; Burdo, Sandro; Grolman, Wilko; Vaillard, Samantha Roux; Huarte, Alicia; Frachet, Bruno; Morera, Constantine; Garcia-Ibáñez, Luis; Abels, Daniel; Walger, Martin; Müller-Mazotta, Jochen; Leone, Carlo Antonio; Meyer, Bernard; Dillier, Norbert; Steffens, Thomas; Gentine, André; Mazzoli, Manuela; Rypkema, Gerben; Killian, Matthijs; Smoorenburg, Guido

    2011-01-01

    Efficacy of the SPEAK and ACE coding strategies was compared with that of a new strategy, MP3000™, by 37 European implant centers including 221 subjects. The SPEAK and ACE strategies are based on selection of 8–10 spectral components with the highest levels, while MP3000 is based on the selection of only 4–6 components, with the highest levels relative to an estimate of the spread of masking. The pulse rate per component was fixed. No significant difference was found for the speech scores and for coding preference between the SPEAK/ACE and MP3000 strategies. Battery life was 24% longer for the MP3000 strategy. With MP3000 the best results were found for a selection of six components. In addition, the best results were found for a masking function with a low-frequency slope of 50 dB/Bark and a high-frequency slope of 37 dB/Bark (50/37) as compared to the other combinations examined of 40/30 and 20/15 dB/Bark. The best results found for the steepest slopes do not seem to agree with current estimates of the spread of masking in electrical stimulation. Future research might reveal if performance with respect to SPEAK/ACE can be enhanced by increasing the number of channels in MP3000 beyond 4–6 and it should shed more light on the optimum steepness of the slopes of the masking functions applied in MP3000. PMID:22251806

  7. Using computational models to relate structural and functional brain connectivity

    PubMed Central

    Hlinka, Jaroslav; Coombes, Stephen

    2012-01-01

    Modern imaging methods allow a non-invasive assessment of both structural and functional brain connectivity. This has lead to the identification of disease-related alterations affecting functional connectivity. The mechanism of how such alterations in functional connectivity arise in a structured network of interacting neural populations is as yet poorly understood. Here we use a modeling approach to explore the way in which this can arise and to highlight the important role that local population dynamics can have in shaping emergent spatial functional connectivity patterns. The local dynamics for a neural population is taken to be of the Wilson–Cowan type, whilst the structural connectivity patterns used, describing long-range anatomical connections, cover both realistic scenarios (from the CoComac database) and idealized ones that allow for more detailed theoretical study. We have calculated graph–theoretic measures of functional network topology from numerical simulations of model networks. The effect of the form of local dynamics on the observed network state is quantified by examining the correlation between structural and functional connectivity. We document a profound and systematic dependence of the simulated functional connectivity patterns on the parameters controlling the dynamics. Importantly, we show that a weakly coupled oscillator theory explaining these correlations and their variation across parameter space can be developed. This theoretical development provides a novel way to characterize the mechanisms for the breakdown of functional connectivity in diseases through changes in local dynamics. PMID:22805059

  8. Introduction to Classical Density Functional Theory by a Computational Experiment

    ERIC Educational Resources Information Center

    Jeanmairet, Guillaume; Levy, Nicolas; Levesque, Maximilien; Borgis, Daniel

    2014-01-01

    We propose an in silico experiment to introduce the classical density functional theory (cDFT). Density functional theories, whether quantum or classical, rely on abstract concepts that are nonintuitive; however, they are at the heart of powerful tools and active fields of research in both physics and chemistry. They led to the 1998 Nobel Prize in…

  9. The computational foundations of time dependent density functional theory

    NASA Astrophysics Data System (ADS)

    Whitfield, James

    2014-03-01

    The mathematical foundations of TDDFT are established through the formal existence of a fictitious non-interacting system (known as the Kohn-Sham system), which can reproduce the one-electron reduced probability density of the actual system. We build upon these works and show that on the interior of the domain of existence, the Kohn-Sham system can be efficiently obtained given the time-dependent density. Since a quantum computer can efficiently produce such time-dependent densities, we present a polynomial time quantum algorithm to generate the time-dependent Kohn-Sham potential with controllable error bounds. Further, we find that systems do not immediately become non-representable but rather become ill-representable as one approaches this boundary. A representability parameter is defined in our work which quantifies the distance to the boundary of representability and the computational difficulty of finding the Kohn-Sham system.

  10. Numerical computation of lightning transfer functions for layered, anisotropically conducting shielding structures by the method of moments

    NASA Astrophysics Data System (ADS)

    Happ, Fabian; Brüns, Heinz-D.; Mavraj, Gazmend; Gronwald, Frank

    2016-09-01

    A formalism for the computation of lightning transfer functions by the method of moments, which involves shielding structures that may consist of layered, anisotropically conducting composite materials, is presented in this contribution. The composite materials, being of a type that is widely used in space- and aircraft design, are electrically characterized by an equivalent conductivity. As basis for the quantitative analysis the method of moments is used where shielding surfaces can be treated by a thin layer technique which utilizes analytical solutions inside the layer. Also the effect of an extended lightning channel can be taken into account. The method is applied to geometries that resemble an actual airplane fuselage.

  11. A threshold theory account of psychometric functions with response confidence under the balance condition.

    PubMed

    Hsu, Yung-Fong; Doble, Christopher W

    2015-02-01

    The study of thresholds for discriminability has been of long-standing interest in psychophysics. While threshold theories embrace the concept of discrete-state thresholds, signal detection theory discounts such a concept. In this paper we concern ourselves with the concept of thresholds from the discrete-state modelling viewpoint. In doing so, we find it necessary to clarify some fundamental issues germane to the psychometric function (PF), which is customarily constructed using psychophysical methods with a binary-response format. We challenge this response format and argue that response confidence also plays an important role in the construction of PFs, and thus should have some impact on threshold estimation. We motivate the discussion by adopting a three-state threshold theory for response confidence proposed by Krantz (1969, Psychol. Rev., 76, 308-324), which is a modification of Luce's (1963, Psychol. Rev., 70, 61-79) low-threshold theory. In particular, we discuss the case in which the practice of averaging over order (or position) is enforced in data collection. Finally, we illustrate the fit of the Luce-Krantz model to data from a line-discrimination task with response confidence.

  12. Piecemeal recruitment of left-lateralized brain areas during reading: a spatio-functional account.

    PubMed

    Levy, Jonathan; Pernet, Cyril; Treserras, Sebastien; Boulanouar, Kader; Berry, Isabelle; Aubry, Florent; Demonet, Jean-Francois; Celsis, Pierre

    2008-11-15

    Neuroimaging studies of reading converge to suggest that linguistically elementary stimuli are confined to the activation of bilateral posterior regions, whereas linguistically complex stimuli additionally recruit left hemispheric anterior regions, raising the hypotheses of a gradual bilateral-to-left and a posterior-to-anterior recruitment of reading related areas. Here, we tested these two hypotheses by contrasting a repertoire of eight categories of stimuli ranging from simple orthographic-like characters to words and pseudowords in a single experiment, and by measuring BOLD signal changes and connectivity while 16 fluent readers passively viewed the stimuli. Our results confirm the existence of a bilateral-to-left and posterior-to-anterior recruitment of reading related areas, straightforwardly resulting from the increase in stimuli's linguistic processing load, which reflects reading processes: visual analysis, orthographic encoding and phonological decoding. Connectivity analyses strengthened the validity of these observations and additionally revealed an enhancement of the left parieto-frontal information trafficking for higher linguistic processing. Our findings clearly establish the notion of a gradual spatio-functional recruitment of reading areas and demonstrate, to the best of our knowledge, the first evidence of a robust and staged link between the level of linguistic processing, the spatial distribution of brain activity and its information trafficking. PMID:18778780

  13. Investigating Constituent Order Change with Elicited Pantomime: A Functional Account of SVO Emergence

    PubMed Central

    Hall, Matthew L.; Ferreira, Victor S.; Mayberry, Rachel I.

    2014-01-01

    One of the most basic functions of human language is to convey who did what to whom. In the world’s languages, the order of these three constituents (subject (S), verb (V), and object (O)) is uneven, with SOV and SVO being most common. Recent experiments using experimentally-elicited pantomime provide a possible explanation for the prevalence of SOV, but extant explanations for the prevalence of SVO could benefit from further empirical support. Here, we test whether SVO might emerge because (a) SOV is not well suited for describing reversible events (a woman pushing a boy), and (b) pressures to be efficient and mention subjects before objects conspire to rule out many other alternatives. We tested this by asking participants to describe reversible and non-reversible events in pantomime, and instructed some participants to be consistent in the form of their gestures and to teach them to the experimenter. These manipulations led to the emergence of SVO in speakers of both English (SVO) and Turkish (SOV). PMID:24641486

  14. Computer Corner: Spreadsheets, Power Series, Generating Functions, and Integers.

    ERIC Educational Resources Information Center

    Snow, Donald R.

    1989-01-01

    Implements a table algorithm on a spreadsheet program and obtains functions for several number sequences such as the Fibonacci and Catalan numbers. Considers other applications of the table algorithm to integers represented in various number bases. (YP)

  15. Improvement in protein functional site prediction by distinguishing structural and functional constraints on protein family evolution using computational design.

    PubMed

    Cheng, Gong; Qian, Bin; Samudrala, Ram; Baker, David

    2005-01-01

    The prediction of functional sites in newly solved protein structures is a challenge for computational structural biology. Most methods for approaching this problem use evolutionary conservation as the primary indicator of the location of functional sites. However, sequence conservation reflects not only evolutionary selection at functional sites to maintain protein function, but also selection throughout the protein to maintain the stability of the folded state. To disentangle sequence conservation due to protein functional constraints from sequence conservation due to protein structural constraints, we use all atom computational protein design methodology to predict sequence profiles expected under solely structural constraints, and to compute the free energy difference between the naturally occurring amino acid and the lowest free energy amino acid at each position. We show that functional sites are more likely than non-functional sites to have computed sequence profiles which differ significantly from the naturally occurring sequence profiles and to have residues with sub-optimal free energies, and that incorporation of these two measures improves sequence based prediction of protein functional sites. The combined sequence and structure based functional site prediction method has been implemented in a publicly available web server.

  16. Proton-Λ correlation functions at energies available at the CERN Large Hadron Collider taking into account residual correlations

    NASA Astrophysics Data System (ADS)

    Shapoval, V. M.; Sinyukov, Yu. M.; Naboka, V. Yu.

    2015-10-01

    The theoretical analysis of the p ¯-Λ ⊕p -Λ ¯ correlation function in 10% most central Au+Au collisions at Relativistic Heavy Ion Collider (RHIC) energy √{sNN}=200 GeV shows that the contribution of residual correlations is a necessary factor for obtaining a satisfactory description of the experimental data. Neglecting the residual correlation effect leads to an unrealistically low source radius, about 2 times smaller than the corresponding value for p -Λ ⊕p ¯-Λ ¯ case, when one fits the experimental correlation function within Lednický-Lyuboshitz analytical model. Recently an approach that accounts effectively for residual correlations for the baryon-antibaryon correlation function was proposed, and a good RHIC data description was reached with the source radius extracted from the hydrokinetic model (HKM). The p ¯-Λ scattering length, as well as the parameters characterizing the residual correlation effect—annihilation dip amplitude and its inverse width—were extracted from the corresponding fit. In this paper we use these extracted values and simulated in HKM source functions for Pb+Pb collisions at the LHC energy √{sNN}=2.76 TeV to predict the corresponding p Λ and p Λ ¯ correlation functions.

  17. COMPUTATIONAL STRATEGIES FOR THE DESIGN OF NEW ENZYMATIC FUNCTIONS

    PubMed Central

    Świderek, K; Tuñón, I.; Moliner, V.; Bertran, J.

    2015-01-01

    In this contribution, recent developments in the design of biocatalysts are reviewed with particular emphasis in the de novo strategy. Studies based on three different reactions, Kemp elimination, Diels-Alder and retro-aldolase, are used to illustrate different success achieved during the last years. Finally, a section is devoted to the particular case of designed metalloenzymes. As a general conclusion, the interplay between new and more sophisticated engineering protocols and computational methods, based on molecular dynamics simulations with Quantum Mechanics/Molecular Mechanics potentials and fully flexible models, seems to constitute the bed rock for present and future successful design strategies. PMID:25797438

  18. A computationally efficient double hybrid density functional based on the random phase approximation.

    PubMed

    Grimme, Stefan; Steinmetz, Marc

    2016-08-01

    We present a revised form of a double hybrid density functional (DHDF) dubbed PWRB95. It contains semi-local Perdew-Wang exchange and Becke95 correlation with a fixed amount of 50% non-local Fock exchange. New features are that the robust random phase approximation (RPA) is used to calculate the non-local correlation part instead of a second-order perturbative treatment as in standard DHDF, and the non-self-consistent evaluation of the Fock exchange with KS-orbitals at the GGA level which leads to a significant reduction of the computational effort. To account for London dispersion effects we include the non-local VV10 dispersion functional. Only three empirical scaling parameters were adjusted. The PWRB95 results for extensive standard thermochemical benchmarks (GMTKN30 data base) are compared to those of well-known functionals from the classes of (meta-)GGAs, (meta-)hybrid functionals, and DHDFs, as well as to standard (direct) RPA. The new method is furthermore tested on prototype bond activations with (Ni/Pd)-based transition metal catalysts, and two difficult cases for DHDF, namely the isomerization reaction of the [Cu2(en)2O2](2+) complex and the singlet-triplet energy difference in highly unsaturated cyclacenes. The results show that PWRB95 is almost as accurate as standard DHDF for main-group thermochemistry but has a similar or better performance for non-covalent interactions, more difficult transition metal containing molecules and other electronically problematic cases. Because of its relatively weak basis set dependence, PWRB95 can be applied even in combination with AO basis sets of only triple-zeta quality which yields huge overall computational savings by a factor of about 40 compared to standard DHDF/'quadruple-zeta' calculations. Structure optimizations of small molecules with PWRB95 indicate an accurate description of bond distances superior to that provided by TPSS-D3, PBE0-D3, or other RPA type methods. PMID:26695184

  19. Automated attendance accounting system

    NASA Technical Reports Server (NTRS)

    Chapman, C. P. (Inventor)

    1973-01-01

    An automated accounting system useful for applying data to a computer from any or all of a multiplicity of data terminals is disclosed. The system essentially includes a preselected number of data terminals which are each adapted to convert data words of decimal form to another form, i.e., binary, usable with the computer. Each data terminal may take the form of a keyboard unit having a number of depressable buttons or switches corresponding to selected data digits and/or function digits. A bank of data buffers, one of which is associated with each data terminal, is provided as a temporary storage. Data from the terminals is applied to the data buffers on a digit by digit basis for transfer via a multiplexer to the computer.

  20. Efficient and Flexible Computation of Many-Electron Wave Function Overlaps

    PubMed Central

    2016-01-01

    A new algorithm for the computation of the overlap between many-electron wave functions is described. This algorithm allows for the extensive use of recurring intermediates and thus provides high computational efficiency. Because of the general formalism employed, overlaps can be computed for varying wave function types, molecular orbitals, basis sets, and molecular geometries. This paves the way for efficiently computing nonadiabatic interaction terms for dynamics simulations. In addition, other application areas can be envisaged, such as the comparison of wave functions constructed at different levels of theory. Aside from explaining the algorithm and evaluating the performance, a detailed analysis of the numerical stability of wave function overlaps is carried out, and strategies for overcoming potential severe pitfalls due to displaced atoms and truncated wave functions are presented. PMID:26854874

  1. Computing Legacy Software Behavior to Understand Functionality and Security Properties: An IBM/370 Demonstration

    SciTech Connect

    Linger, Richard C; Pleszkoch, Mark G; Prowell, Stacy J; Sayre, Kirk D; Ankrum, Scott

    2013-01-01

    Organizations maintaining mainframe legacy software can benefit from code modernization and incorporation of security capabilities to address the current threat environment. Oak Ridge National Laboratory is developing the Hyperion system to compute the behavior of software as a means to gain understanding of software functionality and security properties. Computation of functionality is critical to revealing security attributes, which are in fact specialized functional behaviors of software. Oak Ridge is collaborating with MITRE Corporation to conduct a demonstration project to compute behavior of legacy IBM Assembly Language code for a federal agency. The ultimate goal is to understand functionality and security vulnerabilities as a basis for code modernization. This paper reports on the first phase, to define functional semantics for IBM Assembly instructions and conduct behavior computation experiments.

  2. An account of Sandia's research booth at Supercomputing '92: A collaborative effort in high-performance computing and networking

    SciTech Connect

    Breckenridge, A.; Vahle, M.O.

    1993-03-01

    Supercomputing '92, a high-performance computing and communications conference was held, November 16--20, 1992 in Minneapolis, Minnesota. This paper documents the applications and technologies that were showcased in Sandia's research booth at that conference. In particular the demonstrations in high-performance networking, audio-visual applications in engineering, virtual reality, and supercomputing applications are all described.

  3. Cognition and control in schizophrenia: a computational model of dopamine and prefrontal function.

    PubMed

    Braver, T S; Barch, D M; Cohen, J D

    1999-08-01

    Behavioral deficits suffered by patients with schizophrenia in a wide array of cognitive domains can be conceptualized as failures of cognitive control, due to an impaired ability to internally represent, maintain, and update context information. A theory is described that postulates a single neurobiological mechanism for these disturbances, involving dysfunctional interactions between the dopamine neurotransmitter system and the prefrontal cortex. Specifically, it is hypothesized that in schizophrenia, there is increased noise in the activity of the dopamine system, leading to abnormal "gating" of information into prefrontal cortex. The theory is implemented as an explicit connectionist computational model that incorporates the roles of both dopamine and prefrontal cortex in cognitive control. A simulation is presented of behavioral performance in a version of the Continuous Performance Test specifically adapted to measure critical aspects of cognitive control function. Schizophrenia patients exhibit clear behavioral deficits on this task that reflect impairments in both the maintenance and updating of context information. The simulation results suggest that the model can successfully account for these impairments in terms of abnormal dopamine activity. This theory provides a potential point of contact between research on the neurobiological and psychological aspects of schizophrenia, by illustrating how a particular physiological disturbance might lead to precise and quantifiable consequences for behavior.

  4. Utility functions and resource management in an oversubscribed heterogeneous computing environment

    DOE PAGES

    Khemka, Bhavesh; Friese, Ryan; Briceno, Luis Diego; Siegel, Howard Jay; Maciejewski, Anthony A.; Koenig, Gregory A.; Groer, Christopher S.; Hilton, Marcia M.; Poole, Stephen W.; Okonski, G.; et al

    2014-09-26

    We model an oversubscribed heterogeneous computing system where tasks arrive dynamically and a scheduler maps the tasks to machines for execution. The environment and workloads are based on those being investigated by the Extreme Scale Systems Center at Oak Ridge National Laboratory. Utility functions that are designed based on specifications from the system owner and users are used to create a metric for the performance of resource allocation heuristics. Each task has a time-varying utility (importance) that the enterprise will earn based on when the task successfully completes execution. We design multiple heuristics, which include a technique to drop lowmore » utility-earning tasks, to maximize the total utility that can be earned by completing tasks. The heuristics are evaluated using simulation experiments with two levels of oversubscription. The results show the benefit of having fast heuristics that account for the importance of a task and the heterogeneity of the environment when making allocation decisions in an oversubscribed environment. Furthermore, the ability to drop low utility-earning tasks allow the heuristics to tolerate the high oversubscription as well as earn significant utility.« less

  5. Utility functions and resource management in an oversubscribed heterogeneous computing environment

    SciTech Connect

    Khemka, Bhavesh; Friese, Ryan; Briceno, Luis Diego; Siegel, Howard Jay; Maciejewski, Anthony A.; Koenig, Gregory A.; Groer, Christopher S.; Hilton, Marcia M.; Poole, Stephen W.; Okonski, G.; Rambharos, R.

    2014-09-26

    We model an oversubscribed heterogeneous computing system where tasks arrive dynamically and a scheduler maps the tasks to machines for execution. The environment and workloads are based on those being investigated by the Extreme Scale Systems Center at Oak Ridge National Laboratory. Utility functions that are designed based on specifications from the system owner and users are used to create a metric for the performance of resource allocation heuristics. Each task has a time-varying utility (importance) that the enterprise will earn based on when the task successfully completes execution. We design multiple heuristics, which include a technique to drop low utility-earning tasks, to maximize the total utility that can be earned by completing tasks. The heuristics are evaluated using simulation experiments with two levels of oversubscription. The results show the benefit of having fast heuristics that account for the importance of a task and the heterogeneity of the environment when making allocation decisions in an oversubscribed environment. Furthermore, the ability to drop low utility-earning tasks allow the heuristics to tolerate the high oversubscription as well as earn significant utility.

  6. Computational properties of three-term recurrence relations for Kummer functions

    NASA Astrophysics Data System (ADS)

    Deaño, Alfredo; Segura, Javier; Temme, Nico M.

    2010-01-01

    Several three-term recurrence relations for confluent hypergeometric functions are analyzed from a numerical point of view. Minimal and dominant solutions for complex values of the variable z are given, derived from asymptotic estimates of the Whittaker functions with large parameters. The Laguerre polynomials and the regular Coulomb wave functions are studied as particular cases, with numerical examples of their computation.

  7. Spaceborne computer executive routine functional design specification. Volume 1: Functional design of a flight computer executive program for the reusable shuttle

    NASA Technical Reports Server (NTRS)

    Curran, R. T.

    1971-01-01

    A flight computer functional executive design for the reusable shuttle is presented. The design is given in the form of functional flowcharts and prose description. Techniques utilized in the regulation of process flow to accomplish activation, resource allocation, suspension, termination, and error masking based on process primitives are considered. Preliminary estimates of main storage utilization by the Executive are furnished. Conclusions and recommendations for timely, effective software-hardware integration in the reusable shuttle avionics system are proposed.

  8. The Contingency of Cocaine Administration Accounts for Structural and Functional Medial Prefrontal Deficits and Increased Adrenocortical Activation

    PubMed Central

    Anderson, Rachel M.; Cosme, Caitlin V.; Glanz, Ryan M.; Miller, Mary C.; Romig-Martin, Sara A.; LaLumiere, Ryan T.

    2015-01-01

    The prelimbic region (PL) of the medial prefrontal cortex (mPFC) is implicated in the relapse of drug-seeking behavior. Optimal mPFC functioning relies on synaptic connections involving dendritic spines in pyramidal neurons, whereas prefrontal dysfunction resulting from elevated glucocorticoids, stress, aging, and mental illness are each linked to decreased apical dendritic branching and spine density in pyramidal neurons in these cortical fields. The fact that cocaine use induces activation of the stress-responsive hypothalamo-pituitary-adrenal axis raises the possibility that cocaine-related impairments in mPFC functioning may be manifested by similar changes in neuronal architecture in mPFC. Nevertheless, previous studies have generally identified increases, rather than decreases, in structural plasticity in mPFC after cocaine self-administration. Here, we use 3D imaging and analysis of dendritic spine morphometry to show that chronic cocaine self-administration leads to mild decreases of apical dendritic branching, prominent dendritic spine attrition in PL pyramidal neurons, and working memory deficits. Importantly, these impairments were largely accounted for in groups of rats that self-administered cocaine compared with yoked-cocaine- and saline-matched counterparts. Follow-up experiments failed to demonstrate any effects of either experimenter-administered cocaine or food self-administration on structural alterations in PL neurons. Finally, we verified that the cocaine self-administration group was distinguished by more protracted increases in adrenocortical activity compared with yoked-cocaine- and saline-matched controls. These studies suggest a mechanism whereby increased adrenocortical activity resulting from chronic cocaine self-administration may contribute to regressive prefrontal structural and functional plasticity. SIGNIFICANCE STATEMENT Stress, aging, and mental illness are each linked to decreased prefrontal plasticity. Here, we show that chronic

  9. Toward high-resolution computational design of helical membrane protein structure and function

    PubMed Central

    Barth, Patrick; Senes, Alessandro

    2016-01-01

    The computational design of α-helical membrane proteins is still in its infancy but has made important progress. De novo design has produced stable, specific and active minimalistic oligomeric systems. Computational re-engineering can improve stability and modulate the function of natural membrane proteins. Currently, the major hurdle for the field is not computational, but the experimental characterization of the designs. The emergence of new structural methods for membrane proteins will accelerate progress PMID:27273630

  10. Functional Genomics Reveals That a Compact Terpene Synthase Gene Family Can Account for Terpene Volatile Production in Apple1[W

    PubMed Central

    Nieuwenhuizen, Niels J.; Green, Sol A.; Chen, Xiuyin; Bailleul, Estelle J.D.; Matich, Adam J.; Wang, Mindy Y.; Atkinson, Ross G.

    2013-01-01

    Terpenes are specialized plant metabolites that act as attractants to pollinators and as defensive compounds against pathogens and herbivores, but they also play an important role in determining the quality of horticultural food products. We show that the genome of cultivated apple (Malus domestica) contains 55 putative terpene synthase (TPS) genes, of which only 10 are predicted to be functional. This low number of predicted functional TPS genes compared with other plant species was supported by the identification of only eight potentially functional TPS enzymes in apple ‘Royal Gala’ expressed sequence tag databases, including the previously characterized apple (E,E)-α-farnesene synthase. In planta functional characterization of these TPS enzymes showed that they could account for the majority of terpene volatiles produced in cv Royal Gala, including the sesquiterpenes germacrene-D and (E)-β-caryophyllene, the monoterpenes linalool and α-pinene, and the homoterpene (E)-4,8-dimethyl-1,3,7-nonatriene. Relative expression analysis of the TPS genes indicated that floral and vegetative tissues were the primary sites of terpene production in cv Royal Gala. However, production of cv Royal Gala floral-specific terpenes and TPS genes was observed in the fruit of some heritage apple cultivars. Our results suggest that the apple TPS gene family has been shaped by a combination of ancestral and more recent genome-wide duplication events. The relatively small number of functional enzymes suggests that the remaining terpenes produced in floral and vegetative and fruit tissues are maintained under a positive selective pressure, while the small number of terpenes found in the fruit of modern cultivars may be related to commercial breeding strategies. PMID:23256150

  11. Functional genomics reveals that a compact terpene synthase gene family can account for terpene volatile production in apple.

    PubMed

    Nieuwenhuizen, Niels J; Green, Sol A; Chen, Xiuyin; Bailleul, Estelle J D; Matich, Adam J; Wang, Mindy Y; Atkinson, Ross G

    2013-02-01

    Terpenes are specialized plant metabolites that act as attractants to pollinators and as defensive compounds against pathogens and herbivores, but they also play an important role in determining the quality of horticultural food products. We show that the genome of cultivated apple (Malus domestica) contains 55 putative terpene synthase (TPS) genes, of which only 10 are predicted to be functional. This low number of predicted functional TPS genes compared with other plant species was supported by the identification of only eight potentially functional TPS enzymes in apple 'Royal Gala' expressed sequence tag databases, including the previously characterized apple (E,E)-α-farnesene synthase. In planta functional characterization of these TPS enzymes showed that they could account for the majority of terpene volatiles produced in cv Royal Gala, including the sesquiterpenes germacrene-D and (E)-β-caryophyllene, the monoterpenes linalool and α-pinene, and the homoterpene (E)-4,8-dimethyl-1,3,7-nonatriene. Relative expression analysis of the TPS genes indicated that floral and vegetative tissues were the primary sites of terpene production in cv Royal Gala. However, production of cv Royal Gala floral-specific terpenes and TPS genes was observed in the fruit of some heritage apple cultivars. Our results suggest that the apple TPS gene family has been shaped by a combination of ancestral and more recent genome-wide duplication events. The relatively small number of functional enzymes suggests that the remaining terpenes produced in floral and vegetative and fruit tissues are maintained under a positive selective pressure, while the small number of terpenes found in the fruit of modern cultivars may be related to commercial breeding strategies. PMID:23256150

  12. Fair and Square Computation of Inverse "Z"-Transforms of Rational Functions

    ERIC Educational Resources Information Center

    Moreira, M. V.; Basilio, J. C.

    2012-01-01

    All methods presented in textbooks for computing inverse "Z"-transforms of rational functions have some limitation: 1) the direct division method does not, in general, provide enough information to derive an analytical expression for the time-domain sequence "x"("k") whose "Z"-transform is "X"("z"); 2) computation using the inversion integral…

  13. A Systematic Approach for Understanding Slater-Gaussian Functions in Computational Chemistry

    ERIC Educational Resources Information Center

    Stewart, Brianna; Hylton, Derrick J.; Ravi, Natarajan

    2013-01-01

    A systematic way to understand the intricacies of quantum mechanical computations done by a software package known as "Gaussian" is undertaken via an undergraduate research project. These computations involve the evaluation of key parameters in a fitting procedure to express a Slater-type orbital (STO) function in terms of the linear…

  14. Effects of Computer versus Paper Administration of an Adult Functional Writing Assessment

    ERIC Educational Resources Information Center

    Chen, Jing; White, Sheida; McCloskey, Michael; Soroui, Jaleh; Chun, Young

    2011-01-01

    This study investigated the comparability of paper and computer versions of a functional writing assessment administered to adults 16 and older. Three writing tasks were administered in both paper and computer modes to volunteers in the field test of an assessment of adult literacy in 2008. One set of analyses examined mode effects on scoring by…

  15. Performance of a computer-based assessment of cognitive function measures in two cohorts of seniors

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Computer-administered assessment of cognitive function is being increasingly incorporated in clinical trials, however its performance in these settings has not been systematically evaluated. The Seniors Health and Activity Research Program (SHARP) pilot trial (N=73) developed a computer-based tool f...

  16. A Functional Specification for a Programming Language for Computer Aided Learning Applications.

    ERIC Educational Resources Information Center

    National Research Council of Canada, Ottawa (Ontario).

    In 1972 there were at least six different course authoring languages in use in Canada with little exchange of course materials between Computer Assisted Learning (CAL) centers. In order to improve facilities for producing "transportable" computer based course materials, a working panel undertook the definition of functional requirements of a user…

  17. Method reduces computer time for smoothing functions and derivatives through ninth order polynomials

    NASA Technical Reports Server (NTRS)

    Glauz, R. D.; Wilgus, C. A.

    1969-01-01

    Analysis presented is an efficient technique to adjust previously calculated orthogonal polynomial coefficients for an odd number of equally spaced data points. The adjusting technique derivation is for a ninth order polynomial. It reduces computer time for smoothing functions.

  18. Functions and Requirements and Specifications for Replacement of the Computer Automated Surveillance System (CASS)

    SciTech Connect

    SCAIEF, C.C.

    1999-12-16

    This functions, requirements and specifications document defines the baseline requirements and criteria for the design, purchase, fabrication, construction, installation, and operation of the system to replace the Computer Automated Surveillance System (CASS) alarm monitoring.

  19. Basis Function Sampling: A New Paradigm for Material Property Computation

    NASA Astrophysics Data System (ADS)

    Whitmer, Jonathan K.; Chiu, Chi-cheng; Joshi, Abhijeet A.; de Pablo, Juan J.

    2014-11-01

    Wang-Landau sampling, and the associated class of flat histogram simulation methods have been remarkably helpful for calculations of the free energy in a wide variety of physical systems. Practically, convergence of these calculations to a target free energy surface is hampered by reliance on parameters which are unknown a priori. Here, we derive and implement a method built upon orthogonal functions which is fast, parameter-free, and (importantly) geometrically robust. The method is shown to be highly effective in achieving convergence. An important feature of this method is its ability to attain arbitrary levels of description for the free energy. It is thus ideally suited to in silico measurement of elastic moduli and other material properties related to free energy perturbations. We demonstrate the utility of such applications by applying our method to calculate the Frank elastic constants of the Lebwohl-Lasher model of liquid crystals.

  20. Computational complexity of time-dependent density functional theory

    NASA Astrophysics Data System (ADS)

    Whitfield, J. D.; Yung, M.-H.; Tempel, D. G.; Boixo, S.; Aspuru-Guzik, A.

    2014-08-01

    Time-dependent density functional theory (TDDFT) is rapidly emerging as a premier method for solving dynamical many-body problems in physics and chemistry. The mathematical foundations of TDDFT are established through the formal existence of a fictitious non-interacting system (known as the Kohn-Sham system), which can reproduce the one-electron reduced probability density of the actual system. We build upon these works and show that on the interior of the domain of existence, the Kohn-Sham system can be efficiently obtained given the time-dependent density. We introduce a V-representability parameter which diverges at the boundary of the existence domain and serves to quantify the numerical difficulty of constructing the Kohn-Sham potential. For bounded values of V-representability, we present a polynomial time quantum algorithm to generate the time-dependent Kohn-Sham potential with controllable error bounds.

  1. Computer-Aided Evaluation of Liver Functional Assessment

    PubMed Central

    Lesmo, Leonardo; Saitta, Lorenza; Torasso, Piero

    1980-01-01

    This paper describes the organization of a computerized system whose purpose is to ascertain the presence of functional impairments in the liver and to evaluate their seriousness. The system is composed of categorical rules and decision procedures. The symptoms and the anamnestic data of a given patient trigger the categorical rules which constrain the set of hypothesizable impairments. This set of hypotheses acts as a focus of attention of the system by allowing the selection of the bioclinical tests more relevant to determine the seriousness of those impairments. The outcome of the selected tests are input to the decision procedures operating on the basis of fuzzy relations which allow a quantitative evaluation of the seriousness of the hypothesized impairments. Whereas the categorical rules have been built on the basis of the a-priori knowledge of the physicians, the parameters of the fuzzy relations have been learned automatically by means of a fuzzy inference procedure.

  2. A Computational Account of Borderline Personality Disorder: Impaired Predictive Learning about Self and Others Through Bodily Simulation

    PubMed Central

    Fineberg, Sarah K.; Steinfeld, Matthew; Brewer, Judson A.; Corlett, Philip R.

    2014-01-01

    Social dysfunction is a prominent and disabling aspect of borderline personality disorder. We reconsider traditional explanations for this problem, especially early disruption in the way an infant feels physical care from its mother, in terms of recent developments in computational psychiatry. In particular, social learning may depend on reinforcement learning though embodied simulations. Such modeling involves calculations based on structures outside the brain such as face and hands, calculations on one’s own body that are used to make inferences about others. We discuss ways to test the role of embodied simulation in BPD and potential implications for treatment. PMID:25221523

  3. A Computer Program for the Computation of Running Gear Temperatures Using Green's Function

    NASA Technical Reports Server (NTRS)

    Koshigoe, S.; Murdock, J. W.; Akin, L. S.; Townsend, D. P.

    1996-01-01

    A new technique has been developed to study two dimensional heat transfer problems in gears. This technique consists of transforming the heat equation into a line integral equation with the use of Green's theorem. The equation is then expressed in terms of eigenfunctions that satisfy the Helmholtz equation, and their corresponding eigenvalues for an arbitrarily shaped region of interest. The eigenfunction are obtalned by solving an intergral equation. Once the eigenfunctions are found, the temperature is expanded in terms of the eigenfunctions with unknown time dependent coefficients that can be solved by using Runge Kutta methods. The time integration is extremely efficient. Therefore, any changes in the time dependent coefficients or source terms in the boundary conditions do not impose a great computational burden on the user. The method is demonstrated by applying it to a sample gear tooth. Temperature histories at representative surface locatons are given.

  4. Computation of pair distribution functions and three-dimensional densities with a reduced variance principle

    NASA Astrophysics Data System (ADS)

    Borgis, Daniel; Assaraf, Roland; Rotenberg, Benjamin; Vuilleumier, Rodolphe

    2013-12-01

    No fancy statistical objects here, we go back to the computation of one of the most basic and fundamental quantities in the statistical mechanics of fluids, namely the pair distribution functions. Those functions are usually computed in molecular simulations by using histogram techniques. We show here that they can be estimated using a global information on the instantaneous forces acting on the particles, and that this leads to a reduced variance compared to the standard histogram estimators. The technique is extended successfully to the computation of three-dimensional solvent densities around tagged molecular solutes, quantities that are noisy and very long to converge, using histograms.

  5. Passive Dendrites Enable Single Neurons to Compute Linearly Non-separable Functions

    PubMed Central

    Cazé, Romain Daniel; Humphries, Mark; Gutkin, Boris

    2013-01-01

    Local supra-linear summation of excitatory inputs occurring in pyramidal cell dendrites, the so-called dendritic spikes, results in independent spiking dendritic sub-units, which turn pyramidal neurons into two-layer neural networks capable of computing linearly non-separable functions, such as the exclusive OR. Other neuron classes, such as interneurons, may possess only a few independent dendritic sub-units, or only passive dendrites where input summation is purely sub-linear, and where dendritic sub-units are only saturating. To determine if such neurons can also compute linearly non-separable functions, we enumerate, for a given parameter range, the Boolean functions implementable by a binary neuron model with a linear sub-unit and either a single spiking or a saturating dendritic sub-unit. We then analytically generalize these numerical results to an arbitrary number of non-linear sub-units. First, we show that a single non-linear dendritic sub-unit, in addition to the somatic non-linearity, is sufficient to compute linearly non-separable functions. Second, we analytically prove that, with a sufficient number of saturating dendritic sub-units, a neuron can compute all functions computable with purely excitatory inputs. Third, we show that these linearly non-separable functions can be implemented with at least two strategies: one where a dendritic sub-unit is sufficient to trigger a somatic spike; another where somatic spiking requires the cooperation of multiple dendritic sub-units. We formally prove that implementing the latter architecture is possible with both types of dendritic sub-units whereas the former is only possible with spiking dendrites. Finally, we show how linearly non-separable functions can be computed by a generic two-compartment biophysical model and a realistic neuron model of the cerebellar stellate cell interneuron. Taken together our results demonstrate that passive dendrites are sufficient to enable neurons to compute linearly non

  6. A mesh-decoupled height function method for computing interface curvature

    NASA Astrophysics Data System (ADS)

    Owkes, Mark; Desjardins, Olivier

    2015-01-01

    In this paper, a mesh-decoupled height function method is proposed and tested. The method is based on computing height functions within columns that are not aligned with the underlying mesh and have variable dimensions. Because they are decoupled from the computational mesh, the columns can be aligned with the interface normal vector, which is found to improve the curvature calculation for under-resolved interfaces where the standard height function method often fails. A computational geometry toolbox is used to compute the heights in the complex geometry that is formed at the intersection of the computational mesh and the columns. The toolbox reduces the complexity of the problem to a series of straightforward geometric operations using simplices. The proposed scheme is shown to compute more accurate curvatures than the standard height function method on coarse meshes. A combined method that uses the standard height function where it is well defined and the proposed scheme in under-resolved regions is tested. This approach achieves accurate and robust curvatures for under-resolved interface features and second-order converging curvatures for well-resolved interfaces.

  7. Functional Competency Development Model for Academic Personnel Based on International Professional Qualification Standards in Computing Field

    ERIC Educational Resources Information Center

    Tumthong, Suwut; Piriyasurawong, Pullop; Jeerangsuwan, Namon

    2016-01-01

    This research proposes a functional competency development model for academic personnel based on international professional qualification standards in computing field and examines the appropriateness of the model. Specifically, the model consists of three key components which are: 1) functional competency development model, 2) blended training…

  8. Computation of turbulent boundary layers employing the defect wall-function method. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Brown, Douglas L.

    1994-01-01

    In order to decrease overall computational time requirements of spatially-marching parabolized Navier-Stokes finite-difference computer code when applied to turbulent fluid flow, a wall-function methodology, originally proposed by R. Barnwell, was implemented. This numerical effort increases computational speed and calculates reasonably accurate wall shear stress spatial distributions and boundary-layer profiles. Since the wall shear stress is analytically determined from the wall-function model, the computational grid near the wall is not required to spatially resolve the laminar-viscous sublayer. Consequently, a substantially increased computational integration step size is achieved resulting in a considerable decrease in net computational time. This wall-function technique is demonstrated for adiabatic flat plate test cases from Mach 2 to Mach 8. These test cases are analytically verified employing: (1) Eckert reference method solutions, (2) experimental turbulent boundary-layer data of Mabey, and (3) finite-difference computational code solutions with fully resolved laminar-viscous sublayers. Additionally, results have been obtained for two pressure-gradient cases: (1) an adiabatic expansion corner and (2) an adiabatic compression corner.

  9. Functional Specifications for Computer Aided Training Systems Development and Management (CATSDM) Support Functions. Final Report.

    ERIC Educational Resources Information Center

    Hughes, John; And Others

    This report provides a description of a Computer Aided Training System Development and Management (CATSDM) environment based on state-of-the-art hardware and software technology, and including recommendations for off the shelf systems to be utilized as a starting point in addressing the particular systematic training and instruction design and…

  10. Do Tasks Make a Difference? Accounting for Heterogeneity of Performance of Children with Reading Difficulties on Tasks of Executive Function: Findings from a Meta-Analysis

    ERIC Educational Resources Information Center

    Booth, Josephine N.; Boyle, James M. E.; Kelly, Steve W.

    2010-01-01

    Research studies have implicated executive functions in reading difficulties (RD). But while some studies have found children with RD to be impaired on tasks of executive function other studies report unimpaired performance. A meta-analysis was carried out to determine whether these discrepant findings can be accounted for by differences in the…

  11. An analytic method to account for drag in the Vinti satellite theory. [computer program using quadrature algorithm

    NASA Technical Reports Server (NTRS)

    Watson, J. S.; Mistretta, G. D.; Bonavito, N. L.

    1975-01-01

    A quadrature algorithm is presented which employs analytical expressions for the variations of satellite orbital elements caused by air drag. The Hamiltonian is formally preserved and the Jacobi constants of the motion are advanced with time through the variational equations. The atmospheric density profile is written as a fitted exponential function of the eccentric anomaly, which adheres to tabulated data at all altitudes and simultaneously reduces the variational equations to definite integrals with closed form evaluations, whose limits are in terms of the eccentric anomaly. Results are given for two intense air drag satellites and indicate that the satellite ephemerides produced by this method in conjunction with the Vinti program are of very high accuracy.

  12. The default-mode, ego-functions and free-energy: a neurobiological account of Freudian ideas

    PubMed Central

    Friston, K. J.

    2010-01-01

    This article explores the notion that Freudian constructs may have neurobiological substrates. Specifically, we propose that Freud’s descriptions of the primary and secondary processes are consistent with self-organized activity in hierarchical cortical systems and that his descriptions of the ego are consistent with the functions of the default-mode and its reciprocal exchanges with subordinate brain systems. This neurobiological account rests on a view of the brain as a hierarchical inference or Helmholtz machine. In this view, large-scale intrinsic networks occupy supraordinate levels of hierarchical brain systems that try to optimize their representation of the sensorium. This optimization has been formulated as minimizing a free-energy; a process that is formally similar to the treatment of energy in Freudian formulations. We substantiate this synthesis by showing that Freud’s descriptions of the primary process are consistent with the phenomenology and neurophysiology of rapid eye movement sleep, the early and acute psychotic state, the aura of temporal lobe epilepsy and hallucinogenic drug states. PMID:20194141

  13. Fluorescence microscopy point spread function model accounting for aberrations due to refractive index variability within a specimen.

    PubMed

    Ghosh, Sreya; Preza, Chrysanthe

    2015-07-01

    A three-dimensional (3-D) point spread function (PSF) model for wide-field fluorescence microscopy, suitable for imaging samples with variable refractive index (RI) in multilayered media, is presented. This PSF model is a key component for accurate 3-D image restoration of thick biological samples, such as lung tissue. Microscope- and specimen-derived parameters are combined with a rigorous vectorial formulation to obtain a new PSF model that accounts for additional aberrations due to specimen RI variability. Experimental evaluation and verification of the PSF model was accomplished using images from 175-nm fluorescent beads in a controlled test sample. Fundamental experimental validation of the advantage of using improved PSFs in depth-variant restoration was accomplished by restoring experimental data from beads (6  μm in diameter) mounted in a sample with RI variation. In the investigated study, improvement in restoration accuracy in the range of 18 to 35% was observed when PSFs from the proposed model were used over restoration using PSFs from an existing model. The new PSF model was further validated by showing that its prediction compares to an experimental PSF (determined from 175-nm beads located below a thick rat lung slice) with a 42% improved accuracy over the current PSF model prediction. PMID:26154937

  14. Toward high-resolution computational design of the structure and function of helical membrane proteins.

    PubMed

    Barth, Patrick; Senes, Alessandro

    2016-06-01

    The computational design of α-helical membrane proteins is still in its infancy but has already made great progress. De novo design allows stable, specific and active minimal oligomeric systems to be obtained. Computational reengineering can improve the stability and function of naturally occurring membrane proteins. Currently, the major hurdle for the field is the experimental characterization of the designs. The emergence of new structural methods for membrane proteins will accelerate progress. PMID:27273630

  15. Locating and computing in parallel all the simple roots of special functions using PVM

    NASA Astrophysics Data System (ADS)

    Plagianakos, V. P.; Nousis, N. K.; Vrahatis, M. N.

    2001-08-01

    An algorithm is proposed for locating and computing in parallel and with certainty all the simple roots of any twice continuously differentiable function in any specific interval. To compute with certainty all the roots, the proposed method is heavily based on the knowledge of the total number of roots within the given interval. To obtain this information we use results from topological degree theory and, in particular, the Kronecker-Picard approach. This theory gives a formula for the computation of the total number of roots of a system of equations within a given region, which can be computed in parallel. With this tool in hand, we construct a parallel procedure for the localization and isolation of all the roots by dividing the given region successively and applying the above formula to these subregions until the final domains contain at the most one root. The subregions with no roots are discarded, while for the rest a modification of the well-known bisection method is employed for the computation of the contained root. The new aspect of the present contribution is that the computation of the total number of zeros using the Kronecker-Picard integral as well as the localization and computation of all the roots is performed in parallel using the parallel virtual machine (PVM). PVM is an integrated set of software tools and libraries that emulates a general-purpose, flexible, heterogeneous concurrent computing framework on interconnected computers of varied architectures. The proposed algorithm has large granularity and low synchronization, and is robust. It has been implemented and tested and our experience is that it can massively compute with certainty all the roots in a certain interval. Performance information from massive computations related to a recently proposed conjecture due to Elbert (this issue, J. Comput. Appl. Math. 133 (2001) 65-83) is reported.

  16. Extended Krylov subspaces approximations of matrix functions. Application to computational electromagnetics

    SciTech Connect

    Druskin, V.; Lee, Ping; Knizhnerman, L.

    1996-12-31

    There is now a growing interest in the area of using Krylov subspace approximations to compute the actions of matrix functions. The main application of this approach is the solution of ODE systems, obtained after discretization of partial differential equations by method of lines. In the event that the cost of computing the matrix inverse is relatively inexpensive, it is sometimes attractive to solve the ODE using the extended Krylov subspaces, originated by actions of both positive and negative matrix powers. Examples of such problems can be found frequently in computational electromagnetics.

  17. Renormalization group improved computation of correlation functions in theories with nontrivial phase diagram

    NASA Astrophysics Data System (ADS)

    Codello, Alessandro; Tonero, Alberto

    2016-07-01

    We present a simple and consistent way to compute correlation functions in interacting theories with nontrivial phase diagram. As an example we show how to consistently compute the four-point function in three dimensional Z2 -scalar theories. The idea is to perform the path integral by weighting the momentum modes that contribute to it according to their renormalization group (RG) relevance, i.e. we weight each mode according to the value of the running couplings at that scale. In this way, we are able to encode in a loop computation the information regarding the RG trajectory along which we are integrating. We show that depending on the initial condition, or initial point in the phase diagram, we obtain different behaviors of the four-point function at the endpoint of the flow.

  18. On computational algorithms for real-valued continuous functions of several variables.

    PubMed

    Sprecher, David

    2014-11-01

    The subject of this paper is algorithms for computing superpositions of real-valued continuous functions of several variables based on space-filling curves. The prototypes of these algorithms were based on Kolmogorov's dimension-reducing superpositions (Kolmogorov, 1957). Interest in these grew significantly with the discovery of Hecht-Nielsen that a version of Kolmogorov's formula has an interpretation as a feedforward neural network (Hecht-Nielse, 1987). These superpositions were constructed with devil's staircase-type functions to answer a question in functional complexity, rather than become computational algorithms, and their utility as an efficient computational tool turned out to be limited by the characteristics of space-filling curves that they determined. After discussing the link between the algorithms and these curves, this paper presents two algorithms for the case of two variables: one based on space-filling curves with worked out coding, and the Hilbert curve (Hilbert, 1891).

  19. The Environmental Impacts of a Desktop Computer: Influence of Choice of Functional Unit, System Boundary and User Behaviour

    NASA Astrophysics Data System (ADS)

    Simanovska, J.; Šteina, Māra; Valters, K.; Bažbauers, G.

    2009-01-01

    The pollution prevention during the design phase of products and processes in environmental policy gains its importance over the other, more historically known principle - pollution reduction in the end-of-pipe. This approach requires prediction of potential environmental impacts to be avoided or reduced and a prioritisation of the most efficient areas for action. Currently the most appropriate method for this purpose is life cycle assessment (LCA)- a method for accounting and attributing all environmental impacts which arise during the life time of a product, starting with the production of raw materials and ending with the disposal, or recycling of the wasted product at the end of life. The LCA, however, can be misleading if the performers of the study disregard gaps of information and the limitations of the chosen methodology. During the study we researched the environmental impact of desktop computers, using a simplified LCA method - Indicators' 99, and by developing various scenarios (changing service life, user behaviour, energy supply etc). The study demonstrates that actions for improvements lie in very different areas. The study also concludes that the approach of defining functional unit must be sufficiently flexible in order to avoid discounting areas of potential actions. Therefore, with regard to computers we agree with other authors using the functional unit "one computer" but suggest not to bind this to service life or usage time, but to develop several scenarios varying these parameters. The study also demonstrates the importance of a systemic approach when assessing complex product systems - as more complex the system is, the more broad the scope for potential actions. We conclude that, regarding computers, which belong to energy using and material- intensive products, the measures to reduce environmental impacts lie not only with the producer and user of the particular product, but also with the whole national energy supply and waste management

  20. Performance of computational tools in evaluating the functional impact of laboratory-induced amino acid mutations.

    PubMed

    Gray, Vanessa E; Kukurba, Kimberly R; Kumar, Sudhir

    2012-08-15

    Site-directed mutagenesis is frequently used by scientists to investigate the functional impact of amino acid mutations in the laboratory. Over 10,000 such laboratory-induced mutations have been reported in the UniProt database along with the outcomes of functional assays. Here, we explore the performance of state-of-the-art computational tools (Condel, PolyPhen-2 and SIFT) in correctly annotating the function-altering potential of 10,913 laboratory-induced mutations from 2372 proteins. We find that computational tools are very successful in diagnosing laboratory-induced mutations that elicit significant functional change in the laboratory (up to 92% accuracy). But, these tools consistently fail in correctly annotating laboratory-induced mutations that show no functional impact in the laboratory assays. Therefore, the overall accuracy of computational tools for laboratory-induced mutations is much lower than that observed for the naturally occurring human variants. We tested and rejected the possibilities that the preponderance of changes to alanine and the presence of multiple base-pair mutations in the laboratory were the reasons for the observed discordance between the performance of computational tools for natural and laboratory mutations. Instead, we discover that the laboratory-induced mutations occur predominately at the highly conserved positions in proteins, where the computational tools have the lowest accuracy of correct prediction for variants that do not impact function (neutral). Therefore, the comparisons of experimental-profiling results with those from computational predictions need to be sensitive to the evolutionary conservation of the positions harboring the amino acid change. PMID:22685075

  1. Projection of Young-Old and Old-Old with Functional Disability: Does Accounting for the Changing Educational Composition of the Elderly Population Make a Difference?

    PubMed Central

    Ansah, John P.; Malhotra, Rahul; Lew, Nicola; Chiu, Chi-Tsun; Chan, Angelique; Bayer, Steffen; Matchar, David B.

    2015-01-01

    This study compares projections, up to year 2040, of young-old (aged 60-79) and old-old (aged 80+) with functional disability in Singapore with and without accounting for the changing educational composition of the Singaporean elderly. Two multi-state population models, with and without accounting for educational composition respectively, were developed, parameterized with age-gender-(education)-specific transition probabilities (between active, functional disability and death states) estimated from two waves (2009 and 2011) of a nationally representative survey of community-dwelling Singaporeans aged ≥60 years (N=4,990). Probabilistic sensitivity analysis with the bootstrap method was used to obtain the 95% confidence interval of the transition probabilities. Not accounting for educational composition overestimated the young-old with functional disability by 65 percent and underestimated the old-old by 20 percent in 2040. Accounting for educational composition, the proportion of old-old with functional disability increased from 40.8 percent in 2000 to 64.4 percent by 2040; not accounting for educational composition, the proportion in 2040 was 49.4 percent. Since the health profiles, and hence care needs, of the old-old differ from those of the young-old, health care service utilization and expenditure and the demand for formal and informal caregiving will be affected, impacting health and long-term care policy. PMID:25974069

  2. Projection of young-old and old-old with functional disability: does accounting for the changing educational composition of the elderly population make a difference?

    PubMed

    Ansah, John P; Malhotra, Rahul; Lew, Nicola; Chiu, Chi-Tsun; Chan, Angelique; Bayer, Steffen; Matchar, David B

    2015-01-01

    This study compares projections, up to year 2040, of young-old (aged 60-79) and old-old (aged 80+) with functional disability in Singapore with and without accounting for the changing educational composition of the Singaporean elderly. Two multi-state population models, with and without accounting for educational composition respectively, were developed, parameterized with age-gender-(education)-specific transition probabilities (between active, functional disability and death states) estimated from two waves (2009 and 2011) of a nationally representative survey of community-dwelling Singaporeans aged ≥ 60 years (N=4,990). Probabilistic sensitivity analysis with the bootstrap method was used to obtain the 95% confidence interval of the transition probabilities. Not accounting for educational composition overestimated the young-old with functional disability by 65 percent and underestimated the old-old by 20 percent in 2040. Accounting for educational composition, the proportion of old-old with functional disability increased from 40.8 percent in 2000 to 64.4 percent by 2040; not accounting for educational composition, the proportion in 2040 was 49.4 percent. Since the health profiles, and hence care needs, of the old-old differ from those of the young-old, health care service utilization and expenditure and the demand for formal and informal caregiving will be affected, impacting health and long-term care policy.

  3. Analysis and selection of optimal function implementations in massively parallel computer

    DOEpatents

    Archer, Charles Jens; Peters, Amanda; Ratterman, Joseph D.

    2011-05-31

    An apparatus, program product and method optimize the operation of a parallel computer system by, in part, collecting performance data for a set of implementations of a function capable of being executed on the parallel computer system based upon the execution of the set of implementations under varying input parameters in a plurality of input dimensions. The collected performance data may be used to generate selection program code that is configured to call selected implementations of the function in response to a call to the function under varying input parameters. The collected performance data may be used to perform more detailed analysis to ascertain the comparative performance of the set of implementations of the function under the varying input parameters.

  4. Use of global functions for improvement in efficiency of nonlinear analysis. [in computer structural displacement estimation

    NASA Technical Reports Server (NTRS)

    Almroth, B. O.; Stehlin, P.; Brogan, F. A.

    1981-01-01

    A method for improving the efficiency of nonlinear structural analysis by the use of global displacement functions is presented. The computer programs include options to define the global functions as input or let the program automatically select and update these functions. The program was applied to a number of structures: (1) 'pear-shaped cylinder' in compression, (2) bending of a long cylinder, (3) spherical shell subjected to point force, (4) panel with initial imperfections, (5) cylinder with cutouts. The sample cases indicate the usefulness of the procedure in the solution of nonlinear structural shell problems by the finite element method. It is concluded that the use of global functions for extrapolation will lead to savings in computer time.

  5. The Krigifier: A Procedure for Generating Pseudorandom Nonlinear Objective Functions for Computational Experimentation

    NASA Technical Reports Server (NTRS)

    Trosset, Michael W.

    1999-01-01

    Comprehensive computational experiments to assess the performance of algorithms for numerical optimization require (among other things) a practical procedure for generating pseudorandom nonlinear objective functions. We propose a procedure that is based on the convenient fiction that objective functions are realizations of stochastic processes. This report details the calculations necessary to implement our procedure for the case of certain stationary Gaussian processes and presents a specific implementation in the statistical programming language S-PLUS.

  6. Monte Carlo computation of the spectral density function in the interacting scalar field theory

    NASA Astrophysics Data System (ADS)

    Abbasi, Navid; Davody, Ali

    2015-12-01

    We study the ϕ4 field theory in d = 4. Using bold diagrammatic Monte Carlo method, we solve the Schwinger-Dyson equations and find the spectral density function of the theory beyond the weak coupling regime. We then compare our result with the one obtained from the perturbation theory. At the end, we utilize our Monte Carlo result to find the vertex function as the basis for the computation of the physical scattering amplitudes.

  7. MRIVIEW: An interactive computational tool for investigation of brain structure and function

    SciTech Connect

    Ranken, D.; George, J.

    1993-12-31

    MRIVIEW is a software system which uses image processing and visualization to provide neuroscience researchers with an integrated environment for combining functional and anatomical information. Key features of the software include semi-automated segmentation of volumetric head data and an interactive coordinate reconciliation method which utilizes surface visualization. The current system is a precursor to a computational brain atlas. We describe features this atlas will incorporate, including methods under development for visualizing brain functional data obtained from several different research modalities.

  8. A fast computation method for MUSIC spectrum function based on circular arrays

    NASA Astrophysics Data System (ADS)

    Du, Zhengdong; Wei, Ping

    2015-02-01

    The large computation amount of multiple signal classification (MUSIC) spectrum function seriously affects the timeliness of direction finding system using MUSIC algorithm, especially in the two-dimensional directions of arrival (DOA) estimation of azimuth and elevation with a large antenna array. This paper proposes a fast computation method for MUSIC spectrum. It is suitable for any circular array. First, the circular array is transformed into a virtual uniform circular array, in the process of calculating MUSIC spectrum, for the cyclic characteristics of steering vector, the inner product in the calculation of spatial spectrum is realised by cyclic convolution. The computational amount of MUSIC spectrum is obviously less than that of the conventional method. It is a very practical way for MUSIC spectrum computation in circular arrays.

  9. Computer-program documentation of an interactive-accounting model to simulate streamflow, water quality, and water-supply operations in a river basin

    USGS Publications Warehouse

    Burns, A.W.

    1988-01-01

    This report describes an interactive-accounting model used to simulate streamflow, chemical-constituent concentrations and loads, and water-supply operations in a river basin. The model uses regression equations to compute flow from incremental (internode) drainage areas. Conservative chemical constituents (typically dissolved solids) also are computed from regression equations. Both flow and water quality loads are accumulated downstream. Optionally, the model simulates the water use and the simplified groundwater systems of a basin. Water users include agricultural, municipal, industrial, and in-stream users , and reservoir operators. Water users list their potential water sources, including direct diversions, groundwater pumpage, interbasin imports, or reservoir releases, in the order in which they will be used. Direct diversions conform to basinwide water law priorities. The model is interactive, and although the input data exist in files, the user can modify them interactively. A major feature of the model is its color-graphic-output options. This report includes a description of the model, organizational charts of subroutines, and examples of the graphics. Detailed format instructions for the input data, example files of input data, definitions of program variables, and listing of the FORTRAN source code are Attachments to the report. (USGS)

  10. Identifying Differential Item Functioning in Multi-Stage Computer Adaptive Testing

    ERIC Educational Resources Information Center

    Gierl, Mark J.; Lai, Hollis; Li, Johnson

    2013-01-01

    The purpose of this study is to evaluate the performance of CATSIB (Computer Adaptive Testing-Simultaneous Item Bias Test) for detecting differential item functioning (DIF) when items in the matching and studied subtest are administered adaptively in the context of a realistic multi-stage adaptive test (MST). MST was simulated using a 4-item…

  11. Integrating Computer Software into the Functional Mathematics Curriculum: A Diagnostic Approach.

    ERIC Educational Resources Information Center

    Prince George's County Public Schools, Upper Marlboro, MD.

    This curriculum guide was written to provide information on the skills covered in the Maryland Functional Math Test (MFMT) and to outline a process which will allow teachers to fully integrate computer software into their instruction. The materials produced in this directory are designed to assist mild to moderately handicapped students who will…

  12. Computing the Partial Fraction Decomposition of Rational Functions with Irreducible Quadratic Factors in the Denominators

    ERIC Educational Resources Information Center

    Man, Yiu-Kwong

    2012-01-01

    In this note, a new method for computing the partial fraction decomposition of rational functions with irreducible quadratic factors in the denominators is presented. This method involves polynomial divisions and substitutions only, without having to solve for the complex roots of the irreducible quadratic polynomial or to solve a system of linear…

  13. A Computational Model Quantifies the Effect of Anatomical Variability on Velopharyngeal Function

    ERIC Educational Resources Information Center

    Inouye, Joshua M.; Perry, Jamie L.; Lin, Kant Y.; Blemker, Silvia S.

    2015-01-01

    Purpose: This study predicted the effects of velopharyngeal (VP) anatomical parameters on VP function to provide a greater understanding of speech mechanics and aid in the treatment of speech disorders. Method: We created a computational model of the VP mechanism using dimensions obtained from magnetic resonance imaging measurements of 10 healthy…

  14. PuFT: Computer-Assisted Program for Pulmonary Function Tests.

    ERIC Educational Resources Information Center

    Boyle, Joseph

    1983-01-01

    PuFT computer program (Microsoft Basic) is designed to help in understanding/interpreting pulmonary function tests (PFT). The program provides predicted values for common PFT after entry of patient data, calculates/plots graph simulating force vital capacity (FVC), and allows observations of effects on predicted PFT values and FVC curve when…

  15. Maple (Computer Algebra System) in Teaching Pre-Calculus: Example of Absolute Value Function

    ERIC Educational Resources Information Center

    Tuluk, Güler

    2014-01-01

    Modules in Computer Algebra Systems (CAS) make Mathematics interesting and easy to understand. The present study focused on the implementation of the algebraic, tabular (numerical), and graphical approaches used for the construction of the concept of absolute value function in teaching mathematical content knowledge along with Maple 9. The study…

  16. Accounting for polarization cost when using fixed charge force fields. II. Method and application for computing effect of polarization cost on free energy of hydration.

    PubMed

    Swope, William C; Horn, Hans W; Rice, Julia E

    2010-07-01

    Polarization cost is the energy needed to distort the wave function of a molecule from one appropriate to the gas phase to one appropriate for some condensed phase. Although it is not currently standard practice, polarization cost should be considered when deriving improved fixed charge force fields based on fits to certain types of experimental data and when using such force fields to compute observables that involve changes in molecular polarization. Building on earlier work, we present mathematical expressions and a method to estimate the effect of polarization cost on free energy and enthalpy implied by a charge model meant to represent a solvated state. The charge model can be any combination of point charges, higher-order multipoles, or even distributed charge densities, as long as they do not change in response to environment. The method is illustrated by computing the effect of polarization cost on free energies of hydration for the neutral amino acid side chain analogues as predicted using two popular fixed charge force fields and one based on electron densities computed using quantum chemistry techniques that employ an implicit model to represent aqueous solvent. From comparison of the computed and experimental hydration free energies, we find that two commonly used force fields are too underpolarized in their description of the solute-water interaction. On the other hand, a charge model based on the charge density from a hybrid density functional calculation that used an implicit model for aqueous solvent performs well for hydration free energies of these molecules after the correction for dipole polarization is applied. As such, an improved description of the density (e.g., B3LYP, MP2) in conjunction with an implicit solvent (e.g., PCM) or explicit solvent (e.g., QM/MM) approach may offer promise as a starting point for the development of improved fixed charge models for force fields.

  17. Computer generation of symbolic network functions - A new theory and implementation.

    NASA Technical Reports Server (NTRS)

    Alderson, G. E.; Lin, P.-M.

    1972-01-01

    A new method is presented for obtaining network functions in which some, none, or all of the network elements are represented by symbolic parameters (i.e., symbolic network functions). Unlike the topological tree enumeration or signal flow graph methods generally used to derive symbolic network functions, the proposed procedure employs fast, efficient, numerical-type algorithms to determine the contribution of those network branches that are not represented by symbolic parameters. A computer program called NAPPE (for Network Analysis Program using Parameter Extractions) and incorporating all of the concepts discussed has been written. Several examples illustrating the usefulness and efficiency of NAPPE are presented.

  18. On computation and use of Fourier coefficients for associated Legendre functions

    NASA Astrophysics Data System (ADS)

    Gruber, Christian; Abrykosov, Oleh

    2016-06-01

    The computation of spherical harmonic series in very high resolution is known to be delicate in terms of performance and numerical stability. A major problem is to keep results inside a numerical range of the used data type during calculations as under-/overflow arises. Extended data types are currently not desirable since the arithmetic complexity will grow exponentially with higher resolution levels. If the associated Legendre functions are computed in the spectral domain, then regular grid transformations can be applied to be highly efficient and convenient for derived quantities as well. In this article, we compare three recursive computations of the associated Legendre functions as trigonometric series, thereby ensuring a defined numerical range for each constituent wave number, separately. The results to a high degree and order show the numerical strength of the proposed method. First, the evaluation of Fourier coefficients of the associated Legendre functions has been done with respect to the floating-point precision requirements. Secondly, the numerical accuracy in the cases of standard double and long double precision arithmetic is demonstrated. Following Bessel's inequality the obtained accuracy estimates of the Fourier coefficients are directly transferable to the associated Legendre functions themselves and to derived functionals as well. Therefore, they can provide an essential insight to modern geodetic applications that depend on efficient spherical harmonic analysis and synthesis beyond [5~× ~5] arcmin resolution.

  19. Response function theories that account for size distribution effects - A review. [mathematical models concerning composite propellant heterogeneity effects on combustion instability

    NASA Technical Reports Server (NTRS)

    Cohen, N. S.

    1980-01-01

    The paper presents theoretical models developed to account for the heterogeneity of composite propellants in expressing the pressure-coupled combustion response function. It is noted that the model of Lengelle and Williams (1968) furnishes a viable basis to explain the effects of heterogeneity.

  20. How to Compute Green's Functions for Entire Mass Trajectories Within Krylov Solvers

    NASA Astrophysics Data System (ADS)

    Glässner, Uwe; Güsken, Stephan; Lippert, Thomas; Ritzenhöfer, Gero; Schilling, Klaus; Frommer, Andreas

    The availability of efficient Krylov subspace solvers plays a vital role in the solution of a variety of numerical problems in computational science. Here we consider lattice field theory. We present a new general numerical method to compute many Green's functions for complex non-singular matrices within one iteration process. Our procedure applies to matrices of structure A = D - m, with m proportional to the unit matrix, and can be integrated within any Krylov subspace solver. We can compute the derivatives x(n) of the solution vector x with respect to the parameter m and construct the Taylor expansion of x around m. We demonstrate the advantages of our method using a minimal residual solver. Here the procedure requires one intermediate vector for each Green's function to compute. As real-life example, we determine a mass trajectory of the Wilson fermion matrix for lattice QCD. Here we find that we can obtain Green's functions at all masses ≥ m at the price of one inversion at mass m.

  1. Algorithms for Efficient Computation of Transfer Functions for Large Order Flexible Systems

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Giesy, Daniel P.

    1998-01-01

    An efficient and robust computational scheme is given for the calculation of the frequency response function of a large order, flexible system implemented with a linear, time invariant control system. Advantage is taken of the highly structured sparsity of the system matrix of the plant based on a model of the structure using normal mode coordinates. The computational time per frequency point of the new computational scheme is a linear function of system size, a significant improvement over traditional, still-matrix techniques whose computational times per frequency point range from quadratic to cubic functions of system size. This permits the practical frequency domain analysis of systems of much larger order than by traditional, full-matrix techniques. Formulations are given for both open- and closed-loop systems. Numerical examples are presented showing the advantages of the present formulation over traditional approaches, both in speed and in accuracy. Using a model with 703 structural modes, the present method was up to two orders of magnitude faster than a traditional method. The present method generally showed good to excellent accuracy throughout the range of test frequencies, while traditional methods gave adequate accuracy for lower frequencies, but generally deteriorated in performance at higher frequencies with worst case errors being many orders of magnitude times the correct values.

  2. Computational aspects of maximum likelihood estimation and reduction in sensitivity function calculations

    NASA Technical Reports Server (NTRS)

    Gupta, N. K.; Mehra, R. K.

    1974-01-01

    This paper discusses numerical aspects of computing maximum likelihood estimates for linear dynamical systems in state-vector form. Different gradient-based nonlinear programming methods are discussed in a unified framework and their applicability to maximum likelihood estimation is examined. The problems due to singular Hessian or singular information matrix that are common in practice are discussed in detail and methods for their solution are proposed. New results on the calculation of state sensitivity functions via reduced order models are given. Several methods for speeding convergence and reducing computation time are also discussed.

  3. Understanding entangled cerebral networks: a prerequisite for restoring brain function with brain-computer interfaces.

    PubMed

    Mandonnet, Emmanuel; Duffau, Hugues

    2014-01-01

    Historically, cerebral processing has been conceptualized as a framework based on statically localized functions. However, a growing amount of evidence supports a hodotopical (delocalized) and flexible organization. A number of studies have reported absence of a permanent neurological deficit after massive surgical resections of eloquent brain tissue. These results highlight the tremendous plastic potential of the brain. Understanding anatomo-functional correlates underlying this cerebral reorganization is a prerequisite to restore brain functions through brain-computer interfaces (BCIs) in patients with cerebral diseases, or even to potentiate brain functions in healthy individuals. Here, we review current knowledge of neural networks that could be utilized in the BCIs that enable movements and language. To this end, intraoperative electrical stimulation in awake patients provides valuable information on the cerebral functional maps, their connectomics and plasticity. Overall, these studies indicate that the complex cerebral circuitry that underpins interactions between action, cognition and behavior should be throughly investigated before progress in BCI approaches can be achieved.

  4. Redox Biology: Computational Approaches to the Investigation of Functional Cysteine Residues

    PubMed Central

    Marino, Stefano M.

    2011-01-01

    Abstract Cysteine (Cys) residues serve many functions, such as catalysis, stabilization of protein structure through disulfides, metal binding, and regulation of protein function. Cys residues are also subject to numerous post-translational modifications. In recent years, various computational tools aiming at classifying and predicting different functional categories of Cys have been developed, particularly for structural and catalytic Cys. On the other hand, given complexity of the subject, bioinformatics approaches have been less successful for the investigation of regulatory Cys sites. In this review, we introduce different functional categories of Cys residues. For each category, an overview of state-of-the-art bioinformatics methods and tools is provided, along with examples of successful applications and potential limitations associated with each approach. Finally, we discuss Cys-based redox switches, which modify the view of distinct functional categories of Cys in proteins. Antioxid. Redox Signal. 15, 135–146. PMID:20812876

  5. Symphony Time Accounting Resource (STAR)

    SciTech Connect

    Newfield, S.E.; Booth, J.W.; Redman, D.L.

    1986-05-01

    The Symphony Time Accounting Resource, a new time accounting system, that can be run on personal computers instead of computer mainframes is described. This new system is useful for organizations that do work under several job order numbers and/or accounting codes and could also be adapted for use by organizations on the recharge system. 1 fig., 2 tabs.

  6. Computing Wigner distributions and time correlation functions using the quantum thermal bath method: application to proton transfer spectroscopy.

    PubMed

    Basire, Marie; Borgis, Daniel; Vuilleumier, Rodolphe

    2013-08-14

    Langevin dynamics coupled to a quantum thermal bath (QTB) allows for the inclusion of vibrational quantum effects in molecular dynamics simulations at virtually no additional computer cost. We investigate here the ability of the QTB method to reproduce the quantum Wigner distribution of a variety of model potentials, designed to assess the performances and limits of the method. We further compute the infrared spectrum of a multidimensional model of proton transfer in the gas phase and in solution, using classical trajectories sampled initially from the Wigner distribution. It is shown that for this type of system involving large anharmonicities and strong nonlinear coupling to the environment, the quantum thermal bath is able to sample the Wigner distribution satisfactorily and to account for both zero point energy and tunneling effects. It leads to quantum time correlation functions having the correct short-time behavior, and the correct associated spectral frequencies, but that are slightly too overdamped. This is attributed to the classical propagation approximation rather than the generation of the quantized initial conditions themselves.

  7. PLATO IV Accountancy Index.

    ERIC Educational Resources Information Center

    Pondy, Dorothy, Comp.

    The catalog was compiled to assist instructors in planning community college and university curricula using the 48 computer-assisted accountancy lessons available on PLATO IV (Programmed Logic for Automatic Teaching Operation) for first semester accounting courses. It contains information on lesson access, lists of acceptable abbreviations for…

  8. Understanding of Emotional Experience in Autism: Insights from the Personal Accounts of High-Functioning Children with Autism

    ERIC Educational Resources Information Center

    Losh, Molly; Capps, Lisa

    2006-01-01

    In this study, the authors investigate emotional understanding in autism through a discourse analytic framework to provide a window into children's strategies for interpreting emotional versus nonemotional encounters and consider the implications for the mechanisms underlying emotional understanding in typical development. Accounts were analyzed…

  9. 25 CFR 547.9 - What are the minimum technical standards for Class II gaming system accounting functions?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... OF CLASS II GAMES § 547.9 What are the minimum technical standards for Class II gaming system... digits to accommodate the design of the game. (3) Accounting data displayed to the player may be... audit, configuration, recall and test modes; or (ii) Temporarily, during entertaining displays of...

  10. 25 CFR 547.9 - What are the minimum technical standards for Class II gaming system accounting functions?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... OF CLASS II GAMES § 547.9 What are the minimum technical standards for Class II gaming system... digits to accommodate the design of the game. (3) Accounting data displayed to the player may be... audit, configuration, recall and test modes; or (ii) Temporarily, during entertaining displays of...

  11. 25 CFR 547.9 - What are the minimum technical standards for Class II gaming system accounting functions?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... OF CLASS II GAMES § 547.9 What are the minimum technical standards for Class II gaming system... digits to accommodate the design of the game. (3) Accounting data displayed to the player may be... audit, configuration, recall and test modes; or (ii) Temporarily, during entertaining displays of...

  12. Storing files in a parallel computing system based on user-specified parser function

    DOEpatents

    Faibish, Sorin; Bent, John M; Tzelnic, Percy; Grider, Gary; Manzanares, Adam; Torres, Aaron

    2014-10-21

    Techniques are provided for storing files in a parallel computing system based on a user-specified parser function. A plurality of files generated by a distributed application in a parallel computing system are stored by obtaining a parser from the distributed application for processing the plurality of files prior to storage; and storing one or more of the plurality of files in one or more storage nodes of the parallel computing system based on the processing by the parser. The plurality of files comprise one or more of a plurality of complete files and a plurality of sub-files. The parser can optionally store only those files that satisfy one or more semantic requirements of the parser. The parser can also extract metadata from one or more of the files and the extracted metadata can be stored with one or more of the plurality of files and used for searching for files.

  13. Computational Perspectives into Plasmepsins Structure—Function Relationship: Implications to Inhibitors Design

    PubMed Central

    Gil L., Alejandro; Valiente, Pedro A.; Pascutti, Pedro G.; Pons, Tirso

    2011-01-01

    The development of efficient and selective antimalariais remains a challenge for the pharmaceutical industry. The aspartic proteases plasmepsins, whose inhibition leads to parasite death, are classified as targets for the design of potent drugs. Combinatorial synthesis is currently being used to generate inhibitor libraries for these enzymes, and together with computational methodologies have been demonstrated capable for the selection of lead compounds. The high structural flexibility of plasmepsins, revealed by their X-ray structures and molecular dynamics simulations, made even more complicated the prediction of putative binding modes, and therefore, the use of common computational tools, like docking and free-energy calculations. In this review, we revised the computational strategies utilized so far, for the structure-function relationship studies concerning the plasmepsin family, with special focus on the recent advances in the improvement of the linear interaction estimation (LIE) method, which is one of the most successful methodologies in the evaluation of plasmepsin-inhibitor binding affinity. PMID:21760810

  14. Computational perspectives into plasmepsins structure-function relationship: implications to inhibitors design.

    PubMed

    Gil L, Alejandro; Valiente, Pedro A; Pascutti, Pedro G; Pons, Tirso

    2011-01-01

    The development of efficient and selective antimalariais remains a challenge for the pharmaceutical industry. The aspartic proteases plasmepsins, whose inhibition leads to parasite death, are classified as targets for the design of potent drugs. Combinatorial synthesis is currently being used to generate inhibitor libraries for these enzymes, and together with computational methodologies have been demonstrated capable for the selection of lead compounds. The high structural flexibility of plasmepsins, revealed by their X-ray structures and molecular dynamics simulations, made even more complicated the prediction of putative binding modes, and therefore, the use of common computational tools, like docking and free-energy calculations. In this review, we revised the computational strategies utilized so far, for the structure-function relationship studies concerning the plasmepsin family, with special focus on the recent advances in the improvement of the linear interaction estimation (LIE) method, which is one of the most successful methodologies in the evaluation of plasmepsin-inhibitor binding affinity. PMID:21760810

  15. Time Utility Functions for Modeling and Evaluating Resource Allocations in a Heterogeneous Computing System

    SciTech Connect

    Briceno, Luis Diego; Khemka, Bhavesh; Siegel, Howard Jay; Maciejewski, Anthony A; Groer, Christopher S; Koenig, Gregory A; Okonski, Gene D; Poole, Stephen W

    2011-01-01

    This study considers a heterogeneous computing system and corresponding workload being investigated by the Extreme Scale Systems Center (ESSC) at Oak Ridge National Laboratory (ORNL). The ESSC is part of a collaborative effort between the Department of Energy (DOE) and the Department of Defense (DoD) to deliver research, tools, software, and technologies that can be integrated, deployed, and used in both DOE and DoD environments. The heterogeneous system and workload described here are representative of a prototypical computing environment being studied as part of this collaboration. Each task can exhibit a time-varying importance or utility to the overall enterprise. In this system, an arriving task has an associated priority and precedence. The priority is used to describe the importance of a task, and precedence is used to describe how soon the task must be executed. These two metrics are combined to create a utility function curve that indicates how valuable it is for the system to complete a task at any given moment. This research focuses on using time-utility functions to generate a metric that can be used to compare the performance of different resource schedulers in a heterogeneous computing system. The contributions of this paper are: (a) a mathematical model of a heterogeneous computing system where tasks arrive dynamically and need to be assigned based on their priority, precedence, utility characteristic class, and task execution type, (b) the use of priority and precedence to generate time-utility functions that describe the value a task has at any given time, (c) the derivation of a metric based on the total utility gained from completing tasks to measure the performance of the computing environment, and (d) a comparison of the performance of resource allocation heuristics in this environment.

  16. The Use of Computers in Broadcast Education.

    ERIC Educational Resources Information Center

    Singleton, Timothy J.

    Speech communication and journalism instructors should be aware of how computers are used in the mass media and what role computer systems can play in broadcast education. Many radio and television stations commonly use computers for accounting and business functions, and they are beginning to expand their computer operations to program logs and…

  17. Coal-seismic, desktop computer programs in BASIC; Part 6, Develop rms velocity functions and apply mute and normal movement

    USGS Publications Warehouse

    Hasbrouck, W.P.

    1983-01-01

    Processing of data taken with the U.S. Geological Survey's coal-seismic system is done with a desktop, stand-alone computer. Programs for this computer are written in the extended BASIC language utilized by the Tektronix 4051 Graphic System. This report presents computer programs used to develop rms velocity functions and apply mute and normal moveout to a 12-trace seismogram.

  18. Structure, dynamics, and function of the monooxygenase P450 BM-3: insights from computer simulations studies

    NASA Astrophysics Data System (ADS)

    Roccatano, Danilo

    2015-07-01

    The monooxygenase P450 BM-3 is a NADPH-dependent fatty acid hydroxylase enzyme isolated from soil bacterium Bacillus megaterium. As a pivotal member of cytochrome P450 superfamily, it has been intensely studied for the comprehension of structure-dynamics-function relationships in this class of enzymes. In addition, due to its peculiar properties, it is also a promising enzyme for biochemical and biomedical applications. However, despite the efforts, the full understanding of the enzyme structure and dynamics is not yet achieved. Computational studies, particularly molecular dynamics (MD) simulations, have importantly contributed to this endeavor by providing new insights at an atomic level regarding the correlations between structure, dynamics, and function of the protein. This topical review summarizes computational studies based on MD simulations of the cytochrome P450 BM-3 and gives an outlook on future directions.

  19. A comparison of computational methods and algorithms for the complex gamma function

    NASA Technical Reports Server (NTRS)

    Ng, E. W.

    1974-01-01

    A survey and comparison of some computational methods and algorithms for gamma and log-gamma functions of complex arguments are presented. Methods and algorithms reported include Chebyshev approximations, Pade expansion and Stirling's asymptotic series. The comparison leads to the conclusion that Algorithm 421 published in the Communications of ACM by H. Kuki is the best program either for individual application or for the inclusion in subroutine libraries.

  20. Method, systems, and computer program products for implementing function-parallel network firewall

    DOEpatents

    Fulp, Errin W.; Farley, Ryan J.

    2011-10-11

    Methods, systems, and computer program products for providing function-parallel firewalls are disclosed. According to one aspect, a function-parallel firewall includes a first firewall node for filtering received packets using a first portion of a rule set including a plurality of rules. The first portion includes less than all of the rules in the rule set. At least one second firewall node filters packets using a second portion of the rule set. The second portion includes at least one rule in the rule set that is not present in the first portion. The first and second portions together include all of the rules in the rule set.

  1. A deconvolution function for single photon emission computed tomography with constant attenuation

    SciTech Connect

    Tomitani, T.

    1986-02-01

    A shift-invariant spatial deconvolution function for single-photon-emission computerized tomography with constant attenuation is presented. Image reconstruction algorithm is similar to conventional convolution-back-projection algorithm except that exponential weight is applied in backprojection process. The deconvolution function was obtained as a solution of a generalized Schlomilch's integral equation. A method to solve the integral equation is described briefly. The present deconvolution function is incorporated with frequency roll-off and image resolution can be preset. At the extreme of ideal image reconstruction, the deconvolution function is identical to that deduced by Kim et al. and its Fourier transform was proved to be identical to the filter deduced by Tretiak and Delaney and Gullburg and Budinger. Variance of the reconstructed image was analyzed and some numerical results were given. The algorithm was tested with computer simulation.

  2. Liver Function After Irradiation Based on Computed Tomographic Portal Vein Perfusion Imaging

    SciTech Connect

    Cao Yue Pan, Charlie; Balter, James M.; Platt, Joel F.; Francis, Isaac R.; Knol, James A.; Normolle, Daniel; Ben-Josef, Edgar; Haken, Randall K. ten; Lawrence, Theodore S.

    2008-01-01

    Purpose: To determine whether individual and regional liver sensitivity to radiation could be assessed by measuring liver perfusion during a course of treatment using dynamic contrast-enhanced computed tomography scanning. Methods and Materials: Patients with intrahepatic cancer undergoing conformal radiotherapy underwent dynamic contrast-enhanced computed tomography (to measure perfusion distribution) and an indocyanine extraction study (to measure liver function) before, during, and 1 month after treatment. We hoped to determine whether the residual functioning liver (i.e., those regions showing portal vein perfusion) could be used to predict overall liver function after irradiation. Results: Radiation doses from 45 to 84 Gy resulted in undetectable regional portal vein perfusion 1 month after treatment. The volume of each liver with undetectable portal vein perfusion ranged from 0 to 39% and depended both on the patient's sensitivity and on dose distribution. There was a significant correlation between indocyanine green clearance and the mean of the estimated portal vein perfusion in the functional liver parenchyma (p < 0.001). Conclusion: This study reveals substantial individual variability in the sensitivity of the liver to irradiation. In addition, these findings suggest that hepatic perfusion imaging may be a marker for liver function and has the potential to be a tool for individualizing therapy.

  3. Computing the Evans function via solving a linear boundary value ODE

    NASA Astrophysics Data System (ADS)

    Wahl, Colin; Nguyen, Rose; Ventura, Nathaniel; Barker, Blake; Sandstede, Bjorn

    2015-11-01

    Determining the stability of traveling wave solutions to partial differential equations can oftentimes be computationally intensive but of great importance to understanding the effects of perturbations on the physical systems (chemical reactions, hydrodynamics, etc.) they model. For waves in one spatial dimension, one may linearize around the wave and form an Evans function - an analytic Wronskian-like function which has zeros that correspond in multiplicity to the eigenvalues of the linearized system. If eigenvalues with a positive real part do not exist, the traveling wave will be stable. Two methods exist for calculating the Evans function numerically: the exterior-product method and the method of continuous orthogonalization. The first is numerically expensive, and the second reformulates the originally linear system as a nonlinear system. We develop a new algorithm for computing the Evans function through appropriate linear boundary-value problems. This algorithm is cheaper than the previous methods, and we prove that it preserves analyticity of the Evans function. We also provide error estimates and implement it on some classical one- and two-dimensional systems, one being the Swift-Hohenberg equation in a channel, to show the advantages.

  4. Educational Accountability

    ERIC Educational Resources Information Center

    Pincoffs, Edmund L.

    1973-01-01

    Discusses educational accountability as the paradigm of performance contracting, presents some arguments for and against accountability, and discusses the goals of education and the responsibility of the teacher. (Author/PG)

  5. Comparison of x ray computed tomography number to proton relative linear stopping power conversion functions using a standard phantom

    SciTech Connect

    Moyers, M. F.

    2014-06-15

    Purpose: Adequate evaluation of the results from multi-institutional trials involving light ion beam treatments requires consideration of the planning margins applied to both targets and organs at risk. A major uncertainty that affects the size of these margins is the conversion of x ray computed tomography numbers (XCTNs) to relative linear stopping powers (RLSPs). Various facilities engaged in multi-institutional clinical trials involving proton beams have been applying significantly different margins in their patient planning. This study was performed to determine the variance in the conversion functions used at proton facilities in the U.S.A. wishing to participate in National Cancer Institute sponsored clinical trials. Methods: A simplified method of determining the conversion function was developed using a standard phantom containing only water and aluminum. The new method was based on the premise that all scanners have their XCTNs for air and water calibrated daily to constant values but that the XCTNs for high density/high atomic number materials are variable with different scanning conditions. The standard phantom was taken to 10 different proton facilities and scanned with the local protocols resulting in 14 derived conversion functions which were compared to the conversion functions used at the local facilities. Results: For tissues within ±300 XCTN of water, all facility functions produced converted RLSP values within ±6% of the values produced by the standard function and within 8% of the values from any other facility's function. For XCTNs corresponding to lung tissue, converted RLSP values differed by as great as ±8% from the standard and up to 16% from the values of other facilities. For XCTNs corresponding to low-density immobilization foam, the maximum to minimum values differed by as much as 40%. Conclusions: The new method greatly simplifies determination of the conversion function, reduces ambiguity, and in the future could promote

  6. Clinical Validation of 4-Dimensional Computed Tomography Ventilation With Pulmonary Function Test Data

    SciTech Connect

    Brennan, Douglas; Schubert, Leah; Diot, Quentin; Castillo, Richard; Castillo, Edward; Guerrero, Thomas; Martel, Mary K.; Linderman, Derek; Gaspar, Laurie E.; Miften, Moyed; Kavanagh, Brian D.; Vinogradskiy, Yevgeniy

    2015-06-01

    Purpose: A new form of functional imaging has been proposed in the form of 4-dimensional computed tomography (4DCT) ventilation. Because 4DCTs are acquired as part of routine care for lung cancer patients, calculating ventilation maps from 4DCTs provides spatial lung function information without added dosimetric or monetary cost to the patient. Before 4DCT-ventilation is implemented it needs to be clinically validated. Pulmonary function tests (PFTs) provide a clinically established way of evaluating lung function. The purpose of our work was to perform a clinical validation by comparing 4DCT-ventilation metrics with PFT data. Methods and Materials: Ninety-eight lung cancer patients with pretreatment 4DCT and PFT data were included in the study. Pulmonary function test metrics used to diagnose obstructive lung disease were recorded: forced expiratory volume in 1 second (FEV1) and FEV1/forced vital capacity. Four-dimensional CT data sets and spatial registration were used to compute 4DCT-ventilation images using a density change–based and a Jacobian-based model. The ventilation maps were reduced to single metrics intended to reflect the degree of ventilation obstruction. Specifically, we computed the coefficient of variation (SD/mean), ventilation V20 (volume of lung ≤20% ventilation), and correlated the ventilation metrics with PFT data. Regression analysis was used to determine whether 4DCT ventilation data could predict for normal versus abnormal lung function using PFT thresholds. Results: Correlation coefficients comparing 4DCT-ventilation with PFT data ranged from 0.63 to 0.72, with the best agreement between FEV1 and coefficient of variation. Four-dimensional CT ventilation metrics were able to significantly delineate between clinically normal versus abnormal PFT results. Conclusions: Validation of 4DCT ventilation with clinically relevant metrics is essential. We demonstrate good global agreement between PFTs and 4DCT-ventilation, indicating that 4DCT

  7. It Might Not Make a Big DIF: Improved Differential Test Functioning Statistics That Account for Sampling Variability

    ERIC Educational Resources Information Center

    Chalmers, R. Philip; Counsell, Alyssa; Flora, David B.

    2016-01-01

    Differential test functioning, or DTF, occurs when one or more items in a test demonstrate differential item functioning (DIF) and the aggregate of these effects are witnessed at the test level. In many applications, DTF can be more important than DIF when the overall effects of DIF at the test level can be quantified. However, optimal statistical…

  8. Computer-Based Cognitive Training for Executive Functions after Stroke: A Systematic Review

    PubMed Central

    van de Ven, Renate M.; Murre, Jaap M. J.; Veltman, Dick J.; Schmand, Ben A.

    2016-01-01

    Background: Stroke commonly results in cognitive impairments in working memory, attention, and executive function, which may be restored with appropriate training programs. Our aim was to systematically review the evidence for computer-based cognitive training of executive dysfunctions. Methods: Studies were included if they concerned adults who had suffered stroke or other types of acquired brain injury, if the intervention was computer training of executive functions, and if the outcome was related to executive functioning. We searched in MEDLINE, PsycINFO, Web of Science, and The Cochrane Library. Study quality was evaluated based on the CONSORT Statement. Treatment effect was evaluated based on differences compared to pre-treatment and/or to a control group. Results: Twenty studies were included. Two were randomized controlled trials that used an active control group. The other studies included multiple baselines, a passive control group, or were uncontrolled. Improvements were observed in tasks similar to the training (near transfer) and in tasks dissimilar to the training (far transfer). However, these effects were not larger in trained than in active control groups. Two studies evaluated neural effects and found changes in both functional and structural connectivity. Most studies suffered from methodological limitations (e.g., lack of an active control group and no adjustment for multiple testing) hampering differentiation of training effects from spontaneous recovery, retest effects, and placebo effects. Conclusions: The positive findings of most studies, including neural changes, warrant continuation of research in this field, but only if its methodological limitations are addressed. PMID:27148007

  9. Using computational fluid dynamics to test functional and ecological hypotheses in fossil taxa

    NASA Astrophysics Data System (ADS)

    Rahman, Imran

    2016-04-01

    Reconstructing how ancient organisms moved and fed is a major focus of study in palaeontology. Traditionally, this has been hampered by a lack of objective data on the functional morphology of extinct species, especially those without a clear modern analogue. However, cutting-edge techniques for characterizing specimens digitally and in three dimensions, coupled with state-of-the-art computer models, now provide a robust framework for testing functional and ecological hypotheses even in problematic fossil taxa. One such approach is computational fluid dynamics (CFD), a method for simulating fluid flows around objects that has primarily been applied to complex engineering-design problems. Here, I will present three case studies of CFD applied to fossil taxa, spanning a range of specimen sizes, taxonomic groups and geological ages. First, I will show how CFD enabled a rigorous test of hypothesized feeding modes in an enigmatic Ediacaran organism with three-fold symmetry, revealing previously unappreciated complexity of pre-Cambrian ecosystems. Second, I will show how CFD was used to evaluate hydrodynamic performance and feeding in Cambrian stem-group echinoderms, shedding light on the probable feeding strategy of the latest common ancestor of all deuterostomes. Third, I will show how CFD allowed us to explore the link between form and function in Mesozoic ichthyosaurs. These case studies serve to demonstrate the enormous potential of CFD for addressing long-standing hypotheses for a variety of fossil taxa, opening up an exciting new avenue in palaeontological studies of functional morphology.

  10. Ab initio quasi-particle approximation bandgaps of silicon nanowires calculated at density functional theory/local density approximation computational effort

    SciTech Connect

    Ribeiro, M.

    2015-06-21

    Ab initio calculations of hydrogen-passivated Si nanowires were performed using density functional theory within LDA-1/2, to account for the excited states properties. A range of diameters was calculated to draw conclusions about the ability of the method to correctly describe the main trends of bandgap, quantum confinement, and self-energy corrections versus the diameter of the nanowire. Bandgaps are predicted with excellent accuracy if compared with other theoretical results like GW, and with the experiment as well, but with a low computational cost.

  11. CAP: A Computer Code for Generating Tabular Thermodynamic Functions from NASA Lewis Coefficients

    NASA Technical Reports Server (NTRS)

    Zehe, Michael J.; Gordon, Sanford; McBride, Bonnie J.

    2001-01-01

    For several decades the NASA Glenn Research Center has been providing a file of thermodynamic data for use in several computer programs. These data are in the form of least-squares coefficients that have been calculated from tabular thermodynamic data by means of the NASA Properties and Coefficients (PAC) program. The source thermodynamic data are obtained from the literature or from standard compilations. Most gas-phase thermodynamic functions are calculated by the authors from molecular constant data using ideal gas partition functions. The Coefficients and Properties (CAP) program described in this report permits the generation of tabulated thermodynamic functions from the NASA least-squares coefficients. CAP provides considerable flexibility in the output format, the number of temperatures to be tabulated, and the energy units of the calculated properties. This report provides a detailed description of input preparation, examples of input and output for several species, and a listing of all species in the current NASA Glenn thermodynamic data file.

  12. CAP: A Computer Code for Generating Tabular Thermodynamic Functions from NASA Lewis Coefficients. Revised

    NASA Technical Reports Server (NTRS)

    Zehe, Michael J.; Gordon, Sanford; McBride, Bonnie J.

    2002-01-01

    For several decades the NASA Glenn Research Center has been providing a file of thermodynamic data for use in several computer programs. These data are in the form of least-squares coefficients that have been calculated from tabular thermodynamic data by means of the NASA Properties and Coefficients (PAC) program. The source thermodynamic data are obtained from the literature or from standard compilations. Most gas-phase thermodynamic functions are calculated by the authors from molecular constant data using ideal gas partition functions. The Coefficients and Properties (CAP) program described in this report permits the generation of tabulated thermodynamic functions from the NASA least-squares coefficients. CAP provides considerable flexibility in the output format, the number of temperatures to be tabulated, and the energy units of the calculated properties. This report provides a detailed description of input preparation, examples of input and output for several species, and a listing of all species in the current NASA Glenn thermodynamic data file.

  13. Systematic construction of density functionals based on matrix product state computations

    NASA Astrophysics Data System (ADS)

    Lubasch, Michael; Fuks, Johanna I.; Appel, Heiko; Rubio, Angel; Cirac, J. Ignacio; Bañuls, Mari-Carmen

    2016-08-01

    We propose a systematic procedure for the approximation of density functionals in density functional theory that consists of two parts. First, for the efficient approximation of a general density functional, we introduce an efficient ansatz whose non-locality can be increased systematically. Second, we present a fitting strategy that is based on systematically increasing a reasonably chosen set of training densities. We investigate our procedure in the context of strongly correlated fermions on a one-dimensional lattice in which we compute accurate training densities with the help of matrix product states. Focusing on the exchange-correlation energy, we demonstrate how an efficient approximation can be found that includes and systematically improves beyond the local density approximation. Importantly, this systematic improvement is shown for target densities that are quite different from the training densities.

  14. Effective electron displacements: A tool for time-dependent density functional theory computational spectroscopy

    SciTech Connect

    Guido, Ciro A. Cortona, Pietro; Adamo, Carlo

    2014-03-14

    We extend our previous definition of the metric Δr for electronic excitations in the framework of the time-dependent density functional theory [C. A. Guido, P. Cortona, B. Mennucci, and C. Adamo, J. Chem. Theory Comput. 9, 3118 (2013)], by including a measure of the difference of electronic position variances in passing from occupied to virtual orbitals. This new definition, called Γ, permits applications in those situations where the Δr-index is not helpful: transitions in centrosymmetric systems and Rydberg excitations. The Γ-metric is then extended by using the Natural Transition Orbitals, thus providing an intuitive picture of how locally the electron density changes during the electronic transitions. Furthermore, the Γ values give insight about the functional performances in reproducing different type of transitions, and allow one to define a “confidence radius” for GGA and hybrid functionals.

  15. Computing the three-point correlation function of galaxies in O(N^2) time

    NASA Astrophysics Data System (ADS)

    Slepian, Zachary; Eisenstein, Daniel J.

    2015-12-01

    We present an algorithm that computes the multipole coefficients of the galaxy three-point correlation function (3PCF) without explicitly considering triplets of galaxies. Rather, centring on each galaxy in the survey, it expands the radially binned density field in spherical harmonics and combines these to form the multipoles without ever requiring the relative angle between a pair about the central. This approach scales with number and number density in the same way as the two-point correlation function, allowing run-times that are comparable, and 500 times faster than a naive triplet count. It is exact in angle and easily handles edge correction. We demonstrate the algorithm on the LasDamas SDSS-DR7 mock catalogues, computing an edge corrected 3PCF out to 90 Mpc h-1 in under an hour on modest computing resources. We expect this algorithm will render it possible to obtain the large-scale 3PCF for upcoming surveys such as Euclid, Large Synoptic Survey Telescope (LSST), and Dark Energy Spectroscopic Instrument.

  16. Computing reward-prediction error: an integrated account of cortical timing and basal-ganglia pathways for appetitive and aversive learning.

    PubMed

    Morita, Kenji; Kawaguchi, Yasuo

    2015-08-01

    There are two prevailing notions regarding the involvement of the corticobasal ganglia system in value-based learning: (i) the direct and indirect pathways of the basal ganglia are crucial for appetitive and aversive learning, respectively, and (ii) the activity of midbrain dopamine neurons represents reward-prediction error. Although (ii) constitutes a critical assumption of (i), it remains elusive how (ii) holds given (i), with the basal-ganglia influence on the dopamine neurons. Here we present a computational neural-circuit model that potentially resolves this issue. Based on the latest analyses of the heterogeneous corticostriatal neurons and connections, our model posits that the direct and indirect pathways, respectively, represent the values of upcoming and previous actions, and up-regulate and down-regulate the dopamine neurons via the basal-ganglia output nuclei. This explains how the difference between the upcoming and previous values, which constitutes the core of reward-prediction error, is calculated. Simultaneously, it predicts that blockade of the direct/indirect pathway causes a negative/positive shift of reward-prediction error and thereby impairs learning from positive/negative error, i.e. appetitive/aversive learning. Through simulation of reward-reversal learning and punishment-avoidance learning, we show that our model could indeed account for the experimentally observed features that are suggested to support notion (i) and could also provide predictions on neural activity. We also present a behavioral prediction of our model, through simulation of inter-temporal choice, on how the balance between the two pathways relates to the subject's time preference. These results indicate that our model, incorporating the heterogeneity of the cortical influence on the basal ganglia, is expected to provide a closed-circuit mechanistic understanding of appetitive/aversive learning.

  17. Computing reward-prediction error: an integrated account of cortical timing and basal-ganglia pathways for appetitive and aversive learning.

    PubMed

    Morita, Kenji; Kawaguchi, Yasuo

    2015-08-01

    There are two prevailing notions regarding the involvement of the corticobasal ganglia system in value-based learning: (i) the direct and indirect pathways of the basal ganglia are crucial for appetitive and aversive learning, respectively, and (ii) the activity of midbrain dopamine neurons represents reward-prediction error. Although (ii) constitutes a critical assumption of (i), it remains elusive how (ii) holds given (i), with the basal-ganglia influence on the dopamine neurons. Here we present a computational neural-circuit model that potentially resolves this issue. Based on the latest analyses of the heterogeneous corticostriatal neurons and connections, our model posits that the direct and indirect pathways, respectively, represent the values of upcoming and previous actions, and up-regulate and down-regulate the dopamine neurons via the basal-ganglia output nuclei. This explains how the difference between the upcoming and previous values, which constitutes the core of reward-prediction error, is calculated. Simultaneously, it predicts that blockade of the direct/indirect pathway causes a negative/positive shift of reward-prediction error and thereby impairs learning from positive/negative error, i.e. appetitive/aversive learning. Through simulation of reward-reversal learning and punishment-avoidance learning, we show that our model could indeed account for the experimentally observed features that are suggested to support notion (i) and could also provide predictions on neural activity. We also present a behavioral prediction of our model, through simulation of inter-temporal choice, on how the balance between the two pathways relates to the subject's time preference. These results indicate that our model, incorporating the heterogeneity of the cortical influence on the basal ganglia, is expected to provide a closed-circuit mechanistic understanding of appetitive/aversive learning. PMID:26095906

  18. An evolutionary computational theory of prefrontal executive function in decision-making

    PubMed Central

    Koechlin, Etienne

    2014-01-01

    The prefrontal cortex subserves executive control and decision-making, that is, the coordination and selection of thoughts and actions in the service of adaptive behaviour. We present here a computational theory describing the evolution of the prefrontal cortex from rodents to humans as gradually adding new inferential Bayesian capabilities for dealing with a computationally intractable decision problem: exploring and learning new behavioural strategies versus exploiting and adjusting previously learned ones through reinforcement learning (RL). We provide a principled account identifying three inferential steps optimizing this arbitration through the emergence of (i) factual reactive inferences in paralimbic prefrontal regions in rodents; (ii) factual proactive inferences in lateral prefrontal regions in primates and (iii) counterfactual reactive and proactive inferences in human frontopolar regions. The theory clarifies the integration of model-free and model-based RL through the notion of strategy creation. The theory also shows that counterfactual inferences in humans yield to the notion of hypothesis testing, a critical reasoning ability for approximating optimal adaptive processes and presumably endowing humans with a qualitative evolutionary advantage in adaptive behaviour. PMID:25267817

  19. An evolutionary computational theory of prefrontal executive function in decision-making.

    PubMed

    Koechlin, Etienne

    2014-11-01

    The prefrontal cortex subserves executive control and decision-making, that is, the coordination and selection of thoughts and actions in the service of adaptive behaviour. We present here a computational theory describing the evolution of the prefrontal cortex from rodents to humans as gradually adding new inferential Bayesian capabilities for dealing with a computationally intractable decision problem: exploring and learning new behavioural strategies versus exploiting and adjusting previously learned ones through reinforcement learning (RL). We provide a principled account identifying three inferential steps optimizing this arbitration through the emergence of (i) factual reactive inferences in paralimbic prefrontal regions in rodents; (ii) factual proactive inferences in lateral prefrontal regions in primates and (iii) counterfactual reactive and proactive inferences in human frontopolar regions. The theory clarifies the integration of model-free and model-based RL through the notion of strategy creation. The theory also shows that counterfactual inferences in humans yield to the notion of hypothesis testing, a critical reasoning ability for approximating optimal adaptive processes and presumably endowing humans with a qualitative evolutionary advantage in adaptive behaviour.

  20. Studying the Chemistry of Cationized Triacylglycerols Using Electrospray Ionization Mass Spectrometry and Density Functional Theory Computations

    NASA Astrophysics Data System (ADS)

    Grossert, J. Stuart; Herrera, Lisandra Cubero; Ramaley, Louis; Melanson, Jeremy E.

    2014-08-01

    Analysis of triacylglycerols (TAGs), found as complex mixtures in living organisms, is typically accomplished using liquid chromatography, often coupled to mass spectrometry. TAGs, weak bases not protonated using electrospray ionization, are usually ionized by adduct formation with a cation, including those present in the solvent (e.g., Na+). There are relatively few reports on the binding of TAGs with cations or on the mechanisms by which cationized TAGs fragment. This work examines binding efficiencies, determined by mass spectrometry and computations, for the complexation of TAGs to a range of cations (Na+, Li+, K+, Ag+, NH4 +). While most cations bind to oxygen, Ag+ binding to unsaturation in the acid side chains is significant. The importance of dimer formation, [2TAG + M]+ was demonstrated using several different types of mass spectrometers. From breakdown curves, it became apparent that two or three acid side chains must be attached to glycerol for strong cationization. Possible mechanisms for fragmentation of lithiated TAGs were modeled by computations on tripropionylglycerol. Viable pathways were found for losses of neutral acids and lithium salts of acids from different positions on the glycerol moiety. Novel lactone structures were proposed for the loss of a neutral acid from one position of the glycerol moiety. These were studied further using triple-stage mass spectrometry (MS3). These lactones can account for all the major product ions in the MS3 spectra in both this work and the literature, which should allow for new insights into the challenging analytical methods needed for naturally occurring TAGs.

  1. An evolutionary computational theory of prefrontal executive function in decision-making.

    PubMed

    Koechlin, Etienne

    2014-11-01

    The prefrontal cortex subserves executive control and decision-making, that is, the coordination and selection of thoughts and actions in the service of adaptive behaviour. We present here a computational theory describing the evolution of the prefrontal cortex from rodents to humans as gradually adding new inferential Bayesian capabilities for dealing with a computationally intractable decision problem: exploring and learning new behavioural strategies versus exploiting and adjusting previously learned ones through reinforcement learning (RL). We provide a principled account identifying three inferential steps optimizing this arbitration through the emergence of (i) factual reactive inferences in paralimbic prefrontal regions in rodents; (ii) factual proactive inferences in lateral prefrontal regions in primates and (iii) counterfactual reactive and proactive inferences in human frontopolar regions. The theory clarifies the integration of model-free and model-based RL through the notion of strategy creation. The theory also shows that counterfactual inferences in humans yield to the notion of hypothesis testing, a critical reasoning ability for approximating optimal adaptive processes and presumably endowing humans with a qualitative evolutionary advantage in adaptive behaviour. PMID:25267817

  2. Computing light statistics in heterogeneous media based on a mass weighted probability density function method.

    PubMed

    Jenny, Patrick; Mourad, Safer; Stamm, Tobias; Vöge, Markus; Simon, Klaus

    2007-08-01

    Based on the transport theory, we present a modeling approach to light scattering in turbid material. It uses an efficient and general statistical description of the material's scattering and absorption behavior. The model estimates the spatial distribution of intensity and the flow direction of radiation, both of which are required, e.g., for adaptable predictions of the appearance of colors in halftone prints. This is achieved by employing a computational particle method, which solves a model equation for the probability density function of photon positions and propagation directions. In this framework, each computational particle represents a finite probability of finding a photon in a corresponding state, including properties like wavelength. Model evaluations and verifications conclude the discussion.

  3. A computational theory of hippocampal function, and tests of the theory: new developments.

    PubMed

    Kesner, Raymond P; Rolls, Edmund T

    2015-01-01

    The aims of the paper are to update Rolls' quantitative computational theory of hippocampal function and the predictions it makes about the different subregions (dentate gyrus, CA3 and CA1), and to examine behavioral and electrophysiological data that address the functions of the hippocampus and particularly its subregions. Based on the computational proposal that the dentate gyrus produces sparse representations by competitive learning and via the mossy fiber pathway forces new representations on the CA3 during learning (encoding), it has been shown behaviorally that the dentate gyrus supports spatial pattern separation during learning. Based on the computational proposal that CA3-CA3 autoassociative networks are important for episodic memory, it has been shown behaviorally that the CA3 supports spatial rapid one-trial learning, learning of arbitrary associations where space is a component, pattern completion, spatial short-term memory, and spatial sequence learning by associations formed between successive items. The concept that the CA1 recodes information from CA3 and sets up associatively learned backprojections to neocortex to allow subsequent retrieval of information to neocortex, is consistent with findings on consolidation. Behaviorally, the CA1 is implicated in processing temporal information as shown by investigations requiring temporal order pattern separation and associations across time; and computationally this could involve associations in CA1 between object and timing information that have their origins in the lateral and medial entorhinal cortex respectively. The perforant path input from the entorhinal cortex to DG is implicated in learning, to CA3 in retrieval from CA3, and to CA1 in retrieval after longer time intervals ("intermediate-term memory") and in the temporal sequence memory for objects. PMID:25446947

  4. Computer-aided analyses of transport protein sequences: gleaning evidence concerning function, structure, biogenesis, and evolution.

    PubMed Central

    Saier, M H

    1994-01-01

    Three-dimensional structures have been elucidated for very few integral membrane proteins. Computer methods can be used as guides for estimation of solute transport protein structure, function, biogenesis, and evolution. In this paper the application of currently available computer programs to over a dozen distinct families of transport proteins is reviewed. The reliability of sequence-based topological and localization analyses and the importance of sequence and residue conservation to structure and function are evaluated. Evidence concerning the nature and frequency of occurrence of domain shuffling, splicing, fusion, deletion, and duplication during evolution of specific transport protein families is also evaluated. Channel proteins are proposed to be functionally related to carriers. It is argued that energy coupling to transport was a late occurrence, superimposed on preexisting mechanisms of solute facilitation. It is shown that several transport protein families have evolved independently of each other, employing different routes, at different times in evolutionary history, to give topologically similar transmembrane protein complexes. The possible significance of this apparent topological convergence is discussed. PMID:8177172

  5. Computing single step operators of logic programming in radial basis function neural networks

    NASA Astrophysics Data System (ADS)

    Hamadneh, Nawaf; Sathasivam, Saratha; Choon, Ong Hong

    2014-07-01

    Logic programming is the process that leads from an original formulation of a computing problem to executable programs. A normal logic program consists of a finite set of clauses. A valuation I of logic programming is a mapping from ground atoms to false or true. The single step operator of any logic programming is defined as a function (Tp:I→I). Logic programming is well-suited to building the artificial intelligence systems. In this study, we established a new technique to compute the single step operators of logic programming in the radial basis function neural networks. To do that, we proposed a new technique to generate the training data sets of single step operators. The training data sets are used to build the neural networks. We used the recurrent radial basis function neural networks to get to the steady state (the fixed point of the operators). To improve the performance of the neural networks, we used the particle swarm optimization algorithm to train the networks.

  6. Boolean Combinations of Implicit Functions for Model Clipping in Computer-Assisted Surgical Planning

    PubMed Central

    2016-01-01

    This paper proposes an interactive method of model clipping for computer-assisted surgical planning. The model is separated by a data filter that is defined by the implicit function of the clipping path. Being interactive to surgeons, the clipping path that is composed of the plane widgets can be manually repositioned along the desirable presurgical path, which means that surgeons can produce any accurate shape of the clipped model. The implicit function is acquired through a recursive algorithm based on the Boolean combinations (including Boolean union and Boolean intersection) of a series of plane widgets’ implicit functions. The algorithm is evaluated as highly efficient because the best time performance of the algorithm is linear, which applies to most of the cases in the computer-assisted surgical planning. Based on the above stated algorithm, a user-friendly module named SmartModelClip is developed on the basis of Slicer platform and VTK. A number of arbitrary clipping paths have been tested. Experimental results of presurgical planning for three types of Le Fort fractures and for tumor removal demonstrate the high reliability and efficiency of our recursive algorithm and robustness of the module. PMID:26751685

  7. Computing single step operators of logic programming in radial basis function neural networks

    SciTech Connect

    Hamadneh, Nawaf; Sathasivam, Saratha; Choon, Ong Hong

    2014-07-10

    Logic programming is the process that leads from an original formulation of a computing problem to executable programs. A normal logic program consists of a finite set of clauses. A valuation I of logic programming is a mapping from ground atoms to false or true. The single step operator of any logic programming is defined as a function (T{sub p}:I→I). Logic programming is well-suited to building the artificial intelligence systems. In this study, we established a new technique to compute the single step operators of logic programming in the radial basis function neural networks. To do that, we proposed a new technique to generate the training data sets of single step operators. The training data sets are used to build the neural networks. We used the recurrent radial basis function neural networks to get to the steady state (the fixed point of the operators). To improve the performance of the neural networks, we used the particle swarm optimization algorithm to train the networks.

  8. Boolean Combinations of Implicit Functions for Model Clipping in Computer-Assisted Surgical Planning.

    PubMed

    Zhan, Qiqin; Chen, Xiaojun

    2016-01-01

    This paper proposes an interactive method of model clipping for computer-assisted surgical planning. The model is separated by a data filter that is defined by the implicit function of the clipping path. Being interactive to surgeons, the clipping path that is composed of the plane widgets can be manually repositioned along the desirable presurgical path, which means that surgeons can produce any accurate shape of the clipped model. The implicit function is acquired through a recursive algorithm based on the Boolean combinations (including Boolean union and Boolean intersection) of a series of plane widgets' implicit functions. The algorithm is evaluated as highly efficient because the best time performance of the algorithm is linear, which applies to most of the cases in the computer-assisted surgical planning. Based on the above stated algorithm, a user-friendly module named SmartModelClip is developed on the basis of Slicer platform and VTK. A number of arbitrary clipping paths have been tested. Experimental results of presurgical planning for three types of Le Fort fractures and for tumor removal demonstrate the high reliability and efficiency of our recursive algorithm and robustness of the module.

  9. Talking while Computing in Groups: The Not-so-Private Functions of Computational Private Speech in Mathematical Discussions

    ERIC Educational Resources Information Center

    Zahner, William; Moschkovich, Judit

    2010-01-01

    Students often voice computations during group discussions of mathematics problems. Yet, this type of private speech has received little attention from mathematics educators or researchers. In this article, we use excerpts from middle school students' group mathematical discussions to illustrate and describe "computational private speech." We…

  10. An effective method to verify line and point spread functions measured in computed tomography

    SciTech Connect

    Ohkubo, Masaki; Wada, Sinichi; Matsumoto, Toru; Nishizawa, Kanae

    2006-08-15

    This study describes an effective method for verifying line spread function (LSF) and point spread function (PSF) measured in computed tomography (CT). The CT image of an assumed object function is known to be calculable using LSF or PSF based on a model for the spatial resolution in a linear imaging system. Therefore, the validities of LSF and PSF would be confirmed by comparing the computed images with the images obtained by scanning phantoms corresponding to the object function. Differences between computed and measured images will depend on the accuracy of the LSF and PSF used in the calculations. First, we measured LSF in our scanner, and derived the two-dimensional PSF in the scan plane from the LSF. Second, we scanned the phantom including uniform cylindrical objects parallel to the long axis of a patient's body (z direction). Measured images of such a phantom were characterized according to the spatial resolution in the scan plane, and did not depend on the spatial resolution in the z direction. Third, images were calculated by two-dimensionally convolving the true object as a function of space with the PSF. As a result of comparing computed images with measured ones, good agreement was found and was demonstrated by image subtraction. As a criterion for evaluating quantitatively the overall differences of images, we defined the normalized standard deviation (SD) in the differences between computed and measured images. These normalized SDs were less than 5.0% (ranging from 1.3% to 4.8%) for three types of image reconstruction kernels and for various diameters of cylindrical objects, indicating the high accuracy of PSF and LSF that resulted in successful measurements. Further, we also obtained another LSF utilizing an inappropriate manner, and calculated the images as above. This time, the computed images did not agree with the measured ones. The normalized SDs were 6.0% or more (ranging from 6.0% to 13.8%), indicating the inaccuracy of the PSF and LSF. We

  11. Accelerating Computation of DCM for ERP in MATLAB by External Function Calls to the GPU.

    PubMed

    Wang, Wei-Jen; Hsieh, I-Fan; Chen, Chun-Chuan

    2013-01-01

    This study aims to improve the performance of Dynamic Causal Modelling for Event Related Potentials (DCM for ERP) in MATLAB by using external function calls to a graphics processing unit (GPU). DCM for ERP is an advanced method for studying neuronal effective connectivity. DCM utilizes an iterative procedure, the expectation maximization (EM) algorithm, to find the optimal parameters given a set of observations and the underlying probability model. As the EM algorithm is computationally demanding and the analysis faces possible combinatorial explosion of models to be tested, we propose a parallel computing scheme using the GPU to achieve a fast estimation of DCM for ERP. The computation of DCM for ERP is dynamically partitioned and distributed to threads for parallel processing, according to the DCM model complexity and the hardware constraints. The performance efficiency of this hardware-dependent thread arrangement strategy was evaluated using the synthetic data. The experimental data were used to validate the accuracy of the proposed computing scheme and quantify the time saving in practice. The simulation results show that the proposed scheme can accelerate the computation by a factor of 155 for the parallel part. For experimental data, the speedup factor is about 7 per model on average, depending on the model complexity and the data. This GPU-based implementation of DCM for ERP gives qualitatively the same results as the original MATLAB implementation does at the group level analysis. In conclusion, we believe that the proposed GPU-based implementation is very useful for users as a fast screen tool to select the most likely model and may provide implementation guidance for possible future clinical applications such as online diagnosis.

  12. Accelerating Computation of DCM for ERP in MATLAB by External Function Calls to the GPU

    PubMed Central

    Wang, Wei-Jen; Hsieh, I-Fan; Chen, Chun-Chuan

    2013-01-01

    This study aims to improve the performance of Dynamic Causal Modelling for Event Related Potentials (DCM for ERP) in MATLAB by using external function calls to a graphics processing unit (GPU). DCM for ERP is an advanced method for studying neuronal effective connectivity. DCM utilizes an iterative procedure, the expectation maximization (EM) algorithm, to find the optimal parameters given a set of observations and the underlying probability model. As the EM algorithm is computationally demanding and the analysis faces possible combinatorial explosion of models to be tested, we propose a parallel computing scheme using the GPU to achieve a fast estimation of DCM for ERP. The computation of DCM for ERP is dynamically partitioned and distributed to threads for parallel processing, according to the DCM model complexity and the hardware constraints. The performance efficiency of this hardware-dependent thread arrangement strategy was evaluated using the synthetic data. The experimental data were used to validate the accuracy of the proposed computing scheme and quantify the time saving in practice. The simulation results show that the proposed scheme can accelerate the computation by a factor of 155 for the parallel part. For experimental data, the speedup factor is about 7 per model on average, depending on the model complexity and the data. This GPU-based implementation of DCM for ERP gives qualitatively the same results as the original MATLAB implementation does at the group level analysis. In conclusion, we believe that the proposed GPU-based implementation is very useful for users as a fast screen tool to select the most likely model and may provide implementation guidance for possible future clinical applications such as online diagnosis. PMID:23840507

  13. Computer simulation on the cooperation of functional molecules during the early stages of evolution.

    PubMed

    Ma, Wentao; Hu, Jiming

    2012-01-01

    It is very likely that life began with some RNA (or RNA-like) molecules, self-replicating by base-pairing and exhibiting enzyme-like functions that favored the self-replication. Different functional molecules may have emerged by favoring their own self-replication at different aspects. Then, a direct route towards complexity/efficiency may have been through the coexistence/cooperation of these molecules. However, the likelihood of this route remains quite unclear, especially because the molecules would be competing for limited common resources. By computer simulation using a Monte-Carlo model (with "micro-resolution" at the level of nucleotides and membrane components), we show that the coexistence/cooperation of these molecules can occur naturally, both in a naked form and in a protocell form. The results of the computer simulation also lead to quite a few deductions concerning the environment and history in the scenario. First, a naked stage (with functional molecules catalyzing template-replication and metabolism) may have occurred early in evolution but required high concentration and limited dispersal of the system (e.g., on some mineral surface); the emergence of protocells enabled a "habitat-shift" into bulk water. Second, the protocell stage started with a substage of "pseudo-protocells", with functional molecules catalyzing template-replication and metabolism, but still missing the function involved in the synthesis of membrane components, the emergence of which would lead to a subsequent "true-protocell" substage. Third, the initial unstable membrane, composed of prebiotically available fatty acids, should have been superseded quite early by a more stable membrane (e.g., composed of phospholipids, like modern cells). Additionally, the membrane-takeover probably occurred at the transition of the two substages of the protocells. The scenario described in the present study should correspond to an episode in early evolution, after the emergence of single

  14. An accurate Fortran code for computing hydrogenic continuum wave functions at a wide range of parameters

    NASA Astrophysics Data System (ADS)

    Peng, Liang-You; Gong, Qihuang

    2010-12-01

    The accurate computations of hydrogenic continuum wave functions are very important in many branches of physics such as electron-atom collisions, cold atom physics, and atomic ionization in strong laser fields, etc. Although there already exist various algorithms and codes, most of them are only reliable in a certain ranges of parameters. In some practical applications, accurate continuum wave functions need to be calculated at extremely low energies, large radial distances and/or large angular momentum number. Here we provide such a code, which can generate accurate hydrogenic continuum wave functions and corresponding Coulomb phase shifts at a wide range of parameters. Without any essential restrict to angular momentum number, the present code is able to give reliable results at the electron energy range [10,10] eV for radial distances of [10,10] a.u. We also find the present code is very efficient, which should find numerous applications in many fields such as strong field physics. Program summaryProgram title: HContinuumGautchi Catalogue identifier: AEHD_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHD_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 1233 No. of bytes in distributed program, including test data, etc.: 7405 Distribution format: tar.gz Programming language: Fortran90 in fixed format Computer: AMD Processors Operating system: Linux RAM: 20 MBytes Classification: 2.7, 4.5 Nature of problem: The accurate computation of atomic continuum wave functions is very important in many research fields such as strong field physics and cold atom physics. Although there have already existed various algorithms and codes, most of them can only be applicable and reliable in a certain range of parameters. We present here an accurate FORTRAN program for

  15. Non-Superior Disembedding Performance in Children with High-Functioning Autism and Its Cognitive Style Account

    ERIC Educational Resources Information Center

    Chen, Fei; Lemonnier, Eric; Lazartigues, Alain; Planche, Pascale

    2008-01-01

    Some early studies showed a superior disembedding performance in autistic people while other studies found no difference between autistic and controls. The present study aimed to assess such disembedding ability in 14 boys with high-functioning autism (HFA) and 14 chronological age and non-verbal IQ matched typically developed boys using an…

  16. The Luminosity Function of Quasars (active Galactic Nuclei) in a Merging Model with the Eddington Limit Taken Into Account

    NASA Astrophysics Data System (ADS)

    Kontorovich, V. M.; Krivitsky, D. S.

    The influence of Eddington's limit on the active galactic nuclei (AGN) luminosity function within the framework of a phenomenological activity model (Kats and Kontorovich, 1990, 1991) based on angular momentum compensation in the process of galaxy merging is investigated. In particular, it is shown that in spite of the essential dependence of the galaxy merging probability on their masses in the most important and interesting case it behaves effectively as a constant, so that the abovementioned (Kats and Kontorovich, 1991) correspondence between the observed galaxy mass function (Binggeli et al., 1988) and quasar luminosity function power exponents (Boyle et al., 1988; Koo and Kron, 1988; Cristiani et al., 1993) for a constant merger probability takes place in reality. A break in the power-law dependence of the luminosity function due to Eddington's restriction (cf. Dibai, 1981; Padovani and Rafanelli, 1988) is obtained in certain cases. Possible correlation between masses of black holes in AGN and masses of their host galaxies is discussed. A more detailed paper containing the results presented at this conference was published in Pis'ma v Astron. Zh. (Kontorovich and Krivitsky, 1995). Here we have added also some additional notes and references.

  17. ACCOUNTING FOR THE ENDOGENEITY OF HEALTH AND ENVIRONMENTAL TOBACCO SMOKE EXPOSURE IN CHILDREN: AN APPLICATION TO CONTINUOUS LUNG FUNCTION

    EPA Science Inventory

    The goal of this study is to estimate an unbiased exposure effect of environmental tobacco smoke (ETS) exposure on children's continuous lung function. A majority of the evidence from health studies suggests that ETS exposure in early life contributes significantly to childhood ...

  18. Do Children's Executive Functions Account for Associations between Early Autonomy-Supportive Parenting and Achievement through High School?

    ERIC Educational Resources Information Center

    Bindman, Samantha W.; Pomerantz, Eva M.; Roisman, Glenn I.

    2015-01-01

    This study evaluated whether the positive association between early autonomy-supportive parenting and children's subsequent achievement is mediated by children's executive functions. Using observations of mothers' parenting from the National Institute of Child Health and Human Development (NICHD) Study of Early Child Care and Youth Development (N…

  19. Distinct Quantitative Computed Tomography Emphysema Patterns Are Associated with Physiology and Function in Smokers

    PubMed Central

    San José Estépar, Raúl; Mendoza, Carlos S.; Hersh, Craig P.; Laird, Nan; Crapo, James D.; Lynch, David A.; Silverman, Edwin K.; Washko, George R.

    2013-01-01

    Rationale: Emphysema occurs in distinct pathologic patterns, but little is known about the epidemiologic associations of these patterns. Standard quantitative measures of emphysema from computed tomography (CT) do not distinguish between distinct patterns of parenchymal destruction. Objectives: To study the epidemiologic associations of distinct emphysema patterns with measures of lung-related physiology, function, and health care use in smokers. Methods: Using a local histogram-based assessment of lung density, we quantified distinct patterns of low attenuation in 9,313 smokers in the COPDGene Study. To determine if such patterns provide novel insights into chronic obstructive pulmonary disease epidemiology, we tested for their association with measures of physiology, function, and health care use. Measurements and Main Results: Compared with percentage of low-attenuation area less than −950 Hounsfield units (%LAA-950), local histogram-based measures of distinct CT low-attenuation patterns are more predictive of measures of lung function, dyspnea, quality of life, and health care use. These patterns are strongly associated with a wide array of measures of respiratory physiology and function, and most of these associations remain highly significant (P < 0.005) after adjusting for %LAA-950. In smokers without evidence of chronic obstructive pulmonary disease, the mild centrilobular disease pattern is associated with lower FEV1 and worse functional status (P < 0.005). Conclusions: Measures of distinct CT emphysema patterns provide novel information about the relationship between emphysema and key measures of physiology, physical function, and health care use. Measures of mild emphysema in smokers with preserved lung function can be extracted from CT scans and are significantly associated with functional measures. PMID:23980521

  20. Krylov-space algorithms for time-dependent Hartree-Fock and density functional computations

    SciTech Connect

    Chernyak, Vladimir; Schulz, Michael F.; Mukamel, Shaul; Tretiak, Sergei; Tsiper, Eugene V.

    2000-07-01

    A fast, low memory cost, Krylov-space-based algorithm is proposed for the diagonalization of large Hamiltonian matrices required in time-dependent Hartree-Fock (TDHF) and adiabatic time-dependent density-functional theory (TDDFT) computations of electronic excitations. A deflection procedure based on the symplectic structure of the TDHF equations is introduced and its capability to find higher eigenmodes of the linearized TDHF operator for a given numerical accuracy is demonstrated. The algorithm may be immediately applied to the formally-identical adiabatic TDDFT equations. (c) 2000 American Institute of Physics.

  1. Numerical ray-tracing approach with laser intensity distribution for LIDAR signal power function computation

    NASA Astrophysics Data System (ADS)

    Shi, Guangyuan; Li, Song; Huang, Ke; Li, Zile; Zheng, Guoxing

    2016-10-01

    We have developed a new numerical ray-tracing approach for LIDAR signal power function computation, in which the light round-trip propagation is analyzed by geometrical optics and a simple experiment is employed to acquire the laser intensity distribution. It is relatively more accurate and flexible than previous methods. We emphatically discuss the relationship between the inclined angle and the dynamic range of detector output signal in biaxial LIDAR system. Results indicate that an appropriate negative angle can compress the signal dynamic range. This technique has been successfully proved by comparison with real measurements.

  2. Numerical ray-tracing approach with laser intensity distribution for LIDAR signal power function computation

    NASA Astrophysics Data System (ADS)

    Shi, Guangyuan; Li, Song; Huang, Ke; Li, Zile; Zheng, Guoxing

    2016-08-01

    We have developed a new numerical ray-tracing approach for LIDAR signal power function computation, in which the light round-trip propagation is analyzed by geometrical optics and a simple experiment is employed to acquire the laser intensity distribution. It is relatively more accurate and flexible than previous methods. We emphatically discuss the relationship between the inclined angle and the dynamic range of detector output signal in biaxial LIDAR system. Results indicate that an appropriate negative angle can compress the signal dynamic range. This technique has been successfully proved by comparison with real measurements.

  3. Computed Ranking-Hugoniot relations for hexanitrostilbene and hexanitrohexaazaisowurtzitane via density functional theory based molecular dynamics

    NASA Astrophysics Data System (ADS)

    Wixom, Ryan; Mattsson, Ann; Mattsson, Thomas

    2011-06-01

    Density Functional Theory (DFT) has become an in-dispensable tool for understanding the behavior of matter under extreme conditions, for example confirming experimental findings into the TPa regime and amending experimental data for constructing wide-range equations of state (EOS). The ability to perform high-fidelity calculations is even more important for cases where experiments are impossible to perform, dangerous, and/or prohibitively expensive. We will present computed shock properties for hexanitrostilbene and hexanitrohexaazaisowurtzitane, making comparisons with experimental shock data or diamond anvil cell data, where available. Credibility of the results and proposed methods for validation will be discussed.

  4. Using an iterative eigensolver to compute vibrational energies with phase-spaced localized basis functions

    SciTech Connect

    Brown, James Carrington, Tucker

    2015-07-28

    Although phase-space localized Gaussians are themselves poor basis functions, they can be used to effectively contract a discrete variable representation basis [A. Shimshovitz and D. J. Tannor, Phys. Rev. Lett. 109, 070402 (2012)]. This works despite the fact that elements of the Hamiltonian and overlap matrices labelled by discarded Gaussians are not small. By formulating the matrix problem as a regular (i.e., not a generalized) matrix eigenvalue problem, we show that it is possible to use an iterative eigensolver to compute vibrational energy levels in the Gaussian basis.

  5. Understanding entangled cerebral networks: a prerequisite for restoring brain function with brain-computer interfaces

    PubMed Central

    Mandonnet, Emmanuel; Duffau, Hugues

    2014-01-01

    Historically, cerebral processing has been conceptualized as a framework based on statically localized functions. However, a growing amount of evidence supports a hodotopical (delocalized) and flexible organization. A number of studies have reported absence of a permanent neurological deficit after massive surgical resections of eloquent brain tissue. These results highlight the tremendous plastic potential of the brain. Understanding anatomo-functional correlates underlying this cerebral reorganization is a prerequisite to restore brain functions through brain-computer interfaces (BCIs) in patients with cerebral diseases, or even to potentiate brain functions in healthy individuals. Here, we review current knowledge of neural networks that could be utilized in the BCIs that enable movements and language. To this end, intraoperative electrical stimulation in awake patients provides valuable information on the cerebral functional maps, their connectomics and plasticity. Overall, these studies indicate that the complex cerebral circuitry that underpins interactions between action, cognition and behavior should be throughly investigated before progress in BCI approaches can be achieved. PMID:24834030

  6. Functional Priorities, Assistive Technology, and Brain-Computer Interfaces after Spinal Cord Injury

    PubMed Central

    Collinger, Jennifer L.; Boninger, Michael L.; Bruns, Tim M.; Curley, Kenneth; Wang, Wei; Weber, Douglas J.

    2012-01-01

    Spinal cord injury often impacts a person’s ability to perform critical activities of daily living and can have a negative impact on their quality of life. Assistive technology aims to bridge this gap to augment function and increase independence. It is critical to involve consumers in the design and evaluation process as new technologies, like brain-computer interfaces (BCIs), are developed. In a survey study of fifty-seven veterans with spinal cord injury who were participating in the National Veterans Wheelchair Games, we found that restoration of bladder/bowel control, walking, and arm/hand function (tetraplegia only) were all high priorities for improving quality of life. Many of the participants had not used or heard of some currently available technologies designed to improve function or the ability to interact with their environment. The majority of individuals in this study were interested in using a BCI, particularly for controlling functional electrical stimulation to restore lost function. Independent operation was considered to be the most important design criteria. Interestingly, many participants reported that they would be willing to consider surgery to implant a BCI even though non-invasiveness was a high priority design requirement. This survey demonstrates the interest of individuals with spinal cord injury in receiving and contributing to the design of BCI. PMID:23760996

  7. Functional priorities, assistive technology, and brain-computer interfaces after spinal cord injury.

    PubMed

    Collinger, Jennifer L; Boninger, Michael L; Bruns, Tim M; Curley, Kenneth; Wang, Wei; Weber, Douglas J

    2013-01-01

    Spinal cord injury (SCI) often affects a person's ability to perform critical activities of daily living and can negatively affect his or her quality of life. Assistive technology aims to bridge this gap in order to augment function and increase independence. It is critical to involve consumers in the design and evaluation process as new technologies such as brain-computer interfaces (BCIs) are developed. In a survey study of 57 veterans with SCI participating in the 2010 National Veterans Wheelchair Games, we found that restoration of bladder and bowel control, walking, and arm and hand function (tetraplegia only) were all high priorities for improving quality of life. Many of the participants had not used or heard of some currently available technologies designed to improve function or the ability to interact with their environment. The majority of participants in this study were interested in using a BCI, particularly for controlling functional electrical stimulation to restore lost function. Independent operation was considered to be the most important design criteria. Interestingly, many participants reported that they would consider surgery to implant a BCI even though noninvasiveness was a high-priority design requirement. This survey demonstrates the interest of individuals with SCI in receiving and contributing to the design of BCIs.

  8. Distribution of computer functionality for accelerator control at the Brookhaven AGS

    SciTech Connect

    Stevens, A.; Clifford, T.; Frankel, R.

    1985-01-01

    A set of physical and functional system components and their interconnection protocols have been established for all controls work at the AGS. Portions of these designs were tested as part of enhanced operation of the AGS as a source of polarized protons and additional segments will be implemented during the continuing construction efforts which are adding heavy ion capability to our facility. Included in our efforts are the following computer and control system elements: a broad band local area network, which embodies MODEMS; transmission systems and branch interface units; a hierarchical layer, which performs certain data base and watchdog/alarm functions; a group of work station processors (Apollo's) which perform the function of traditional minicomputer host(s) and a layer, which provides both real time control and standardization functions for accelerator devices and instrumentation. Data base and other accelerator functionality is assigned to the most correct level within our network for both real time performance, long-term utility, and orderly growth.

  9. Distributed Accounting on the Grid

    NASA Technical Reports Server (NTRS)

    Thigpen, William; Hacker, Thomas J.; McGinnis, Laura F.; Athey, Brian D.

    2001-01-01

    By the late 1990s, the Internet was adequately equipped to move vast amounts of data between HPC (High Performance Computing) systems, and efforts were initiated to link together the national infrastructure of high performance computational and data storage resources together into a general computational utility 'grid', analogous to the national electrical power grid infrastructure. The purpose of the Computational grid is to provide dependable, consistent, pervasive, and inexpensive access to computational resources for the computing community in the form of a computing utility. This paper presents a fully distributed view of Grid usage accounting and a methodology for allocating Grid computational resources for use on a Grid computing system.

  10. Beyond localized and distributed accounts of brain functions. Comment on “Understanding brain networks and brain organization” by Pessoa

    NASA Astrophysics Data System (ADS)

    Cauda, Franco; Costa, Tommaso; Tamietto, Marco

    2014-09-01

    Recent evidence in cognitive neuroscience lends support to the idea that network models of brain architecture provide a privileged access to the understanding of the relation between brain organization and cognitive processes [1]. The core perspective holds that cognitive processes depend on the interactions among distributed neuronal populations and brain structures, and that the impact of a given region on behavior largely depends on its pattern of anatomical and functional connectivity [2,3].

  11. Functional assessment of coronary artery disease by intravascular ultrasound and computational fluid dynamics simulation.

    PubMed

    Carrizo, Sebastián; Xie, Xinzhou; Peinado-Peinado, Rafael; Sánchez-Recalde, Angel; Jiménez-Valero, Santiago; Galeote-Garcia, Guillermo; Moreno, Raúl

    2014-10-01

    Clinical trials have shown that functional assessment of coronary stenosis by fractional flow reserve (FFR) improves clinical outcomes. Intravascular ultrasound (IVUS) complements conventional angiography, and is a powerful tool to assess atherosclerotic plaques and to guide percutaneous coronary intervention (PCI). Computational fluid dynamics (CFD) simulation represents a novel method for the functional assessment of coronary flow. A CFD simulation can be calculated from the data normally acquired by IVUS images. A case of coronary heart disease studied with FFR and IVUS, before and after PCI, is presented. A three-dimensional model was constructed based on IVUS images, to which CFD was applied. A discussion of the literature concerning the clinical utility of CFD simulation is provided. PMID:25441999

  12. Function and dynamics of macromolecular complexes explored by integrative structural and computational biology.

    PubMed

    Purdy, Michael D; Bennett, Brad C; McIntire, William E; Khan, Ali K; Kasson, Peter M; Yeager, Mark

    2014-08-01

    Three vignettes exemplify the potential of combining EM and X-ray crystallographic data with molecular dynamics (MD) simulation to explore the architecture, dynamics and functional properties of multicomponent, macromolecular complexes. The first two describe how EM and X-ray crystallography were used to solve structures of the ribosome and the Arp2/3-actin complex, which enabled MD simulations that elucidated functional dynamics. The third describes how EM, X-ray crystallography, and microsecond MD simulations of a GPCR:G protein complex were used to explore transmembrane signaling by the β-adrenergic receptor. Recent technical advancements in EM, X-ray crystallography and computational simulation create unprecedented synergies for integrative structural biology to reveal new insights into heretofore intractable biological systems.

  13. Study of space shuttle orbiter system management computer function. Volume 1: Analysis, baseline design

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A system analysis of the shuttle orbiter baseline system management (SM) computer function is performed. This analysis results in an alternative SM design which is also described. The alternative design exhibits several improvements over the baseline, some of which are increased crew usability, improved flexibility, and improved growth potential. The analysis consists of two parts: an application assessment and an implementation assessment. The former is concerned with the SM user needs and design functional aspects. The latter is concerned with design flexibility, reliability, growth potential, and technical risk. The system analysis is supported by several topical investigations. These include: treatment of false alarms, treatment of off-line items, significant interface parameters, and a design evaluation checklist. An in-depth formulation of techniques, concepts, and guidelines for design of automated performance verification is discussed.

  14. Functional assessment of coronary artery disease by intravascular ultrasound and computational fluid dynamics simulation.

    PubMed

    Carrizo, Sebastián; Xie, Xinzhou; Peinado-Peinado, Rafael; Sánchez-Recalde, Angel; Jiménez-Valero, Santiago; Galeote-Garcia, Guillermo; Moreno, Raúl

    2014-10-01

    Clinical trials have shown that functional assessment of coronary stenosis by fractional flow reserve (FFR) improves clinical outcomes. Intravascular ultrasound (IVUS) complements conventional angiography, and is a powerful tool to assess atherosclerotic plaques and to guide percutaneous coronary intervention (PCI). Computational fluid dynamics (CFD) simulation represents a novel method for the functional assessment of coronary flow. A CFD simulation can be calculated from the data normally acquired by IVUS images. A case of coronary heart disease studied with FFR and IVUS, before and after PCI, is presented. A three-dimensional model was constructed based on IVUS images, to which CFD was applied. A discussion of the literature concerning the clinical utility of CFD simulation is provided.

  15. Management of Liver Cancer Argon-helium Knife Therapy with Functional Computer Tomography Perfusion Imaging.

    PubMed

    Wang, Hongbo; Shu, Shengjie; Li, Jinping; Jiang, Huijie

    2016-02-01

    The objective of this study was to observe the change in blood perfusion of liver cancer following argon-helium knife treatment with functional computer tomography perfusion imaging. Twenty-seven patients with primary liver cancer treated with argon-helium knife and were included in this study. Plain computer tomography (CT) and computer tomography perfusion (CTP) imaging were conducted in all patients before and after treatment. Perfusion parameters including blood flows, blood volume, hepatic artery perfusion fraction, hepatic artery perfusion, and hepatic portal venous perfusion were used for evaluating therapeutic effect. All parameters in liver cancer were significantly decreased after argon-helium knife treatment (p < 0.05 to all). Significant decrease in hepatic artery perfusion was also observed in pericancerous liver tissue, but other parameters kept constant. CT perfusion imaging is able to detect decrease in blood perfusion of liver cancer post-argon-helium knife therapy. Therefore, CTP imaging would play an important role for liver cancer management followed argon-helium knife therapy.

  16. Computerized material accounting

    SciTech Connect

    Claborn, J.; Erkkila, B.

    1995-07-01

    With the advent of fast, reliable database servers running on inexpensive networked personal computers, it is possible to create material accountability systems that are easy to learn, easy to use, and cost-effective to implement. Maintaining the material data in a relational database allows data to be viewed in ways that were previously very difficult. This paper describes a software and hardware platforms for the implementation of such an accountability system.

  17. A Function Accounting for Training Set Size and Marker Density to Model the Average Accuracy of Genomic Prediction

    PubMed Central

    Erbe, Malena; Gredler, Birgit; Seefried, Franz Reinhold; Bapst, Beat; Simianer, Henner

    2013-01-01

    Prediction of genomic breeding values is of major practical relevance in dairy cattle breeding. Deterministic equations have been suggested to predict the accuracy of genomic breeding values in a given design which are based on training set size, reliability of phenotypes, and the number of independent chromosome segments (). The aim of our study was to find a general deterministic equation for the average accuracy of genomic breeding values that also accounts for marker density and can be fitted empirically. Two data sets of 5′698 Holstein Friesian bulls genotyped with 50 K SNPs and 1′332 Brown Swiss bulls genotyped with 50 K SNPs and imputed to ∼600 K SNPs were available. Different k-fold (k = 2–10, 15, 20) cross-validation scenarios (50 replicates, random assignment) were performed using a genomic BLUP approach. A maximum likelihood approach was used to estimate the parameters of different prediction equations. The highest likelihood was obtained when using a modified form of the deterministic equation of Daetwyler et al. (2010), augmented by a weighting factor (w) based on the assumption that the maximum achievable accuracy is . The proportion of genetic variance captured by the complete SNP sets () was 0.76 to 0.82 for Holstein Friesian and 0.72 to 0.75 for Brown Swiss. When modifying the number of SNPs, w was found to be proportional to the log of the marker density up to a limit which is population and trait specific and was found to be reached with ∼20′000 SNPs in the Brown Swiss population studied. PMID:24339895

  18. A computer adaptive testing approach for assessing physical functioning in children and adolescents.

    PubMed

    Haley, Stephen M; Ni, Pengsheng; Fragala-Pinkham, Maria A; Skrinar, Alison M; Corzo, Deyanira

    2005-02-01

    The purpose of this article is to demonstrate: (1) the accuracy and (2) the reduction in amount of time and effort in assessing physical functioning (self-care and mobility domains) of children and adolescents using computer-adaptive testing (CAT). A CAT algorithm selects questions directly tailored to the child's ability level, based on previous responses. Using a CAT algorithm, a simulation study was used to determine the number of items necessary to approximate the score of a full-length assessment. We built simulated CAT (5-, 10-, 15-, and 20-item versions) for self-care and mobility domains and tested their accuracy in a normative sample (n=373; 190 males, 183 females; mean age 6y 11mo [SD 4y 2m], range 4mo to 14y 11mo) and a sample of children and adolescents with Pompe disease (n=26; 21 males, 5 females; mean age 6y 1mo [SD 3y 10mo], range 5mo to 14y 10mo). Results indicated that comparable score estimates (based on computer simulations) to the full-length tests can be achieved in a 20-item CAT version for all age ranges and for normative and clinical samples. No more than 13 to 16% of the items in the full-length tests were needed for any one administration. These results support further consideration of using CAT programs for accurate and efficient clinical assessments of physical functioning.

  19. Morphological and Functional Evaluation of Quadricuspid Aortic Valves Using Cardiac Computed Tomography

    PubMed Central

    Song, Inyoung; Park, Jung Ah; Choi, Bo Hwa; Shin, Je Kyoun; Chee, Hyun Keun; Kim, Jun Seok

    2016-01-01

    Objective The aim of this study was to identify the morphological and functional characteristics of quadricuspid aortic valves (QAV) on cardiac computed tomography (CCT). Materials and Methods We retrospectively enrolled 11 patients with QAV. All patients underwent CCT and transthoracic echocardiography (TTE), and 7 patients underwent cardiovascular magnetic resonance (CMR). The presence and classification of QAV assessed by CCT was compared with that of TTE and intraoperative findings. The regurgitant orifice area (ROA) measured by CCT was compared with severity of aortic regurgitation (AR) by TTE and the regurgitant fraction (RF) by CMR. Results All of the patients had AR; 9 had pure AR, 1 had combined aortic stenosis and regurgitation, and 1 had combined subaortic stenosis and regurgitation. Two patients had a subaortic fibrotic membrane and 1 of them showed a subaortic stenosis. One QAV was misdiagnosed as tricuspid aortic valve on TTE. In accordance with the Hurwitz and Robert's classification, consensus was reached on the QAV classification between the CCT and TTE findings in 7 of 10 patients. The patients were classified as type A (n = 1), type B (n = 3), type C (n = 1), type D (n = 4), and type F (n = 2) on CCT. A very high correlation existed between ROA by CCT and RF by CMR (r = 0.99) but a good correlation existed between ROA by CCT and regurgitant severity by TTE (r = 0.62). Conclusion Cardiac computed tomography provides comprehensive anatomical and functional information about the QAV. PMID:27390538

  20. Planar quantum quenches: computation of exact time-dependent correlation functions at large N

    NASA Astrophysics Data System (ADS)

    Cortés Cubero, Axel

    2016-08-01

    We study a quantum quench of an integrable quantum field theory in the planar infinite-N limit. Unlike isovector-valued O(N) models, matrix-valued field theories in the infinite-N limit are not solvable by the Hartre-Fock approximation, and are nontrivial interacting theories. We study quenches with initial states that are color-charge neutral, correspond to integrability-preserving boundary conditions, and that lead to nontrivial correlation functions of operators. We compute exactly at infinite N, the time-dependent one- and two-point correlation functions of the energy-momentum tensor and renormalized field operator after this quench using known exact form factors. This computation can be done fully analytically, due the simplicity of the initial state and the form factors in the planar limit. We also show that this type of quench preserves factorizability at all times, allows for particle transmission from the pre-quench state, while still having nontrivial interacting post-quench dynamics.

  1. Experimental evidence validating the computational inference of functional associations from gene fusion events: a critical survey.

    PubMed

    Promponas, Vasilis J; Ouzounis, Christos A; Iliopoulos, Ioannis

    2014-05-01

    More than a decade ago, a number of methods were proposed for the inference of protein interactions, using whole-genome information from gene clusters, gene fusions and phylogenetic profiles. This structural and evolutionary view of entire genomes has provided a valuable approach for the functional characterization of proteins, especially those without sequence similarity to proteins of known function. Furthermore, this view has raised the real possibility to detect functional associations of genes and their corresponding proteins for any entire genome sequence. Yet, despite these exciting developments, there have been relatively few cases of real use of these methods outside the computational biology field, as reflected from citation analysis. These methods have the potential to be used in high-throughput experimental settings in functional genomics and proteomics to validate results with very high accuracy and good coverage. In this critical survey, we provide a comprehensive overview of 30 most prominent examples of single pairwise protein interaction cases in small-scale studies, where protein interactions have either been detected by gene fusion or yielded additional, corroborating evidence from biochemical observations. Our conclusion is that with the derivation of a validated gold-standard corpus and better data integration with big experiments, gene fusion detection can truly become a valuable tool for large-scale experimental biology.

  2. Experimental evidence validating the computational inference of functional associations from gene fusion events: a critical survey

    PubMed Central

    Promponas, Vasilis J.; Ouzounis, Christos A.; Iliopoulos, Ioannis

    2014-01-01

    More than a decade ago, a number of methods were proposed for the inference of protein interactions, using whole-genome information from gene clusters, gene fusions and phylogenetic profiles. This structural and evolutionary view of entire genomes has provided a valuable approach for the functional characterization of proteins, especially those without sequence similarity to proteins of known function. Furthermore, this view has raised the real possibility to detect functional associations of genes and their corresponding proteins for any entire genome sequence. Yet, despite these exciting developments, there have been relatively few cases of real use of these methods outside the computational biology field, as reflected from citation analysis. These methods have the potential to be used in high-throughput experimental settings in functional genomics and proteomics to validate results with very high accuracy and good coverage. In this critical survey, we provide a comprehensive overview of 30 most prominent examples of single pairwise protein interaction cases in small-scale studies, where protein interactions have either been detected by gene fusion or yielded additional, corroborating evidence from biochemical observations. Our conclusion is that with the derivation of a validated gold-standard corpus and better data integration with big experiments, gene fusion detection can truly become a valuable tool for large-scale experimental biology. PMID:23220349

  3. Intersections between the Autism Spectrum and the Internet: Perceived Benefits and Preferred Functions of Computer-Mediated Communication

    ERIC Educational Resources Information Center

    Gillespie-Lynch, Kristen; Kapp, Steven K.; Shane-Simpson, Christina; Smith, David Shane; Hutman, Ted

    2014-01-01

    An online survey compared the perceived benefits and preferred functions of computer-mediated communication of participants with (N = 291) and without ASD (N = 311). Participants with autism spectrum disorder (ASD) perceived benefits of computer-mediated communication in terms of increased comprehension and control over communication, access to…

  4. Feasibility Study for a Remote Terminal Central Computing Facility Serving School and College Institutions. Volume I, Functional Requirements.

    ERIC Educational Resources Information Center

    International Business Machines Corp., White Plains, NY.

    The economic and technical feasibility of providing a remote terminal central computing facility to serve a group of 25-75 secondary schools and colleges was investigated. The general functions of a central facility for an educational cluster were defined to include training in computer techniques, the solution of student development problems in…

  5. A Computational Method Designed to Aid in the Teaching of Copolymer Composition and Microstructure as a Function of Conversion.

    ERIC Educational Resources Information Center

    Coleman, M. M.; Varnell, W. D.

    1982-01-01

    Describes a computer program (FORTRAN and APPLESOFT) demonstrating the effect of copolymer composition as a function of conversion, providing theoretical background and examples of types of information gained from computer calculations. Suggests that the program enhances undergraduate students' understanding of basic copolymerization theory.…

  6. Application of the new neutron monitor yield function computed for different altitudes to an analysis of GLEs

    NASA Astrophysics Data System (ADS)

    Mishev, Alexander; Usoskin, Ilya

    2016-07-01

    A precise analysis of SEP (solar energetic particle) spectral and angular characteristics using neutron monitor (NM) data requires realistic modeling of propagation of those particles in the Earth's magnetosphere and atmosphere. On the basis of the method including a sequence of consecutive steps, namely a detailed computation of the SEP assymptotic cones of acceptance, and application of a neutron monitor yield function and convenient optimization procedure, we derived the rigidity spectra and anisotropy characteristics of several major GLEs. Here we present several major GLEs of the solar cycle 23: the Bastille day event on 14 July 2000 (GLE 59), GLE 69 on 20 January 2005, and GLE 70 on 13 December 2006. The SEP spectra and pitch angle distributions were computed in their dynamical development. For the computation we use the newly computed yield function of the standard 6NM64 neutron monitor for primary proton and alpha CR nuclei. In addition, we present new computations of NM yield function for the altitudes of 3000 m and 5000 m above the sea level The computations were carried out with Planetocosmics and CORSIKA codes as standardized Monte-Carlo tools for atmospheric cascade simulations. The flux of secondary neutrons and protons was computed using the Planetocosmics code appliyng a realistic curved atmospheric. Updated information concerning the NM registration efficiency for secondary neutrons and protons was used. The derived results for spectral and angular characteristics using the newly computed NM yield function at several altitudes are compared with the previously obtained ones using the double attenuation method.

  7. Multiscale Theoretical and Computational Modeling of the Synthesis, Structure and Performance of Functional Carbon Materials

    NASA Astrophysics Data System (ADS)

    Mushrif, Samir Hemant

    2010-09-01

    Functional carbon-based/supported materials, including those doped with transition metal, are widely applied in hydrogen mediated catalysis and are currently being designed for hydrogen storage applications. This thesis focuses on acquiring a fundamental understanding and quantitative characterization of: (i) the chemistry of their synthesis procedure, (ii) their microstructure and chemical composition and (iii) their functionality, using multiscale modeling and simulation methodologies. Palladium and palladium(II) acetylacetonate are the transition metal and its precursor of interest, respectively. A first-principles modeling approach consisting of the planewave-pseudopotential implementation of the Kohn-Sham density functional theory, combined with the Car-Parrinello molecular dynamics, is implemented to model the palladium doping step in the synthesis of carbon-based/supported material and its interaction with hydrogen. The electronic structure is analyzed using the electron localization function and, when required, the hydrogen interaction dynamics are accelerated and the energetics are computed using the metadynamics technique. Palladium pseudopotentials are tested and validated for their use in a hydrocarbon environment by successfully computing the experimentally observed crystal structure of palladium(II) acetylacetonate. Long-standing hypotheses related to the palladium doping process are confirmed and new fundamental insights about its molecular chemistry are revealed. The dynamics, mechanism and energy landscape and barriers of hydrogen adsorption and migration on and desorption from the carbon-based/supported palladium clusters are reported for the first time. The effects of palladium doping and of the synthesis procedure on the pore structure of palladium-doped activated carbon fibers are quantified by applying novel statistical mechanical based methods to the experimental physisorption isotherms. The drawbacks of the conventional adsorption-based pore

  8. Study of dust particle charging in weakly ionized inert gases taking into account the nonlocality of the electron energy distribution function

    SciTech Connect

    Filippov, A. V. Dyatko, N. A.; Kostenko, A. S.

    2014-11-15

    The charging of dust particles in weakly ionized inert gases at atmospheric pressure has been investigated. The conditions under which the gas is ionized by an external source, a beam of fast electrons, are considered. The electron energy distribution function in argon, krypton, and xenon has been calculated for three rates of gas ionization by fast electrons: 10{sup 13}, 10{sup 14}, and 10{sup 15} cm{sup −1}. A model of dust particle charging with allowance for the nonlocal formation of the electron energy distribution function in the region of strong plasma quasi-neutrality violation around the dust particle is described. The nonlocality is taken into account in an approximation where the distribution function is a function of only the total electron energy. Comparative calculations of the dust particle charge with and without allowance for the nonlocality of the electron energy distribution function have been performed. Allowance for the nonlocality is shown to lead to a noticeable increase in the dust particle charge due to the influence of the group of hot electrons from the tail of the distribution function. It has been established that the screening constant virtually coincides with the smallest screening constant determined according to the asymptotic theory of screening with the electron transport and recombination coefficients in an unperturbed plasma.

  9. Acidity of the amidoxime functional group in aqueous solution. A combined experimental and computational study

    DOE PAGES

    Mehio, Nada; Lashely, Mark A.; Nugent, Joseph W.; Tucker, Lyndsay; Correia, Bruna; Do-Thanh, Chi-Linh; Dai, Sheng; Hancock, Robert D.; Bryantsev, Vyacheslav S.

    2015-01-26

    Poly(acrylamidoxime) adsorbents are often invoked in discussions of mining uranium from seawater. It has been demonstrated repeatedly in the literature that the success of these materials is due to the amidoxime functional group. While the amidoxime-uranyl chelation mode has been established, a number of essential binding constants remain unclear. This is largely due to the wide range of conflicting pKa values that have been reported for the amidoxime functional group in the literature. To resolve this existing controversy we investigated the pKa values of the amidoxime functional group using a combination of experimental and computational methods. Experimentally, we used spectroscopicmore » titrations to measure the pKa values of representative amidoximes, acetamidoxime and benzamidoxime. Computationally, we report on the performance of several protocols for predicting the pKa values of aqueous oxoacids. Calculations carried out at the MP2 or M06-2X levels of theory combined with solvent effects calculated using the SMD model provide the best overall performance with a mean absolute error of 0.33 pKa units and 0.35 pKa units, respectively, and a root mean square deviation of 0.46 pKa units and 0.45 pKa units, respectively. Finally, we employ our two best methods to predict the pKa values of promising, uncharacterized amidoxime ligands. Hence, our study provides a convenient means for screening suitable amidoxime monomers for future generations of poly(acrylamidoxime) adsorbents used to mine uranium from seawater.« less

  10. Acidity of the amidoxime functional group in aqueous solution. A combined experimental and computational study

    SciTech Connect

    Mehio, Nada; Lashely, Mark A.; Nugent, Joseph W.; Tucker, Lyndsay; Correia, Bruna; Do-Thanh, Chi-Linh; Dai, Sheng; Hancock, Robert D.; Bryantsev, Vyacheslav S.

    2015-01-26

    Poly(acrylamidoxime) adsorbents are often invoked in discussions of mining uranium from seawater. It has been demonstrated repeatedly in the literature that the success of these materials is due to the amidoxime functional group. While the amidoxime-uranyl chelation mode has been established, a number of essential binding constants remain unclear. This is largely due to the wide range of conflicting pKa values that have been reported for the amidoxime functional group in the literature. To resolve this existing controversy we investigated the pKa values of the amidoxime functional group using a combination of experimental and computational methods. Experimentally, we used spectroscopic titrations to measure the pKa values of representative amidoximes, acetamidoxime and benzamidoxime. Computationally, we report on the performance of several protocols for predicting the pKa values of aqueous oxoacids. Calculations carried out at the MP2 or M06-2X levels of theory combined with solvent effects calculated using the SMD model provide the best overall performance with a mean absolute error of 0.33 pKa units and 0.35 pKa units, respectively, and a root mean square deviation of 0.46 pKa units and 0.45 pKa units, respectively. Finally, we employ our two best methods to predict the pKa values of promising, uncharacterized amidoxime ligands. Hence, our study provides a convenient means for screening suitable amidoxime monomers for future generations of poly(acrylamidoxime) adsorbents used to mine uranium from seawater.

  11. Computer Simulations Reveal Multiple Functions for Aromatic Residues in Cellulase Enzymes (Fact Sheet)

    SciTech Connect

    Not Available

    2012-07-01

    NREL researchers use high-performance computing to demonstrate fundamental roles of aromatic residues in cellulase enzyme tunnels. National Renewable Energy Laboratory (NREL) computer simulations of a key industrial enzyme, the Trichoderma reesei Family 6 cellulase (Cel6A), predict that aromatic residues near the enzyme's active site and at the entrance and exit tunnel perform different functions in substrate binding and catalysis, depending on their location in the enzyme. These results suggest that nature employs aromatic-carbohydrate interactions with a wide variety of binding affinities for diverse functions. Outcomes also suggest that protein engineering strategies in which mutations are made around the binding sites may require tailoring specific to the enzyme family. Cellulase enzymes ubiquitously exhibit tunnels or clefts lined with aromatic residues for processing carbohydrate polymers to monomers, but the molecular-level role of these aromatic residues remains unknown. In silico mutation of the aromatic residues near the catalytic site of Cel6A has little impact on the binding affinity, but simulation suggests that these residues play a major role in the glucopyranose ring distortion necessary for cleaving glycosidic bonds to produce fermentable sugars. Removal of aromatic residues at the entrance and exit of the cellulase tunnel, however, dramatically impacts the binding affinity. This suggests that these residues play a role in acquiring cellulose chains from the cellulose crystal and stabilizing the reaction product, respectively. These results illustrate that the role of aromatic-carbohydrate interactions varies dramatically depending on the position in the enzyme tunnel. As aromatic-carbohydrate interactions are present in all carbohydrate-active enzymes, the results have implications for understanding protein structure-function relationships in carbohydrate metabolism and recognition, carbon turnover in nature, and protein engineering strategies for

  12. Management Needs for Computer Support.

    ERIC Educational Resources Information Center

    Irby, Alice J.

    University management has many and varied needs for effective computer services in support of their processing and information functions. The challenge for the computer center managers is to better understand these needs and assist in the development of effective and timely solutions. Management needs can range from accounting and payroll to…

  13. ABINIT: Plane-Wave-Based Density-Functional Theory on High Performance Computers

    NASA Astrophysics Data System (ADS)

    Torrent, Marc

    2014-03-01

    For several years, a continuous effort has been produced to adapt electronic structure codes based on Density-Functional Theory to the future computing architectures. Among these codes, ABINIT is based on a plane-wave description of the wave functions which allows to treat systems of any kind. Porting such a code on petascale architectures pose difficulties related to the many-body nature of the DFT equations. To improve the performances of ABINIT - especially for what concerns standard LDA/GGA ground-state and response-function calculations - several strategies have been followed: A full multi-level parallelisation MPI scheme has been implemented, exploiting all possible levels and distributing both computation and memory. It allows to increase the number of distributed processes and could not be achieved without a strong restructuring of the code. The core algorithm used to solve the eigen problem (``Locally Optimal Blocked Congugate Gradient''), a Blocked-Davidson-like algorithm, is based on a distribution of processes combining plane-waves and bands. In addition to the distributed memory parallelization, a full hybrid scheme has been implemented, using standard shared-memory directives (openMP/openACC) or porting some comsuming code sections to Graphics Processing Units (GPU). As no simple performance model exists, the complexity of use has been increased; the code efficiency strongly depends on the distribution of processes among the numerous levels. ABINIT is able to predict the performances of several process distributions and automatically choose the most favourable one. On the other hand, a big effort has been carried out to analyse the performances of the code on petascale architectures, showing which sections of codes have to be improved; they all are related to Matrix Algebra (diagonalisation, orthogonalisation). The different strategies employed to improve the code scalability will be described. They are based on an exploration of new diagonalization

  14. Using Data Mining and Computational Approaches to Study Intermediate Filament Structure and Function.

    PubMed

    Parry, David A D

    2016-01-01

    Experimental and theoretical research aimed at determining the structure and function of the family of intermediate filament proteins has made significant advances over the past 20 years. Much of this has either contributed to or relied on the amino acid sequence databases that are now available online, and the data mining approaches that have been developed to analyze these sequences. As the quality of sequence data is generally high, it follows that it is the design of the computational and graphical methodologies that are of especial importance to researchers who aspire to gain a greater understanding of those sequence features that specify both function and structural hierarchy. However, these techniques are necessarily subject to limitations and it is important that these be recognized. In addition, no single method is likely to be successful in solving a particular problem, and a coordinated approach using a suite of methods is generally required. A final step in the process involves the interpretation of the results obtained and the construction of a working model or hypothesis that suggests further experimentation. While such methods allow meaningful progress to be made it is still important that the data are interpreted correctly and conservatively. New data mining methods are continually being developed, and it can be expected that even greater understanding of the relationship between structure and function will be gleaned from sequence data in the coming years.

  15. Novel hold-release functionality in a P300 brain-computer interface

    NASA Astrophysics Data System (ADS)

    Alcaide-Aguirre, R. E.; Huggins, J. E.

    2014-12-01

    Assistive technology control interface theory describes interface activation and interface deactivation as distinct properties of any control interface. Separating control of activation and deactivation allows precise timing of the duration of the activation. Objective. We propose a novel P300 brain-computer interface (BCI) functionality with separate control of the initial activation and the deactivation (hold-release) of a selection. Approach. Using two different layouts and off-line analysis, we tested the accuracy with which subjects could (1) hold their selection and (2) quickly change between selections. Main results. Mean accuracy across all subjects for the hold-release algorithm was 85% with one hold-release classification and 100% with two hold-release classifications. Using a layout designed to lower perceptual errors, accuracy increased to a mean of 90% and the time subjects could hold a selection was 40% longer than with the standard layout. Hold-release functionality provides improved response time (6-16 times faster) over the initial P300 BCI selection by allowing the BCI to make hold-release decisions from very few flashes instead of after multiple sequences of flashes. Significance. For the BCI user, hold-release functionality allows for faster, more continuous control with a P300 BCI, creating new options for BCI applications.

  16. Synaptic Efficacy as a Function of Ionotropic Receptor Distribution: A Computational Study

    PubMed Central

    Allam, Sushmita L.; Bouteiller, Jean-Marie C.; Hu, Eric Y.; Ambert, Nicolas; Greget, Renaud; Bischoff, Serge; Baudry, Michel; Berger, Theodore W.

    2015-01-01

    Glutamatergic synapses are the most prevalent functional elements of information processing in the brain. Changes in pre-synaptic activity and in the function of various post-synaptic elements contribute to generate a large variety of synaptic responses. Previous studies have explored postsynaptic factors responsible for regulating synaptic strength variations, but have given far less importance to synaptic geometry, and more specifically to the subcellular distribution of ionotropic receptors. We analyzed the functional effects resulting from changing the subsynaptic localization of ionotropic receptors by using a hippocampal synaptic computational framework. The present study was performed using the EONS (Elementary Objects of the Nervous System) synaptic modeling platform, which was specifically developed to explore the roles of subsynaptic elements as well as their interactions, and that of synaptic geometry. More specifically, we determined the effects of changing the localization of ionotropic receptors relative to the presynaptic glutamate release site, on synaptic efficacy and its variations following single pulse and paired-pulse stimulation protocols. The results indicate that changes in synaptic geometry do have consequences on synaptic efficacy and its dynamics. PMID:26480028

  17. Reproducibility of physiologic parameters obtained using functional computed tomography in mice

    NASA Astrophysics Data System (ADS)

    Krishnamurthi, Ganapathy; Stantz, Keith M.; Steinmetz, Rosemary; Hutchins, Gary D.; Liang, Yun

    2004-04-01

    High-speed X-ray computed tomography (CT) has the potential to observe the transport of iodinated radio-opaque contrast agent (CA) through tissue enabling the quantification of tissue physiology in organs and tumors. The concentration of Iodine in the tissue and in the left ventricle is extracted as a function of time and is fit to a compartmental model for physiologic parameter estimation. The reproducibility of the physiologic parameters depend on the (1) The image-sampling rate. According to our simulations 5-second sampling is required for CA injection rates of 1.0ml/min (2) the compartmental model should reflect the real tissue function to give meaning results. In order to verify these limits a functional CT study was carried out in a group of 3 mice. Dynamic CT scans were performed on all the mice with 0.5ml/min, 1ml/min and 2ml/min CA injection rates. The physiologic parameters were extracted using 4 parameter and 6 parameter two compartmental models (2CM). Single factor ANOVA did not indicate a significant difference in the perfusion, in the kidneys for the different injection rates. The physiologic parameter obtained using the 6-parameter 2CM model was in line with literature values and the 6-parameter significantly improves chi-square goodness of fits for two cases.

  18. Synaptic Efficacy as a Function of Ionotropic Receptor Distribution: A Computational Study.

    PubMed

    Allam, Sushmita L; Bouteiller, Jean-Marie C; Hu, Eric Y; Ambert, Nicolas; Greget, Renaud; Bischoff, Serge; Baudry, Michel; Berger, Theodore W

    2015-01-01

    Glutamatergic synapses are the most prevalent functional elements of information processing in the brain. Changes in pre-synaptic activity and in the function of various post-synaptic elements contribute to generate a large variety of synaptic responses. Previous studies have explored postsynaptic factors responsible for regulating synaptic strength variations, but have given far less importance to synaptic geometry, and more specifically to the subcellular distribution of ionotropic receptors. We analyzed the functional effects resulting from changing the subsynaptic localization of ionotropic receptors by using a hippocampal synaptic computational framework. The present study was performed using the EONS (Elementary Objects of the Nervous System) synaptic modeling platform, which was specifically developed to explore the roles of subsynaptic elements as well as their interactions, and that of synaptic geometry. More specifically, we determined the effects of changing the localization of ionotropic receptors relative to the presynaptic glutamate release site, on synaptic efficacy and its variations following single pulse and paired-pulse stimulation protocols. The results indicate that changes in synaptic geometry do have consequences on synaptic efficacy and its dynamics.

  19. Functional neuroanatomy of remote episodic, semantic and spatial memory: a unified account based on multiple trace theory

    PubMed Central

    Moscovitch, Morris; Rosenbaum, R Shayna; Gilboa, Asaf; Addis, Donna Rose; Westmacott, Robyn; Grady, Cheryl; McAndrews, Mary Pat; Levine, Brian; Black, Sandra; Winocur, Gordon; Nadel, Lynn

    2005-01-01

    We review lesion and neuroimaging evidence on the role of the hippocampus, and other structures, in retention and retrieval of recent and remote memories. We examine episodic, semantic and spatial memory, and show that important distinctions exist among different types of these memories and the structures that mediate them. We argue that retention and retrieval of detailed, vivid autobiographical memories depend on the hippocampal system no matter how long ago they were acquired. Semantic memories, on the other hand, benefit from hippocampal contribution for some time before they can be retrieved independently of the hippocampus. Even semantic memories, however, can have episodic elements associated with them that continue to depend on the hippocampus. Likewise, we distinguish between experientially detailed spatial memories (akin to episodic memory) and more schematic memories (akin to semantic memory) that are sufficient for navigation but not for re-experiencing the environment in which they were acquired. Like their episodic and semantic counterparts, the former type of spatial memory is dependent on the hippocampus no matter how long ago it was acquired, whereas the latter can survive independently of the hippocampus and is represented in extra-hippocampal structures. In short, the evidence reviewed suggests strongly that the function of the hippocampus (and possibly that of related limbic structures) is to help encode, retain, and retrieve experiences, no matter how long ago the events comprising the experience occurred, and no matter whether the memories are episodic or spatial. We conclude that the evidence favours a multiple trace theory (MTT) of memory over two other models: (1) traditional consolidation models which posit that the hippocampus is a time-limited memory structure for all forms of memory; and (2) versions of cognitive map theory which posit that the hippocampus is needed for representing all forms of allocentric space in memory. PMID

  20. Vibration of isotropic and composite plates using computed shape function and its application to elastic support optimization

    NASA Astrophysics Data System (ADS)

    Kong, Jackson

    2009-10-01

    Vibration of plates with various boundary and internal support conditions is analyzed, based on classical thin-plate theory and the Rayleigh-Ritz approach. To satisfy the support conditions, a new set of admissible functions, namely the computed shape functions, is applied to each of the two orthogonal in-plane directions. Similar to conventional finite element shape functions, parameters associated with each term of the proposed functions represent the actual displacements of the plates, thus making the method easily applicable to a wide range of support conditions, including continuous or partial edge supports and discrete internal supports. The method can also be applied to plates consisting of rectangular segments, like an L-shape plate, which sub-domains can be formulated using the computed shape functions and subsequently assembled in the usual finite element manner. Unlike many other admissible functions proposed in the literature, however, the computed shape functions presented herein are C 1—continuous and involve no complicated mathematical functions; they can be easily computed a priori by means of a continuous beam computer program and only the conventional third-order beam shape functions are involved in subsequent formulation. In all the examples given herein, only a few terms of these functions are sufficient to obtain accurate frequencies, thus demonstrating its computational effectiveness and accuracy. The method is further extended to the study of optimal location and stiffness of discrete elastic supports for maximizing the fundamental frequency of plates. Unlike rigid point supports with infinite stiffness, which optimal locations have been studied by many researchers, only discrete supports with a finite stiffness is considered in this paper. The optimal location and stiffness of discrete supports are determined for isotropic plates and laminated plates with various stacking sequences, which results are presented for the first time in

  1. Comparison of measured and computed phase functions of individual tropospheric ice crystals

    NASA Astrophysics Data System (ADS)

    Stegmann, Patrick G.; Tropea, Cameron; Järvinen, Emma; Schnaiter, Martin

    2016-07-01

    Airplanes passing the incuda (lat. anvils) regions of tropical cumulonimbi-clouds are at risk of suffering an engine power-loss event and engine damage due to ice ingestion (Mason et al., 2006 [1]). Research in this field relies on optical measurement methods to characterize ice crystals; however the design and implementation of such methods presently suffer from the lack of reliable and efficient means of predicting the light scattering from ice crystals. The nascent discipline of direct measurement of phase functions of ice crystals in conjunction with particle imaging and forward modelling through geometrical optics derivative- and Transition matrix-codes for the first time allow us to obtain a deeper understanding of the optical properties of real tropospheric ice crystals. In this manuscript, a sample phase function obtained via the Particle Habit Imaging and Polar Scattering (PHIPS) probe during a measurement campaign in flight over Brazil will be compared to three different light scattering codes. This includes a newly developed first order geometrical optics code taking into account the influence of the Gaussian beam illumination used in the PHIPS device, as well as the reference ray tracing code of Macke and the T-matrix code of Kahnert.

  2. Structure function analysis of serpin super-family: "a computational approach".

    PubMed

    Singh, Poonam; Jairajpuri, Mohamad Aman

    2014-01-01

    Serine Protease inhibitors (serpins) are a super-family of proteins that controls the proteinases involved in the inflammation, complementation, coagulation and fibrinolytic pathways. Serpins are prone to conformational diseases due to a complex inhibition mechanism that involves large scale conformational change, and their susceptibility to undergo point mutations might lead to functional defects. Serpins are associated with diseases like emphysema/cirrhosis, angioedema, familial dementia, chronic obstructive bronchitis and thrombosis. Serpin polymerization based pathologies are fairly widespread and devising a cure has been difficult due to lack of clarity regarding its mechanism. Serpin can exist in various conformational states and has a variable cofactor binding ability. It has a large genome and proteome database which can be utilized to gain critical insight into their structure, mechanism and defects. Comprehensive computational studies on the serpin family is lacking, most of the work done till date is limited and deals mostly with few individual serpins. We have tried to analyze few aspect of this family using diverse computational biology tools and have shown the following: a) the importance of residue burial linked shift in the conformational stability as a major factor in increasing the polymer propensity in serpins. b) Amino acids involved in the polymerization are in general completely buried in the native conformation. c) An isozyme specific antithrombin study showed the structural basis of improved heparin binding to beta antithrombin as compared to alpha-antithrombin. d) A comprehensive cavity analysis showed its importance in inhibition and polymerizaiton and finally e) an interface analysis of various serpin protease complexes identified critical evolutionary conserved residues in exosite that determines its protease specificity. This work introduces the problem and emphasizes on the need for in-depth computational studies of serpin superfamily

  3. Parallel-META 2.0: Enhanced Metagenomic Data Analysis with Functional Annotation, High Performance Computing and Advanced Visualization

    PubMed Central

    Song, Baoxing; Xu, Jian; Ning, Kang

    2014-01-01

    The metagenomic method directly sequences and analyses genome information from microbial communities. The main computational tasks for metagenomic analyses include taxonomical and functional structure analysis for all genomes in a microbial community (also referred to as a metagenomic sample). With the advancement of Next Generation Sequencing (NGS) techniques, the number of metagenomic samples and the data size for each sample are increasing rapidly. Current metagenomic analysis is both data- and computation- intensive, especially when there are many species in a metagenomic sample, and each has a large number of sequences. As such, metagenomic analyses require extensive computational power. The increasing analytical requirements further augment the challenges for computation analysis. In this work, we have proposed Parallel-META 2.0, a metagenomic analysis software package, to cope with such needs for efficient and fast analyses of taxonomical and functional structures for microbial communities. Parallel-META 2.0 is an extended and improved version of Parallel-META 1.0, which enhances the taxonomical analysis using multiple databases, improves computation efficiency by optimized parallel computing, and supports interactive visualization of results in multiple views. Furthermore, it enables functional analysis for metagenomic samples including short-reads assembly, gene prediction and functional annotation. Therefore, it could provide accurate taxonomical and functional analyses of the metagenomic samples in high-throughput manner and on large scale. PMID:24595159

  4. Parallel-META 2.0: enhanced metagenomic data analysis with functional annotation, high performance computing and advanced visualization.

    PubMed

    Su, Xiaoquan; Pan, Weihua; Song, Baoxing; Xu, Jian; Ning, Kang

    2014-01-01

    The metagenomic method directly sequences and analyses genome information from microbial communities. The main computational tasks for metagenomic analyses include taxonomical and functional structure analysis for all genomes in a microbial community (also referred to as a metagenomic sample). With the advancement of Next Generation Sequencing (NGS) techniques, the number of metagenomic samples and the data size for each sample are increasing rapidly. Current metagenomic analysis is both data- and computation- intensive, especially when there are many species in a metagenomic sample, and each has a large number of sequences. As such, metagenomic analyses require extensive computational power. The increasing analytical requirements further augment the challenges for computation analysis. In this work, we have proposed Parallel-META 2.0, a metagenomic analysis software package, to cope with such needs for efficient and fast analyses of taxonomical and functional structures for microbial communities. Parallel-META 2.0 is an extended and improved version of Parallel-META 1.0, which enhances the taxonomical analysis using multiple databases, improves computation efficiency by optimized parallel computing, and supports interactive visualization of results in multiple views. Furthermore, it enables functional analysis for metagenomic samples including short-reads assembly, gene prediction and functional annotation. Therefore, it could provide accurate taxonomical and functional analyses of the metagenomic samples in high-throughput manner and on large scale.

  5. Functional near-infrared spectroscopy for adaptive human-computer interfaces

    NASA Astrophysics Data System (ADS)

    Yuksel, Beste F.; Peck, Evan M.; Afergan, Daniel; Hincks, Samuel W.; Shibata, Tomoki; Kainerstorfer, Jana; Tgavalekos, Kristen; Sassaroli, Angelo; Fantini, Sergio; Jacob, Robert J. K.

    2015-03-01

    We present a brain-computer interface (BCI) that detects, analyzes and responds to user cognitive state in real-time using machine learning classifications of functional near-infrared spectroscopy (fNIRS) data. Our work is aimed at increasing the narrow communication bandwidth between the human and computer by implicitly measuring users' cognitive state without any additional effort on the part of the user. Traditionally, BCIs have been designed to explicitly send signals as the primary input. However, such systems are usually designed for people with severe motor disabilities and are too slow and inaccurate for the general population. In this paper, we demonstrate with previous work1 that a BCI that implicitly measures cognitive workload can improve user performance and awareness compared to a control condition by adapting to user cognitive state in real-time. We also discuss some of the other applications we have used in this field to measure and respond to cognitive states such as cognitive workload, multitasking, and user preference.

  6. Technical Report: Toward a Scalable Algorithm to Compute High-Dimensional Integrals of Arbitrary Functions

    SciTech Connect

    Snyder, Abigail C.; Jiao, Yu

    2010-10-01

    Neutron experiments at the Spallation Neutron Source (SNS) at Oak Ridge National Laboratory (ORNL) frequently generate large amounts of data (on the order of 106-1012 data points). Hence, traditional data analysis tools run on a single CPU take too long to be practical and scientists are unable to efficiently analyze all data generated by experiments. Our goal is to develop a scalable algorithm to efficiently compute high-dimensional integrals of arbitrary functions. This algorithm can then be used to integrate the four-dimensional integrals that arise as part of modeling intensity from the experiments at the SNS. Here, three different one-dimensional numerical integration solvers from the GNU Scientific Library were modified and implemented to solve four-dimensional integrals. The results of these solvers on a final integrand provided by scientists at the SNS can be compared to the results of other methods, such as quasi-Monte Carlo methods, computing the same integral. A parallelized version of the most efficient method can allow scientists the opportunity to more effectively analyze all experimental data.

  7. Training Older Adults to Use Tablet Computers: Does It Enhance Cognitive Function?

    PubMed Central

    Chan, Micaela Y.; Haber, Sara; Drew, Linda M.; Park, Denise C.

    2016-01-01

    Purpose of the Study: Recent evidence shows that engaging in learning new skills improves episodic memory in older adults. In this study, older adults who were computer novices were trained to use a tablet computer and associated software applications. We hypothesize that sustained engagement in this mentally challenging training would yield a dual benefit of improved cognition and enhancement of everyday function by introducing useful skills. Design and Methods: A total of 54 older adults (age 60-90) committed 15 hr/week for 3 months. Eighteen participants received extensive iPad training, learning a broad range of practical applications. The iPad group was compared with 2 separate controls: a Placebo group that engaged in passive tasks requiring little new learning; and a Social group that had regular social interaction, but no active skill acquisition. All participants completed the same cognitive battery pre- and post-engagement. Results: Compared with both controls, the iPad group showed greater improvements in episodic memory and processing speed but did not differ in mental control or visuospatial processing. Implications: iPad training improved cognition relative to engaging in social or nonchallenging activities. Mastering relevant technological devices have the added advantage of providing older adults with technological skills useful in facilitating everyday activities (e.g., banking). This work informs the selection of targeted activities for future interventions and community programs. PMID:24928557

  8. Computational modeling of heterogeneity and function of CD4+ T cells

    PubMed Central

    Carbo, Adria; Hontecillas, Raquel; Andrew, Tricity; Eden, Kristin; Mei, Yongguo; Hoops, Stefan; Bassaganya-Riera, Josep

    2014-01-01

    The immune system is composed of many different cell types and hundreds of intersecting molecular pathways and signals. This large biological complexity requires coordination between distinct pro-inflammatory and regulatory cell subsets to respond to infection while maintaining tissue homeostasis. CD4+ T cells play a central role in orchestrating immune responses and in maintaining a balance between pro- and anti- inflammatory responses. This tight balance between regulatory and effector reactions depends on the ability of CD4+ T cells to modulate distinct pathways within large molecular networks, since dysregulated CD4+ T cell responses may result in chronic inflammatory and autoimmune diseases. The CD4+ T cell differentiation process comprises an intricate interplay between cytokines, their receptors, adaptor molecules, signaling cascades and transcription factors that help delineate cell fate and function. Computational modeling can help to describe, simulate, analyze, and predict some of the behaviors in this complicated differentiation network. This review provides a comprehensive overview of existing computational immunology methods as well as novel strategies used to model immune responses with a particular focus on CD4+ T cell differentiation. PMID:25364738

  9. Development of computer-aided functions in clinical neurosurgery with PACS

    NASA Astrophysics Data System (ADS)

    Mukasa, Minoru; Aoki, Makoto; Satoh, Minoru; Kowada, Masayoshi; Kikuchi, K.

    1991-07-01

    The introduction of the "Picture Archiving and Communications System (known as PACS)," provides many benefits, including the application of C.A.D., (Computer Aided Diagnosis). Clinically, this allows for the measurement and design of an operation to be easily completed with the CRT monitors of PACS rather than with film, as has been customary in the past. Under the leadership of the Department of Neurosurgery, Akita University School of Medicine, and Southern Tohoku Research Institute for Neuroscience, Koriyama, new computer aided functions with EFPACS (Fuji Electric's PACS) have been developed for use in clinical neurosurgery. This image processing is composed of three parts as follows: (1) Automatic mapping of small lesions depicted on Magnetic Resonance (or MR) images on the brain atlas. (2) Superimposition of two angiographic films onto a single synthesized image. (3) Automatic mapping of the lesion's position (as shown on the. CT images) on the processing image referred to in the foregoing clause 2. The processing in the clause (1) provides a reference for anatomical estimation. The processing in the clause (2) is used for general analysis of the condition of a disease. The processing in the clause (3) is used to design the operation. This image processing is currently being used with good results.

  10. Comparison of functional MRI image realignment tools using a computer-generated phantom.

    PubMed

    Morgan, V L; Pickens, D R; Hartmann, S L; Price, R R

    2001-09-01

    This study discusses the development of a computer-generated phantom to compare the effects of image realignment programs on functional MRI (fMRI) pixel activation. The phantom is a whole-head MRI volume with added random noise, activation, and motion. It allows simulation of realistic head motions with controlled areas of activation. Without motion, the phantom shows the effects of realignment on motion-free data sets. Prior to realignment, the phantom illustrates some activation corruption due to motion. Finally, three widely used realignment packages are examined. The results showed that the most accurate algorithms are able to increase specificity through accurate realignment while maintaining sensitivity through effective resampling techniques. In fact, accurate realignment alone is not a powerful indicator of the most effective algorithm in terms of true activation.

  11. Computing frequency by using generalized zero-crossing applied to intrinsic mode functions

    NASA Technical Reports Server (NTRS)

    Huang, Norden E. (Inventor)

    2006-01-01

    This invention presents a method for computing Instantaneous Frequency by applying Empirical Mode Decomposition to a signal and using Generalized Zero-Crossing (GZC) and Extrema Sifting. The GZC approach is the most direct, local, and also the most accurate in the mean. Furthermore, this approach will also give a statistical measure of the scattering of the frequency value. For most practical applications, this mean frequency localized down to quarter of a wave period is already a well-accepted result. As this method physically measures the period, or part of it, the values obtained can serve as the best local mean over the period to which it applies. Through Extrema Sifting, instead of the cubic spline fitting, this invention constructs the upper envelope and the lower envelope by connecting local maxima points and local minima points of the signal with straight lines, respectively, when extracting a collection of Intrinsic Mode Functions (IMFs) from a signal under consideration.

  12. Computed myography: three-dimensional reconstruction of motor functions from surface EMG data

    NASA Astrophysics Data System (ADS)

    van den Doel, Kees; Ascher, Uri M.; Pai, Dinesh K.

    2008-12-01

    We describe a methodology called computed myography to qualitatively and quantitatively determine the activation level of individual muscles by voltage measurements from an array of voltage sensors on the skin surface. A finite element model for electrostatics simulation is constructed from morphometric data. For the inverse problem, we utilize a generalized Tikhonov regularization. This imposes smoothness on the reconstructed sources inside the muscles and suppresses sources outside the muscles using a penalty term. Results from experiments with simulated and human data are presented for activation reconstructions of three muscles in the upper arm (biceps brachii, bracialis and triceps). This approach potentially offers a new clinical tool to sensitively assess muscle function in patients suffering from neurological disorders (e.g., spinal cord injury), and could more accurately guide advances in the evaluation of specific rehabilitation training regimens.

  13. Computational genomic identification and functional reconstitution of plant natural product biosynthetic pathways

    PubMed Central

    2016-01-01

    Covering: 2003 to 2016 The last decade has seen the first major discoveries regarding the genomic basis of plant natural product biosynthetic pathways. Four key computationally driven strategies have been developed to identify such pathways, which make use of physical clustering, co-expression, evolutionary co-occurrence and epigenomic co-regulation of the genes involved in producing a plant natural product. Here, we discuss how these approaches can be used for the discovery of plant biosynthetic pathways encoded by both chromosomally clustered and non-clustered genes. Additionally, we will discuss opportunities to prioritize plant gene clusters for experimental characterization, and end with a forward-looking perspective on how synthetic biology technologies will allow effective functional reconstitution of candidate pathways using a variety of genetic systems. PMID:27321668

  14. Computational design of intrinsic molecular rectifiers based on asymmetric functionalization of N-phenylbenzamide

    SciTech Connect

    Ding, Wendu; Koepf, Matthieu; Koenigsmann, Christopher; Batra, Arunabh; Venkataraman, Latha; Negre, Christian F. A.; Brudvig, Gary W.; Crabtree, Robert H.; Schmuttenmaer, Charles A.; Batista, Victor S.

    2015-12-08

    Here, we report a systematic computational search of molecular frameworks for intrinsic rectification of electron transport. The screening of molecular rectifiers includes 52 molecules and conformers spanning over 9 series of structural motifs. N-Phenylbenzamide is found to be a promising framework with both suitable conductance and rectification properties. A targeted screening performed on 30 additional derivatives and conformers of N-phenylbenzamide yielded enhanced rectification based on asymmetric functionalization. We demonstrate that electron-donating substituent groups that maintain an asymmetric distribution of charge in the dominant transport channel (e.g., HOMO) enhance rectification by raising the channel closer to the Fermi level. These findings are particularly valuable for the design of molecular assemblies that could ensure directionality of electron transport in a wide range of applications, from molecular electronics to catalytic reactions.

  15. Computational design of intrinsic molecular rectifiers based on asymmetric functionalization of N-phenylbenzamide

    DOE PAGES

    Ding, Wendu; Koepf, Matthieu; Koenigsmann, Christopher; Batra, Arunabh; Venkataraman, Latha; Negre, Christian F. A.; Brudvig, Gary W.; Crabtree, Robert H.; Schmuttenmaer, Charles A.; Batista, Victor S.

    2015-11-03

    Here, we report a systematic computational search of molecular frameworks for intrinsic rectification of electron transport. The screening of molecular rectifiers includes 52 molecules and conformers spanning over 9 series of structural motifs. N-Phenylbenzamide is found to be a promising framework with both suitable conductance and rectification properties. A targeted screening performed on 30 additional derivatives and conformers of N-phenylbenzamide yielded enhanced rectification based on asymmetric functionalization. We demonstrate that electron-donating substituent groups that maintain an asymmetric distribution of charge in the dominant transport channel (e.g., HOMO) enhance rectification by raising the channel closer to the Fermi level. These findingsmore » are particularly valuable for the design of molecular assemblies that could ensure directionality of electron transport in a wide range of applications, from molecular electronics to catalytic reactions.« less

  16. Computational design of intrinsic molecular rectifiers based on asymmetric functionalization of N-phenylbenzamide

    SciTech Connect

    Ding, Wendu; Koepf, Matthieu; Koenigsmann, Christopher; Batra, Arunabh; Venkataraman, Latha; Negre, Christian F. A.; Brudvig, Gary W.; Crabtree, Robert H.; Schmuttenmaer, Charles A.; Batista, Victor S.

    2015-11-03

    Here, we report a systematic computational search of molecular frameworks for intrinsic rectification of electron transport. The screening of molecular rectifiers includes 52 molecules and conformers spanning over 9 series of structural motifs. N-Phenylbenzamide is found to be a promising framework with both suitable conductance and rectification properties. A targeted screening performed on 30 additional derivatives and conformers of N-phenylbenzamide yielded enhanced rectification based on asymmetric functionalization. We demonstrate that electron-donating substituent groups that maintain an asymmetric distribution of charge in the dominant transport channel (e.g., HOMO) enhance rectification by raising the channel closer to the Fermi level. These findings are particularly valuable for the design of molecular assemblies that could ensure directionality of electron transport in a wide range of applications, from molecular electronics to catalytic reactions.

  17. Computational Simulation of a Simple Pendulum Driven by a Natural Chaotic Function

    NASA Astrophysics Data System (ADS)

    Tomesh, Trevor

    2010-03-01

    A simple pendulum is computationally modeled and driven according to the natural non-linear dynamical functions that arise out of the Hodgkin-Huxley membrane model of squid giant axons. Driving a neural membrane with a sinusoidal current can stimulate chaotic potential oscillations that can be modeled mathematically. The solution of the Hodgkin-Huxley membrane model provides the amplitude of the impulse to the simple pendulum at the lowest point in its swing. The phase-space plot of a simple harmonic oscillator, randomly driven chaotic oscillator, and Hodgkin-Huxley driven chaotic oscillator are compared. The similarities and differences between the motion of the pendulum as the result of the Hodgkin-Huxley driving impulse and a random impulse are explored.

  18. Localization of functional adrenal tumors by computed tomography and venous sampling

    SciTech Connect

    Dunnick, N.R.; Doppman, J.L.; Gill, J.R. Jr.; Strott, C.A.; Keiser, H.R.; Brennan, M.F.

    1982-02-01

    Fifty-eight patients with functional lesions of the adrenal glands underwent radiographic evaluation. Twenty-eight patients had primary aldosteronism (Conn syndrome), 20 had Cushing syndrome, and 10 had pheochromocytoma. Computed tomography (CT) correctly identified adrenal tumors in 11 (61%) of 18 patients with aldosteronomas, 6 of 6 patients with benign cortisol-producing adrenal tumors, and 5 (83%) of 6 patients with pheochromocytomas. No false-positive diagnoses were encountered among patients with adrenal adenomas. Bilateral adrenal hyperplasia appeared on CT scans as normal or prominent adrenal glands with a normal configuration; however, CT was not able to exclude the presence of small adenomas. Adrenal venous sampling was correct in each case, and reliably distinguished adrenal tumors from hyperplasia. Recurrent pheochromocytomas were the most difficult to loclize on CT due to the surgical changes in the region of the adrenals and the frequent extra-adrenal locations.

  19. Computational Design of Intrinsic Molecular Rectifiers Based on Asymmetric Functionalization of N-Phenylbenzamide.

    PubMed

    Ding, Wendu; Koepf, Matthieu; Koenigsmann, Christopher; Batra, Arunabh; Venkataraman, Latha; Negre, Christian F A; Brudvig, Gary W; Crabtree, Robert H; Schmuttenmaer, Charles A; Batista, Victor S

    2015-12-01

    We report a systematic computational search of molecular frameworks for intrinsic rectification of electron transport. The screening of molecular rectifiers includes 52 molecules and conformers spanning over 9 series of structural motifs. N-Phenylbenzamide is found to be a promising framework with both suitable conductance and rectification properties. A targeted screening performed on 30 additional derivatives and conformers of N-phenylbenzamide yielded enhanced rectification based on asymmetric functionalization. We demonstrate that electron-donating substituent groups that maintain an asymmetric distribution of charge in the dominant transport channel (e.g., HOMO) enhance rectification by raising the channel closer to the Fermi level. These findings are particularly valuable for the design of molecular assemblies that could ensure directionality of electron transport in a wide range of applications, from molecular electronics to catalytic reactions.

  20. Accountability Overboard

    ERIC Educational Resources Information Center

    Chieppo, Charles D.; Gass, James T.

    2009-01-01

    This article reports that special interest groups opposed to charter schools and high-stakes testing have hijacked Massachusetts's once-independent board of education and stand poised to water down the Massachusetts Comprehensive Assessment System (MCAS) tests and the accountability system they support. President Barack Obama and Massachusetts…

  1. Painless Accountability.

    ERIC Educational Resources Information Center

    Brown, R. W.; And Others

    The computerized Painless Accountability System is a performance objective system from which instructional programs are developed. Three main simplified behavioral response levels characterize this system: (1) cognitive, (2) psychomotor, and (3) affective domains. Each of these objectives are classified by one of 16 descriptors. The second major…

  2. Multi-Rate Mass Transfer : Computing the Memory Function Using Micro-Tomographic Images

    NASA Astrophysics Data System (ADS)

    Gouze, P.; Melean, Y.; Leborgne, T.; Carrera, J.

    2006-12-01

    Several in situ and laboratory experiments display strongly dissymmetrical breakthrough curves (BTC), ending up with a concentration decrease with time close to C(t) ~ t ^{-γ}. Matrix diffusion is a widely recognized process producing this class of non-Fickean transport behavior characterized by an apparently infinite variance of the temporal distribution. The matrix diffusion sink/source term in the macroscopic advection dispersion transport equation can be expressed by the convolution product of a memory function G(t) times the concentration measured in the mobile (advective) part of the aquifer. Memory function, displaying power law decrease C(t) ~ t ^{1-γ} at early time, can be obtained by assuming an immobile domain made of single diffusion length structures, such as spheres or slabs. Indeed, diffusion in a distribution of spheres of different size may produce a large spectrum of power law memory function. However, the structure of the immobile domain of real rocks is generally completely different from spheres-made rocks. Here, we present a method for calculating the true memory function of heterogeneous structures (reef calcareous rocks) using 3D X-Ray micro-tomography images of rock samples. Several steps of data processing are required to quantify precisely the structure, the porosity distribution and the properties of the mobile/immobile interface, before solving the diffusion problem (here using random walk approach). Conversely, tracer experiments (at meter scale) are performed in the same medium. The obtained BTCs display long tailing decrease over several orders of magnitude. Using very few assumptions, one compute memory functions (measured on centimeter scale samples) similar to those expected to control the BTCs at meter scale. Results show that the memory function is strongly controlled by the diffusivity distribution in the matrix and, to a lesser extent, by the mobile-immobile interface geometry; so that power law exponents of the BTCs tail

  3. Beta-1 integrin-mediated adhesion may be initiated by multiple incomplete bonds, thus accounting for the functional importance of receptor clustering.

    PubMed

    Vitte, Joana; Benoliel, Anne-Marie; Eymeric, Philippe; Bongrand, Pierre; Pierres, Anne

    2004-06-01

    The regulation of cell integrin receptors involves modulation of membrane expression, shift between different affinity states, and topographical redistribution on the cell membrane. Here we attempted to assess quantitatively the functional importance of receptor clustering. We studied beta-1 integrin-mediated attachment of THP-1 cells to fibronectin-coated surfaces under low shear flow. Cells displayed multiple binding events with a half-life of the order of 1 s. The duration of binding events after the first second after arrest was quantitatively accounted for by a model assuming the existence of a short-time intermediate binding state with 3.6 s(-1) dissociation rate and 1.3 s(-1) transition frequency toward a more stable state. Cell binding to surfaces coated with lower fibronectin densities was concluded to be mediated by single molecular interactions, whereas multiple bonds were formed <1 s after contact with higher fibronectin surface densities. Cell treatment with microfilament inhibitors or a neutral antiintegrin antibody decreased bond number without changing aforementioned kinetic parameters whereas a function enhancing antibody increased the rate of bond formation and/or the lifetime of intermediate state. Receptor aggregation was induced by treating cells with neutral antiintegrin antibody and antiimmunoglobulin antibodies. A semiquantitative confocal microscopy study suggested that this treatment increased between 40% and 100% the average number of integrin receptors located in a volume of approximately 0.045 microm(3) surrounding each integrin. This aggregation induced up to 2.7-fold increase of the average number of bonds. Flow cytometric analysis of fluorescent ligand binding showed that THP-1 cells displayed low-affinity beta-1 integrins with a dissociation constant in the micromolar range. It is concluded that the initial step of cell adhesion was mediated by multiple incomplete bonds rather than a single equilibrium-state ligand receptor

  4. Brain-computer interface using a simplified functional near-infrared spectroscopy system.

    PubMed

    Coyle, Shirley M; Ward, Tomás E; Markham, Charles M

    2007-09-01

    A brain-computer interface (BCI) is a device that allows a user to communicate with external devices through thought processes alone. A novel signal acquisition tool for BCIs is near-infrared spectroscopy (NIRS), an optical technique to measure localized cortical brain activity. The benefits of using this non-invasive modality are safety, portability and accessibility. A number of commercial multi-channel NIRS system are available; however we have developed a straightforward custom-built system to investigate the functionality of a fNIRS-BCI system. This work describes the construction of the device, the principles of operation and the implementation of a fNIRS-BCI application, 'Mindswitch' that harnesses motor imagery for control. Analysis is performed online and feedback of performance is presented to the user. Mindswitch presents a basic 'on/off' switching option to the user, where selection of either state takes 1 min. Initial results show that fNIRS can support simple BCI functionality and shows much potential. Although performance may be currently inferior to many EEG systems, there is much scope for development particularly with more sophisticated signal processing and classification techniques. We hope that by presenting fNIRS as an accessible and affordable option, a new avenue of exploration will open within the BCI research community and stimulate further research in fNIRS-BCIs. PMID:17873424

  5. Characterizing Molecular Structure by Combining Experimental Measurements with Density Functional Theory Computations

    NASA Astrophysics Data System (ADS)

    Lopez-Encarnacion, Juan M.

    2016-06-01

    In this talk, the power and synergy of combining experimental measurements with density functional theory computations as a single tool to unambiguously characterize the molecular structure of complex atomic systems is shown. Here, we bring three beautiful cases where the interaction between the experiment and theory is in very good agreement for both finite and extended systems: 1) Characterizing Metal Coordination Environments in Porous Organic Polymers: A Joint Density Functional Theory and Experimental Infrared Spectroscopy Study 2) Characterization of Rhenium Compounds Obtained by Electrochemical Synthesis After Aging Process and 3) Infrared Study of H(D)2 + Co4+ Chemical Reaction: Characterizing Molecular Structures. J.M. López-Encarnación, K.K. Tanabe, M.J.A. Johnson, J. Jellinek, Chemistry-A European Journal 19 (41), 13646-13651 A. Vargas-Uscategui, E. Mosquera, J.M. López-Encarnación, B. Chornik, R. S. Katiyar, L. Cifuentes, Journal of Solid State Chemistry 220, 17-21

  6. Computational identification of riboswitches based on RNA conserved functional sequences and conformations.

    PubMed

    Chang, Tzu-Hao; Huang, Hsien-Da; Wu, Li-Ching; Yeh, Chi-Ta; Liu, Baw-Jhiune; Horng, Jorng-Tzong

    2009-07-01

    Riboswitches are cis-acting genetic regulatory elements within a specific mRNA that can regulate both transcription and translation by interacting with their corresponding metabolites. Recently, an increasing number of riboswitches have been identified in different species and investigated for their roles in regulatory functions. Both the sequence contexts and structural conformations are important characteristics of riboswitches. None of the previously developed tools, such as covariance models (CMs), Riboswitch finder, and RibEx, provide a web server for efficiently searching homologous instances of known riboswitches or considers two crucial characteristics of each riboswitch, such as the structural conformations and sequence contexts of functional regions. Therefore, we developed a systematic method for identifying 12 kinds of riboswitches. The method is implemented and provided as a web server, RiboSW, to efficiently and conveniently identify riboswitches within messenger RNA sequences. The predictive accuracy of the proposed method is comparable with other previous tools. The efficiency of the proposed method for identifying riboswitches was improved in order to achieve a reasonable computational time required for the prediction, which makes it possible to have an accurate and convenient web server for biologists to obtain the results of their analysis of a given mRNA sequence. RiboSW is now available on the web at http://RiboSW.mbc.nctu.edu.tw/. PMID:19460868

  7. Brain computer interface using a simplified functional near-infrared spectroscopy system

    NASA Astrophysics Data System (ADS)

    Coyle, Shirley M.; Ward, Tomás E.; Markham, Charles M.

    2007-09-01

    A brain-computer interface (BCI) is a device that allows a user to communicate with external devices through thought processes alone. A novel signal acquisition tool for BCIs is near-infrared spectroscopy (NIRS), an optical technique to measure localized cortical brain activity. The benefits of using this non-invasive modality are safety, portability and accessibility. A number of commercial multi-channel NIRS system are available; however we have developed a straightforward custom-built system to investigate the functionality of a fNIRS-BCI system. This work describes the construction of the device, the principles of operation and the implementation of a fNIRS-BCI application, 'Mindswitch' that harnesses motor imagery for control. Analysis is performed online and feedback of performance is presented to the user. Mindswitch presents a basic 'on/off' switching option to the user, where selection of either state takes 1 min. Initial results show that fNIRS can support simple BCI functionality and shows much potential. Although performance may be currently inferior to many EEG systems, there is much scope for development particularly with more sophisticated signal processing and classification techniques. We hope that by presenting fNIRS as an accessible and affordable option, a new avenue of exploration will open within the BCI research community and stimulate further research in fNIRS-BCIs.

  8. Can ultrasound and computed tomography replace high-dose urography in patients with impaired renal function?

    PubMed

    Webb, J A; Reznek, R H; White, F E; Cattell, W R; Fry, I K; Baker, L R

    1984-01-01

    Ninety-one patients with unexplained impaired renal function were investigated by high-dose urography, ultrasound and computed tomography (CT) without contrast. The aim was to evaluate the role of ultrasound and CT in renal failure, in particular their ability to define renal length and to show collecting system dilatation. In the majority of patients, renal length could be measured accurately by ultrasound. Measurements were less that those at urography because of the absence of magnification. Renal measurement by CT was not a sufficiently accurate indicator of renal length to be of clinical use. Both ultrasound and CT were sensitive detectors of collecting system dilatation: neither technique missed any case diagnosed by urography. However, in the presence of staghorn calculi or multiple cysts, neither ultrasound nor CT could exclude collecting system dilatation. CT was the only technique which demonstrated retroperitoneal nodes or fibrosis causing obstruction. It is proposed that the first investigation when renal function is impaired should be ultrasound, with plain films and renal tomograms to show calculi. CT should be reserved for those patients in whom ultrasound is not diagnostic or in whom ultrasound shows collecting system dilatation but does not demonstrate the cause. Using this scheme, ultrasound, plain radiography and CT would have demonstrated collecting system dilatation and, where appropriate, shown the cause of obstruction in 84 per cent of patients in this series. Only 16 per cent of patients would have required either high-dose urography or retrograde ureterograms.

  9. Functional requirements of computer systems for the U.S. Geological Survey, Water Resources Division, 1988-97

    USGS Publications Warehouse

    Hathaway, R.M.; McNellis, J.M.

    1989-01-01

    Investigating the occurrence, quantity, quality, distribution, and movement of the Nation 's water resources is the principal mission of the U.S. Geological Survey 's Water Resources Division. Reports of these investigations are published and available to the public. To accomplish this mission, the Division requires substantial computer technology to process, store, and analyze data from more than 57,000 hydrologic sites. The Division 's computer resources are organized through the Distributed Information System Program Office that manages the nationwide network of computers. The contract that provides the major computer components for the Water Resources Division 's Distributed information System expires in 1991. Five work groups were organized to collect the information needed to procure a new generation of computer systems for the U. S. Geological Survey, Water Resources Division. Each group was assigned a major Division activity and asked to describe its functional requirements of computer systems for the next decade. The work groups and major activities are: (1) hydrologic information; (2) hydrologic applications; (3) geographic information systems; (4) reports and electronic publishing; and (5) administrative. The work groups identified 42 functions and described their functional requirements for 1988, 1992, and 1997. A few new functions such as Decision Support Systems and Executive Information Systems, were identified, but most are the same as performed today. Although the number of functions will remain about the same, steady growth in the size, complexity, and frequency of many functions is predicted for the next decade. No compensating increase in the Division 's staff is anticipated during this period. To handle the increased workload and perform these functions, new approaches will be developed that use advanced computer technology. The advanced technology is required in a unified, tightly coupled system that will support all functions simultaneously

  10. Functional source separation and hand cortical representation for a brain–computer interface feature extraction

    PubMed Central

    Tecchio, Franca; Porcaro, Camillo; Barbati, Giulia; Zappasodi, Filippo

    2007-01-01

    A brain–computer interface (BCI) can be defined as any system that can track the person's intent which is embedded in his/her brain activity and, from it alone, translate the intention into commands of a computer. Among the brain signal monitoring systems best suited for this challenging task, electroencephalography (EEG) and magnetoencephalography (MEG) are the most realistic, since both are non-invasive, EEG is portable and MEG could provide more specific information that could be later exploited also through EEG signals. The first two BCI steps require set up of the appropriate experimental protocol while recording the brain signal and then to extract interesting features from the recorded cerebral activity. To provide information useful in these BCI stages, our aim is to provide an overview of a new procedure we recently developed, named functional source separation (FSS). As it comes from the blind source separation algorithms, it exploits the most valuable information provided by the electrophysiological techniques, i.e. the waveform signal properties, remaining blind to the biophysical nature of the signal sources. FSS returns the single trial source activity, estimates the time course of a neuronal pool along different experimental states on the basis of a specific functional requirement in a specific time period, and uses the simulated annealing as the optimization procedure allowing the exploit of functional constraints non-differentiable. Moreover, a minor section is included, devoted to information acquired by MEG in stroke patients, to guide BCI applications aiming at sustaining motor behaviour in these patients. Relevant BCI features – spatial and time-frequency properties – are in fact altered by a stroke in the regions devoted to hand control. Moreover, a method to investigate the relationship between sensory and motor hand cortical network activities is described, providing information useful to develop BCI feedback control systems. This

  11. Computational Diffusion Magnetic Resonance Imaging Based on Time-Dependent Bloch NMR Flow Equation and Bessel Functions.

    PubMed

    Awojoyogbe, Bamidele O; Dada, Michael O; Onwu, Samuel O; Ige, Taofeeq A; Akinwande, Ninuola I

    2016-04-01

    Magnetic resonance imaging (MRI) uses a powerful magnetic field along with radio waves and a computer to produce highly detailed "slice-by-slice" pictures of virtually all internal structures of matter. The results enable physicians to examine parts of the body in minute detail and identify diseases in ways that are not possible with other techniques. For example, MRI is one of the few imaging tools that can see through bones, making it an excellent tool for examining the brain and other soft tissues. Pulsed-field gradient experiments provide a straightforward means of obtaining information on the translational motion of nuclear spins. However, the interpretation of the data is complicated by the effects of restricting geometries as in the case of most cancerous tissues and the mathematical concept required to account for this becomes very difficult. Most diffusion magnetic resonance techniques are based on the Stejskal-Tanner formulation usually derived from the Bloch-Torrey partial differential equation by including additional terms to accommodate the diffusion effect. Despite the early success of this technique, it has been shown that it has important limitations, the most of which occurs when there is orientation heterogeneity of the fibers in the voxel of interest (VOI). Overcoming this difficulty requires the specification of diffusion coefficients as function of spatial coordinate(s) and such a phenomenon is an indication of non-uniform compartmental conditions which can be analyzed accurately by solving the time-dependent Bloch NMR flow equation analytically. In this study, a mathematical formulation of magnetic resonance flow sequence in restricted geometry is developed based on a general second order partial differential equation derived directly from the fundamental Bloch NMR flow equations. The NMR signal is obtained completely in terms of NMR experimental parameters. The process is described based on Bessel functions and properties that can make it

  12. Computational Diffusion Magnetic Resonance Imaging Based on Time-Dependent Bloch NMR Flow Equation and Bessel Functions.

    PubMed

    Awojoyogbe, Bamidele O; Dada, Michael O; Onwu, Samuel O; Ige, Taofeeq A; Akinwande, Ninuola I

    2016-04-01

    Magnetic resonance imaging (MRI) uses a powerful magnetic field along with radio waves and a computer to produce highly detailed "slice-by-slice" pictures of virtually all internal structures of matter. The results enable physicians to examine parts of the body in minute detail and identify diseases in ways that are not possible with other techniques. For example, MRI is one of the few imaging tools that can see through bones, making it an excellent tool for examining the brain and other soft tissues. Pulsed-field gradient experiments provide a straightforward means of obtaining information on the translational motion of nuclear spins. However, the interpretation of the data is complicated by the effects of restricting geometries as in the case of most cancerous tissues and the mathematical concept required to account for this becomes very difficult. Most diffusion magnetic resonance techniques are based on the Stejskal-Tanner formulation usually derived from the Bloch-Torrey partial differential equation by including additional terms to accommodate the diffusion effect. Despite the early success of this technique, it has been shown that it has important limitations, the most of which occurs when there is orientation heterogeneity of the fibers in the voxel of interest (VOI). Overcoming this difficulty requires the specification of diffusion coefficients as function of spatial coordinate(s) and such a phenomenon is an indication of non-uniform compartmental conditions which can be analyzed accurately by solving the time-dependent Bloch NMR flow equation analytically. In this study, a mathematical formulation of magnetic resonance flow sequence in restricted geometry is developed based on a general second order partial differential equation derived directly from the fundamental Bloch NMR flow equations. The NMR signal is obtained completely in terms of NMR experimental parameters. The process is described based on Bessel functions and properties that can make it

  13. EUPDF: Eulerian Monte Carlo Probability Density Function Solver for Applications With Parallel Computing, Unstructured Grids, and Sprays

    NASA Technical Reports Server (NTRS)

    Raju, M. S.

    1998-01-01

    The success of any solution methodology used in the study of gas-turbine combustor flows depends a great deal on how well it can model the various complex and rate controlling processes associated with the spray's turbulent transport, mixing, chemical kinetics, evaporation, and spreading rates, as well as convective and radiative heat transfer and other phenomena. The phenomena to be modeled, which are controlled by these processes, often strongly interact with each other at different times and locations. In particular, turbulence plays an important role in determining the rates of mass and heat transfer, chemical reactions, and evaporation in many practical combustion devices. The influence of turbulence in a diffusion flame manifests itself in several forms, ranging from the so-called wrinkled, or stretched, flamelets regime to the distributed combustion regime, depending upon how turbulence interacts with various flame scales. Conventional turbulence models have difficulty treating highly nonlinear reaction rates. A solution procedure based on the composition joint probability density function (PDF) approach holds the promise of modeling various important combustion phenomena relevant to practical combustion devices (such as extinction, blowoff limits, and emissions predictions) because it can account for nonlinear chemical reaction rates without making approximations. In an attempt to advance the state-of-the-art in multidimensional numerical methods, we at the NASA Lewis Research Center extended our previous work on the PDF method to unstructured grids, parallel computing, and sprays. EUPDF, which was developed by M.S. Raju of Nyma, Inc., was designed to be massively parallel and could easily be coupled with any existing gas-phase and/or spray solvers. EUPDF can use an unstructured mesh with mixed triangular, quadrilateral, and/or tetrahedral elements. The application of the PDF method showed favorable results when applied to several supersonic

  14. Arkansas' Curriculum Guide. Competency Based Computerized Accounting.

    ERIC Educational Resources Information Center

    Arkansas State Dept. of Education, Little Rock. Div. of Vocational, Technical and Adult Education.

    This guide contains the essential parts of a total curriculum for a one-year secondary-level course in computerized accounting. Addressed in the individual sections of the guide are the following topics: the complete accounting cycle, computer operations for accounting, computerized accounting and general ledgers, computerized accounts payable,…

  15. Educational Accounting Procedures.

    ERIC Educational Resources Information Center

    Tidwell, Sam B.

    This chapter of "Principles of School Business Management" reviews the functions, procedures, and reports with which school business officials must be familiar in order to interpret and make decisions regarding the school district's financial position. Among the accounting functions discussed are financial management, internal auditing, annual…

  16. Computed Tomography-Based Centrilobular Emphysema Subtypes Relate with Pulmonary Function

    PubMed Central

    Takahashi, Mamoru; Yamada, Gen; Koba, Hiroyuki; Takahashi, Hiroki

    2013-01-01

    Introduction: Centrilobular emphysema (CLE) is recognized as low attenuation areas (LAA) with centrilobular pattern on high-resolution computed tomography (CT). However, several shapes of LAA are observed. Our preliminary study showed three types of LAA in CLE by CT-pathologic correlations. This study was performed to investigate whether the morphological features of LAA affect pulmonary functions. Materials and Methods: A total of 73 Japanese patients with stable CLE (63 males, 10 females) were evaluated visually by CT and classified into three subtypes based on the morphology of LAA including shape and sharpness of border; patients with CLE who shows round or oval LAA with well-defined border (Subtype A), polygonal or irregular-shaped LAA with ill-defined border (Subtype B), and irregular-shaped LAA with ill-defined border coalesced with each other (Subtype C). CT score, pulmonary function test and smoking index were compared among three subtypes. Results: Twenty (27%), 45 (62%) and 8 cases (11%) of the patients were grouped into Subtype A, Subtype B and Subtype C, respectively. In CT score and smoking index, both Subtype B and Subtype C were significantly higher than Subtype A. In FEV1%, Subtype C was significantly lower than both Subtype A and Subtype B. In diffusing capacity of lung for carbon monoxide, Subtype B was significantly lower than Subtype A. Conclusion: The morphological differences of LAA may relate with an airflow limitation and alveolar diffusing capacity. To assess morphological features of LAA may be helpful for the expectation of respiratory function. PMID:23935765

  17. Indices of cognitive function measured in rugby union players using a computer-based test battery.

    PubMed

    MacDonald, Luke A; Minahan, Clare L

    2016-09-01

    The purpose of this study was to investigate the intra- and inter-day reliability of cognitive performance using a computer-based test battery in team-sport athletes. Eighteen elite male rugby union players (age: 19 ± 0.5 years) performed three experimental trials (T1, T2 and T3) of the test battery: T1 and T2 on the same day and T3, on the following day, 24 h later. The test battery comprised of four cognitive tests assessing the cognitive domains of executive function (Groton Maze Learning Task), psychomotor function (Detection Task), vigilance (Identification Task), visual learning and memory (One Card Learning Task). The intraclass correlation coefficients (ICCs) for the Detection Task, the Identification Task and the One Card Learning Task performance variables ranged from 0.75 to 0.92 when comparing T1 to T2 to assess intraday reliability, and 0.76 to 0.83 when comparing T1 and T3 to assess inter-day reliability. The ICCs for the Groton Maze Learning Task intra- and inter-day reliability were 0.67 and 0.57, respectively. We concluded that the Detection Task, the Identification Task and the One Card Learning Task are reliable measures of psychomotor function, vigilance, visual learning and memory in rugby union players. The reliability of the Groton Maze Learning Task is questionable (mean coefficient of variation (CV) = 19.4%) and, therefore, results should be interpreted with caution.

  18. COPD phenotypes on computed tomography and its correlation with selected lung function variables in severe patients

    PubMed Central

    da Silva, Silvia Maria Doria; Paschoal, Ilma Aparecida; De Capitani, Eduardo Mello; Moreira, Marcos Mello; Palhares, Luciana Campanatti; Pereira, Mônica Corso

    2016-01-01

    Background Computed tomography (CT) phenotypic characterization helps in understanding the clinical diversity of chronic obstructive pulmonary disease (COPD) patients, but its clinical relevance and its relationship with functional features are not clarified. Volumetric capnography (VC) uses the principle of gas washout and analyzes the pattern of CO2 elimination as a function of expired volume. The main variables analyzed were end-tidal concentration of carbon dioxide (ETCO2), Slope of phase 2 (Slp2), and Slope of phase 3 (Slp3) of capnogram, the curve which represents the total amount of CO2 eliminated by the lungs during each breath. Objective To investigate, in a group of patients with severe COPD, if the phenotypic analysis by CT could identify different subsets of patients, and if there was an association of CT findings and functional variables. Subjects and methods Sixty-five patients with COPD Gold III–IV were admitted for clinical evaluation, high-resolution CT, and functional evaluation (spirometry, 6-minute walk test [6MWT], and VC). The presence and profusion of tomography findings were evaluated, and later, the patients were identified as having emphysema (EMP) or airway disease (AWD) phenotype. EMP and AWD groups were compared; tomography findings scores were evaluated versus spirometric, 6MWT, and VC variables. Results Bronchiectasis was found in 33.8% and peribronchial thickening in 69.2% of the 65 patients. Structural findings of airways had no significant correlation with spirometric variables. Air trapping and EMP were strongly correlated with VC variables, but in opposite directions. There was some overlap between the EMP and AWD groups, but EMP patients had signicantly lower body mass index, worse obstruction, and shorter walked distance on 6MWT. Concerning VC, EMP patients had signicantly lower ETCO2, Slp2 and Slp3. Increases in Slp3 characterize heterogeneous involvement of the distal air spaces, as in AWD. Conclusion Visual assessment and

  19. Tools for Computing the AGN Feedback: Radio-loudness Distribution and the Kinetic Luminosity Function

    NASA Astrophysics Data System (ADS)

    La Franca, F.; Melini, G.; Fiore, F.

    2010-07-01

    We studied the active galactic nucleus (AGN) radio emission from a compilation of hard X-ray-selected samples, all observed in the 1.4 GHz band. A total of more than 1600 AGNs with 2-10 keV de-absorbed luminosities higher than 1042 erg s-1 cm-2 were used. For a sub-sample of about fifty z <~ 0.1 AGNs, it was possible to reach ~80% of radio detections and therefore, for the first time, it was possible to almost completely measure the probability distribution function of the ratio between the radio and the X-ray luminosity RX = log(L 1.4/LX ), where L 1.4/LX = νL ν(1.4 GHz)/LX (2-10 keV). The probability distribution function of RX was functionally fitted as dependent on the X-ray luminosity and redshift, P(RX |LX , z). It roughly spans over six decades (-7< RX <-1) and does not show any sign of bi-modality. The result is that the probability of finding large values of the RX ratio increases with decreasing X-ray luminosities and (possibly) with increasing redshift. No statistically significant difference was found between the radio properties of the X-ray absorbed (N H>1022 cm-2) and un-absorbed AGNs. Measurement of the probability distribution function of RX allowed us to compute the kinetic luminosity function and the kinetic energy density which, at variance with that assumed in many galaxy evolution models, is observed to decrease by about a factor of 5 at redshift below 0.5. About half of the kinetic energy density results in being produced by the more radio quiet (RX <-4) AGNs. In agreement with previous estimates, the AGN efficiency epsilonkin in converting the accreted mass energy into kinetic power (L_K=ɛ_kin\\dot{m} c^2) is, on average, epsilonkin ~= 5 × 10-3. The data suggest a possible increase of epsilonkin at low redshifts.

  20. Density functional theory computation of Nuclear Magnetic Resonance parameters in light and heavy nuclei

    NASA Astrophysics Data System (ADS)

    Sutter, Kiplangat

    This thesis illustrates the utilization of Density functional theory (DFT) in calculations of gas and solution phase Nuclear Magnetic Resonance (NMR) properties of light and heavy nuclei. Computing NMR properties is still a challenge and there are many unknown factors that are still being explored. For instance, influence of hydrogen-bonding; thermal motion; vibration; rotation and solvent effects. In one of the theoretical studies of 195Pt NMR chemical shift in cisplatin and its derivatives illustrated in Chapter 2 and 3 of this thesis. The importance of representing explicit solvent molecules explicitly around the Pt center in cisplatin complexes was outlined. In the same complexes, solvent effect contributed about half of the J(Pt-N) coupling constant. Indicating the significance of considering the surrounding solvent molecules in elucidating the NMR measurements of cisplatin binding to DNA. In chapter 4, we explore the Spin-Orbit (SO) effects on the 29Si and 13C chemical shifts induced by surrounding metal and ligands. The unusual Ni, Pd, Pt trends in SO effects to the 29Si in metallasilatrane complexes X-Si-(mu-mt)4-M-Y was interpreted based on electronic and relativistic effects rather than by structural differences between the complexes. In addition, we develop a non-linear model for predicting NMR SO effects in a series of organics bonded to heavy nuclei halides. In chapter 5, we extend the idea of "Chemist's orbitals" LMO analysis to the quantum chemical proton NMR computation of systems with internal resonance-assisted hydrogen bonds. Consequently, we explicitly link the relationship between the NMR parameters related to H-bonded systems and intuitive picture of a chemical bond from quantum calculations. The analysis shows how NMR signatures characteristic of H-bond can be explained by local bonding and electron delocalization concepts. One shortcoming of some of the anti-cancer agents like cisplatin is that they are toxic and researchers are looking for

  1. Analysis of the Uncertainty in the Computation of Receiver Functions and Improvement in the Estimation of Receiver, PP and SS functions

    NASA Astrophysics Data System (ADS)

    Huang, X.; Gurrola, H.

    2013-12-01

    methods. All of these methods performed well in terms of stdev but we chose ARU for its high quality data and low signal to noise ratios (the average S/N ratio for these data were 4%). With real data, we tend to assume the method that has the lowest stdev is the best. But stdev does not account for a systematic bias toward incorrect values. In this case the LSD once again had the lowest stdev in computed amplitudes of Pds phases but it had the smallest values. But the FID, FWLD and MID tended to produce the largest amplitude while the LSD and TID tended toward the lower amplitudes. Considering that in the synthetics all these methods showed bias toward low amplitude, we believe that with real data those methods producing the largest amplitudes will be closest to the 'true values' and that is a better measure of the better method than a small stdev in amplitude estimates. We will also present results for applying TID and FID methods to the production of PP and SS precursor functions. When applied to these data, it is possible to moveout correct the cross-correlation functions before extracting the signal from each PdP (or SdS) phase in these data. As a result a much cleaner Earth function is produced and feequency content is significantly improved.

  2. Response functions for computing absorbed dose to skeletal tissues from neutron irradiation

    NASA Astrophysics Data System (ADS)

    Bahadori, Amir A.; Johnson, Perry; Jokisch, Derek W.; Eckerman, Keith F.; Bolch, Wesley E.

    2011-11-01

    Spongiosa in the adult human skeleton consists of three tissues—active marrow (AM), inactive marrow (IM) and trabecularized mineral bone (TB). AM is considered to be the target tissue for assessment of both long-term leukemia risk and acute marrow toxicity following radiation exposure. The total shallow marrow (TM50), defined as all tissues lying within the first 50 µm of the bone surfaces, is considered to be the radiation target tissue of relevance for radiogenic bone cancer induction. For irradiation by sources external to the body, kerma to homogeneous spongiosa has been used as a surrogate for absorbed dose to both of these tissues, as direct dose calculations are not possible using computational phantoms with homogenized spongiosa. Recent micro-CT imaging of a 40 year old male cadaver has allowed for the accurate modeling of the fine microscopic structure of spongiosa in many regions of the adult skeleton (Hough et al 2011 Phys. Med. Biol. 56 2309-46). This microstructure, along with associated masses and tissue compositions, was used to compute specific absorbed fraction (SAF) values for protons originating in axial and appendicular bone sites (Jokisch et al 2011 Phys. Med. Biol. 56 6857-72). These proton SAFs, bone masses, tissue compositions and proton production cross sections, were subsequently used to construct neutron dose-response functions (DRFs) for both AM and TM50 targets in each bone of the reference adult male. Kerma conditions were assumed for other resultant charged particles. For comparison, AM, TM50 and spongiosa kerma coefficients were also calculated. At low incident neutron energies, AM kerma coefficients for neutrons correlate well with values of the AM DRF, while total marrow (TM) kerma coefficients correlate well with values of the TM50 DRF. At high incident neutron energies, all kerma coefficients and DRFs tend to converge as charged-particle equilibrium is established across the bone site. In the range of 10 eV to 100 Me

  3. Micro-computed tomography assessment of fracture healing: relationships among callus structure, composition, and mechanical function.

    PubMed

    Morgan, Elise F; Mason, Zachary D; Chien, Karen B; Pfeiffer, Anthony J; Barnes, George L; Einhorn, Thomas A; Gerstenfeld, Louis C

    2009-02-01

    Non-invasive characterization of fracture callus structure and composition may facilitate development of surrogate measures of the regain of mechanical function. As such, quantitative computed tomography- (CT-) based analyses of fracture calluses could enable more reliable clinical assessments of bone healing. Although previous studies have used CT to quantify and predict fracture healing, it is unclear which of the many CT-derived metrics of callus structure and composition are the most predictive of callus mechanical properties. The goal of this study was to identify the changes in fracture callus structure and composition that occur over time and that are most closely related to the regain of mechanical function. Micro-computed tomography (microCT) imaging and torsion testing were performed on murine fracture calluses (n=188) at multiple post-fracture timepoints and under different experimental conditions that alter fracture healing. Total callus volume (TV), mineralized callus volume (BV), callus mineralized volume fraction (BV/TV), bone mineral content (BMC), tissue mineral density (TMD), standard deviation of mineral density (sigma(TMD)), effective polar moment of inertia (J(eff)), torsional strength, and torsional rigidity were quantified. Multivariate statistical analyses, including multivariate analysis of variance, principal components analysis, and stepwise regression were used to identify differences in callus structure and composition among experimental groups and to determine which of the microCT outcome measures were the strongest predictors of mechanical properties. Although calluses varied greatly in the absolute and relative amounts of mineralized tissue (BV, BMC, and BV/TV), differences among timepoints were most strongly associated with changes in tissue mineral density. Torsional strength and rigidity were dependent on mineral density as well as the amount of mineralized tissue: TMD, BV, and sigma(TMD) explained 62% of the variation in

  4. Computerized accounting methods. Final report

    SciTech Connect

    1994-12-31

    This report summarizes the results of the research performed under the Task Order on computerized accounting methods in a period from 03 August to 31 December 1994. Computerized nuclear material accounting methods are analyzed and evaluated. Selected methods are implemented in a hardware-software complex developed as a prototype of the local network-based CONMIT system. This complex has been put into trial operation for test and evaluation of the selected methods at two selected ``Kurchatov Institute`` Russian Research Center (``KI`` RRC) nuclear facilities. Trial operation is carried out since the beginning of Initial Physical Inventory Taking in these facilities that was performed in November 1994. Operation of CONMIT prototype system was demonstrated in the middle of December 1994. Results of evaluation of CONMIT prototype system features and functioning under real operating conditions are considered. Conclusions are formulated on the ways of further development of computerized nuclear material accounting methods. The most important conclusion is a need to strengthen computer and information security features supported by the operating environment. Security provisions as well as other LANL Client/Server System approaches being developed by Los Alamos National Laboratory are recommended for selection of software and hardware components to be integrated into production version of CONMIT system for KI RRC.

  5. Functional Analysis of Metabolic Channeling and Regulation in Lignin Biosynthesis: A Computational Approach

    PubMed Central

    Lee, Yun; Escamilla-Treviño, Luis; Dixon, Richard A.; Voit, Eberhard O.

    2012-01-01

    Lignin is a polymer in secondary cell walls of plants that is known to have negative impacts on forage digestibility, pulping efficiency, and sugar release from cellulosic biomass. While targeted modifications of different lignin biosynthetic enzymes have permitted the generation of transgenic plants with desirable traits, such as improved digestibility or reduced recalcitrance to saccharification, some of the engineered plants exhibit monomer compositions that are clearly at odds with the expected outcomes when the biosynthetic pathway is perturbed. In Medicago, such discrepancies were partly reconciled by the recent finding that certain biosynthetic enzymes may be spatially organized into two independent channels for the synthesis of guaiacyl (G) and syringyl (S) lignin monomers. Nevertheless, the mechanistic details, as well as the biological function of these interactions, remain unclear. To decipher the working principles of this and similar control mechanisms, we propose and employ here a novel computational approach that permits an expedient and exhaustive assessment of hundreds of minimal designs that could arise in vivo. Interestingly, this comparative analysis not only helps distinguish two most parsimonious mechanisms of crosstalk between the two channels by formulating a targeted and readily testable hypothesis, but also suggests that the G lignin-specific channel is more important for proper functioning than the S lignin-specific channel. While the proposed strategy of analysis in this article is tightly focused on lignin synthesis, it is likely to be of similar utility in extracting unbiased information in a variety of situations, where the spatial organization of molecular components is critical for coordinating the flow of cellular information, and where initially various control designs seem equally valid. PMID:23144605

  6. Computed Tomography-Derived Parameters of Myocardial Morphology and Function in Black and White Patients With Acute Chest Pain.

    PubMed

    Takx, Richard A P; Vliegenthart, Rozemarijn; Schoepf, U Joseph; Abro, Joseph A; Nance, John W; Ebersberger, Ullrich; Bamberg, Fabian; Carr, Christine M; Apfaltrer, Paul

    2016-02-01

    Blacks have higher mortality and hospitalization rates because of congestive heart failure compared with white counterparts. Differences in cardiac structure and function may contribute to the racial disparity in cardiovascular outcomes. Our aim was to compare computed tomography (CT)-derived cardiac measurements between black patients with acute chest pain and age- and gender-matched white patients. We performed a retrospective analysis under an institutional review board waiver and in Health Insurance Portability and Accountability Act compliance. We investigated patients who underwent cardiac dual-source CT for acute chest pain. Myocardial mass, left ventricular (LV) ejection fraction, LV end-systolic volume, and LV end-diastolic volume were quantified using an automated analysis algorithm. Septal wall thickness and cardiac chamber diameters were manually measured. Measurements were compared by independent t test and linear regression. The study population consisted of 300 patients (150 black-mean age 54 ± 12 years; 46% men; 150 white-mean age 55 ± 11 years; 46% men). Myocardial mass was larger for blacks compared with white (176.1 ± 58.4 vs 155.9 ± 51.7 g, p = 0.002), which remained significant after adjusting for age, gender, body mass index, and hypertension. Septal wall thickness was slightly greater (11.9 ± 2.7 vs 11.2 ± 3.1 mm, p = 0.036). The LV inner diameter was moderately larger in black patients in systole (32.3 ± 9.0 vs 30.1 ± 5.4 ml, p = 0.010) and in diastole (50.1 ± 7.8 vs 48.9 ± 5.2 ml, p = 0.137), as well as LV end-diastolic volume (134.5 ± 42.7 vs 128.2 ± 30.6 ml, p = 0.143). Ejection fraction was nonsignificantly lower in blacks (67.1 ± 13.5% vs 69.0 ± 9.6%, p = 0.169). In conclusion, CT-derived myocardial mass was larger in blacks compared with whites, whereas LV functional parameters were generally not statistically different, suggesting that LV mass might be a possible contributing factor to the higher rate of cardiac events

  7. Insights into the function of ion channels by computational electrophysiology simulations.

    PubMed

    Kutzner, Carsten; Köpfer, David A; Machtens, Jan-Philipp; de Groot, Bert L; Song, Chen; Zachariae, Ulrich

    2016-07-01

    Ion channels are of universal importance for all cell types and play key roles in cellular physiology and pathology. Increased insight into their functional mechanisms is crucial to enable drug design on this important class of membrane proteins, and to enhance our understanding of some of the fundamental features of cells. This review presents the concepts behind the recently developed simulation protocol Computational Electrophysiology (CompEL), which facilitates the atomistic simulation of ion channels in action. In addition, the review provides guidelines for its application in conjunction with the molecular dynamics software package GROMACS. We first lay out the rationale for designing CompEL as a method that models the driving force for ion permeation through channels the way it is established in cells, i.e., by electrochemical ion gradients across the membrane. This is followed by an outline of its implementation and a description of key settings and parameters helpful to users wishing to set up and conduct such simulations. In recent years, key mechanistic and biophysical insights have been obtained by employing the CompEL protocol to address a wide range of questions on ion channels and permeation. We summarize these recent findings on membrane proteins, which span a spectrum from highly ion-selective, narrow channels to wide diffusion pores. Finally we discuss the future potential of CompEL in light of its limitations and strengths. This article is part of a Special Issue entitled: Membrane Proteins edited by J.C. Gumbart and Sergei Noskov.

  8. Computer-mediated communication preferences predict biobehavioral measures of social-emotional functioning.

    PubMed

    Babkirk, Sarah; Luehring-Jones, Peter; Dennis-Tiwary, Tracy A

    2016-12-01

    The use of computer-mediated communication (CMC) as a form of social interaction has become increasingly prevalent, yet few studies examine individual differences that may shed light on implications of CMC for adjustment. The current study examined neurocognitive individual differences associated with preferences to use technology in relation to social-emotional outcomes. In Study 1 (N = 91), a self-report measure, the Social Media Communication Questionnaire (SMCQ), was evaluated as an assessment of preferences for communicating positive and negative emotions on a scale ranging from purely via CMC to purely face-to-face. In Study 2, SMCQ preferences were examined in relation to event-related potentials (ERPs) associated with early emotional attention capture and reactivity (the frontal N1) and later sustained emotional processing and regulation (the late positive potential (LPP)). Electroencephalography (EEG) was recorded while 22 participants passively viewed emotional and neutral pictures and completed an emotion regulation task with instructions to increase, decrease, or maintain their emotional responses. A greater preference for CMC was associated with reduced size of and satisfaction with social support, greater early (N1) attention capture by emotional stimuli, and reduced LPP amplitudes to unpleasant stimuli in the increase emotion regulatory task. These findings are discussed in the context of possible emotion- and social-regulatory functions of CMC.

  9. [Functional multispiral computed tomography of sound-transmitting structures in the middle ear].

    PubMed

    Bodrova, I V; Rusektskiĭ, Iu Iu; Kulakova, L A; Lopatin, A S; Ternovoĭ, S K

    2011-01-01

    The objective of this work was to estimate the potential of functional multispiral computed tomography (fMSCT) for the choice and planning of the treatment strategy and the extent of surgical intervention in the patients presenting with fibroosseous diseases of the middle ear associated with the pathologically altered mobility of the auditory ossicles. Studies with the use of MSCT and fMSCT for the examination of temporal bones in 21 patients (25 observations) provided information about normal CT anatomy of the middle ear and a basis for the development of the fMSCT protocol; moreover they allowed the range of mobility of the auditory ossicles to be determined in healthy subjects and patients with middle ear disorders. It is concluded that fMSCT of temporal bones may be recommended to patients suffering otosclerosis, tympanosclerosis, and adhesive otitis media. The use of this technique improves the accuracy of diagnosis and facilitates the choice and planning of the treatment strategy and the extent of surgical intervention in the patients presenting with middle ear diseases.

  10. A Computational Model Quantifies the Effect of Anatomical Variability on Velopharyngeal Function

    PubMed Central

    Inouye, Joshua M.; Perry, Jamie L.; Lin, Kant Y.

    2015-01-01

    Purpose This study predicted the effects of velopharyngeal (VP) anatomical parameters on VP function to provide a greater understanding of speech mechanics and aid in the treatment of speech disorders. Method We created a computational model of the VP mechanism using dimensions obtained from magnetic resonance imaging measurements of 10 healthy adults. The model components included the levator veli palatini (LVP), the velum, and the posterior pharyngeal wall, and the simulations were based on material parameters from the literature. The outcome metrics were the VP closure force and LVP muscle activation required to achieve VP closure. Results Our average model compared favorably with experimental data from the literature. Simulations of 1,000 random anatomies reflected the large variability in closure forces observed experimentally. VP distance had the greatest effect on both outcome metrics when considering the observed anatomic variability. Other anatomical parameters were ranked by their predicted influences on the outcome metrics. Conclusions Our results support the implication that interventions for VP dysfunction that decrease anterior to posterior VP portal distance, increase velar length, and/or increase LVP cross-sectional area may be very effective. Future modeling studies will help to further our understanding of speech mechanics and optimize treatment of speech disorders. PMID:26049120

  11. Restoring unassisted natural gait to paraplegics via functional neuromuscular stimulation: a computer simulation study.

    PubMed

    Yamaguchi, G T; Zajac, F E

    1990-09-01

    Functional neuromuscular stimulation (FNS) of paralyzed muscles has enabled spinal-cord-injured patients to regain a semblance of lower-extremity control, for example to ambulate while relying heavily on the use of walkers. Given the limitations of FNS, specifically low muscle strengths, high rates of fatigue, and a limited ability to modulate muscle excitations, it remains unclear, however, whether FNS can be developed as a practical means to control the lower extremity musculature to restore aesthetic, unsupported gait to paraplegics. A computer simulation of FNS-assisted bipedal gait shows that it is difficult, but possible to attain undisturbed, level gait at normal speeds provided the electrically-stimulated ankle plantarflexors exhibit either near-normal strengths or are augmented by an orthosis, and at least seven muscle-groups in each leg are stimulated. A combination of dynamic programming and an open-loop, trial-and-error adjustment process was used to find a suboptimal set of discretely-varying muscle stimulation patterns needed for a 3-D, 8 degree-of-freedom dynamic model to sustain a step. An ankle-foot orthosis was found to be especially useful, as it helped to stabilize the stance leg and simplified the task of controlling the foot during swing. It is believed that the process of simulating natural gait with this model will serve to highlight difficulties to be expected during laboratory and clinical trials.

  12. Evaluation of Coupled Perturbed and Density Functional Methods of Computing the Parity-Violating Energy Difference between Enantiomers

    NASA Astrophysics Data System (ADS)

    MacDermott, A. J.; Hyde, G. O.; Cohen, A. J.

    2009-03-01

    We present new coupled-perturbed Hartree-Fock (CPHF) and density functional theory (DFT) computations of the parity-violating energy difference (PVED) between enantiomers for H2O2 and H2S2. Our DFT PVED computations are the first for H2S2 and the first with the new HCTH and OLYP functionals. Like other “second generation” PVED computations, our results are an order of magnitude larger than the original “first generation” uncoupled-perturbed Hartree-Fock computations of Mason and Tranter. We offer an explanation for the dramatically larger size in terms of cancellation of contributions of opposing signs, which also explains the basis set sensitivity of the PVED, and its conformational hypersensitivity (addressed in the following paper). This paper also serves as a review of the different types of “second generation” PVED computations: we set our work in context, comparing our results with those of four other groups, and noting the good agreement between results obtained by very different methods. DFT PVEDs tend to be somewhat inflated compared to the CPHF values, but this is not a problem when only sign and order of magnitude are required. Our results with the new OLYP functional are less inflated than those with other functionals, and OLYP is also more efficient computationally. We therefore conclude that DFT computation offers a promising approach for low-cost extension to larger biosystems, especially polymers. The following two papers extend to terrestrial and extra-terrestrial amino acids respectively, and later work will extend to polymers.

  13. Development of the Computer-Adaptive Version of the Late-Life Function and Disability Instrument

    PubMed Central

    Tian, Feng; Kopits, Ilona M.; Moed, Richard; Pardasaney, Poonam K.; Jette, Alan M.

    2012-01-01

    Background. Having psychometrically strong disability measures that minimize response burden is important in assessing of older adults. Methods. Using the original 48 items from the Late-Life Function and Disability Instrument and newly developed items, a 158-item Activity Limitation and a 62-item Participation Restriction item pool were developed. The item pools were administered to a convenience sample of 520 community-dwelling adults 60 years or older. Confirmatory factor analysis and item response theory were employed to identify content structure, calibrate items, and build the computer-adaptive testings (CATs). We evaluated real-data simulations of 10-item CAT subscales. We collected data from 102 older adults to validate the 10-item CATs against the Veteran’s Short Form-36 and assessed test–retest reliability in a subsample of 57 subjects. Results. Confirmatory factor analysis revealed a bifactor structure, and multi-dimensional item response theory was used to calibrate an overall Activity Limitation Scale (141 items) and an overall Participation Restriction Scale (55 items). Fit statistics were acceptable (Activity Limitation: comparative fit index = 0.95, Tucker Lewis Index = 0.95, root mean square error approximation = 0.03; Participation Restriction: comparative fit index = 0.95, Tucker Lewis Index = 0.95, root mean square error approximation = 0.05). Correlation of 10-item CATs with full item banks were substantial (Activity Limitation: r = .90; Participation Restriction: r = .95). Test–retest reliability estimates were high (Activity Limitation: r = .85; Participation Restriction r = .80). Strength and pattern of correlations with Veteran’s Short Form-36 subscales were as hypothesized. Each CAT, on average, took 3.56 minutes to administer. Conclusions. The Late-Life Function and Disability Instrument CATs demonstrated strong reliability, validity, accuracy, and precision. The Late-Life Function and Disability Instrument CAT can achieve

  14. Sensory processing during viewing of cinematographic material: computational modeling and functional neuroimaging.

    PubMed

    Bordier, Cecile; Puja, Francesco; Macaluso, Emiliano

    2013-02-15

    The investigation of brain activity using naturalistic, ecologically-valid stimuli is becoming an important challenge for neuroscience research. Several approaches have been proposed, primarily relying on data-driven methods (e.g. independent component analysis, ICA). However, data-driven methods often require some post-hoc interpretation of the imaging results to draw inferences about the underlying sensory, motor or cognitive functions. Here, we propose using a biologically-plausible computational model to extract (multi-)sensory stimulus statistics that can be used for standard hypothesis-driven analyses (general linear model, GLM). We ran two separate fMRI experiments, which both involved subjects watching an episode of a TV-series. In Exp 1, we manipulated the presentation by switching on-and-off color, motion and/or sound at variable intervals, whereas in Exp 2, the video was played in the original version, with all the consequent continuous changes of the different sensory features intact. Both for vision and audition, we extracted stimulus statistics corresponding to spatial and temporal discontinuities of low-level features, as well as a combined measure related to the overall stimulus saliency. Results showed that activity in occipital visual cortex and the superior temporal auditory cortex co-varied with changes of low-level features. Visual saliency was found to further boost activity in extra-striate visual cortex plus posterior parietal cortex, while auditory saliency was found to enhance activity in the superior temporal cortex. Data-driven ICA analyses of the same datasets also identified "sensory" networks comprising visual and auditory areas, but without providing specific information about the possible underlying processes, e.g., these processes could relate to modality, stimulus features and/or saliency. We conclude that the combination of computational modeling and GLM enables the tracking of the impact of bottom-up signals on brain activity

  15. Localized basis functions and other computational improvements in variational nonorthogonal basis function methods for quantum mechanical scattering problems involving chemical reactions

    NASA Technical Reports Server (NTRS)

    Schwenke, David W.; Truhlar, Donald G.

    1990-01-01

    The Generalized Newton Variational Principle for 3D quantum mechanical reactive scattering is briefly reviewed. Then three techniques are described which improve the efficiency of the computations. First, the fact that the Hamiltonian is Hermitian is used to reduce the number of integrals computed, and then the properties of localized basis functions are exploited in order to eliminate redundant work in the integral evaluation. A new type of localized basis function with desirable properties is suggested. It is shown how partitioned matrices can be used with localized basis functions to reduce the amount of work required to handle the complex boundary conditions. The new techniques do not introduce any approximations into the calculations, so they may be used to obtain converged solutions of the Schroedinger equation.

  16. PLATO Instruction for Elementary Accounting.

    ERIC Educational Resources Information Center

    McKeown, James C.

    A progress report of a study using computer assisted instruction (CAI) materials for an elementary course in accounting principles is presented. The study was based on the following objectives: (1) improvement of instruction in the elementary accounting sequence, and (2) help for transfer students from two-year institutions. The materials under…

  17. The application of computer assisted technologies (CAT) in the rehabilitation of cognitive functions in psychiatric disorders of childhood and adolescence.

    PubMed

    Srebnicki, Tomasz; Bryńska, Anita

    2016-01-01

    First applications of computer-assisted technologies (CAT) in the rehabilitation of cognitive deficits, including child and adolescent psychiatric disorders date back to the 80's last century. Recent developments in computer technologies, wide access to the Internet and vast expansion of electronic devices resulted in dynamic increase in therapeutic software as well as supporting devices. The aim of computer assisted technologies is the improvement in the comfort and quality of life as well as the rehabilitation of impaired functions. The goal of the article is the presentation of most common computer-assisted technologies used in the therapy of children and adolescents with cognitive deficits as well as the literature review of their effectiveness including the challenges and limitations in regard to the implementation of such interventions. PMID:27556116

  18. Older Children and Adolescents with High-Functioning Autism Spectrum Disorders Can Comprehend Verbal Irony in Computer-Mediated Communication

    ERIC Educational Resources Information Center

    Glenwright, Melanie; Agbayewa, Abiola S.

    2012-01-01

    We compared the comprehension of verbal irony presented in computer-mediated conversations for older children and adolescents with high-functioning autism spectrum disorders (HFASD) and typically developing (TD) controls. We also determined whether participants' interpretations of irony were affected by the relationship between characters in the…

  19. Content Range and Precision of a Computer Adaptive Test of Upper Extremity Function for Children with Cerebral Palsy

    ERIC Educational Resources Information Center

    Montpetit, Kathleen; Haley, Stephen; Bilodeau, Nathalie; Ni, Pengsheng; Tian, Feng; Gorton, George, III; Mulcahey, M. J.

    2011-01-01

    This article reports on the content range and measurement precision of an upper extremity (UE) computer adaptive testing (CAT) platform of physical function in children with cerebral palsy. Upper extremity items representing skills of all abilities were administered to 305 parents. These responses were compared with two traditional standardized…

  20. Density Functional Computations and Mass Spectrometric Measurements. Can this Coupling Enlarge the Knowledge of Gas-Phase Chemistry?

    NASA Astrophysics Data System (ADS)

    Marino, T.; Russo, N.; Sicilia, E.; Toscano, M.; Mineva, T.

    A series of gas-phase properties of the systems has been investigated by using different exchange-correlation potentials and basis sets of increasing size in the framework of Density Functional theory with the aim to determine a strategy able to give reliable results with reasonable computational efforts.

  1. Utilization of high resolution computed tomography to visualize the three dimensional structure and function of plant vasculature

    Technology Transfer Automated Retrieval System (TEKTRAN)

    High resolution x-ray computed tomography (HRCT) is a non-destructive diagnostic imaging technique with sub-micron resolution capability that is now being used to evaluate the structure and function of plant xylem network in three dimensions (3D). HRCT imaging is based on the same principles as medi...

  2. Investigating the Potential of Computer Environments for the Teaching and Learning of Functions: A Double Analysis from Two Research Traditions

    ERIC Educational Resources Information Center

    Lagrange, Jean-Baptiste; Psycharis, Giorgos

    2014-01-01

    The general goal of this paper is to explore the potential of computer environments for the teaching and learning of functions. To address this, different theoretical frameworks and corresponding research traditions are available. In this study, we aim to network different frameworks by following a "double analysis" method to analyse two…

  3. Differential Item Functioning (DIF) Analysis of Computation, Word Problem and Geometry Questions across Gender and SES Groups.

    ERIC Educational Resources Information Center

    Berberoglu, Giray

    1995-01-01

    Item characteristic curves were compared across gender and socioeconomic status (SES) groups for the university entrance mathematics examination in Turkey to see if any group had an advantage in solving computation, word-problem, or geometry questions. Differential item functioning was found, and patterns are discussed. (SLD)

  4. INTERP3: A computer routine for linear interpolation of trivariate functions defined by nondistinct unequally spaced variables

    NASA Technical Reports Server (NTRS)

    Hill, D. C.; Morris, S. J., Jr.

    1979-01-01

    A report on the computer routine INTERP3 is presented. The routine is designed to linearly interpolate a variable which is a function of three independent variables. The variables within the parameter arrays do not have to be distinct, or equally spaced, and the array variables can be in increasing or decreasing order.

  5. Implementation of the AES as a Hash Function for Confirming the Identity of Software on a Computer System

    SciTech Connect

    Hansen, Randy R.; Bass, Robert B.; Kouzes, Richard T.; Mileson, Nicholas D.

    2003-01-20

    This paper provides a brief overview of the implementation of the Advanced Encryption Standard (AES) as a hash function for confirming the identity of software resident on a computer system. The PNNL Software Authentication team chose to use a hash function to confirm software identity on a system for situations where: (1) there is limited time to perform the confirmation and (2) access to the system is restricted to keyboard or thumbwheel input and output can only be displayed on a monitor. PNNL reviewed three popular algorithms: the Secure Hash Algorithm - 1 (SHA-1), the Message Digest - 5 (MD-5), and the Advanced Encryption Standard (AES) and selected the AES to incorporate in software confirmation tool we developed. This paper gives a brief overview of the SHA-1, MD-5, and the AES and sites references for further detail. It then explains the overall processing steps of the AES to reduce a large amount of generic data-the plain text, such is present in memory and other data storage media in a computer system, to a small amount of data-the hash digest, which is a mathematically unique representation or signature of the former that could be displayed on a computer's monitor. This paper starts with a simple definition and example to illustrate the use of a hash function. It concludes with a description of how the software confirmation tool uses the hash function to confirm the identity of software on a computer system.

  6. On One Unusual Method of Computation of Limits of Rational Functions in the Program Mathematica[R

    ERIC Educational Resources Information Center

    Hora, Jaroslav; Pech, Pavel

    2005-01-01

    Computing limits of functions is a traditional part of mathematical analysis which is very difficult for students. Now an algorithm for the elimination of quantifiers in the field of real numbers is implemented in the program Mathematica. This offers a non-traditional view on this classical theme. (Contains 1 table.)

  7. Computational and functional analyses of a small-molecule binding site in ROMK.

    PubMed

    Swale, Daniel R; Sheehan, Jonathan H; Banerjee, Sreedatta; Husni, Afeef S; Nguyen, Thuy T; Meiler, Jens; Denton, Jerod S

    2015-03-10

    The renal outer medullary potassium channel (ROMK, or Kir1.1, encoded by KCNJ1) critically regulates renal tubule electrolyte and water transport and hence blood volume and pressure. The discovery of loss-of-function mutations in KCNJ1 underlying renal salt and water wasting and lower blood pressure has sparked interest in developing new classes of antihypertensive diuretics targeting ROMK. The recent development of nanomolar-affinity small-molecule inhibitors of ROMK creates opportunities for exploring the chemical and physical basis of ligand-channel interactions required for selective ROMK inhibition. We previously reported that the bis-nitro-phenyl ROMK inhibitor VU591 exhibits voltage-dependent knock-off at hyperpolarizing potentials, suggesting that the binding site is located within the ion-conduction pore. In this study, comparative molecular modeling and in silico ligand docking were used to interrogate the full-length ROMK pore for energetically favorable VU591 binding sites. Cluster analysis of 2498 low-energy poses resulting from 9900 Monte Carlo docking trajectories on each of 10 conformationally distinct ROMK comparative homology models identified two putative binding sites in the transmembrane pore that were subsequently tested for a role in VU591-dependent inhibition using site-directed mutagenesis and patch-clamp electrophysiology. Introduction of mutations into the lower site had no effect on the sensitivity of the channel to VU591. In contrast, mutations of Val(168) or Asn(171) in the upper site, which are unique to ROMK within the Kir channel family, led to a dramatic reduction in VU591 sensitivity. This study highlights the utility of computational modeling for defining ligand-ROMK interactions and proposes a mechanism for inhibition of ROMK. PMID:25762321

  8. Computational and Functional Analyses of a Small-Molecule Binding Site in ROMK

    PubMed Central

    Swale, Daniel R.; Sheehan, Jonathan H.; Banerjee, Sreedatta; Husni, Afeef S.; Nguyen, Thuy T.; Meiler, Jens; Denton, Jerod S.

    2015-01-01

    The renal outer medullary potassium channel (ROMK, or Kir1.1, encoded by KCNJ1) critically regulates renal tubule electrolyte and water transport and hence blood volume and pressure. The discovery of loss-of-function mutations in KCNJ1 underlying renal salt and water wasting and lower blood pressure has sparked interest in developing new classes of antihypertensive diuretics targeting ROMK. The recent development of nanomolar-affinity small-molecule inhibitors of ROMK creates opportunities for exploring the chemical and physical basis of ligand-channel interactions required for selective ROMK inhibition. We previously reported that the bis-nitro-phenyl ROMK inhibitor VU591 exhibits voltage-dependent knock-off at hyperpolarizing potentials, suggesting that the binding site is located within the ion-conduction pore. In this study, comparative molecular modeling and in silico ligand docking were used to interrogate the full-length ROMK pore for energetically favorable VU591 binding sites. Cluster analysis of 2498 low-energy poses resulting from 9900 Monte Carlo docking trajectories on each of 10 conformationally distinct ROMK comparative homology models identified two putative binding sites in the transmembrane pore that were subsequently tested for a role in VU591-dependent inhibition using site-directed mutagenesis and patch-clamp electrophysiology. Introduction of mutations into the lower site had no effect on the sensitivity of the channel to VU591. In contrast, mutations of Val168 or Asn171 in the upper site, which are unique to ROMK within the Kir channel family, led to a dramatic reduction in VU591 sensitivity. This study highlights the utility of computational modeling for defining ligand-ROMK interactions and proposes a mechanism for inhibition of ROMK. PMID:25762321

  9. Complex functionality with minimal computation: Promise and pitfalls of reduced-tracer ocean biogeochemistry models

    NASA Astrophysics Data System (ADS)

    Galbraith, Eric D.; Dunne, John P.; Gnanadesikan, Anand; Slater, Richard D.; Sarmiento, Jorge L.; Dufour, Carolina O.; de Souza, Gregory F.; Bianchi, Daniele; Claret, Mariona; Rodgers, Keith B.; Marvasti, Seyedehsafoura Sedigh

    2015-12-01

    Earth System Models increasingly include ocean biogeochemistry models in order to predict changes in ocean carbon storage, hypoxia, and biological productivity under climate change. However, state-of-the-art ocean biogeochemical models include many advected tracers, that significantly increase the computational resources required, forcing a trade-off with spatial resolution. Here, we compare a state-of-the art model with 30 prognostic tracers (TOPAZ) with two reduced-tracer models, one with 6 tracers (BLING), and the other with 3 tracers (miniBLING). The reduced-tracer models employ parameterized, implicit biological functions, which nonetheless capture many of the most important processes resolved by TOPAZ. All three are embedded in the same coupled climate model. Despite the large difference in tracer number, the absence of tracers for living organic matter is shown to have a minimal impact on the transport of nutrient elements, and the three models produce similar mean annual preindustrial distributions of macronutrients, oxygen, and carbon. Significant differences do exist among the models, in particular the seasonal cycle of biomass and export production, but it does not appear that these are necessary consequences of the reduced tracer number. With increasing CO2, changes in dissolved oxygen and anthropogenic carbon uptake are very similar across the different models. Thus, while the reduced-tracer models do not explicitly resolve the diversity and internal dynamics of marine ecosystems, we demonstrate that such models are applicable to a broad suite of major biogeochemical concerns, including anthropogenic change. These results are very promising for the further development and application of reduced-tracer biogeochemical models that incorporate "sub-ecosystem-scale" parameterizations.

  10. Using High Resolution Computed Tomography to Visualize the Three Dimensional Structure and Function of Plant Vasculature

    PubMed Central

    McElrone, Andrew J.; Choat, Brendan; Parkinson, Dilworth Y.; MacDowell, Alastair A.; Brodersen, Craig R.

    2013-01-01

    High resolution x-ray computed tomography (HRCT) is a non-destructive diagnostic imaging technique with sub-micron resolution capability that is now being used to evaluate the structure and function of plant xylem network in three dimensions (3D) (e.g. Brodersen et al. 2010; 2011; 2012a,b). HRCT imaging is based on the same principles as medical CT systems, but a high intensity synchrotron x-ray source results in higher spatial resolution and decreased image acquisition time. Here, we demonstrate in detail how synchrotron-based HRCT (performed at the Advanced Light Source-LBNL Berkeley, CA, USA) in combination with Avizo software (VSG Inc., Burlington, MA, USA) is being used to explore plant xylem in excised tissue and living plants. This new imaging tool allows users to move beyond traditional static, 2D light or electron micrographs and study samples using virtual serial sections in any plane. An infinite number of slices in any orientation can be made on the same sample, a feature that is physically impossible using traditional microscopy methods. Results demonstrate that HRCT can be applied to both herbaceous and woody plant species, and a range of plant organs (i.e. leaves, petioles, stems, trunks, roots). Figures presented here help demonstrate both a range of representative plant vascular anatomy and the type of detail extracted from HRCT datasets, including scans for coast redwood (Sequoia sempervirens), walnut (Juglans spp.), oak (Quercus spp.), and maple (Acer spp.) tree saplings to sunflowers (Helianthus annuus), grapevines (Vitis spp.), and ferns (Pteridium aquilinum and Woodwardia fimbriata). Excised and dried samples from woody species are easiest to scan and typically yield the best images. However, recent improvements (i.e. more rapid scans and sample stabilization) have made it possible to use this visualization technique on green tissues (e.g. petioles) and in living plants. On occasion some shrinkage of hydrated green plant tissues will cause

  11. Efficient computation of the angularly resolved chord length distributions and lineal path functions in large microstructure datasets

    NASA Astrophysics Data System (ADS)

    Turner, David M.; Niezgoda, Stephen R.; Kalidindi, Surya R.

    2016-10-01

    Chord length distributions (CLDs) and lineal path functions (LPFs) have been successfully utilized in prior literature as measures of the size and shape distributions of the important microscale constituents in the material system. Typically, these functions are parameterized only by line lengths, and thus calculated and derived independent of the angular orientation of the chord or line segment. We describe in this paper computationally efficient methods for estimating chord length distributions and lineal path functions for 2D (two dimensional) and 3D microstructure images defined on any number of arbitrary chord orientations. These so called fully angularly resolved distributions can be computed for over 1000 orientations on large microstructure images (5003 voxels) in minutes on modest hardware. We present these methods as new tools for characterizing microstructures in a statistically meaningful way.

  12. [COMPUTER TECHNOLOGY FOR ACCOUNTING OF CONFOUNDERS IN THE RISK ASSESSMENT IN COMPARATIVE STUDIES ON THE BASE OF THE METHOD OF STANDARDIZATION].

    PubMed

    Shalaumova, Yu V; Varaksin, A N; Panov, V G

    2016-01-01

    There was performed an analysis of the accounting of the impact of concomitant variables (confounders), introducing a systematic error in the assessment of the impact of risk factors on the resulting variable. The analysis showed that standardization is an effective method for the reduction of the shift of risk assessment. In the work there is suggested an algorithm implementing the method of standardization based on stratification, providing for the minimization of the difference of distributions of confounders in groups on risk factors. To automate the standardization procedures there was developed a software available on the website of the Institute of Industrial Ecology, UB RAS. With the help of the developed software by numerically modeling there were determined conditions of the applicability of the method of standardization on the basis of stratification for the case of the normal distribution on the response and confounder and linear relationship between them. Comparison ofresults obtained with the help of the standardization with statistical methods (logistic regression and analysis of covariance) in solving the problem of human ecology, has shown that obtaining close results is possible if there will be met exactly conditions for the applicability of statistical methods. Standardization is less sensitive to violations of conditions of applicability. PMID:27266034

  13. [COMPUTER TECHNOLOGY FOR ACCOUNTING OF CONFOUNDERS IN THE RISK ASSESSMENT IN COMPARATIVE STUDIES ON THE BASE OF THE METHOD OF STANDARDIZATION].

    PubMed

    Shalaumova, Yu V; Varaksin, A N; Panov, V G

    2016-01-01

    There was performed an analysis of the accounting of the impact of concomitant variables (confounders), introducing a systematic error in the assessment of the impact of risk factors on the resulting variable. The analysis showed that standardization is an effective method for the reduction of the shift of risk assessment. In the work there is suggested an algorithm implementing the method of standardization based on stratification, providing for the minimization of the difference of distributions of confounders in groups on risk factors. To automate the standardization procedures there was developed a software available on the website of the Institute of Industrial Ecology, UB RAS. With the help of the developed software by numerically modeling there were determined conditions of the applicability of the method of standardization on the basis of stratification for the case of the normal distribution on the response and confounder and linear relationship between them. Comparison ofresults obtained with the help of the standardization with statistical methods (logistic regression and analysis of covariance) in solving the problem of human ecology, has shown that obtaining close results is possible if there will be met exactly conditions for the applicability of statistical methods. Standardization is less sensitive to violations of conditions of applicability.

  14. Accounting for the environment.

    PubMed

    Lutz, E; Munasinghe, M

    1991-03-01

    Environmental awareness in the 1980s has led to efforts to improve the current UN System of National Accounts (SNA) for better measurement of the value of environmental resources when estimating income. National governments, the UN, the International Monetary Fund, and the World Bank are interested in solving this issue. The World Bank relies heavily on national aggregates in income accounts compiled by means of the SNA that was published in 1968 and stressed gross domestic product (GDP). GDP measures mainly market activity, but it takes does not consider the consumption of natural capital, and indirectly inhibits sustained development. The deficiencies of the current method of accounting are inconsistent treatment of manmade and natural capital, the omission of natural resources and their depletion from balance sheets, and pollution cleanup costs from national income. In the calculation of GDP pollution is overlooked, and beneficial environmental inputs are valued at zero. The calculation of environmentally adjusted net domestic product (EDP) and environmentally adjusted net income (ENI) would lower income and growth rate, as the World Resources Institute found with respect to Indonesia for 1971-84. When depreciation for oil, timber, and top soil was included the net domestic product (NDP) was only 4% compared with a 7.1% GDP. The World Bank has advocated environmental accounting since 1983 in SNA revisions. The 1989 revised Blue Book of the SNA takes environment concerns into account. Relevant research is under way in Mexico and Papua New Guinea using the UN Statistical Office framework as a system for environmentally adjusted economic accounts that computes EDP and ENI and integrates environmental data with national accounts while preserving SNA concepts. PMID:12285741

  15. FIT: Computer Program that Interactively Determines Polynomial Equations for Data which are a Function of Two Independent Variables

    NASA Technical Reports Server (NTRS)

    Arbuckle, P. D.; Sliwa, S. M.; Roy, M. L.; Tiffany, S. H.

    1985-01-01

    A computer program for interactively developing least-squares polynomial equations to fit user-supplied data is described. The program is characterized by the ability to compute the polynomial equations of a surface fit through data that are a function of two independent variables. The program utilizes the Langley Research Center graphics packages to display polynomial equation curves and data points, facilitating a qualitative evaluation of the effectiveness of the fit. An explanation of the fundamental principles and features of the program, as well as sample input and corresponding output, are included.

  16. Open and closed cortico-subcortical loops: A neuro-computational account of access to consciousness in the distractor-induced blindness paradigm.

    PubMed

    Ebner, Christian; Schroll, Henning; Winther, Gesche; Niedeggen, Michael; Hamker, Fred H

    2015-09-01

    How the brain decides which information to process 'consciously' has been debated over for decades without a simple explanation at hand. While most experiments manipulate the perceptual energy of presented stimuli, the distractor-induced blindness task is a prototypical paradigm to investigate gating of information into consciousness without or with only minor visual manipulation. In this paradigm, subjects are asked to report intervals of coherent dot motion in a rapid serial visual presentation (RSVP) stream, whenever these are preceded by a particular color stimulus in a different RSVP stream. If distractors (i.e., intervals of coherent dot motion prior to the color stimulus) are shown, subjects' abilities to perceive and report intervals of target dot motion decrease, particularly with short delays between intervals of target color and target motion. We propose a biologically plausible neuro-computational model of how the brain controls access to consciousness to explain how distractor-induced blindness originates from information processing in the cortex and basal ganglia. The model suggests that conscious perception requires reverberation of activity in cortico-subcortical loops and that basal-ganglia pathways can either allow or inhibit this reverberation. In the distractor-induced blindness paradigm, inadequate distractor-induced response tendencies are suppressed by the inhibitory 'hyperdirect' pathway of the basal ganglia. If a target follows such a distractor closely, temporal aftereffects of distractor suppression prevent target identification. The model reproduces experimental data on how delays between target color and target motion affect the probability of target detection.

  17. Functional Assessment for Human-Computer Interaction: A Method for Quantifying Physical Functional Capabilities for Information Technology Users

    ERIC Educational Resources Information Center

    Price, Kathleen J.

    2011-01-01

    The use of information technology is a vital part of everyday life, but for a person with functional impairments, technology interaction may be difficult at best. Information technology is commonly designed to meet the needs of a theoretical "normal" user. However, there is no such thing as a "normal" user. A user's capabilities will vary over…

  18. On one-dimensional stretching functions for finite-difference calculations. [computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Vinokur, M.

    1983-01-01

    The class of one-dimensional stretching functions used in finite-difference calculations is studied. For solutions containing a highly localized region of rapid variation, simple criteria for a stretching function are derived using a truncation error analysis. These criteria are used to investigate two types of stretching functions. One an interior stretching function, for which the location and slope of an interior clustering region are specified. The simplest such function satisfying the criteria is found to be one based on the inverse hyperbolic sine. The other type of function is a two-sided stretching function, for which the arbitrary slopes at the two ends of the one-dimensional interval are specified. The simplest such general function is found to be one based on the inverse tangent. Previously announced in STAR as N80-25055

  19. On one-dimensional stretching functions for finite-difference calculations. [computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Vinokur, M.

    1979-01-01

    The class of one-dimensional stretching functions used in finite-difference calculations is studied. For solutions containing a highly localized region of rapid variation, simple criteria for a stretching function are derived using a truncation error analysis. These criteria are used to investigate two types of stretching functions. One is an interior stretching function, for which the location and slope of an interior clustering region are specified. The simplest such function satisfying the criteria is found to be one based on the inverse hyperbolic sine. The other type of function is a two-sided stretching function, for which the arbitrary slopes at the two ends of the one-dimensional interval are specified. The simplest such general function is found to be one based on the inverse tangent.

  20. Next-generation computers

    SciTech Connect

    Torrero, E.A.

    1985-01-01

    Developments related to tomorrow's computers are discussed, taking into account advances toward the fifth generation in Japan, the challenge to U.S. supercomputers, plans concerning the creation of supersmart computers for the U.S. military, a U.S. industry response to the Japanese challenge, a survey of U.S. and European research, Great Britain, the European Common Market, codifying human knowledge for machine reading, software engineering, the next-generation softwave, plans for obtaining the million-transistor chip, and fabrication issues for next-generation circuits. Other topics explored are related to a status report regarding artificial intelligence, an assessment of the technical challenges, aspects of sociotechnology, and defense advanced research projects. Attention is also given to expert systems, speech recognition, computer vision, function-level programming and automated programming, computing at the speed limit, VLSI, and superpower computers.

  1. Evaluating the Appropriateness of a New Computer-Administered Measure of Adaptive Function for Children and Youth with Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Coster, Wendy J.; Kramer, Jessica M.; Tian, Feng; Dooley, Meghan; Liljenquist, Kendra; Kao, Ying-Chia; Ni, Pengsheng

    2016-01-01

    The Pediatric Evaluation of Disability Inventory-Computer Adaptive Test is an alternative method for describing the adaptive function of children and youth with disabilities using a computer-administered assessment. This study evaluated the performance of the Pediatric Evaluation of Disability Inventory-Computer Adaptive Test with a national…

  2. Fast Computation of Solvation Free Energies with Molecular Density Functional Theory: Thermodynamic-Ensemble Partial Molar Volume Corrections.

    PubMed

    Sergiievskyi, Volodymyr P; Jeanmairet, Guillaume; Levesque, Maximilien; Borgis, Daniel

    2014-06-01

    Molecular density functional theory (MDFT) offers an efficient implicit-solvent method to estimate molecule solvation free-energies, whereas conserving a fully molecular representation of the solvent. Even within a second-order approximation for the free-energy functional, the so-called homogeneous reference fluid approximation, we show that the hydration free-energies computed for a data set of 500 organic compounds are of similar quality as those obtained from molecular dynamics free-energy perturbation simulations, with a computer cost reduced by 2-3 orders of magnitude. This requires to introduce the proper partial volume correction to transform the results from the grand canonical to the isobaric-isotherm ensemble that is pertinent to experiments. We show that this correction can be extended to 3D-RISM calculations, giving a sound theoretical justification to empirical partial molar volume corrections that have been proposed recently.

  3. The use of computer graphic techniques for the determination of ventricular function.

    NASA Technical Reports Server (NTRS)

    Sandler, H.; Rasmussen, D.

    1972-01-01

    Description of computer techniques employed to increase the speed, accuracy, reliability, and scope of angiocardiographic analyses determining human heart dimensions. Chamber margins are traced with a Calma 303 digitizer from projections of the angiographic films. The digitized margins of the ventricular images are filed in a computer for subsequent analysis. The margins can be displayed on the television screen of a graphics unit for individual study or they can be viewed in real time (or at any selected speed) to study dynamic changes in the chamber outline. The construction of three dimensional images of the ventricle is described.

  4. Real-time functional magnetic imaging-brain-computer interface and virtual reality promising tools for the treatment of pedophilia.

    PubMed

    Renaud, Patrice; Joyal, Christian; Stoleru, Serge; Goyette, Mathieu; Weiskopf, Nikolaus; Birbaumer, Niels

    2011-01-01

    This chapter proposes a prospective view on using a real-time functional magnetic imaging (rt-fMRI) brain-computer interface (BCI) application as a new treatment for pedophilia. Neurofeedback mediated by interactive virtual stimuli is presented as the key process in this new BCI application. Results on the diagnostic discriminant power of virtual characters depicting sexual stimuli relevant to pedophilia are given. Finally, practical and ethical implications are briefly addressed.

  5. Computation of dynamical correlation functions for many-fermion systems with auxiliary-field quantum Monte Carlo

    NASA Astrophysics Data System (ADS)

    Vitali, Ettore; Shi, Hao; Qin, Mingpu; Zhang, Shiwei

    2016-08-01

    We address the calculation of dynamical correlation functions for many fermion systems at zero temperature, using the auxiliary-field quantum Monte Carlo method. The two-dimensional Hubbard hamiltonian is used as a model system. Although most of the calculations performed here are for cases where the sign problem is absent, the discussions are kept general for applications to physical problems when the sign problem does arise. We study the use of twisted boundary conditions to improve the extrapolation of the results to the thermodynamic limit. A strategy is proposed to drastically reduce finite size effects relying on a minimization among the twist angles. This approach is demonstrated by computing the charge gap at half filling. We obtain accurate results showing the scaling of the gap with the interaction strength U in two dimensions, connecting to the scaling of the unrestricted Hartree-Fock method at small U and Bethe ansatz exact result in one dimension at large U . An alternative algorithm is then proposed to compute dynamical Green functions and correlation functions which explicitly varies the number of particles during the random walks in the manifold of Slater determinants. In dilute systems, such as ultracold Fermi gases, this algorithm enables calculations with much more favorable complexity, with computational cost proportional to basis size or the number of lattice sites.

  6. Variability in Reading Ability Gains as a Function of Computer-Assisted Instruction Method of Presentation

    ERIC Educational Resources Information Center

    Johnson, Erin Phinney; Perry, Justin; Shamir, Haya

    2010-01-01

    This study examines the effects on early reading skills of three different methods of presenting material with computer-assisted instruction (CAI): (1) learner-controlled picture menu, which allows the student to choose activities, (2) linear sequencer, which progresses the students through lessons at a pre-specified pace, and (3) mastery-based…

  7. Discourse Functions and Vocabulary Use in English Language Learners' Synchronous Computer-Mediated Communication

    ERIC Educational Resources Information Center

    Rabab'ah, Ghaleb

    2013-01-01

    This study explores the discourse generated by English as a foreign language (EFL) learners using synchronous computer-mediated communication (CMC) as an approach to help English language learners to create social interaction in the classroom. It investigates the impact of synchronous CMC mode on the quantity of total words, lexical range and…

  8. Computational insights into function and inhibition of fatty acid amide hydrolase.

    PubMed

    Palermo, Giulia; Rothlisberger, Ursula; Cavalli, Andrea; De Vivo, Marco

    2015-02-16

    The Fatty Acid Amide Hydrolase (FAAH) enzyme is a membrane-bound serine hydrolase responsible for the deactivating hydrolysis of a family of naturally occurring fatty acid amides. FAAH is a critical enzyme of the endocannabinoid system, being mainly responsible for regulating the level of its main cannabinoid substrate anandamide. For this reason, pharmacological inhibition of FAAH, which increases the level of endogenous anandamide, is a promising strategy to cure a variety of diseases including pain, inflammation, and cancer. Much structural, mutagenesis, and kinetic data on FAAH has been generated over the last couple of decades. This has prompted several informative computational investigations to elucidate, at the atomic-level, mechanistic details on catalysis and inhibition of this pharmaceutically relevant enzyme. Here, we review how these computational studies - based on classical molecular dynamics, full quantum mechanics, and hybrid QM/MM methods - have clarified the binding and reactivity of some relevant substrates and inhibitors of FAAH. We also discuss the experimental implications of these computational insights, which have provided a thoughtful elucidation of the complex physical and chemical steps of the enzymatic mechanism of FAAH. Finally, we discuss how computations have been helpful for building structure-activity relationships of potent FAAH inhibitors. PMID:25240419

  9. Integrating computational modeling and functional assays to decipher the structure-function relationship of influenza virus PB1 protein

    PubMed Central

    Li, Chunfeng; Wu, Aiping; Peng, Yousong; Wang, Jingfeng; Guo, Yang; Chen, Zhigao; Zhang, Hong; Wang, Yongqiang; Dong, Jiuhong; Wang, Lulan; Qin, F. Xiao-Feng; Cheng, Genhong; Deng, Tao; Jiang, Taijiao

    2014-01-01

    The influenza virus PB1 protein is the core subunit of the heterotrimeric polymerase complex (PA, PB1 and PB2) in which PB1 is responsible for catalyzing RNA polymerization and binding to the viral RNA promoter. Among the three subunits, PB1 is the least known subunit so far in terms of its structural information. In this work, by integrating template-based structural modeling approach with all known sequence and functional information about the PB1 protein, we constructed a modeled structure of PB1. Based on this model, we performed mutagenesis analysis for the key residues that constitute the RNA template binding and catalytic (TBC) channel in an RNP reconstitution system. The results correlated well with the model and further identified new residues of PB1 that are critical for RNA synthesis. Moreover, we derived 5 peptides from the sequence of PB1 that form the TBC channel and 4 of them can inhibit the viral RNA polymerase activity. Interestingly, we found that one of them named PB1(491–515) can inhibit influenza virus replication by disrupting viral RNA promoter binding activity of polymerase. Therefore, this study has not only deepened our understanding of structure-function relationship of PB1, but also promoted the development of novel therapeutics against influenza virus. PMID:25424584

  10. Computer Equipment Repair Curriculum Guide.

    ERIC Educational Resources Information Center

    Reneau, Fred; And Others

    This guide is intended for use in a course to train students to repair computer equipment and perform related administrative and customer service tasks. Addressed in the individual units are the following topics (with selected subtopics in brackets): performing administrative functions (preparing service bills, maintaining accounts and labor…

  11. Head sinuses, melon, and jaws of bottlenose dolphins, Tursiops truncatus, observed with computed tomography structural and single photon emission computed tomography functional imaging

    NASA Astrophysics Data System (ADS)

    Ridgway, Sam; Houser, Dorian; Finneran, James J.; Carder, Don; van Bonn, William; Smith, Cynthia; Hoh, Carl; Corbeil, Jacqueline; Mattrey, Robert

    2003-04-01

    The head sinuses, melon, and lower jaws of dolphins have been studied extensively with various methods including radiography, chemical analysis, and imaging of dead specimens. Here we report the first structural and functional imaging of live dolphins. Two animals were imaged, one male and one female. Computed tomography (CT) revealed extensive air cavities posterior and medial to the ear as well as between the ear and sound-producing nasal structures. Single photon emission computed tomography (SPECT) employing 50 mCi of the intravenously injected ligand technetium [Tc-99m] biscisate (Neurolite) revealed extensive and uptake in the core of the melon as well as near the pan bone area of the lower jaw. Count density on SPECT images was four times greater in melon as in the surrounding tissue and blubber layer suggesting that the melon is an active rather than a passive tissue. Since the dolphin temporal bone is not attached to the skull except by fibrous suspensions, the air cavities medial and posterior to the ear as well as the abutment of the temporal bone, to the acoustic fat bodies of each lower jaw, should be considered in modeling the mechanism of sound transmission from the environment to the dolphin ear.

  12. High-Throughput Computational Design of Advanced Functional Materials: Topological Insulators and Two-Dimensional Electron Gas Systems

    NASA Astrophysics Data System (ADS)

    Yang, Kesong

    As a rapidly growing area of materials science, high-throughput (HT) computational materials design is playing a crucial role in accelerating the discovery and development of novel functional materials. In this presentation, I will first introduce the strategy of HT computational materials design, and take the HT discovery of topological insulators (TIs) as a practical example to show the usage of such an approach. Topological insulators are one of the most studied classes of novel materials because of their great potential for applications ranging from spintronics to quantum computers. Here I will show that, by defining a reliable and accessible descriptor, which represents the topological robustness or feasibility of the candidate, and by searching the quantum materials repository aflowlib.org, we have automatically discovered 28 TIs (some of them already known) in five different symmetry families. Next, I will talk about our recent research work on the HT computational design of the perovskite-based two-dimensional electron gas (2DEG) systems. The 2DEG formed on the perovskite oxide heterostructure (HS) has potential applications in next-generation nanoelectronic devices. In order to achieve practical implementation of the 2DEG in the device design, desired physical properties such as high charge carrier density and mobility are necessary. Here I show that, using the same strategy with the HT discovery of TIs, by introducing a series of combinatorial descriptors, we have successfully identified a series of candidate 2DEG systems based on the perovskite oxides. This work provides another exemplar of applying HT computational design approach for the discovery of advanced functional materials.

  13. A Computation of the Frequency Dependent Dielectric Function for Energetic Materials

    NASA Astrophysics Data System (ADS)

    Zwitter, D. E.; Kuklja, M. M.; Kunz, A. B.

    1999-06-01

    The imaginary part of the dielectric function as a function of frequency is calculated for the solids RDX, TATB, ADN, and PETN. Calculations have been performed including the effects of isotropic and uniaxial pressure. Simple lattice defects are included in some of the calculations.

  14. Computer analysis of protein functional sites projection on exon structure of genes in Metazoa

    PubMed Central

    2015-01-01

    Background Study of the relationship between the structural and functional organization of proteins and their coding genes is necessary for an understanding of the evolution of molecular systems and can provide new knowledge for many applications for designing proteins with improved medical and biological properties. It is well known that the functional properties of proteins are determined by their functional sites. Functional sites are usually represented by a small number of amino acid residues that are distantly located from each other in the amino acid sequence. They are highly conserved within their functional group and vary significantly in structure between such groups. According to this facts analysis of the general properties of the structural organization of the functional sites at the protein level and, at the level of exon-intron structure of the coding gene is still an actual problem. Results One approach to this analysis is the projection of amino acid residue positions of the functional sites along with the exon boundaries to the gene structure. In this paper, we examined the discontinuity of the functional sites in the exon-intron structure of genes and the distribution of lengths and phases of the functional site encoding exons in vertebrate genes. We have shown that the DNA fragments coding the functional sites were in the same exons, or in close exons. The observed tendency to cluster the exons that code functional sites which could be considered as the unit of protein evolution. We studied the characteristics of the structure of the exon boundaries that code, and do not code, functional sites in 11 Metazoa species. This is accompanied by a reduced frequency of intercodon gaps (phase 0) in exons encoding the amino acid residue functional site, which may be evidence of the existence of evolutionary limitations to the exon shuffling. Conclusions These results characterize the features of the coding exon-intron structure that affect the

  15. Cosmic Reionization on Computers: The Faint End of the Galaxy Luminosity Function

    NASA Astrophysics Data System (ADS)

    Gnedin, Nickolay Y.

    2016-07-01

    Using numerical cosmological simulations completed under the “Cosmic Reionization On Computers” project, I explore theoretical predictions for the faint end of the galaxy UV luminosity functions at z≳ 6. A commonly used Schechter function approximation with the magnitude cut at {M}{{cut}}˜ -13 provides a reasonable fit to the actual luminosity function of simulated galaxies. When the Schechter functional form is forced on the luminosity functions from the simulations, the magnitude cut {M}{{cut}} is found to vary between -12 and -14 with a mild redshift dependence. An analytical model of reionization from Madau et al., as used by Robertson et al., provides a good description of the simulated results, which can be improved even further by adding two physically motivated modifications to the original Madau et al. equation.

  16. Cosmic reionization on computers: The faint end of the galaxy luminosity function

    DOE PAGES

    Gnedin, Nickolay Y.

    2016-07-01

    Using numerical cosmological simulations completed under the “Cosmic Reionization On Computers” project, I explore theoretical predictions for the faint end of the galaxy UV luminosity functions atmore » $$z\\gtrsim 6$$. A commonly used Schechter function approximation with the magnitude cut at $${M}_{{\\rm{cut}}}\\sim -13$$ provides a reasonable fit to the actual luminosity function of simulated galaxies. When the Schechter functional form is forced on the luminosity functions from the simulations, the magnitude cut $${M}_{{\\rm{cut}}}$$ is found to vary between -12 and -14 with a mild redshift dependence. Here, an analytical model of reionization from Madau et al., as used by Robertson et al., provides a good description of the simulated results, which can be improved even further by adding two physically motivated modifications to the original Madau et al. equation.« less

  17. Towards a fully automated computation of RG functions for the three-dimensional O(N) vector model: parametrizing amplitudes

    NASA Astrophysics Data System (ADS)

    Guida, Riccardo; Ribeca, Paolo

    2006-02-01

    Within the framework of field-theoretical description of second-order phase transitions via the three-dimensional O(N) vector model, accurate predictions for critical exponents can be obtained from (resummation of) the perturbative series of renormalization-group functions, which are in turn derived—following Parisi's approach—from the expansions of appropriate field correlators evaluated at zero external momenta. Such a technique was fully exploited 30 years ago in two seminal works of Baker, Nickel, Green and Meiron, which led to the knowledge of the β-function up to the six-loop level; they succeeded in obtaining a precise numerical evaluation of all needed Feynman amplitudes in momentum space by lowering the dimensionalities of each integration with a cleverly arranged set of computational simplifications. In fact, extending this computation is not straightforward, due both to the factorial proliferation of relevant diagrams and the increasing dimensionality of their associated integrals; in any case, this task can be reasonably carried on only in the framework of an automated environment. On the road towards the creation of such an environment, we here show how a strategy closely inspired by that of Nickel and co-workers can be stated in algorithmic form, and successfully implemented on a computer. As an application, we plot the minimized distributions of residual integrations for the sets of diagrams needed to obtain RG functions to the full seven-loop level; they represent a good evaluation of the computational effort which will be required to improve the currently available estimates of critical exponents.

  18. Computing diffuse reflection from particulate planetary surface with a new function.

    PubMed

    Wolff, M

    1981-07-15

    An equation is derived to compute the amount of diffuse light reflected by a particulate surface such as on Mars or an asteroid. The method traces the paths of rays within an ensemble of randomly shaped grains and finds the eventual probability of emission. The amount of diffuse, unpolarized emitted light is obtained in terms of the real index of refraction, the imaginary index, and the average diameter of particles making up the surface. The equation is used to compute the empirical rule for obtaining the planetary albedo from the slope of its polarization curve. Accuracy of the equation, estimated at +/-4%, seems justified because of quantitative agreement with experimental measures of the empirical rule. It is also shown that the equation can be applied to bubble-enclosing surfaces such as volcanic foams. Results for the indices of the moon, Mars, Io, and Europa are obtained and compared with other data.

  19. Using Speech Recognition to Enhance the Tongue Drive System Functionality in Computer Access

    PubMed Central

    Huo, Xueliang; Ghovanloo, Maysam

    2013-01-01

    Tongue Drive System (TDS) is a wireless tongue operated assistive technology (AT), which can enable people with severe physical disabilities to access computers and drive powered wheelchairs using their volitional tongue movements. TDS offers six discrete commands, simultaneously available to the users, for pointing and typing as a substitute for mouse and keyboard in computer access, respectively. To enhance the TDS performance in typing, we have added a microphone, an audio codec, and a wireless audio link to its readily available 3-axial magnetic sensor array, and combined it with a commercially available speech recognition software, the Dragon Naturally Speaking, which is regarded as one of the most efficient ways for text entry. Our preliminary evaluations indicate that the combined TDS and speech recognition technologies can provide end users with significantly higher performance than using each technology alone, particularly in completing tasks that require both pointing and text entry, such as web surfing. PMID:22255801

  20. Substrate tunnels in enzymes: structure-function relationships and computational methodology.

    PubMed

    Kingsley, Laura J; Lill, Markus A

    2015-04-01

    In enzymes, the active site is the location where incoming substrates are chemically converted to products. In some enzymes, this site is deeply buried within the core of the protein, and, in order to access the active site, substrates must pass through the body of the protein via a tunnel. In many systems, these tunnels act as filters and have been found to influence both substrate specificity and catalytic mechanism. Identifying and understanding how these tunnels exert such control has been of growing interest over the past several years because of implications in fields such as protein engineering and drug design. This growing interest has spurred the development of several computational methods to identify and analyze tunnels and how ligands migrate through these tunnels. The goal of this review is to outline how tunnels influence substrate specificity and catalytic efficiency in enzymes with buried active sites and to provide a brief summary of the computational tools used to identify and evaluate these tunnels.

  1. Meta-Analysis of Diagnostic Performance of Coronary Computed Tomography Angiography, Computed Tomography Perfusion, and Computed Tomography-Fractional Flow Reserve in Functional Myocardial Ischemia Assessment Versus Invasive Fractional Flow Reserve.

    PubMed

    Gonzalez, Jorge A; Lipinski, Michael J; Flors, Lucia; Shaw, Peter W; Kramer, Christopher M; Salerno, Michael

    2015-11-01

    We sought to compare the diagnostic performance of coronary computed tomography angiography (CCTA), computed tomography perfusion (CTP), and computed tomography (CT)-fractional flow reserve (FFR) for assessing the functional significance of coronary stenosis as defined by invasive FFR in patients with known or suspected coronary artery disease (CAD). CCTA has proved clinically useful for excluding obstructive CAD because of its high sensitivity and negative predictive value (NPV); however, the ability of CTA to identify functionally significant CAD has remained challenging. We searched PubMed/Medline for studies evaluating CCTA, CTP, or CT-FFR for the noninvasive detection of obstructive CAD compared with catheter-derived FFR as the reference standard. Pooled sensitivity, specificity, PPV, NPV, likelihood ratios, and odds ratio of all diagnostic tests were assessed. Eighteen studies involving a total of 1,535 patients were included. CTA demonstrated a pooled sensitivity of 0.92, specificity 0.43, PPV of 0.56, and NPV of 0.87 on a per-patient level. CT-FFR and CTP increased the specificity to 0.72 and 0.77, respectively (p = 0.004 and p = 0.0009) resulting in higher point estimates for PPV 0.70 and 0.83, respectively. There was no improvement in the sensitivity. The CTP protocol involved more radiation (3.5 mSv CCTA vs 9.6 mSv CTP) and a higher volume of iodinated contrast (145 ml). In conclusion, CTP and CT-FFR improve the specificity of CCTA for detecting functionally significant stenosis as defined by invasive FFR on a per-patient level; both techniques could advance the ability to noninvasively detect the functional significance of coronary lesions.

  2. Noncovalent functionalization of single-walled carbon nanotubes by aromatic diisocyanate molecules: A computational study

    NASA Astrophysics Data System (ADS)

    Goclon, Jakub; Kozlowska, Mariana; Rodziewicz, Pawel

    2014-04-01

    We investigate the noncovalent functionalization of metallic single-walled carbon nanotubes (SWCNT) (6,0) by 4,4‧-methylene diphenyl diisocyanate (MDI) and toluene-2,4-diisocyanate (TDI) molecules using the density functional theory (DFT) method with van der Waals dispersion correction. The obtained local minima show the dependence between the molecular arrangement of the adsorbates on SWCNT surface and their binding energies. We analyze the interplay between the π-π stacking interactions and isocyanate functional groups. For the analysis of the changes in the electronic structure we calculate the density of states (DOS) and charge density plots.

  3. Use of 4-Dimensional Computed Tomography-Based Ventilation Imaging to Correlate Lung Dose and Function With Clinical Outcomes

    SciTech Connect

    Vinogradskiy, Yevgeniy; Castillo, Richard; Castillo, Edward; Tucker, Susan L.; Liao, Zhongxing; Guerrero, Thomas; Martel, Mary K.

    2013-06-01

    Purpose: Four-dimensional computed tomography (4DCT)-based ventilation is an emerging imaging modality that can be used in the thoracic treatment planning process. The clinical benefit of using ventilation images in radiation treatment plans remains to be tested. The purpose of the current work was to test the potential benefit of using ventilation in treatment planning by evaluating whether dose to highly ventilated regions of the lung resulted in increased incidence of clinical toxicity. Methods and Materials: Pretreatment 4DCT data were used to compute pretreatment ventilation images for 96 lung cancer patients. Ventilation images were calculated using 4DCT data, deformable image registration, and a density-change based algorithm. Dose–volume and ventilation-based dose function metrics were computed for each patient. The ability of the dose–volume and ventilation-based dose–function metrics to predict for severe (grade 3+) radiation pneumonitis was assessed using logistic regression analysis, area under the curve (AUC) metrics, and bootstrap methods. Results: A specific patient example is presented that demonstrates how incorporating ventilation-based functional information can help separate patients with and without toxicity. The logistic regression significance values were all lower for the dose–function metrics (range P=.093-.250) than for their dose–volume equivalents (range, P=.331-.580). The AUC values were all greater for the dose–function metrics (range, 0.569-0.620) than for their dose–volume equivalents (range, 0.500-0.544). Bootstrap results revealed an improvement in model fit using dose–function metrics compared to dose–volume metrics that approached significance (range, P=.118-.155). Conclusions: To our knowledge, this is the first study that attempts to correlate lung dose and 4DCT ventilation-based function to thoracic toxicity after radiation therapy. Although the results were not significant at the .05 level, our data suggests

  4. Explicit Hilbert-space representations of atomic and molecular photoabsorption spectra - Computational studies of Stieltjes-Tchebycheff functions

    NASA Technical Reports Server (NTRS)

    Hermann, M. R.; Langhoff, P. W.

    1983-01-01

    Computational methods are reported for construction of discrete and continuum Schroedinger states in atoms and molecules employing explicit Hilbert space procedures familiar from bound state studies. As theoretical development, the Schroedinger problem of interest is described, the Cauchy-Lanczos bases and orthonormal polynomials used in constructing L-squared Stieltjes-Tchebycheff (ST) approximations to the discrete and continuum states are defined, and certain properties of these functions are indicated. Advantages and limitations of the ST approach to spectral studies relative to more conventional calculations are discussed, and aspects of the approach in single-channel approximations to larger molecules are described. Procedures are indicated for construction of photoejection anisotropies and for performing coupled-channel calculations employing the ST formalism. Finally, explicit descriptive intercomparisons are made of the nature and diagnostic value of ST functions with more conventional scattering functions.

  5. Development of microgravity, full body functional reach envelope using 3-D computer graphic models and virtual reality technology

    NASA Technical Reports Server (NTRS)

    Lindsey, Patricia F.

    1994-01-01

    In microgravity conditions mobility is greatly enhanced and body stability is difficult to achieve. Because of these difficulties, optimum placement and accessibility of objects and controls can be critical to required tasks on board shuttle flights or on the proposed space station. Anthropometric measurement of the maximum reach of occupants of a microgravity environment provide knowledge about maximum functional placement for tasking situations. Calculations for a full body, functional reach envelope for microgravity environments are imperative. To this end, three dimensional computer modeled human figures, providing a method of anthropometric measurement, were used to locate the data points that define the full body, functional reach envelope. Virtual reality technology was utilized to enable an occupant of the microgravity environment to experience movement within the reach envelope while immersed in a simulated microgravity environment.

  6. Parallel computers

    SciTech Connect

    Treveaven, P.

    1989-01-01

    This book presents an introduction to object-oriented, functional, and logic parallel computing on which the fifth generation of computer systems will be based. Coverage includes concepts for parallel computing languages, a parallel object-oriented system (DOOM) and its language (POOL), an object-oriented multilevel VLSI simulator using POOL, and implementation of lazy functional languages on parallel architectures.

  7. When can Empirical Green Functions be computed from Noise Cross-Correlations? Hints from different Geographical and Tectonic environments

    NASA Astrophysics Data System (ADS)

    Matos, Catarina; Silveira, Graça; Custódio, Susana; Domingues, Ana; Dias, Nuno; Fonseca, João F. B.; Matias, Luís; Krueger, Frank; Carrilho, Fernando

    2014-05-01

    Noise cross-correlations are now widely used to extract Green functions between station pairs. But, do all the cross-correlations routinely computed produce successful Green Functions? What is the relationship between noise recorded in a couple of stations and the cross-correlation between them? During the last decade, we have been involved in the deployment of several temporary dense broadband (BB) networks within the scope of both national projects and international collaborations. From 2000 to 2002, a pool of 8 BB stations continuously operated in the Azores in the scope of the Memorandum of Understanding COSEA (COordinated Seismic Experiment in the Azores). Thanks to the Project WILAS (West Iberia Lithosphere and Astenosphere Structure, PTDC/CTE-GIX/097946/2008) we temporarily increased the number of BB deployed in mainland Portugal to more than 50 (permanent + temporary) during the period 2010 - 2012. In 2011/12 a temporary pool of 12 seismometers continuously recorded BB data in the Madeira archipelago, as part of the DOCTAR (Deep Ocean Test Array Experiment) project. Project CV-PLUME (Investigation on the geometry and deep signature of the Cape Verde mantle plume, PTDC/CTE-GIN/64330/2006) covered the archipelago of Cape Verde, North Atlantic, with 40 temporary BB stations in 2007/08. Project MOZART (Mozambique African Rift Tomography, PTDC/CTE-GIX/103249/2008), covered Mozambique, East Africa, with 30 temporary BB stations in the period 2011 - 2013. These networks, located in very distinct geographical and tectonic environments, offer an interesting opportunity to study seasonal and spatial variations of noise sources and their impact on Empirical Green functions computed from noise cross-correlation. Seismic noise recorded at different seismic stations is evaluated by computation of the probability density functions of power spectral density (PSD) of continuous data. To assess seasonal variations of ambient noise sources in frequency content, time-series of

  8. A first principle approach using Maximally Localized Wannier Functions for computing and understanding elasto-optic reponse

    NASA Astrophysics Data System (ADS)

    Liang, Xin; Ismail-Beigi, Sohrab

    Strain-induced changes of optical properties are of use in the design and functioning of devices that couple photons and phonons. The elasto-optic (or photo-elastic) effect describes a general materials property where strain induces a change in the dielectric tensor. Despite a number of experimental and computational works, it is fair to say that a basic physical understanding of the effect and its materials dependence is lacking: e.g., we know of no materials design rule for enhancing or suppressing elasto-optic response. Based on our previous work, we find that a real space representation, as opposed to a k-space description, is a promising way to understand this effect. We have finished the development of a method of computing the dielectric and elasto-optic tensors using Maximally Localized Wannier Functions (MLWFs). By analyzing responses to uniaxial strain, we find that both tensors respond in a localized manner to the perturbation: the dominant optical transitions are between local electronic states on nearby bonds. We describe the method, the resulting physical picture and computed results for semiconductors. This work is supported by the National Science Foundation through Grant NSF DMR-1104974.

  9. An atomic orbital based real-time time-dependent density functional theory for computing electronic circular dichroism band spectra.

    PubMed

    Goings, Joshua J; Li, Xiaosong

    2016-06-21

    One of the challenges of interpreting electronic circular dichroism (ECD) band spectra is that different states may have different rotatory strength signs, determined by their absolute configuration. If the states are closely spaced and opposite in sign, observed transitions may be washed out by nearby states, unlike absorption spectra where transitions are always positive additive. To accurately compute ECD bands, it is necessary to compute a large number of excited states, which may be prohibitively costly if one uses the linear-response time-dependent density functional theory (TDDFT) framework. Here we implement a real-time, atomic-orbital based TDDFT method for computing the entire ECD spectrum simultaneously. The method is advantageous for large systems with a high density of states. In contrast to previous implementations based on real-space grids, the method is variational, independent of nuclear orientation, and does not rely on pseudopotential approximations, making it suitable for computation of chiroptical properties well into the X-ray regime.

  10. An atomic orbital based real-time time-dependent density functional theory for computing electronic circular dichroism band spectra

    NASA Astrophysics Data System (ADS)

    Goings, Joshua J.; Li, Xiaosong

    2016-06-01

    One of the challenges of interpreting electronic circular dichroism (ECD) band spectra is that different states may have different rotatory strength signs, determined by their absolute configuration. If the states are closely spaced and opposite in sign, observed transitions may be washed out by nearby states, unlike absorption spectra where transitions are always positive additive. To accurately compute ECD bands, it is necessary to compute a large number of excited states, which may be prohibitively costly if one uses the linear-response time-dependent density functional theory (TDDFT) framework. Here we implement a real-time, atomic-orbital based TDDFT method for computing the entire ECD spectrum simultaneously. The method is advantageous for large systems with a high density of states. In contrast to previous implementations based on real-space grids, the method is variational, independent of nuclear orientation, and does not rely on pseudopotential approximations, making it suitable for computation of chiroptical properties well into the X-ray regime.

  11. WAPA Daily Energy Accounting Activities

    1990-10-01

    ISA (Interchange, Scheduling, & Accounting) is the interchange scheduling system used by the DOE Western Area Power Administration to perform energy accounting functions associated with the daily activities of the Watertown Operations Office (WOO). The system's primary role is to provide accounting functions for scheduled energy which is exchanged with other power companies and power operating organizations. The system has a secondary role of providing a historical record of all scheduled interchange transactions. The followingmore » major functions are performed by ISA: scheduled energy accounting for received and delivered energy; generation scheduling accounting for both fossil and hydro-electric power plants; metered energy accounting for received and delivered totals; energy accounting for Direct Current (D.C.) Ties; regulation accounting; automatic generation control set calculations; accounting summaries for Basin, Heartland Consumers Power District, and the Missouri Basin Municipal Power Agency; calculation of estimated generation for the Laramie River Station plant; daily and monthly reports; and dual control areas.« less

  12. MATERIAL CONTROL ACCOUNTING INMM

    SciTech Connect

    Hasty, T.

    2009-06-14

    Since 1996, the Mining and Chemical Combine (MCC - formerly known as K-26), and the United States Department of Energy (DOE) have been cooperating under the cooperative Nuclear Material Protection, Control and Accounting (MPC&A) Program between the Russian Federation and the U.S. Governments. Since MCC continues to operate a reactor for steam and electricity production for the site and city of Zheleznogorsk which results in production of the weapons grade plutonium, one of the goals of the MPC&A program is to support implementation of an expanded comprehensive nuclear material control and accounting (MC&A) program. To date MCC has completed upgrades identified in the initial gap analysis and documented in the site MC&A Plan and is implementing additional upgrades identified during an update to the gap analysis. The scope of these upgrades includes implementation of MCC organization structure relating to MC&A, establishing material balance area structure for special nuclear materials (SNM) storage and bulk processing areas, and material control functions including SNM portal monitors at target locations. Material accounting function upgrades include enhancements in the conduct of physical inventories, limit of error inventory difference procedure enhancements, implementation of basic computerized accounting system for four SNM storage areas, implementation of measurement equipment for improved accountability reporting, and both new and revised site-level MC&A procedures. This paper will discuss the implementation of MC&A upgrades at MCC based on the requirements established in the comprehensive MC&A plan developed by the Mining and Chemical Combine as part of the MPC&A Program.

  13. [Covalent chloramine inhibitors of blood platelet functions: computational indices for their reactivity and antiplatelet activity].

    PubMed

    Roshchupkin, D I; Murina, M A; Sergienko, V I

    2011-01-01

    The quantum mechanics computation of the reactivities of chloramine derivatives of amino acids and taurine has been accomplished. A pair of computational indices that reflect a predisposition of alpha amino acid chloramines to chemical decay have been revealed. One of the indices was the dihedral angle for the chain of four atoms: carbons at beta- and alpha-positions, carbon of the carboxyl group, and carbonyl oxygen. The second index was the sum of partial charges for three or two carbon atoms in the chain. The amino acid chloramines with high values of the indices showed enhanced stability. Partial charges for active chlorine in known chloramines having different structures have been computed. The charges correlate with the rate constants of the reaction between chloramines and the thiol group of reduced glutathione. New derivatives of taurine chloramines have been constructed via the introduction of different substituents into the chloramine part. Among them, the amidoderivatives had the greatest charges of active chlorine (0.19-0.23). It was found in the study of the reactions of N-acetyl-N-chlorotaurine and N-propyonyl-N-chlorotaurine with amino acids and peptides possessing the thiol, thioester, or disulphide groups that the amidoderivatives manifested the thiol chemoselectivity. N-Acetyl-N-chlorotaurine and N-propionyl-N-chlorotaurine suppress the aggregation activity of blood platelets under their activation by the agonists ADP and collagen. It is not excluded that the amidoderivatives studied prevent platelet aggregation by a modification of the critical thiol group in the purine receptor P2Y12. PMID:22117450

  14. Using brain–computer interfaces to induce neural plasticity and restore function

    PubMed Central

    Grosse-Wentrup, Moritz; Mattia, Donatella; Oweiss, Karim

    2015-01-01

    Analyzing neural signals and providing feedback in realtime is one of the core characteristics of a brain–computer interface (BCI). As this feature may be employed to induce neural plasticity, utilizing BCI technology for therapeutic purposes is increasingly gaining popularity in the BCI community. In this paper, we discuss the state-of-the-art of research on this topic, address the principles of and challenges in inducing neural plasticity by means of a BCI, and delineate the problems of study design and outcome evaluation arising in this context. We conclude with a list of open questions and recommendations for future research in this field. PMID:21436534

  15. Using brain-computer interfaces to induce neural plasticity and restore function

    NASA Astrophysics Data System (ADS)

    Grosse-Wentrup, Moritz; Mattia, Donatella; Oweiss, Karim

    2011-04-01

    Analyzing neural signals and providing feedback in realtime is one of the core characteristics of a brain-computer interface (BCI). As this feature may be employed to induce neural plasticity, utilizing BCI technology for therapeutic purposes is increasingly gaining popularity in the BCI community. In this paper, we discuss the state-of-the-art of research on this topic, address the principles of and challenges in inducing neural plasticity by means of a BCI, and delineate the problems of study design and outcome evaluation arising in this context. We conclude with a list of open questions and recommendations for future research in this field.

  16. Accuracy and computational efficiency of real-time subspace propagation schemes for the time-dependent density functional theory

    NASA Astrophysics Data System (ADS)

    Russakoff, Arthur; Li, Yonghui; He, Shenglai; Varga, Kalman

    2016-05-01

    Time-dependent Density Functional Theory (TDDFT) has become successful for its balance of economy and accuracy. However, the application of TDDFT to large systems or long time scales remains computationally prohibitively expensive. In this paper, we investigate the numerical stability and accuracy of two subspace propagation methods to solve the time-dependent Kohn-Sham equations with finite and periodic boundary conditions. The bases considered are the Lánczos basis and the adiabatic eigenbasis. The results are compared to a benchmark fourth-order Taylor expansion of the time propagator. Our results show that it is possible to use larger time steps with the subspace methods, leading to computational speedups by a factor of 2-3 over Taylor propagation. Accuracy is found to be maintained for certain energy regimes and small time scales.

  17. A Computationally Inexpensive Optimal Guidance via Radial-Basis-Function Neural Network for Autonomous Soft Landing on Asteroids.

    PubMed

    Zhang, Peng; Liu, Keping; Zhao, Bo; Li, Yuanchun

    2015-01-01

    Optimal guidance is essential for the soft landing task. However, due to its high computational complexities, it is hardly applied to the autonomous guidance. In this paper, a computationally inexpensive optimal guidance algorithm based on the radial basis function neural network (RBFNN) is proposed. The optimization problem of the trajectory for soft landing on asteroids is formulated and transformed into a two-point boundary value problem (TPBVP). Combining the database of initial states with the relative initial co-states, an RBFNN is trained offline. The optimal trajectory of the soft landing is determined rapidly by applying the trained network in the online guidance. The Monte Carlo simulations of soft landing on the Eros433 are performed to demonstrate the effectiveness of the proposed guidance algorithm. PMID:26367382

  18. Reverse energy partitioning-An efficient algorithm for computing the density of states, partition functions, and free energy of solids.

    PubMed

    Do, Hainam; Wheatley, Richard J

    2016-08-28

    A robust and model free Monte Carlo simulation method is proposed to address the challenge in computing the classical density of states and partition function of solids. Starting from the minimum configurational energy, the algorithm partitions the entire energy range in the increasing energy direction ("upward") into subdivisions whose integrated density of states is known. When combined with the density of states computed from the "downward" energy partitioning approach [H. Do, J. D. Hirst, and R. J. Wheatley, J. Chem. Phys. 135, 174105 (2011)], the equilibrium thermodynamic properties can be evaluated at any temperature and in any phase. The method is illustrated in the context of the Lennard-Jones system and can readily be extended to other molecular systems and clusters for which the structures are known. PMID:27586913

  19. A Computationally Inexpensive Optimal Guidance via Radial-Basis-Function Neural Network for Autonomous Soft Landing on Asteroids.

    PubMed

    Zhang, Peng; Liu, Keping; Zhao, Bo; Li, Yuanchun

    2015-01-01

    Optimal guidance is essential for the soft landing task. However, due to its high computational complexities, it is hardly applied to the autonomous guidance. In this paper, a computationally inexpensive optimal guidance algorithm based on the radial basis function neural network (RBFNN) is proposed. The optimization problem of the trajectory for soft landing on asteroids is formulated and transformed into a two-point boundary value problem (TPBVP). Combining the database of initial states with the relative initial co-states, an RBFNN is trained offline. The optimal trajectory of the soft landing is determined rapidly by applying the trained network in the online guidance. The Monte Carlo simulations of soft landing on the Eros433 are performed to demonstrate the effectiveness of the proposed guidance algorithm.

  20. Reverse energy partitioning—An efficient algorithm for computing the density of states, partition functions, and free energy of solids

    NASA Astrophysics Data System (ADS)

    Do, Hainam; Wheatley, Richard J.

    2016-08-01

    A robust and model free Monte Carlo simulation method is proposed to address the challenge in computing the classical density of states and partition function of solids. Starting from the minimum configurational energy, the algorithm partitions the entire energy range in the increasing energy direction ("upward") into subdivisions whose integrated density of states is known. When combined with the density of states computed from the "downward" energy partitioning approach [H. Do, J. D. Hirst, and R. J. Wheatley, J. Chem. Phys. 135, 174105 (2011)], the equilibrium thermodynamic properties can be evaluated at any temperature and in any phase. The method is illustrated in the context of the Lennard-Jones system and can readily be extended to other molecular systems and clusters for which the structures are known.