Science.gov

Sample records for accounting functions computer

  1. Accounting & Computing Curriculum Guide.

    ERIC Educational Resources Information Center

    Avani, Nathan T.; And Others

    This curriculum guide consists of materials for use in teaching a competency-based accounting and computing course that is designed to prepare students for employability in the following occupational areas: inventory control clerk, invoice clerk, payroll clerk, traffic clerk, general ledger bookkeeper, accounting clerk, account information clerk,…

  2. Teaching Accounting with Computers.

    ERIC Educational Resources Information Center

    Shaoul, Jean

    This paper addresses the numerous ways that computers may be used to enhance the teaching of accounting and business topics. It focuses on the pedagogical use of spreadsheet software to improve the conceptual coverage of accounting principles and practice, increase student understanding by involvement in the solution process, and reduce the amount…

  3. Rayleigh radiance computations for satellite remote sensing: accounting for the effect of sensor spectral response function.

    PubMed

    Wang, Menghua

    2016-05-30

    To understand and assess the effect of the sensor spectral response function (SRF) on the accuracy of the top of the atmosphere (TOA) Rayleigh-scattering radiance computation, new TOA Rayleigh radiance lookup tables (LUTs) over global oceans and inland waters have been generated. The new Rayleigh LUTs include spectral coverage of 335-2555 nm, all possible solar-sensor geometries, and surface wind speeds of 0-30 m/s. Using the new Rayleigh LUTs, the sensor SRF effect on the accuracy of the TOA Rayleigh radiance computation has been evaluated for spectral bands of the Visible Infrared Imaging Radiometer Suite (VIIRS) on the Suomi National Polar-orbiting Partnership (SNPP) satellite and the Joint Polar Satellite System (JPSS)-1, showing some important uncertainties for VIIRS-SNPP particularly for large solar- and/or sensor-zenith angles as well as for large Rayleigh optical thicknesses (i.e., short wavelengths) and bands with broad spectral bandwidths. To accurately account for the sensor SRF effect, a new correction algorithm has been developed for VIIRS spectral bands, which improves the TOA Rayleigh radiance accuracy to ~0.01% even for the large solar-zenith angles of 70°-80°, compared with the error of ~0.7% without applying the correction for the VIIRS-SNPP 410 nm band. The same methodology that accounts for the sensor SRF effect on the Rayleigh radiance computation can be used for other satellite sensors. In addition, with the new Rayleigh LUTs, the effect of surface atmospheric pressure variation on the TOA Rayleigh radiance computation can be calculated precisely, and no specific atmospheric pressure correction algorithm is needed. There are some other important applications and advantages to using the new Rayleigh LUTs for satellite remote sensing, including an efficient and accurate TOA Rayleigh radiance computation for hyperspectral satellite remote sensing, detector-based TOA Rayleigh radiance computation, Rayleigh radiance calculations for high altitude

  4. Vocational Accounting and Computing Programs.

    ERIC Educational Resources Information Center

    Avani, Nathan T.

    1986-01-01

    Describes an "Accounting and Computing" program in Michigan that emphasizes computerized accounting procedures. This article describes the program curriculum and duty areas (such as handling accounts receivable), presents a list of sample tasks in each duty area, and specifies components of each task. Computer equipment necessary for this program…

  5. Does the regulation of local excitation-inhibition balance aid in recovery of functional connectivity? A computational account.

    PubMed

    Vattikonda, Anirudh; Surampudi, Bapi Raju; Banerjee, Arpan; Deco, Gustavo; Roy, Dipanjan

    2016-08-01

    Computational modeling of the spontaneous dynamics over the whole brain provides critical insight into the spatiotemporal organization of brain dynamics at multiple resolutions and their alteration to changes in brain structure (e.g. in diseased states, aging, across individuals). Recent experimental evidence further suggests that the adverse effect of lesions is visible on spontaneous dynamics characterized by changes in resting state functional connectivity and its graph theoretical properties (e.g. modularity). These changes originate from altered neural dynamics in individual brain areas that are otherwise poised towards a homeostatic equilibrium to maintain a stable excitatory and inhibitory activity. In this work, we employ a homeostatic inhibitory mechanism, balancing excitation and inhibition in the local brain areas of the entire cortex under neurological impairments like lesions to understand global functional recovery (across brain networks and individuals). Previous computational and empirical studies have demonstrated that the resting state functional connectivity varies primarily due to the location and specific topological characteristics of the lesion. We show that local homeostatic balance provides a functional recovery by re-establishing excitation-inhibition balance in all areas that are affected by lesion. We systematically compare the extent of recovery in the primary hub areas (e.g. default mode network (DMN), medial temporal lobe, medial prefrontal cortex) as well as other sensory areas like primary motor area, supplementary motor area, fronto-parietal and temporo-parietal networks. Our findings suggest that stability and richness similar to the normal brain dynamics at rest are achievable by re-establishment of balance. PMID:27177761

  6. Integrating Computer Concepts into Principles of Accounting.

    ERIC Educational Resources Information Center

    Beck, Henry J.; Parrish, Roy James, Jr.

    A package of instructional materials for an undergraduate principles of accounting course at Danville Community College was developed based upon the following assumptions: (1) the principles of accounting student does not need to be able to write computer programs; (2) computerized accounting concepts should be presented in this course; (3)…

  7. Space shuttle configuration accounting functional design specification

    NASA Technical Reports Server (NTRS)

    1974-01-01

    An analysis is presented of the requirements for an on-line automated system which must be capable of tracking the status of requirements and engineering changes and of providing accurate and timely records. The functional design specification provides the definition, description, and character length of the required data elements and the interrelationship of data elements to adequately track, display, and report the status of active configuration changes. As changes to the space shuttle program levels II and III configuration are proposed, evaluated, and dispositioned, it is the function of the configuration management office to maintain records regarding changes to the baseline and to track and report the status of those changes. The configuration accounting system will consist of a combination of computers, computer terminals, software, and procedures, all of which are designed to store, retrieve, display, and process information required to track proposed and proved engineering changes to maintain baseline documentation of the space shuttle program levels II and III.

  8. Symbolic functions from neural computation.

    PubMed

    Smolensky, Paul

    2012-07-28

    Is thought computation over ideas? Turing, and many cognitive scientists since, have assumed so, and formulated computational systems in which meaningful concepts are encoded by symbols which are the objects of computation. Cognition has been carved into parts, each a function defined over such symbols. This paper reports on a research program aimed at computing these symbolic functions without computing over the symbols. Symbols are encoded as patterns of numerical activation over multiple abstract neurons, each neuron simultaneously contributing to the encoding of multiple symbols. Computation is carried out over the numerical activation values of such neurons, which individually have no conceptual meaning. This is massively parallel numerical computation operating within a continuous computational medium. The paper presents an axiomatic framework for such a computational account of cognition, including a number of formal results. Within the framework, a class of recursive symbolic functions can be computed. Formal languages defined by symbolic rewrite rules can also be specified, the subsymbolic computations producing symbolic outputs that simultaneously display central properties of both facets of human language: universal symbolic grammatical competence and statistical, imperfect performance. PMID:22711873

  9. Assessment of the Accounting and Joint Accounting/Computer Information Systems Programs.

    ERIC Educational Resources Information Center

    Appiah, John; Cernigliaro, James; Davis, Jeffrey; Gordon, Millicent; Richards, Yves; Santamaria, Fernando; Siegel, Annette; Lytle, Namy; Wharton, Patrick

    This document presents City University of New York LaGuardia Community College's Department of Accounting and Managerial Studies assessment of its accounting and joint accounting/computer information systems programs report, and includes the following items: (1) description of the mission and goals of the Department of Accounting and Managerial…

  10. Computational complexity of Boolean functions

    NASA Astrophysics Data System (ADS)

    Korshunov, Aleksei D.

    2012-02-01

    Boolean functions are among the fundamental objects of discrete mathematics, especially in those of its subdisciplines which fall under mathematical logic and mathematical cybernetics. The language of Boolean functions is convenient for describing the operation of many discrete systems such as contact networks, Boolean circuits, branching programs, and some others. An important parameter of discrete systems of this kind is their complexity. This characteristic has been actively investigated starting from Shannon's works. There is a large body of scientific literature presenting many fundamental results. The purpose of this survey is to give an account of the main results over the last sixty years related to the complexity of computation (realization) of Boolean functions by contact networks, Boolean circuits, and Boolean circuits without branching. Bibliography: 165 titles.

  11. A Computational Account of Bilingual Aphasia Rehabilitation

    ERIC Educational Resources Information Center

    Kiran, Swathi; Grasemann, Uli; Sandberg, Chaleece; Miikkulainen, Risto

    2013-01-01

    Current research on bilingual aphasia highlights the paucity in recommendations for optimal rehabilitation for bilingual aphasic patients (Edmonds & Kiran, 2006; Roberts & Kiran, 2007). In this paper, we have developed a computational model to simulate an English-Spanish bilingual language system in which language representations can vary by age…

  12. Computer-Based Instruction in Accounting Using the CREATE System.

    ERIC Educational Resources Information Center

    Henkle, Edward B.; Robertson, Kenneth W.

    The Graduate Logistics program of the United States Air Force (USAF) Institute of Technology has required that prospective students show a satisfactory level of competence in basic accounting procedures before entering the program. The purpose of this thesis was to develop accounting case problems for use with the CREATE computer system that would…

  13. Network Coding for Function Computation

    ERIC Educational Resources Information Center

    Appuswamy, Rathinakumar

    2011-01-01

    In this dissertation, the following "network computing problem" is considered. Source nodes in a directed acyclic network generate independent messages and a single receiver node computes a target function f of the messages. The objective is to maximize the average number of times f can be computed per network usage, i.e., the "computing…

  14. Program Computes Thermodynamic Functions

    NASA Technical Reports Server (NTRS)

    Mcbride, Bonnie J.; Gordon, Sanford

    1994-01-01

    PAC91 is latest in PAC (Properties and Coefficients) series. Two principal features are to provide means of (1) generating theoretical thermodynamic functions from molecular constants and (2) least-squares fitting of these functions to empirical equations. PAC91 written in FORTRAN 77 to be machine-independent.

  15. Common Accounting System for Monitoring the ATLAS Distributed Computing Resources

    NASA Astrophysics Data System (ADS)

    Karavakis, E.; Andreeva, J.; Campana, S.; Gayazov, S.; Jezequel, S.; Saiz, P.; Sargsyan, L.; Schovancova, J.; Ueda, I.; Atlas Collaboration

    2014-06-01

    This paper covers in detail a variety of accounting tools used to monitor the utilisation of the available computational and storage resources within the ATLAS Distributed Computing during the first three years of Large Hadron Collider data taking. The Experiment Dashboard provides a set of common accounting tools that combine monitoring information originating from many different information sources; either generic or ATLAS specific. This set of tools provides quality and scalable solutions that are flexible enough to support the constantly evolving requirements of the ATLAS user community.

  16. PC-DYMAC: Personal Computer---DYnamic Materials ACcounting

    SciTech Connect

    Jackson, B.G.

    1989-11-01

    This manual was designed to provide complete documentation for the computer system used by the EBR-II Fuels and Materials Department, Argonne National Laboratory-West (ANL-W) for accountability of special nuclear materials (SNM). This document includes background information on the operation of the Fuel Manufacturing Facility (FMF), instructions on computer operations in correlation with production and a detailed manual for DYMAC operation. 60 figs.

  17. Computers Can Help Student Retention in Introductory College Accounting.

    ERIC Educational Resources Information Center

    Price, Richard L.; Murvin, Harry J.

    1992-01-01

    Almost all students in a study of an integrated instructional approach indicated that using a computer and workbook was very helpful in understanding financial accounting. A related study found that students with lower reading levels benefited most from this approach, and withdrawal dropped from 10 percent to 2 percent. (JOW)

  18. Computational Models for Neuromuscular Function

    PubMed Central

    Valero-Cuevas, Francisco J.; Hoffmann, Heiko; Kurse, Manish U.; Kutch, Jason J.; Theodorou, Evangelos A.

    2011-01-01

    Computational models of the neuromuscular system hold the potential to allow us to reach a deeper understanding of neuromuscular function and clinical rehabilitation by complementing experimentation. By serving as a means to distill and explore specific hypotheses, computational models emerge from prior experimental data and motivate future experimental work. Here we review computational tools used to understand neuromuscular function including musculoskeletal modeling, machine learning, control theory, and statistical model analysis. We conclude that these tools, when used in combination, have the potential to further our understanding of neuromuscular function by serving as a rigorous means to test scientific hypotheses in ways that complement and leverage experimental data. PMID:21687779

  19. Automatic computation of transfer functions

    DOEpatents

    Atcitty, Stanley; Watson, Luke Dale

    2015-04-14

    Technologies pertaining to the automatic computation of transfer functions for a physical system are described herein. The physical system is one of an electrical system, a mechanical system, an electromechanical system, an electrochemical system, or an electromagnetic system. A netlist in the form of a matrix comprises data that is indicative of elements in the physical system, values for the elements in the physical system, and structure of the physical system. Transfer functions for the physical system are computed based upon the netlist.

  20. Computer Experiments for Function Approximations

    SciTech Connect

    Chang, A; Izmailov, I; Rizzo, S; Wynter, S; Alexandrov, O; Tong, C

    2007-10-15

    This research project falls in the domain of response surface methodology, which seeks cost-effective ways to accurately fit an approximate function to experimental data. Modeling and computer simulation are essential tools in modern science and engineering. A computer simulation can be viewed as a function that receives input from a given parameter space and produces an output. Running the simulation repeatedly amounts to an equivalent number of function evaluations, and for complex models, such function evaluations can be very time-consuming. It is then of paramount importance to intelligently choose a relatively small set of sample points in the parameter space at which to evaluate the given function, and then use this information to construct a surrogate function that is close to the original function and takes little time to evaluate. This study was divided into two parts. The first part consisted of comparing four sampling methods and two function approximation methods in terms of efficiency and accuracy for simple test functions. The sampling methods used were Monte Carlo, Quasi-Random LP{sub {tau}}, Maximin Latin Hypercubes, and Orthogonal-Array-Based Latin Hypercubes. The function approximation methods utilized were Multivariate Adaptive Regression Splines (MARS) and Support Vector Machines (SVM). The second part of the study concerned adaptive sampling methods with a focus on creating useful sets of sample points specifically for monotonic functions, functions with a single minimum and functions with a bounded first derivative.

  1. On computing special functions in marine engineering

    NASA Astrophysics Data System (ADS)

    Constantinescu, E.; Bogdan, M.

    2015-11-01

    Important modeling applications in marine engineering conduct us to a special class of solutions for difficult differential equations with variable coefficients. In order to be able to solve and implement such models (in wave theory, in acoustics, in hydrodynamics, in electromagnetic waves, but also in many other engineering fields), it is necessary to compute so called special functions: Bessel functions, modified Bessel functions, spherical Bessel functions, Hankel functions. The aim of this paper is to develop numerical solutions in Matlab for the above mentioned special functions. Taking into account the main properties for Bessel and modified Bessel functions, we shortly present analytically solutions (where possible) in the form of series. Especially it is studied the behavior of these special functions using Matlab facilities: numerical solutions and plotting. Finally, it will be compared the behavior of the special functions and point out other directions for investigating properties of Bessel and spherical Bessel functions. The asymptotic forms of Bessel functions and modified Bessel functions allow determination of important properties of these functions. The modified Bessel functions tend to look more like decaying and growing exponentials.

  2. Computer program for the automated attendance accounting system

    NASA Technical Reports Server (NTRS)

    Poulson, P.; Rasmusson, C.

    1971-01-01

    The automated attendance accounting system (AAAS) was developed under the auspices of the Space Technology Applications Program. The task is basically the adaptation of a small digital computer, coupled with specially developed pushbutton terminals located in school classrooms and offices for the purpose of taking daily attendance, maintaining complete attendance records, and producing partial and summary reports. Especially developed for high schools, the system is intended to relieve both teachers and office personnel from the time-consuming and dreary task of recording and analyzing the myriad classroom attendance data collected throughout the semester. In addition, since many school district budgets are related to student attendance, the increase in accounting accuracy is expected to augment district income. A major component of this system is the real-time AAAS software system, which is described.

  3. FUNCTION GENERATOR FOR ANALOGUE COMPUTERS

    DOEpatents

    Skramstad, H.K.; Wright, J.H.; Taback, L.

    1961-12-12

    An improved analogue computer is designed which can be used to determine the final ground position of radioactive fallout particles in an atomic cloud. The computer determines the fallout pattern on the basis of known wind velocity and direction at various altitudes, and intensity of radioactivity in the mushroom cloud as a function of particle size and initial height in the cloud. The output is then displayed on a cathode-ray tube so that the average or total luminance of the tube screen at any point represents the intensity of radioactive fallout at the geographical location represented by that point. (AEC)

  4. Accountability for Early Childhood Education (Assessing Global Functioning).

    ERIC Educational Resources Information Center

    Cassel, Russell N.

    1995-01-01

    Discusses the pacing of learning activity, knowledge of progress in student learning, teacher role, accountability in learning, feedback on knowledge of success, the global functioning assessment concept, and the mother surrogate. (RS)

  5. Living through a computer voice: a personal account.

    PubMed

    Martin, Alan; Newell, Christopher

    2013-10-01

    Alan Martin, the first author of this paper, has cerebral palsy and uses a voice output communication aid (VOCA) to speak, and this paper describes the personal experience of living 'through' a computer voice (or VOCA) in the form of an interview of Mr Martin conducted by Dr Newell. The interview focuses on the computerized voice output rather than other features of the VOCA. In presenting a first-hand account of the experience of actually using VOCA, the intention is that both everyday, practical issues of the technology and broader imaginative, philosophical, and sociological implications will be explored. Based upon the interview, the authors offer an informal set of design requirements and recommendations for the development of future VOCAs. PMID:23841537

  6. Metacognition: computation, biology and function

    PubMed Central

    Fleming, Stephen M.; Dolan, Raymond J.; Frith, Christopher D.

    2012-01-01

    Many complex systems maintain a self-referential check and balance. In animals, such reflective monitoring and control processes have been grouped under the rubric of metacognition. In this introductory article to a Theme Issue on metacognition, we review recent and rapidly progressing developments from neuroscience, cognitive psychology, computer science and philosophy of mind. While each of these areas is represented in detail by individual contributions to the volume, we take this opportunity to draw links between disciplines, and highlight areas where further integration is needed. Specifically, we cover the definition, measurement, neurobiology and possible functions of metacognition, and assess the relationship between metacognition and consciousness. We propose a framework in which level of representation, order of behaviour and access consciousness are orthogonal dimensions of the conceptual landscape. PMID:22492746

  7. Metacognition: computation, biology and function.

    PubMed

    Fleming, Stephen M; Dolan, Raymond J; Frith, Christopher D

    2012-05-19

    Many complex systems maintain a self-referential check and balance. In animals, such reflective monitoring and control processes have been grouped under the rubric of metacognition. In this introductory article to a Theme Issue on metacognition, we review recent and rapidly progressing developments from neuroscience, cognitive psychology, computer science and philosophy of mind. While each of these areas is represented in detail by individual contributions to the volume, we take this opportunity to draw links between disciplines, and highlight areas where further integration is needed. Specifically, we cover the definition, measurement, neurobiology and possible functions of metacognition, and assess the relationship between metacognition and consciousness. We propose a framework in which level of representation, order of behaviour and access consciousness are orthogonal dimensions of the conceptual landscape. PMID:22492746

  8. Computing Functions by Approximating the Input

    ERIC Educational Resources Information Center

    Goldberg, Mayer

    2012-01-01

    In computing real-valued functions, it is ordinarily assumed that the input to the function is known, and it is the output that we need to approximate. In this work, we take the opposite approach: we show how to compute the values of some transcendental functions by approximating the input to these functions, and obtaining exact answers for their…

  9. Integration of a Computer Application in a First Year Accounting Curriculum: An Evaluation of Student Attitudes

    ERIC Educational Resources Information Center

    Laing, Gregory Kenneth; Perrin, Ronald William

    2012-01-01

    This paper presents the findings of a field study conducted to ascertain the perceptions of first year accounting students concerning the integration of computer applications in the accounting curriculum. The results indicate that both student cohorts perceived the computer as a valuable educational tool. The use of computers to enhance the…

  10. Genre Analysis of Tax Computation Letters: How and Why Tax Accountants Write the Way They Do

    ERIC Educational Resources Information Center

    Flowerdew, John; Wan, Alina

    2006-01-01

    This study is a genre analysis which explores the specific discourse community of tax accountants. Tax computation letters from one international accounting firm in Hong Kong were analyzed and compared. To probe deeper into the tax accounting discourse community, a group of tax accountants from the same firm was observed and questioned. The texts…

  11. Teaching with Computers: A Cautionary Finding in an Accounting Class

    ERIC Educational Resources Information Center

    Jones, Stuart H.; Wright, Michael

    2005-01-01

    The study assesses the effects of a hypertext learning aid and GPA on performance in advanced financial accounting. Results indicate that the type of learning aid and GPA significantly affect performance. High GPA students performed better than did the low GPA students. In the study, two versions of the hypertext learning aid were utilized by two…

  12. 49 CFR 1242.46 - Computers and data processing equipment (account XX-27-46).

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 9 2011-10-01 2011-10-01 false Computers and data processing equipment (account XX-27-46). 1242.46 Section 1242.46 Transportation Other Regulations Relating to Transportation... RAILROADS 1 Operating Expenses-Equipment § 1242.46 Computers and data processing equipment (account...

  13. 49 CFR 1242.46 - Computers and data processing equipment (account XX-27-46).

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 9 2014-10-01 2014-10-01 false Computers and data processing equipment (account XX-27-46). 1242.46 Section 1242.46 Transportation Other Regulations Relating to Transportation... RAILROADS 1 Operating Expenses-Equipment § 1242.46 Computers and data processing equipment (account...

  14. 49 CFR 1242.46 - Computers and data processing equipment (account XX-27-46).

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 9 2013-10-01 2013-10-01 false Computers and data processing equipment (account XX-27-46). 1242.46 Section 1242.46 Transportation Other Regulations Relating to Transportation... RAILROADS 1 Operating Expenses-Equipment § 1242.46 Computers and data processing equipment (account...

  15. 49 CFR 1242.46 - Computers and data processing equipment (account XX-27-46).

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 9 2012-10-01 2012-10-01 false Computers and data processing equipment (account XX-27-46). 1242.46 Section 1242.46 Transportation Other Regulations Relating to Transportation... RAILROADS 1 Operating Expenses-Equipment § 1242.46 Computers and data processing equipment (account...

  16. 49 CFR 1242.46 - Computers and data processing equipment (account XX-27-46).

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 9 2010-10-01 2010-10-01 false Computers and data processing equipment (account XX-27-46). 1242.46 Section 1242.46 Transportation Other Regulations Relating to Transportation... RAILROADS 1 Operating Expenses-Equipment § 1242.46 Computers and data processing equipment (account...

  17. Connecting Neural Coding to Number Cognition: A Computational Account

    ERIC Educational Resources Information Center

    Prather, Richard W.

    2012-01-01

    The current study presents a series of computational simulations that demonstrate how the neural coding of numerical magnitude may influence number cognition and development. This includes behavioral phenomena cataloged in cognitive literature such as the development of numerical estimation and operational momentum. Though neural research has…

  18. Accountability.

    ERIC Educational Resources Information Center

    The Newsletter of the Comprehensive Center-Region VI, 1999

    1999-01-01

    Controversy surrounding the accountability movement is related to how the movement began in response to dissatisfaction with public schools. Opponents see it as one-sided, somewhat mean-spirited, and a threat to the professional status of teachers. Supporters argue that all other spheres of the workplace have accountability systems and that the…

  19. Accountability.

    ERIC Educational Resources Information Center

    Lashway, Larry

    1999-01-01

    This issue reviews publications that provide a starting point for principals looking for a way through the accountability maze. Each publication views accountability differently, but collectively these readings argue that even in an era of state-mandated assessment, principals can pursue proactive strategies that serve students' needs. James A.…

  20. Sequential decisions: a computational comparison of observational and reinforcement accounts.

    PubMed

    Mohammadi Sepahvand, Nazanin; Stöttinger, Elisabeth; Danckert, James; Anderson, Britt

    2014-01-01

    Right brain damaged patients show impairments in sequential decision making tasks for which healthy people do not show any difficulty. We hypothesized that this difficulty could be due to the failure of right brain damage patients to develop well-matched models of the world. Our motivation is the idea that to navigate uncertainty, humans use models of the world to direct the decisions they make when interacting with their environment. The better the model is, the better their decisions are. To explore the model building and updating process in humans and the basis for impairment after brain injury, we used a computational model of non-stationary sequence learning. RELPH (Reinforcement and Entropy Learned Pruned Hypothesis space) was able to qualitatively and quantitatively reproduce the results of left and right brain damaged patient groups and healthy controls playing a sequential version of Rock, Paper, Scissors. Our results suggests that, in general, humans employ a sub-optimal reinforcement based learning method rather than an objectively better statistical learning approach, and that differences between right brain damaged and healthy control groups can be explained by different exploration policies, rather than qualitatively different learning mechanisms. PMID:24747416

  1. On computation of Hough functions

    NASA Astrophysics Data System (ADS)

    Wang, Houjun; Boyd, John P.; Akmaev, Rashid A.

    2016-04-01

    Hough functions are the eigenfunctions of the Laplace tidal equation governing fluid motion on a rotating sphere with a resting basic state. Several numerical methods have been used in the past. In this paper, we compare two of those methods: normalized associated Legendre polynomial expansion and Chebyshev collocation. Both methods are not widely used, but both have some advantages over the commonly used unnormalized associated Legendre polynomial expansion method. Comparable results are obtained using both methods. For the first method we note some details on numerical implementation. The Chebyshev collocation method was first used for the Laplace tidal problem by Boyd (1976) and is relatively easy to use. A compact MATLAB code is provided for this method. We also illustrate the importance and effect of including a parity factor in Chebyshev polynomial expansions for modes with odd zonal wave numbers.

  2. Accounting Students Are Unable to Recognize the Various Types of Accounting Functions.

    ERIC Educational Resources Information Center

    Frank, Gary B.; And Others

    1989-01-01

    The authors discuss 258 undergraduate business majors' perceptions of the nature and uses of financial and managerial accounting. Perceptions were measured with Stapel Scales constructed on 11 descriptive statements. Findings indicated that students distinguish between financial and managerial accounting, but that they do not view the two as…

  3. Final Report for the Account Creation/Deletion Reenginering Task for the Scientific Computing Department

    SciTech Connect

    JENNINGS, BARBARA J.; MCALLISTER, PAULA L.

    2002-04-01

    In October 2000, the personnel responsible for administration of the corporate computers managed by the Scientific Computing Department assembled to reengineer the process of creating and deleting users' computer accounts. Using the Carnegie Mellon Software Engineering Institute (SEI) Capability Maturity Model (CMM) for quality improvement process, the team performed the reengineering by way of process modeling, defining and measuring the maturity of the processes, per SEI and CMM practices. The computers residing in the classified environment are bound by security requirements of the Secure Classified Network (SCN) Security Plan. These security requirements delimited the scope of the project, specifically mandating validation of all user accounts on the central corporate computer systems. System administrators, in addition to their assigned responsibilities, were spending valuable hours performing the additional tacit responsibility of tracking user accountability for user-generated data. For example, in cases where the data originator was no longer an employee, the administrators were forced to spend considerable time and effort determining the appropriate management personnel to assume ownership or disposition of the former owner's data files. In order to prevent this sort of problem from occurring and to have a defined procedure in the event of an anomaly, the computer account management procedure was thoroughly reengineered, as detailed in this document. An automated procedure is now in place that is initiated and supplied data by central corporate processes certifying the integrity, timeliness and authentication of account holders and their management. Automated scripts identify when an account is about to expire, to preempt the problem of data becoming ''orphaned'' without a responsible ''owner'' on the system. The automated account-management procedure currently operates on and provides a standard process for all of the computers maintained by the

  4. Does Participation in a Computer-Based Learning Program in Introductory Financial Accounting Course Lead to Choosing Accounting as a Major?

    ERIC Educational Resources Information Center

    Owhoso, Vincent; Malgwi, Charles A.; Akpomi, Margaret

    2014-01-01

    The authors examine whether students who completed a computer-based intervention program, designed to help them develop abilities and skills in introductory accounting, later declared accounting as a major. A sample of 1,341 students participated in the study, of which 74 completed the intervention program (computer-based assisted learning [CBAL])…

  5. Computer Games Functioning as Motivation Stimulants

    ERIC Educational Resources Information Center

    Lin, Grace Hui Chin; Tsai, Tony Kung Wan; Chien, Paul Shih Chieh

    2011-01-01

    Numerous scholars have recommended computer games can function as influential motivation stimulants of English learning, showing benefits as learning tools (Clarke and Dede, 2007; Dede, 2009; Klopfer and Squire, 2009; Liu and Chu, 2010; Mitchell, Dede & Dunleavy, 2009). This study aimed to further test and verify the above suggestion,…

  6. Deterministic Function Computation with Chemical Reaction Networks*

    PubMed Central

    Chen, Ho-Lin; Doty, David; Soloveichik, David

    2013-01-01

    Chemical reaction networks (CRNs) formally model chemistry in a well-mixed solution. CRNs are widely used to describe information processing occurring in natural cellular regulatory networks, and with upcoming advances in synthetic biology, CRNs are a promising language for the design of artificial molecular control circuitry. Nonetheless, despite the widespread use of CRNs in the natural sciences, the range of computational behaviors exhibited by CRNs is not well understood. CRNs have been shown to be efficiently Turing-universal (i.e., able to simulate arbitrary algorithms) when allowing for a small probability of error. CRNs that are guaranteed to converge on a correct answer, on the other hand, have been shown to decide only the semilinear predicates (a multi-dimensional generalization of “eventually periodic” sets). We introduce the notion of function, rather than predicate, computation by representing the output of a function f : ℕk → ℕl by a count of some molecular species, i.e., if the CRN starts with x1, …, xk molecules of some “input” species X1, …, Xk, the CRN is guaranteed to converge to having f(x1, …, xk) molecules of the “output” species Y1, …, Yl. We show that a function f : ℕk → ℕl is deterministically computed by a CRN if and only if its graph {(x, y) ∈ ℕk × ℕl ∣ f(x) = y} is a semilinear set. Finally, we show that each semilinear function f (a function whose graph is a semilinear set) can be computed by a CRN on input x in expected time O(polylog ∥x∥1). PMID:25383068

  7. The emerging discipline of Computational Functional Anatomy

    PubMed Central

    Miller, Michael I.; Qiu, Anqi

    2010-01-01

    Computational Functional Anatomy (CFA) is the study of functional and physiological response variables in anatomical coordinates. For this we focus on two things: (i) the construction of bijections (via diffeomorphisms) between the coordinatized manifolds of human anatomy, and (ii) the transfer (group action and parallel transport) of functional information into anatomical atlases via these bijections. We review advances in the unification of the bijective comparison of anatomical submanifolds via point-sets including points, curves and surface triangulations as well as dense imagery. We examine the transfer via these bijections of functional response variables into anatomical coordinates via group action on scalars and matrices in DTI as well as parallel transport of metric information across multiple templates which preserves the inner product. PMID:19103297

  8. New Computer Simulations of Macular Neural Functioning

    NASA Technical Reports Server (NTRS)

    Ross, Muriel D.; Doshay, D.; Linton, S.; Parnas, B.; Montgomery, K.; Chimento, T.

    1994-01-01

    We use high performance graphics workstations and supercomputers to study the functional significance of the three-dimensional (3-D) organization of gravity sensors. These sensors have a prototypic architecture foreshadowing more complex systems. Scaled-down simulations run on a Silicon Graphics workstation and scaled-up, 3-D versions run on a Cray Y-MP supercomputer. A semi-automated method of reconstruction of neural tissue from serial sections studied in a transmission electron microscope has been developed to eliminate tedious conventional photography. The reconstructions use a mesh as a step in generating a neural surface for visualization. Two meshes are required to model calyx surfaces. The meshes are connected and the resulting prisms represent the cytoplasm and the bounding membranes. A finite volume analysis method is employed to simulate voltage changes along the calyx in response to synapse activation on the calyx or on calyceal processes. The finite volume method insures that charge is conserved at the calyx-process junction. These and other models indicate that efferent processes act as voltage followers, and that the morphology of some afferent processes affects their functioning. In a final application, morphological information is symbolically represented in three dimensions in a computer. The possible functioning of the connectivities is tested using mathematical interpretations of physiological parameters taken from the literature. Symbolic, 3-D simulations are in progress to probe the functional significance of the connectivities. This research is expected to advance computer-based studies of macular functioning and of synaptic plasticity.

  9. Efficient computation of Wigner-Eisenbud functions

    NASA Astrophysics Data System (ADS)

    Raffah, Bahaaudin M.; Abbott, Paul C.

    2013-06-01

    The R-matrix method, introduced by Wigner and Eisenbud (1947) [1], has been applied to a broad range of electron transport problems in nanoscale quantum devices. With the rapid increase in the development and modeling of nanodevices, efficient, accurate, and general computation of Wigner-Eisenbud functions is required. This paper presents the Mathematica package WignerEisenbud, which uses the Fourier discrete cosine transform to compute the Wigner-Eisenbud functions in dimensionless units for an arbitrary potential in one dimension, and two dimensions in cylindrical coordinates. Program summaryProgram title: WignerEisenbud Catalogue identifier: AEOU_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEOU_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html Distribution format: tar.gz Programming language: Mathematica Operating system: Any platform supporting Mathematica 7.0 and above Keywords: Wigner-Eisenbud functions, discrete cosine transform (DCT), cylindrical nanowires Classification: 7.3, 7.9, 4.6, 5 Nature of problem: Computing the 1D and 2D Wigner-Eisenbud functions for arbitrary potentials using the DCT. Solution method: The R-matrix method is applied to the physical problem. Separation of variables is used for eigenfunction expansion of the 2D Wigner-Eisenbud functions. Eigenfunction computation is performed using the DCT to convert the Schrödinger equation with Neumann boundary conditions to a generalized matrix eigenproblem. Limitations: Restricted to uniform (rectangular grid) sampling of the potential. In 1D the number of sample points, n, results in matrix computations involving n×n matrices. Unusual features: Eigenfunction expansion using the DCT is fast and accurate. Users can specify scattering potentials using functions, or interactively using mouse input. Use of dimensionless units permits application to a

  10. Computing Balance Column Amount in Ledger Accounts. Student Manual and Instructor's Manual.

    ERIC Educational Resources Information Center

    McElveen, Peggy C.

    Supporting performance objective 31 of the V-TECS (Vocational-Technical Education Consortium of States) Secretarial Catalog, both a set of student materials and an instructor's manual on computing the balance column amount in ledger accounts are included in this packet, which is one of a series. The student materials include a record of a…

  11. A Computational Account of Children's Analogical Reasoning: Balancing Inhibitory Control in Working Memory and Relational Representation

    ERIC Educational Resources Information Center

    Morrison, Robert G.; Doumas, Leonidas A. A.; Richland, Lindsey E.

    2011-01-01

    Theories accounting for the development of analogical reasoning tend to emphasize either the centrality of relational knowledge accretion or changes in information processing capability. Simulations in LISA (Hummel & Holyoak, 1997, 2003), a neurally inspired computer model of analogical reasoning, allow us to explore how these factors may…

  12. Technology Readiness, Internet Self-Efficacy and Computing Experience of Professional Accounting Students

    ERIC Educational Resources Information Center

    Lai, Ming-Ling

    2008-01-01

    Purpose: This study aims to assess the state of technology readiness of professional accounting students in Malaysia, to examine their level of internet self-efficacy, to assess their prior computing experience, and to explore if they are satisfied with the professional course that they are pursuing in improving their technology skills.…

  13. Written and Computer-Mediated Accounting Communication Skills: An Employer Perspective

    ERIC Educational Resources Information Center

    Jones, Christopher G.

    2011-01-01

    Communication skills are a fundamental personal competency for a successful career in accounting. What is not so obvious is the specific written communication skill set employers look for and the extent those skills are computer mediated. Using survey research, this article explores the particular skills employers desire and their satisfaction…

  14. Neutron monitor yield function: New improved computations

    NASA Astrophysics Data System (ADS)

    Mishev, A. L.; Usoskin, I. G.; Kovaltsov, G. A.

    2013-06-01

    A ground-based neutron monitor (NM) is a standard tool to measure cosmic ray (CR) variability near Earth, and it is crucially important to know its yield function for primary CRs. Although there are several earlier theoretically calculated yield functions, none of them agrees with experimental data of latitude surveys of sea-level NMs, thus suggesting for an inconsistency. A newly computed yield function of the standard sea-level 6NM64 NM is presented here separately for primary CR protons and α-particles, the latter representing also heavier species of CRs. The computations have been done using the GEANT-4 PLANETOCOSMICS Monte-Carlo tool and a realistic curved atmospheric model. For the first time, an effect of the geometrical correction of the NM effective area, related to the finite lateral expansion of the CR induced atmospheric cascade, is considered, which was neglected in the previous studies. This correction slightly enhances the relative impact of higher-energy CRs (energy above 5-10 GeV/nucleon) in NM count rate. The new computation finally resolves the long-standing problem of disagreement between the theoretically calculated spatial variability of CRs over the globe and experimental latitude surveys. The newly calculated yield function, corrected for this geometrical factor, appears fully consistent with the experimental latitude surveys of NMs performed during three consecutive solar minima in 1976-1977, 1986-1987, and 1996-1997. Thus, we provide a new yield function of the standard sea-level NM 6NM64 that is validated against experimental data.

  15. Computer network defense through radial wave functions

    NASA Astrophysics Data System (ADS)

    Malloy, Ian J.

    The purpose of this research is to synthesize basic and fundamental findings in quantum computing, as applied to the attack and defense of conventional computer networks. The concept focuses on uses of radio waves as a shield for, and attack against traditional computers. A logic bomb is analogous to a landmine in a computer network, and if one was to implement it as non-trivial mitigation, it will aid computer network defense. As has been seen in kinetic warfare, the use of landmines has been devastating to geopolitical regions in that they are severely difficult for a civilian to avoid triggering given the unknown position of a landmine. Thus, the importance of understanding a logic bomb is relevant and has corollaries to quantum mechanics as well. The research synthesizes quantum logic phase shifts in certain respects using the Dynamic Data Exchange protocol in software written for this work, as well as a C-NOT gate applied to a virtual quantum circuit environment by implementing a Quantum Fourier Transform. The research focus applies the principles of coherence and entanglement from quantum physics, the concept of expert systems in artificial intelligence, principles of prime number based cryptography with trapdoor functions, and modeling radio wave propagation against an event from unknown parameters. This comes as a program relying on the artificial intelligence concept of an expert system in conjunction with trigger events for a trapdoor function relying on infinite recursion, as well as system mechanics for elliptic curve cryptography along orbital angular momenta. Here trapdoor both denotes the form of cipher, as well as the implied relationship to logic bombs.

  16. The intrinsic quasar luminosity function: Accounting for accretion disk anisotropy

    SciTech Connect

    DiPompeo, M. A.; Myers, A. D.; Brotherton, M. S.; Runnoe, J. C.; Green, R. F.

    2014-05-20

    Quasar luminosity functions are a fundamental probe of the growth and evolution of supermassive black holes. Measuring the intrinsic luminosity function is difficult in practice, due to a multitude of observational and systematic effects. As sample sizes increase and measurement errors drop, characterizing the systematic effects is becoming more important. It is well known that the continuum emission from the accretion disk of quasars is anisotropic—in part due to its disk-like structure—but current luminosity function calculations effectively assume isotropy over the range of unobscured lines of sight. Here, we provide the first steps in characterizing the effect of random quasar orientations and simple models of anisotropy on observed luminosity functions. We find that the effect of orientation is not insignificant and exceeds other potential corrections such as those from gravitational lensing of foreground structures. We argue that current observational constraints may overestimate the intrinsic luminosity function by as much as a factor of ∼2 on the bright end. This has implications for models of quasars and their role in the universe, such as quasars' contribution to cosmological backgrounds.

  17. Computational functions in biochemical reaction networks.

    PubMed Central

    Arkin, A; Ross, J

    1994-01-01

    In prior work we demonstrated the implementation of logic gates, sequential computers (universal Turing machines), and parallel computers by means of the kinetics of chemical reaction mechanisms. In the present article we develop this subject further by first investigating the computational properties of several enzymatic (single and multiple) reaction mechanisms: we show their steady states are analogous to either Boolean or fuzzy logic gates. Nearly perfect digital function is obtained only in the regime in which the enzymes are saturated with their substrates. With these enzymatic gates, we construct combinational chemical networks that execute a given truth-table. The dynamic range of a network's output is strongly affected by "input/output matching" conditions among the internal gate elements. We find a simple mechanism, similar to the interconversion of fructose-6-phosphate between its two bisphosphate forms (fructose-1,6-bisphosphate and fructose-2,6-bisphosphate), that functions analogously to an AND gate. When the simple model is supplanted with one in which the enzyme rate laws are derived from experimental data, the steady state of the mechanism functions as an asymmetric fuzzy aggregation operator with properties akin to a fuzzy AND gate. The qualitative behavior of the mechanism does not change when situated within a large model of glycolysis/gluconeogenesis and the TCA cycle. The mechanism, in this case, switches the pathway's mode from glycolysis to gluconeogenesis in response to chemical signals of low blood glucose (cAMP) and abundant fuel for the TCA cycle (acetyl coenzyme A). Images FIGURE 3 FIGURE 4 FIGURE 5 FIGURE 7 FIGURE 10 FIGURE 12 FIGURE 13 FIGURE 14 FIGURE 15 FIGURE 16 PMID:7948674

  18. Discrete Wigner functions and quantum computational speedup

    SciTech Connect

    Galvao, Ernesto F.

    2005-04-01

    Gibbons et al. [Phys. Rev. A 70, 062101 (2004)] have recently defined a class of discrete Wigner functions W to represent quantum states in a finite Hilbert space dimension d. I characterize the set C{sub d} of states having non-negative W simultaneously in all definitions of W in this class. For d{<=}5 I show C{sub d} is the convex hull of stabilizer states. This supports the conjecture that negativity of W is necessary for exponential speedup in pure-state quantum computation.

  19. Accounting for a Functional Category: German "Drohen" "to Threaten"

    ERIC Educational Resources Information Center

    Heine, Bernd; Miyashita, Hiroyuki

    2008-01-01

    In many languages there are words that behave like lexical verbs and on the one hand and like functional categories expressing distinctions of tense, aspect, modality, etc. on the other. The grammatical status of such words is frequently controversial; while some authors treat them as belonging to one and the same grammatical category, others…

  20. Computer simulation as a teaching aid in pharmacy management--Part 1: Principles of accounting.

    PubMed

    Morrison, D J

    1987-06-01

    The need for pharmacists to develop management expertise through participation in formal courses is now widely acknowledged. Many schools of pharmacy lay the foundations for future management training by providing introductory courses as an integral or elective part of the undergraduate syllabus. The benefit of such courses may, however, be limited by the lack of opportunity for the student to apply the concepts and procedures in a practical working environment. Computer simulations provide a means to overcome this problem, particularly in the field of resource management. In this, the first of two articles, the use of a computer model to demonstrate basic accounting principles is described. PMID:3301875

  1. A cognitive neurobiological account of deception: evidence from functional neuroimaging.

    PubMed Central

    Spence, Sean A; Hunter, Mike D; Farrow, Tom F D; Green, Russell D; Leung, David H; Hughes, Catherine J; Ganesan, Venkatasubramanian

    2004-01-01

    An organism may use misinformation, knowingly (through deception) or unknowingly (as in the case of camouflage), to gain advantage in a competitive environment. From an evolutionary perspective, greater tactical deception occurs among primates closer to humans, with larger neocortices. In humans, the onset of deceptive behaviours in childhood exhibits a developmental trajectory, which may be regarded as 'normal' in the majority and deficient among a minority with certain neurodevelopmental disorders (e.g. autism). In the human adult, deception and lying exhibit features consistent with their use of 'higher' or 'executive' brain systems. Accurate detection of deception in humans may be of particular importance in forensic practice, while an understanding of its cognitive neurobiology may have implications for models of 'theory of mind' and social cognition, and societal notions of responsibility, guilt and mitigation. In recent years, functional neuroimaging techniques (especially functional magnetic resonance imaging) have been used to study deception. Though few in number, and using very different experimental protocols, studies published in the peer-reviewed literature exhibit certain consistencies. Attempted deception is associated with activation of executive brain regions (particularly prefrontal and anterior cingulate cortices), while truthful responding has not been shown to be associated with any areas of increased activation (relative to deception). Hence, truthful responding may comprise a relative 'baseline' in human cognition and communication. The subject who lies may necessarily engage 'higher' brain centres, consistent with a purpose or intention (to deceive). While the principle of executive control during deception remains plausible, its precise anatomy awaits elucidation. PMID:15590616

  2. Computational based functional analysis of Bacillus phytases.

    PubMed

    Verma, Anukriti; Singh, Vinay Kumar; Gaur, Smriti

    2016-02-01

    Phytase is an enzyme which catalyzes the total hydrolysis of phytate to less phosphorylated myo-inositol derivatives and inorganic phosphate and digests the undigestable phytate part present in seeds and grains and therefore provides digestible phosphorus, calcium and other mineral nutrients. Phytases are frequently added to the feed of monogastric animals so that bioavailability of phytic acid-bound phosphate increases, ultimately enhancing the nutritional value of diets. The Bacillus phytase is very suitable to be used in animal feed because of its optimum pH with excellent thermal stability. Present study is aimed to perform an in silico comparative characterization and functional analysis of phytases from Bacillus amyloliquefaciens to explore physico-chemical properties using various bio-computational tools. All proteins are acidic and thermostable and can be used as suitable candidates in the feed industry. PMID:26672917

  3. Functional requirements for gas characterization system computer software

    SciTech Connect

    Tate, D.D.

    1996-01-01

    This document provides the Functional Requirements for the Computer Software operating the Gas Characterization System (GCS), which monitors the combustible gasses in the vapor space of selected tanks. Necessary computer functions are defined to support design, testing, operation, and change control. The GCS requires several individual computers to address the control and data acquisition functions of instruments and sensors. These computers are networked for communication, and must multi-task to accommodate operation in parallel.

  4. Green's Function Analysis of Periodic Structures in Computational Electromagnetics

    NASA Astrophysics Data System (ADS)

    Van Orden, Derek

    2011-12-01

    Periodic structures are used widely in electromagnetic devices, including filters, waveguiding structures, and antennas. Their electromagnetic properties may be analyzed computationally by solving an integral equation, in which an unknown equivalent current distribution in a single unit cell is convolved with a periodic Green's function that accounts for the system's boundary conditions. Fast computation of the periodic Green's function is therefore essential to achieve high accuracy solutions of complicated periodic structures, including analysis of modal wave propagation and scattering from external sources. This dissertation first presents alternative spectral representations of the periodic Green's function of the Helmholtz equation for cases of linear periodic systems in 2D and 3D free space and near planarly layered media. Although there exist multiple representations of the periodic Green's function, most are not efficient in the important case where the fields are observed near the array axis. We present spectral-spatial representations for rapid calculation of the periodic Green's functions for linear periodic arrays of current sources residing in free space as well as near a planarly layered medium. They are based on the integral expansion of the periodic Green's functions in terms of the spectral parameters transverse to the array axis. These schemes are important for the rapid computation of the interaction among unit cells of a periodic array, and, by extension, the complex dispersion relations of guided waves. Extensions of this approach to planar periodic structures are discussed. With these computation tools established, we study the traveling wave properties of linear resonant arrays placed near surfaces, and examine the coupling mechanisms that lead to radiation into guided waves supported by the surface. This behavior is especially important to understand the properties of periodic structures printed on dielectric substrates, such as periodic

  5. Computation of the lattice Green function for a dislocation

    NASA Astrophysics Data System (ADS)

    Tan, Anne Marie Z.; Trinkle, Dallas R.

    2016-08-01

    Modeling isolated dislocations is challenging due to their long-ranged strain fields. Flexible boundary condition methods capture the correct long-range strain field of a defect by coupling the defect core to an infinite harmonic bulk through the lattice Green function (LGF). To improve the accuracy and efficiency of flexible boundary condition methods, we develop a numerical method to compute the LGF specifically for a dislocation geometry; in contrast to previous methods, where the LGF was computed for the perfect bulk as an approximation for the dislocation. Our approach directly accounts for the topology of a dislocation, and the errors in the LGF computation converge rapidly for edge dislocations in a simple cubic model system as well as in BCC Fe with an empirical potential. When used within the flexible boundary condition approach, the dislocation LGF relaxes dislocation core geometries in fewer iterations than when the perfect bulk LGF is used as an approximation for the dislocation, making a flexible boundary condition approach more efficient.

  6. An Atomistic Statistically Effective Energy Function for Computational Protein Design.

    PubMed

    Topham, Christopher M; Barbe, Sophie; André, Isabelle

    2016-08-01

    Shortcomings in the definition of effective free-energy surfaces of proteins are recognized to be a major contributory factor responsible for the low success rates of existing automated methods for computational protein design (CPD). The formulation of an atomistic statistically effective energy function (SEEF) suitable for a wide range of CPD applications and its derivation from structural data extracted from protein domains and protein-ligand complexes are described here. The proposed energy function comprises nonlocal atom-based and local residue-based SEEFs, which are coupled using a novel atom connectivity number factor to scale short-range, pairwise, nonbonded atomic interaction energies and a surface-area-dependent cavity energy term. This energy function was used to derive additional SEEFs describing the unfolded-state ensemble of any given residue sequence based on computed average energies for partially or fully solvent-exposed fragments in regions of irregular structure in native proteins. Relative thermal stabilities of 97 T4 bacteriophage lysozyme mutants were predicted from calculated energy differences for folded and unfolded states with an average unsigned error (AUE) of 0.84 kcal mol(-1) when compared to experiment. To demonstrate the utility of the energy function for CPD, further validation was carried out in tests of its capacity to recover cognate protein sequences and to discriminate native and near-native protein folds, loop conformers, and small-molecule ligand binding poses from non-native benchmark decoys. Experimental ligand binding free energies for a diverse set of 80 protein complexes could be predicted with an AUE of 2.4 kcal mol(-1) using an additional energy term to account for the loss in ligand configurational entropy upon binding. The atomistic SEEF is expected to improve the accuracy of residue-based coarse-grained SEEFs currently used in CPD and to extend the range of applications of extant atom-based protein statistical

  7. Visual perception can account for the close relation between numerosity processing and computational fluency

    PubMed Central

    Zhou, Xinlin; Wei, Wei; Zhang, Yiyun; Cui, Jiaxin; Chen, Chuansheng

    2015-01-01

    Studies have shown that numerosity processing (e.g., comparison of numbers of dots in two dot arrays) is significantly correlated with arithmetic performance. Researchers have attributed this association to the fact that both tasks share magnitude processing. The current investigation tested an alternative hypothesis, which states that visual perceptual ability (as measured by a figure-matching task) can account for the close relation between numerosity processing and arithmetic performance (computational fluency). Four hundred and twenty four third- to fifth-grade children (220 boys and 204 girls, 8.0–11.0 years old; 120 third graders, 146 fourth graders, and 158 fifth graders) were recruited from two schools (one urban and one suburban) in Beijing, China. Six classes were randomly selected from each school, and all students in each selected class participated in the study. All children were given a series of cognitive and mathematical tests, including numerosity comparison, figure matching, forward verbal working memory, visual tracing, non-verbal matrices reasoning, mental rotation, choice reaction time, arithmetic tests and curriculum-based mathematical achievement test. Results showed that figure-matching ability had higher correlations with numerosity processing and computational fluency than did other cognitive factors (e.g., forward verbal working memory, visual tracing, non-verbal matrix reasoning, mental rotation, and choice reaction time). More important, hierarchical multiple regression showed that figure matching ability accounted for the well-established association between numerosity processing and computational fluency. In support of the visual perception hypothesis, the results suggest that visual perceptual ability, rather than magnitude processing, may be the shared component of numerosity processing and arithmetic performance. PMID:26441740

  8. Assigning unique identification numbers to new user accounts and groups in a computing environment with multiple registries

    DOEpatents

    DeRobertis, Christopher V.; Lu, Yantian T.

    2010-02-23

    A method, system, and program storage device for creating a new user account or user group with a unique identification number in a computing environment having multiple user registries is provided. In response to receiving a command to create a new user account or user group, an operating system of a clustered computing environment automatically checks multiple registries configured for the operating system to determine whether a candidate identification number for the new user account or user group has been assigned already to one or more existing user accounts or groups, respectively. The operating system automatically assigns the candidate identification number to the new user account or user group created in a target user registry if the checking indicates that the candidate identification number has not been assigned already to any of the existing user accounts or user groups, respectively.

  9. Computer program for Bessel and Hankel functions

    NASA Technical Reports Server (NTRS)

    Kreider, Kevin L.; Saule, Arthur V.; Rice, Edward J.; Clark, Bruce J.

    1991-01-01

    A set of FORTRAN subroutines for calculating Bessel and Hankel functions is presented. The routines calculate Bessel and Hankel functions of the first and second kinds, as well as their derivatives, for wide ranges of integer order and real or complex argument in single or double precision. Depending on the order and argument, one of three evaluation methods is used: the power series definition, an Airy function expansion, or an asymptotic expansion. Routines to calculate Airy functions and their derivatives are also included.

  10. Computer method for identification of boiler transfer functions

    NASA Technical Reports Server (NTRS)

    Miles, J. H.

    1972-01-01

    Iterative computer aided procedure was developed which provides for identification of boiler transfer functions using frequency response data. Method uses frequency response data to obtain satisfactory transfer function for both high and low vapor exit quality data.

  11. An Experimental Analysis of Computer-Mediated Instruction and Student Attitudes in a Principles of Financial Accounting Course.

    ERIC Educational Resources Information Center

    Basile, Anthony; D'Aquila, Jill M.

    2002-01-01

    Accounting students received either traditional instruction (n=46) or used computer-mediated communication and WebCT course management software. There were no significant differences in attitudes about the course. However, computer users were more positive about course delivery and course management tools. (Contains 17 references.) (SK)

  12. Computing black hole partition functions from quasinormal modes

    NASA Astrophysics Data System (ADS)

    Arnold, Peter; Szepietowski, Phillip; Vaman, Diana

    2016-07-01

    We propose a method of computing one-loop determinants in black hole space-times (with emphasis on asymptotically anti-de Sitter black holes) that may be used for numerics when completely-analytic results are unattainable. The method utilizes the expression for one-loop determinants in terms of quasinormal frequencies determined by Denef, Hartnoll and Sachdev in [1]. A numerical evaluation must face the fact that the sum over the quasinormal modes, indexed by momentum and overtone numbers, is divergent. A necessary ingredient is then a regularization scheme to handle the divergent contributions of individual fixed-momentum sectors to the partition function. To this end, we formulate an effective two-dimensional problem in which a natural refinement of standard heat kernel techniques can be used to account for contributions to the partition function at fixed momentum. We test our method in a concrete case by reproducing the scalar one-loop determinant in the BTZ black hole background. We then discuss the application of such techniques to more complicated spacetimes.

  13. Some computational techniques for estimating human operator describing functions

    NASA Technical Reports Server (NTRS)

    Levison, W. H.

    1986-01-01

    Computational procedures for improving the reliability of human operator describing functions are described. Special attention is given to the estimation of standard errors associated with mean operator gain and phase shift as computed from an ensemble of experimental trials. This analysis pertains to experiments using sum-of-sines forcing functions. Both open-loop and closed-loop measurement environments are considered.

  14. Computer Use and the Relation between Age and Cognitive Functioning

    ERIC Educational Resources Information Center

    Soubelet, Andrea

    2012-01-01

    This article investigates whether computer use for leisure could mediate or moderate the relations between age and cognitive functioning. Findings supported smaller age differences in measures of cognitive functioning for people who reported spending more hours using a computer. Because of the cross-sectional design of the study, two alternative…

  15. Pair correlation function integrals: Computation and use

    NASA Astrophysics Data System (ADS)

    Wedberg, Rasmus; O'Connell, John P.; Peters, Günther H.; Abildskov, Jens

    2011-08-01

    We describe a method for extending radial distribution functions obtained from molecular simulations of pure and mixed molecular fluids to arbitrary distances. The method allows total correlation function integrals to be reliably calculated from simulations of relatively small systems. The long-distance behavior of radial distribution functions is determined by requiring that the corresponding direct correlation functions follow certain approximations at long distances. We have briefly described the method and tested its performance in previous communications [R. Wedberg, J. P. O'Connell, G. H. Peters, and J. Abildskov, Mol. Simul. 36, 1243 (2010);, 10.1080/08927020903536366 Fluid Phase Equilib. 302, 32 (2011)], 10.1016/j.fluid.2010.10.004, but describe here its theoretical basis more thoroughly and derive long-distance approximations for the direct correlation functions. We describe the numerical implementation of the method in detail, and report numerical tests complementing previous results. Pure molecular fluids are here studied in the isothermal-isobaric ensemble with isothermal compressibilities evaluated from the total correlation function integrals and compared with values derived from volume fluctuations. For systems where the radial distribution function has structure beyond the sampling limit imposed by the system size, the integration is more reliable, and usually more accurate, than simple integral truncation.

  16. Singular Function Integration in Computational Physics

    NASA Astrophysics Data System (ADS)

    Hasbun, Javier

    2009-03-01

    In teaching computational methods in the undergraduate physics curriculum, standard integration approaches taught include the rectangular, trapezoidal, Simpson, Romberg, and others. Over time, these techniques have proven to be invaluable and students are encouraged to employ the most efficient method that is expected to perform best when applied to a given problem. However, some physics research applications require techniques that can handle singularities. While decreasing the step size in traditional approaches is an alternative, this may not always work and repetitive processes make this route even more inefficient. Here, I present two existing integration rules designed to handle singular integrals. I compare them to traditional rules as well as to the exact analytic results. I suggest that it is perhaps time to include such approaches in the undergraduate computational physics course.

  17. 45 CFR 302.20 - Separation of cash handling and accounting functions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 2 2011-10-01 2011-10-01 false Separation of cash handling and accounting functions. 302.20 Section 302.20 Public Welfare Regulations Relating to Public Welfare OFFICE OF CHILD..., DEPARTMENT OF HEALTH AND HUMAN SERVICES STATE PLAN REQUIREMENTS § 302.20 Separation of cash handling...

  18. 45 CFR 302.20 - Separation of cash handling and accounting functions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 2 2014-10-01 2012-10-01 true Separation of cash handling and accounting functions. 302.20 Section 302.20 Public Welfare Regulations Relating to Public Welfare OFFICE OF CHILD..., DEPARTMENT OF HEALTH AND HUMAN SERVICES STATE PLAN REQUIREMENTS § 302.20 Separation of cash handling...

  19. A Management Study of the MCPS Accounting System and Certain Related Financial Services Functions. Final Report.

    ERIC Educational Resources Information Center

    Young (Arthur) and Co., Washington, DC.

    Several years ago, Montgomery County Public Schools (MCPS) began a Management Operations Review and Evaluation (MORE) of the entire school system, excluding school-based instruction. This MORE study is an evaluation of MCPS's current accounting system and certain related financial services functions within the Department of Financial Services. In…

  20. 45 CFR 302.20 - Separation of cash handling and accounting functions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 2 2010-10-01 2010-10-01 false Separation of cash handling and accounting functions. 302.20 Section 302.20 Public Welfare Regulations Relating to Public Welfare OFFICE OF CHILD SUPPORT ENFORCEMENT (CHILD SUPPORT ENFORCEMENT PROGRAM), ADMINISTRATION FOR CHILDREN AND...

  1. Basic mathematical function libraries for scientific computation

    NASA Technical Reports Server (NTRS)

    Galant, David C.

    1989-01-01

    Ada packages implementing selected mathematical functions for the support of scientific and engineering applications were written. The packages provide the Ada programmer with the mathematical function support found in the languages Pascal and FORTRAN as well as an extended precision arithmetic and a complete complex arithmetic. The algorithms used are fully described and analyzed. Implementation assumes that the Ada type FLOAT objects fully conform to the IEEE 754-1985 standard for single binary floating-point arithmetic, and that INTEGER objects are 32-bit entities. Codes for the Ada packages are included as appendixes.

  2. The Computer and Its Functions; How to Communicate with the Computer.

    ERIC Educational Resources Information Center

    Ward, Peggy M.

    A brief discussion of why it is important for students to be familiar with computers and their functions and a list of some practical applications introduce this two-part paper. Focusing on how the computer works, the first part explains the various components of the computer, different kinds of memory storage devices, disk operating systems, and…

  3. 49 CFR 1242.78 - Employees performing clerical and accounting functions, and loss and damage claims processing...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... functions, and loss and damage claims processing (accounts XX-55-76 and XX-55-78). 1242.78 Section 1242.78... Employees performing clerical and accounting functions, and loss and damage claims processing (accounts XX-55-76 and XX-55-78). If the sum of the direct freight and the direct passenger expenses is more...

  4. Inaccuracies of trigonometric functions in computer mathematical libraries

    NASA Astrophysics Data System (ADS)

    Ito, Takashi; Kojima, Sadamu

    Recent progress in the development of high speed computers has enabled us to perform larger and faster numerical experiments in astronomy. However, sometimes the high speed of numerical computation is achieved at the cost of accuracy. In this paper we show an example of accuracy loss by some mathematical functions on certain computer platforms in Astronomical Data Analysis Center, National Astronomical Observatory of Japan. We focus in particular on the numerical inaccuracy in sine and cosine functions, demonstrating how accuracy deterioration emerges. We also describe the measures that we have so far taken against these numerical inaccuracies. In general, computer vendors are not eager to improve the numerical accuracy in the mathematical libraries that they are supposed to be responsible for. Therefore scientists have to be aware of the existence of numerical inaccuracies, and protect their computational results from contamination by the potential errors that many computer platforms inherently contain.

  5. Examining Functions in Mathematics and Science Using Computer Interfacing.

    ERIC Educational Resources Information Center

    Walton, Karen Doyle

    1988-01-01

    Introduces microcomputer interfacing as a method for explaining and demonstrating various aspects of the concept of function. Provides three experiments with illustrations and typical computer graphic displays: pendulum motion, pendulum study using two pendulums, and heat absorption and radiation. (YP)

  6. Do Parents Recognize Autistic Deviant Behavior Long before Diagnosis? Taking into Account Interaction Using Computational Methods

    PubMed Central

    Saint-Georges, Catherine; Mahdhaoui, Ammar; Chetouani, Mohamed; Cassel, Raquel S.; Laznik, Marie-Christine; Apicella, Fabio; Muratori, Pietro; Maestro, Sandra; Muratori, Filippo; Cohen, David

    2011-01-01

    Background To assess whether taking into account interaction synchrony would help to better differentiate autism (AD) from intellectual disability (ID) and typical development (TD) in family home movies of infants aged less than 18 months, we used computational methods. Methodology and Principal Findings First, we analyzed interactive sequences extracted from home movies of children with AD (N = 15), ID (N = 12), or TD (N = 15) through the Infant and Caregiver Behavior Scale (ICBS). Second, discrete behaviors between baby (BB) and Care Giver (CG) co-occurring in less than 3 seconds were selected as single interactive patterns (or dyadic events) for analysis of the two directions of interaction (CG→BB and BB→CG) by group and semester. To do so, we used a Markov assumption, a Generalized Linear Mixed Model, and non negative matrix factorization. Compared to TD children, BBs with AD exhibit a growing deviant development of interactive patterns whereas those with ID rather show an initial delay of development. Parents of AD and ID do not differ very much from parents of TD when responding to their child. However, when initiating interaction, parents use more touching and regulation up behaviors as early as the first semester. Conclusion When studying interactive patterns, deviant autistic behaviors appear before 18 months. Parents seem to feel the lack of interactive initiative and responsiveness of their babies and try to increasingly supply soliciting behaviors. Thus we stress that credence should be given to parents' intuition as they recognize, long before diagnosis, the pathological process through the interactive pattern with their child. PMID:21818320

  7. Walk a Mile in My Shoes: Stakeholder Accounts of Testing Experience with a Computer-Administered Test

    ERIC Educational Resources Information Center

    Fox, Janna; Cheng, Liying

    2015-01-01

    In keeping with the trend to elicit multiple stakeholder responses to operational tests as part of test validation, this exploratory mixed methods study examines test-taker accounts of an Internet-based (i.e., computer-administered) test in the high-stakes context of proficiency testing for university admission. In 2013, as language testing…

  8. Factors accounting for psychosocial functioning in patients with low back pain

    PubMed Central

    Steuden, Stanisława; Kuryłowicz, Joanna

    2009-01-01

    Low back pain (LBP) is a chronic disorder which exerts a profound impact on various spheres of psychosocial functioning, including emotional distress, functional limitations and decrements in social contacts. The objective of this study was to investigate the associations between the indices of psychosocial functioning in patients with chronic LBP and a range of psychological factors. Specifically, the study aimed at exploring the relative participation of personality, social support, disease-related cognitive appraisals and coping styles in accounting for the differences in psychosocial functioning of patients with LBP. One-hundred-twenty patients with LBP took part in the study and completed a battery of psychological questionnaires: NEO–Five Factors Inventory, Ways of Coping Questionnaire, Disease-Related Social Support Scale, Disease-Related Appraisals Scale and Psychosocial Functioning Questionnaire (PFQ). The PFQ dimensions were used as dependent variables in a series of stepwise regression analysis models with the scores from other questionnaires entered as independent variables. A cognitive appraisal of the disease in terms of an obstacle was strongly related to all domains of functioning; however, other appraisals (threat, challenge, harm, profit and overall disease importance) were uniquely associated with particular domains of functioning. Deprivation of social support was a significant predictor of distress experienced in interpersonal context and of sense of being disabled. Among basic personality traits, agreeableness was negatively associated with distress in interpersonal context, and conscientiousness was positively related to acceptance of life with the disease. Problem-focus coping was linked to higher acceptance of life with the disease. Among sociodemographic variables, older age and lower educational level were related to greater subjective feelings of being disabled. Pain severity was found unrelated to any of psychosocial functioning

  9. Neuromotor recovery from stroke: computational models at central, functional, and muscle synergy level

    PubMed Central

    Casadio, Maura; Tamagnone, Irene; Summa, Susanna; Sanguineti, Vittorio

    2013-01-01

    Computational models of neuromotor recovery after a stroke might help to unveil the underlying physiological mechanisms and might suggest how to make recovery faster and more effective. At least in principle, these models could serve: (i) To provide testable hypotheses on the nature of recovery; (ii) To predict the recovery of individual patients; (iii) To design patient-specific “optimal” therapy, by setting the treatment variables for maximizing the amount of recovery or for achieving a better generalization of the learned abilities across different tasks. Here we review the state of the art of computational models for neuromotor recovery through exercise, and their implications for treatment. We show that to properly account for the computational mechanisms of neuromotor recovery, multiple levels of description need to be taken into account. The review specifically covers models of recovery at central, functional and muscle synergy level. PMID:23986688

  10. Computer-Intensive Algebra and Students' Conceptual Knowledge of Functions.

    ERIC Educational Resources Information Center

    O'Callaghan, Brian R.

    1998-01-01

    Describes a research project that examined the effects of the Computer-Intensive Algebra (CIA) and traditional algebra curricula on students' (N=802) understanding of the function concept. Results indicate that CIA students achieved a better understanding of functions and were better at the components of modeling, interpreting, and translating.…

  11. Assessment of cognitive function in alcoholics by computer: a control study.

    PubMed

    Acker, C; Acker, W; Shaw, G K

    1984-01-01

    Results are presented of the performance by 103 alcoholics and 90 controls on six computer-administered tests of cognitive function. The main analysis compared performance of the two groups when pre-existing differences in intellectual capacity, as estimated by NART, were accounted for statistically. The performance of the alcoholics was worse, at a statistically significant level, on 18 of 23 measures. Procedurally, the tests were found to offer practical advantages over conventional procedures. PMID:6508877

  12. Convergence rate for numerical computation of the lattice Green's function.

    PubMed

    Ghazisaeidi, M; Trinkle, D R

    2009-03-01

    Flexible boundary-condition methods couple an isolated defect to bulk through the bulk lattice Green's function. Direct computation of the lattice Green's function requires projecting out the singular subspace of uniform displacements and forces for the infinite lattice. We calculate the convergence rates for elastically isotropic and anisotropic cases for three different techniques: relative displacement, elastic Green's function correction, and discontinuity correction. The discontinuity correction has the most rapid convergence for the general case. PMID:19392089

  13. PERFORMANCE OF A COMPUTER-BASED ASSESSMENT OF COGNITIVE FUNCTION MEASURES IN TWO COHORTS OF SENIORS

    PubMed Central

    Espeland, Mark A.; Katula, Jeffrey A.; Rushing, Julia; Kramer, Arthur F.; Jennings, Janine M.; Sink, Kaycee M.; Nadkarni, Neelesh K.; Reid, Kieran F.; Castro, Cynthia M.; Church, Timothy; Kerwin, Diana R.; Williamson, Jeff D.; Marottoli, Richard A.; Rushing, Scott; Marsiske, Michael; Rapp, Stephen R.

    2013-01-01

    Background Computer-administered assessment of cognitive function is being increasingly incorporated in clinical trials, however its performance in these settings has not been systematically evaluated. Design The Seniors Health and Activity Research Program (SHARP) pilot trial (N=73) developed a computer-based tool for assessing memory performance and executive functioning. The Lifestyle Interventions and Independence for Seniors (LIFE) investigators incorporated this battery in a full scale multicenter clinical trial (N=1635). We describe relationships that test scores have with those from interviewer-administered cognitive function tests and risk factors for cognitive deficits and describe performance measures (completeness, intra-class correlations). Results Computer-based assessments of cognitive function had consistent relationships across the pilot and full scale trial cohorts with interviewer-administered assessments of cognitive function, age, and a measure of physical function. In the LIFE cohort, their external validity was further demonstrated by associations with other risk factors for cognitive dysfunction: education, hypertension, diabetes, and physical function. Acceptable levels of data completeness (>83%) were achieved on all computer-based measures, however rates of missing data were higher among older participants (odds ratio=1.06 for each additional year; p<0.001) and those who reported no current computer use (odds ratio=2.71; p<0.001). Intra-class correlations among clinics were at least as low (ICC≤0.013) as for interviewer measures (ICC≤0.023), reflecting good standardization. All cognitive measures loaded onto the first principal component (global cognitive function), which accounted for 40% of the overall variance. Conclusion Our results support the use of computer-based tools for assessing cognitive function in multicenter clinical trials of older individuals. PMID:23589390

  14. Wigner Function Negativity and Contextuality in Quantum Computation on Rebits

    NASA Astrophysics Data System (ADS)

    Delfosse, Nicolas; Allard Guerin, Philippe; Bian, Jacob; Raussendorf, Robert

    2015-04-01

    We describe a universal scheme of quantum computation by state injection on rebits (states with real density matrices). For this scheme, we establish contextuality and Wigner function negativity as computational resources, extending results of M. Howard et al. [Nature (London) 510, 351 (2014), 10.1038/nature13460] to two-level systems. For this purpose, we define a Wigner function suited to systems of n rebits and prove a corresponding discrete Hudson's theorem. We introduce contextuality witnesses for rebit states and discuss the compatibility of our result with state-independent contextuality.

  15. Computer method for identification of boiler transfer functions

    NASA Technical Reports Server (NTRS)

    Miles, J. H.

    1971-01-01

    An iterative computer method is described for identifying boiler transfer functions using frequency response data. An objective penalized performance measure and a nonlinear minimization technique are used to cause the locus of points generated by a transfer function to resemble the locus of points obtained from frequency response measurements. Different transfer functions can be tried until a satisfactory empirical transfer function to the system is found. To illustrate the method, some examples and some results from a study of a set of data consisting of measurements of the inlet impedance of a single tube forced flow boiler with inserts are given.

  16. A large-scale evaluation of computational protein function prediction

    PubMed Central

    Radivojac, Predrag; Clark, Wyatt T; Ronnen Oron, Tal; Schnoes, Alexandra M; Wittkop, Tobias; Sokolov, Artem; Graim, Kiley; Funk, Christopher; Verspoor, Karin; Ben-Hur, Asa; Pandey, Gaurav; Yunes, Jeffrey M; Talwalkar, Ameet S; Repo, Susanna; Souza, Michael L; Piovesan, Damiano; Casadio, Rita; Wang, Zheng; Cheng, Jianlin; Fang, Hai; Gough, Julian; Koskinen, Patrik; Törönen, Petri; Nokso-Koivisto, Jussi; Holm, Liisa; Cozzetto, Domenico; Buchan, Daniel W A; Bryson, Kevin; Jones, David T; Limaye, Bhakti; Inamdar, Harshal; Datta, Avik; Manjari, Sunitha K; Joshi, Rajendra; Chitale, Meghana; Kihara, Daisuke; Lisewski, Andreas M; Erdin, Serkan; Venner, Eric; Lichtarge, Olivier; Rentzsch, Robert; Yang, Haixuan; Romero, Alfonso E; Bhat, Prajwal; Paccanaro, Alberto; Hamp, Tobias; Kassner, Rebecca; Seemayer, Stefan; Vicedo, Esmeralda; Schaefer, Christian; Achten, Dominik; Auer, Florian; Böhm, Ariane; Braun, Tatjana; Hecht, Maximilian; Heron, Mark; Hönigschmid, Peter; Hopf, Thomas; Kaufmann, Stefanie; Kiening, Michael; Krompass, Denis; Landerer, Cedric; Mahlich, Yannick; Roos, Manfred; Björne, Jari; Salakoski, Tapio; Wong, Andrew; Shatkay, Hagit; Gatzmann, Fanny; Sommer, Ingolf; Wass, Mark N; Sternberg, Michael J E; Škunca, Nives; Supek, Fran; Bošnjak, Matko; Panov, Panče; Džeroski, Sašo; Šmuc, Tomislav; Kourmpetis, Yiannis A I; van Dijk, Aalt D J; ter Braak, Cajo J F; Zhou, Yuanpeng; Gong, Qingtian; Dong, Xinran; Tian, Weidong; Falda, Marco; Fontana, Paolo; Lavezzo, Enrico; Di Camillo, Barbara; Toppo, Stefano; Lan, Liang; Djuric, Nemanja; Guo, Yuhong; Vucetic, Slobodan; Bairoch, Amos; Linial, Michal; Babbitt, Patricia C; Brenner, Steven E; Orengo, Christine; Rost, Burkhard; Mooney, Sean D; Friedberg, Iddo

    2013-01-01

    Automated annotation of protein function is challenging. As the number of sequenced genomes rapidly grows, the overwhelming majority of protein products can only be annotated computationally. If computational predictions are to be relied upon, it is crucial that the accuracy of these methods be high. Here we report the results from the first large-scale community-based Critical Assessment of protein Function Annotation (CAFA) experiment. Fifty-four methods representing the state-of-the-art for protein function prediction were evaluated on a target set of 866 proteins from eleven organisms. Two findings stand out: (i) today’s best protein function prediction algorithms significantly outperformed widely-used first-generation methods, with large gains on all types of targets; and (ii) although the top methods perform well enough to guide experiments, there is significant need for improvement of currently available tools. PMID:23353650

  17. The flight telerobotic servicer: From functional architecture to computer architecture

    NASA Technical Reports Server (NTRS)

    Lumia, Ronald; Fiala, John

    1989-01-01

    After a brief tutorial on the NASA/National Bureau of Standards Standard Reference Model for Telerobot Control System Architecture (NASREM) functional architecture, the approach to its implementation is shown. First, interfaces must be defined which are capable of supporting the known algorithms. This is illustrated by considering the interfaces required for the SERVO level of the NASREM functional architecture. After interface definition, the specific computer architecture for the implementation must be determined. This choice is obviously technology dependent. An example illustrating one possible mapping of the NASREM functional architecture to a particular set of computers which implements it is shown. The result of choosing the NASREM functional architecture is that it provides a technology independent paradigm which can be mapped into a technology dependent implementation capable of evolving with technology in the laboratory and in space.

  18. Computational approaches for rational design of proteins with novel functionalities

    PubMed Central

    Tiwari, Manish Kumar; Singh, Ranjitha; Singh, Raushan Kumar; Kim, In-Won; Lee, Jung-Kul

    2012-01-01

    Proteins are the most multifaceted macromolecules in living systems and have various important functions, including structural, catalytic, sensory, and regulatory functions. Rational design of enzymes is a great challenge to our understanding of protein structure and physical chemistry and has numerous potential applications. Protein design algorithms have been applied to design or engineer proteins that fold, fold faster, catalyze, catalyze faster, signal, and adopt preferred conformational states. The field of de novo protein design, although only a few decades old, is beginning to produce exciting results. Developments in this field are already having a significant impact on biotechnology and chemical biology. The application of powerful computational methods for functional protein designing has recently succeeded at engineering target activities. Here, we review recently reported de novo functional proteins that were developed using various protein design approaches, including rational design, computational optimization, and selection from combinatorial libraries, highlighting recent advances and successes. PMID:24688643

  19. A large-scale evaluation of computational protein function prediction.

    PubMed

    Radivojac, Predrag; Clark, Wyatt T; Oron, Tal Ronnen; Schnoes, Alexandra M; Wittkop, Tobias; Sokolov, Artem; Graim, Kiley; Funk, Christopher; Verspoor, Karin; Ben-Hur, Asa; Pandey, Gaurav; Yunes, Jeffrey M; Talwalkar, Ameet S; Repo, Susanna; Souza, Michael L; Piovesan, Damiano; Casadio, Rita; Wang, Zheng; Cheng, Jianlin; Fang, Hai; Gough, Julian; Koskinen, Patrik; Törönen, Petri; Nokso-Koivisto, Jussi; Holm, Liisa; Cozzetto, Domenico; Buchan, Daniel W A; Bryson, Kevin; Jones, David T; Limaye, Bhakti; Inamdar, Harshal; Datta, Avik; Manjari, Sunitha K; Joshi, Rajendra; Chitale, Meghana; Kihara, Daisuke; Lisewski, Andreas M; Erdin, Serkan; Venner, Eric; Lichtarge, Olivier; Rentzsch, Robert; Yang, Haixuan; Romero, Alfonso E; Bhat, Prajwal; Paccanaro, Alberto; Hamp, Tobias; Kaßner, Rebecca; Seemayer, Stefan; Vicedo, Esmeralda; Schaefer, Christian; Achten, Dominik; Auer, Florian; Boehm, Ariane; Braun, Tatjana; Hecht, Maximilian; Heron, Mark; Hönigschmid, Peter; Hopf, Thomas A; Kaufmann, Stefanie; Kiening, Michael; Krompass, Denis; Landerer, Cedric; Mahlich, Yannick; Roos, Manfred; Björne, Jari; Salakoski, Tapio; Wong, Andrew; Shatkay, Hagit; Gatzmann, Fanny; Sommer, Ingolf; Wass, Mark N; Sternberg, Michael J E; Škunca, Nives; Supek, Fran; Bošnjak, Matko; Panov, Panče; Džeroski, Sašo; Šmuc, Tomislav; Kourmpetis, Yiannis A I; van Dijk, Aalt D J; ter Braak, Cajo J F; Zhou, Yuanpeng; Gong, Qingtian; Dong, Xinran; Tian, Weidong; Falda, Marco; Fontana, Paolo; Lavezzo, Enrico; Di Camillo, Barbara; Toppo, Stefano; Lan, Liang; Djuric, Nemanja; Guo, Yuhong; Vucetic, Slobodan; Bairoch, Amos; Linial, Michal; Babbitt, Patricia C; Brenner, Steven E; Orengo, Christine; Rost, Burkhard; Mooney, Sean D; Friedberg, Iddo

    2013-03-01

    Automated annotation of protein function is challenging. As the number of sequenced genomes rapidly grows, the overwhelming majority of protein products can only be annotated computationally. If computational predictions are to be relied upon, it is crucial that the accuracy of these methods be high. Here we report the results from the first large-scale community-based critical assessment of protein function annotation (CAFA) experiment. Fifty-four methods representing the state of the art for protein function prediction were evaluated on a target set of 866 proteins from 11 organisms. Two findings stand out: (i) today's best protein function prediction algorithms substantially outperform widely used first-generation methods, with large gains on all types of targets; and (ii) although the top methods perform well enough to guide experiments, there is considerable need for improvement of currently available tools. PMID:23353650

  20. Individual and developmental differences in semantic priming: empirical and computational support for a single-mechanism account of lexical processing.

    PubMed

    Plaut, D C; Booth, J R

    2000-10-01

    Existing accounts of single-word semantic priming phenomena incorporate multiple mechanisms, such as spreading activation, expectancy-based processes, and postlexical semantic matching. The authors provide empirical and computational support for a single-mechanism distributed network account. Previous studies have found greater semantic priming for low- than for high-frequency target words as well as inhibition following unrelated primes only at long stimulus-onset asynchronies (SOAs). A series of experiments examined the modulation of these effects by individual differences in age or perceptual ability. Third-grade, 6th-grade, and college students performed a lexical-decision task on high- and low-frequency target words preceded by related, unrelated, and nonword primes. Greater priming for low-frequency targets was exhibited only by participants with high perceptual ability. Moreover, unlike the college students, the children showed no inhibition even at the long SOA. The authors provide an account of these results in terms of the properties of distributed network models and support this account with an explicit computational simulation. PMID:11089407

  1. Outcomes Assessment of Computer-Assisted Behavioral Objectives for Accounting Graduates.

    ERIC Educational Resources Information Center

    Moore, John W.; Mitchem, Cheryl E.

    1997-01-01

    Presents behavioral objectives for accounting students and an outcomes assessment plan with five steps: (1) identification and definition of student competencies; (2) selection of valid instruments; (3) integration of assessment and instruction; (4) determination of levels of assessment; and (5) attribution of improvements to the program. (SK)

  2. Computational design of proteins with novel structure and functions

    NASA Astrophysics Data System (ADS)

    Wei, Yang; Lu-Hua, Lai

    2016-01-01

    Computational design of proteins is a relatively new field, where scientists search the enormous sequence space for sequences that can fold into desired structure and perform desired functions. With the computational approach, proteins can be designed, for example, as regulators of biological processes, novel enzymes, or as biotherapeutics. These approaches not only provide valuable information for understanding of sequence-structure-function relations in proteins, but also hold promise for applications to protein engineering and biomedical research. In this review, we briefly introduce the rationale for computational protein design, then summarize the recent progress in this field, including de novo protein design, enzyme design, and design of protein-protein interactions. Challenges and future prospects of this field are also discussed. Project supported by the National Basic Research Program of China (Grant No. 2015CB910300), the National High Technology Research and Development Program of China (Grant No. 2012AA020308), and the National Natural Science Foundation of China (Grant No. 11021463).

  3. Quantum Computing Without Wavefunctions: Time-Dependent Density Functional Theory for Universal Quantum Computation

    PubMed Central

    Tempel, David G.; Aspuru-Guzik, Alán

    2012-01-01

    We prove that the theorems of TDDFT can be extended to a class of qubit Hamiltonians that are universal for quantum computation. The theorems of TDDFT applied to universal Hamiltonians imply that single-qubit expectation values can be used as the basic variables in quantum computation and information theory, rather than wavefunctions. From a practical standpoint this opens the possibility of approximating observables of interest in quantum computations directly in terms of single-qubit quantities (i.e. as density functionals). Additionally, we also demonstrate that TDDFT provides an exact prescription for simulating universal Hamiltonians with other universal Hamiltonians that have different, and possibly easier-to-realize two-qubit interactions. This establishes the foundations of TDDFT for quantum computation and opens the possibility of developing density functionals for use in quantum algorithms. PMID:22553483

  4. Quantum computing without wavefunctions: time-dependent density functional theory for universal quantum computation.

    PubMed

    Tempel, David G; Aspuru-Guzik, Alán

    2012-01-01

    We prove that the theorems of TDDFT can be extended to a class of qubit Hamiltonians that are universal for quantum computation. The theorems of TDDFT applied to universal Hamiltonians imply that single-qubit expectation values can be used as the basic variables in quantum computation and information theory, rather than wavefunctions. From a practical standpoint this opens the possibility of approximating observables of interest in quantum computations directly in terms of single-qubit quantities (i.e. as density functionals). Additionally, we also demonstrate that TDDFT provides an exact prescription for simulating universal Hamiltonians with other universal Hamiltonians that have different, and possibly easier-to-realize two-qubit interactions. This establishes the foundations of TDDFT for quantum computation and opens the possibility of developing density functionals for use in quantum algorithms. PMID:22553483

  5. SNAP: A computer program for generating symbolic network functions

    NASA Technical Reports Server (NTRS)

    Lin, P. M.; Alderson, G. E.

    1970-01-01

    The computer program SNAP (symbolic network analysis program) generates symbolic network functions for networks containing R, L, and C type elements and all four types of controlled sources. The program is efficient with respect to program storage and execution time. A discussion of the basic algorithms is presented, together with user's and programmer's guides.

  6. Robust Computation of Morse-Smale Complexes of Bilinear Functions

    SciTech Connect

    Norgard, G; Bremer, P T

    2010-11-30

    The Morse-Smale (MS) complex has proven to be a useful tool in extracting and visualizing features from scalar-valued data. However, existing algorithms to compute the MS complex are restricted to either piecewise linear or discrete scalar fields. This paper presents a new combinatorial algorithm to compute MS complexes for two dimensional piecewise bilinear functions defined on quadrilateral meshes. We derive a new invariant of the gradient flow within a bilinear cell and use it to develop a provably correct computation which is unaffected by numerical instabilities. This includes a combinatorial algorithm to detect and classify critical points as well as a way to determine the asymptotes of cell-based saddles and their intersection with cell edges. Finally, we introduce a simple data structure to compute and store integral lines on quadrilateral meshes which by construction prevents intersections and enables us to enforce constraints on the gradient flow to preserve known invariants.

  7. Computer program for calculating and fitting thermodynamic functions

    NASA Technical Reports Server (NTRS)

    Mcbride, Bonnie J.; Gordon, Sanford

    1992-01-01

    A computer program is described which (1) calculates thermodynamic functions (heat capacity, enthalpy, entropy, and free energy) for several optional forms of the partition function, (2) fits these functions to empirical equations by means of a least-squares fit, and (3) calculates, as a function of temperture, heats of formation and equilibrium constants. The program provides several methods for calculating ideal gas properties. For monatomic gases, three methods are given which differ in the technique used for truncating the partition function. For diatomic and polyatomic molecules, five methods are given which differ in the corrections to the rigid-rotator harmonic-oscillator approximation. A method for estimating thermodynamic functions for some species is also given.

  8. A computational account of the production effect: Still playing twenty questions with nature.

    PubMed

    Jamieson, Randall K; Mewhort, D J K; Hockley, William E

    2016-06-01

    People remember words that they read aloud better than words that they read silently, a result known as the production effect. The standing explanation for the production effect is that producing a word renders it distinctive in memory and, thus, memorable at test. By 1 key account, distinctiveness is defined in terms of sensory feedback. We formalize the sensory-feedback account using MINERVA 2, a standard model of memory. The model accommodates the basic result in recognition as well as the fact that the mixed-list production effect is larger than its pure-list counterpart, that the production effect is robust to forgetting, and that the production and generation effects have additive influences on performance. A final simulation addresses the strength-based account and suggests that it will be more difficult to distinguish a strength-based versus distinctiveness-based explanation than is typically thought. We conclude that the production effect is consistent with existing theory and discuss our analysis in relation to Alan Newell's (1973) classic criticism of psychology and call for an analysis of psychological principles instead of laboratory phenomena. (PsycINFO Database Record PMID:27244357

  9. Computing the hadronic vacuum polarization function by analytic continuation

    DOE PAGESBeta

    Feng, Xu; Hashimoto, Shoji; Hotzel, Grit; Jansen, Karl; Petschlies, Marcus; Renner, Dru B.

    2013-08-29

    We propose a method to compute the hadronic vacuum polarization function on the lattice at continuous values of photon momenta bridging between the spacelike and timelike regions. We provide two independent demonstrations to show that this method leads to the desired hadronic vacuum polarization function in Minkowski spacetime. We present with the example of the leading-order QCD correction to the muon anomalous magnetic moment that this approach can provide a valuable alternative method for calculations of physical quantities where the hadronic vacuum polarization function enters.

  10. A Survey of Computational Intelligence Techniques in Protein Function Prediction

    PubMed Central

    Tiwari, Arvind Kumar; Srivastava, Rajeev

    2014-01-01

    During the past, there was a massive growth of knowledge of unknown proteins with the advancement of high throughput microarray technologies. Protein function prediction is the most challenging problem in bioinformatics. In the past, the homology based approaches were used to predict the protein function, but they failed when a new protein was different from the previous one. Therefore, to alleviate the problems associated with homology based traditional approaches, numerous computational intelligence techniques have been proposed in the recent past. This paper presents a state-of-the-art comprehensive review of various computational intelligence techniques for protein function predictions using sequence, structure, protein-protein interaction network, and gene expression data used in wide areas of applications such as prediction of DNA and RNA binding sites, subcellular localization, enzyme functions, signal peptides, catalytic residues, nuclear/G-protein coupled receptors, membrane proteins, and pathway analysis from gene expression datasets. This paper also summarizes the result obtained by many researchers to solve these problems by using computational intelligence techniques with appropriate datasets to improve the prediction performance. The summary shows that ensemble classifiers and integration of multiple heterogeneous data are useful for protein function prediction. PMID:25574395

  11. Computer-Supported Instructional Communication: A Multidisciplinary Account of Relevant Factors

    ERIC Educational Resources Information Center

    Rummel, Nikol; Kramer, Nicole

    2010-01-01

    The papers in the present special issue summarize research that aims at compiling and understanding variables associated with successful communication in computer-supported instructional settings. Secondly, the papers add to the question of how adaptiveness of instructional communication may be achieved. A particular strength of the special issue…

  12. Computation of three-dimensional flows using two stream functions

    NASA Technical Reports Server (NTRS)

    Greywall, Mahesh S.

    1991-01-01

    An approach to compute 3-D flows using two stream functions is presented. The method generates a boundary fitted grid as part of its solution. Commonly used two steps for computing the flow fields are combined into a single step in the present approach: (1) boundary fitted grid generation; and (2) solution of Navier-Stokes equations on the generated grid. The presented method can be used to directly compute 3-D viscous flows, or the potential flow approximation of this method can be used to generate grids for other algorithms to compute 3-D viscous flows. The independent variables used are chi, a spatial coordinate, and xi and eta, values of stream functions along two sets of suitably chosen intersecting stream surfaces. The dependent variables used are the streamwise velocity, and two functions that describe the stream surfaces. Since for a 3-D flow there is no unique way to define two sets of intersecting stream surfaces to cover the given flow, different types of two sets of intersecting stream surfaces are considered. First, the metric of the (chi, xi, eta) curvilinear coordinate system associated with each type is presented. Next, equations for the steady state transport of mass, momentum, and energy are presented in terms of the metric of the (chi, xi, eta) coordinate system. Also included are the inviscid and the parabolized approximations to the general transport equations.

  13. Integrated command, control, communications and computation system functional architecture

    NASA Technical Reports Server (NTRS)

    Cooley, C. G.; Gilbert, L. E.

    1981-01-01

    The functional architecture for an integrated command, control, communications, and computation system applicable to the command and control portion of the NASA End-to-End Data. System is described including the downlink data processing and analysis functions required to support the uplink processes. The functional architecture is composed of four elements: (1) the functional hierarchy which provides the decomposition and allocation of the command and control functions to the system elements; (2) the key system features which summarize the major system capabilities; (3) the operational activity threads which illustrate the interrelationahip between the system elements; and (4) the interfaces which illustrate those elements that originate or generate data and those elements that use the data. The interfaces also provide a description of the data and the data utilization and access techniques.

  14. Computer-based accountability system (Phase I) for special nuclear materials at Argonne-West

    SciTech Connect

    Ingermanson, R.S.; Proctor, A.E.

    1982-05-01

    An automated accountability system for special nuclear materials (SNM) is under development at Argonne National Laboratory-West. Phase I of the development effort has established the following basic features of the system: a unique file organization allows rapid updating or retrieval of the status of various SNM, based on batch numbers, storage location, serial number, or other attributes. Access to the program is controlled by an interactive user interface that can be easily understood by operators who have had no prior background in electronic data processing. Extensive use of structured programming techniques make the software package easy to understand and to modify for specific applications. All routines are written in FORTRAN.

  15. The Functionality of Spontaneous Mimicry and Its Influences on Affiliation: An Implicit Socialization Account

    PubMed Central

    Kavanagh, Liam C.; Winkielman, Piotr

    2016-01-01

    There is a broad theoretical and empirical interest in spontaneous mimicry, or the automatic reproduction of a model’s behavior. Evidence shows that people mimic models they like, and that mimicry enhances liking for the mimic. Yet, there is no satisfactory account of this phenomenon, especially in terms of its functional significance. While affiliation is often cited as the driver of mimicry, we argue that mimicry is primarily driven by a learning process that helps to produce the appropriate bodily and emotional responses to relevant social situations. Because the learning process and the resulting knowledge is implicit, it cannot easily be rejected, criticized, revised, and employed by the learner in a deliberative or deceptive manner. We argue that these characteristics will lead individuals to preferentially mimic ingroup members, whose implicit information is worth incorporating. Conversely, mimicry of the wrong person is costly because individuals will internalize “bad habits,” including emotional reactions and mannerisms indicating wrong group membership. This pattern of mimicry, in turn, means that observed mimicry is an honest signal of group affiliation. We propose that the preferences of models for the mimic stems from this true signal value. Further, just like facial expressions, mimicry communicates a genuine disposition when it is truly spontaneous. Consequently, perceivers are attuned to relevant cues such as appropriate timing, fidelity, and selectivity. Our account, while assuming no previously unknown biological endowments, also explains greater mimicry of powerful people, and why affiliation can be signaled by mimicry of seemingly inconsequential behaviors. PMID:27064398

  16. The Functionality of Spontaneous Mimicry and Its Influences on Affiliation: An Implicit Socialization Account.

    PubMed

    Kavanagh, Liam C; Winkielman, Piotr

    2016-01-01

    There is a broad theoretical and empirical interest in spontaneous mimicry, or the automatic reproduction of a model's behavior. Evidence shows that people mimic models they like, and that mimicry enhances liking for the mimic. Yet, there is no satisfactory account of this phenomenon, especially in terms of its functional significance. While affiliation is often cited as the driver of mimicry, we argue that mimicry is primarily driven by a learning process that helps to produce the appropriate bodily and emotional responses to relevant social situations. Because the learning process and the resulting knowledge is implicit, it cannot easily be rejected, criticized, revised, and employed by the learner in a deliberative or deceptive manner. We argue that these characteristics will lead individuals to preferentially mimic ingroup members, whose implicit information is worth incorporating. Conversely, mimicry of the wrong person is costly because individuals will internalize "bad habits," including emotional reactions and mannerisms indicating wrong group membership. This pattern of mimicry, in turn, means that observed mimicry is an honest signal of group affiliation. We propose that the preferences of models for the mimic stems from this true signal value. Further, just like facial expressions, mimicry communicates a genuine disposition when it is truly spontaneous. Consequently, perceivers are attuned to relevant cues such as appropriate timing, fidelity, and selectivity. Our account, while assuming no previously unknown biological endowments, also explains greater mimicry of powerful people, and why affiliation can be signaled by mimicry of seemingly inconsequential behaviors. PMID:27064398

  17. The Melanopic Sensitivity Function Accounts for Melanopsin-Driven Responses in Mice under Diverse Lighting Conditions

    PubMed Central

    Brown, Timothy M.; Allen, Annette E.; al-Enezi, Jazi; Wynne, Jonathan; Schlangen, Luc; Hommes, Vanja; Lucas, Robert J.

    2013-01-01

    In addition to rods and cones, photoreception in mammals extends to a third retinal cell type expressing the photopigment melanopsin. The influences of this novel opsin are widespread, ranging from pupillary and circadian responses to brightness perception, yet established approaches to quantifying the biological effects of light do not adequately account for melanopsin sensitivity. We have recently proposed a novel metric, the melanopic sensitivity function (VZλ), to address this deficiency. Here, we further validate this new measure with a variety of tests based on potential barriers to its applicability identified in the literature or relating to obvious practical benefits. Using electrophysiogical approaches and pupillometry, initially in rodless+coneless mice, our data demonstrate that under a very wide range of different conditions (including switching between stimuli with highly divergent spectral content) the VZλ function provides an accurate prediction of the sensitivity of melanopsin-dependent responses. We further show that VZλ provides the best available description of the spectral sensitivity of at least one aspect of the visual response in mice with functional rods and cones: tonic firing activity in the lateral geniculate nuclei. Together, these data establish VZλ as an important new approach for light measurement with widespread practical utility. PMID:23301090

  18. Optimization of removal function in computer controlled optical surfacing

    NASA Astrophysics Data System (ADS)

    Chen, Xi; Guo, Peiji; Ren, Jianfeng

    2010-10-01

    The technical principle of computer controlled optical surfacing (CCOS) and the common method of optimizing removal function that is used in CCOS are introduced in this paper. A new optimizing method time-sharing synthesis of removal function is proposed to solve problems of the removal function being far away from Gaussian type and slow approaching of the removal function error that encountered in the mode of planet motion or translation-rotation. Detailed time-sharing synthesis of using six removal functions is discussed. For a given region on the workpiece, six positions are selected as the centers of the removal function; polishing tool controlled by the executive system of CCOS revolves around each centre to complete a cycle in proper order. The overall removal function obtained by the time-sharing process is the ratio of total material removal in six cycles to time duration of the six cycles, which depends on the arrangement and distribution of the six removal functions. Simulations on the synthesized overall removal functions under two different modes of motion, i.e., planet motion and translation-rotation are performed from which the optimized combination of tool parameters and distribution of time-sharing synthesis removal functions are obtained. The evaluation function when optimizing is determined by an approaching factor which is defined as the ratio of the material removal within the area of half of the polishing tool coverage from the polishing center to the total material removal within the full polishing tool coverage area. After optimization, it is found that the optimized removal function obtained by time-sharing synthesis is closer to the ideal Gaussian type removal function than those by the traditional methods. The time-sharing synthesis method of the removal function provides an efficient way to increase the convergence speed of the surface error in CCOS for the fabrication of aspheric optical surfaces, and to reduce the intermediate- and high

  19. Why do interracial interactions impair executive function? A resource depletion account.

    PubMed

    Richeson, Jennifer A; Trawalter, Sophie

    2005-06-01

    Three studies investigated the veracity of a resource depletion account of the impairment of inhibitory task performance after interracial contact. White individuals engaged in either an interracial or same-race interaction, then completed an ostensibly unrelated Stroop color-naming test. In each study, the self-regulatory demands of the interaction were either increased (Study 1) or decreased (Studies 2 and 3). Results revealed that increasing the self-regulatory demands of an interracial interaction led to greater Stroop interference compared with control, whereas reducing self-regulatory demands led to less Stroop interference. Manipulating self-regulatory demands did not affect Stroop performance after same-race interactions. Taken together, the present studies point to resource depletion as the likely mechanism underlying the impairment of cognitive functioning after interracial dyadic interactions. PMID:15982114

  20. 17 CFR 1.32 - Segregated account; daily computation and record.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...., “securities haircuts”) as set forth in Rule 15c3-1(c)(2)(vi) of the Securities and Exchange Commission (17 CFR... (17 CFR 240.15c3-1(c)(11)(i)). (c) The daily computations required by this section must be completed... business day, on a currency-by-currency basis: (1) The total amount of customer funds on deposit...

  1. A three-dimensional model of mammalian tyrosinase active site accounting for loss of function mutations.

    PubMed

    Schweikardt, Thorsten; Olivares, Concepción; Solano, Francisco; Jaenicke, Elmar; García-Borrón, José Carlos; Decker, Heinz

    2007-10-01

    Tyrosinases are the first and rate-limiting enzymes in the synthesis of melanin pigments responsible for colouring hair, skin and eyes. Mutation of tyrosinases often decreases melanin production resulting in albinism, but the effects are not always understood at the molecular level. Homology modelling of mouse tyrosinase based on recently published crystal structures of non-mammalian tyrosinases provides an active site model accounting for loss-of-function mutations. According to the model, the copper-binding histidines are located in a helix bundle comprising four densely packed helices. A loop containing residues M374, S375 and V377 connects the CuA and CuB centres, with the peptide oxygens of M374 and V377 serving as hydrogen acceptors for the NH-groups of the imidazole rings of the copper-binding His367 and His180. Therefore, this loop is essential for the stability of the active site architecture. A double substitution (374)MS(375) --> (374)GG(375) or a single M374G mutation lead to a local perturbation of the protein matrix at the active site affecting the orientation of the H367 side chain, that may be unable to bind CuB reliably, resulting in loss of activity. The model also accounts for loss of function in two naturally occurring albino mutations, S380P and V393F. The hydroxyl group in S380 contributes to the correct orientation of M374, and the substitution of V393 for a bulkier phenylalanine sterically impedes correct side chain packing at the active site. Therefore, our model explains the mechanistic necessity for conservation of not only active site histidines but also adjacent amino acids in tyrosinase. PMID:17850513

  2. Time-Dependent Density Functional Theory for Universal Quantum Computation

    NASA Astrophysics Data System (ADS)

    Tempel, David

    2015-03-01

    In this talk, I will discuss how the theorems of TDDFT can be applied to a class of qubit Hamiltonians that are universal for quantum computation. The theorems of TDDFT applied to universal Hamiltonians imply that single-qubit expectation values can be used as the basic variables in quantum computation and information theory, rather than wavefunctions. From a practical standpoint this opens the possibility of approximating observables of interest in quantum computations directly in terms of single-qubit quantities (i.e. as density functionals). Additionally, I will discuss how TDDFT provides an exact prescription for simulating universal Hamiltonians with other universal Hamiltonians that have different, and possibly easier-to-realize two-qubit interactions.

  3. Computational predictions of energy materials using density functional theory

    NASA Astrophysics Data System (ADS)

    Jain, Anubhav; Shin, Yongwoo; Persson, Kristin A.

    2016-01-01

    In the search for new functional materials, quantum mechanics is an exciting starting point. The fundamental laws that govern the behaviour of electrons have the possibility, at the other end of the scale, to predict the performance of a material for a targeted application. In some cases, this is achievable using density functional theory (DFT). In this Review, we highlight DFT studies predicting energy-related materials that were subsequently confirmed experimentally. The attributes and limitations of DFT for the computational design of materials for lithium-ion batteries, hydrogen production and storage materials, superconductors, photovoltaics and thermoelectric materials are discussed. In the future, we expect that the accuracy of DFT-based methods will continue to improve and that growth in computing power will enable millions of materials to be virtually screened for specific applications. Thus, these examples represent a first glimpse of what may become a routine and integral step in materials discovery.

  4. Optimized Kaiser-Bessel Window Functions for Computed Tomography.

    PubMed

    Nilchian, Masih; Ward, John Paul; Vonesch, Cedric; Unser, Michael

    2015-11-01

    Kaiser-Bessel window functions are frequently used to discretize tomographic problems because they have two desirable properties: 1) their short support leads to a low computational cost and 2) their rotational symmetry makes their imaging transform independent of the direction. In this paper, we aim at optimizing the parameters of these basis functions. We present a formalism based on the theory of approximation and point out the importance of the partition-of-unity condition. While we prove that, for compact-support functions, this condition is incompatible with isotropy, we show that minimizing the deviation from the partition of unity condition is highly beneficial. The numerical results confirm that the proposed tuning of the Kaiser-Bessel window functions yields the best performance. PMID:26151939

  5. Computer Code For Calculation Of The Mutual Coherence Function

    NASA Astrophysics Data System (ADS)

    Bugnolo, Dimitri S.

    1986-05-01

    We present a computer code in FORTRAN 77 for the calculation of the mutual coherence function (MCF) of a plane wave normally incident on a stochastic half-space. This is an exact result. The user need only input the path length, the wavelength, the outer scale size, and the structure constant. This program may be used to calculate the MCF of a well-collimated laser beam in the atmosphere.

  6. Computations involving differential operators and their actions on functions

    NASA Technical Reports Server (NTRS)

    Crouch, Peter E.; Grossman, Robert; Larson, Richard

    1991-01-01

    The algorithms derived by Grossmann and Larson (1989) are further developed for rewriting expressions involving differential operators. The differential operators involved arise in the local analysis of nonlinear dynamical systems. These algorithms are extended in two different directions: the algorithms are generalized so that they apply to differential operators on groups and the data structures and algorithms are developed to compute symbolically the action of differential operators on functions. Both of these generalizations are needed for applications.

  7. Functional imaging of the brain using computed tomography.

    PubMed

    Berninger, W H; Axel, L; Norman, D; Napel, S; Redington, R W

    1981-03-01

    Data from rapid-sequence CT scans of the same cross section, obtained following bolus injection of contrast material, were analyzed by functional imaging. The information contained in a large number of images can be compressed into one or two gray-scale images which can be evaluated both qualitatively and quantitatively. The computational techniques are described and applied to the generation of images depicting bolus transit time, arrival time, peak time, and effective width. PMID:7465851

  8. Computational aspects of the continuum quaternionic wave functions for hydrogen

    SciTech Connect

    Morais, J.

    2014-10-15

    Over the past few years considerable attention has been given to the role played by the Hydrogen Continuum Wave Functions (HCWFs) in quantum theory. The HCWFs arise via the method of separation of variables for the time-independent Schrödinger equation in spherical coordinates. The HCWFs are composed of products of a radial part involving associated Laguerre polynomials multiplied by exponential factors and an angular part that is the spherical harmonics. In the present paper we introduce the continuum wave functions for hydrogen within quaternionic analysis ((R)QHCWFs), a result which is not available in the existing literature. In particular, the underlying functions are of three real variables and take on either values in the reduced and full quaternions (identified, respectively, with R{sup 3} and R{sup 4}). We prove that the (R)QHCWFs are orthonormal to one another. The representation of these functions in terms of the HCWFs are explicitly given, from which several recurrence formulae for fast computer implementations can be derived. A summary of fundamental properties and further computation of the hydrogen-like atom transforms of the (R)QHCWFs are also discussed. We address all the above and explore some basic facts of the arising quaternionic function theory. As an application, we provide the reader with plot simulations that demonstrate the effectiveness of our approach. (R)QHCWFs are new in the literature and have some consequences that are now under investigation.

  9. INTEGRATING COMPUTATIONAL PROTEIN FUNCTION PREDICTION INTO DRUG DISCOVERY INITIATIVES

    PubMed Central

    Grant, Marianne A.

    2014-01-01

    Pharmaceutical researchers must evaluate vast numbers of protein sequences and formulate innovative strategies for identifying valid targets and discovering leads against them as a way of accelerating drug discovery. The ever increasing number and diversity of novel protein sequences identified by genomic sequencing projects and the success of worldwide structural genomics initiatives have spurred great interest and impetus in the development of methods for accurate, computationally empowered protein function prediction and active site identification. Previously, in the absence of direct experimental evidence, homology-based protein function annotation remained the gold-standard for in silico analysis and prediction of protein function. However, with the continued exponential expansion of sequence databases, this approach is not always applicable, as fewer query protein sequences demonstrate significant homology to protein gene products of known function. As a result, several non-homology based methods for protein function prediction that are based on sequence features, structure, evolution, biochemical and genetic knowledge have emerged. Herein, we review current bioinformatic programs and approaches for protein function prediction/annotation and discuss their integration into drug discovery initiatives. The development of such methods to annotate protein functional sites and their application to large protein functional families is crucial to successfully utilizing the vast amounts of genomic sequence information available to drug discovery and development processes. PMID:25530654

  10. Preprocessing functions for computed radiography images in a PACS environment

    NASA Astrophysics Data System (ADS)

    McNitt-Gray, Michael F.; Pietka, Ewa; Huang, H. K.

    1992-05-01

    In a picture archiving and communications system (PACS), images are acquired from several modalities including computed radiography (CR). This modality has unique image characteristics and presents several problems that need to be resolved before the image is available for viewing at a display workstation. A set of preprocessing functions have been applied to all CR images in a PACS environment to enhance the display of images. The first function reformats CR images that are acquired with different plate sizes to a standard size for display. Another function removes the distracting white background caused by the collimation used at the time of exposure. A third function determines the orientation of each image and rotates those images that are in nonstandard positions into a standard viewing position. Another function creates a default look-up table based on the gray levels actually used by the image (instead of allocated gray levels). Finally, there is a function which creates (for chest images only) the piece-wise linear look-up tables that can be applied to enhance different tissue densities. These functions have all been implemented in a PACS environment. Each of these functions have been very successful in improving the viewing conditions of CR images and contribute to the clinical acceptance of PACS by reducing the effort required to display CR images.

  11. Exploring the cognitive and motor functions of the basal ganglia: an integrative review of computational cognitive neuroscience models

    PubMed Central

    Helie, Sebastien; Chakravarthy, Srinivasa; Moustafa, Ahmed A.

    2013-01-01

    Many computational models of the basal ganglia (BG) have been proposed over the past twenty-five years. While computational neuroscience models have focused on closely matching the neurobiology of the BG, computational cognitive neuroscience (CCN) models have focused on how the BG can be used to implement cognitive and motor functions. This review article focuses on CCN models of the BG and how they use the neuroanatomy of the BG to account for cognitive and motor functions such as categorization, instrumental conditioning, probabilistic learning, working memory, sequence learning, automaticity, reaching, handwriting, and eye saccades. A total of 19 BG models accounting for one or more of these functions are reviewed and compared. The review concludes with a discussion of the limitations of existing CCN models of the BG and prescriptions for future modeling, including the need for computational models of the BG that can simultaneously account for cognitive and motor functions, and the need for a more complete specification of the role of the BG in behavioral functions. PMID:24367325

  12. On the Hydrodynamic Function of Sharkskin: A Computational Investigation

    NASA Astrophysics Data System (ADS)

    Boomsma, Aaron; Sotiropoulos, Fotis

    2014-11-01

    Denticles (placoid scales) are small structures that cover the epidermis of some sharks. The hydrodynamic function of denticles is unclear. Because they resemble riblets, they have been thought to passively reduce skin-friction-for which there is some experimental evidence. Others have experimentally shown that denticles increase skin-friction and have hypothesized that denticles act as vortex generators to delay separation. To help clarify their function, we use high-resolution large eddy and direct numerical simulations, with an immersed boundary method, to simulate flow patterns past and calculate the drag force on Mako Short Fin denticles. Simulations are carried out for the denticles placed in a canonical turbulent boundary layer as well as in the vicinity of a separation bubble. The computed results elucidate the three-dimensional structure of the flow around denticles and provide insights into the hydrodynamic function of sharkskin.

  13. A Riemannian framework for orientation distribution function computing.

    PubMed

    Cheng, Jian; Ghosh, Aurobrata; Jiang, Tianzi; Deriche, Rachid

    2009-01-01

    Compared with Diffusion Tensor Imaging (DTI), High Angular Resolution Imaging (HARDI) can better explore the complex microstructure of white matter. Orientation Distribution Function (ODF) is used to describe the probability of the fiber direction. Fisher information metric has been constructed for probability density family in Information Geometry theory and it has been successfully applied for tensor computing in DTI. In this paper, we present a state of the art Riemannian framework for ODF computing based on Information Geometry and sparse representation of orthonormal bases. In this Riemannian framework, the exponential map, logarithmic map and geodesic have closed forms. And the weighted Frechet mean exists uniquely on this manifold. We also propose a novel scalar measurement, named Geometric Anisotropy (GA), which is the Riemannian geodesic distance between the ODF and the isotropic ODF. The Renyi entropy H1/2 of the ODF can be computed from the GA. Moreover, we present an Affine-Euclidean framework and a Log-Euclidean framework so that we can work in an Euclidean space. As an application, Lagrange interpolation on ODF field is proposed based on weighted Frechet mean. We validate our methods on synthetic and real data experiments. Compared with existing Riemannian frameworks on ODF, our framework is model-free. The estimation of the parameters, i.e. Riemannian coordinates, is robust and linear. Moreover it should be noted that our theoretical results can be used for any probability density function (PDF) under an orthonormal basis representation. PMID:20426075

  14. Computational models of basal-ganglia pathway functions: focus on functional neuroanatomy

    PubMed Central

    Schroll, Henning; Hamker, Fred H.

    2013-01-01

    Over the past 15 years, computational models have had a considerable impact on basal-ganglia research. Most of these models implement multiple distinct basal-ganglia pathways and assume them to fulfill different functions. As there is now a multitude of different models, it has become complex to keep track of their various, sometimes just marginally different assumptions on pathway functions. Moreover, it has become a challenge to oversee to what extent individual assumptions are corroborated or challenged by empirical data. Focusing on computational, but also considering non-computational models, we review influential concepts of pathway functions and show to what extent they are compatible with or contradict each other. Moreover, we outline how empirical evidence favors or challenges specific model assumptions and propose experiments that allow testing assumptions against each other. PMID:24416002

  15. Analog computation of auto and cross-correlation functions

    NASA Technical Reports Server (NTRS)

    1974-01-01

    For analysis of the data obtained from the cross beam systems it was deemed desirable to compute the auto- and cross-correlation functions by both digital and analog methods to provide a cross-check of the analysis methods and an indication as to which of the two methods would be most suitable for routine use in the analysis of such data. It is the purpose of this appendix to provide a concise description of the equipment and procedures used for the electronic analog analysis of the cross beam data. A block diagram showing the signal processing and computation set-up used for most of the analog data analysis is provided. The data obtained at the field test sites were recorded on magnetic tape using wide-band FM recording techniques. The data as recorded were band-pass filtered by electronic signal processing in the data acquisition systems.

  16. A general model for likelihood computations of genetic marker data accounting for linkage, linkage disequilibrium, and mutations.

    PubMed

    Kling, Daniel; Tillmar, Andreas; Egeland, Thore; Mostad, Petter

    2015-09-01

    Several applications necessitate an unbiased determination of relatedness, be it in linkage or association studies or in a forensic setting. An appropriate model to compute the joint probability of some genetic data for a set of persons given some hypothesis about the pedigree structure is then required. The increasing number of markers available through high-density SNP microarray typing and NGS technologies intensifies the demand, where using a large number of markers may lead to biased results due to strong dependencies between closely located loci, both within pedigrees (linkage) and in the population (allelic association or linkage disequilibrium (LD)). We present a new general model, based on a Markov chain for inheritance patterns and another Markov chain for founder allele patterns, the latter allowing us to account for LD. We also demonstrate a specific implementation for X chromosomal markers that allows for computation of likelihoods based on hypotheses of alleged relationships and genetic marker data. The algorithm can simultaneously account for linkage, LD, and mutations. We demonstrate its feasibility using simulated examples. The algorithm is implemented in the software FamLinkX, providing a user-friendly GUI for Windows systems (FamLinkX, as well as further usage instructions, is freely available at www.famlink.se ). Our software provides the necessary means to solve cases where no previous implementation exists. In addition, the software has the possibility to perform simulations in order to further study the impact of linkage and LD on computed likelihoods for an arbitrary set of markers. PMID:25425094

  17. Simulation of Preterm Neonatal Brain Metabolism During Functional Neuronal Activation Using a Computational Model.

    PubMed

    Hapuarachchi, T; Scholkmann, F; Caldwell, M; Hagmann, C; Kleiser, S; Metz, A J; Pastewski, M; Wolf, M; Tachtsidis, I

    2016-01-01

    We present a computational model of metabolism in the preterm neonatal brain. The model has the capacity to mimic haemodynamic and metabolic changes during functional activation and simulate functional near-infrared spectroscopy (fNIRS) data. As an initial test of the model's efficacy, we simulate data obtained from published studies investigating functional activity in preterm neonates. In addition we simulated recently collected data from preterm neonates during visual activation. The model is well able to predict the haemodynamic and metabolic changes from these observations. In particular, we found that changes in cerebral blood flow and blood pressure may account for the observed variability of the magnitude and sign of stimulus-evoked haemodynamic changes reported in preterm infants. PMID:26782202

  18. Computer Modeling of the Earliest Cellular Structures and Functions

    NASA Astrophysics Data System (ADS)

    Pohorille, Andrew

    2000-03-01

    In the absence of extinct or extant record of protocells (the earliest ancestors of contemporary cells), the most direct way to test ourunderstanding of the origin of cellular life is to construct laboratory models of protocells. Such efforts are currently underway in the NASA Astrobiology Program. They are accompanied by computational studies aimed at explaining self-organization of simple molecules into ordered structures and developing designs for molecules that perform protocellular functions. Many of these functions, such as import of nutrients, capture and storage of energy, and response to changes in the environment are carried out by proteins bound to membranes. We will discuss a series of large-scale, molecular-level computer simulations which demonstrate (a) how small proteins (peptides)organize themselves into ordered structures at water-membrane interfaces and insert into membranes, (b) how these peptides aggregate to form membrane-spanning structures (e.g. channels), and (c) by what mechanisms such aggregates perform essential protocellular functions, such as proton transport of protons across cell walls, a key step in cellular bioenergetics. The simulations were performed using the molecular dynamics method, in which Newton's equations of motion for each atom in the system are solved iteratively. The problems of interest required simulations on multi-nanosecond time scales, which corresponded to 10^6-10^8 time steps.

  19. Computer Modeling of the Earliest Cellular Structures and Functions

    NASA Technical Reports Server (NTRS)

    Pohorille, Andrew; Chipot, Christophe; Schweighofer, Karl

    2000-01-01

    In the absence of extinct or extant record of protocells (the earliest ancestors of contemporary cells). the most direct way to test our understanding of the origin of cellular life is to construct laboratory models of protocells. Such efforts are currently underway in the NASA Astrobiology Program. They are accompanied by computational studies aimed at explaining self-organization of simple molecules into ordered structures and developing designs for molecules that perform proto-cellular functions. Many of these functions, such as import of nutrients, capture and storage of energy. and response to changes in the environment are carried out by proteins bound to membrane< We will discuss a series of large-scale, molecular-level computer simulations which demonstrate (a) how small proteins (peptides) organize themselves into ordered structures at water-membrane interfaces and insert into membranes, (b) how these peptides aggregate to form membrane-spanning structures (eg. channels), and (c) by what mechanisms such aggregates perform essential proto-cellular functions, such as proton transport of protons across cell walls, a key step in cellular bioenergetics. The simulations were performed using the molecular dynamics method, in which Newton's equations of motion for each item in the system are solved iteratively. The problems of interest required simulations on multi-nanosecond time scales, which corresponded to 10(exp 6)-10(exp 8) time steps.

  20. Complete RNA inverse folding: computational design of functional hammerhead ribozymes

    PubMed Central

    Dotu, Ivan; Garcia-Martin, Juan Antonio; Slinger, Betty L.; Mechery, Vinodh; Meyer, Michelle M.; Clote, Peter

    2014-01-01

    Nanotechnology and synthetic biology currently constitute one of the most innovative, interdisciplinary fields of research, poised to radically transform society in the 21st century. This paper concerns the synthetic design of ribonucleic acid molecules, using our recent algorithm, RNAiFold, which can determine all RNA sequences whose minimum free energy secondary structure is a user-specified target structure. Using RNAiFold, we design ten cis-cleaving hammerhead ribozymes, all of which are shown to be functional by a cleavage assay. We additionally use RNAiFold to design a functional cis-cleaving hammerhead as a modular unit of a synthetic larger RNA. Analysis of kinetics on this small set of hammerheads suggests that cleavage rate of computationally designed ribozymes may be correlated with positional entropy, ensemble defect, structural flexibility/rigidity and related measures. Artificial ribozymes have been designed in the past either manually or by SELEX (Systematic Evolution of Ligands by Exponential Enrichment); however, this appears to be the first purely computational design and experimental validation of novel functional ribozymes. RNAiFold is available at http://bioinformatics.bc.edu/clotelab/RNAiFold/. PMID:25209235

  1. 21 CFR 870.1435 - Single-function, preprogrammed diagnostic computer.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Single-function, preprogrammed diagnostic computer... Single-function, preprogrammed diagnostic computer. (a) Identification. A single-function, preprogrammed diagnostic computer is a hard-wired computer that calculates a specific physiological or blood-flow...

  2. 21 CFR 870.1435 - Single-function, preprogrammed diagnostic computer.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Single-function, preprogrammed diagnostic computer... Single-function, preprogrammed diagnostic computer. (a) Identification. A single-function, preprogrammed diagnostic computer is a hard-wired computer that calculates a specific physiological or blood-flow...

  3. 21 CFR 870.1435 - Single-function, preprogrammed diagnostic computer.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Single-function, preprogrammed diagnostic computer... Single-function, preprogrammed diagnostic computer. (a) Identification. A single-function, preprogrammed diagnostic computer is a hard-wired computer that calculates a specific physiological or blood-flow...

  4. 21 CFR 870.1435 - Single-function, preprogrammed diagnostic computer.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Single-function, preprogrammed diagnostic computer... Single-function, preprogrammed diagnostic computer. (a) Identification. A single-function, preprogrammed diagnostic computer is a hard-wired computer that calculates a specific physiological or blood-flow...

  5. 21 CFR 870.1435 - Single-function, preprogrammed diagnostic computer.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Single-function, preprogrammed diagnostic computer... Single-function, preprogrammed diagnostic computer. (a) Identification. A single-function, preprogrammed diagnostic computer is a hard-wired computer that calculates a specific physiological or blood-flow...

  6. Confidence and psychosis: a neuro-computational account of contingency learning disruption by NMDA blockade.

    PubMed

    Vinckier, F; Gaillard, R; Palminteri, S; Rigoux, L; Salvador, A; Fornito, A; Adapa, R; Krebs, M O; Pessiglione, M; Fletcher, P C

    2016-07-01

    A state of pathological uncertainty about environmental regularities might represent a key step in the pathway to psychotic illness. Early psychosis can be investigated in healthy volunteers under ketamine, an NMDA receptor antagonist. Here, we explored the effects of ketamine on contingency learning using a placebo-controlled, double-blind, crossover design. During functional magnetic resonance imaging, participants performed an instrumental learning task, in which cue-outcome contingencies were probabilistic and reversed between blocks. Bayesian model comparison indicated that in such an unstable environment, reinforcement learning parameters are downregulated depending on confidence level, an adaptive mechanism that was specifically disrupted by ketamine administration. Drug effects were underpinned by altered neural activity in a fronto-parietal network, which reflected the confidence-based shift to exploitation of learned contingencies. Our findings suggest that an early characteristic of psychosis lies in a persistent doubt that undermines the stabilization of behavioral policy resulting in a failure to exploit regularities in the environment. PMID:26055423

  7. Non-functioning adrenal adenomas discovered incidentally on computed tomography

    SciTech Connect

    Mitnick, J.S.; Bosniak, M.A.; Megibow, A.J.; Naidich, D.P.

    1983-08-01

    Eighteen patients with unilateral non-metastatic non-functioning adrenal masses were studied with computed tomography (CT). Pathological examination in cases revealed benign adrenal adenomas. The others were followed up with serial CT scans and found to show no change in tumor size over a period of six months to three years. On the basis of these findings, the authors suggest certain criteria of a benign adrenal mass, including (a) diameter less than 5 cm, (b) smooth contour, (c) well-defined margin, and (d) no change in size on follow-up. Serial CT scanning can be used as an alternative to surgery in the management of many of these patients.

  8. Do Executive Function and Impulsivity Predict Adolescent Health Behaviour after Accounting for Intelligence? Findings from the ALSPAC Cohort

    PubMed Central

    Pechey, Rachel; Couturier, Dominique-Laurent; Deary, Ian J.; Marteau, Theresa M.

    2016-01-01

    Objective Executive function, impulsivity, and intelligence are correlated markers of cognitive resource that predict health-related behaviours. It is unknown whether executive function and impulsivity are unique predictors of these behaviours after accounting for intelligence. Methods Data from 6069 participants from the Avon Longitudinal Study of Parents and Children were analysed to investigate whether components of executive function (selective attention, attentional control, working memory, and response inhibition) and impulsivity (parent-rated) measured between ages 8 and 10, predicted having ever drunk alcohol, having ever smoked, fruit and vegetable consumption, physical activity, and overweight at age 13, after accounting for intelligence at age 8 and childhood socioeconomic characteristics. Results Higher intelligence predicted having drunk alcohol, not smoking, greater fruit and vegetable consumption, and not being overweight. After accounting for intelligence, impulsivity predicted alcohol use (odds ratio = 1.10; 99% confidence interval = 1.02, 1.19) and smoking (1.22; 1.11, 1.34). Working memory predicted not being overweight (0.90; 0.81, 0.99). Conclusions After accounting for intelligence, executive function predicts overweight status but not health-related behaviours in early adolescence, whilst impulsivity predicts the onset of alcohol and cigarette use, all with small effects. This suggests overlap between executive function and intelligence as predictors of health behaviour in this cohort, with trait impulsivity accounting for additional variance. PMID:27479488

  9. Computing the effective action with the functional renormalization group

    NASA Astrophysics Data System (ADS)

    Codello, Alessandro; Percacci, Roberto; Rachwał, Lesław; Tonero, Alberto

    2016-04-01

    The "exact" or "functional" renormalization group equation describes the renormalization group flow of the effective average action Γ _k. The ordinary effective action Γ _0 can be obtained by integrating the flow equation from an ultraviolet scale k=Λ down to k=0. We give several examples of such calculations at one-loop, both in renormalizable and in effective field theories. We reproduce the four-point scattering amplitude in the case of a real scalar field theory with quartic potential and in the case of the pion chiral Lagrangian. In the case of gauge theories, we reproduce the vacuum polarization of QED and of Yang-Mills theory. We also compute the two-point functions for scalars and gravitons in the effective field theory of scalar fields minimally coupled to gravity.

  10. Enhancing Commitment or Tightening Control: The Function of Teacher Professional Development in an Era of Accountability

    ERIC Educational Resources Information Center

    Smith, Thomas M.; Rowley, Kristie J.

    2005-01-01

    During the past decade or so, popular rhetoric has shifted away from site-based management and participatory governance as the centerpiece of school reform strategies as accountability and standards-based reform have become the reform mantra of policy makers at all levels of government. Critics of accountability-based reforms have suggested that…

  11. Enzymatic Halogenases and Haloperoxidases: Computational Studies on Mechanism and Function.

    PubMed

    Timmins, Amy; de Visser, Sam P

    2015-01-01

    Despite the fact that halogenated compounds are rare in biology, a number of organisms have developed processes to utilize halogens and in recent years, a string of enzymes have been identified that selectively insert halogen atoms into, for instance, a CH aliphatic bond. Thus, a number of natural products, including antibiotics, contain halogenated functional groups. This unusual process has great relevance to the chemical industry for stereoselective and regiospecific synthesis of haloalkanes. Currently, however, industry utilizes few applications of biological haloperoxidases and halogenases, but efforts are being worked on to understand their catalytic mechanism, so that their catalytic function can be upscaled. In this review, we summarize experimental and computational studies on the catalytic mechanism of a range of haloperoxidases and halogenases with structurally very different catalytic features and cofactors. This chapter gives an overview of heme-dependent haloperoxidases, nonheme vanadium-dependent haloperoxidases, and flavin adenine dinucleotide-dependent haloperoxidases. In addition, we discuss the S-adenosyl-l-methionine fluoridase and nonheme iron/α-ketoglutarate-dependent halogenases. In particular, computational efforts have been applied extensively for several of these haloperoxidases and halogenases and have given insight into the essential structural features that enable these enzymes to perform the unusual halogen atom transfer to substrates. PMID:26415843

  12. Range, Doppler and astrometric observables computed from Time Transfer Functions: a survey

    NASA Astrophysics Data System (ADS)

    Hees, A.; Bertone, S.; Le Poncin-Lafitte, C.; Teyssandier, P.

    2015-08-01

    Determining range, Doppler and astrometric observables is of crucial interest for modelling and analyzing space observations. We recall how these observables can be computed when the travel time of a light ray is known as a function of the positions of the emitter and the receiver for a given instant of reception (or emission). For a long time, such a function-called a reception (or emission) time transfer function has been almost exclusively calculated by integrating the null geodesic equations describing the light rays. However, other methods avoiding such an integration have been considerably developed in the last twelve years. We give a survey of the analytical results obtained with these new methods up to the third order in the gravitational constant G for a mass monopole. We briefly discuss the case of quasi-conjunctions, where higher-order enhanced terms must be taken into account for correctly calculating the effects. We summarize the results obtained at the first order in G when the multipole structure and the motion of an axisymmetric body is taken into account. We present some applications to on-going or future missions like Gaia and Juno. We give a short review of the recent works devoted to the numerical estimates of the time transfer functions and their derivatives.

  13. The Time Transfer Functions: an efficient tool to compute range, Doppler and astrometric observables

    NASA Astrophysics Data System (ADS)

    Hees, A.; Bertone, S.; Le Poncin-Lafitte, C.; Teyssandier, P.

    2015-12-01

    Determining range, Doppler and astrometric observables is of crucial interest for modelling and analyzing space observations. We recall how these observables can be computed when the travel time of a light ray is known as a function of the positions of the emitter and the receiver for a given instant of reception (or emission). For a long time, such a function--called a reception (or emission) time transfer function--has been almost exclusively calculated by integrating the null geodesic equations describing the light rays. However, other methods avoiding such an integration have been considerably developped in the last twelve years. We give a survey of the analytical results obtained with these new methods up to the third order in the gravitational constant G for a mass monopole. We briefly discuss the case of quasi-conjunctions, where higher-order enhanced terms must be taken into account for correctly calculating the effects. We summarize the results obtained at the first order in G when the multipole structure and the motion of an axisymmetric body is taken into account. We present some applications to on-going or future missions like Gaia and Juno. We give a short review of the recent works devoted to the numerical estimates of the time transfer functions and their derivatives.

  14. Computational principles of syntax in the regions specialized for language: integrating theoretical linguistics and functional neuroimaging

    PubMed Central

    Ohta, Shinri; Fukui, Naoki; Sakai, Kuniyoshi L.

    2013-01-01

    The nature of computational principles of syntax remains to be elucidated. One promising approach to this problem would be to construct formal and abstract linguistic models that parametrically predict the activation modulations in the regions specialized for linguistic processes. In this article, we review recent advances in theoretical linguistics and functional neuroimaging in the following respects. First, we introduce the two fundamental linguistic operations: Merge (which combines two words or phrases to form a larger structure) and Search (which searches and establishes a syntactic relation of two words or phrases). We also illustrate certain universal properties of human language, and present hypotheses regarding how sentence structures are processed in the brain. Hypothesis I is that the Degree of Merger (DoM), i.e., the maximum depth of merged subtrees within a given domain, is a key computational concept to properly measure the complexity of tree structures. Hypothesis II is that the basic frame of the syntactic structure of a given linguistic expression is determined essentially by functional elements, which trigger Merge and Search. We then present our recent functional magnetic resonance imaging experiment, demonstrating that the DoM is indeed a key syntactic factor that accounts for syntax-selective activations in the left inferior frontal gyrus and supramarginal gyrus. Hypothesis III is that the DoM domain changes dynamically in accordance with iterative Merge applications, the Search distances, and/or task requirements. We confirm that the DoM accounts for activations in various sentence types. Hypothesis III successfully explains activation differences between object- and subject-relative clauses, as well as activations during explicit syntactic judgment tasks. A future research on the computational principles of syntax will further deepen our understanding of uniquely human mental faculties. PMID:24385957

  15. A new computational account of cognitive control over reinforcement-based decision-making: Modeling of a probabilistic learning task.

    PubMed

    Zendehrouh, Sareh

    2015-11-01

    Recent work on decision-making field offers an account of dual-system theory for decision-making process. This theory holds that this process is conducted by two main controllers: a goal-directed system and a habitual system. In the reinforcement learning (RL) domain, the habitual behaviors are connected with model-free methods, in which appropriate actions are learned through trial-and-error experiences. However, goal-directed behaviors are associated with model-based methods of RL, in which actions are selected using a model of the environment. Studies on cognitive control also suggest that during processes like decision-making, some cortical and subcortical structures work in concert to monitor the consequences of decisions and to adjust control according to current task demands. Here a computational model is presented based on dual system theory and cognitive control perspective of decision-making. The proposed model is used to simulate human performance on a variant of probabilistic learning task. The basic proposal is that the brain implements a dual controller, while an accompanying monitoring system detects some kinds of conflict including a hypothetical cost-conflict one. The simulation results address existing theories about two event-related potentials, namely error related negativity (ERN) and feedback related negativity (FRN), and explore the best account of them. Based on the results, some testable predictions are also presented. PMID:26339919

  16. An Evolutionary Computation Approach to Examine Functional Brain Plasticity.

    PubMed

    Roy, Arnab; Campbell, Colin; Bernier, Rachel A; Hillary, Frank G

    2016-01-01

    One common research goal in systems neurosciences is to understand how the functional relationship between a pair of regions of interest (ROIs) evolves over time. Examining neural connectivity in this way is well-suited for the study of developmental processes, learning, and even in recovery or treatment designs in response to injury. For most fMRI based studies, the strength of the functional relationship between two ROIs is defined as the correlation between the average signal representing each region. The drawback to this approach is that much information is lost due to averaging heterogeneous voxels, and therefore, the functional relationship between a ROI-pair that evolve at a spatial scale much finer than the ROIs remain undetected. To address this shortcoming, we introduce a novel evolutionary computation (EC) based voxel-level procedure to examine functional plasticity between an investigator defined ROI-pair by simultaneously using subject-specific BOLD-fMRI data collected from two sessions seperated by finite duration of time. This data-driven procedure detects a sub-region composed of spatially connected voxels from each ROI (a so-called sub-regional-pair) such that the pair shows a significant gain/loss of functional relationship strength across the two time points. The procedure is recursive and iteratively finds all statistically significant sub-regional-pairs within the ROIs. Using this approach, we examine functional plasticity between the default mode network (DMN) and the executive control network (ECN) during recovery from traumatic brain injury (TBI); the study includes 14 TBI and 12 healthy control subjects. We demonstrate that the EC based procedure is able to detect functional plasticity where a traditional averaging based approach fails. The subject-specific plasticity estimates obtained using the EC-procedure are highly consistent across multiple runs. Group-level analyses using these plasticity estimates showed an increase in the strength

  17. An Evolutionary Computation Approach to Examine Functional Brain Plasticity

    PubMed Central

    Roy, Arnab; Campbell, Colin; Bernier, Rachel A.; Hillary, Frank G.

    2016-01-01

    One common research goal in systems neurosciences is to understand how the functional relationship between a pair of regions of interest (ROIs) evolves over time. Examining neural connectivity in this way is well-suited for the study of developmental processes, learning, and even in recovery or treatment designs in response to injury. For most fMRI based studies, the strength of the functional relationship between two ROIs is defined as the correlation between the average signal representing each region. The drawback to this approach is that much information is lost due to averaging heterogeneous voxels, and therefore, the functional relationship between a ROI-pair that evolve at a spatial scale much finer than the ROIs remain undetected. To address this shortcoming, we introduce a novel evolutionary computation (EC) based voxel-level procedure to examine functional plasticity between an investigator defined ROI-pair by simultaneously using subject-specific BOLD-fMRI data collected from two sessions seperated by finite duration of time. This data-driven procedure detects a sub-region composed of spatially connected voxels from each ROI (a so-called sub-regional-pair) such that the pair shows a significant gain/loss of functional relationship strength across the two time points. The procedure is recursive and iteratively finds all statistically significant sub-regional-pairs within the ROIs. Using this approach, we examine functional plasticity between the default mode network (DMN) and the executive control network (ECN) during recovery from traumatic brain injury (TBI); the study includes 14 TBI and 12 healthy control subjects. We demonstrate that the EC based procedure is able to detect functional plasticity where a traditional averaging based approach fails. The subject-specific plasticity estimates obtained using the EC-procedure are highly consistent across multiple runs. Group-level analyses using these plasticity estimates showed an increase in the strength

  18. Computer Modeling of Protocellular Functions: Peptide Insertion in Membranes

    NASA Technical Reports Server (NTRS)

    Rodriquez-Gomez, D.; Darve, E.; Pohorille, A.

    2006-01-01

    Lipid vesicles became the precursors to protocells by acquiring the capabilities needed to survive and reproduce. These include transport of ions, nutrients and waste products across cell walls and capture of energy and its conversion into a chemically usable form. In modem organisms these functions are carried out by membrane-bound proteins (about 30% of the genome codes for this kind of proteins). A number of properties of alpha-helical peptides suggest that their associations are excellent candidates for protobiological precursors of proteins. In particular, some simple a-helical peptides can aggregate spontaneously and form functional channels. This process can be described conceptually by a three-step thermodynamic cycle: 1 - folding of helices at the water-membrane interface, 2 - helix insertion into the lipid bilayer and 3 - specific interactions of these helices that result in functional tertiary structures. Although a crucial step, helix insertion has not been adequately studied because of the insolubility and aggregation of hydrophobic peptides. In this work, we use computer simulation methods (Molecular Dynamics) to characterize the energetics of helix insertion and we discuss its importance in an evolutionary context. Specifically, helices could self-assemble only if their interactions were sufficiently strong to compensate the unfavorable Free Energy of insertion of individual helices into membranes, providing a selection mechanism for protobiological evolution.

  19. Computation of Multimodal Size-Velocity-Temperature Spray Distribution Functions

    NASA Astrophysics Data System (ADS)

    Archambault, Mark R.

    2002-09-01

    An alternative approach to modeling spray flows-one which does not involve simulation or stochastic integration is to directly compute the evolution of the probability density function (PDF) describing the drops. The purpose of this paper is to continue exploring an alternative method of solving the spray flow problem. The approach is to derive and solve a set of Eulerian moment transport equations for the quantities of interest in the spray, coupled with the appropriate gas-phase (Eulerian) equations. A second purpose is to continue to explore how a maximum-entropy criterion may be used to provide closure for such a moment-based model. The hope is to further develop an Eulerian-Eulerian model that will permit one to solve for detailed droplet statistics directly without the use of stochastic integration or post-averaging of simulations.

  20. Imaging local brain function with emission computed tomography

    SciTech Connect

    Kuhl, D.E.

    1984-03-01

    Positron emission tomography (PET) using /sup 18/F-fluorodeoxyglucose (FDG) was used to map local cerebral glucose utilization in the study of local cerebral function. This information differs fundamentally from structural assessment by means of computed tomography (CT). In normal human volunteers, the FDG scan was used to determine the cerebral metabolic response to conrolled sensory stimulation and the effects of aging. Cerebral metabolic patterns are distinctive among depressed and demented elderly patients. The FDG scan appears normal in the depressed patient, studded with multiple metabolic defects in patients with multiple infarct dementia, and in the patients with Alzheimer disease, metabolism is particularly reduced in the parietal cortex, but only slightly reduced in the caudate and thalamus. The interictal FDG scan effectively detects hypometabolic brain zones that are sites of onset for seizures in patients with partial epilepsy, even though these zones usually appear normal on CT scans. The future prospects of PET are discussed.

  1. Optimizing high performance computing workflow for protein functional annotation.

    PubMed

    Stanberry, Larissa; Rekepalli, Bhanu; Liu, Yuan; Giblock, Paul; Higdon, Roger; Montague, Elizabeth; Broomall, William; Kolker, Natali; Kolker, Eugene

    2014-09-10

    Functional annotation of newly sequenced genomes is one of the major challenges in modern biology. With modern sequencing technologies, the protein sequence universe is rapidly expanding. Newly sequenced bacterial genomes alone contain over 7.5 million proteins. The rate of data generation has far surpassed that of protein annotation. The volume of protein data makes manual curation infeasible, whereas a high compute cost limits the utility of existing automated approaches. In this work, we present an improved and optmized automated workflow to enable large-scale protein annotation. The workflow uses high performance computing architectures and a low complexity classification algorithm to assign proteins into existing clusters of orthologous groups of proteins. On the basis of the Position-Specific Iterative Basic Local Alignment Search Tool the algorithm ensures at least 80% specificity and sensitivity of the resulting classifications. The workflow utilizes highly scalable parallel applications for classification and sequence alignment. Using Extreme Science and Engineering Discovery Environment supercomputers, the workflow processed 1,200,000 newly sequenced bacterial proteins. With the rapid expansion of the protein sequence universe, the proposed workflow will enable scientists to annotate big genome data. PMID:25313296

  2. Computational Effective Fault Detection by Means of Signature Functions

    PubMed Central

    Baranski, Przemyslaw; Pietrzak, Piotr

    2016-01-01

    The paper presents a computationally effective method for fault detection. A system’s responses are measured under healthy and ill conditions. These signals are used to calculate so-called signature functions that create a signal space. The current system’s response is projected into this space. The signal location in this space easily allows to determine the fault. No classifier such as a neural network, hidden Markov models, etc. is required. The advantage of this proposed method is its efficiency, as computing projections amount to calculating dot products. Therefore, this method is suitable for real-time embedded systems due to its simplicity and undemanding processing capabilities which permit the use of low-cost hardware and allow rapid implementation. The approach performs well for systems that can be considered linear and stationary. The communication presents an application, whereby an industrial process of moulding is supervised. The machine is composed of forms (dies) whose alignment must be precisely set and maintained during the work. Typically, the process is stopped periodically to manually control the alignment. The applied algorithm allows on-line monitoring of the device by analysing the acceleration signal from a sensor mounted on a die. This enables to detect failures at an early stage thus prolonging the machine’s life. PMID:26949942

  3. Computational Effective Fault Detection by Means of Signature Functions.

    PubMed

    Baranski, Przemyslaw; Pietrzak, Piotr

    2016-01-01

    The paper presents a computationally effective method for fault detection. A system's responses are measured under healthy and ill conditions. These signals are used to calculate so-called signature functions that create a signal space. The current system's response is projected into this space. The signal location in this space easily allows to determine the fault. No classifier such as a neural network, hidden Markov models, etc. is required. The advantage of this proposed method is its efficiency, as computing projections amount to calculating dot products. Therefore, this method is suitable for real-time embedded systems due to its simplicity and undemanding processing capabilities which permit the use of low-cost hardware and allow rapid implementation. The approach performs well for systems that can be considered linear and stationary. The communication presents an application, whereby an industrial process of moulding is supervised. The machine is composed of forms (dies) whose alignment must be precisely set and maintained during the work. Typically, the process is stopped periodically to manually control the alignment. The applied algorithm allows on-line monitoring of the device by analysing the acceleration signal from a sensor mounted on a die. This enables to detect failures at an early stage thus prolonging the machine's life. PMID:26949942

  4. Assessing executive function using a computer game: computational modeling of cognitive processes.

    PubMed

    Hagler, Stuart; Jimison, Holly Brugge; Pavel, Misha

    2014-07-01

    Early and reliable detection of cognitive decline is one of the most important challenges of current healthcare. In this project, we developed an approach whereby a frequently played computer game can be used to assess a variety of cognitive processes and estimate the results of the pen-and-paper trail making test (TMT)--known to measure executive function, as well as visual pattern recognition, speed of processing, working memory, and set-switching ability. We developed a computational model of the TMT based on a decomposition of the test into several independent processes, each characterized by a set of parameters that can be estimated from play of a computer game designed to resemble the TMT. An empirical evaluation of the model suggests that it is possible to use the game data to estimate the parameters of the underlying cognitive processes and using the values of the parameters to estimate the TMT performance. Cognitive measures and trends in these measures can be used to identify individuals for further assessment, to provide a mechanism for improving the early detection of neurological problems, and to provide feedback and monitoring for cognitive interventions in the home. PMID:25014944

  5. Chemical Visualization of Boolean Functions: A Simple Chemical Computer

    NASA Astrophysics Data System (ADS)

    Blittersdorf, R.; Müller, J.; Schneider, F. W.

    1995-08-01

    We present a chemical realization of the Boolean functions AND, OR, NAND, and NOR with a neutralization reaction carried out in three coupled continuous flow stirred tank reactors (CSTR). Two of these CSTR's are used as input reactors, the third reactor marks the output. The chemical reaction is the neutralization of hydrochloric acid (HCl) with sodium hydroxide (NaOH) in the presence of phenolphtalein as an indicator, which is red in alkaline solutions and colorless in acidic solutions representing the two binary states 1 and 0, respectively. The time required for a "chemical computation" is determined by the flow rate of reactant solutions into the reactors since the neutralization reaction itself is very fast. While the acid flow to all reactors is equal and constant, the flow rate of NaOH solution controls the states of the input reactors. The connectivities between the input and output reactors determine the flow rate of NaOH solution into the output reactor, according to the chosen Boolean function. Thus the state of the output reactor depends on the states of the input reactors.

  6. Accounting for Accountability.

    ERIC Educational Resources Information Center

    Colorado State Dept. of Education, Denver. Cooperative Accountability Project.

    This publication reports on two Regional Educational Accountability Conferences on Techniques sponsored by the Cooperative Accountability Project. Accountability is described as an "emotionally-charged issue" and an "operationally demanding concept." Overviewing accountability, major speakers emphasized that accountability is a means toward…

  7. Memory and Generativity in Very High Functioning Autism: A Firsthand Account, and an Interpretation

    ERIC Educational Resources Information Center

    Boucher, Jill

    2007-01-01

    JS is a highly able person with Asperger syndrome whose language and intellectual abilities are, and always have been, superior. The first part of this short article consists of JS's analytical account of his atypical memory abilities, and the strategies he uses for memorizing and learning. JS has also described specific difficulties with creative…

  8. A survey. Financial accounting and internal control functions pursued by hospital boards.

    PubMed

    Gavin, T A

    1984-09-01

    Justification for a board committee's existence is its ability to devote time to issues judged to be important by the full board. This seems to have happened. Multiple committees pursue more functions than the other committee structures. Boards lacking an FA/IC committee pursue significantly fewer functions than their counterparts with committees. Substantial respondent agreement exists on those functions most and least frequently pursued, those perceived to be most and least important, and those perceived to be most and least effectively undertaken. Distinctions between committee structures and the full board, noted in the previous paragraph, hold true with respect to the importance of functions. All board structures identified reviewing the budget and comparing it to actual results as important. Committee structures are generally more inclined to address functions related to the work of the independent auditor and the effectiveness of the hospital's system and controls than are full board structures. Functions related to the internal auditor are pursued least frequently by all FA/IC board structures. The following suggestions are made to help boards pay adequate attention to and obtain objective information about the financial affairs of their hospitals. Those boards that do not have some form of an FA/IC committee should consider starting one. Evidence shows chief financial officers have been a moving force in establishing and strengthening such committees. Boards having a joint or single committee structure should consider upgrading their structure to either a single committee or multiple committees respectively. The complexity of the healthcare environment requires that more FA/IC functions be addressed by the board. The board or its FA/IC committee(s) should meet with their independent CPA's, fiscal intermediary auditors, and internal auditors. Where the hospital lacks an internal audit function a study should be undertaken to determine the feasibility of

  9. HANOIPC3: a computer program to evaluate executive functions.

    PubMed

    Guevara, M A; Rizo, L; Ruiz-Díaz, M; Hernández-González, M

    2009-08-01

    This article describes a computer program (HANOIPC3) based on the Tower of Hanoi game that, by analyzing a series of parameters during execution, allows a fast and accurate evaluation of data related to certain executive functions, especially planning, organizing and problem-solving. This computerized version has only one level of difficulty based on the use of 3 disks, but it stipulates an additional rule: only one disk may be moved at a time, and only to an adjacent peg (i.e., no peg can be skipped over). In the original version--without this stipulation--the minimum number of movements required to complete the task is 7, but under the conditions of this computerized version this increases to 26. HANOIPC3 has three important advantages: (1) it allows a researcher or clinician to modify the rules by adding or removing certain conditions, thus augmenting the utility and flexibility in test execution and the interpretation of results; (2) it allows to provide on-line feedback to subjects about their execution; and, (3) it creates a specific file to store the scores that correspond to the parameters obtained during trials. The parameters that can be measured include: latencies (time taken for each movement, measured in seconds), total test time, total number of movements, and the number of correct and incorrect movements. The efficacy and adaptability of this program has been confirmed. PMID:19303660

  10. Accounting Specialist.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Center on Education and Training for Employment.

    This publication identifies 20 subjects appropriate for use in a competency list for the occupation of accounting specialist, 1 of 12 occupations within the business/computer technologies cluster. Each unit consists of a number of competencies; a list of competency builders is provided for each competency. Titles of the 20 units are as follows:…

  11. pH-Regulated Mechanisms Account for Pigment-Type Differences in Epidermal Barrier Function

    PubMed Central

    Gunathilake, Roshan; Schurer, Nanna Y.; Shoo, Brenda A.; Celli, Anna; Hachem, Jean-Pierre; Crumrine, Debra; Sirimanna, Ganga; Feingold, Kenneth R.; Mauro, Theodora M.; Elias, Peter M.

    2009-01-01

    To determine whether pigment type determines differences in epidermal function, we studied stratum corneum (SC) pH, permeability barrier homeostasis, and SC integrity in three geographically disparate populations with pigment type I–II versus IV–V skin (Fitzpatrick I–VI scale). Type IV–V subjects showed: (i) lower surface pH (≈0.5 U); (ii) enhanced SC integrity (transepidermal water loss change with sequential tape strippings); and (iii) more rapid barrier recovery than type I–II subjects. Enhanced barrier function could be ascribed to increased epidermal lipid content, increased lamellar body production, and reduced acidity, leading to enhanced lipid processing. Compromised SC integrity in type I–II subjects could be ascribed to increased serine protease activity, resulting in accelerated desmoglein-1 (DSG-1)/corneodesmosome degradation. In contrast, DSG-1-positive CDs persisted in type IV–V subjects, but due to enhanced cathepsin-D activity, SC thickness did not increase. Adjustment of pH of type I–II SC to type IV–V levels improved epidermal function. Finally, dendrites from type IV–V melanocytes were more acidic than those from type I–II subjects, and they transfer more melanosomes to the SC, suggesting that melanosome secretion could contribute to the more acidic pH of type IV–V skin. These studies show marked pigment-type differences in epidermal structure and function that are pH driven. PMID:19177137

  12. Developmental Language Impairment through the Lens of the ICF: An Integrated Account of Children's Functioning

    ERIC Educational Resources Information Center

    Dempsey, Lynn; Skarakis-Doyle, Elizabeth

    2010-01-01

    The conceptual framework of the World Health Organization's International Classification of Functioning, Disability and Health (ICF) has the potential to advance understanding of developmental language impairment (LI) and enhance clinical practice. The framework provides a systematic way of unifying numerous lines of research, which have linked a…

  13. Calibration function for the Orbitrap FTMS accounting for the space charge effect.

    PubMed

    Gorshkov, Mikhail V; Good, David M; Lyutvinskiy, Yaroslav; Yang, Hongqian; Zubarev, Roman A

    2010-11-01

    Ion storage in an electrostatic trap has been implemented with the introduction of the Orbitrap Fourier transform mass spectrometer (FTMS), which demonstrates performance similar to high-field ion cyclotron resonance MS. High mass spectral characteristics resulted in rapid acceptance of the Orbitrap FTMS for Life Sciences applications. The basics of Orbitrap operation are well documented; however, like in any ion trap MS technology, its performance is limited by interactions between the ion clouds. These interactions result in ion cloud couplings, systematic errors in measured masses, interference between ion clouds of different size yet with close m/z ratios, etc. In this work, we have characterized the space-charge effect on the measured frequency for the Orbitrap FTMS, looking for the possibility to achieve sub-ppm levels of mass measurement accuracy (MMA) for peptides in a wide range of total ion population. As a result of this characterization, we proposed an m/z calibration law for the Orbitrap FTMS that accounts for the total ion population present in the trap during a data acquisition event. Using this law, we were able to achieve a zero-space charge MMA limit of 80 ppb for the commercial Orbitrap FTMS system and sub-ppm level of MMA over a wide range of total ion populations with the automatic gain control values varying from 10 to 10(7). PMID:20696596

  14. Accounting for heterogeneity among treatment sites and time trends in developing crash modification functions.

    PubMed

    Sacchi, Emanuele; Sayed, Tarek

    2014-11-01

    Collision modification factors (CMFs) are commonly used to quantify the impact of safety countermeasures. The CMFs obtained from observational before-after (BA) studies are usually estimated by averaging the safety impact (i.e., index of effectiveness) for a group of treatment sites. The heterogeneity among the treatment locations, in terms of their characteristics, and the effect of this heterogeneity on safety treatment effectiveness are usually ignored. This is in contrast to treatment evaluations in other fields like medical statistics where variations in the magnitude (or in the direction) of response to the same treatment given to different patients are considered. This paper introduces an approach for estimating a CMFunction from BA safety studies that account for variable treatment location characteristics (heterogeneity). The treatment sites heterogeneity was incorporated into the CMFunction using fixed-effects and random-effects regression models. In addition to heterogeneity, the paper also advocates the use of CMFunctions with a time variable to acknowledge that the safety treatment (intervention) effects do not occur instantaneously but are spread over future time. This is achieved using non-linear intervention (Koyck) models, developed within a hierarchical full Bayes (FB) context. To demonstrate the approach, a case study is presented to evaluate the safety effectiveness of the "Signal Head Upgrade Program" recently implemented in the city of Surrey (British Columbia, Canada), where signal visibility was improved at several urban signalized intersections. The results demonstrated the importance of considering treatment sites heterogeneity and time trends when developing CMFunctions. PMID:25033279

  15. Elusive accountabilities in the HIV scale-up: 'ownership' as a functional tautology.

    PubMed

    Esser, Daniel E

    2014-01-01

    Mounting concerns over aid effectiveness have rendered 'ownership' a central concept in the vocabulary of development assistance for health (DAH). The article investigates the application of both 'national ownership' and 'country ownership' in the broader development discourse as well as more specifically in the context of internationally funded HIV/AIDS interventions. Based on comprehensive literature reviews, the research uncovers a multiplicity of definitions, most of which either divert from or plainly contradict the concept's original meaning and intent. During the last 10 years in particular, it appears that both public and private donors have advocated for greater 'ownership' by recipient governments and countries to hedge their own political risk rather than to work towards greater inclusion of the latter in agenda-setting and programming. Such politically driven semantic dynamics suggest that the concept's salience is not merely a discursive reflection of globally skewed power relations in DAH but a deliberate exercise in limiting donors' accountabilities. At the same time, the research also finds evidence that this conceptual contortion frames current global public health scholarship, thus adding further urgency to the need to critically re-evaluate the international political economy of global public health from a discursive perspective. PMID:24498888

  16. Computational Multiscale Toxicodynamic Modeling of Silver and Carbon Nanoparticle Effects on Mouse Lung Function

    PubMed Central

    Mukherjee, Dwaipayan; Botelho, Danielle; Gow, Andrew J.; Zhang, Junfeng; Georgopoulos, Panos G.

    2013-01-01

    A computational, multiscale toxicodynamic model has been developed to quantify and predict pulmonary effects due to uptake of engineered nanomaterials (ENMs) in mice. The model consists of a collection of coupled toxicodynamic modules, that were independently developed and tested using information obtained from the literature. The modules were developed to describe the dynamics of tissue with explicit focus on the cells and the surfactant chemicals that regulate the process of breathing, as well as the response of the pulmonary system to xenobiotics. Alveolar type I and type II cells, and alveolar macrophages were included in the model, along with surfactant phospholipids and surfactant proteins, to account for processes occurring at multiple biological scales, coupling cellular and surfactant dynamics affected by nanoparticle exposure, and linking the effects to tissue-level lung function changes. Nanoparticle properties such as size, surface chemistry, and zeta potential were explicitly considered in modeling the interactions of these particles with biological media. The model predictions were compared with in vivo lung function response measurements in mice and analysis of mice lung lavage fluid following exposures to silver and carbon nanoparticles. The predictions were found to follow the trends of observed changes in mouse surfactant composition over 7 days post dosing, and are in good agreement with the observed changes in mouse lung function over the same period of time. PMID:24312506

  17. Using the compute unified device architecture programming environment in simulation of ion-beam injection line with account for space charge effect

    NASA Astrophysics Data System (ADS)

    Yudin, I. P.; Perepelkin, E. E.; Tyutyunnikov, S. I.

    2011-11-01

    A simulation of the beam injection line in a synchrotron is performed within the Veksler and Baldin Laboratory of High Energy Physics, Joint Institute for Nuclear Research (VBLHEP JINR), project "The Development and Implementation of Units of a Synchrotron for Hadron Therapy." The parameters of the injection line are chosen for the transport of beams with intensities of 25-100 mA through the injection channel of the synchrotron with account for the space-charge effect. The simulation was performed using the method of macroparticles (the PIC method). The approach of massively parallel computations on graphics processors using Compute Unified Device Architecture (CUDA) technology was applied for the acceleration of computations. The 66-fold speedup of computations was obtained using the Tesla C1060 computing module instead of a single-core CPU with 2.4 GHz.

  18. Enhancing functionality and performance in the PVM network computing system

    SciTech Connect

    Sunderam, V.

    1996-09-01

    The research funded by this grant is part of an ongoing research project in heterogeneous distributed computing with the PVM system, at Emory as well as at Oak Ridge Labs and the University of Tennessee. This grant primarily supports research at Emory that continues to evolve new concepts and systems in distributed computing, but it also includes the PI`s ongoing interaction with the other groups in terms of collaborative research as well as software systems development and maintenance. We have continued our second year efforts (July 1995 - June 1996), on the same topics as during the first year, namely (a) visualization of PVM programs to complement XPVM displays; (b) I/O and generalized distributed computing in PVM; and (c) evolution of a multithreaded concurrent computing model. 12 refs.

  19. Texture functions in image analysis: A computationally efficient solution

    NASA Technical Reports Server (NTRS)

    Cox, S. C.; Rose, J. F.

    1983-01-01

    A computationally efficient means for calculating texture measurements from digital images by use of the co-occurrence technique is presented. The calculation of the statistical descriptors of image texture and a solution that circumvents the need for calculating and storing a co-occurrence matrix are discussed. The results show that existing efficient algorithms for calculating sums, sums of squares, and cross products can be used to compute complex co-occurrence relationships directly from the digital image input.

  20. Do general intellectual functioning and socioeconomic status account for performance on the Children's Gambling Task?

    PubMed Central

    Mata, Fernanda; Sallum, Isabela; Miranda, Débora M.; Bechara, Antoine; Malloy-Diniz, Leandro F.

    2013-01-01

    Studies that use the Iowa Gambling Task (IGT) and its age-appropriate versions as indices of affective decision-making during childhood and adolescence have demonstrated significant individual differences in scores. Our study investigated the association between general intellectual functioning and socioeconomic status (SES) and its effect on the development of affective decision-making in preschoolers by using a computerized version of the Children's Gambling Task (CGT). We administered the CGT and the Columbia Mental Maturity Scale (CMMS) to 137 Brazilian children between the ages of 3 and 5 years old to assess their general intellectual functioning. We also used the Brazilian Criterion of Economic Classification (CCEB) to assess their SES. Age differences between 3- and 4-years-old, but not between 4- and 5-years-old, confirmed the results obtained by Kerr and Zelazo (2004), indicating the rapid development of affective decision-making during the preschool period. Both 4- and 5-years-old performed significantly above chance on blocks 3, 4, and 5 of the CGT, whereas 3-years-old mean scores did not differ from chance. We found that general intellectual functioning was not related to affective decision-making. On the other hand, our findings showed that children with high SES performed better on the last block of the CGT in comparison to children with low SES, which indicates that children from the former group seem more likely to use the information about the gain/loss aspects of the decks to efficiently choose cards from the advantageous deck throughout the task. PMID:23760222

  1. 'A Leg to Stand On' by Oliver Sacks: a unique autobiographical account of functional paralysis.

    PubMed

    Stone, Jon; Perthen, Jo; Carson, Alan J

    2012-09-01

    Oliver Sacks, the well known neurologist and writer, published his fourth book, 'A Leg to Stand On', in 1984 following an earlier essay 'The Leg' in 1982. The book described his recovery after a fall in a remote region of Norway in which he injured his leg. Following surgery to reattach his quadriceps muscle, he experienced an emotional period in which his leg no longer felt a part of his body, and he struggled to regain his ability to walk. Sacks attributed the experience to a neurologically determined disorder of body-image and bodyego induced by peripheral injury. In the first edition of his book Sacks explicitly rejected the diagnosis of 'hysterical paralysis' as it was then understood, although he approached this diagnosis more closely in subsequent revisions. In this article we propose that, in the light of better understanding of functional neurological symptoms, Sacks' experiences deserve to be reappraised as a unique insight in to a genuinely experienced functional/psychogenic leg paralysis following injury. PMID:22872718

  2. A comprehensive account of spectral, Hartree Fock, and Density Functional Theory studies of 2-chlorobenzothiazole

    NASA Astrophysics Data System (ADS)

    Daswani, Ujla; Sharma, Pratibha; Kumar, Ashok

    2015-01-01

    Benzothiazole moiety is found to play an important role in medicinal chemistry with a wide range of pharmacological activities. Herein, a simple, benzothiazole derivative viz., 2-chlorobenzothiazole (2CBT) has been analyzed. The spectroscopic properties of the target compound were examined by FT-IR (4400-450 cm-1), FT-Raman (4000-50 cm-1), and NMR techniques. The 1H and 13C NMR spectra were recorded in DMSO. Theoretical calculations were performed by ab initio Hartree Fock and Density Functional Theory (DFT)/B3LYP method using varied basis sets combination. The scaled B3LYP/6-311++G(d,p) results precisely complements with the experimental findings. Electronic absorption spectra along with energy and oscillator strength were obtained by TDDFT method. Atomic charges have also been reported. Total density isosurface and total density mapped with electrostatic potential surface (MESP) has been shown.

  3. Clinical evaluation of cochlear implant sound coding taking into account conjectural masking functions, MP3000™

    PubMed Central

    Buechner, Andreas; Beynon, Andy; Szyfter, Witold; Niemczyk, Kazimierz; Hoppe, Ulrich; Hey, Matthias; Brokx, Jan; Eyles, Julie; Van de Heyning, Paul; Paludetti, Gaetano; Zarowski, Andrzej; Quaranta, Nicola; Wesarg, Thomas; Festen, Joost; Olze, Heidi; Dhooge, Ingeborg; Müller-Deile, Joachim; Ramos, Angel; Roman, Stephane; Piron, Jean-Pierre; Cuda, Domenico; Burdo, Sandro; Grolman, Wilko; Vaillard, Samantha Roux; Huarte, Alicia; Frachet, Bruno; Morera, Constantine; Garcia-Ibáñez, Luis; Abels, Daniel; Walger, Martin; Müller-Mazotta, Jochen; Leone, Carlo Antonio; Meyer, Bernard; Dillier, Norbert; Steffens, Thomas; Gentine, André; Mazzoli, Manuela; Rypkema, Gerben; Killian, Matthijs; Smoorenburg, Guido

    2011-01-01

    Efficacy of the SPEAK and ACE coding strategies was compared with that of a new strategy, MP3000™, by 37 European implant centers including 221 subjects. The SPEAK and ACE strategies are based on selection of 8–10 spectral components with the highest levels, while MP3000 is based on the selection of only 4–6 components, with the highest levels relative to an estimate of the spread of masking. The pulse rate per component was fixed. No significant difference was found for the speech scores and for coding preference between the SPEAK/ACE and MP3000 strategies. Battery life was 24% longer for the MP3000 strategy. With MP3000 the best results were found for a selection of six components. In addition, the best results were found for a masking function with a low-frequency slope of 50 dB/Bark and a high-frequency slope of 37 dB/Bark (50/37) as compared to the other combinations examined of 40/30 and 20/15 dB/Bark. The best results found for the steepest slopes do not seem to agree with current estimates of the spread of masking in electrical stimulation. Future research might reveal if performance with respect to SPEAK/ACE can be enhanced by increasing the number of channels in MP3000 beyond 4–6 and it should shed more light on the optimum steepness of the slopes of the masking functions applied in MP3000. PMID:22251806

  4. Basic processes in reading: a critical review of pseudohomophone effects in reading aloud and a new computational account.

    PubMed

    Reynolds, Michael; Besner, Derek

    2005-08-01

    There are pervasive lexical influences on the time that it takes to read aloud novel letter strings that sound like real words (e.g., brane from brain). However, the literature presents a complicated picture, given that the time taken to read aloud such items is sometimes shorter and sometimes longer than a control string (e.g.,frane) and that the time to read aloud is sometimes affected by the frequency of the base word and other times is not. In the present review, we first organize these data to show that there is considerably more consistency than has previously been acknowledged. We then consider six different accounts that have been proposed to explain various aspects of these data. Four of them immediately fail in one way or another. The remaining two accounts may be able to explain these findings, but they either make counterintuitive assumptions or invoke a novel mechanism solely to explain these findings. A new account is advanced that is able to explain all of the effects reviewed here and has none of the problems associated with the other accounts. According to this account, different types of lexical knowledge are used when pseudohomophones and nonword controls are read aloud in mixed and pure lists. This account is then implemented in Coltheart, Rastle, Perry, Langdon, and Ziegler's (2001) dual route cascaded model in order to provide an existence proof that it accommodates all of the effects, while retaining the ability to simulate three standard effects seen in nonword reading aloud. PMID:16447376

  5. Challenges in computational studies of enzyme structure, function and dynamics.

    PubMed

    Carvalho, Alexandra T P; Barrozo, Alexandre; Doron, Dvir; Kilshtain, Alexandra Vardi; Major, Dan Thomas; Kamerlin, Shina Caroline Lynn

    2014-11-01

    In this review we give an overview of the field of Computational enzymology. We start by describing the birth of the field, with emphasis on the work of the 2013 chemistry Nobel Laureates. We then present key features of the state-of-the-art in the field, showing what theory, accompanied by experiments, has taught us so far about enzymes. We also briefly describe computational methods, such as quantum mechanics-molecular mechanics approaches, reaction coordinate treatment, and free energy simulation approaches. We finalize by discussing open questions and challenges. PMID:25306098

  6. Computing options for multiple-trait test-day random regression models while accounting for heat tolerance.

    PubMed

    Aguilar, I; Tsuruta, S; Misztal, I

    2010-06-01

    Data included 90,242,799 test day records from first, second and third parities of 5,402,484 Holstein cows and 9,326,754 animals in the pedigree. Additionally, daily temperature humidity indexes (THI) from 202 weather stations were available. The fixed effects included herd test day, age at calving, milking frequency and days in milk classes (DIM). Random effects were additive genetic, permanent environment and herd-year and were fit as random regressions. Covariates included linear splines with four knots at 5, 50, 200 and 305 DIM and a function of THI. Mixed model equations were solved using an iteration on data program with a preconditioned conjugate gradient algorithm. Preconditioners used were diagonal (D), block diagonal due to traits (BT) and block diagonal due to traits and correlated effects (BTCORR). One run included BT with a 'diagonalized' model in which the random effects were reparameterized for diagonal (co)variance matrices among traits (BTDIAG). Memory requirements were 8.7 Gb for D, 10.4 Gb for BT and BTDIAG, and 24.3 Gb for BTCORR. Computing times (rounds) were 14 days (952) for D, 10.7 days (706) for BT, 7.7 days (494) for BTDIAG and 4.6 days (289) for BTCORR. The convergence pattern was strongly influenced by the choice of fixed effects. When sufficient memory is available, the option BTCORR is the fastest and simplest to implement; the next efficient method, BTDIAG, requires additional steps for diagonalization and back-diagonalization. PMID:20536641

  7. Computer routines for probability distributions, random numbers, and related functions

    USGS Publications Warehouse

    Kirby, W.

    1983-01-01

    Use of previously coded and tested subroutines simplifies and speeds up program development and testing. This report presents routines that can be used to calculate various probability distributions and other functions of importance in statistical hydrology. The routines are designed as general-purpose Fortran subroutines and functions to be called from user-written main progress. The probability distributions provided include the beta, chi-square, gamma, Gaussian (normal), Pearson Type III (tables and approximation), and Weibull. Also provided are the distributions of the Grubbs-Beck outlier test, Kolmogorov 's and Smirnov 's D, Student 's t, noncentral t (approximate), and Snedecor F. Other mathematical functions include the Bessel function, I sub o, gamma and log-gamma functions, error functions, and exponential integral. Auxiliary services include sorting and printer-plotting. Random number generators for uniform and normal numbers are provided and may be used with some of the above routines to generate numbers from other distributions. (USGS)

  8. Computer routines for probability distributions, random numbers, and related functions

    USGS Publications Warehouse

    Kirby, W.H.

    1980-01-01

    Use of previously codes and tested subroutines simplifies and speeds up program development and testing. This report presents routines that can be used to calculate various probability distributions and other functions of importance in statistical hydrology. The routines are designed as general-purpose Fortran subroutines and functions to be called from user-written main programs. The probability distributions provided include the beta, chisquare, gamma, Gaussian (normal), Pearson Type III (tables and approximation), and Weibull. Also provided are the distributions of the Grubbs-Beck outlier test, Kolmogorov 's and Smirnov 's D, Student 's t, noncentral t (approximate), and Snedecor F tests. Other mathematical functions include the Bessel function I (subzero), gamma and log-gamma functions, error functions and exponential integral. Auxiliary services include sorting and printer plotting. Random number generators for uniform and normal numbers are provided and may be used with some of the above routines to generate numbers from other distributions. (USGS)

  9. A method to account for outliers in the development of safety performance functions.

    PubMed

    El-Basyouny, Karim; Sayed, Tarek

    2010-07-01

    Accident data sets can include some unusual data points that are not typical of the rest of the data. The presence of these data points (usually termed outliers) can have a significant impact on the estimates of the parameters of safety performance functions (SPFs). Few studies have considered outliers analysis in the development of SPFs. In these studies, the practice has been to identify and then exclude outliers from further analysis. This paper introduces alternative mixture models based on the multivariate Poisson lognormal (MVPLN) regression. The proposed approach presents outlier resistance modeling techniques that provide robust safety inferences by down-weighting the outlying observations rather than rejecting them. The first proposed model is a scale-mixture model that is obtained by replacing the normal distribution in the Poisson-lognormal hierarchy by the Student t distribution, which has heavier tails. The second model is a two-component mixture (contaminated normal model) where it is assumed that most of the observations come from a basic distribution, whereas the remaining few outliers arise from an alternative distribution that has a larger variance. The results indicate that the estimates of the extra-Poisson variation parameters were considerably smaller under the mixture models leading to higher precision. Also, both mixture models have identified the same set of outliers. In terms of goodness-of-fit, both mixture models have outperformed the MVPLN. The outlier rejecting MVPLN model provided a superior fit in terms of a much smaller DIC and standard deviations for the parameter estimates. However, this approach tends to underestimate uncertainty by producing too small standard deviations for the parameter estimates, which may lead to incorrect conclusions. It is recommended that the proposed outlier resistance modeling techniques be used unless the exclusion of the outlying observations can be justified because of data related reasons (e

  10. A Functional Analytic Approach to Computer-Interactive Mathematics

    ERIC Educational Resources Information Center

    Ninness, Chris; Rumph, Robin; McCuller, Glen; Harrison, Carol; Ford, Angela M.; Ninness, Sharon K.

    2005-01-01

    Following a pretest, 11 participants who were naive with regard to various algebraic and trigonometric transformations received an introductory lecture regarding the fundamentals of the rectangular coordinate system. Following the lecture, they took part in a computer-interactive matching-to-sample procedure in which they received training on…

  11. A Computational Framework Discovers New Copy Number Variants with Functional Importance

    PubMed Central

    Banerjee, Samprit; Oldridge, Derek; Poptsova, Maria; Hussain, Wasay M.; Chakravarty, Dimple; Demichelis, Francesca

    2011-01-01

    Structural variants which cause changes in copy numbers constitute an important component of genomic variability. They account for 0.7% of genomic differences in two individual genomes, of which copy number variants (CNVs) are the largest component. A recent population-based CNV study revealed the need of better characterization of CNVs, especially the small ones (<500 bp).We propose a three step computational framework (Identification of germline Changes in Copy Number or IgC2N) to discover and genotype germline CNVs. First, we detect candidate CNV loci by combining information across multiple samples without imposing restrictions to the number of coverage markers or to the variant size. Secondly, we fine tune the detection of rare variants and infer the putative copy number classes for each locus. Last, for each variant we combine the relative distance between consecutive copy number classes with genetic information in a novel attempt to estimate the reference model bias. This computational approach is applied to genome-wide data from 1250 HapMap individuals. Novel variants were discovered and characterized in terms of size, minor allele frequency, type of polymorphism (gains, losses or both), and mechanism of formation. Using data generated for a subset of individuals by a 42 million marker platform, we validated the majority of the variants with the highest validation rate (66.7%) was for variants of size larger than 1 kb. Finally, we queried transcriptomic data from 129 individuals determined by RNA-sequencing as further validation and to assess the functional role of the new variants. We investigated the possible enrichment for variant's regulatory effect and found that smaller variants (<1 Kb) are more likely to regulate gene transcript than larger variants (p-value = 2.04e-08). Our results support the validity of the computational framework to detect novel variants relevant to disease susceptibility studies and provide evidence of the importance of

  12. Proton-Λ correlation functions at energies available at the CERN Large Hadron Collider taking into account residual correlations

    NASA Astrophysics Data System (ADS)

    Shapoval, V. M.; Sinyukov, Yu. M.; Naboka, V. Yu.

    2015-10-01

    The theoretical analysis of the p ¯-Λ ⊕p -Λ ¯ correlation function in 10% most central Au+Au collisions at Relativistic Heavy Ion Collider (RHIC) energy √{sNN}=200 GeV shows that the contribution of residual correlations is a necessary factor for obtaining a satisfactory description of the experimental data. Neglecting the residual correlation effect leads to an unrealistically low source radius, about 2 times smaller than the corresponding value for p -Λ ⊕p ¯-Λ ¯ case, when one fits the experimental correlation function within Lednický-Lyuboshitz analytical model. Recently an approach that accounts effectively for residual correlations for the baryon-antibaryon correlation function was proposed, and a good RHIC data description was reached with the source radius extracted from the hydrokinetic model (HKM). The p ¯-Λ scattering length, as well as the parameters characterizing the residual correlation effect—annihilation dip amplitude and its inverse width—were extracted from the corresponding fit. In this paper we use these extracted values and simulated in HKM source functions for Pb+Pb collisions at the LHC energy √{sNN}=2.76 TeV to predict the corresponding p Λ and p Λ ¯ correlation functions.

  13. EDF: Computing electron number probability distribution functions in real space from molecular wave functions

    NASA Astrophysics Data System (ADS)

    Francisco, E.; Pendás, A. Martín; Blanco, M. A.

    2008-04-01

    Given an N-electron molecule and an exhaustive partition of the real space ( R) into m arbitrary regions Ω,Ω,…,Ω ( ⋃i=1mΩ=R), the edf program computes all the probabilities P(n,n,…,n) of having exactly n electrons in Ω, n electrons in Ω,…, and n electrons ( n+n+⋯+n=N) in Ω. Each Ω may correspond to a single basin (atomic domain) or several such basins (functional group). In the later case, each atomic domain must belong to a single Ω. The program can manage both single- and multi-determinant wave functions which are read in from an aimpac-like wave function description ( .wfn) file (T.A. Keith et al., The AIMPAC95 programs, http://www.chemistry.mcmaster.ca/aimpac, 1995). For multi-determinantal wave functions a generalization of the original .wfn file has been introduced. The new format is completely backwards compatible, adding to the previous structure a description of the configuration interaction (CI) coefficients and the determinants of correlated wave functions. Besides the .wfn file, edf only needs the overlap integrals over all the atomic domains between the molecular orbitals (MO). After the P(n,n,…,n) probabilities are computed, edf obtains from them several magnitudes relevant to chemical bonding theory, such as average electronic populations and localization/delocalization indices. Regarding spin, edf may be used in two ways: with or without a splitting of the P(n,n,…,n) probabilities into α and β spin components. Program summaryProgram title: edf Catalogue identifier: AEAJ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEAJ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 5387 No. of bytes in distributed program, including test data, etc.: 52 381 Distribution format: tar.gz Programming language: Fortran 77 Computer

  14. Spaceborne computer executive routine functional design specification. Volume 2: Computer executive design for space station/base

    NASA Technical Reports Server (NTRS)

    Kennedy, J. R.; Fitzpatrick, W. S.

    1971-01-01

    The computer executive functional system design concepts derived from study of the Space Station/Base are presented. Information Management System hardware configuration as directly influencing the executive design is reviewed. The hardware configuration and generic executive design requirements are considered in detail in a previous report (System Configuration and Executive Requirements Specifications for Reusable Shuttle and Space Station/Base, 9/25/70). This report defines basic system primitives and delineates processes and process control. Supervisor states are considered for describing basic multiprogramming and multiprocessing systems. A high-level computer executive including control of scheduling, allocation of resources, system interactions, and real-time supervisory functions is defined. The description is oriented to provide a baseline for a functional simulation of the computer executive system.

  15. Functional requirements for design of the Space Ultrareliable Modular Computer (SUMC) system simulator

    NASA Technical Reports Server (NTRS)

    Curran, R. T.; Hornfeck, W. A.

    1972-01-01

    The functional requirements for the design of an interpretive simulator for the space ultrareliable modular computer (SUMC) are presented. A review of applicable existing computer simulations is included along with constraints on the SUMC simulator functional design. Input requirements, output requirements, and language requirements for the simulator are discussed in terms of a SUMC configuration which may vary according to the application.

  16. A high-radix CORDIC architecture dedicated to compute the Gaussian potential function in neural networks

    NASA Astrophysics Data System (ADS)

    Meyer-Baese, Uwe H.; Meyer-Baese, Anke; Ramirez, Javier; Garcia, Antonio

    2003-08-01

    In this paper, a new parallel hardware architecture dedicated to compute the Gaussian Potential Function is proposed. This function is commonly utilized in neural radial basis classifiers for pattern recognition as described by Lee; Girosi and Poggio; and Musavi et al. Attention to a simplified Gaussian Potential Function which processes uncorrelated features is confined. Operations of most interest included by the Gaussian potential function are the exponential and the square function. Our hardware computes the exponential function and its exponent at the same time. The contributions of all features to the exponent are computed in parallel. This parallelism reduces computational delay in the output function. The duration does not depend on the number of features processed. Software and hardware case studies are presented to evaluate the new CORDIC.

  17. Using computational models to relate structural and functional brain connectivity

    PubMed Central

    Hlinka, Jaroslav; Coombes, Stephen

    2012-01-01

    Modern imaging methods allow a non-invasive assessment of both structural and functional brain connectivity. This has lead to the identification of disease-related alterations affecting functional connectivity. The mechanism of how such alterations in functional connectivity arise in a structured network of interacting neural populations is as yet poorly understood. Here we use a modeling approach to explore the way in which this can arise and to highlight the important role that local population dynamics can have in shaping emergent spatial functional connectivity patterns. The local dynamics for a neural population is taken to be of the Wilson–Cowan type, whilst the structural connectivity patterns used, describing long-range anatomical connections, cover both realistic scenarios (from the CoComac database) and idealized ones that allow for more detailed theoretical study. We have calculated graph–theoretic measures of functional network topology from numerical simulations of model networks. The effect of the form of local dynamics on the observed network state is quantified by examining the correlation between structural and functional connectivity. We document a profound and systematic dependence of the simulated functional connectivity patterns on the parameters controlling the dynamics. Importantly, we show that a weakly coupled oscillator theory explaining these correlations and their variation across parameter space can be developed. This theoretical development provides a novel way to characterize the mechanisms for the breakdown of functional connectivity in diseases through changes in local dynamics. PMID:22805059

  18. Introduction to Classical Density Functional Theory by a Computational Experiment

    ERIC Educational Resources Information Center

    Jeanmairet, Guillaume; Levy, Nicolas; Levesque, Maximilien; Borgis, Daniel

    2014-01-01

    We propose an in silico experiment to introduce the classical density functional theory (cDFT). Density functional theories, whether quantum or classical, rely on abstract concepts that are nonintuitive; however, they are at the heart of powerful tools and active fields of research in both physics and chemistry. They led to the 1998 Nobel Prize in…

  19. The computational foundations of time dependent density functional theory

    NASA Astrophysics Data System (ADS)

    Whitfield, James

    2014-03-01

    The mathematical foundations of TDDFT are established through the formal existence of a fictitious non-interacting system (known as the Kohn-Sham system), which can reproduce the one-electron reduced probability density of the actual system. We build upon these works and show that on the interior of the domain of existence, the Kohn-Sham system can be efficiently obtained given the time-dependent density. Since a quantum computer can efficiently produce such time-dependent densities, we present a polynomial time quantum algorithm to generate the time-dependent Kohn-Sham potential with controllable error bounds. Further, we find that systems do not immediately become non-representable but rather become ill-representable as one approaches this boundary. A representability parameter is defined in our work which quantifies the distance to the boundary of representability and the computational difficulty of finding the Kohn-Sham system.

  20. Computational approaches to identify functional genetic variants in cancer genomes

    PubMed Central

    Gonzalez-Perez, Abel; Mustonen, Ville; Reva, Boris; Ritchie, Graham R.S.; Creixell, Pau; Karchin, Rachel; Vazquez, Miguel; Fink, J. Lynn; Kassahn, Karin S.; Pearson, John V.; Bader, Gary; Boutros, Paul C.; Muthuswamy, Lakshmi; Ouellette, B.F. Francis; Reimand, Jüri; Linding, Rune; Shibata, Tatsuhiro; Valencia, Alfonso; Butler, Adam; Dronov, Serge; Flicek, Paul; Shannon, Nick B.; Carter, Hannah; Ding, Li; Sander, Chris; Stuart, Josh M.; Stein, Lincoln D.; Lopez-Bigas, Nuria

    2014-01-01

    The International Cancer Genome Consortium (ICGC) aims to catalog genomic abnormalities in tumors from 50 different cancer types. Genome sequencing reveals hundreds to thousands of somatic mutations in each tumor, but only a minority drive tumor progression. We present the result of discussions within the ICGC on how to address the challenge of identifying mutations that contribute to oncogenesis, tumor maintenance or response to therapy, and recommend computational techniques to annotate somatic variants and predict their impact on cancer phenotype. PMID:23900255

  1. A brain-computer interface to support functional recovery.

    PubMed

    Kjaer, Troels W; Sørensen, Helge B

    2013-01-01

    Brain-computer interfaces (BCI) register changes in brain activity and utilize this to control computers. The most widely used method is based on registration of electrical signals from the cerebral cortex using extracranially placed electrodes also called electroencephalography (EEG). The features extracted from the EEG may, besides controlling the computer, also be fed back to the patient for instance as visual input. This facilitates a learning process. BCI allow us to utilize brain activity in the rehabilitation of patients after stroke. The activity of the cerebral cortex varies with the type of movement we imagine, and by letting the patient know the type of brain activity best associated with the intended movement the rehabilitation process may be faster and more efficient. The focus of BCI utilization in medicine has changed in recent years. While we previously focused on devices facilitating communication in the rather few patients with locked-in syndrome, much interest is now devoted to the therapeutic use of BCI in rehabilitation. For this latter group of patients, the device is not intended to be a lifelong assistive companion but rather a 'teacher' during the rehabilitation period. PMID:23859968

  2. Computerizing the Accounting Curriculum.

    ERIC Educational Resources Information Center

    Nash, John F.; England, Thomas G.

    1986-01-01

    Discusses the use of computers in college accounting courses. Argues that the success of new efforts in using computers in teaching accounting is dependent upon increasing instructors' computer skills, and choosing appropriate hardware and software, including commercially available business software packages. (TW)

  3. Computer Corner: Spreadsheets, Power Series, Generating Functions, and Integers.

    ERIC Educational Resources Information Center

    Snow, Donald R.

    1989-01-01

    Implements a table algorithm on a spreadsheet program and obtains functions for several number sequences such as the Fibonacci and Catalan numbers. Considers other applications of the table algorithm to integers represented in various number bases. (YP)

  4. Adaptive, associative, and self-organizing functions in neural computing.

    PubMed

    Kohonen, T

    1987-12-01

    This paper contains an attempt to describe certain adaptive and cooperative functions encountered in neural networks. The approach is a compromise between biological accuracy and mathematical clarity. two types of differential equation seem to describe the basic effects underlying the information of these functions: the equation for the electrical activity of the neuron and the adaptation equation that describes changes in its input connectivities. Various phenomena and operations are derivable from them: clustering of activity in a laterally interconnected nework; adaptive formation of feature detectors; the autoassociative memory function; and self-organized formation of ordered sensory maps. The discussion tends to reason what functions are readily amenable to analytical modeling and which phenomena seem to ensue from the more complex interactions that take place in the brain. PMID:20523469

  5. Multiple multiresolution representation of functions and calculus for fast computation

    SciTech Connect

    Fann, George I; Harrison, Robert J; Hill, Judith C; Jia, Jun; Galindo, Diego A

    2010-01-01

    We describe the mathematical representations, data structure and the implementation of the numerical calculus of functions in the software environment multiresolution analysis environment for scientific simulations, MADNESS. In MADNESS, each smooth function is represented using an adaptive pseudo-spectral expansion using the multiwavelet basis to a arbitrary but finite precision. This is an extension of the capabilities of most of the existing net, mesh and spectral based methods where the discretization is based on a single adaptive mesh, or expansions.

  6. Evaluation of computing systems using functionals of a Stochastic process

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.; Wu, L. T.

    1980-01-01

    An intermediate model was used to represent the probabilistic nature of a total system at a level which is higher than the base model and thus closer to the performance variable. A class of intermediate models, which are generally referred to as functionals of a Markov process, were considered. A closed form solution of performability for the case where performance is identified with the minimum value of a functional was developed.

  7. Computational strategies for the design of new enzymatic functions.

    PubMed

    Świderek, K; Tuñón, I; Moliner, V; Bertran, J

    2015-09-15

    In this contribution, recent developments in the design of biocatalysts are reviewed with particular emphasis in the de novo strategy. Studies based on three different reactions, Kemp elimination, Diels-Alder and Retro-Aldolase, are used to illustrate different success achieved during the last years. Finally, a section is devoted to the particular case of designed metalloenzymes. As a general conclusion, the interplay between new and more sophisticated engineering protocols and computational methods, based on molecular dynamics simulations with Quantum Mechanics/Molecular Mechanics potentials and fully flexible models, seems to constitute the bed rock for present and future successful design strategies. PMID:25797438

  8. Automated attendance accounting system

    NASA Technical Reports Server (NTRS)

    Chapman, C. P. (Inventor)

    1973-01-01

    An automated accounting system useful for applying data to a computer from any or all of a multiplicity of data terminals is disclosed. The system essentially includes a preselected number of data terminals which are each adapted to convert data words of decimal form to another form, i.e., binary, usable with the computer. Each data terminal may take the form of a keyboard unit having a number of depressable buttons or switches corresponding to selected data digits and/or function digits. A bank of data buffers, one of which is associated with each data terminal, is provided as a temporary storage. Data from the terminals is applied to the data buffers on a digit by digit basis for transfer via a multiplexer to the computer.

  9. Applications of a new wall function to turbulent flow computations

    NASA Technical Reports Server (NTRS)

    Chen, Y. S.

    1986-01-01

    A new wall function approach is developed based on a wall law suitable for incompressible turbulent boundary layers under strong adverse pressure gradients. This wall law was derived from a one-dimensional analysis of the turbulent kinetic energy equation with gradient diffusion concept employed in modeling the near-wall shear stress gradient. Numerical testing cases for the present wall functions include turbulent separating flows around an airfoil and turbulent recirculating flows in several confined regions. Improvements on the predictions using the present wall functions are illustrated. For cases of internal recirculating flows, one modification factor for improving the performance of the k-epsilon turbulence model in the flow recirculation regions is also included.

  10. Bread dough rheology: Computing with a damage function model

    NASA Astrophysics Data System (ADS)

    Tanner, Roger I.; Qi, Fuzhong; Dai, Shaocong

    2015-01-01

    We describe an improved damage function model for bread dough rheology. The model has relatively few parameters, all of which can easily be found from simple experiments. Small deformations in the linear region are described by a gel-like power-law memory function. A set of large non-reversing deformations - stress relaxation after a step of shear, steady shearing and elongation beginning from rest, and biaxial stretching, is used to test the model. With the introduction of a revised strain measure which includes a Mooney-Rivlin term, all of these motions can be well described by the damage function described in previous papers. For reversing step strains, larger amplitude oscillatory shearing and recoil reasonable predictions have been found. The numerical methods used are discussed and we give some examples.

  11. A computationally efficient double hybrid density functional based on the random phase approximation.

    PubMed

    Grimme, Stefan; Steinmetz, Marc

    2016-08-01

    We present a revised form of a double hybrid density functional (DHDF) dubbed PWRB95. It contains semi-local Perdew-Wang exchange and Becke95 correlation with a fixed amount of 50% non-local Fock exchange. New features are that the robust random phase approximation (RPA) is used to calculate the non-local correlation part instead of a second-order perturbative treatment as in standard DHDF, and the non-self-consistent evaluation of the Fock exchange with KS-orbitals at the GGA level which leads to a significant reduction of the computational effort. To account for London dispersion effects we include the non-local VV10 dispersion functional. Only three empirical scaling parameters were adjusted. The PWRB95 results for extensive standard thermochemical benchmarks (GMTKN30 data base) are compared to those of well-known functionals from the classes of (meta-)GGAs, (meta-)hybrid functionals, and DHDFs, as well as to standard (direct) RPA. The new method is furthermore tested on prototype bond activations with (Ni/Pd)-based transition metal catalysts, and two difficult cases for DHDF, namely the isomerization reaction of the [Cu2(en)2O2](2+) complex and the singlet-triplet energy difference in highly unsaturated cyclacenes. The results show that PWRB95 is almost as accurate as standard DHDF for main-group thermochemistry but has a similar or better performance for non-covalent interactions, more difficult transition metal containing molecules and other electronically problematic cases. Because of its relatively weak basis set dependence, PWRB95 can be applied even in combination with AO basis sets of only triple-zeta quality which yields huge overall computational savings by a factor of about 40 compared to standard DHDF/'quadruple-zeta' calculations. Structure optimizations of small molecules with PWRB95 indicate an accurate description of bond distances superior to that provided by TPSS-D3, PBE0-D3, or other RPA type methods. PMID:26695184

  12. The Contingency of Cocaine Administration Accounts for Structural and Functional Medial Prefrontal Deficits and Increased Adrenocortical Activation

    PubMed Central

    Anderson, Rachel M.; Cosme, Caitlin V.; Glanz, Ryan M.; Miller, Mary C.; Romig-Martin, Sara A.; LaLumiere, Ryan T.

    2015-01-01

    The prelimbic region (PL) of the medial prefrontal cortex (mPFC) is implicated in the relapse of drug-seeking behavior. Optimal mPFC functioning relies on synaptic connections involving dendritic spines in pyramidal neurons, whereas prefrontal dysfunction resulting from elevated glucocorticoids, stress, aging, and mental illness are each linked to decreased apical dendritic branching and spine density in pyramidal neurons in these cortical fields. The fact that cocaine use induces activation of the stress-responsive hypothalamo-pituitary-adrenal axis raises the possibility that cocaine-related impairments in mPFC functioning may be manifested by similar changes in neuronal architecture in mPFC. Nevertheless, previous studies have generally identified increases, rather than decreases, in structural plasticity in mPFC after cocaine self-administration. Here, we use 3D imaging and analysis of dendritic spine morphometry to show that chronic cocaine self-administration leads to mild decreases of apical dendritic branching, prominent dendritic spine attrition in PL pyramidal neurons, and working memory deficits. Importantly, these impairments were largely accounted for in groups of rats that self-administered cocaine compared with yoked-cocaine- and saline-matched counterparts. Follow-up experiments failed to demonstrate any effects of either experimenter-administered cocaine or food self-administration on structural alterations in PL neurons. Finally, we verified that the cocaine self-administration group was distinguished by more protracted increases in adrenocortical activity compared with yoked-cocaine- and saline-matched controls. These studies suggest a mechanism whereby increased adrenocortical activity resulting from chronic cocaine self-administration may contribute to regressive prefrontal structural and functional plasticity. SIGNIFICANCE STATEMENT Stress, aging, and mental illness are each linked to decreased prefrontal plasticity. Here, we show that chronic

  13. Changes in collagen metabolism account for ventricular functional recovery following beta-blocker therapy in patients with chronic heart failure.

    PubMed

    Fukui, Miho; Goda, Akiko; Komamura, Kazuo; Nakabo, Ayumi; Masaki, Mitsuru; Yoshida, Chikako; Hirotani, Shinichi; Lee-Kawabata, Masaaki; Tsujino, Takeshi; Mano, Toshiaki; Masuyama, Tohru

    2016-02-01

    While beta blockade improves left ventricular (LV) function in patients with chronic heart failure (CHF), the mechanisms are not well known. This study aimed to examine whether changes in myocardial collagen metabolism account for LV functional recovery following beta-blocker therapy in 62 CHF patients with reduced ejection fraction (EF). LV function was echocardiographically measured at baseline and 1, 6, and 12 months after bisoprolol therapy along with serum markers of collagen metabolism including C-terminal telopeptide of collagen type I (CITP) and matrix metalloproteinase (MMP)-2. Deceleration time of mitral early velocity (DcT) increased even in the early phase, but LVEF gradually improved throughout the study period. Heart rate (HR) was reduced from the early stage, and CITP gradually decreased. LVEF and DcT increased more so in patients with the larger decreases in CITP (r = -0.33, p < 0.05; r = -0.28, p < 0.05, respectively), and HR (r = -0.31, p < 0.05; r = -0.38, p < 0.05, respectively). In addition, there were greater decreases in CITP, MMP-2 and HR from baseline to 1, 6, or 12 months in patients with above-average improvement in LVEF than in those with below-average improvement in LVEF. Similar results were obtained in terms of DcT. There was no significant correlation between the changes in HR and CITP. In conclusion, improvement in LV systolic/diastolic function was greatest in patients with the larger inhibition of collagen degradation. Changes in myocardial collagen metabolism are closely related to LV functional recovery somewhat independently from HR reduction. PMID:25351137

  14. Functional genomics reveals that a compact terpene synthase gene family can account for terpene volatile production in apple.

    PubMed

    Nieuwenhuizen, Niels J; Green, Sol A; Chen, Xiuyin; Bailleul, Estelle J D; Matich, Adam J; Wang, Mindy Y; Atkinson, Ross G

    2013-02-01

    Terpenes are specialized plant metabolites that act as attractants to pollinators and as defensive compounds against pathogens and herbivores, but they also play an important role in determining the quality of horticultural food products. We show that the genome of cultivated apple (Malus domestica) contains 55 putative terpene synthase (TPS) genes, of which only 10 are predicted to be functional. This low number of predicted functional TPS genes compared with other plant species was supported by the identification of only eight potentially functional TPS enzymes in apple 'Royal Gala' expressed sequence tag databases, including the previously characterized apple (E,E)-α-farnesene synthase. In planta functional characterization of these TPS enzymes showed that they could account for the majority of terpene volatiles produced in cv Royal Gala, including the sesquiterpenes germacrene-D and (E)-β-caryophyllene, the monoterpenes linalool and α-pinene, and the homoterpene (E)-4,8-dimethyl-1,3,7-nonatriene. Relative expression analysis of the TPS genes indicated that floral and vegetative tissues were the primary sites of terpene production in cv Royal Gala. However, production of cv Royal Gala floral-specific terpenes and TPS genes was observed in the fruit of some heritage apple cultivars. Our results suggest that the apple TPS gene family has been shaped by a combination of ancestral and more recent genome-wide duplication events. The relatively small number of functional enzymes suggests that the remaining terpenes produced in floral and vegetative and fruit tissues are maintained under a positive selective pressure, while the small number of terpenes found in the fruit of modern cultivars may be related to commercial breeding strategies. PMID:23256150

  15. Functional Genomics Reveals That a Compact Terpene Synthase Gene Family Can Account for Terpene Volatile Production in Apple1[W

    PubMed Central

    Nieuwenhuizen, Niels J.; Green, Sol A.; Chen, Xiuyin; Bailleul, Estelle J.D.; Matich, Adam J.; Wang, Mindy Y.; Atkinson, Ross G.

    2013-01-01

    Terpenes are specialized plant metabolites that act as attractants to pollinators and as defensive compounds against pathogens and herbivores, but they also play an important role in determining the quality of horticultural food products. We show that the genome of cultivated apple (Malus domestica) contains 55 putative terpene synthase (TPS) genes, of which only 10 are predicted to be functional. This low number of predicted functional TPS genes compared with other plant species was supported by the identification of only eight potentially functional TPS enzymes in apple ‘Royal Gala’ expressed sequence tag databases, including the previously characterized apple (E,E)-α-farnesene synthase. In planta functional characterization of these TPS enzymes showed that they could account for the majority of terpene volatiles produced in cv Royal Gala, including the sesquiterpenes germacrene-D and (E)-β-caryophyllene, the monoterpenes linalool and α-pinene, and the homoterpene (E)-4,8-dimethyl-1,3,7-nonatriene. Relative expression analysis of the TPS genes indicated that floral and vegetative tissues were the primary sites of terpene production in cv Royal Gala. However, production of cv Royal Gala floral-specific terpenes and TPS genes was observed in the fruit of some heritage apple cultivars. Our results suggest that the apple TPS gene family has been shaped by a combination of ancestral and more recent genome-wide duplication events. The relatively small number of functional enzymes suggests that the remaining terpenes produced in floral and vegetative and fruit tissues are maintained under a positive selective pressure, while the small number of terpenes found in the fruit of modern cultivars may be related to commercial breeding strategies. PMID:23256150

  16. 17 CFR 1.32 - Reporting of segregated account computation and details regarding the holding of futures customer...

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... and Exchange Commission (17 CFR 241.15c3-1(c)(2)(vi)), held for the same futures customer's account... accordance with Rule 240.15c3-1(c)(2)(vi) of the Securities and Exchange Commission (17 CFR 240.15c3-1(c)(2... Exchange Commission (17 CFR 240.15c3-1(c)(11)(i)). (c) Each futures commission merchant is required...

  17. Efficient and Flexible Computation of Many-Electron Wave Function Overlaps

    PubMed Central

    2016-01-01

    A new algorithm for the computation of the overlap between many-electron wave functions is described. This algorithm allows for the extensive use of recurring intermediates and thus provides high computational efficiency. Because of the general formalism employed, overlaps can be computed for varying wave function types, molecular orbitals, basis sets, and molecular geometries. This paves the way for efficiently computing nonadiabatic interaction terms for dynamics simulations. In addition, other application areas can be envisaged, such as the comparison of wave functions constructed at different levels of theory. Aside from explaining the algorithm and evaluating the performance, a detailed analysis of the numerical stability of wave function overlaps is carried out, and strategies for overcoming potential severe pitfalls due to displaced atoms and truncated wave functions are presented. PMID:26854874

  18. Computing Legacy Software Behavior to Understand Functionality and Security Properties: An IBM/370 Demonstration

    SciTech Connect

    Linger, Richard C; Pleszkoch, Mark G; Prowell, Stacy J; Sayre, Kirk D; Ankrum, Scott

    2013-01-01

    Organizations maintaining mainframe legacy software can benefit from code modernization and incorporation of security capabilities to address the current threat environment. Oak Ridge National Laboratory is developing the Hyperion system to compute the behavior of software as a means to gain understanding of software functionality and security properties. Computation of functionality is critical to revealing security attributes, which are in fact specialized functional behaviors of software. Oak Ridge is collaborating with MITRE Corporation to conduct a demonstration project to compute behavior of legacy IBM Assembly Language code for a federal agency. The ultimate goal is to understand functionality and security vulnerabilities as a basis for code modernization. This paper reports on the first phase, to define functional semantics for IBM Assembly instructions and conduct behavior computation experiments.

  19. Memory intensive functional architecture for distributed computer control systems

    SciTech Connect

    Dimmler, D.G.

    1983-10-01

    A memory-intensive functional architectue for distributed data-acquisition, monitoring, and control systems with large numbers of nodes has been conceptually developed and applied in several large-scale and some smaller systems. This discussion concentrates on: (1) the basic architecture; (2) recent expansions of the architecture which now become feasible in view of the rapidly developing component technologies in microprocessors and functional large-scale integration circuits; and (3) implementation of some key hardware and software structures and one system implementation which is a system for performing control and data acquisition of a neutron spectrometer at the Brookhaven High Flux Beam Reactor. The spectrometer is equipped with a large-area position-sensitive neutron detector.

  20. Frequency domain transfer function identification using the computer program SYSFIT

    SciTech Connect

    Trudnowski, D.J.

    1992-12-01

    Because the primary application of SYSFIT for BPA involves studying power system dynamics, this investigation was geared toward simulating the effects that might be encountered in studying electromechanical oscillations in power systems. Although the intended focus of this work is power system oscillations, the studies are sufficiently genetic that the results can be applied to many types of oscillatory systems with closely-spaced modes. In general, there are two possible ways of solving the optimization problem. One is to use a least-squares optimization function and to write the system in such a form that the problem becomes one of linear least-squares. The solution can then be obtained using a standard least-squares technique. The other method involves using a search method to obtain the optimal model. This method allows considerably more freedom in forming the optimization function and model, but it requires an initial guess of the system parameters. SYSFIT employs this second approach. Detailed investigations were conducted into three main areas: (1) fitting to exact frequency response data of a linear system; (2) fitting to the discrete Fourier transformation of noisy data; and (3) fitting to multi-path systems. The first area consisted of investigating the effects of alternative optimization cost function options; using different optimization search methods; incorrect model order, missing response data; closely-spaced poles; and closely-spaced pole-zero pairs. Within the second area, different noise colorations and levels were studied. In the third area, methods were investigated for improving fitting results by incorporating more than one system path. The following is a list of guidelines and properties developed from the study for fitting a transfer function to the frequency response of a system using optimization search methods.

  1. Utility functions and resource management in an oversubscribed heterogeneous computing environment

    DOE PAGESBeta

    Khemka, Bhavesh; Friese, Ryan; Briceno, Luis Diego; Siegel, Howard Jay; Maciejewski, Anthony A.; Koenig, Gregory A.; Groer, Christopher S.; Hilton, Marcia M.; Poole, Stephen W.; Okonski, G.; et al

    2014-09-26

    We model an oversubscribed heterogeneous computing system where tasks arrive dynamically and a scheduler maps the tasks to machines for execution. The environment and workloads are based on those being investigated by the Extreme Scale Systems Center at Oak Ridge National Laboratory. Utility functions that are designed based on specifications from the system owner and users are used to create a metric for the performance of resource allocation heuristics. Each task has a time-varying utility (importance) that the enterprise will earn based on when the task successfully completes execution. We design multiple heuristics, which include a technique to drop lowmore » utility-earning tasks, to maximize the total utility that can be earned by completing tasks. The heuristics are evaluated using simulation experiments with two levels of oversubscription. The results show the benefit of having fast heuristics that account for the importance of a task and the heterogeneity of the environment when making allocation decisions in an oversubscribed environment. Furthermore, the ability to drop low utility-earning tasks allow the heuristics to tolerate the high oversubscription as well as earn significant utility.« less

  2. Computed versus measured ion velocity distribution functions in a Hall effect thruster

    SciTech Connect

    Garrigues, L.; Mazouffre, S.; Bourgeois, G.

    2012-06-01

    We compare time-averaged and time-varying measured and computed ion velocity distribution functions in a Hall effect thruster for typical operating conditions. The ion properties are measured by means of laser induced fluorescence spectroscopy. Simulations of the plasma properties are performed with a two-dimensional hybrid model. In the electron fluid description of the hybrid model, the anomalous transport responsible for the electron diffusion across the magnetic field barrier is deduced from the experimental profile of the time-averaged electric field. The use of a steady state anomalous mobility profile allows the hybrid model to capture some properties like the time-averaged ion mean velocity. Yet, the model fails at reproducing the time evolution of the ion velocity. This fact reveals a complex underlying physics that necessitates to account for the electron dynamics over a short time-scale. This study also shows the necessity for electron temperature measurements. Moreover, the strength of the self-magnetic field due to the rotating Hall current is found negligible.

  3. Utility functions and resource management in an oversubscribed heterogeneous computing environment

    SciTech Connect

    Khemka, Bhavesh; Friese, Ryan; Briceno, Luis Diego; Siegel, Howard Jay; Maciejewski, Anthony A.; Koenig, Gregory A.; Groer, Christopher S.; Hilton, Marcia M.; Poole, Stephen W.; Okonski, G.; Rambharos, R.

    2014-09-26

    We model an oversubscribed heterogeneous computing system where tasks arrive dynamically and a scheduler maps the tasks to machines for execution. The environment and workloads are based on those being investigated by the Extreme Scale Systems Center at Oak Ridge National Laboratory. Utility functions that are designed based on specifications from the system owner and users are used to create a metric for the performance of resource allocation heuristics. Each task has a time-varying utility (importance) that the enterprise will earn based on when the task successfully completes execution. We design multiple heuristics, which include a technique to drop low utility-earning tasks, to maximize the total utility that can be earned by completing tasks. The heuristics are evaluated using simulation experiments with two levels of oversubscription. The results show the benefit of having fast heuristics that account for the importance of a task and the heterogeneity of the environment when making allocation decisions in an oversubscribed environment. Furthermore, the ability to drop low utility-earning tasks allow the heuristics to tolerate the high oversubscription as well as earn significant utility.

  4. An account of Sandia's research booth at Supercomputing '92: A collaborative effort in high-performance computing and networking

    SciTech Connect

    Breckenridge, A.; Vahle, M.O.

    1993-03-01

    Supercomputing '92, a high-performance computing and communications conference was held, November 16--20, 1992 in Minneapolis, Minnesota. This paper documents the applications and technologies that were showcased in Sandia's research booth at that conference. In particular the demonstrations in high-performance networking, audio-visual applications in engineering, virtual reality, and supercomputing applications are all described.

  5. Computational properties of three-term recurrence relations for Kummer functions

    NASA Astrophysics Data System (ADS)

    Deaño, Alfredo; Segura, Javier; Temme, Nico M.

    2010-01-01

    Several three-term recurrence relations for confluent hypergeometric functions are analyzed from a numerical point of view. Minimal and dominant solutions for complex values of the variable z are given, derived from asymptotic estimates of the Whittaker functions with large parameters. The Laguerre polynomials and the regular Coulomb wave functions are studied as particular cases, with numerical examples of their computation.

  6. Computational Modeling of Airway and Pulmonary Vascular Structure and Function: Development of a “Lung Physiome”

    PubMed Central

    Tawhai, M. H.; Clark, A. R.; Donovan, G. M.; Burrowes, K. S.

    2011-01-01

    Computational models of lung structure and function necessarily span multiple spatial and temporal scales, i.e., dynamic molecular interactions give rise to whole organ function, and the link between these scales cannot be fully understood if only molecular or organ-level function is considered. Here, we review progress in constructing multiscale finite element models of lung structure and function that are aimed at providing a computational framework for bridging the spatial scales from molecular to whole organ. These include structural models of the intact lung, embedded models of the pulmonary airways that couple to model lung tissue, and models of the pulmonary vasculature that account for distinct structural differences at the extra- and intra-acinar levels. Biophysically based functional models for tissue deformation, pulmonary blood flow, and airway bronchoconstriction are also described. The development of these advanced multiscale models has led to a better understanding of complex physiological mechanisms that govern regional lung perfusion and emergent heterogeneity during bronchoconstriction. PMID:22011236

  7. A Functional Analytic Approach To Computer-Interactive Mathematics

    PubMed Central

    2005-01-01

    Following a pretest, 11 participants who were naive with regard to various algebraic and trigonometric transformations received an introductory lecture regarding the fundamentals of the rectangular coordinate system. Following the lecture, they took part in a computer-interactive matching-to-sample procedure in which they received training on particular formula-to-formula and formula-to-graph relations as these formulas pertain to reflections and vertical and horizontal shifts. In training A-B, standard formulas served as samples and factored formulas served as comparisons. In training B-C, factored formulas served as samples and graphs served as comparisons. Subsequently, the program assessed for mutually entailed B-A and C-B relations as well as combinatorially entailed C-A and A-C relations. After all participants demonstrated mutual entailment and combinatorial entailment, we employed a test of novel relations to assess 40 different and complex variations of the original training formulas and their respective graphs. Six of 10 participants who completed training demonstrated perfect or near-perfect performance in identifying novel formula-to-graph relations. Three of the 4 participants who made more than three incorrect responses during the assessment of novel relations showed some commonality among their error patterns. Derived transfer of stimulus control using mathematical relations is discussed. PMID:15898471

  8. Toward high-resolution computational design of helical membrane protein structure and function

    PubMed Central

    Barth, Patrick; Senes, Alessandro

    2016-01-01

    The computational design of α-helical membrane proteins is still in its infancy but has made important progress. De novo design has produced stable, specific and active minimalistic oligomeric systems. Computational re-engineering can improve stability and modulate the function of natural membrane proteins. Currently, the major hurdle for the field is not computational, but the experimental characterization of the designs. The emergence of new structural methods for membrane proteins will accelerate progress PMID:27273630

  9. Computer programs for calculation of thermodynamic functions of mixing in crystalline solutions

    NASA Technical Reports Server (NTRS)

    Comella, P. A.; Saxena, S. K.

    1972-01-01

    The computer programs Beta, GEGIM, REGSOL1, REGSOL2, Matrix, and Quasi are presented. The programs are useful in various calculations for the thermodynamic functions of mixing and the activity-composition relations in rock forming minerals.

  10. Functions and Requirements and Specifications for Replacement of the Computer Automated Surveillance System (CASS)

    SciTech Connect

    SCAIEF, C.C.

    1999-12-16

    This functions, requirements and specifications document defines the baseline requirements and criteria for the design, purchase, fabrication, construction, installation, and operation of the system to replace the Computer Automated Surveillance System (CASS) alarm monitoring.

  11. Computation of Schenberg response function by using finite element modelling

    NASA Astrophysics Data System (ADS)

    Frajuca, C.; Bortoli, F. S.; Magalhaes, N. S.

    2016-05-01

    Schenberg is a detector of gravitational waves resonant mass type, with a central frequency of operation of 3200 Hz. Transducers located on the surface of the resonating sphere, according to a distribution half-dodecahedron, are used to monitor a strain amplitude. The development of mechanical impedance matchers that act by increasing the coupling of the transducers with the sphere is a major challenge because of the high frequency and small in size. The objective of this work is to study the Schenberg response function obtained by finite element modeling (FEM). Finnaly, the result is compared with the result of the simplified model for mass spring type system modeling verifying if that is suitable for the determination of sensitivity detector, as the conclusion the both modeling give the same results.

  12. Computational complexity of time-dependent density functional theory

    NASA Astrophysics Data System (ADS)

    Whitfield, J. D.; Yung, M.-H.; Tempel, D. G.; Boixo, S.; Aspuru-Guzik, A.

    2014-08-01

    Time-dependent density functional theory (TDDFT) is rapidly emerging as a premier method for solving dynamical many-body problems in physics and chemistry. The mathematical foundations of TDDFT are established through the formal existence of a fictitious non-interacting system (known as the Kohn-Sham system), which can reproduce the one-electron reduced probability density of the actual system. We build upon these works and show that on the interior of the domain of existence, the Kohn-Sham system can be efficiently obtained given the time-dependent density. We introduce a V-representability parameter which diverges at the boundary of the existence domain and serves to quantify the numerical difficulty of constructing the Kohn-Sham potential. For bounded values of V-representability, we present a polynomial time quantum algorithm to generate the time-dependent Kohn-Sham potential with controllable error bounds.

  13. Fair and Square Computation of Inverse "Z"-Transforms of Rational Functions

    ERIC Educational Resources Information Center

    Moreira, M. V.; Basilio, J. C.

    2012-01-01

    All methods presented in textbooks for computing inverse "Z"-transforms of rational functions have some limitation: 1) the direct division method does not, in general, provide enough information to derive an analytical expression for the time-domain sequence "x"("k") whose "Z"-transform is "X"("z"); 2) computation using the inversion integral…

  14. Effects of Computer versus Paper Administration of an Adult Functional Writing Assessment

    ERIC Educational Resources Information Center

    Chen, Jing; White, Sheida; McCloskey, Michael; Soroui, Jaleh; Chun, Young

    2011-01-01

    This study investigated the comparability of paper and computer versions of a functional writing assessment administered to adults 16 and older. Three writing tasks were administered in both paper and computer modes to volunteers in the field test of an assessment of adult literacy in 2008. One set of analyses examined mode effects on scoring by…

  15. Performance of a computer-based assessment of cognitive function measures in two cohorts of seniors

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Computer-administered assessment of cognitive function is being increasingly incorporated in clinical trials, however its performance in these settings has not been systematically evaluated. The Seniors Health and Activity Research Program (SHARP) pilot trial (N=73) developed a computer-based tool f...

  16. A Systematic Approach for Understanding Slater-Gaussian Functions in Computational Chemistry

    ERIC Educational Resources Information Center

    Stewart, Brianna; Hylton, Derrick J.; Ravi, Natarajan

    2013-01-01

    A systematic way to understand the intricacies of quantum mechanical computations done by a software package known as "Gaussian" is undertaken via an undergraduate research project. These computations involve the evaluation of key parameters in a fitting procedure to express a Slater-type orbital (STO) function in terms of the linear…

  17. A Functional Specification for a Programming Language for Computer Aided Learning Applications.

    ERIC Educational Resources Information Center

    National Research Council of Canada, Ottawa (Ontario).

    In 1972 there were at least six different course authoring languages in use in Canada with little exchange of course materials between Computer Assisted Learning (CAL) centers. In order to improve facilities for producing "transportable" computer based course materials, a working panel undertook the definition of functional requirements of a user…

  18. A Computational Account of Borderline Personality Disorder: Impaired Predictive Learning about Self and Others Through Bodily Simulation

    PubMed Central

    Fineberg, Sarah K.; Steinfeld, Matthew; Brewer, Judson A.; Corlett, Philip R.

    2014-01-01

    Social dysfunction is a prominent and disabling aspect of borderline personality disorder. We reconsider traditional explanations for this problem, especially early disruption in the way an infant feels physical care from its mother, in terms of recent developments in computational psychiatry. In particular, social learning may depend on reinforcement learning though embodied simulations. Such modeling involves calculations based on structures outside the brain such as face and hands, calculations on one’s own body that are used to make inferences about others. We discuss ways to test the role of embodied simulation in BPD and potential implications for treatment. PMID:25221523

  19. Theoretical and computational studies in protein folding, design, and function

    NASA Astrophysics Data System (ADS)

    Morrissey, Michael Patrick

    2000-10-01

    In this work, simplified statistical models are used to understand an array of processes related to protein folding and design. In Part I, lattice models are utilized to test several theories about the statistical properties of protein-like systems. In Part II, sequence analysis and all-atom simulations are used to advance a novel theory for the behavior of a particular protein. Part I is divided into five chapters. In Chapter 2, a method of sequence design for model proteins, based on statistical mechanical first-principles, is developed. The cumulant design method uses a mean-field approximation to expand the free energy of a sequence in temperature. The method successfully designs sequences which fold to a target lattice structure at a specific temperature, a feat which was not possible using previous design methods. The next three chapters are computational studies of the double mutant cycle, which has been used experimentally to predict intra-protein interactions. Complete structure prediction is demonstrated for a model system using exhaustive, and also sub-exhaustive, double mutants. Nonadditivity of enthalpy, rather than of free energy, is proposed and demonstrated to be a superior marker for inter-residue contact. Next, a new double mutant protocol, called exchange mutation, is introduced. Although simple statistical arguments predict exchange mutation to be a more accurate contact predictor than standard mutant cycles, this hypothesis was not upheld in lattice simulations. Reasons for this inconsistency will be discussed. Finally, a multi-chain folding algorithm is introduced. Known as LINKS, this algorithm was developed to test a method of structure prediction which utilizes chain-break mutants. While structure prediction was not successful, LINKS should nevertheless be a useful tool for the study of protein-protein and protein-ligand interactions. The last chapter of Part I utilizes the lattice to explore the differences between standard folding, from

  20. A Computer Program for the Computation of Running Gear Temperatures Using Green's Function

    NASA Technical Reports Server (NTRS)

    Koshigoe, S.; Murdock, J. W.; Akin, L. S.; Townsend, D. P.

    1996-01-01

    A new technique has been developed to study two dimensional heat transfer problems in gears. This technique consists of transforming the heat equation into a line integral equation with the use of Green's theorem. The equation is then expressed in terms of eigenfunctions that satisfy the Helmholtz equation, and their corresponding eigenvalues for an arbitrarily shaped region of interest. The eigenfunction are obtalned by solving an intergral equation. Once the eigenfunctions are found, the temperature is expanded in terms of the eigenfunctions with unknown time dependent coefficients that can be solved by using Runge Kutta methods. The time integration is extremely efficient. Therefore, any changes in the time dependent coefficients or source terms in the boundary conditions do not impose a great computational burden on the user. The method is demonstrated by applying it to a sample gear tooth. Temperature histories at representative surface locatons are given.

  1. A mesh-decoupled height function method for computing interface curvature

    NASA Astrophysics Data System (ADS)

    Owkes, Mark; Desjardins, Olivier

    2015-01-01

    In this paper, a mesh-decoupled height function method is proposed and tested. The method is based on computing height functions within columns that are not aligned with the underlying mesh and have variable dimensions. Because they are decoupled from the computational mesh, the columns can be aligned with the interface normal vector, which is found to improve the curvature calculation for under-resolved interfaces where the standard height function method often fails. A computational geometry toolbox is used to compute the heights in the complex geometry that is formed at the intersection of the computational mesh and the columns. The toolbox reduces the complexity of the problem to a series of straightforward geometric operations using simplices. The proposed scheme is shown to compute more accurate curvatures than the standard height function method on coarse meshes. A combined method that uses the standard height function where it is well defined and the proposed scheme in under-resolved regions is tested. This approach achieves accurate and robust curvatures for under-resolved interface features and second-order converging curvatures for well-resolved interfaces.

  2. Do Tasks Make a Difference? Accounting for Heterogeneity of Performance of Children with Reading Difficulties on Tasks of Executive Function: Findings from a Meta-Analysis

    ERIC Educational Resources Information Center

    Booth, Josephine N.; Boyle, James M. E.; Kelly, Steve W.

    2010-01-01

    Research studies have implicated executive functions in reading difficulties (RD). But while some studies have found children with RD to be impaired on tasks of executive function other studies report unimpaired performance. A meta-analysis was carried out to determine whether these discrepant findings can be accounted for by differences in the…

  3. Fluorescence microscopy point spread function model accounting for aberrations due to refractive index variability within a specimen.

    PubMed

    Ghosh, Sreya; Preza, Chrysanthe

    2015-07-01

    A three-dimensional (3-D) point spread function (PSF) model for wide-field fluorescence microscopy, suitable for imaging samples with variable refractive index (RI) in multilayered media, is presented. This PSF model is a key component for accurate 3-D image restoration of thick biological samples, such as lung tissue. Microscope- and specimen-derived parameters are combined with a rigorous vectorial formulation to obtain a new PSF model that accounts for additional aberrations due to specimen RI variability. Experimental evaluation and verification of the PSF model was accomplished using images from 175-nm fluorescent beads in a controlled test sample. Fundamental experimental validation of the advantage of using improved PSFs in depth-variant restoration was accomplished by restoring experimental data from beads (6  μm in diameter) mounted in a sample with RI variation. In the investigated study, improvement in restoration accuracy in the range of 18 to 35% was observed when PSFs from the proposed model were used over restoration using PSFs from an existing model. The new PSF model was further validated by showing that its prediction compares to an experimental PSF (determined from 175-nm beads located below a thick rat lung slice) with a 42% improved accuracy over the current PSF model prediction. PMID:26154937

  4. The default-mode, ego-functions and free-energy: a neurobiological account of Freudian ideas

    PubMed Central

    Friston, K. J.

    2010-01-01

    This article explores the notion that Freudian constructs may have neurobiological substrates. Specifically, we propose that Freud’s descriptions of the primary and secondary processes are consistent with self-organized activity in hierarchical cortical systems and that his descriptions of the ego are consistent with the functions of the default-mode and its reciprocal exchanges with subordinate brain systems. This neurobiological account rests on a view of the brain as a hierarchical inference or Helmholtz machine. In this view, large-scale intrinsic networks occupy supraordinate levels of hierarchical brain systems that try to optimize their representation of the sensorium. This optimization has been formulated as minimizing a free-energy; a process that is formally similar to the treatment of energy in Freudian formulations. We substantiate this synthesis by showing that Freud’s descriptions of the primary process are consistent with the phenomenology and neurophysiology of rapid eye movement sleep, the early and acute psychotic state, the aura of temporal lobe epilepsy and hallucinogenic drug states. PMID:20194141

  5. Functional Competency Development Model for Academic Personnel Based on International Professional Qualification Standards in Computing Field

    ERIC Educational Resources Information Center

    Tumthong, Suwut; Piriyasurawong, Pullop; Jeerangsuwan, Namon

    2016-01-01

    This research proposes a functional competency development model for academic personnel based on international professional qualification standards in computing field and examines the appropriateness of the model. Specifically, the model consists of three key components which are: 1) functional competency development model, 2) blended training…

  6. Paraplegic standing controlled by functional neuromuscular stimulation: Part I--computer model and control-system design.

    PubMed

    Khang, G; Zajac, F E

    1989-09-01

    We have developed a planar computer model to investigate paraplegic standing induced by functional neuromuscular stimulation. The model consists of nonlinear musculotendon dynamics (pulse train activation dynamics and musculotendon actuator dynamics), nonlinear body-segmental dynamics, and a linear output-feedback control law. The model of activation dynamics is an analytic expression that characterizes the relation between the stimulus parameters (pulse width and interpulse interval) and the muscle activation. Hill's classic two-element muscle model was modified into a musculotendon actuator model in order to account for the effects of submaximal activation and tendon elasticity on development of force by the actuator. The three body-segmental, multijoint model accounts for the anterior-posterior movements of the head and trunk, the thigh, and the shank. We modeled arm movement as an external disturbance and imposed the disturbance to the body-segmental dynamics by means of a quasistatic analysis. Linearization, and at times linear approximation of the computer model, enabled us to compute a constant, linear feedback-gain matrix, whose output is the net activation needed by a dynamical joint-torque actuator. Motivated by an assumption that minimization of energy expenditure lessens muscle fatigue, we developed an algorithm that then computes how to distribute the net activation among all the muscles crossing the joint. In part II, the combined feedback control strategy is applied to the nonlinear model of musculotendon and body-segmental dynamics to study how well the body ought to maintain balance should the feedback control strategy be employed. PMID:2789177

  7. Incompressible flow computations based on the vorticity-stream function and velocity-pressure formulations

    NASA Technical Reports Server (NTRS)

    Tezduyar, T. E.; Liou, J.; Ganjoo, D. K.

    1990-01-01

    Finite element procedures and computations based on the velocity-pressure and vorticity-stream function formulations of incompressible flows are presented. Two new multistep velocity-pressure formulations are proposed and compared with the vorticity-stream function and one-step formulations. The example problems chosen are the standing vortex problem and flow past a circular cylinder. Benchmark quality computations are performed for the cylinder problem. The numerical results indicate that the vorticity-stream function formulation and one of the two new multistep formulations involve much less numerical dissipation than the one-step formulation.

  8. Computation of turbulent boundary layers employing the defect wall-function method. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Brown, Douglas L.

    1994-01-01

    In order to decrease overall computational time requirements of spatially-marching parabolized Navier-Stokes finite-difference computer code when applied to turbulent fluid flow, a wall-function methodology, originally proposed by R. Barnwell, was implemented. This numerical effort increases computational speed and calculates reasonably accurate wall shear stress spatial distributions and boundary-layer profiles. Since the wall shear stress is analytically determined from the wall-function model, the computational grid near the wall is not required to spatially resolve the laminar-viscous sublayer. Consequently, a substantially increased computational integration step size is achieved resulting in a considerable decrease in net computational time. This wall-function technique is demonstrated for adiabatic flat plate test cases from Mach 2 to Mach 8. These test cases are analytically verified employing: (1) Eckert reference method solutions, (2) experimental turbulent boundary-layer data of Mabey, and (3) finite-difference computational code solutions with fully resolved laminar-viscous sublayers. Additionally, results have been obtained for two pressure-gradient cases: (1) an adiabatic expansion corner and (2) an adiabatic compression corner.

  9. Functional Specifications for Computer Aided Training Systems Development and Management (CATSDM) Support Functions. Final Report.

    ERIC Educational Resources Information Center

    Hughes, John; And Others

    This report provides a description of a Computer Aided Training System Development and Management (CATSDM) environment based on state-of-the-art hardware and software technology, and including recommendations for off the shelf systems to be utilized as a starting point in addressing the particular systematic training and instruction design and…

  10. The Lung Physiome: merging imaging-based measures with predictive computational models of structure and function

    PubMed Central

    Tawhai, Merryn H; Hoffman, Eric A; Lin, Ching-Long

    2009-01-01

    Global measurements of the lung provided by standard pulmonary function tests do not give insight into the regional basis of lung function and lung disease. Advances in imaging methodologies, computer technologies, and subject-specific simulations are creating new opportunities for studying structure-function relationships in the lung through multi-disciplinary research. The digital Human Lung Atlas is an imaging-based resource compiled from male and female subjects spanning several decades of age. The Atlas comprises both structural and functional measures, and includes computational models derived to match individual subjects for personalized prediction of function. The computational models in the Atlas form part of the Lung Physiome project, which is an international effort to develop integrative models of lung function at all levels of biological organization. The computational models provide mechanistic interpretation of imaging measures; the Atlas provides structural data upon which to base model geometry, and functional data against which to test hypotheses. The example of simulating air flow on a subject-specific basis is considered. Methods for deriving multi-scale models of the airway geometry for individual subjects in the Atlas are outlined, and methods for modeling turbulent flows in the airway are reviewed. PMID:20835982

  11. An analytic method to account for drag in the Vinti satellite theory. [computer program using quadrature algorithm

    NASA Technical Reports Server (NTRS)

    Watson, J. S.; Mistretta, G. D.; Bonavito, N. L.

    1975-01-01

    A quadrature algorithm is presented which employs analytical expressions for the variations of satellite orbital elements caused by air drag. The Hamiltonian is formally preserved and the Jacobi constants of the motion are advanced with time through the variational equations. The atmospheric density profile is written as a fitted exponential function of the eccentric anomaly, which adheres to tabulated data at all altitudes and simultaneously reduces the variational equations to definite integrals with closed form evaluations, whose limits are in terms of the eccentric anomaly. Results are given for two intense air drag satellites and indicate that the satellite ephemerides produced by this method in conjunction with the Vinti program are of very high accuracy.

  12. Computer/gaming station use in youth: Correlations among use, addiction and functional impairment

    PubMed Central

    Baer, Susan; Saran, Kelly; Green, David A

    2012-01-01

    OBJECTIVE: Computer/gaming station use is ubiquitous in the lives of youth today. Overuse is a concern, but it remains unclear whether problems arise from addictive patterns of use or simply excessive time spent on use. The goal of the present study was to evaluate computer/gaming station use in youth and to examine the relationship between amounts of use, addictive features of use and functional impairment. METHOD: A total of 110 subjects (11 to 17 years of age) from local schools participated. Time spent on television, video gaming and non-gaming recreational computer activities was measured. Addictive features of computer/gaming station use were ascertained, along with emotional/behavioural functioning. Multiple linear regressions were used to understand how youth functioning varied with time of use and addictive features of use. RESULTS: Mean (± SD) total screen time was 4.5±2.4 h/day. Addictive features of use were consistently correlated with functional impairment across multiple measures and informants, whereas time of use, after controlling for addiction, was not. CONCLUSIONS: Youth are spending many hours each day in front of screens. In the absence of addictive features of computer/gaming station use, time spent is not correlated with problems; however, youth with addictive features of use show evidence of poor emotional/ behavioural functioning. PMID:24082802

  13. Resources, attitudes and culture: an understanding of the factors that influence the functioning of accountability mechanisms in primary health care settings

    PubMed Central

    2013-01-01

    Background District level health system governance is recognised as an important but challenging element of health system development in low and middle-income countries. Accountability is a more recent focus in health system debates. Accountability mechanisms are governance tools that seek to regulate answerability between the health system and the community (external accountability) and/or between different levels of the health system (bureaucratic accountability). External accountability has attracted significant attention in recent years, but bureaucratic accountability mechanisms, and the interactions between the two forms of accountability, have been relatively neglected. This is an important gap given that webs of accountability relationships exist within every health system. There is a need to strike a balance between achieving accountability upwards within the health system (for example through information reporting arrangements) while at the same time allowing for the local level innovation that could improve quality of care and patient responsiveness. Methods Using a descriptive literature review, this paper examines the factors that influence the functioning of accountability mechanisms and relationships within the district health system, and draws out the implications for responsiveness to patients and communities. We also seek to understand the practices that might strengthen accountability in ways that improve responsiveness – of the health system to citizens’ needs and rights, and of providers to patients. Results The review highlights the ways in which bureaucratic accountability mechanisms often constrain the functioning of external accountability mechanisms. For example, meeting the expectations of relatively powerful managers further up the system may crowd out efforts to respond to citizens and patients. Organisational cultures characterized by supervision and management systems focused on compliance to centrally defined outputs and targets

  14. Projection of Young-Old and Old-Old with Functional Disability: Does Accounting for the Changing Educational Composition of the Elderly Population Make a Difference?

    PubMed Central

    Ansah, John P.; Malhotra, Rahul; Lew, Nicola; Chiu, Chi-Tsun; Chan, Angelique; Bayer, Steffen; Matchar, David B.

    2015-01-01

    This study compares projections, up to year 2040, of young-old (aged 60-79) and old-old (aged 80+) with functional disability in Singapore with and without accounting for the changing educational composition of the Singaporean elderly. Two multi-state population models, with and without accounting for educational composition respectively, were developed, parameterized with age-gender-(education)-specific transition probabilities (between active, functional disability and death states) estimated from two waves (2009 and 2011) of a nationally representative survey of community-dwelling Singaporeans aged ≥60 years (N=4,990). Probabilistic sensitivity analysis with the bootstrap method was used to obtain the 95% confidence interval of the transition probabilities. Not accounting for educational composition overestimated the young-old with functional disability by 65 percent and underestimated the old-old by 20 percent in 2040. Accounting for educational composition, the proportion of old-old with functional disability increased from 40.8 percent in 2000 to 64.4 percent by 2040; not accounting for educational composition, the proportion in 2040 was 49.4 percent. Since the health profiles, and hence care needs, of the old-old differ from those of the young-old, health care service utilization and expenditure and the demand for formal and informal caregiving will be affected, impacting health and long-term care policy. PMID:25974069

  15. Analytic computation of energy derivatives - Relationships among partial derivatives of a variationally determined function

    NASA Technical Reports Server (NTRS)

    King, H. F.; Komornicki, A.

    1986-01-01

    Formulas are presented relating Taylor series expansion coefficients of three functions of several variables, the energy of the trial wave function (W), the energy computed using the optimized variational wave function (E), and the response function (lambda), under certain conditions. Partial derivatives of lambda are obtained through solution of a recursive system of linear equations, and solution through order n yields derivatives of E through order 2n + 1, extending Puley's application of Wigner's 2n + 1 rule to partial derivatives in couple perturbation theory. An examination of numerical accuracy shows that the usual two-term second derivative formula is less stable than an alternative four-term formula, and that previous claims that energy derivatives are stationary properties of the wave function are fallacious. The results have application to quantum theoretical methods for the computation of derivative properties such as infrared frequencies and intensities.

  16. Renormalization group improved computation of correlation functions in theories with nontrivial phase diagram

    NASA Astrophysics Data System (ADS)

    Codello, Alessandro; Tonero, Alberto

    2016-07-01

    We present a simple and consistent way to compute correlation functions in interacting theories with nontrivial phase diagram. As an example we show how to consistently compute the four-point function in three dimensional Z2 -scalar theories. The idea is to perform the path integral by weighting the momentum modes that contribute to it according to their renormalization group (RG) relevance, i.e. we weight each mode according to the value of the running couplings at that scale. In this way, we are able to encode in a loop computation the information regarding the RG trajectory along which we are integrating. We show that depending on the initial condition, or initial point in the phase diagram, we obtain different behaviors of the four-point function at the endpoint of the flow.

  17. Extended Krylov subspaces approximations of matrix functions. Application to computational electromagnetics

    SciTech Connect

    Druskin, V.; Lee, Ping; Knizhnerman, L.

    1996-12-31

    There is now a growing interest in the area of using Krylov subspace approximations to compute the actions of matrix functions. The main application of this approach is the solution of ODE systems, obtained after discretization of partial differential equations by method of lines. In the event that the cost of computing the matrix inverse is relatively inexpensive, it is sometimes attractive to solve the ODE using the extended Krylov subspaces, originated by actions of both positive and negative matrix powers. Examples of such problems can be found frequently in computational electromagnetics.

  18. Performance of computational tools in evaluating the functional impact of laboratory-induced amino acid mutations.

    PubMed

    Gray, Vanessa E; Kukurba, Kimberly R; Kumar, Sudhir

    2012-08-15

    Site-directed mutagenesis is frequently used by scientists to investigate the functional impact of amino acid mutations in the laboratory. Over 10,000 such laboratory-induced mutations have been reported in the UniProt database along with the outcomes of functional assays. Here, we explore the performance of state-of-the-art computational tools (Condel, PolyPhen-2 and SIFT) in correctly annotating the function-altering potential of 10,913 laboratory-induced mutations from 2372 proteins. We find that computational tools are very successful in diagnosing laboratory-induced mutations that elicit significant functional change in the laboratory (up to 92% accuracy). But, these tools consistently fail in correctly annotating laboratory-induced mutations that show no functional impact in the laboratory assays. Therefore, the overall accuracy of computational tools for laboratory-induced mutations is much lower than that observed for the naturally occurring human variants. We tested and rejected the possibilities that the preponderance of changes to alanine and the presence of multiple base-pair mutations in the laboratory were the reasons for the observed discordance between the performance of computational tools for natural and laboratory mutations. Instead, we discover that the laboratory-induced mutations occur predominately at the highly conserved positions in proteins, where the computational tools have the lowest accuracy of correct prediction for variants that do not impact function (neutral). Therefore, the comparisons of experimental-profiling results with those from computational predictions need to be sensitive to the evolutionary conservation of the positions harboring the amino acid change. PMID:22685075

  19. Computation of determinant expansion coefficients within the graphically contracted function method.

    SciTech Connect

    Gidofalvi, G.; Shepard, R.; Chemical Sciences and Engineering Division

    2009-11-30

    Most electronic structure methods express the wavefunction as an expansion of N-electron basis functions that are chosen to be either Slater determinants or configuration state functions. Although the expansion coefficient of a single determinant may be readily computed from configuration state function coefficients for small wavefunction expansions, traditional algorithms are impractical for systems with a large number of electrons and spatial orbitals. In this work, we describe an efficient algorithm for the evaluation of a single determinant expansion coefficient for wavefunctions expanded as a linear combination of graphically contracted functions. Each graphically contracted function has significant multiconfigurational character and depends on a relatively small number of variational parameters called arc factors. Because the graphically contracted function approach expresses the configuration state function coefficients as products of arc factors, a determinant expansion coefficient may be computed recursively more efficiently than with traditional configuration interaction methods. Although the cost of computing determinant coefficients scales exponentially with the number of spatial orbitals for traditional methods, the algorithm presented here exploits two levels of recursion and scales polynomially with system size. Hence, as demonstrated through applications to systems with hundreds of electrons and orbitals, it may readily be applied to very large systems.

  20. Computation of determinant expansion coefficients within the graphically contracted function method.

    PubMed

    Gidofalvi, Gergely; Shepard, Ron

    2009-11-30

    Most electronic structure methods express the wavefunction as an expansion of N-electron basis functions that are chosen to be either Slater determinants or configuration state functions. Although the expansion coefficient of a single determinant may be readily computed from configuration state function coefficients for small wavefunction expansions, traditional algorithms are impractical for systems with a large number of electrons and spatial orbitals. In this work, we describe an efficient algorithm for the evaluation of a single determinant expansion coefficient for wavefunctions expanded as a linear combination of graphically contracted functions. Each graphically contracted function has significant multiconfigurational character and depends on a relatively small number of variational parameters called arc factors. Because the graphically contracted function approach expresses the configuration state function coefficients as products of arc factors, a determinant expansion coefficient may be computed recursively more efficiently than with traditional configuration interaction methods. Although the cost of computing determinant coefficients scales exponentially with the number of spatial orbitals for traditional methods, the algorithm presented here exploits two levels of recursion and scales polynomially with system size. Hence, as demonstrated through applications to systems with hundreds of electrons and orbitals, it may readily be applied to very large systems. PMID:19360796

  1. Use of global functions for improvement in efficiency of nonlinear analysis. [in computer structural displacement estimation

    NASA Technical Reports Server (NTRS)

    Almroth, B. O.; Stehlin, P.; Brogan, F. A.

    1981-01-01

    A method for improving the efficiency of nonlinear structural analysis by the use of global displacement functions is presented. The computer programs include options to define the global functions as input or let the program automatically select and update these functions. The program was applied to a number of structures: (1) 'pear-shaped cylinder' in compression, (2) bending of a long cylinder, (3) spherical shell subjected to point force, (4) panel with initial imperfections, (5) cylinder with cutouts. The sample cases indicate the usefulness of the procedure in the solution of nonlinear structural shell problems by the finite element method. It is concluded that the use of global functions for extrapolation will lead to savings in computer time.

  2. Analysis and selection of optimal function implementations in massively parallel computer

    DOEpatents

    Archer, Charles Jens; Peters, Amanda; Ratterman, Joseph D.

    2011-05-31

    An apparatus, program product and method optimize the operation of a parallel computer system by, in part, collecting performance data for a set of implementations of a function capable of being executed on the parallel computer system based upon the execution of the set of implementations under varying input parameters in a plurality of input dimensions. The collected performance data may be used to generate selection program code that is configured to call selected implementations of the function in response to a call to the function under varying input parameters. The collected performance data may be used to perform more detailed analysis to ascertain the comparative performance of the set of implementations of the function under the varying input parameters.

  3. The Krigifier: A Procedure for Generating Pseudorandom Nonlinear Objective Functions for Computational Experimentation

    NASA Technical Reports Server (NTRS)

    Trosset, Michael W.

    1999-01-01

    Comprehensive computational experiments to assess the performance of algorithms for numerical optimization require (among other things) a practical procedure for generating pseudorandom nonlinear objective functions. We propose a procedure that is based on the convenient fiction that objective functions are realizations of stochastic processes. This report details the calculations necessary to implement our procedure for the case of certain stationary Gaussian processes and presents a specific implementation in the statistical programming language S-PLUS.

  4. MRIVIEW: An interactive computational tool for investigation of brain structure and function

    SciTech Connect

    Ranken, D.; George, J.

    1993-12-31

    MRIVIEW is a software system which uses image processing and visualization to provide neuroscience researchers with an integrated environment for combining functional and anatomical information. Key features of the software include semi-automated segmentation of volumetric head data and an interactive coordinate reconciliation method which utilizes surface visualization. The current system is a precursor to a computational brain atlas. We describe features this atlas will incorporate, including methods under development for visualizing brain functional data obtained from several different research modalities.

  5. Response function theories that account for size distribution effects - A review. [mathematical models concerning composite propellant heterogeneity effects on combustion instability

    NASA Technical Reports Server (NTRS)

    Cohen, N. S.

    1980-01-01

    The paper presents theoretical models developed to account for the heterogeneity of composite propellants in expressing the pressure-coupled combustion response function. It is noted that the model of Lengelle and Williams (1968) furnishes a viable basis to explain the effects of heterogeneity.

  6. Maple (Computer Algebra System) in Teaching Pre-Calculus: Example of Absolute Value Function

    ERIC Educational Resources Information Center

    Tuluk, Güler

    2014-01-01

    Modules in Computer Algebra Systems (CAS) make Mathematics interesting and easy to understand. The present study focused on the implementation of the algebraic, tabular (numerical), and graphical approaches used for the construction of the concept of absolute value function in teaching mathematical content knowledge along with Maple 9. The study…

  7. Effects of a Computer-Based Intervention Program on the Communicative Functions of Children with Autism

    ERIC Educational Resources Information Center

    Hetzroni, Orit E.; Tannous, Juman

    2004-01-01

    This study investigated the use of computer-based intervention for enhancing communication functions of children with autism. The software program was developed based on daily life activities in the areas of play, food, and hygiene. The following variables were investigated: delayed echolalia, immediate echolalia, irrelevant speech, relevant…

  8. Computing the Partial Fraction Decomposition of Rational Functions with Irreducible Quadratic Factors in the Denominators

    ERIC Educational Resources Information Center

    Man, Yiu-Kwong

    2012-01-01

    In this note, a new method for computing the partial fraction decomposition of rational functions with irreducible quadratic factors in the denominators is presented. This method involves polynomial divisions and substitutions only, without having to solve for the complex roots of the irreducible quadratic polynomial or to solve a system of linear…

  9. PuFT: Computer-Assisted Program for Pulmonary Function Tests.

    ERIC Educational Resources Information Center

    Boyle, Joseph

    1983-01-01

    PuFT computer program (Microsoft Basic) is designed to help in understanding/interpreting pulmonary function tests (PFT). The program provides predicted values for common PFT after entry of patient data, calculates/plots graph simulating force vital capacity (FVC), and allows observations of effects on predicted PFT values and FVC curve when…

  10. Identifying Differential Item Functioning in Multi-Stage Computer Adaptive Testing

    ERIC Educational Resources Information Center

    Gierl, Mark J.; Lai, Hollis; Li, Johnson

    2013-01-01

    The purpose of this study is to evaluate the performance of CATSIB (Computer Adaptive Testing-Simultaneous Item Bias Test) for detecting differential item functioning (DIF) when items in the matching and studied subtest are administered adaptively in the context of a realistic multi-stage adaptive test (MST). MST was simulated using a 4-item…

  11. A Computational Model Quantifies the Effect of Anatomical Variability on Velopharyngeal Function

    ERIC Educational Resources Information Center

    Inouye, Joshua M.; Perry, Jamie L.; Lin, Kant Y.; Blemker, Silvia S.

    2015-01-01

    Purpose: This study predicted the effects of velopharyngeal (VP) anatomical parameters on VP function to provide a greater understanding of speech mechanics and aid in the treatment of speech disorders. Method: We created a computational model of the VP mechanism using dimensions obtained from magnetic resonance imaging measurements of 10 healthy…

  12. Computer-program documentation of an interactive-accounting model to simulate streamflow, water quality, and water-supply operations in a river basin

    USGS Publications Warehouse

    Burns, A.W.

    1988-01-01

    This report describes an interactive-accounting model used to simulate streamflow, chemical-constituent concentrations and loads, and water-supply operations in a river basin. The model uses regression equations to compute flow from incremental (internode) drainage areas. Conservative chemical constituents (typically dissolved solids) also are computed from regression equations. Both flow and water quality loads are accumulated downstream. Optionally, the model simulates the water use and the simplified groundwater systems of a basin. Water users include agricultural, municipal, industrial, and in-stream users , and reservoir operators. Water users list their potential water sources, including direct diversions, groundwater pumpage, interbasin imports, or reservoir releases, in the order in which they will be used. Direct diversions conform to basinwide water law priorities. The model is interactive, and although the input data exist in files, the user can modify them interactively. A major feature of the model is its color-graphic-output options. This report includes a description of the model, organizational charts of subroutines, and examples of the graphics. Detailed format instructions for the input data, example files of input data, definitions of program variables, and listing of the FORTRAN source code are Attachments to the report. (USGS)

  13. A fast computation method for MUSIC spectrum function based on circular arrays

    NASA Astrophysics Data System (ADS)

    Du, Zhengdong; Wei, Ping

    2015-02-01

    The large computation amount of multiple signal classification (MUSIC) spectrum function seriously affects the timeliness of direction finding system using MUSIC algorithm, especially in the two-dimensional directions of arrival (DOA) estimation of azimuth and elevation with a large antenna array. This paper proposes a fast computation method for MUSIC spectrum. It is suitable for any circular array. First, the circular array is transformed into a virtual uniform circular array, in the process of calculating MUSIC spectrum, for the cyclic characteristics of steering vector, the inner product in the calculation of spatial spectrum is realised by cyclic convolution. The computational amount of MUSIC spectrum is obviously less than that of the conventional method. It is a very practical way for MUSIC spectrum computation in circular arrays.

  14. A new Fortran 90 program to compute regular and irregular associated Legendre functions

    NASA Astrophysics Data System (ADS)

    Schneider, Barry I.; Segura, Javier; Gil, Amparo; Guan, Xiaoxu; Bartschat, Klaus

    2010-12-01

    We present a modern Fortran 90 code to compute the regular Plm(x) and irregular Qlm(x) associated Legendre functions for all x∈(-1,+1) (on the cut) and |x|>1 and integer degree ( l) and order ( m). The code applies either forward or backward recursion in ( l) and ( m) in the stable direction, starting with analytically known values for forward recursion and considering both a Wronskian based and a modified Miller's method for backward recursion. While some Fortran 77 codes existed for computing the functions off the cut, no Fortran 90 code was available for accurately computing the functions for all real values of x different from x=±1 where the irregular functions are not defined. Program summaryProgram title: Associated Legendre Functions Catalogue identifier: AEHE_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHE_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 6722 No. of bytes in distributed program, including test data, etc.: 310 210 Distribution format: tar.gz Programming language: Fortran 90 Computer: Linux systems Operating system: Linux RAM: bytes Classification: 4.7 Nature of problem: Compute the regular and irregular associated Legendre functions for integer values of the degree and order and for all real arguments. The computation of the interaction of two electrons, 1/|r-r|, in prolate spheroidal coordinates is used as one example where these functions are required for all values of the argument and we are able to easily compare the series expansion in associated Legendre functions and the exact value. Solution method: The code evaluates the regular and irregular associated Legendre functions using forward recursion when |x|<1 starting the recursion with the analytically known values of the first two members of the sequence. For values of

  15. 25 CFR 547.9 - What are the minimum technical standards for Class II gaming system accounting functions?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... OF CLASS II GAMES § 547.9 What are the minimum technical standards for Class II gaming system... digits to accommodate the design of the game. (3) Accounting data displayed to the player may be... audit, configuration, recall and test modes; or (ii) Temporarily, during entertaining displays of...

  16. 25 CFR 547.9 - What are the minimum technical standards for Class II gaming system accounting functions?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... OF CLASS II GAMES § 547.9 What are the minimum technical standards for Class II gaming system... digits to accommodate the design of the game. (3) Accounting data displayed to the player may be... audit, configuration, recall and test modes; or (ii) Temporarily, during entertaining displays of...

  17. 25 CFR 547.9 - What are the minimum technical standards for Class II gaming system accounting functions?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... OF CLASS II GAMES § 547.9 What are the minimum technical standards for Class II gaming system... digits to accommodate the design of the game. (3) Accounting data displayed to the player may be... audit, configuration, recall and test modes; or (ii) Temporarily, during entertaining displays of...

  18. Computer generation of symbolic network functions - A new theory and implementation.

    NASA Technical Reports Server (NTRS)

    Alderson, G. E.; Lin, P.-M.

    1972-01-01

    A new method is presented for obtaining network functions in which some, none, or all of the network elements are represented by symbolic parameters (i.e., symbolic network functions). Unlike the topological tree enumeration or signal flow graph methods generally used to derive symbolic network functions, the proposed procedure employs fast, efficient, numerical-type algorithms to determine the contribution of those network branches that are not represented by symbolic parameters. A computer program called NAPPE (for Network Analysis Program using Parameter Extractions) and incorporating all of the concepts discussed has been written. Several examples illustrating the usefulness and efficiency of NAPPE are presented.

  19. On computation and use of Fourier coefficients for associated Legendre functions

    NASA Astrophysics Data System (ADS)

    Gruber, Christian; Abrykosov, Oleh

    2016-06-01

    The computation of spherical harmonic series in very high resolution is known to be delicate in terms of performance and numerical stability. A major problem is to keep results inside a numerical range of the used data type during calculations as under-/overflow arises. Extended data types are currently not desirable since the arithmetic complexity will grow exponentially with higher resolution levels. If the associated Legendre functions are computed in the spectral domain, then regular grid transformations can be applied to be highly efficient and convenient for derived quantities as well. In this article, we compare three recursive computations of the associated Legendre functions as trigonometric series, thereby ensuring a defined numerical range for each constituent wave number, separately. The results to a high degree and order show the numerical strength of the proposed method. First, the evaluation of Fourier coefficients of the associated Legendre functions has been done with respect to the floating-point precision requirements. Secondly, the numerical accuracy in the cases of standard double and long double precision arithmetic is demonstrated. Following Bessel's inequality the obtained accuracy estimates of the Fourier coefficients are directly transferable to the associated Legendre functions themselves and to derived functionals as well. Therefore, they can provide an essential insight to modern geodetic applications that depend on efficient spherical harmonic analysis and synthesis beyond [5~× ~5] arcmin resolution.

  20. Radial subsampling for fast cost function computation in intensity-based 3D image registration

    NASA Astrophysics Data System (ADS)

    Boettger, Thomas; Wolf, Ivo; Meinzer, Hans-Peter; Celi, Juan Carlos

    2007-03-01

    Image registration is always a trade-off between accuracy and speed. Looking towards clinical scenarios the time for bringing two or more images into registration should be around a few seconds only. We present a new scheme for subsampling 3D-image data to allow for efficient computation of cost functions in intensity-based image registration. Starting from an arbitrary center point voxels are sampled along scan lines which do radially extend from the center point. We analyzed the characteristics of different cost functions computed on the sub-sampled data and compared them to known cost functions with respect to local optima. Results show the cost functions are smooth and give high peaks at the expected optima. Furthermore we investigated capture range of cost functions computed under the new subsampling scheme. Capture range was remarkably better for the new scheme compared to metrics using all voxels or different subsampling schemes and high registration accuracy was achieved as well. The most important result is the improvement in terms of speed making this scheme very interesting for clinical scenarios. We conclude using the new subsampling scheme intensity-based 3D image registration can be performed much faster than using other approaches while maintaining high accuracy. A variety of different extensions of the new approach is conceivable, e.g. non-regular distribution of the scan lines or not to let the scan lines start from a center point only, but from the surface of an organ model for example.

  1. On computation and use of Fourier coefficients for associated Legendre functions

    NASA Astrophysics Data System (ADS)

    Gruber, Christian; Abrykosov, Oleh

    2016-02-01

    The computation of spherical harmonic series in very high resolution is known to be delicate in terms of performance and numerical stability. A major problem is to keep results inside a numerical range of the used data type during calculations as under-/overflow arises. Extended data types are currently not desirable since the arithmetic complexity will grow exponentially with higher resolution levels. If the associated Legendre functions are computed in the spectral domain, then regular grid transformations can be applied to be highly efficient and convenient for derived quantities as well. In this article, we compare three recursive computations of the associated Legendre functions as trigonometric series, thereby ensuring a defined numerical range for each constituent wave number, separately. The results to a high degree and order show the numerical strength of the proposed method. First, the evaluation of Fourier coefficients of the associated Legendre functions has been done with respect to the floating-point precision requirements. Secondly, the numerical accuracy in the cases of standard double and long double precision arithmetic is demonstrated. Following Bessel's inequality the obtained accuracy estimates of the Fourier coefficients are directly transferable to the associated Legendre functions themselves and to derived functionals as well. Therefore, they can provide an essential insight to modern geodetic applications that depend on efficient spherical harmonic analysis and synthesis beyond [5~× ~5 ] arcmin resolution.

  2. Algorithms for Efficient Computation of Transfer Functions for Large Order Flexible Systems

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Giesy, Daniel P.

    1998-01-01

    An efficient and robust computational scheme is given for the calculation of the frequency response function of a large order, flexible system implemented with a linear, time invariant control system. Advantage is taken of the highly structured sparsity of the system matrix of the plant based on a model of the structure using normal mode coordinates. The computational time per frequency point of the new computational scheme is a linear function of system size, a significant improvement over traditional, still-matrix techniques whose computational times per frequency point range from quadratic to cubic functions of system size. This permits the practical frequency domain analysis of systems of much larger order than by traditional, full-matrix techniques. Formulations are given for both open- and closed-loop systems. Numerical examples are presented showing the advantages of the present formulation over traditional approaches, both in speed and in accuracy. Using a model with 703 structural modes, the present method was up to two orders of magnitude faster than a traditional method. The present method generally showed good to excellent accuracy throughout the range of test frequencies, while traditional methods gave adequate accuracy for lower frequencies, but generally deteriorated in performance at higher frequencies with worst case errors being many orders of magnitude times the correct values.

  3. PLATO IV Accountancy Index.

    ERIC Educational Resources Information Center

    Pondy, Dorothy, Comp.

    The catalog was compiled to assist instructors in planning community college and university curricula using the 48 computer-assisted accountancy lessons available on PLATO IV (Programmed Logic for Automatic Teaching Operation) for first semester accounting courses. It contains information on lesson access, lists of acceptable abbreviations for…

  4. Computational aspects of maximum likelihood estimation and reduction in sensitivity function calculations

    NASA Technical Reports Server (NTRS)

    Gupta, N. K.; Mehra, R. K.

    1974-01-01

    This paper discusses numerical aspects of computing maximum likelihood estimates for linear dynamical systems in state-vector form. Different gradient-based nonlinear programming methods are discussed in a unified framework and their applicability to maximum likelihood estimation is examined. The problems due to singular Hessian or singular information matrix that are common in practice are discussed in detail and methods for their solution are proposed. New results on the calculation of state sensitivity functions via reduced order models are given. Several methods for speeding convergence and reducing computation time are also discussed.

  5. Can Expanded Bacteriochlorins Act as Photosensitizers in Photodynamic Therapy? Good News from Density Functional Theory Computations.

    PubMed

    Mazzone, Gloria; Alberto, Marta E; De Simone, Bruna C; Marino, Tiziana; Russo, Nino

    2016-01-01

    The main photophysical properties of a series of expanded bacteriochlorins, recently synthetized, have been investigated by means of DFT and TD-DFT methods. Absorption spectra computed with different exchange-correlation functionals, B3LYP, M06 and ωB97XD, have been compared with the experimental ones. In good agreement, all the considered systems show a maximum absorption wavelength that falls in the therapeutic window (600-800 nm). The obtained singlet-triplet energy gaps are large enough to ensure the production of cytotoxic singlet molecular oxygen. The computed spin-orbit matrix elements suggest a good probability of intersystem spin-crossing between singlet and triplet excited states, since they result to be higher than those computed for 5,10,15,20-tetrakis-(m-hydroxyphenyl)chlorin (Foscan©) already used in the photodynamic therapy (PDT) protocol. Because of the investigated properties, these expanded bacteriochlorins can be proposed as PDT agents. PMID:26938516

  6. Storing files in a parallel computing system based on user-specified parser function

    DOEpatents

    Faibish, Sorin; Bent, John M; Tzelnic, Percy; Grider, Gary; Manzanares, Adam; Torres, Aaron

    2014-10-21

    Techniques are provided for storing files in a parallel computing system based on a user-specified parser function. A plurality of files generated by a distributed application in a parallel computing system are stored by obtaining a parser from the distributed application for processing the plurality of files prior to storage; and storing one or more of the plurality of files in one or more storage nodes of the parallel computing system based on the processing by the parser. The plurality of files comprise one or more of a plurality of complete files and a plurality of sub-files. The parser can optionally store only those files that satisfy one or more semantic requirements of the parser. The parser can also extract metadata from one or more of the files and the extracted metadata can be stored with one or more of the plurality of files and used for searching for files.

  7. Mirrors for X-ray telescopes: Fresnel diffraction-based computation of point spread functions from metrology

    NASA Astrophysics Data System (ADS)

    Raimondi, L.; Spiga, D.

    2015-01-01

    Context. The imaging sharpness of an X-ray telescope is chiefly determined by the optical quality of its focusing optics, which in turn mostly depends on the shape accuracy and the surface finishing of the grazing-incidence X-ray mirrors that compose the optical modules. To ensure the imaging performance during the mirror manufacturing, a fundamental step is predicting the mirror point spread function (PSF) from the metrology of its surface. Traditionally, the PSF computation in X-rays is assumed to be different depending on whether the surface defects are classified as figure errors or roughness. This classical approach, however, requires setting a boundary between these two asymptotic regimes, which is not known a priori. Aims: The aim of this work is to overcome this limit by providing analytical formulae that are valid at any light wavelength, for computing the PSF of an X-ray mirror shell from the measured longitudinal profiles and the roughness power spectral density, without distinguishing spectral ranges with different treatments. Methods: The method we adopted is based on the Huygens-Fresnel principle for computing the diffracted intensity from measured or modeled profiles. In particular, we have simplified the computation of the surface integral to only one dimension, owing to the grazing incidence that reduces the influence of the azimuthal errors by orders of magnitude. The method can be extended to optical systems with an arbitrary number of reflections - in particular the Wolter-I, which is frequently used in X-ray astronomy - and can be used in both near- and far-field approximation. Finally, it accounts simultaneously for profile, roughness, and aperture diffraction. Results: We describe the formalism with which one can self-consistently compute the PSF of grazing-incidence mirrors, and we show some PSF simulations including the UV band, where the aperture diffraction dominates the PSF, and hard X-rays where the X-ray scattering has a major impact

  8. Randomly Accountable

    ERIC Educational Resources Information Center

    Kane, Thomas J.; Staiger, Douglas O.; Geppert, Jeffrey

    2002-01-01

    The accountability debate tends to devolve into a battle between the pro-testing and anti-testing crowds. When it comes to the design of a school accountability system, the devil is truly in the details. A well-designed accountability plan may go a long way toward giving school personnel the kinds of signals they need to improve performance.…

  9. School Accountability.

    ERIC Educational Resources Information Center

    Evers, Williamson M., Ed.; Walberg, Herbert J., Ed.

    This book presents the perspectives of experts from the fields of history, economics, political science, and psychology on what is known about accountability, what still needs to be learned, what should be done right now, and what should be avoided in devising accountability systems. The common myths about accountability are dispelled and how it…

  10. Colorful Accounting

    ERIC Educational Resources Information Center

    Warrick, C. Shane

    2006-01-01

    As instructors of accounting, we should take an abstract topic (at least to most students) and connect it to content known by students to help increase the effectiveness of our instruction. In a recent semester, ordinary items such as colors, a basketball, and baseball were used to relate the subject of accounting. The accounting topics of account…

  11. It Might Not Make a Big DIF: Improved Differential Test Functioning Statistics That Account for Sampling Variability

    ERIC Educational Resources Information Center

    Chalmers, R. Philip; Counsell, Alyssa; Flora, David B.

    2016-01-01

    Differential test functioning, or DTF, occurs when one or more items in a test demonstrate differential item functioning (DIF) and the aggregate of these effects are witnessed at the test level. In many applications, DTF can be more important than DIF when the overall effects of DIF at the test level can be quantified. However, optimal statistical…

  12. Time Utility Functions for Modeling and Evaluating Resource Allocations in a Heterogeneous Computing System

    SciTech Connect

    Briceno, Luis Diego; Khemka, Bhavesh; Siegel, Howard Jay; Maciejewski, Anthony A; Groer, Christopher S; Koenig, Gregory A; Okonski, Gene D; Poole, Stephen W

    2011-01-01

    This study considers a heterogeneous computing system and corresponding workload being investigated by the Extreme Scale Systems Center (ESSC) at Oak Ridge National Laboratory (ORNL). The ESSC is part of a collaborative effort between the Department of Energy (DOE) and the Department of Defense (DoD) to deliver research, tools, software, and technologies that can be integrated, deployed, and used in both DOE and DoD environments. The heterogeneous system and workload described here are representative of a prototypical computing environment being studied as part of this collaboration. Each task can exhibit a time-varying importance or utility to the overall enterprise. In this system, an arriving task has an associated priority and precedence. The priority is used to describe the importance of a task, and precedence is used to describe how soon the task must be executed. These two metrics are combined to create a utility function curve that indicates how valuable it is for the system to complete a task at any given moment. This research focuses on using time-utility functions to generate a metric that can be used to compare the performance of different resource schedulers in a heterogeneous computing system. The contributions of this paper are: (a) a mathematical model of a heterogeneous computing system where tasks arrive dynamically and need to be assigned based on their priority, precedence, utility characteristic class, and task execution type, (b) the use of priority and precedence to generate time-utility functions that describe the value a task has at any given time, (c) the derivation of a metric based on the total utility gained from completing tasks to measure the performance of the computing environment, and (d) a comparison of the performance of resource allocation heuristics in this environment.

  13. Structure, dynamics, and function of the monooxygenase P450 BM-3: insights from computer simulations studies

    NASA Astrophysics Data System (ADS)

    Roccatano, Danilo

    2015-07-01

    The monooxygenase P450 BM-3 is a NADPH-dependent fatty acid hydroxylase enzyme isolated from soil bacterium Bacillus megaterium. As a pivotal member of cytochrome P450 superfamily, it has been intensely studied for the comprehension of structure-dynamics-function relationships in this class of enzymes. In addition, due to its peculiar properties, it is also a promising enzyme for biochemical and biomedical applications. However, despite the efforts, the full understanding of the enzyme structure and dynamics is not yet achieved. Computational studies, particularly molecular dynamics (MD) simulations, have importantly contributed to this endeavor by providing new insights at an atomic level regarding the correlations between structure, dynamics, and function of the protein. This topical review summarizes computational studies based on MD simulations of the cytochrome P450 BM-3 and gives an outlook on future directions.

  14. Coal-seismic, desktop computer programs in BASIC; Part 6, Develop rms velocity functions and apply mute and normal movement

    USGS Publications Warehouse

    Hasbrouck, W.P.

    1983-01-01

    Processing of data taken with the U.S. Geological Survey's coal-seismic system is done with a desktop, stand-alone computer. Programs for this computer are written in the extended BASIC language utilized by the Tektronix 4051 Graphic System. This report presents computer programs used to develop rms velocity functions and apply mute and normal moveout to a 12-trace seismogram.

  15. Broadband transmission functions for atmospheric IR flux computations and climate studies

    NASA Technical Reports Server (NTRS)

    Chou, M.-D.

    1983-01-01

    In order to reduce the size of precomputed tables which are used in the emissivity approach to computing IR radiation in a climate model, the three-dimensional transmission function in the water vapor bands is defined in this study by a simple regression equation consisting of three two-dimensional parameters. The transmittances in the 9.6 and 15 micron bands are individually parameterized as functions of the amount of scaled absorber. This approach can thus be applied to atmospheres with a variable CO2 concentration.

  16. Method, systems, and computer program products for implementing function-parallel network firewall

    DOEpatents

    Fulp, Errin W.; Farley, Ryan J.

    2011-10-11

    Methods, systems, and computer program products for providing function-parallel firewalls are disclosed. According to one aspect, a function-parallel firewall includes a first firewall node for filtering received packets using a first portion of a rule set including a plurality of rules. The first portion includes less than all of the rules in the rule set. At least one second firewall node filters packets using a second portion of the rule set. The second portion includes at least one rule in the rule set that is not present in the first portion. The first and second portions together include all of the rules in the rule set.

  17. A comparison of computational methods and algorithms for the complex gamma function

    NASA Technical Reports Server (NTRS)

    Ng, E. W.

    1974-01-01

    A survey and comparison of some computational methods and algorithms for gamma and log-gamma functions of complex arguments are presented. Methods and algorithms reported include Chebyshev approximations, Pade expansion and Stirling's asymptotic series. The comparison leads to the conclusion that Algorithm 421 published in the Communications of ACM by H. Kuki is the best program either for individual application or for the inclusion in subroutine libraries.

  18. Economic probes of mental function and the extraction of computational phenotypes☆

    PubMed Central

    Kishida, Kenneth T.; Montague, P. Read

    2013-01-01

    Economic games are now routinely used to characterize human cognition across multiple dimensions. These games allow for effective computational modeling of mental function because they typically come equipped with notions of optimal play, which provide quantitatively prescribed target functions that can be tracked throughout an experiment. The combination of these games, computational models, and neuroimaging tools open up the possibility for new ways to characterize normal cognition and associated brain function. We propose that these tools may also be used to characterize mental dysfunction, such as that found in a range of psychiatric illnesses. We describe early efforts using a multi-round trust game to probe brain responses associated with healthy social exchange and review how this game has provided a novel and useful characterization of autism spectrum disorder. Lastly, we use the multi-round trust game as an example to discuss how these kinds of games could produce novel bases for representing healthy behavior and brain function and thus provide objectively identifiable subtypes within a broad spectrum of mental function. PMID:24926112

  19. Comparison of x ray computed tomography number to proton relative linear stopping power conversion functions using a standard phantom

    SciTech Connect

    Moyers, M. F.

    2014-06-15

    Purpose: Adequate evaluation of the results from multi-institutional trials involving light ion beam treatments requires consideration of the planning margins applied to both targets and organs at risk. A major uncertainty that affects the size of these margins is the conversion of x ray computed tomography numbers (XCTNs) to relative linear stopping powers (RLSPs). Various facilities engaged in multi-institutional clinical trials involving proton beams have been applying significantly different margins in their patient planning. This study was performed to determine the variance in the conversion functions used at proton facilities in the U.S.A. wishing to participate in National Cancer Institute sponsored clinical trials. Methods: A simplified method of determining the conversion function was developed using a standard phantom containing only water and aluminum. The new method was based on the premise that all scanners have their XCTNs for air and water calibrated daily to constant values but that the XCTNs for high density/high atomic number materials are variable with different scanning conditions. The standard phantom was taken to 10 different proton facilities and scanned with the local protocols resulting in 14 derived conversion functions which were compared to the conversion functions used at the local facilities. Results: For tissues within ±300 XCTN of water, all facility functions produced converted RLSP values within ±6% of the values produced by the standard function and within 8% of the values from any other facility's function. For XCTNs corresponding to lung tissue, converted RLSP values differed by as great as ±8% from the standard and up to 16% from the values of other facilities. For XCTNs corresponding to low-density immobilization foam, the maximum to minimum values differed by as much as 40%. Conclusions: The new method greatly simplifies determination of the conversion function, reduces ambiguity, and in the future could promote

  20. Comparison of x ray computed tomography number to proton relative linear stopping power conversion functions using a standard phantom1

    PubMed Central

    Moyers, M. F.

    2014-01-01

    Purpose: Adequate evaluation of the results from multi-institutional trials involving light ion beam treatments requires consideration of the planning margins applied to both targets and organs at risk. A major uncertainty that affects the size of these margins is the conversion of x ray computed tomography numbers (XCTNs) to relative linear stopping powers (RLSPs). Various facilities engaged in multi-institutional clinical trials involving proton beams have been applying significantly different margins in their patient planning. This study was performed to determine the variance in the conversion functions used at proton facilities in the U.S.A. wishing to participate in National Cancer Institute sponsored clinical trials. Methods: A simplified method of determining the conversion function was developed using a standard phantom containing only water and aluminum. The new method was based on the premise that all scanners have their XCTNs for air and water calibrated daily to constant values but that the XCTNs for high density/high atomic number materials are variable with different scanning conditions. The standard phantom was taken to 10 different proton facilities and scanned with the local protocols resulting in 14 derived conversion functions which were compared to the conversion functions used at the local facilities. Results: For tissues within ±300 XCTN of water, all facility functions produced converted RLSP values within ±6% of the values produced by the standard function and within 8% of the values from any other facility's function. For XCTNs corresponding to lung tissue, converted RLSP values differed by as great as ±8% from the standard and up to 16% from the values of other facilities. For XCTNs corresponding to low-density immobilization foam, the maximum to minimum values differed by as much as 40%. Conclusions: The new method greatly simplifies determination of the conversion function, reduces ambiguity, and in the future could promote

  1. An Instructional Model for Preparing Accounting/Computing Clerks in Michigan Secondary School Office Education Programs, Part I and Part II.

    ERIC Educational Resources Information Center

    Moskovis, L. Michael; McKitrick, Max O.

    Outlined in this two-part document is a model for the implementation of a business-industry oriented program designed to provide high school seniors with updated training in the skills and concepts necessary for developing competencies in entry-level and second-level accounting jobs that involve accounts receivable, accounts payable, and payroll…

  2. Clinical Validation of 4-Dimensional Computed Tomography Ventilation With Pulmonary Function Test Data

    SciTech Connect

    Brennan, Douglas; Schubert, Leah; Diot, Quentin; Castillo, Richard; Castillo, Edward; Guerrero, Thomas; Martel, Mary K.; Linderman, Derek; Gaspar, Laurie E.; Miften, Moyed; Kavanagh, Brian D.; Vinogradskiy, Yevgeniy

    2015-06-01

    Purpose: A new form of functional imaging has been proposed in the form of 4-dimensional computed tomography (4DCT) ventilation. Because 4DCTs are acquired as part of routine care for lung cancer patients, calculating ventilation maps from 4DCTs provides spatial lung function information without added dosimetric or monetary cost to the patient. Before 4DCT-ventilation is implemented it needs to be clinically validated. Pulmonary function tests (PFTs) provide a clinically established way of evaluating lung function. The purpose of our work was to perform a clinical validation by comparing 4DCT-ventilation metrics with PFT data. Methods and Materials: Ninety-eight lung cancer patients with pretreatment 4DCT and PFT data were included in the study. Pulmonary function test metrics used to diagnose obstructive lung disease were recorded: forced expiratory volume in 1 second (FEV1) and FEV1/forced vital capacity. Four-dimensional CT data sets and spatial registration were used to compute 4DCT-ventilation images using a density change–based and a Jacobian-based model. The ventilation maps were reduced to single metrics intended to reflect the degree of ventilation obstruction. Specifically, we computed the coefficient of variation (SD/mean), ventilation V20 (volume of lung ≤20% ventilation), and correlated the ventilation metrics with PFT data. Regression analysis was used to determine whether 4DCT ventilation data could predict for normal versus abnormal lung function using PFT thresholds. Results: Correlation coefficients comparing 4DCT-ventilation with PFT data ranged from 0.63 to 0.72, with the best agreement between FEV1 and coefficient of variation. Four-dimensional CT ventilation metrics were able to significantly delineate between clinically normal versus abnormal PFT results. Conclusions: Validation of 4DCT ventilation with clinically relevant metrics is essential. We demonstrate good global agreement between PFTs and 4DCT-ventilation, indicating that 4DCT

  3. Computer-Based Cognitive Training for Executive Functions after Stroke: A Systematic Review

    PubMed Central

    van de Ven, Renate M.; Murre, Jaap M. J.; Veltman, Dick J.; Schmand, Ben A.

    2016-01-01

    Background: Stroke commonly results in cognitive impairments in working memory, attention, and executive function, which may be restored with appropriate training programs. Our aim was to systematically review the evidence for computer-based cognitive training of executive dysfunctions. Methods: Studies were included if they concerned adults who had suffered stroke or other types of acquired brain injury, if the intervention was computer training of executive functions, and if the outcome was related to executive functioning. We searched in MEDLINE, PsycINFO, Web of Science, and The Cochrane Library. Study quality was evaluated based on the CONSORT Statement. Treatment effect was evaluated based on differences compared to pre-treatment and/or to a control group. Results: Twenty studies were included. Two were randomized controlled trials that used an active control group. The other studies included multiple baselines, a passive control group, or were uncontrolled. Improvements were observed in tasks similar to the training (near transfer) and in tasks dissimilar to the training (far transfer). However, these effects were not larger in trained than in active control groups. Two studies evaluated neural effects and found changes in both functional and structural connectivity. Most studies suffered from methodological limitations (e.g., lack of an active control group and no adjustment for multiple testing) hampering differentiation of training effects from spontaneous recovery, retest effects, and placebo effects. Conclusions: The positive findings of most studies, including neural changes, warrant continuation of research in this field, but only if its methodological limitations are addressed. PMID:27148007

  4. Using computational fluid dynamics to test functional and ecological hypotheses in fossil taxa

    NASA Astrophysics Data System (ADS)

    Rahman, Imran

    2016-04-01

    Reconstructing how ancient organisms moved and fed is a major focus of study in palaeontology. Traditionally, this has been hampered by a lack of objective data on the functional morphology of extinct species, especially those without a clear modern analogue. However, cutting-edge techniques for characterizing specimens digitally and in three dimensions, coupled with state-of-the-art computer models, now provide a robust framework for testing functional and ecological hypotheses even in problematic fossil taxa. One such approach is computational fluid dynamics (CFD), a method for simulating fluid flows around objects that has primarily been applied to complex engineering-design problems. Here, I will present three case studies of CFD applied to fossil taxa, spanning a range of specimen sizes, taxonomic groups and geological ages. First, I will show how CFD enabled a rigorous test of hypothesized feeding modes in an enigmatic Ediacaran organism with three-fold symmetry, revealing previously unappreciated complexity of pre-Cambrian ecosystems. Second, I will show how CFD was used to evaluate hydrodynamic performance and feeding in Cambrian stem-group echinoderms, shedding light on the probable feeding strategy of the latest common ancestor of all deuterostomes. Third, I will show how CFD allowed us to explore the link between form and function in Mesozoic ichthyosaurs. These case studies serve to demonstrate the enormous potential of CFD for addressing long-standing hypotheses for a variety of fossil taxa, opening up an exciting new avenue in palaeontological studies of functional morphology.

  5. Ab initio quasi-particle approximation bandgaps of silicon nanowires calculated at density functional theory/local density approximation computational effort

    SciTech Connect

    Ribeiro, M.

    2015-06-21

    Ab initio calculations of hydrogen-passivated Si nanowires were performed using density functional theory within LDA-1/2, to account for the excited states properties. A range of diameters was calculated to draw conclusions about the ability of the method to correctly describe the main trends of bandgap, quantum confinement, and self-energy corrections versus the diameter of the nanowire. Bandgaps are predicted with excellent accuracy if compared with other theoretical results like GW, and with the experiment as well, but with a low computational cost.

  6. Quantum Computation

    NASA Astrophysics Data System (ADS)

    Ekert, Artur

    1994-08-01

    As computers become faster they must become smaller because of the finiteness of the speed of light. The history of computer technology has involved a sequence of changes from one type of physical realisation to another - from gears to relays to valves to transistors to integrated circuits and so on. Quantum mechanics is already important in the design of microelectronic components. Soon it will be necessary to harness quantum mechanics rather than simply take it into account, and at that point it will be possible to give data processing devices new functionality.

  7. CAP: A Computer Code for Generating Tabular Thermodynamic Functions from NASA Lewis Coefficients. Revised

    NASA Technical Reports Server (NTRS)

    Zehe, Michael J.; Gordon, Sanford; McBride, Bonnie J.

    2002-01-01

    For several decades the NASA Glenn Research Center has been providing a file of thermodynamic data for use in several computer programs. These data are in the form of least-squares coefficients that have been calculated from tabular thermodynamic data by means of the NASA Properties and Coefficients (PAC) program. The source thermodynamic data are obtained from the literature or from standard compilations. Most gas-phase thermodynamic functions are calculated by the authors from molecular constant data using ideal gas partition functions. The Coefficients and Properties (CAP) program described in this report permits the generation of tabulated thermodynamic functions from the NASA least-squares coefficients. CAP provides considerable flexibility in the output format, the number of temperatures to be tabulated, and the energy units of the calculated properties. This report provides a detailed description of input preparation, examples of input and output for several species, and a listing of all species in the current NASA Glenn thermodynamic data file.

  8. CAP: A Computer Code for Generating Tabular Thermodynamic Functions from NASA Lewis Coefficients

    NASA Technical Reports Server (NTRS)

    Zehe, Michael J.; Gordon, Sanford; McBride, Bonnie J.

    2001-01-01

    For several decades the NASA Glenn Research Center has been providing a file of thermodynamic data for use in several computer programs. These data are in the form of least-squares coefficients that have been calculated from tabular thermodynamic data by means of the NASA Properties and Coefficients (PAC) program. The source thermodynamic data are obtained from the literature or from standard compilations. Most gas-phase thermodynamic functions are calculated by the authors from molecular constant data using ideal gas partition functions. The Coefficients and Properties (CAP) program described in this report permits the generation of tabulated thermodynamic functions from the NASA least-squares coefficients. CAP provides considerable flexibility in the output format, the number of temperatures to be tabulated, and the energy units of the calculated properties. This report provides a detailed description of input preparation, examples of input and output for several species, and a listing of all species in the current NASA Glenn thermodynamic data file.

  9. Effective electron displacements: A tool for time-dependent density functional theory computational spectroscopy

    SciTech Connect

    Guido, Ciro A. Cortona, Pietro; Adamo, Carlo; Institut Universitaire de France, 103 Bd Saint-Michel, F-75005 Paris

    2014-03-14

    We extend our previous definition of the metric Δr for electronic excitations in the framework of the time-dependent density functional theory [C. A. Guido, P. Cortona, B. Mennucci, and C. Adamo, J. Chem. Theory Comput. 9, 3118 (2013)], by including a measure of the difference of electronic position variances in passing from occupied to virtual orbitals. This new definition, called Γ, permits applications in those situations where the Δr-index is not helpful: transitions in centrosymmetric systems and Rydberg excitations. The Γ-metric is then extended by using the Natural Transition Orbitals, thus providing an intuitive picture of how locally the electron density changes during the electronic transitions. Furthermore, the Γ values give insight about the functional performances in reproducing different type of transitions, and allow one to define a “confidence radius” for GGA and hybrid functionals.

  10. ACCOUNTING FOR THE ENDOGENEITY OF HEALTH AND ENVIRONMENTAL TOBACCO SMOKE EXPOSURE IN CHILDREN: AN APPLICATION TO CONTINUOUS LUNG FUNCTION

    EPA Science Inventory

    The goal of this study is to estimate an unbiased exposure effect of environmental tobacco smoke (ETS) exposure on children's continuous lung function. A majority of the evidence from health studies suggests that ETS exposure in early life contributes significantly to childhood ...

  11. Do Children's Executive Functions Account for Associations between Early Autonomy-Supportive Parenting and Achievement through High School?

    ERIC Educational Resources Information Center

    Bindman, Samantha W.; Pomerantz, Eva M.; Roisman, Glenn I.

    2015-01-01

    This study evaluated whether the positive association between early autonomy-supportive parenting and children's subsequent achievement is mediated by children's executive functions. Using observations of mothers' parenting from the National Institute of Child Health and Human Development (NICHD) Study of Early Child Care and Youth Development (N…

  12. Computing the three-point correlation function of galaxies in O(N^2) time

    NASA Astrophysics Data System (ADS)

    Slepian, Zachary; Eisenstein, Daniel J.

    2015-12-01

    We present an algorithm that computes the multipole coefficients of the galaxy three-point correlation function (3PCF) without explicitly considering triplets of galaxies. Rather, centring on each galaxy in the survey, it expands the radially binned density field in spherical harmonics and combines these to form the multipoles without ever requiring the relative angle between a pair about the central. This approach scales with number and number density in the same way as the two-point correlation function, allowing run-times that are comparable, and 500 times faster than a naive triplet count. It is exact in angle and easily handles edge correction. We demonstrate the algorithm on the LasDamas SDSS-DR7 mock catalogues, computing an edge corrected 3PCF out to 90 Mpc h-1 in under an hour on modest computing resources. We expect this algorithm will render it possible to obtain the large-scale 3PCF for upcoming surveys such as Euclid, Large Synoptic Survey Telescope (LSST), and Dark Energy Spectroscopic Instrument.

  13. Computation of diffusion function measures in q-space using magnetic resonance hybrid diffusion imaging.

    PubMed

    Wu, Yu-Chien; Field, Aaron S; Alexander, Andrew L

    2008-06-01

    The distribution of water diffusion in biological tissues may be estimated by a 3-D Fourier transform (FT) of diffusion-weighted measurements in q-space. In this study, methods for estimating diffusion spectrum measures (the zero-displacement probability, the mean-squared displacement, and the orientation distribution function) directly from the q-space signals are described. These methods were evaluated using both computer simulations and hybrid diffusion imaging (HYDI) measurements on a human brain. The HYDI method obtains diffusion-weighted measurements on concentric spheres in q-space. Monte Carlo computer simulations were performed to investigate effects of noise, q-space truncation, and sampling interval on the measures. This new direct computation approach reduces HYDI data processing time and image artifacts arising from 3-D FT and regridding interpolation. In addition, it is less sensitive to the noise and q-space truncation effects than conventional approach. Although this study focused on data using the HYDI scheme, this computation approach may be applied to other diffusion sampling schemes including Cartesian diffusion spectrum imaging. PMID:18541492

  14. Choosing a proper exchange-correlation functional for the computational catalysis on surface.

    PubMed

    Teng, Bo-Tao; Wen, Xiao-Dong; Fan, Maohong; Wu, Feng-Min; Zhang, Yulong

    2014-09-14

    To choose a proper functional among the diverse density functional approximations of the electronic exchange-correlation energy for a given system is the basis for obtaining accurate results of theoretical calculations. In this work, we first propose an approach by comparing the calculated ΔE0 with the theoretical reference data based on the corresponding experimental results in a gas phase reaction. With ΔE0 being a criterion, the three most typical and popular exchange-correlation functionals (PW91, PBE and RPBE) were systematically compared in terms of the typical Fischer-Tropsch synthesis reactions in the gas phase. In addition, verifications of the geometrical and electronic properties of modeling catalysts, as well as the adsorption behavior of a typical probe molecule on modeling catalysts are also suggested for further screening of proper functionals. After a systematic comparison of CO adsorption behavior on Co(0001) calculated by PW91, PBE, and RPBE, the RPBE functional was found to be better than the other two in view of FTS reactions in gas phase and CO adsorption behaviors on a cobalt surface. The present work shows the general implications for choosing a reliable exchange-correlation functional in the computational catalysis of a surface. PMID:25072632

  15. Computational Study of Acidic and Basic Functionalized Crystalline Silica Surfaces as a Model for Biomaterial Interfaces.

    PubMed

    Corno, Marta; Delle Piane, Massimo; Monti, Susanna; Moreno-Couranjou, Maryline; Choquet, Patrick; Ugliengo, Piero

    2015-06-16

    In silico modeling of acidic (CH2COOH) or basic (CH2NH2) functionalized silica surfaces has been carried out by means of a density functional approach based on a gradient-corrected functional to provide insight into the characterization of experimentally functionalized surfaces via a plasma method. Hydroxylated surfaces of crystalline cristobalite (sporting 4.8 OH/nm(2)) mimic an amorphous silica interface as unsubstituted material. To functionalize the silica surface we transformed the surface Si-OH groups into Si-CH2COOH and Si-CH2NH2 moieties to represent acidic/basic chemical character for the substitution. Structures, energetics, electronic, and vibrational properties were computed and compared as a function of the increasing loading of the functional groups (from 1 to 4 per surface unit cell). Classical molecular dynamics simulations of selected cases have been performed through a Reax-FF reactive force field to assess the mobility of the surface added chains. Both DFT and force field calculations identify the CH2NH2 moderate surface loading (1 group per unit cell) as the most stable functionalization, at variance with the case of the CH2COOH group, where higher loadings are preferred (2 groups per unit cell). The vibrational fingerprints of the surface functionalities, which are the ν(C═O) stretching and δ(NH2) bending modes for acidic/basic cases, have been characterized as a function of substitution percentage in order to guide the assignment of the experimental data. The final results highlighted the different behavior of the two types of functionalization. On the one hand, the frequency associated with the ν(C═O) mode shifts to lower wavenumbers as a function of the H-bond strength between the surface functionalities (both COOH and SiOH groups), and on the other hand, the δ(NH2) frequency shift seems to be caused by a subtle balance between the H-bond donor and acceptor abilities of the NH2 moiety. Both sets of data are in general agreement with

  16. Studying the Chemistry of Cationized Triacylglycerols Using Electrospray Ionization Mass Spectrometry and Density Functional Theory Computations

    NASA Astrophysics Data System (ADS)

    Grossert, J. Stuart; Herrera, Lisandra Cubero; Ramaley, Louis; Melanson, Jeremy E.

    2014-08-01

    Analysis of triacylglycerols (TAGs), found as complex mixtures in living organisms, is typically accomplished using liquid chromatography, often coupled to mass spectrometry. TAGs, weak bases not protonated using electrospray ionization, are usually ionized by adduct formation with a cation, including those present in the solvent (e.g., Na+). There are relatively few reports on the binding of TAGs with cations or on the mechanisms by which cationized TAGs fragment. This work examines binding efficiencies, determined by mass spectrometry and computations, for the complexation of TAGs to a range of cations (Na+, Li+, K+, Ag+, NH4 +). While most cations bind to oxygen, Ag+ binding to unsaturation in the acid side chains is significant. The importance of dimer formation, [2TAG + M]+ was demonstrated using several different types of mass spectrometers. From breakdown curves, it became apparent that two or three acid side chains must be attached to glycerol for strong cationization. Possible mechanisms for fragmentation of lithiated TAGs were modeled by computations on tripropionylglycerol. Viable pathways were found for losses of neutral acids and lithium salts of acids from different positions on the glycerol moiety. Novel lactone structures were proposed for the loss of a neutral acid from one position of the glycerol moiety. These were studied further using triple-stage mass spectrometry (MS3). These lactones can account for all the major product ions in the MS3 spectra in both this work and the literature, which should allow for new insights into the challenging analytical methods needed for naturally occurring TAGs.

  17. An evolutionary computational theory of prefrontal executive function in decision-making.

    PubMed

    Koechlin, Etienne

    2014-11-01

    The prefrontal cortex subserves executive control and decision-making, that is, the coordination and selection of thoughts and actions in the service of adaptive behaviour. We present here a computational theory describing the evolution of the prefrontal cortex from rodents to humans as gradually adding new inferential Bayesian capabilities for dealing with a computationally intractable decision problem: exploring and learning new behavioural strategies versus exploiting and adjusting previously learned ones through reinforcement learning (RL). We provide a principled account identifying three inferential steps optimizing this arbitration through the emergence of (i) factual reactive inferences in paralimbic prefrontal regions in rodents; (ii) factual proactive inferences in lateral prefrontal regions in primates and (iii) counterfactual reactive and proactive inferences in human frontopolar regions. The theory clarifies the integration of model-free and model-based RL through the notion of strategy creation. The theory also shows that counterfactual inferences in humans yield to the notion of hypothesis testing, a critical reasoning ability for approximating optimal adaptive processes and presumably endowing humans with a qualitative evolutionary advantage in adaptive behaviour. PMID:25267817

  18. Beyond localized and distributed accounts of brain functions. Comment on “Understanding brain networks and brain organization” by Pessoa

    NASA Astrophysics Data System (ADS)

    Cauda, Franco; Costa, Tommaso; Tamietto, Marco

    2014-09-01

    Recent evidence in cognitive neuroscience lends support to the idea that network models of brain architecture provide a privileged access to the understanding of the relation between brain organization and cognitive processes [1]. The core perspective holds that cognitive processes depend on the interactions among distributed neuronal populations and brain structures, and that the impact of a given region on behavior largely depends on its pattern of anatomical and functional connectivity [2,3].

  19. On fast computation of finite-time coherent sets using radial basis functions

    NASA Astrophysics Data System (ADS)

    Froyland, Gary; Junge, Oliver

    2015-08-01

    Finite-time coherent sets inhibit mixing over finite times. The most expensive part of the transfer operator approach to detecting coherent sets is the construction of the operator itself. We present a numerical method based on radial basis function collocation and apply it to a recent transfer operator construction [G. Froyland, "Dynamic isoperimetry and the geometry of Lagrangian coherent structures," Nonlinearity (unpublished); preprint arXiv:1411.7186] that has been designed specifically for purely advective dynamics. The construction [G. Froyland, "Dynamic isoperimetry and the geometry of Lagrangian coherent structures," Nonlinearity (unpublished); preprint arXiv:1411.7186] is based on a "dynamic" Laplace operator and minimises the boundary size of the coherent sets relative to their volume. The main advantage of our new approach is a substantial reduction in the number of Lagrangian trajectories that need to be computed, leading to large speedups in the transfer operator analysis when this computation is costly.

  20. On fast computation of finite-time coherent sets using radial basis functions.

    PubMed

    Froyland, Gary; Junge, Oliver

    2015-08-01

    Finite-time coherent sets inhibit mixing over finite times. The most expensive part of the transfer operator approach to detecting coherent sets is the construction of the operator itself. We present a numerical method based on radial basis function collocation and apply it to a recent transfer operator construction [G. Froyland, "Dynamic isoperimetry and the geometry of Lagrangian coherent structures," Nonlinearity (unpublished); preprint arXiv:1411.7186] that has been designed specifically for purely advective dynamics. The construction [G. Froyland, "Dynamic isoperimetry and the geometry of Lagrangian coherent structures," Nonlinearity (unpublished); preprint arXiv:1411.7186] is based on a "dynamic" Laplace operator and minimises the boundary size of the coherent sets relative to their volume. The main advantage of our new approach is a substantial reduction in the number of Lagrangian trajectories that need to be computed, leading to large speedups in the transfer operator analysis when this computation is costly. PMID:26328580

  1. Boolean Combinations of Implicit Functions for Model Clipping in Computer-Assisted Surgical Planning

    PubMed Central

    2016-01-01

    This paper proposes an interactive method of model clipping for computer-assisted surgical planning. The model is separated by a data filter that is defined by the implicit function of the clipping path. Being interactive to surgeons, the clipping path that is composed of the plane widgets can be manually repositioned along the desirable presurgical path, which means that surgeons can produce any accurate shape of the clipped model. The implicit function is acquired through a recursive algorithm based on the Boolean combinations (including Boolean union and Boolean intersection) of a series of plane widgets’ implicit functions. The algorithm is evaluated as highly efficient because the best time performance of the algorithm is linear, which applies to most of the cases in the computer-assisted surgical planning. Based on the above stated algorithm, a user-friendly module named SmartModelClip is developed on the basis of Slicer platform and VTK. A number of arbitrary clipping paths have been tested. Experimental results of presurgical planning for three types of Le Fort fractures and for tumor removal demonstrate the high reliability and efficiency of our recursive algorithm and robustness of the module. PMID:26751685

  2. Computing single step operators of logic programming in radial basis function neural networks

    NASA Astrophysics Data System (ADS)

    Hamadneh, Nawaf; Sathasivam, Saratha; Choon, Ong Hong

    2014-07-01

    Logic programming is the process that leads from an original formulation of a computing problem to executable programs. A normal logic program consists of a finite set of clauses. A valuation I of logic programming is a mapping from ground atoms to false or true. The single step operator of any logic programming is defined as a function (Tp:I→I). Logic programming is well-suited to building the artificial intelligence systems. In this study, we established a new technique to compute the single step operators of logic programming in the radial basis function neural networks. To do that, we proposed a new technique to generate the training data sets of single step operators. The training data sets are used to build the neural networks. We used the recurrent radial basis function neural networks to get to the steady state (the fixed point of the operators). To improve the performance of the neural networks, we used the particle swarm optimization algorithm to train the networks.

  3. Computer-aided analyses of transport protein sequences: gleaning evidence concerning function, structure, biogenesis, and evolution.

    PubMed Central

    Saier, M H

    1994-01-01

    Three-dimensional structures have been elucidated for very few integral membrane proteins. Computer methods can be used as guides for estimation of solute transport protein structure, function, biogenesis, and evolution. In this paper the application of currently available computer programs to over a dozen distinct families of transport proteins is reviewed. The reliability of sequence-based topological and localization analyses and the importance of sequence and residue conservation to structure and function are evaluated. Evidence concerning the nature and frequency of occurrence of domain shuffling, splicing, fusion, deletion, and duplication during evolution of specific transport protein families is also evaluated. Channel proteins are proposed to be functionally related to carriers. It is argued that energy coupling to transport was a late occurrence, superimposed on preexisting mechanisms of solute facilitation. It is shown that several transport protein families have evolved independently of each other, employing different routes, at different times in evolutionary history, to give topologically similar transmembrane protein complexes. The possible significance of this apparent topological convergence is discussed. PMID:8177172

  4. Computing single step operators of logic programming in radial basis function neural networks

    SciTech Connect

    Hamadneh, Nawaf; Sathasivam, Saratha; Choon, Ong Hong

    2014-07-10

    Logic programming is the process that leads from an original formulation of a computing problem to executable programs. A normal logic program consists of a finite set of clauses. A valuation I of logic programming is a mapping from ground atoms to false or true. The single step operator of any logic programming is defined as a function (T{sub p}:I→I). Logic programming is well-suited to building the artificial intelligence systems. In this study, we established a new technique to compute the single step operators of logic programming in the radial basis function neural networks. To do that, we proposed a new technique to generate the training data sets of single step operators. The training data sets are used to build the neural networks. We used the recurrent radial basis function neural networks to get to the steady state (the fixed point of the operators). To improve the performance of the neural networks, we used the particle swarm optimization algorithm to train the networks.

  5. An effective method to verify line and point spread functions measured in computed tomography

    SciTech Connect

    Ohkubo, Masaki; Wada, Sinichi; Matsumoto, Toru; Nishizawa, Kanae

    2006-08-15

    This study describes an effective method for verifying line spread function (LSF) and point spread function (PSF) measured in computed tomography (CT). The CT image of an assumed object function is known to be calculable using LSF or PSF based on a model for the spatial resolution in a linear imaging system. Therefore, the validities of LSF and PSF would be confirmed by comparing the computed images with the images obtained by scanning phantoms corresponding to the object function. Differences between computed and measured images will depend on the accuracy of the LSF and PSF used in the calculations. First, we measured LSF in our scanner, and derived the two-dimensional PSF in the scan plane from the LSF. Second, we scanned the phantom including uniform cylindrical objects parallel to the long axis of a patient's body (z direction). Measured images of such a phantom were characterized according to the spatial resolution in the scan plane, and did not depend on the spatial resolution in the z direction. Third, images were calculated by two-dimensionally convolving the true object as a function of space with the PSF. As a result of comparing computed images with measured ones, good agreement was found and was demonstrated by image subtraction. As a criterion for evaluating quantitatively the overall differences of images, we defined the normalized standard deviation (SD) in the differences between computed and measured images. These normalized SDs were less than 5.0% (ranging from 1.3% to 4.8%) for three types of image reconstruction kernels and for various diameters of cylindrical objects, indicating the high accuracy of PSF and LSF that resulted in successful measurements. Further, we also obtained another LSF utilizing an inappropriate manner, and calculated the images as above. This time, the computed images did not agree with the measured ones. The normalized SDs were 6.0% or more (ranging from 6.0% to 13.8%), indicating the inaccuracy of the PSF and LSF. We

  6. Computer simulation on the cooperation of functional molecules during the early stages of evolution.

    PubMed

    Ma, Wentao; Hu, Jiming

    2012-01-01

    It is very likely that life began with some RNA (or RNA-like) molecules, self-replicating by base-pairing and exhibiting enzyme-like functions that favored the self-replication. Different functional molecules may have emerged by favoring their own self-replication at different aspects. Then, a direct route towards complexity/efficiency may have been through the coexistence/cooperation of these molecules. However, the likelihood of this route remains quite unclear, especially because the molecules would be competing for limited common resources. By computer simulation using a Monte-Carlo model (with "micro-resolution" at the level of nucleotides and membrane components), we show that the coexistence/cooperation of these molecules can occur naturally, both in a naked form and in a protocell form. The results of the computer simulation also lead to quite a few deductions concerning the environment and history in the scenario. First, a naked stage (with functional molecules catalyzing template-replication and metabolism) may have occurred early in evolution but required high concentration and limited dispersal of the system (e.g., on some mineral surface); the emergence of protocells enabled a "habitat-shift" into bulk water. Second, the protocell stage started with a substage of "pseudo-protocells", with functional molecules catalyzing template-replication and metabolism, but still missing the function involved in the synthesis of membrane components, the emergence of which would lead to a subsequent "true-protocell" substage. Third, the initial unstable membrane, composed of prebiotically available fatty acids, should have been superseded quite early by a more stable membrane (e.g., composed of phospholipids, like modern cells). Additionally, the membrane-takeover probably occurred at the transition of the two substages of the protocells. The scenario described in the present study should correspond to an episode in early evolution, after the emergence of single

  7. Distinct Quantitative Computed Tomography Emphysema Patterns Are Associated with Physiology and Function in Smokers

    PubMed Central

    San José Estépar, Raúl; Mendoza, Carlos S.; Hersh, Craig P.; Laird, Nan; Crapo, James D.; Lynch, David A.; Silverman, Edwin K.; Washko, George R.

    2013-01-01

    Rationale: Emphysema occurs in distinct pathologic patterns, but little is known about the epidemiologic associations of these patterns. Standard quantitative measures of emphysema from computed tomography (CT) do not distinguish between distinct patterns of parenchymal destruction. Objectives: To study the epidemiologic associations of distinct emphysema patterns with measures of lung-related physiology, function, and health care use in smokers. Methods: Using a local histogram-based assessment of lung density, we quantified distinct patterns of low attenuation in 9,313 smokers in the COPDGene Study. To determine if such patterns provide novel insights into chronic obstructive pulmonary disease epidemiology, we tested for their association with measures of physiology, function, and health care use. Measurements and Main Results: Compared with percentage of low-attenuation area less than −950 Hounsfield units (%LAA-950), local histogram-based measures of distinct CT low-attenuation patterns are more predictive of measures of lung function, dyspnea, quality of life, and health care use. These patterns are strongly associated with a wide array of measures of respiratory physiology and function, and most of these associations remain highly significant (P < 0.005) after adjusting for %LAA-950. In smokers without evidence of chronic obstructive pulmonary disease, the mild centrilobular disease pattern is associated with lower FEV1 and worse functional status (P < 0.005). Conclusions: Measures of distinct CT emphysema patterns provide novel information about the relationship between emphysema and key measures of physiology, physical function, and health care use. Measures of mild emphysema in smokers with preserved lung function can be extracted from CT scans and are significantly associated with functional measures. PMID:23980521

  8. An accurate Fortran code for computing hydrogenic continuum wave functions at a wide range of parameters

    NASA Astrophysics Data System (ADS)

    Peng, Liang-You; Gong, Qihuang

    2010-12-01

    The accurate computations of hydrogenic continuum wave functions are very important in many branches of physics such as electron-atom collisions, cold atom physics, and atomic ionization in strong laser fields, etc. Although there already exist various algorithms and codes, most of them are only reliable in a certain ranges of parameters. In some practical applications, accurate continuum wave functions need to be calculated at extremely low energies, large radial distances and/or large angular momentum number. Here we provide such a code, which can generate accurate hydrogenic continuum wave functions and corresponding Coulomb phase shifts at a wide range of parameters. Without any essential restrict to angular momentum number, the present code is able to give reliable results at the electron energy range [10,10] eV for radial distances of [10,10] a.u. We also find the present code is very efficient, which should find numerous applications in many fields such as strong field physics. Program summaryProgram title: HContinuumGautchi Catalogue identifier: AEHD_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHD_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 1233 No. of bytes in distributed program, including test data, etc.: 7405 Distribution format: tar.gz Programming language: Fortran90 in fixed format Computer: AMD Processors Operating system: Linux RAM: 20 MBytes Classification: 2.7, 4.5 Nature of problem: The accurate computation of atomic continuum wave functions is very important in many research fields such as strong field physics and cold atom physics. Although there have already existed various algorithms and codes, most of them can only be applicable and reliable in a certain range of parameters. We present here an accurate FORTRAN program for

  9. Talking while Computing in Groups: The Not-so-Private Functions of Computational Private Speech in Mathematical Discussions

    ERIC Educational Resources Information Center

    Zahner, William; Moschkovich, Judit

    2010-01-01

    Students often voice computations during group discussions of mathematics problems. Yet, this type of private speech has received little attention from mathematics educators or researchers. In this article, we use excerpts from middle school students' group mathematical discussions to illustrate and describe "computational private speech." We…

  10. Using an iterative eigensolver to compute vibrational energies with phase-spaced localized basis functions

    SciTech Connect

    Brown, James Carrington, Tucker

    2015-07-28

    Although phase-space localized Gaussians are themselves poor basis functions, they can be used to effectively contract a discrete variable representation basis [A. Shimshovitz and D. J. Tannor, Phys. Rev. Lett. 109, 070402 (2012)]. This works despite the fact that elements of the Hamiltonian and overlap matrices labelled by discarded Gaussians are not small. By formulating the matrix problem as a regular (i.e., not a generalized) matrix eigenvalue problem, we show that it is possible to use an iterative eigensolver to compute vibrational energy levels in the Gaussian basis.

  11. Using an iterative eigensolver to compute vibrational energies with phase-spaced localized basis functions.

    PubMed

    Brown, James; Carrington, Tucker

    2015-07-28

    Although phase-space localized Gaussians are themselves poor basis functions, they can be used to effectively contract a discrete variable representation basis [A. Shimshovitz and D. J. Tannor, Phys. Rev. Lett. 109, 070402 (2012)]. This works despite the fact that elements of the Hamiltonian and overlap matrices labelled by discarded Gaussians are not small. By formulating the matrix problem as a regular (i.e., not a generalized) matrix eigenvalue problem, we show that it is possible to use an iterative eigensolver to compute vibrational energy levels in the Gaussian basis. PMID:26233104

  12. Using an iterative eigensolver to compute vibrational energies with phase-spaced localized basis functions

    NASA Astrophysics Data System (ADS)

    Brown, James; Carrington, Tucker

    2015-07-01

    Although phase-space localized Gaussians are themselves poor basis functions, they can be used to effectively contract a discrete variable representation basis [A. Shimshovitz and D. J. Tannor, Phys. Rev. Lett. 109, 070402 (2012)]. This works despite the fact that elements of the Hamiltonian and overlap matrices labelled by discarded Gaussians are not small. By formulating the matrix problem as a regular (i.e., not a generalized) matrix eigenvalue problem, we show that it is possible to use an iterative eigensolver to compute vibrational energy levels in the Gaussian basis.

  13. Computational solution of the defect stream-function equation for nonequilibrium turbulent boundary layers

    NASA Technical Reports Server (NTRS)

    Barnwell, Richard W.

    1993-01-01

    The derivation of the accurate, second-order, almost linear, approximate equation governing the defect stream function for nonequilibrium compressible turbulent boundary layers is reviewed. The similarity of this equation to the heat conduction equation is exploited in the development of an unconditionally stable, tridiagonal computational method which is second-order accurate in the marching direction and fourth-order accurate in the surface-normal direction. Results compare well with experimental data. Nonlinear effects are shown to be small. This two-dimensional method is simple and has been implemented on a programmable calculator.

  14. Functional Priorities, Assistive Technology, and Brain-Computer Interfaces after Spinal Cord Injury

    PubMed Central

    Collinger, Jennifer L.; Boninger, Michael L.; Bruns, Tim M.; Curley, Kenneth; Wang, Wei; Weber, Douglas J.

    2012-01-01

    Spinal cord injury often impacts a person’s ability to perform critical activities of daily living and can have a negative impact on their quality of life. Assistive technology aims to bridge this gap to augment function and increase independence. It is critical to involve consumers in the design and evaluation process as new technologies, like brain-computer interfaces (BCIs), are developed. In a survey study of fifty-seven veterans with spinal cord injury who were participating in the National Veterans Wheelchair Games, we found that restoration of bladder/bowel control, walking, and arm/hand function (tetraplegia only) were all high priorities for improving quality of life. Many of the participants had not used or heard of some currently available technologies designed to improve function or the ability to interact with their environment. The majority of individuals in this study were interested in using a BCI, particularly for controlling functional electrical stimulation to restore lost function. Independent operation was considered to be the most important design criteria. Interestingly, many participants reported that they would be willing to consider surgery to implant a BCI even though non-invasiveness was a high priority design requirement. This survey demonstrates the interest of individuals with spinal cord injury in receiving and contributing to the design of BCI. PMID:23760996

  15. Distribution of computer functionality for accelerator control at the Brookhaven AGS

    SciTech Connect

    Stevens, A.; Clifford, T.; Frankel, R.

    1985-01-01

    A set of physical and functional system components and their interconnection protocols have been established for all controls work at the AGS. Portions of these designs were tested as part of enhanced operation of the AGS as a source of polarized protons and additional segments will be implemented during the continuing construction efforts which are adding heavy ion capability to our facility. Included in our efforts are the following computer and control system elements: a broad band local area network, which embodies MODEMS; transmission systems and branch interface units; a hierarchical layer, which performs certain data base and watchdog/alarm functions; a group of work station processors (Apollo's) which perform the function of traditional minicomputer host(s) and a layer, which provides both real time control and standardization functions for accelerator devices and instrumentation. Data base and other accelerator functionality is assigned to the most correct level within our network for both real time performance, long-term utility, and orderly growth.

  16. Distribution of computer functionality for accelerator control at the Brookhaven AGS

    SciTech Connect

    Stevens, A.; Clifford, T.; Frankel, R.

    1985-10-01

    A set of physical and functional system components and their interconnection protocols have been established for all controls work at the AGS. Portions of these designs were tested as part of enhanced operation of the AGS as a source of polarized protons and additional segments will be implemented during the continuing construction efforts which are adding heavy ion capability to our facility. Included in our efforts are the following computer and control system elements: a broad band local area network, which embodies MODEMS; transmission systems and branch interface units; a hierarchical layer, which performs certain data base and watchdog/alarm functions; a group of work station processors (Apollo's) which perform the function of traditional minicomputer host(s) and a layer, which provides both real time control and standardization functions for accelerator devices and instrumentation. Data base and other accelerator functionality is assigned to the most correct level within our network for both real time performance, long-term utility, and orderly growth.

  17. Understanding entangled cerebral networks: a prerequisite for restoring brain function with brain-computer interfaces

    PubMed Central

    Mandonnet, Emmanuel; Duffau, Hugues

    2014-01-01

    Historically, cerebral processing has been conceptualized as a framework based on statically localized functions. However, a growing amount of evidence supports a hodotopical (delocalized) and flexible organization. A number of studies have reported absence of a permanent neurological deficit after massive surgical resections of eloquent brain tissue. These results highlight the tremendous plastic potential of the brain. Understanding anatomo-functional correlates underlying this cerebral reorganization is a prerequisite to restore brain functions through brain-computer interfaces (BCIs) in patients with cerebral diseases, or even to potentiate brain functions in healthy individuals. Here, we review current knowledge of neural networks that could be utilized in the BCIs that enable movements and language. To this end, intraoperative electrical stimulation in awake patients provides valuable information on the cerebral functional maps, their connectomics and plasticity. Overall, these studies indicate that the complex cerebral circuitry that underpins interactions between action, cognition and behavior should be throughly investigated before progress in BCI approaches can be achieved. PMID:24834030

  18. Study of dust particle charging in weakly ionized inert gases taking into account the nonlocality of the electron energy distribution function

    SciTech Connect

    Filippov, A. V. Dyatko, N. A.; Kostenko, A. S.

    2014-11-15

    The charging of dust particles in weakly ionized inert gases at atmospheric pressure has been investigated. The conditions under which the gas is ionized by an external source, a beam of fast electrons, are considered. The electron energy distribution function in argon, krypton, and xenon has been calculated for three rates of gas ionization by fast electrons: 10{sup 13}, 10{sup 14}, and 10{sup 15} cm{sup −1}. A model of dust particle charging with allowance for the nonlocal formation of the electron energy distribution function in the region of strong plasma quasi-neutrality violation around the dust particle is described. The nonlocality is taken into account in an approximation where the distribution function is a function of only the total electron energy. Comparative calculations of the dust particle charge with and without allowance for the nonlocality of the electron energy distribution function have been performed. Allowance for the nonlocality is shown to lead to a noticeable increase in the dust particle charge due to the influence of the group of hot electrons from the tail of the distribution function. It has been established that the screening constant virtually coincides with the smallest screening constant determined according to the asymptotic theory of screening with the electron transport and recombination coefficients in an unperturbed plasma.

  19. [Respiratory function evaluation in welders taking into account tecnological evolution of individual protection dispositive and risk specific information].

    PubMed

    Boggia, B; Graziuso, G; Carbone, U

    2011-01-01

    Aim of the study is to evaluate the effect of specific information program on DPI use on the functional respiratory parameters in a group of 15 welders compared with 18 welders not included in the program and 18 workers of industrial sector. Spirometryc parameters were recorded and compared and the results pointed out a significant increase of FEV1 and FVC in the study group compared with welder out of the study while no difference were observed between study group and workers of industrial sector. Results shown that the correct use of DPI could reduce the effects of welding fumes on respiratory tract making these effects equal to the exposure to industrial dusts. PMID:23393790

  20. Study of space shuttle orbiter system management computer function. Volume 1: Analysis, baseline design

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A system analysis of the shuttle orbiter baseline system management (SM) computer function is performed. This analysis results in an alternative SM design which is also described. The alternative design exhibits several improvements over the baseline, some of which are increased crew usability, improved flexibility, and improved growth potential. The analysis consists of two parts: an application assessment and an implementation assessment. The former is concerned with the SM user needs and design functional aspects. The latter is concerned with design flexibility, reliability, growth potential, and technical risk. The system analysis is supported by several topical investigations. These include: treatment of false alarms, treatment of off-line items, significant interface parameters, and a design evaluation checklist. An in-depth formulation of techniques, concepts, and guidelines for design of automated performance verification is discussed.

  1. Computing exact p-values for a cross-correlation shotgun proteomics score function.

    PubMed

    Howbert, J Jeffry; Noble, William Stafford

    2014-09-01

    The core of every protein mass spectrometry analysis pipeline is a function that assesses the quality of a match between an observed spectrum and a candidate peptide. We describe a procedure for computing exact p-values for the oldest and still widely used score function, SEQUEST XCorr. The procedure uses dynamic programming to enumerate efficiently the full distribution of scores for all possible peptides whose masses are close to that of the spectrum precursor mass. Ranking identified spectra by p-value rather than XCorr significantly reduces variance because of spectrum-specific effects on the score. In combination with the Percolator postprocessor, the XCorr p-value yields more spectrum and peptide identifications at a fixed false discovery rate than Mascot, X!Tandem, Comet, and MS-GF+ across a variety of data sets. PMID:24895379

  2. Perturbation method to compute 1-D anisotropic P and S receiver functions

    NASA Astrophysics Data System (ADS)

    Çakır, Özcan

    2013-09-01

    We propose a new algorithm to compute the teleseismic P and S receiver function synthetics for a multilayered Cartesian structure with anisotropic flat layers. The algorithm is based on the first-order perturbation theory in which the layered background structure is assumed one-dimensional with isotropic variations in vertical direction. Anisotropic velocity perturbations acting as secondary sources constitute the heterogeneities in the medium. The total wavefield is solved using a convolutional type integral equation along with the Green's function of the one-dimensional reference medium extracted using the reflectivity method. The integral equation involves a five-fold integration in space and wavenumber domains. Four of these integrals are achieved analytically and the fifth integral, which is spatial integral in the vertical direction, is performed numerically for which the Born single scattering approximation greatly suffices. The proposed algorithm is demonstrated on some selected numerical examples adapted from published work in the literature.

  3. Distributed Accounting on the Grid

    NASA Technical Reports Server (NTRS)

    Thigpen, William; Hacker, Thomas J.; McGinnis, Laura F.; Athey, Brian D.

    2001-01-01

    By the late 1990s, the Internet was adequately equipped to move vast amounts of data between HPC (High Performance Computing) systems, and efforts were initiated to link together the national infrastructure of high performance computational and data storage resources together into a general computational utility 'grid', analogous to the national electrical power grid infrastructure. The purpose of the Computational grid is to provide dependable, consistent, pervasive, and inexpensive access to computational resources for the computing community in the form of a computing utility. This paper presents a fully distributed view of Grid usage accounting and a methodology for allocating Grid computational resources for use on a Grid computing system.

  4. Finite difference computation of head-related transfer function for human hearing

    NASA Astrophysics Data System (ADS)

    Xiao, Tian; Huo Liu, Qing

    2003-05-01

    Modeling the head-related transfer function (HRTF) is a key to many applications in spatial audio. To understand and predict the effects of head geometry and the surrounding environment on the HRTF, a three-dimensional finite-difference time domain model (3D FDTD) has been developed to simulate acoustic wave interaction with a human head. A perfectly matched layer (PML) is used to absorb outgoing waves at the truncated boundary of an unbounded medium. An external source is utilized to reduce the computational domain size through the scattered-field/total-field formulation. This numerical model has been validated by analytical solutions for a spherical head model. The 3D FDTD code is then used as a computational tool to predict the HRTF for various scenarios. In particular, a simplified spherical head model is compared to a realistic head model up to about 7 kHz. The HRTF is also computed for a realistic head model in the presence of a wall. It is demonstrated that this 3D FDTD model can be a useful tool for spatial audio applications.

  5. Finite difference computation of head-related transfer function for human hearing.

    PubMed

    Xiao, Tian; Liu, Qing Huo

    2003-05-01

    Modeling the head-related transfer function (HRTF) is a key to many applications in spatial audio. To understand and predict the effects of head geometry and the surrounding environment on the HRTF, a three-dimensional finite-difference time domain model (3D FDTD) has been developed to simulate acoustic wave interaction with a human head. A perfectly matched layer (PML) is used to absorb outgoing waves at the truncated boundary of an unbounded medium. An external source is utilized to reduce the computational domain size through the scattered-field/total-field formulation. This numerical model has been validated by analytical solutions for a spherical head model. The 3D FDTD code is then used as a computational tool to predict the HRTF for various scenarios. In particular, a simplified spherical head model is compared to a realistic head model up to about 7 kHz. The HRTF is also computed for a realistic head model in the presence of a wall. It is demonstrated that this 3D FDTD model can be a useful tool for spatial audio applications. PMID:12765362

  6. Morphological and Functional Evaluation of Quadricuspid Aortic Valves Using Cardiac Computed Tomography

    PubMed Central

    Song, Inyoung; Park, Jung Ah; Choi, Bo Hwa; Shin, Je Kyoun; Chee, Hyun Keun; Kim, Jun Seok

    2016-01-01

    Objective The aim of this study was to identify the morphological and functional characteristics of quadricuspid aortic valves (QAV) on cardiac computed tomography (CCT). Materials and Methods We retrospectively enrolled 11 patients with QAV. All patients underwent CCT and transthoracic echocardiography (TTE), and 7 patients underwent cardiovascular magnetic resonance (CMR). The presence and classification of QAV assessed by CCT was compared with that of TTE and intraoperative findings. The regurgitant orifice area (ROA) measured by CCT was compared with severity of aortic regurgitation (AR) by TTE and the regurgitant fraction (RF) by CMR. Results All of the patients had AR; 9 had pure AR, 1 had combined aortic stenosis and regurgitation, and 1 had combined subaortic stenosis and regurgitation. Two patients had a subaortic fibrotic membrane and 1 of them showed a subaortic stenosis. One QAV was misdiagnosed as tricuspid aortic valve on TTE. In accordance with the Hurwitz and Robert's classification, consensus was reached on the QAV classification between the CCT and TTE findings in 7 of 10 patients. The patients were classified as type A (n = 1), type B (n = 3), type C (n = 1), type D (n = 4), and type F (n = 2) on CCT. A very high correlation existed between ROA by CCT and RF by CMR (r = 0.99) but a good correlation existed between ROA by CCT and regurgitant severity by TTE (r = 0.62). Conclusion Cardiac computed tomography provides comprehensive anatomical and functional information about the QAV. PMID:27390538

  7. Poisson Green's function method for increased computational efficiency in numerical calculations of Coulomb coupling elements

    NASA Astrophysics Data System (ADS)

    Zimmermann, Anke; Kuhn, Sandra; Richter, Marten

    2016-01-01

    Often, the calculation of Coulomb coupling elements for quantum dynamical treatments, e.g., in cluster or correlation expansion schemes, requires the evaluation of a six dimensional spatial integral. Therefore, it represents a significant limiting factor in quantum mechanical calculations. If the size or the complexity of the investigated system increases, many coupling elements need to be determined. The resulting computational constraints require an efficient method for a fast numerical calculation of the Coulomb coupling. We present a computational method to reduce the numerical complexity by decreasing the number of spatial integrals for arbitrary geometries. We use a Green's function formulation of the Coulomb coupling and introduce a generalized scalar potential as solution of a generalized Poisson equation with a generalized charge density as the inhomogeneity. That enables a fast calculation of Coulomb coupling elements and, additionally, a straightforward inclusion of boundary conditions and arbitrarily spatially dependent dielectrics through the Coulomb Green's function. Particularly, if many coupling elements are included, the presented method, which is not restricted to specific symmetries of the model, presents a promising approach for increasing the efficiency of numerical calculations of the Coulomb interaction. To demonstrate the wide range of applications, we calculate internanostructure couplings, such as the Förster coupling, and illustrate the inclusion of symmetry considerations in the method for the Coulomb coupling between bound quantum dot states and unbound continuum states.

  8. Acidity of the amidoxime functional group in aqueous solution: a combined experimental and computational study.

    PubMed

    Mehio, Nada; Lashely, Mark A; Nugent, Joseph W; Tucker, Lyndsay; Correia, Bruna; Do-Thanh, Chi-Linh; Dai, Sheng; Hancock, Robert D; Bryantsev, Vyacheslav S

    2015-02-26

    Poly(acrylamidoxime) adsorbents are often invoked in discussions of mining uranium from seawater. While the amidoxime-uranyl chelation mode has been established, a number of essential binding constants remain unclear. This is largely due to the wide range of conflicting pK(a) values that have been reported for the amidoxime functional group. To resolve this existing controversy we investigated the pK(a) values of the amidoxime functional group using a combination of experimental and computational methods. Experimentally, we used spectroscopic titrations to measure the pK(a) values of representative amidoximes, acetamidoxime, and benzamidoxime. Computationally, we report on the performance of several protocols for predicting the pK(a) values of aqueous oxoacids. Calculations carried out at the MP2 or M06-2X levels of theory combined with solvent effects calculated using the SMD model provide the best overall performance, with a root-mean-square deviation of 0.46 pK(a) units and 0.45 pK(a) units, respectively. Finally, we employ our two best methods to predict the pK(a) values of promising, uncharacterized amidoxime ligands, which provides a convenient means for screening suitable amidoxime monomers for future generations of poly(acrylamidoxime) adsorbents. PMID:25621618

  9. Rsite2: an efficient computational method to predict the functional sites of noncoding RNAs.

    PubMed

    Zeng, Pan; Cui, Qinghua

    2016-01-01

    Noncoding RNAs (ncRNAs) represent a big class of important RNA molecules. Given the large number of ncRNAs, identifying their functional sites is becoming one of the most important topics in the post-genomic era, but available computational methods are limited. For the above purpose, we previously presented a tertiary structure based method, Rsite, which first calculates the distance metrics defined in Methods with the tertiary structure of an ncRNA and then identifies the nucleotides located within the extreme points in the distance curve as the functional sites of the given ncRNA. However, the application of Rsite is largely limited because of limited RNA tertiary structures. Here we present a secondary structure based computational method, Rsite2, based on the observation that the secondary structure based nucleotide distance is strongly positively correlated with that derived from tertiary structure. This makes it reasonable to replace tertiary structure with secondary structure, which is much easier to obtain and process. Moreover, we applied Rsite2 to three ncRNAs (tRNA (Lys), Diels-Alder ribozyme, and RNase P) and a list of human mitochondria transcripts. The results show that Rsite2 works well with nearly equivalent accuracy as Rsite but is much more feasible and efficient. Finally, a web-server, the source codes, and the dataset of Rsite2 are available at http://www.cuialb.cn/rsite2. PMID:26751501

  10. Functional neuroanatomy of remote episodic, semantic and spatial memory: a unified account based on multiple trace theory

    PubMed Central

    Moscovitch, Morris; Rosenbaum, R Shayna; Gilboa, Asaf; Addis, Donna Rose; Westmacott, Robyn; Grady, Cheryl; McAndrews, Mary Pat; Levine, Brian; Black, Sandra; Winocur, Gordon; Nadel, Lynn

    2005-01-01

    We review lesion and neuroimaging evidence on the role of the hippocampus, and other structures, in retention and retrieval of recent and remote memories. We examine episodic, semantic and spatial memory, and show that important distinctions exist among different types of these memories and the structures that mediate them. We argue that retention and retrieval of detailed, vivid autobiographical memories depend on the hippocampal system no matter how long ago they were acquired. Semantic memories, on the other hand, benefit from hippocampal contribution for some time before they can be retrieved independently of the hippocampus. Even semantic memories, however, can have episodic elements associated with them that continue to depend on the hippocampus. Likewise, we distinguish between experientially detailed spatial memories (akin to episodic memory) and more schematic memories (akin to semantic memory) that are sufficient for navigation but not for re-experiencing the environment in which they were acquired. Like their episodic and semantic counterparts, the former type of spatial memory is dependent on the hippocampus no matter how long ago it was acquired, whereas the latter can survive independently of the hippocampus and is represented in extra-hippocampal structures. In short, the evidence reviewed suggests strongly that the function of the hippocampus (and possibly that of related limbic structures) is to help encode, retain, and retrieve experiences, no matter how long ago the events comprising the experience occurred, and no matter whether the memories are episodic or spatial. We conclude that the evidence favours a multiple trace theory (MTT) of memory over two other models: (1) traditional consolidation models which posit that the hippocampus is a time-limited memory structure for all forms of memory; and (2) versions of cognitive map theory which posit that the hippocampus is needed for representing all forms of allocentric space in memory. PMID

  11. Accounting Systems for School Districts.

    ERIC Educational Resources Information Center

    Atwood, E. Barrett, Jr.

    1983-01-01

    Advises careful analysis and improvement of existing school district accounting systems prior to investment in new ones. Emphasizes the importance of attracting and maintaining quality financial staffs, developing an accounting policies and procedures manual, and designing a good core accounting system before purchasing computer hardware and…

  12. Model Accounting Program. Adopters Guide.

    ERIC Educational Resources Information Center

    Beaverton School District 48, OR.

    The accounting cluster demonstration project conducted at Aloha High School in the Beaverton, Oregon, school district developed a model curriculum for high school accounting. The curriculum is based on interviews with professionals in the accounting field and emphasizes the use of computers. It is suitable for use with special needs students as…

  13. Multiscale Theoretical and Computational Modeling of the Synthesis, Structure and Performance of Functional Carbon Materials

    NASA Astrophysics Data System (ADS)

    Mushrif, Samir Hemant

    2010-09-01

    Functional carbon-based/supported materials, including those doped with transition metal, are widely applied in hydrogen mediated catalysis and are currently being designed for hydrogen storage applications. This thesis focuses on acquiring a fundamental understanding and quantitative characterization of: (i) the chemistry of their synthesis procedure, (ii) their microstructure and chemical composition and (iii) their functionality, using multiscale modeling and simulation methodologies. Palladium and palladium(II) acetylacetonate are the transition metal and its precursor of interest, respectively. A first-principles modeling approach consisting of the planewave-pseudopotential implementation of the Kohn-Sham density functional theory, combined with the Car-Parrinello molecular dynamics, is implemented to model the palladium doping step in the synthesis of carbon-based/supported material and its interaction with hydrogen. The electronic structure is analyzed using the electron localization function and, when required, the hydrogen interaction dynamics are accelerated and the energetics are computed using the metadynamics technique. Palladium pseudopotentials are tested and validated for their use in a hydrocarbon environment by successfully computing the experimentally observed crystal structure of palladium(II) acetylacetonate. Long-standing hypotheses related to the palladium doping process are confirmed and new fundamental insights about its molecular chemistry are revealed. The dynamics, mechanism and energy landscape and barriers of hydrogen adsorption and migration on and desorption from the carbon-based/supported palladium clusters are reported for the first time. The effects of palladium doping and of the synthesis procedure on the pore structure of palladium-doped activated carbon fibers are quantified by applying novel statistical mechanical based methods to the experimental physisorption isotherms. The drawbacks of the conventional adsorption-based pore

  14. A Computational Method Designed to Aid in the Teaching of Copolymer Composition and Microstructure as a Function of Conversion.

    ERIC Educational Resources Information Center

    Coleman, M. M.; Varnell, W. D.

    1982-01-01

    Describes a computer program (FORTRAN and APPLESOFT) demonstrating the effect of copolymer composition as a function of conversion, providing theoretical background and examples of types of information gained from computer calculations. Suggests that the program enhances undergraduate students' understanding of basic copolymerization theory.…

  15. Application of the new neutron monitor yield function computed for different altitudes to an analysis of GLEs

    NASA Astrophysics Data System (ADS)

    Mishev, Alexander; Usoskin, Ilya

    2016-07-01

    A precise analysis of SEP (solar energetic particle) spectral and angular characteristics using neutron monitor (NM) data requires realistic modeling of propagation of those particles in the Earth's magnetosphere and atmosphere. On the basis of the method including a sequence of consecutive steps, namely a detailed computation of the SEP assymptotic cones of acceptance, and application of a neutron monitor yield function and convenient optimization procedure, we derived the rigidity spectra and anisotropy characteristics of several major GLEs. Here we present several major GLEs of the solar cycle 23: the Bastille day event on 14 July 2000 (GLE 59), GLE 69 on 20 January 2005, and GLE 70 on 13 December 2006. The SEP spectra and pitch angle distributions were computed in their dynamical development. For the computation we use the newly computed yield function of the standard 6NM64 neutron monitor for primary proton and alpha CR nuclei. In addition, we present new computations of NM yield function for the altitudes of 3000 m and 5000 m above the sea level The computations were carried out with Planetocosmics and CORSIKA codes as standardized Monte-Carlo tools for atmospheric cascade simulations. The flux of secondary neutrons and protons was computed using the Planetocosmics code appliyng a realistic curved atmospheric. Updated information concerning the NM registration efficiency for secondary neutrons and protons was used. The derived results for spectral and angular characteristics using the newly computed NM yield function at several altitudes are compared with the previously obtained ones using the double attenuation method.

  16. Intersections between the Autism Spectrum and the Internet: Perceived Benefits and Preferred Functions of Computer-Mediated Communication

    ERIC Educational Resources Information Center

    Gillespie-Lynch, Kristen; Kapp, Steven K.; Shane-Simpson, Christina; Smith, David Shane; Hutman, Ted

    2014-01-01

    An online survey compared the perceived benefits and preferred functions of computer-mediated communication of participants with (N = 291) and without ASD (N = 311). Participants with autism spectrum disorder (ASD) perceived benefits of computer-mediated communication in terms of increased comprehension and control over communication, access to…

  17. Computer Simulations Reveal Multiple Functions for Aromatic Residues in Cellulase Enzymes (Fact Sheet)

    SciTech Connect

    Not Available

    2012-07-01

    NREL researchers use high-performance computing to demonstrate fundamental roles of aromatic residues in cellulase enzyme tunnels. National Renewable Energy Laboratory (NREL) computer simulations of a key industrial enzyme, the Trichoderma reesei Family 6 cellulase (Cel6A), predict that aromatic residues near the enzyme's active site and at the entrance and exit tunnel perform different functions in substrate binding and catalysis, depending on their location in the enzyme. These results suggest that nature employs aromatic-carbohydrate interactions with a wide variety of binding affinities for diverse functions. Outcomes also suggest that protein engineering strategies in which mutations are made around the binding sites may require tailoring specific to the enzyme family. Cellulase enzymes ubiquitously exhibit tunnels or clefts lined with aromatic residues for processing carbohydrate polymers to monomers, but the molecular-level role of these aromatic residues remains unknown. In silico mutation of the aromatic residues near the catalytic site of Cel6A has little impact on the binding affinity, but simulation suggests that these residues play a major role in the glucopyranose ring distortion necessary for cleaving glycosidic bonds to produce fermentable sugars. Removal of aromatic residues at the entrance and exit of the cellulase tunnel, however, dramatically impacts the binding affinity. This suggests that these residues play a role in acquiring cellulose chains from the cellulose crystal and stabilizing the reaction product, respectively. These results illustrate that the role of aromatic-carbohydrate interactions varies dramatically depending on the position in the enzyme tunnel. As aromatic-carbohydrate interactions are present in all carbohydrate-active enzymes, the results have implications for understanding protein structure-function relationships in carbohydrate metabolism and recognition, carbon turnover in nature, and protein engineering strategies for

  18. Acidity of the amidoxime functional group in aqueous solution. A combined experimental and computational study

    SciTech Connect

    Mehio, Nada; Lashely, Mark A.; Nugent, Joseph W.; Tucker, Lyndsay; Correia, Bruna; Do-Thanh, Chi-Linh; Dai, Sheng; Hancock, Robert D.; Bryantsev, Vyacheslav S.

    2015-01-26

    Poly(acrylamidoxime) adsorbents are often invoked in discussions of mining uranium from seawater. It has been demonstrated repeatedly in the literature that the success of these materials is due to the amidoxime functional group. While the amidoxime-uranyl chelation mode has been established, a number of essential binding constants remain unclear. This is largely due to the wide range of conflicting pKa values that have been reported for the amidoxime functional group in the literature. To resolve this existing controversy we investigated the pKa values of the amidoxime functional group using a combination of experimental and computational methods. Experimentally, we used spectroscopic titrations to measure the pKa values of representative amidoximes, acetamidoxime and benzamidoxime. Computationally, we report on the performance of several protocols for predicting the pKa values of aqueous oxoacids. Calculations carried out at the MP2 or M06-2X levels of theory combined with solvent effects calculated using the SMD model provide the best overall performance with a mean absolute error of 0.33 pKa units and 0.35 pKa units, respectively, and a root mean square deviation of 0.46 pKa units and 0.45 pKa units, respectively. Finally, we employ our two best methods to predict the pKa values of promising, uncharacterized amidoxime ligands. Hence, our study provides a convenient means for screening suitable amidoxime monomers for future generations of poly(acrylamidoxime) adsorbents used to mine uranium from seawater.

  19. Acidity of the amidoxime functional group in aqueous solution. A combined experimental and computational study

    DOE PAGESBeta

    Mehio, Nada; Lashely, Mark A.; Nugent, Joseph W.; Tucker, Lyndsay; Correia, Bruna; Do-Thanh, Chi-Linh; Dai, Sheng; Hancock, Robert D.; Bryantsev, Vyacheslav S.

    2015-01-26

    Poly(acrylamidoxime) adsorbents are often invoked in discussions of mining uranium from seawater. It has been demonstrated repeatedly in the literature that the success of these materials is due to the amidoxime functional group. While the amidoxime-uranyl chelation mode has been established, a number of essential binding constants remain unclear. This is largely due to the wide range of conflicting pKa values that have been reported for the amidoxime functional group in the literature. To resolve this existing controversy we investigated the pKa values of the amidoxime functional group using a combination of experimental and computational methods. Experimentally, we used spectroscopicmore » titrations to measure the pKa values of representative amidoximes, acetamidoxime and benzamidoxime. Computationally, we report on the performance of several protocols for predicting the pKa values of aqueous oxoacids. Calculations carried out at the MP2 or M06-2X levels of theory combined with solvent effects calculated using the SMD model provide the best overall performance with a mean absolute error of 0.33 pKa units and 0.35 pKa units, respectively, and a root mean square deviation of 0.46 pKa units and 0.45 pKa units, respectively. Finally, we employ our two best methods to predict the pKa values of promising, uncharacterized amidoxime ligands. Hence, our study provides a convenient means for screening suitable amidoxime monomers for future generations of poly(acrylamidoxime) adsorbents used to mine uranium from seawater.« less

  20. ABINIT: Plane-Wave-Based Density-Functional Theory on High Performance Computers

    NASA Astrophysics Data System (ADS)

    Torrent, Marc

    2014-03-01

    For several years, a continuous effort has been produced to adapt electronic structure codes based on Density-Functional Theory to the future computing architectures. Among these codes, ABINIT is based on a plane-wave description of the wave functions which allows to treat systems of any kind. Porting such a code on petascale architectures pose difficulties related to the many-body nature of the DFT equations. To improve the performances of ABINIT - especially for what concerns standard LDA/GGA ground-state and response-function calculations - several strategies have been followed: A full multi-level parallelisation MPI scheme has been implemented, exploiting all possible levels and distributing both computation and memory. It allows to increase the number of distributed processes and could not be achieved without a strong restructuring of the code. The core algorithm used to solve the eigen problem (``Locally Optimal Blocked Congugate Gradient''), a Blocked-Davidson-like algorithm, is based on a distribution of processes combining plane-waves and bands. In addition to the distributed memory parallelization, a full hybrid scheme has been implemented, using standard shared-memory directives (openMP/openACC) or porting some comsuming code sections to Graphics Processing Units (GPU). As no simple performance model exists, the complexity of use has been increased; the code efficiency strongly depends on the distribution of processes among the numerous levels. ABINIT is able to predict the performances of several process distributions and automatically choose the most favourable one. On the other hand, a big effort has been carried out to analyse the performances of the code on petascale architectures, showing which sections of codes have to be improved; they all are related to Matrix Algebra (diagonalisation, orthogonalisation). The different strategies employed to improve the code scalability will be described. They are based on an exploration of new diagonalization

  1. Comparison of measured and computed phase functions of individual tropospheric ice crystals

    NASA Astrophysics Data System (ADS)

    Stegmann, Patrick G.; Tropea, Cameron; Järvinen, Emma; Schnaiter, Martin

    2016-07-01

    Airplanes passing the incuda (lat. anvils) regions of tropical cumulonimbi-clouds are at risk of suffering an engine power-loss event and engine damage due to ice ingestion (Mason et al., 2006 [1]). Research in this field relies on optical measurement methods to characterize ice crystals; however the design and implementation of such methods presently suffer from the lack of reliable and efficient means of predicting the light scattering from ice crystals. The nascent discipline of direct measurement of phase functions of ice crystals in conjunction with particle imaging and forward modelling through geometrical optics derivative- and Transition matrix-codes for the first time allow us to obtain a deeper understanding of the optical properties of real tropospheric ice crystals. In this manuscript, a sample phase function obtained via the Particle Habit Imaging and Polar Scattering (PHIPS) probe during a measurement campaign in flight over Brazil will be compared to three different light scattering codes. This includes a newly developed first order geometrical optics code taking into account the influence of the Gaussian beam illumination used in the PHIPS device, as well as the reference ray tracing code of Macke and the T-matrix code of Kahnert.

  2. [Computer-assisted phonetography as a diagnostic aid in functional dysphonia].

    PubMed

    Airainer, R; Klingholz, F

    1991-07-01

    A total of 160 voice-trained and untrained subjects with functional dysphonia were given a "clinical rating" according to their clinical findings. This was a certain value on a scale that recorded the degree of functional voice disorder ranging from a marked hypofunction to an extreme hyperfunction. The phonetograms of these patients were approximated by ellipses, whereby the definition and quantitative recording of several phonetogram parameters were rendered possible. By means of a linear combination of phonetogram parameters, a "calculated assessment" was obtained for each patient that was expected to tally with the "clinical rating". This paper demonstrates that a graduation of the dysphonic clinical picture with regard to the presence of hypofunctional or hyperfunctional components is possible via computerised phonetogram evaluation. In this case, the "calculated assessments" for both male and female singers and non-singers must be computed using different linear combinations. The method can be introduced as a supplementary diagnostic procedure in the diagnosis of functional dysphonia. PMID:1910366

  3. Novel hold-release functionality in a P300 brain-computer interface

    NASA Astrophysics Data System (ADS)

    Alcaide-Aguirre, R. E.; Huggins, J. E.

    2014-12-01

    Assistive technology control interface theory describes interface activation and interface deactivation as distinct properties of any control interface. Separating control of activation and deactivation allows precise timing of the duration of the activation. Objective. We propose a novel P300 brain-computer interface (BCI) functionality with separate control of the initial activation and the deactivation (hold-release) of a selection. Approach. Using two different layouts and off-line analysis, we tested the accuracy with which subjects could (1) hold their selection and (2) quickly change between selections. Main results. Mean accuracy across all subjects for the hold-release algorithm was 85% with one hold-release classification and 100% with two hold-release classifications. Using a layout designed to lower perceptual errors, accuracy increased to a mean of 90% and the time subjects could hold a selection was 40% longer than with the standard layout. Hold-release functionality provides improved response time (6-16 times faster) over the initial P300 BCI selection by allowing the BCI to make hold-release decisions from very few flashes instead of after multiple sequences of flashes. Significance. For the BCI user, hold-release functionality allows for faster, more continuous control with a P300 BCI, creating new options for BCI applications.

  4. Accountability Overboard

    ERIC Educational Resources Information Center

    Chieppo, Charles D.; Gass, James T.

    2009-01-01

    This article reports that special interest groups opposed to charter schools and high-stakes testing have hijacked Massachusetts's once-independent board of education and stand poised to water down the Massachusetts Comprehensive Assessment System (MCAS) tests and the accountability system they support. President Barack Obama and Massachusetts…

  5. Computational genomic identification and functional reconstitution of plant natural product biosynthetic pathways.

    PubMed

    Medema, Marnix H; Osbourn, Anne

    2016-08-27

    Covering: 2003 to 2016The last decade has seen the first major discoveries regarding the genomic basis of plant natural product biosynthetic pathways. Four key computationally driven strategies have been developed to identify such pathways, which make use of physical clustering, co-expression, evolutionary co-occurrence and epigenomic co-regulation of the genes involved in producing a plant natural product. Here, we discuss how these approaches can be used for the discovery of plant biosynthetic pathways encoded by both chromosomally clustered and non-clustered genes. Additionally, we will discuss opportunities to prioritize plant gene clusters for experimental characterization, and end with a forward-looking perspective on how synthetic biology technologies will allow effective functional reconstitution of candidate pathways using a variety of genetic systems. PMID:27321668

  6. Computing frequency by using generalized zero-crossing applied to intrinsic mode functions

    NASA Technical Reports Server (NTRS)

    Huang, Norden E. (Inventor)

    2006-01-01

    This invention presents a method for computing Instantaneous Frequency by applying Empirical Mode Decomposition to a signal and using Generalized Zero-Crossing (GZC) and Extrema Sifting. The GZC approach is the most direct, local, and also the most accurate in the mean. Furthermore, this approach will also give a statistical measure of the scattering of the frequency value. For most practical applications, this mean frequency localized down to quarter of a wave period is already a well-accepted result. As this method physically measures the period, or part of it, the values obtained can serve as the best local mean over the period to which it applies. Through Extrema Sifting, instead of the cubic spline fitting, this invention constructs the upper envelope and the lower envelope by connecting local maxima points and local minima points of the signal with straight lines, respectively, when extracting a collection of Intrinsic Mode Functions (IMFs) from a signal under consideration.

  7. Localization of functional adrenal tumors by computed tomography and venous sampling

    SciTech Connect

    Dunnick, N.R.; Doppman, J.L.; Gill, J.R. Jr.; Strott, C.A.; Keiser, H.R.; Brennan, M.F.

    1982-02-01

    Fifty-eight patients with functional lesions of the adrenal glands underwent radiographic evaluation. Twenty-eight patients had primary aldosteronism (Conn syndrome), 20 had Cushing syndrome, and 10 had pheochromocytoma. Computed tomography (CT) correctly identified adrenal tumors in 11 (61%) of 18 patients with aldosteronomas, 6 of 6 patients with benign cortisol-producing adrenal tumors, and 5 (83%) of 6 patients with pheochromocytomas. No false-positive diagnoses were encountered among patients with adrenal adenomas. Bilateral adrenal hyperplasia appeared on CT scans as normal or prominent adrenal glands with a normal configuration; however, CT was not able to exclude the presence of small adenomas. Adrenal venous sampling was correct in each case, and reliably distinguished adrenal tumors from hyperplasia. Recurrent pheochromocytomas were the most difficult to loclize on CT due to the surgical changes in the region of the adrenals and the frequent extra-adrenal locations.

  8. Computational design of intrinsic molecular rectifiers based on asymmetric functionalization of N-phenylbenzamide

    DOE PAGESBeta

    Ding, Wendu; Koepf, Matthieu; Koenigsmann, Christopher; Batra, Arunabh; Venkataraman, Latha; Negre, Christian F. A.; Brudvig, Gary W.; Crabtree, Robert H.; Schmuttenmaer, Charles A.; Batista, Victor S.

    2015-11-03

    Here, we report a systematic computational search of molecular frameworks for intrinsic rectification of electron transport. The screening of molecular rectifiers includes 52 molecules and conformers spanning over 9 series of structural motifs. N-Phenylbenzamide is found to be a promising framework with both suitable conductance and rectification properties. A targeted screening performed on 30 additional derivatives and conformers of N-phenylbenzamide yielded enhanced rectification based on asymmetric functionalization. We demonstrate that electron-donating substituent groups that maintain an asymmetric distribution of charge in the dominant transport channel (e.g., HOMO) enhance rectification by raising the channel closer to the Fermi level. These findingsmore » are particularly valuable for the design of molecular assemblies that could ensure directionality of electron transport in a wide range of applications, from molecular electronics to catalytic reactions.« less

  9. Computational design of intrinsic molecular rectifiers based on asymmetric functionalization of N-phenylbenzamide

    SciTech Connect

    Ding, Wendu; Koepf, Matthieu; Koenigsmann, Christopher; Batra, Arunabh; Venkataraman, Latha; Negre, Christian F. A.; Brudvig, Gary W.; Crabtree, Robert H.; Schmuttenmaer, Charles A.; Batista, Victor S.

    2015-11-03

    Here, we report a systematic computational search of molecular frameworks for intrinsic rectification of electron transport. The screening of molecular rectifiers includes 52 molecules and conformers spanning over 9 series of structural motifs. N-Phenylbenzamide is found to be a promising framework with both suitable conductance and rectification properties. A targeted screening performed on 30 additional derivatives and conformers of N-phenylbenzamide yielded enhanced rectification based on asymmetric functionalization. We demonstrate that electron-donating substituent groups that maintain an asymmetric distribution of charge in the dominant transport channel (e.g., HOMO) enhance rectification by raising the channel closer to the Fermi level. These findings are particularly valuable for the design of molecular assemblies that could ensure directionality of electron transport in a wide range of applications, from molecular electronics to catalytic reactions.

  10. Computed myography: three-dimensional reconstruction of motor functions from surface EMG data

    NASA Astrophysics Data System (ADS)

    van den Doel, Kees; Ascher, Uri M.; Pai, Dinesh K.

    2008-12-01

    We describe a methodology called computed myography to qualitatively and quantitatively determine the activation level of individual muscles by voltage measurements from an array of voltage sensors on the skin surface. A finite element model for electrostatics simulation is constructed from morphometric data. For the inverse problem, we utilize a generalized Tikhonov regularization. This imposes smoothness on the reconstructed sources inside the muscles and suppresses sources outside the muscles using a penalty term. Results from experiments with simulated and human data are presented for activation reconstructions of three muscles in the upper arm (biceps brachii, bracialis and triceps). This approach potentially offers a new clinical tool to sensitively assess muscle function in patients suffering from neurological disorders (e.g., spinal cord injury), and could more accurately guide advances in the evaluation of specific rehabilitation training regimens.

  11. Functional near-infrared spectroscopy for adaptive human-computer interfaces

    NASA Astrophysics Data System (ADS)

    Yuksel, Beste F.; Peck, Evan M.; Afergan, Daniel; Hincks, Samuel W.; Shibata, Tomoki; Kainerstorfer, Jana; Tgavalekos, Kristen; Sassaroli, Angelo; Fantini, Sergio; Jacob, Robert J. K.

    2015-03-01

    We present a brain-computer interface (BCI) that detects, analyzes and responds to user cognitive state in real-time using machine learning classifications of functional near-infrared spectroscopy (fNIRS) data. Our work is aimed at increasing the narrow communication bandwidth between the human and computer by implicitly measuring users' cognitive state without any additional effort on the part of the user. Traditionally, BCIs have been designed to explicitly send signals as the primary input. However, such systems are usually designed for people with severe motor disabilities and are too slow and inaccurate for the general population. In this paper, we demonstrate with previous work1 that a BCI that implicitly measures cognitive workload can improve user performance and awareness compared to a control condition by adapting to user cognitive state in real-time. We also discuss some of the other applications we have used in this field to measure and respond to cognitive states such as cognitive workload, multitasking, and user preference.

  12. Training Older Adults to Use Tablet Computers: Does It Enhance Cognitive Function?

    PubMed Central

    Chan, Micaela Y.; Haber, Sara; Drew, Linda M.; Park, Denise C.

    2016-01-01

    Purpose of the Study: Recent evidence shows that engaging in learning new skills improves episodic memory in older adults. In this study, older adults who were computer novices were trained to use a tablet computer and associated software applications. We hypothesize that sustained engagement in this mentally challenging training would yield a dual benefit of improved cognition and enhancement of everyday function by introducing useful skills. Design and Methods: A total of 54 older adults (age 60-90) committed 15 hr/week for 3 months. Eighteen participants received extensive iPad training, learning a broad range of practical applications. The iPad group was compared with 2 separate controls: a Placebo group that engaged in passive tasks requiring little new learning; and a Social group that had regular social interaction, but no active skill acquisition. All participants completed the same cognitive battery pre- and post-engagement. Results: Compared with both controls, the iPad group showed greater improvements in episodic memory and processing speed but did not differ in mental control or visuospatial processing. Implications: iPad training improved cognition relative to engaging in social or nonchallenging activities. Mastering relevant technological devices have the added advantage of providing older adults with technological skills useful in facilitating everyday activities (e.g., banking). This work informs the selection of targeted activities for future interventions and community programs. PMID:24928557

  13. Development of computer-aided functions in clinical neurosurgery with PACS

    NASA Astrophysics Data System (ADS)

    Mukasa, Minoru; Aoki, Makoto; Satoh, Minoru; Kowada, Masayoshi; Kikuchi, K.

    1991-07-01

    The introduction of the "Picture Archiving and Communications System (known as PACS)," provides many benefits, including the application of C.A.D., (Computer Aided Diagnosis). Clinically, this allows for the measurement and design of an operation to be easily completed with the CRT monitors of PACS rather than with film, as has been customary in the past. Under the leadership of the Department of Neurosurgery, Akita University School of Medicine, and Southern Tohoku Research Institute for Neuroscience, Koriyama, new computer aided functions with EFPACS (Fuji Electric's PACS) have been developed for use in clinical neurosurgery. This image processing is composed of three parts as follows: (1) Automatic mapping of small lesions depicted on Magnetic Resonance (or MR) images on the brain atlas. (2) Superimposition of two angiographic films onto a single synthesized image. (3) Automatic mapping of the lesion's position (as shown on the. CT images) on the processing image referred to in the foregoing clause 2. The processing in the clause (1) provides a reference for anatomical estimation. The processing in the clause (2) is used for general analysis of the condition of a disease. The processing in the clause (3) is used to design the operation. This image processing is currently being used with good results.

  14. A computational approach to identify genes for functional RNAs in genomic sequences

    PubMed Central

    Carter, Richard J.; Dubchak, Inna; Holbrook, Stephen R.

    2001-01-01

    Currently there is no successful computational approach for identification of genes encoding novel functional RNAs (fRNAs) in genomic sequences. We have developed a machine learning approach using neural networks and support vector machines to extract common features among known RNAs for prediction of new RNA genes in the unannotated regions of prokaryotic and archaeal genomes. The Escherichia coli genome was used for development, but we have applied this method to several other bacterial and archaeal genomes. Networks based on nucleotide composition were 80–90% accurate in jackknife testing experiments for bacteria and 90–99% for hyperthermophilic archaea. We also achieved a significant improvement in accuracy by combining these predictions with those obtained using a second set of parameters consisting of known RNA sequence motifs and the calculated free energy of folding. Several known fRNAs not included in the training datasets were identified as well as several hundred predicted novel RNAs. These studies indicate that there are many unidentified RNAs in simple genomes that can be predicted computationally as a precursor to experimental study. Public access to our RNA gene predictions and an interface for user predictions is available via the web. PMID:11574674

  15. Technical Report: Toward a Scalable Algorithm to Compute High-Dimensional Integrals of Arbitrary Functions

    SciTech Connect

    Snyder, Abigail C.; Jiao, Yu

    2010-10-01

    Neutron experiments at the Spallation Neutron Source (SNS) at Oak Ridge National Laboratory (ORNL) frequently generate large amounts of data (on the order of 106-1012 data points). Hence, traditional data analysis tools run on a single CPU take too long to be practical and scientists are unable to efficiently analyze all data generated by experiments. Our goal is to develop a scalable algorithm to efficiently compute high-dimensional integrals of arbitrary functions. This algorithm can then be used to integrate the four-dimensional integrals that arise as part of modeling intensity from the experiments at the SNS. Here, three different one-dimensional numerical integration solvers from the GNU Scientific Library were modified and implemented to solve four-dimensional integrals. The results of these solvers on a final integrand provided by scientists at the SNS can be compared to the results of other methods, such as quasi-Monte Carlo methods, computing the same integral. A parallelized version of the most efficient method can allow scientists the opportunity to more effectively analyze all experimental data.

  16. Educational Accounting Procedures.

    ERIC Educational Resources Information Center

    Tidwell, Sam B.

    This chapter of "Principles of School Business Management" reviews the functions, procedures, and reports with which school business officials must be familiar in order to interpret and make decisions regarding the school district's financial position. Among the accounting functions discussed are financial management, internal auditing, annual…

  17. Influence of non-invasive X-ray computed tomography (XRCT) on the microbial community structure and function in soil.

    PubMed

    Fischer, Doreen; Pagenkemper, Sebastian; Nellesen, Jens; Peth, Stephan; Horn, Rainer; Schloter, Michael

    2013-05-01

    In this study the influence of X-ray computed tomography (XRCT) on the microbial community structure and function in soils has been investigated. Our results clearly indicate that XRCT of soil samples has a strong impact on microbial communities and changes structure and function significantly due to the death of selected microbial groups as a result of the treatment. PMID:23499670

  18. Computing many-body wave functions with guaranteed precision: the first-order Møller-Plesset wave function for the ground state of helium atom.

    PubMed

    Bischoff, Florian A; Harrison, Robert J; Valeev, Edward F

    2012-09-14

    We present an approach to compute accurate correlation energies for atoms and molecules using an adaptive discontinuous spectral-element multiresolution representation for the two-electron wave function. Because of the exponential storage complexity of the spectral-element representation with the number of dimensions, a brute-force computation of two-electron (six-dimensional) wave functions with high precision was not practical. To overcome the key storage bottlenecks we utilized (1) a low-rank tensor approximation (specifically, the singular value decomposition) to compress the wave function, and (2) explicitly correlated R12-type terms in the wave function to regularize the Coulomb electron-electron singularities of the Hamiltonian. All operations necessary to solve the Schrödinger equation were expressed so that the reconstruction of the full-rank form of the wave function is never necessary. Numerical performance of the method was highlighted by computing the first-order Møller-Plesset wave function of a helium atom. The computed second-order Møller-Plesset energy is precise to ~2 microhartrees, which is at the precision limit of the existing general atomic-orbital-based approaches. Our approach does not assume special geometric symmetries, hence application to molecules is straightforward. PMID:22979846

  19. Arkansas' Curriculum Guide. Competency Based Computerized Accounting.

    ERIC Educational Resources Information Center

    Arkansas State Dept. of Education, Little Rock. Div. of Vocational, Technical and Adult Education.

    This guide contains the essential parts of a total curriculum for a one-year secondary-level course in computerized accounting. Addressed in the individual sections of the guide are the following topics: the complete accounting cycle, computer operations for accounting, computerized accounting and general ledgers, computerized accounts payable,…

  20. Estimation of the shadow prices of pollutants with production/environment inefficiency taken into account: a nonparametric directional distance function approach.

    PubMed

    Lee, Jeong-Dong; Park, Jong-Bok; Kim, Tai-Yoo

    2002-04-01

    This paper deals with the estimation of the shadow prices of pollutants with a nonparametric directional distance function approach, where the inefficiency involved in the production process is taken into account unlike the previous studies. The directional vector, which is critical to the estimation and subject to the criterion for an appropriate efficiency rule proposed here, is calculated by using the annual plans of power plants in terms of production and environment. In the empirical study for Korea's electric power industry during the period of 1990-1995, we find that the average shadow prices of sulfur oxides (SOx), nitrogen oxides (NOx), and total suspended particulates (TSP) are approximately 10% lower than those calculated under the assumption of full efficiency. The methodology we propose and the findings obtained in the empirical study allow us to undertake better decision-making over a broad range of environmental policy issues. PMID:12141157

  1. Feasibility of a Hybrid Brain-Computer Interface for Advanced Functional Electrical Therapy

    PubMed Central

    Savić, Andrej M.; Malešević, Nebojša M.; Popović, Mirjana B.

    2014-01-01

    We present a feasibility study of a novel hybrid brain-computer interface (BCI) system for advanced functional electrical therapy (FET) of grasp. FET procedure is improved with both automated stimulation pattern selection and stimulation triggering. The proposed hybrid BCI comprises the two BCI control signals: steady-state visual evoked potentials (SSVEP) and event-related desynchronization (ERD). The sequence of the two stages, SSVEP-BCI and ERD-BCI, runs in a closed-loop architecture. The first stage, SSVEP-BCI, acts as a selector of electrical stimulation pattern that corresponds to one of the three basic types of grasp: palmar, lateral, or precision. In the second stage, ERD-BCI operates as a brain switch which activates the stimulation pattern selected in the previous stage. The system was tested in 6 healthy subjects who were all able to control the device with accuracy in a range of 0.64–0.96. The results provided the reference data needed for the planned clinical study. This novel BCI may promote further restoration of the impaired motor function by closing the loop between the “will to move” and contingent temporally synchronized sensory feedback. PMID:24616644

  2. Use of time space Green's functions in the computation of transient eddy current fields

    SciTech Connect

    Davey, K.; Turner, L.

    1988-12-01

    The utility of integral equations to solve eddy current problems has been borne out by numerous computations in the past few years, principally in sinusoidal steady-state problems. This paper attempts to examine the applicability of the integral approaches in both time and space for the more generic transient problem. The basic formulation for the time space Green's function approach is laid out. A technique employing Gauss-Laguerre integration is employed to realize the temporal solution, while Gauss--Legendre integration is used to resolve the spatial field character. The technique is then applied to the fusion electromagnetic induction experiments (FELIX) cylinder experiments in both two and three dimensions. It is found that quite accurate solutions can be obtained using rather coarse time steps and very few unknowns; the three-dimensional field solution worked out in this context used basically only four unknowns. The solution appears to be somewhat sensitive to the choice of time step, a consequence of a numerical instability imbedded in the Green's function near the origin.

  3. The use of time space Green's functions in the computation of transient eddy current fields

    NASA Astrophysics Data System (ADS)

    Davey, Kent; Turner, Larry

    1988-12-01

    The utility of integral equations to solve eddy current problems has been borne out by numerous computations in the past few years, principally in sinusoidal steady-state problems. This paper attempts to examine the applicability of the integral approaches in both time and space for the more generic transient problem. The basic formulation for the time space Green's function approach is laid out. A technique employing Gauss-Laguerre integration is employed to realize the temporal solution, while Gauss-Legendre integration is used to resolve the spatial field character. The technique is then applied to the fusion electromagnetic induction experiments (FELIX) cylinder experiments in both two and three dimensions. It is found that quite accurate solutions can be obtained using rather coarse time steps and very few unknowns; the three-dimensional field solution worked out in this context used basically only four unknowns. The solution appears to be somewhat sensitive to the choice of time step, a consequence of a numerical instability imbedded in the Green's function near the origin.

  4. Brain-computer interface using a simplified functional near-infrared spectroscopy system.

    PubMed

    Coyle, Shirley M; Ward, Tomás E; Markham, Charles M

    2007-09-01

    A brain-computer interface (BCI) is a device that allows a user to communicate with external devices through thought processes alone. A novel signal acquisition tool for BCIs is near-infrared spectroscopy (NIRS), an optical technique to measure localized cortical brain activity. The benefits of using this non-invasive modality are safety, portability and accessibility. A number of commercial multi-channel NIRS system are available; however we have developed a straightforward custom-built system to investigate the functionality of a fNIRS-BCI system. This work describes the construction of the device, the principles of operation and the implementation of a fNIRS-BCI application, 'Mindswitch' that harnesses motor imagery for control. Analysis is performed online and feedback of performance is presented to the user. Mindswitch presents a basic 'on/off' switching option to the user, where selection of either state takes 1 min. Initial results show that fNIRS can support simple BCI functionality and shows much potential. Although performance may be currently inferior to many EEG systems, there is much scope for development particularly with more sophisticated signal processing and classification techniques. We hope that by presenting fNIRS as an accessible and affordable option, a new avenue of exploration will open within the BCI research community and stimulate further research in fNIRS-BCIs. PMID:17873424

  5. Computational identification of riboswitches based on RNA conserved functional sequences and conformations.

    PubMed

    Chang, Tzu-Hao; Huang, Hsien-Da; Wu, Li-Ching; Yeh, Chi-Ta; Liu, Baw-Jhiune; Horng, Jorng-Tzong

    2009-07-01

    Riboswitches are cis-acting genetic regulatory elements within a specific mRNA that can regulate both transcription and translation by interacting with their corresponding metabolites. Recently, an increasing number of riboswitches have been identified in different species and investigated for their roles in regulatory functions. Both the sequence contexts and structural conformations are important characteristics of riboswitches. None of the previously developed tools, such as covariance models (CMs), Riboswitch finder, and RibEx, provide a web server for efficiently searching homologous instances of known riboswitches or considers two crucial characteristics of each riboswitch, such as the structural conformations and sequence contexts of functional regions. Therefore, we developed a systematic method for identifying 12 kinds of riboswitches. The method is implemented and provided as a web server, RiboSW, to efficiently and conveniently identify riboswitches within messenger RNA sequences. The predictive accuracy of the proposed method is comparable with other previous tools. The efficiency of the proposed method for identifying riboswitches was improved in order to achieve a reasonable computational time required for the prediction, which makes it possible to have an accurate and convenient web server for biologists to obtain the results of their analysis of a given mRNA sequence. RiboSW is now available on the web at http://RiboSW.mbc.nctu.edu.tw/. PMID:19460868

  6. Advancing Understanding and Design of Functional Materials Through Theoretical and Computational Chemical Physics

    SciTech Connect

    Fuentes-Cabrera, Miguel A; Huang, Jingsong; Jakowski, Jacek; Meunier, V.; Lopez-Benzanilla, Alejandro; Cruz Silva, Eduardo; Sumpter, Bobby G; Beste, Ariana

    2012-01-01

    Theoretical and computational chemical physics and materials science offers great opportunity toward helping solve some of the grand challenges in science and engineering, because structure and properties of molecules, solids, and liquids are direct reflections of the underlying quantum motion of their electrons. With the advent of semilocal and especially nonlocal descriptions of exchange and correlation effects, density functional theory (DFT) can now describe bonding in molecules and solids with an accuracy which, for many classes of systems, is sufficient to compare quantitatively to experiments. It is therefore becoming possible to develop a semiquantitative description of a large number of systems and processes. In this chapter, we briefly review DFT and its various extensions to include nonlocal terms that are important for long-range dispersion interactions that dominate many self-assembly processes, molecular surface adsorption processes, solution processes, and biological and polymeric materials. Applications of DFT toward problems relevant to energy systems, including energy storage materials, functional nanoelectronics/optoelectronics, and energy conversion, are highlighted.

  7. Functional source separation and hand cortical representation for a brain–computer interface feature extraction

    PubMed Central

    Tecchio, Franca; Porcaro, Camillo; Barbati, Giulia; Zappasodi, Filippo

    2007-01-01

    A brain–computer interface (BCI) can be defined as any system that can track the person's intent which is embedded in his/her brain activity and, from it alone, translate the intention into commands of a computer. Among the brain signal monitoring systems best suited for this challenging task, electroencephalography (EEG) and magnetoencephalography (MEG) are the most realistic, since both are non-invasive, EEG is portable and MEG could provide more specific information that could be later exploited also through EEG signals. The first two BCI steps require set up of the appropriate experimental protocol while recording the brain signal and then to extract interesting features from the recorded cerebral activity. To provide information useful in these BCI stages, our aim is to provide an overview of a new procedure we recently developed, named functional source separation (FSS). As it comes from the blind source separation algorithms, it exploits the most valuable information provided by the electrophysiological techniques, i.e. the waveform signal properties, remaining blind to the biophysical nature of the signal sources. FSS returns the single trial source activity, estimates the time course of a neuronal pool along different experimental states on the basis of a specific functional requirement in a specific time period, and uses the simulated annealing as the optimization procedure allowing the exploit of functional constraints non-differentiable. Moreover, a minor section is included, devoted to information acquired by MEG in stroke patients, to guide BCI applications aiming at sustaining motor behaviour in these patients. Relevant BCI features – spatial and time-frequency properties – are in fact altered by a stroke in the regions devoted to hand control. Moreover, a method to investigate the relationship between sensory and motor hand cortical network activities is described, providing information useful to develop BCI feedback control systems. This

  8. Computational Diffusion Magnetic Resonance Imaging Based on Time-Dependent Bloch NMR Flow Equation and Bessel Functions.

    PubMed

    Awojoyogbe, Bamidele O; Dada, Michael O; Onwu, Samuel O; Ige, Taofeeq A; Akinwande, Ninuola I

    2016-04-01

    Magnetic resonance imaging (MRI) uses a powerful magnetic field along with radio waves and a computer to produce highly detailed "slice-by-slice" pictures of virtually all internal structures of matter. The results enable physicians to examine parts of the body in minute detail and identify diseases in ways that are not possible with other techniques. For example, MRI is one of the few imaging tools that can see through bones, making it an excellent tool for examining the brain and other soft tissues. Pulsed-field gradient experiments provide a straightforward means of obtaining information on the translational motion of nuclear spins. However, the interpretation of the data is complicated by the effects of restricting geometries as in the case of most cancerous tissues and the mathematical concept required to account for this becomes very difficult. Most diffusion magnetic resonance techniques are based on the Stejskal-Tanner formulation usually derived from the Bloch-Torrey partial differential equation by including additional terms to accommodate the diffusion effect. Despite the early success of this technique, it has been shown that it has important limitations, the most of which occurs when there is orientation heterogeneity of the fibers in the voxel of interest (VOI). Overcoming this difficulty requires the specification of diffusion coefficients as function of spatial coordinate(s) and such a phenomenon is an indication of non-uniform compartmental conditions which can be analyzed accurately by solving the time-dependent Bloch NMR flow equation analytically. In this study, a mathematical formulation of magnetic resonance flow sequence in restricted geometry is developed based on a general second order partial differential equation derived directly from the fundamental Bloch NMR flow equations. The NMR signal is obtained completely in terms of NMR experimental parameters. The process is described based on Bessel functions and properties that can make it

  9. Functional requirements of computer systems for the U.S. Geological Survey, Water Resources Division, 1988-97

    USGS Publications Warehouse

    Hathaway, R.M.; McNellis, J.M.

    1989-01-01

    Investigating the occurrence, quantity, quality, distribution, and movement of the Nation 's water resources is the principal mission of the U.S. Geological Survey 's Water Resources Division. Reports of these investigations are published and available to the public. To accomplish this mission, the Division requires substantial computer technology to process, store, and analyze data from more than 57,000 hydrologic sites. The Division 's computer resources are organized through the Distributed Information System Program Office that manages the nationwide network of computers. The contract that provides the major computer components for the Water Resources Division 's Distributed information System expires in 1991. Five work groups were organized to collect the information needed to procure a new generation of computer systems for the U. S. Geological Survey, Water Resources Division. Each group was assigned a major Division activity and asked to describe its functional requirements of computer systems for the next decade. The work groups and major activities are: (1) hydrologic information; (2) hydrologic applications; (3) geographic information systems; (4) reports and electronic publishing; and (5) administrative. The work groups identified 42 functions and described their functional requirements for 1988, 1992, and 1997. A few new functions such as Decision Support Systems and Executive Information Systems, were identified, but most are the same as performed today. Although the number of functions will remain about the same, steady growth in the size, complexity, and frequency of many functions is predicted for the next decade. No compensating increase in the Division 's staff is anticipated during this period. To handle the increased workload and perform these functions, new approaches will be developed that use advanced computer technology. The advanced technology is required in a unified, tightly coupled system that will support all functions simultaneously

  10. EUPDF: Eulerian Monte Carlo Probability Density Function Solver for Applications With Parallel Computing, Unstructured Grids, and Sprays

    NASA Technical Reports Server (NTRS)

    Raju, M. S.

    1998-01-01

    The success of any solution methodology used in the study of gas-turbine combustor flows depends a great deal on how well it can model the various complex and rate controlling processes associated with the spray's turbulent transport, mixing, chemical kinetics, evaporation, and spreading rates, as well as convective and radiative heat transfer and other phenomena. The phenomena to be modeled, which are controlled by these processes, often strongly interact with each other at different times and locations. In particular, turbulence plays an important role in determining the rates of mass and heat transfer, chemical reactions, and evaporation in many practical combustion devices. The influence of turbulence in a diffusion flame manifests itself in several forms, ranging from the so-called wrinkled, or stretched, flamelets regime to the distributed combustion regime, depending upon how turbulence interacts with various flame scales. Conventional turbulence models have difficulty treating highly nonlinear reaction rates. A solution procedure based on the composition joint probability density function (PDF) approach holds the promise of modeling various important combustion phenomena relevant to practical combustion devices (such as extinction, blowoff limits, and emissions predictions) because it can account for nonlinear chemical reaction rates without making approximations. In an attempt to advance the state-of-the-art in multidimensional numerical methods, we at the NASA Lewis Research Center extended our previous work on the PDF method to unstructured grids, parallel computing, and sprays. EUPDF, which was developed by M.S. Raju of Nyma, Inc., was designed to be massively parallel and could easily be coupled with any existing gas-phase and/or spray solvers. EUPDF can use an unstructured mesh with mixed triangular, quadrilateral, and/or tetrahedral elements. The application of the PDF method showed favorable results when applied to several supersonic

  11. Indices of cognitive function measured in rugby union players using a computer-based test battery.

    PubMed

    MacDonald, Luke A; Minahan, Clare L

    2016-09-01

    The purpose of this study was to investigate the intra- and inter-day reliability of cognitive performance using a computer-based test battery in team-sport athletes. Eighteen elite male rugby union players (age: 19 ± 0.5 years) performed three experimental trials (T1, T2 and T3) of the test battery: T1 and T2 on the same day and T3, on the following day, 24 h later. The test battery comprised of four cognitive tests assessing the cognitive domains of executive function (Groton Maze Learning Task), psychomotor function (Detection Task), vigilance (Identification Task), visual learning and memory (One Card Learning Task). The intraclass correlation coefficients (ICCs) for the Detection Task, the Identification Task and the One Card Learning Task performance variables ranged from 0.75 to 0.92 when comparing T1 to T2 to assess intraday reliability, and 0.76 to 0.83 when comparing T1 and T3 to assess inter-day reliability. The ICCs for the Groton Maze Learning Task intra- and inter-day reliability were 0.67 and 0.57, respectively. We concluded that the Detection Task, the Identification Task and the One Card Learning Task are reliable measures of psychomotor function, vigilance, visual learning and memory in rugby union players. The reliability of the Groton Maze Learning Task is questionable (mean coefficient of variation (CV) = 19.4%) and, therefore, results should be interpreted with caution. PMID:26756946

  12. Progressive adaptation in regional parenchyma mechanics following extensive lung resection assessed by functional computed tomography

    PubMed Central

    Yilmaz, Cuneyt; Tustison, Nicholas J.; Dane, D. Merrill; Ravikumar, Priya; Takahashi, Masaya; Gee, James C.

    2011-01-01

    In adult canines following major lung resection, the remaining lobes expand asymmetrically, associated with alveolar tissue regrowth, remodeling, and progressive functional compensation over many months. To permit noninvasive longitudinal assessment of regional growth and function, we performed serial high-resolution computed tomography (HRCT) on six male dogs (∼9 mo old, 25.0 ± 4.5 kg, ±SD) at 15 and 30 cmH2O transpulmonary pressure (Ptp) before resection (PRE) and 3 and 15 mo postresection (POST3 and POST15, respectively) of 65–70% of lung units. At POST3, lobar air volume increased 83–148% and tissue (including microvascular blood) volume 120–234% above PRE values without further changes at POST15. Lobar-specific compliance (Cs) increased 52–137% from PRE to POST3 and 28–79% from POST3 to POST15. Inflation-related parenchyma strain and shear were estimated by detailed registration of corresponding anatomical features at each Ptp. Within each lobe, regional displacement was most pronounced at the caudal region, whereas strain was pronounced in the periphery. Regional three-dimensional strain magnitudes increased heterogeneously from PRE to POST3, with further medial-lateral increases from POST3 to POST15. Lobar principal strains (PSs) were unchanged or modestly elevated postresection; changes in lobar maximum PS correlated inversely with changes in lobar air and tissue volumes. Lobar shear distortion increased in coronal and transverse planes at POST3 without further changes thereafter. These results establish a novel use of functional HRCT to map heterogeneous regional deformation during compensatory lung growth and illustrate a stimulus-response feedback loop whereby postresection mechanical stress initiates differential lobar regrowth and sustained remodeling, which in turn, relieves parenchyma stress and strain, resulting in progressive increases in lobar Cs and a delayed increase in whole lung Cs. PMID:21799134

  13. COPD phenotypes on computed tomography and its correlation with selected lung function variables in severe patients

    PubMed Central

    da Silva, Silvia Maria Doria; Paschoal, Ilma Aparecida; De Capitani, Eduardo Mello; Moreira, Marcos Mello; Palhares, Luciana Campanatti; Pereira, Mônica Corso

    2016-01-01

    Background Computed tomography (CT) phenotypic characterization helps in understanding the clinical diversity of chronic obstructive pulmonary disease (COPD) patients, but its clinical relevance and its relationship with functional features are not clarified. Volumetric capnography (VC) uses the principle of gas washout and analyzes the pattern of CO2 elimination as a function of expired volume. The main variables analyzed were end-tidal concentration of carbon dioxide (ETCO2), Slope of phase 2 (Slp2), and Slope of phase 3 (Slp3) of capnogram, the curve which represents the total amount of CO2 eliminated by the lungs during each breath. Objective To investigate, in a group of patients with severe COPD, if the phenotypic analysis by CT could identify different subsets of patients, and if there was an association of CT findings and functional variables. Subjects and methods Sixty-five patients with COPD Gold III–IV were admitted for clinical evaluation, high-resolution CT, and functional evaluation (spirometry, 6-minute walk test [6MWT], and VC). The presence and profusion of tomography findings were evaluated, and later, the patients were identified as having emphysema (EMP) or airway disease (AWD) phenotype. EMP and AWD groups were compared; tomography findings scores were evaluated versus spirometric, 6MWT, and VC variables. Results Bronchiectasis was found in 33.8% and peribronchial thickening in 69.2% of the 65 patients. Structural findings of airways had no significant correlation with spirometric variables. Air trapping and EMP were strongly correlated with VC variables, but in opposite directions. There was some overlap between the EMP and AWD groups, but EMP patients had signicantly lower body mass index, worse obstruction, and shorter walked distance on 6MWT. Concerning VC, EMP patients had signicantly lower ETCO2, Slp2 and Slp3. Increases in Slp3 characterize heterogeneous involvement of the distal air spaces, as in AWD. Conclusion Visual assessment and

  14. Analysis of the Uncertainty in the Computation of Receiver Functions and Improvement in the Estimation of Receiver, PP and SS functions

    NASA Astrophysics Data System (ADS)

    Huang, X.; Gurrola, H.

    2013-12-01

    methods. All of these methods performed well in terms of stdev but we chose ARU for its high quality data and low signal to noise ratios (the average S/N ratio for these data were 4%). With real data, we tend to assume the method that has the lowest stdev is the best. But stdev does not account for a systematic bias toward incorrect values. In this case the LSD once again had the lowest stdev in computed amplitudes of Pds phases but it had the smallest values. But the FID, FWLD and MID tended to produce the largest amplitude while the LSD and TID tended toward the lower amplitudes. Considering that in the synthetics all these methods showed bias toward low amplitude, we believe that with real data those methods producing the largest amplitudes will be closest to the 'true values' and that is a better measure of the better method than a small stdev in amplitude estimates. We will also present results for applying TID and FID methods to the production of PP and SS precursor functions. When applied to these data, it is possible to moveout correct the cross-correlation functions before extracting the signal from each PdP (or SdS) phase in these data. As a result a much cleaner Earth function is produced and feequency content is significantly improved.

  15. Computer simulation of respiratory impedance and flow transfer functions during high frequency oscillations.

    PubMed

    Peslin, R

    1989-01-01

    The usefulness of measuring respiratory flow in the airway and at the chest wall and of measuring respiratory input impedance (Z) to monitor high frequency ventilation was investigated by computer simulation using a monoalveolar 10-coefficient model. The latter included a central airway with its resistance (Rc) and inertance (lc), a resistive peripheral airway (Rp), a lumped bronchial compliance (Cb), alveolar gas compliance (Cgas), lung tissue with its resistance (RL) and compliance (CL), and chest wall resistance (RW), inertance (lw) and compliance (Cw). Gas flow in the peripheral airway (Vp), shunt flow through Cb (Vb), gas compression flow (Vgas) and rate of volume change of the lung (VL) and of the chest (VW) were computed and expressed as a function of gas flow in the central airway (Vc). For normal values of the coefficients, Vp/Vc was found to decrease moderately with increasing frequency and was still 0.75 at 20 Hz. Peripheral airway obstruction (Rp x 5) considerably decreased Vp/Vc, particularly at high frequency. It did not change the relationship between the two measurable flows, Vc and Vw, but increased the effective resistance at low frequency and shifted the reactance curve to the right. A reduced lung or chest wall compliance produced little change in Vp/Vc and Z except at very low frequencies; however, it decreased the phase lag between Vw and Vc. Finally, an increased airway wall compliance decreased Vp/Vc, but had little effect on Z and Vw/Vc. It is concluded that measuring respiratory impedance may help in detecting some, but not all of the conditions in which peripheral flow convection is decreased during high frequency oscillations. PMID:2611083

  16. Density functional theory computation of Nuclear Magnetic Resonance parameters in light and heavy nuclei

    NASA Astrophysics Data System (ADS)

    Sutter, Kiplangat

    This thesis illustrates the utilization of Density functional theory (DFT) in calculations of gas and solution phase Nuclear Magnetic Resonance (NMR) properties of light and heavy nuclei. Computing NMR properties is still a challenge and there are many unknown factors that are still being explored. For instance, influence of hydrogen-bonding; thermal motion; vibration; rotation and solvent effects. In one of the theoretical studies of 195Pt NMR chemical shift in cisplatin and its derivatives illustrated in Chapter 2 and 3 of this thesis. The importance of representing explicit solvent molecules explicitly around the Pt center in cisplatin complexes was outlined. In the same complexes, solvent effect contributed about half of the J(Pt-N) coupling constant. Indicating the significance of considering the surrounding solvent molecules in elucidating the NMR measurements of cisplatin binding to DNA. In chapter 4, we explore the Spin-Orbit (SO) effects on the 29Si and 13C chemical shifts induced by surrounding metal and ligands. The unusual Ni, Pd, Pt trends in SO effects to the 29Si in metallasilatrane complexes X-Si-(mu-mt)4-M-Y was interpreted based on electronic and relativistic effects rather than by structural differences between the complexes. In addition, we develop a non-linear model for predicting NMR SO effects in a series of organics bonded to heavy nuclei halides. In chapter 5, we extend the idea of "Chemist's orbitals" LMO analysis to the quantum chemical proton NMR computation of systems with internal resonance-assisted hydrogen bonds. Consequently, we explicitly link the relationship between the NMR parameters related to H-bonded systems and intuitive picture of a chemical bond from quantum calculations. The analysis shows how NMR signatures characteristic of H-bond can be explained by local bonding and electron delocalization concepts. One shortcoming of some of the anti-cancer agents like cisplatin is that they are toxic and researchers are looking for

  17. Computerized accounting methods. Final report

    SciTech Connect

    1994-12-31

    This report summarizes the results of the research performed under the Task Order on computerized accounting methods in a period from 03 August to 31 December 1994. Computerized nuclear material accounting methods are analyzed and evaluated. Selected methods are implemented in a hardware-software complex developed as a prototype of the local network-based CONMIT system. This complex has been put into trial operation for test and evaluation of the selected methods at two selected ``Kurchatov Institute`` Russian Research Center (``KI`` RRC) nuclear facilities. Trial operation is carried out since the beginning of Initial Physical Inventory Taking in these facilities that was performed in November 1994. Operation of CONMIT prototype system was demonstrated in the middle of December 1994. Results of evaluation of CONMIT prototype system features and functioning under real operating conditions are considered. Conclusions are formulated on the ways of further development of computerized nuclear material accounting methods. The most important conclusion is a need to strengthen computer and information security features supported by the operating environment. Security provisions as well as other LANL Client/Server System approaches being developed by Los Alamos National Laboratory are recommended for selection of software and hardware components to be integrated into production version of CONMIT system for KI RRC.

  18. Response functions for computing absorbed dose to skeletal tissues from neutron irradiation.

    PubMed

    Bahadori, Amir A; Johnson, Perry; Jokisch, Derek W; Eckerman, Keith F; Bolch, Wesley E

    2011-11-01

    Spongiosa in the adult human skeleton consists of three tissues-active marrow (AM), inactive marrow (IM) and trabecularized mineral bone (TB). AM is considered to be the target tissue for assessment of both long-term leukemia risk and acute marrow toxicity following radiation exposure. The total shallow marrow (TM(50)), defined as all tissues lying within the first 50 µm of the bone surfaces, is considered to be the radiation target tissue of relevance for radiogenic bone cancer induction. For irradiation by sources external to the body, kerma to homogeneous spongiosa has been used as a surrogate for absorbed dose to both of these tissues, as direct dose calculations are not possible using computational phantoms with homogenized spongiosa. Recent micro-CT imaging of a 40 year old male cadaver has allowed for the accurate modeling of the fine microscopic structure of spongiosa in many regions of the adult skeleton (Hough et al 2011 Phys. Med. Biol. 56 2309-46). This microstructure, along with associated masses and tissue compositions, was used to compute specific absorbed fraction (SAF) values for protons originating in axial and appendicular bone sites (Jokisch et al 2011 Phys. Med. Biol. 56 6857-72). These proton SAFs, bone masses, tissue compositions and proton production cross sections, were subsequently used to construct neutron dose-response functions (DRFs) for both AM and TM(50) targets in each bone of the reference adult male. Kerma conditions were assumed for other resultant charged particles. For comparison, AM, TM(50) and spongiosa kerma coefficients were also calculated. At low incident neutron energies, AM kerma coefficients for neutrons correlate well with values of the AM DRF, while total marrow (TM) kerma coefficients correlate well with values of the TM(50) DRF. At high incident neutron energies, all kerma coefficients and DRFs tend to converge as charged-particle equilibrium is established across the bone site. In the range of 10 eV to 100 Me

  19. Response functions for computing absorbed dose to skeletal tissues from neutron irradiation

    NASA Astrophysics Data System (ADS)

    Bahadori, Amir A.; Johnson, Perry; Jokisch, Derek W.; Eckerman, Keith F.; Bolch, Wesley E.

    2011-11-01

    Spongiosa in the adult human skeleton consists of three tissues—active marrow (AM), inactive marrow (IM) and trabecularized mineral bone (TB). AM is considered to be the target tissue for assessment of both long-term leukemia risk and acute marrow toxicity following radiation exposure. The total shallow marrow (TM50), defined as all tissues lying within the first 50 µm of the bone surfaces, is considered to be the radiation target tissue of relevance for radiogenic bone cancer induction. For irradiation by sources external to the body, kerma to homogeneous spongiosa has been used as a surrogate for absorbed dose to both of these tissues, as direct dose calculations are not possible using computational phantoms with homogenized spongiosa. Recent micro-CT imaging of a 40 year old male cadaver has allowed for the accurate modeling of the fine microscopic structure of spongiosa in many regions of the adult skeleton (Hough et al 2011 Phys. Med. Biol. 56 2309-46). This microstructure, along with associated masses and tissue compositions, was used to compute specific absorbed fraction (SAF) values for protons originating in axial and appendicular bone sites (Jokisch et al 2011 Phys. Med. Biol. 56 6857-72). These proton SAFs, bone masses, tissue compositions and proton production cross sections, were subsequently used to construct neutron dose-response functions (DRFs) for both AM and TM50 targets in each bone of the reference adult male. Kerma conditions were assumed for other resultant charged particles. For comparison, AM, TM50 and spongiosa kerma coefficients were also calculated. At low incident neutron energies, AM kerma coefficients for neutrons correlate well with values of the AM DRF, while total marrow (TM) kerma coefficients correlate well with values of the TM50 DRF. At high incident neutron energies, all kerma coefficients and DRFs tend to converge as charged-particle equilibrium is established across the bone site. In the range of 10 eV to 100 Me

  20. How to teach mono-unary algebras and functional graphs with the use of computers in secondary schools

    NASA Astrophysics Data System (ADS)

    Binterová, Helena; Fuchs, Eduard

    2014-07-01

    In this paper, alternative descriptions of functions are demonstrated with the use of a computer. If we understand functions as mono-unary algebraic functions or functional graphs, it is possible, even at the school level, to suitably present many of their characteristics. First, we describe cyclic graphs of constant and linear functions, which are a part of the upper-secondary level educational curriculum. Students are usually surprised by the unexpected characteristics of such simple functions which cannot be revealed using the traditional Cartesian graphing. The next part of the paper deals with the characteristics of functional graphs of quadratic functions, which play an important role in school mathematics and in applications, for instance, in the description of non-linear processes. We show that their description is much more complicated. In contrast to functional graphs of linear functions, it is necessary to use computers. Students can find space for their own individual exploration to reveal lines of interesting characteristics of quadratic functions, which give students a new view on this part of school mathematics.

  1. Functional analysis of metabolic channeling and regulation in lignin biosynthesis: a computational approach.

    PubMed

    Lee, Yun; Escamilla-Treviño, Luis; Dixon, Richard A; Voit, Eberhard O

    2012-01-01

    Lignin is a polymer in secondary cell walls of plants that is known to have negative impacts on forage digestibility, pulping efficiency, and sugar release from cellulosic biomass. While targeted modifications of different lignin biosynthetic enzymes have permitted the generation of transgenic plants with desirable traits, such as improved digestibility or reduced recalcitrance to saccharification, some of the engineered plants exhibit monomer compositions that are clearly at odds with the expected outcomes when the biosynthetic pathway is perturbed. In Medicago, such discrepancies were partly reconciled by the recent finding that certain biosynthetic enzymes may be spatially organized into two independent channels for the synthesis of guaiacyl (G) and syringyl (S) lignin monomers. Nevertheless, the mechanistic details, as well as the biological function of these interactions, remain unclear. To decipher the working principles of this and similar control mechanisms, we propose and employ here a novel computational approach that permits an expedient and exhaustive assessment of hundreds of minimal designs that could arise in vivo. Interestingly, this comparative analysis not only helps distinguish two most parsimonious mechanisms of crosstalk between the two channels by formulating a targeted and readily testable hypothesis, but also suggests that the G lignin-specific channel is more important for proper functioning than the S lignin-specific channel. While the proposed strategy of analysis in this article is tightly focused on lignin synthesis, it is likely to be of similar utility in extracting unbiased information in a variety of situations, where the spatial organization of molecular components is critical for coordinating the flow of cellular information, and where initially various control designs seem equally valid. PMID:23144605

  2. Functional Analysis of Metabolic Channeling and Regulation in Lignin Biosynthesis: A Computational Approach

    PubMed Central

    Lee, Yun; Escamilla-Treviño, Luis; Dixon, Richard A.; Voit, Eberhard O.

    2012-01-01

    Lignin is a polymer in secondary cell walls of plants that is known to have negative impacts on forage digestibility, pulping efficiency, and sugar release from cellulosic biomass. While targeted modifications of different lignin biosynthetic enzymes have permitted the generation of transgenic plants with desirable traits, such as improved digestibility or reduced recalcitrance to saccharification, some of the engineered plants exhibit monomer compositions that are clearly at odds with the expected outcomes when the biosynthetic pathway is perturbed. In Medicago, such discrepancies were partly reconciled by the recent finding that certain biosynthetic enzymes may be spatially organized into two independent channels for the synthesis of guaiacyl (G) and syringyl (S) lignin monomers. Nevertheless, the mechanistic details, as well as the biological function of these interactions, remain unclear. To decipher the working principles of this and similar control mechanisms, we propose and employ here a novel computational approach that permits an expedient and exhaustive assessment of hundreds of minimal designs that could arise in vivo. Interestingly, this comparative analysis not only helps distinguish two most parsimonious mechanisms of crosstalk between the two channels by formulating a targeted and readily testable hypothesis, but also suggests that the G lignin-specific channel is more important for proper functioning than the S lignin-specific channel. While the proposed strategy of analysis in this article is tightly focused on lignin synthesis, it is likely to be of similar utility in extracting unbiased information in a variety of situations, where the spatial organization of molecular components is critical for coordinating the flow of cellular information, and where initially various control designs seem equally valid. PMID:23144605

  3. Computed Tomography-Derived Parameters of Myocardial Morphology and Function in Black and White Patients With Acute Chest Pain.

    PubMed

    Takx, Richard A P; Vliegenthart, Rozemarijn; Schoepf, U Joseph; Abro, Joseph A; Nance, John W; Ebersberger, Ullrich; Bamberg, Fabian; Carr, Christine M; Apfaltrer, Paul

    2016-02-01

    Blacks have higher mortality and hospitalization rates because of congestive heart failure compared with white counterparts. Differences in cardiac structure and function may contribute to the racial disparity in cardiovascular outcomes. Our aim was to compare computed tomography (CT)-derived cardiac measurements between black patients with acute chest pain and age- and gender-matched white patients. We performed a retrospective analysis under an institutional review board waiver and in Health Insurance Portability and Accountability Act compliance. We investigated patients who underwent cardiac dual-source CT for acute chest pain. Myocardial mass, left ventricular (LV) ejection fraction, LV end-systolic volume, and LV end-diastolic volume were quantified using an automated analysis algorithm. Septal wall thickness and cardiac chamber diameters were manually measured. Measurements were compared by independent t test and linear regression. The study population consisted of 300 patients (150 black-mean age 54 ± 12 years; 46% men; 150 white-mean age 55 ± 11 years; 46% men). Myocardial mass was larger for blacks compared with white (176.1 ± 58.4 vs 155.9 ± 51.7 g, p = 0.002), which remained significant after adjusting for age, gender, body mass index, and hypertension. Septal wall thickness was slightly greater (11.9 ± 2.7 vs 11.2 ± 3.1 mm, p = 0.036). The LV inner diameter was moderately larger in black patients in systole (32.3 ± 9.0 vs 30.1 ± 5.4 ml, p = 0.010) and in diastole (50.1 ± 7.8 vs 48.9 ± 5.2 ml, p = 0.137), as well as LV end-diastolic volume (134.5 ± 42.7 vs 128.2 ± 30.6 ml, p = 0.143). Ejection fraction was nonsignificantly lower in blacks (67.1 ± 13.5% vs 69.0 ± 9.6%, p = 0.169). In conclusion, CT-derived myocardial mass was larger in blacks compared with whites, whereas LV functional parameters were generally not statistically different, suggesting that LV mass might be a possible contributing factor to the higher rate of cardiac events

  4. Insights into the function of ion channels by computational electrophysiology simulations.

    PubMed

    Kutzner, Carsten; Köpfer, David A; Machtens, Jan-Philipp; de Groot, Bert L; Song, Chen; Zachariae, Ulrich

    2016-07-01

    Ion channels are of universal importance for all cell types and play key roles in cellular physiology and pathology. Increased insight into their functional mechanisms is crucial to enable drug design on this important class of membrane proteins, and to enhance our understanding of some of the fundamental features of cells. This review presents the concepts behind the recently developed simulation protocol Computational Electrophysiology (CompEL), which facilitates the atomistic simulation of ion channels in action. In addition, the review provides guidelines for its application in conjunction with the molecular dynamics software package GROMACS. We first lay out the rationale for designing CompEL as a method that models the driving force for ion permeation through channels the way it is established in cells, i.e., by electrochemical ion gradients across the membrane. This is followed by an outline of its implementation and a description of key settings and parameters helpful to users wishing to set up and conduct such simulations. In recent years, key mechanistic and biophysical insights have been obtained by employing the CompEL protocol to address a wide range of questions on ion channels and permeation. We summarize these recent findings on membrane proteins, which span a spectrum from highly ion-selective, narrow channels to wide diffusion pores. Finally we discuss the future potential of CompEL in light of its limitations and strengths. This article is part of a Special Issue entitled: Membrane Proteins edited by J.C. Gumbart and Sergei Noskov. PMID:26874204

  5. Effects of a computer-based intervention program on the communicative functions of children with autism.

    PubMed

    Hetzroni, Orit E; Tannous, Juman

    2004-04-01

    This study investigated the use of computer-based intervention for enhancing communication functions of children with autism. The software program was developed based on daily life activities in the areas of play, food, and hygiene. The following variables were investigated: delayed echolalia, immediate echolalia, irrelevant speech, relevant speech, and communicative initiations. Multiple-baseline design across settings was used to examine the effects of the exposure of five children with autism to activities in a structured and controlled simulated environment on the communication manifested in their natural environment. Results indicated that after exposure to the simulations, all children produced fewer sentences with delayed and irrelevant speech. Most of the children engaged in fewer sentences involving immediate echolalia and increased the number of communication intentions and the amount of relevant speech they produced. Results indicated that after practicing in a controlled and structured setting that provided the children with opportunities to interact in play, food, and hygiene activities, the children were able to transfer their knowledge to the natural classroom environment. Implications and future research directions are discussed. PMID:15162930

  6. Synchrotron-based dynamic computed tomography of tissue motion for regional lung function measurement

    PubMed Central

    Dubsky, Stephen; Hooper, Stuart B.; Siu, Karen K. W.; Fouras, Andreas

    2012-01-01

    During breathing, lung inflation is a dynamic process involving a balance of mechanical factors, including trans-pulmonary pressure gradients, tissue compliance and airway resistance. Current techniques lack the capacity for dynamic measurement of ventilation in vivo at sufficient spatial and temporal resolution to allow the spatio-temporal patterns of ventilation to be precisely defined. As a result, little is known of the regional dynamics of lung inflation, in either health or disease. Using fast synchrotron-based imaging (up to 60 frames s−1), we have combined dynamic computed tomography (CT) with cross-correlation velocimetry to measure regional time constants and expansion within the mammalian lung in vivo. Additionally, our new technique provides estimation of the airflow distribution throughout the bronchial tree during the ventilation cycle. Measurements of lung expansion and airflow in mice and rabbit pups are shown to agree with independent measures. The ability to measure lung function at a regional level will provide invaluable information for studies into normal and pathological lung dynamics, and may provide new pathways for diagnosis of regional lung diseases. Although proof-of-concept data were acquired on a synchrotron, the methodology developed potentially lends itself to clinical CT scanning and therefore offers translational research opportunities. PMID:22491972

  7. A Computational Model Quantifies the Effect of Anatomical Variability on Velopharyngeal Function

    PubMed Central

    Inouye, Joshua M.; Perry, Jamie L.; Lin, Kant Y.

    2015-01-01

    Purpose This study predicted the effects of velopharyngeal (VP) anatomical parameters on VP function to provide a greater understanding of speech mechanics and aid in the treatment of speech disorders. Method We created a computational model of the VP mechanism using dimensions obtained from magnetic resonance imaging measurements of 10 healthy adults. The model components included the levator veli palatini (LVP), the velum, and the posterior pharyngeal wall, and the simulations were based on material parameters from the literature. The outcome metrics were the VP closure force and LVP muscle activation required to achieve VP closure. Results Our average model compared favorably with experimental data from the literature. Simulations of 1,000 random anatomies reflected the large variability in closure forces observed experimentally. VP distance had the greatest effect on both outcome metrics when considering the observed anatomic variability. Other anatomical parameters were ranked by their predicted influences on the outcome metrics. Conclusions Our results support the implication that interventions for VP dysfunction that decrease anterior to posterior VP portal distance, increase velar length, and/or increase LVP cross-sectional area may be very effective. Future modeling studies will help to further our understanding of speech mechanics and optimize treatment of speech disorders. PMID:26049120

  8. Practical Steps toward Computational Unification: Helpful Perspectives for New Systems, Adding Functionality to Existing Ones

    NASA Astrophysics Data System (ADS)

    Troy, R. M.

    2005-12-01

    and functions may be integrated into a system efficiently, with minimal effort, and with an eye toward an eventual Computational Unification of the Earth Sciences. A fundamental to such systems is meta-data which describe not only the content of data but also how intricate relationships are represented and used to good advantage. Retrieval techniques will be discussed including trade-offs in using externally managed meta-data versus embedded meta-data, how the two may be integrated, and how "simplifying assumptions" may or may not actually be helpful. The perspectives presented in this talk or poster session are based upon the experience of the Sequoia 2000 and BigSur research projects at the University of California, Berkeley, which sought to unify NASA's Mission To Planet Earth's EOS-DIS, and on-going experience developed by Science Tools corporation, of which the author is a principal. NOTE: These ideas are most easily shared in the form of a talk, and we suspect that this session will generate a lot of interest. We would therefore prefer to have this session accepted as a talk as opposed to a poster session.

  9. Development of the Computer-Adaptive Version of the Late-Life Function and Disability Instrument

    PubMed Central

    Tian, Feng; Kopits, Ilona M.; Moed, Richard; Pardasaney, Poonam K.; Jette, Alan M.

    2012-01-01

    Background. Having psychometrically strong disability measures that minimize response burden is important in assessing of older adults. Methods. Using the original 48 items from the Late-Life Function and Disability Instrument and newly developed items, a 158-item Activity Limitation and a 62-item Participation Restriction item pool were developed. The item pools were administered to a convenience sample of 520 community-dwelling adults 60 years or older. Confirmatory factor analysis and item response theory were employed to identify content structure, calibrate items, and build the computer-adaptive testings (CATs). We evaluated real-data simulations of 10-item CAT subscales. We collected data from 102 older adults to validate the 10-item CATs against the Veteran’s Short Form-36 and assessed test–retest reliability in a subsample of 57 subjects. Results. Confirmatory factor analysis revealed a bifactor structure, and multi-dimensional item response theory was used to calibrate an overall Activity Limitation Scale (141 items) and an overall Participation Restriction Scale (55 items). Fit statistics were acceptable (Activity Limitation: comparative fit index = 0.95, Tucker Lewis Index = 0.95, root mean square error approximation = 0.03; Participation Restriction: comparative fit index = 0.95, Tucker Lewis Index = 0.95, root mean square error approximation = 0.05). Correlation of 10-item CATs with full item banks were substantial (Activity Limitation: r = .90; Participation Restriction: r = .95). Test–retest reliability estimates were high (Activity Limitation: r = .85; Participation Restriction r = .80). Strength and pattern of correlations with Veteran’s Short Form-36 subscales were as hypothesized. Each CAT, on average, took 3.56 minutes to administer. Conclusions. The Late-Life Function and Disability Instrument CATs demonstrated strong reliability, validity, accuracy, and precision. The Late-Life Function and Disability Instrument CAT can achieve

  10. GammaCHI: A package for the inversion and computation of the gamma and chi-square cumulative distribution functions (central and noncentral)

    NASA Astrophysics Data System (ADS)

    Gil, Amparo; Segura, Javier; Temme, Nico M.

    2015-06-01

    A Fortran 90 module GammaCHI for computing and inverting the gamma and chi-square cumulative distribution functions (central and noncentral) is presented. The main novelty of this package is the reliable and accurate inversion routines for the noncentral cumulative distribution functions. Additionally, the package also provides routines for computing the gamma function, the error function and other functions related to the gamma function. The module includes the routines cdfgamC, invcdfgamC, cdfgamNC, invcdfgamNC, errorfunction, inverfc, gamma, loggam, gamstar and quotgamm for the computation of the central gamma distribution function (and its complementary function), the inversion of the central gamma distribution function, the computation of the noncentral gamma distribution function (and its complementary function), the inversion of the noncentral gamma distribution function, the computation of the error function and its complementary function, the inversion of the complementary error function, the computation of: the gamma function, the logarithm of the gamma function, the regulated gamma function and the ratio of two gamma functions, respectively.

  11. Evaluation of Coupled Perturbed and Density Functional Methods of Computing the Parity-Violating Energy Difference between Enantiomers

    NASA Astrophysics Data System (ADS)

    MacDermott, A. J.; Hyde, G. O.; Cohen, A. J.

    2009-03-01

    We present new coupled-perturbed Hartree-Fock (CPHF) and density functional theory (DFT) computations of the parity-violating energy difference (PVED) between enantiomers for H2O2 and H2S2. Our DFT PVED computations are the first for H2S2 and the first with the new HCTH and OLYP functionals. Like other “second generation” PVED computations, our results are an order of magnitude larger than the original “first generation” uncoupled-perturbed Hartree-Fock computations of Mason and Tranter. We offer an explanation for the dramatically larger size in terms of cancellation of contributions of opposing signs, which also explains the basis set sensitivity of the PVED, and its conformational hypersensitivity (addressed in the following paper). This paper also serves as a review of the different types of “second generation” PVED computations: we set our work in context, comparing our results with those of four other groups, and noting the good agreement between results obtained by very different methods. DFT PVEDs tend to be somewhat inflated compared to the CPHF values, but this is not a problem when only sign and order of magnitude are required. Our results with the new OLYP functional are less inflated than those with other functionals, and OLYP is also more efficient computationally. We therefore conclude that DFT computation offers a promising approach for low-cost extension to larger biosystems, especially polymers. The following two papers extend to terrestrial and extra-terrestrial amino acids respectively, and later work will extend to polymers.

  12. SARS: Safeguards Accounting and Reporting Software

    NASA Astrophysics Data System (ADS)

    Mohammedi, B.; Saadi, S.; Ait-Mohamed, S.

    In order to satisfy the requirements of the SSAC (State System for Accounting and Control of nuclear materials), for recording and reporting objectives; this computer program comes to bridge the gape between nuclear facilities operators and national inspection verifying records and delivering reports. The SARS maintains and generates at-facility safeguards accounting records and generates International Atomic Energy Agency (IAEA) safeguards reports based on accounting data input by the user at any nuclear facility. A database structure is built and BORLAND DELPHI programming language has been used. The software is designed to be user-friendly, to make extensive and flexible management of menus and graphs. SARS functions include basic physical inventory tacking, transaction histories and reporting. Access controls are made by different passwords.

  13. Localized basis functions and other computational improvements in variational nonorthogonal basis function methods for quantum mechanical scattering problems involving chemical reactions

    NASA Technical Reports Server (NTRS)

    Schwenke, David W.; Truhlar, Donald G.

    1990-01-01

    The Generalized Newton Variational Principle for 3D quantum mechanical reactive scattering is briefly reviewed. Then three techniques are described which improve the efficiency of the computations. First, the fact that the Hamiltonian is Hermitian is used to reduce the number of integrals computed, and then the properties of localized basis functions are exploited in order to eliminate redundant work in the integral evaluation. A new type of localized basis function with desirable properties is suggested. It is shown how partitioned matrices can be used with localized basis functions to reduce the amount of work required to handle the complex boundary conditions. The new techniques do not introduce any approximations into the calculations, so they may be used to obtain converged solutions of the Schroedinger equation.

  14. Sensory processing during viewing of cinematographic material: computational modeling and functional neuroimaging.

    PubMed

    Bordier, Cecile; Puja, Francesco; Macaluso, Emiliano

    2013-02-15

    The investigation of brain activity using naturalistic, ecologically-valid stimuli is becoming an important challenge for neuroscience research. Several approaches have been proposed, primarily relying on data-driven methods (e.g. independent component analysis, ICA). However, data-driven methods often require some post-hoc interpretation of the imaging results to draw inferences about the underlying sensory, motor or cognitive functions. Here, we propose using a biologically-plausible computational model to extract (multi-)sensory stimulus statistics that can be used for standard hypothesis-driven analyses (general linear model, GLM). We ran two separate fMRI experiments, which both involved subjects watching an episode of a TV-series. In Exp 1, we manipulated the presentation by switching on-and-off color, motion and/or sound at variable intervals, whereas in Exp 2, the video was played in the original version, with all the consequent continuous changes of the different sensory features intact. Both for vision and audition, we extracted stimulus statistics corresponding to spatial and temporal discontinuities of low-level features, as well as a combined measure related to the overall stimulus saliency. Results showed that activity in occipital visual cortex and the superior temporal auditory cortex co-varied with changes of low-level features. Visual saliency was found to further boost activity in extra-striate visual cortex plus posterior parietal cortex, while auditory saliency was found to enhance activity in the superior temporal cortex. Data-driven ICA analyses of the same datasets also identified "sensory" networks comprising visual and auditory areas, but without providing specific information about the possible underlying processes, e.g., these processes could relate to modality, stimulus features and/or saliency. We conclude that the combination of computational modeling and GLM enables the tracking of the impact of bottom-up signals on brain activity

  15. PLATO Instruction for Elementary Accounting.

    ERIC Educational Resources Information Center

    McKeown, James C.

    A progress report of a study using computer assisted instruction (CAI) materials for an elementary course in accounting principles is presented. The study was based on the following objectives: (1) improvement of instruction in the elementary accounting sequence, and (2) help for transfer students from two-year institutions. The materials under…

  16. Computing alignment and orientation of non-linear molecules at room temperatures using random phase wave functions

    NASA Astrophysics Data System (ADS)

    Kallush, Shimshon; Fleischer, Sharly; Ultrafast terahertz molecular dynamics Collaboration

    2015-05-01

    Quantum simulation of large open systems is a hard task that demands huge computation and memory costs. The rotational dynamics of non-linear molecules at high-temperature under external fields is such an example. At room temperature, the initial density matrix populates ~ 104 rotational states, and the whole coupled Hilbert space can reach ~ 106 states. Simulation by neither the direct density matrix nor the full basis set of populated wavefunctions is impossible. We employ the random phase wave function method to represent the initial state and compute several time dependent and independent observables such as the orientation and the alignment of the molecules. The error of the method was found to scale as N- 1 / 2, where N is the number of wave function realizations employed. Scaling vs. the temperature was computed for weak and strong fields. As expected, the convergence of the method increase rapidly with the temperature and the field intensity.

  17. INTERP3: A computer routine for linear interpolation of trivariate functions defined by nondistinct unequally spaced variables

    NASA Technical Reports Server (NTRS)

    Hill, D. C.; Morris, S. J., Jr.

    1979-01-01

    A report on the computer routine INTERP3 is presented. The routine is designed to linearly interpolate a variable which is a function of three independent variables. The variables within the parameter arrays do not have to be distinct, or equally spaced, and the array variables can be in increasing or decreasing order.

  18. Investigating the Potential of Computer Environments for the Teaching and Learning of Functions: A Double Analysis from Two Research Traditions

    ERIC Educational Resources Information Center

    Lagrange, Jean-Baptiste; Psycharis, Giorgos

    2014-01-01

    The general goal of this paper is to explore the potential of computer environments for the teaching and learning of functions. To address this, different theoretical frameworks and corresponding research traditions are available. In this study, we aim to network different frameworks by following a "double analysis" method to analyse two…

  19. Utilization of high resolution computed tomography to visualize the three dimensional structure and function of plant vasculature

    Technology Transfer Automated Retrieval System (TEKTRAN)

    High resolution x-ray computed tomography (HRCT) is a non-destructive diagnostic imaging technique with sub-micron resolution capability that is now being used to evaluate the structure and function of plant xylem network in three dimensions (3D). HRCT imaging is based on the same principles as medi...

  20. Density Functional Computations and Mass Spectrometric Measurements. Can this Coupling Enlarge the Knowledge of Gas-Phase Chemistry?

    NASA Astrophysics Data System (ADS)

    Marino, T.; Russo, N.; Sicilia, E.; Toscano, M.; Mineva, T.

    A series of gas-phase properties of the systems has been investigated by using different exchange-correlation potentials and basis sets of increasing size in the framework of Density Functional theory with the aim to determine a strategy able to give reliable results with reasonable computational efforts.

  1. Content Range and Precision of a Computer Adaptive Test of Upper Extremity Function for Children with Cerebral Palsy

    ERIC Educational Resources Information Center

    Montpetit, Kathleen; Haley, Stephen; Bilodeau, Nathalie; Ni, Pengsheng; Tian, Feng; Gorton, George, III; Mulcahey, M. J.

    2011-01-01

    This article reports on the content range and measurement precision of an upper extremity (UE) computer adaptive testing (CAT) platform of physical function in children with cerebral palsy. Upper extremity items representing skills of all abilities were administered to 305 parents. These responses were compared with two traditional standardized…

  2. Implementation of the AES as a Hash Function for Confirming the Identity of Software on a Computer System

    SciTech Connect

    Hansen, Randy R.; Bass, Robert B.; Kouzes, Richard T.; Mileson, Nicholas D.

    2003-01-20

    This paper provides a brief overview of the implementation of the Advanced Encryption Standard (AES) as a hash function for confirming the identity of software resident on a computer system. The PNNL Software Authentication team chose to use a hash function to confirm software identity on a system for situations where: (1) there is limited time to perform the confirmation and (2) access to the system is restricted to keyboard or thumbwheel input and output can only be displayed on a monitor. PNNL reviewed three popular algorithms: the Secure Hash Algorithm - 1 (SHA-1), the Message Digest - 5 (MD-5), and the Advanced Encryption Standard (AES) and selected the AES to incorporate in software confirmation tool we developed. This paper gives a brief overview of the SHA-1, MD-5, and the AES and sites references for further detail. It then explains the overall processing steps of the AES to reduce a large amount of generic data-the plain text, such is present in memory and other data storage media in a computer system, to a small amount of data-the hash digest, which is a mathematically unique representation or signature of the former that could be displayed on a computer's monitor. This paper starts with a simple definition and example to illustrate the use of a hash function. It concludes with a description of how the software confirmation tool uses the hash function to confirm the identity of software on a computer system.

  3. On One Unusual Method of Computation of Limits of Rational Functions in the Program Mathematica[R

    ERIC Educational Resources Information Center

    Hora, Jaroslav; Pech, Pavel

    2005-01-01

    Computing limits of functions is a traditional part of mathematical analysis which is very difficult for students. Now an algorithm for the elimination of quantifiers in the field of real numbers is implemented in the program Mathematica. This offers a non-traditional view on this classical theme. (Contains 1 table.)

  4. Using High Resolution Computed Tomography to Visualize the Three Dimensional Structure and Function of Plant Vasculature

    PubMed Central

    McElrone, Andrew J.; Choat, Brendan; Parkinson, Dilworth Y.; MacDowell, Alastair A.; Brodersen, Craig R.

    2013-01-01

    High resolution x-ray computed tomography (HRCT) is a non-destructive diagnostic imaging technique with sub-micron resolution capability that is now being used to evaluate the structure and function of plant xylem network in three dimensions (3D) (e.g. Brodersen et al. 2010; 2011; 2012a,b). HRCT imaging is based on the same principles as medical CT systems, but a high intensity synchrotron x-ray source results in higher spatial resolution and decreased image acquisition time. Here, we demonstrate in detail how synchrotron-based HRCT (performed at the Advanced Light Source-LBNL Berkeley, CA, USA) in combination with Avizo software (VSG Inc., Burlington, MA, USA) is being used to explore plant xylem in excised tissue and living plants. This new imaging tool allows users to move beyond traditional static, 2D light or electron micrographs and study samples using virtual serial sections in any plane. An infinite number of slices in any orientation can be made on the same sample, a feature that is physically impossible using traditional microscopy methods. Results demonstrate that HRCT can be applied to both herbaceous and woody plant species, and a range of plant organs (i.e. leaves, petioles, stems, trunks, roots). Figures presented here help demonstrate both a range of representative plant vascular anatomy and the type of detail extracted from HRCT datasets, including scans for coast redwood (Sequoia sempervirens), walnut (Juglans spp.), oak (Quercus spp.), and maple (Acer spp.) tree saplings to sunflowers (Helianthus annuus), grapevines (Vitis spp.), and ferns (Pteridium aquilinum and Woodwardia fimbriata). Excised and dried samples from woody species are easiest to scan and typically yield the best images. However, recent improvements (i.e. more rapid scans and sample stabilization) have made it possible to use this visualization technique on green tissues (e.g. petioles) and in living plants. On occasion some shrinkage of hydrated green plant tissues will cause

  5. Quantitative Functional Imaging Using Dynamic Positron Computed Tomography and Rapid Parameter Estimation Techniques

    NASA Astrophysics Data System (ADS)

    Koeppe, Robert Allen

    Positron computed tomography (PCT) is a diagnostic imaging technique that provides both three dimensional imaging capability and quantitative measurements of local tissue radioactivity concentrations in vivo. This allows the development of non-invasive methods that employ the principles of tracer kinetics for determining physiological properties such as mass specific blood flow, tissue pH, and rates of substrate transport or utilization. A physiologically based, two-compartment tracer kinetic model was derived to mathematically describe the exchange of a radioindicator between blood and tissue. The model was adapted for use with dynamic sequences of data acquired with a positron tomograph. Rapid estimation techniques were implemented to produce functional images of the model parameters by analyzing each individual pixel sequence of the image data. A detailed analysis of the performance characteristics of three different parameter estimation schemes was performed. The analysis included examination of errors caused by statistical uncertainties in the measured data, errors in the timing of the data, and errors caused by violation of various assumptions of the tracer kinetic model. Two specific radioindicators were investigated. ('18)F -fluoromethane, an inert freely diffusible gas, was used for local quantitative determinations of both cerebral blood flow and tissue:blood partition coefficient. A method was developed that did not require direct sampling of arterial blood for the absolute scaling of flow values. The arterial input concentration time course was obtained by assuming that the alveolar or end-tidal expired breath radioactivity concentration is proportional to the arterial blood concentration. The scale of the input function was obtained from a series of venous blood concentration measurements. The method of absolute scaling using venous samples was validated in four studies, performed on normal volunteers, in which directly measured arterial concentrations

  6. Using high resolution computed tomography to visualize the three dimensional structure and function of plant vasculature.

    PubMed

    McElrone, Andrew J; Choat, Brendan; Parkinson, Dilworth Y; MacDowell, Alastair A; Brodersen, Craig R

    2013-01-01

    High resolution x-ray computed tomography (HRCT) is a non-destructive diagnostic imaging technique with sub-micron resolution capability that is now being used to evaluate the structure and function of plant xylem network in three dimensions (3D) (e.g. Brodersen et al. 2010; 2011; 2012a,b). HRCT imaging is based on the same principles as medical CT systems, but a high intensity synchrotron x-ray source results in higher spatial resolution and decreased image acquisition time. Here, we demonstrate in detail how synchrotron-based HRCT (performed at the Advanced Light Source-LBNL Berkeley, CA, USA) in combination with Avizo software (VSG Inc., Burlington, MA, USA) is being used to explore plant xylem in excised tissue and living plants. This new imaging tool allows users to move beyond traditional static, 2D light or electron micrographs and study samples using virtual serial sections in any plane. An infinite number of slices in any orientation can be made on the same sample, a feature that is physically impossible using traditional microscopy methods. Results demonstrate that HRCT can be applied to both herbaceous and woody plant species, and a range of plant organs (i.e. leaves, petioles, stems, trunks, roots). Figures presented here help demonstrate both a range of representative plant vascular anatomy and the type of detail extracted from HRCT datasets, including scans for coast redwood (Sequoia sempervirens), walnut (Juglans spp.), oak (Quercus spp.), and maple (Acer spp.) tree saplings to sunflowers (Helianthus annuus), grapevines (Vitis spp.), and ferns (Pteridium aquilinum and Woodwardia fimbriata). Excised and dried samples from woody species are easiest to scan and typically yield the best images. However, recent improvements (i.e. more rapid scans and sample stabilization) have made it possible to use this visualization technique on green tissues (e.g. petioles) and in living plants. On occasion some shrinkage of hydrated green plant tissues will cause

  7. High-throughput optogenetic functional magnetic resonance imaging with parallel computations

    PubMed Central

    Fang, Zhongnan; Lee, Jin Hyung

    2013-01-01

    Optogenetic functional magnetic resonance imaging (ofMRI) technology enables cell-type specific, temporally precise neuronal control and accurate, in vivo readout of resulting activity across the whole brain. With the ability to precisely control excitation and inhibition parameters, and to accurately record the resulting activity, there is an increased need for a high-throughput method to bring the ofMRI studies to their full potential. In this paper, an advanced system that can allow real-time fMRI with interactive control and analysis in a fraction of the MRI acquisition repetition time (TR) is proposed. With such high processing speed, sufficient time will be available for integration of future developments that can further enhance ofMRI data quality or better streamline the study. We designed and implemented a highly optimized, massively parallel system using graphics processing unit (GPU)s which achieves reconstruction, motion correction, and analysis of 3D volume data in approximately 12.80 ms. As a result, with a 750 ms TR and 4 interleaf fMRI acquisition, we can now conduct sliding window reconstruction, motion correction, analysis and display in approximately 1.7% of the TR. Therefore, a significant amount of time can now be allocated to integrating advanced but computationally intensive methods that can enable higher image quality and better analysis results all within a TR. Utilizing the proposed high-throughput imaging platform with sliding window reconstruction, we were also able to observe the much-debated initial dips in our ofMRI data. Combined with methods to further improve SNR, the proposed system will enable efficient real-time, interactive, high-throughput ofMRI studies. PMID:23747482

  8. Computational and functional analyses of a small-molecule binding site in ROMK.

    PubMed

    Swale, Daniel R; Sheehan, Jonathan H; Banerjee, Sreedatta; Husni, Afeef S; Nguyen, Thuy T; Meiler, Jens; Denton, Jerod S

    2015-03-10

    The renal outer medullary potassium channel (ROMK, or Kir1.1, encoded by KCNJ1) critically regulates renal tubule electrolyte and water transport and hence blood volume and pressure. The discovery of loss-of-function mutations in KCNJ1 underlying renal salt and water wasting and lower blood pressure has sparked interest in developing new classes of antihypertensive diuretics targeting ROMK. The recent development of nanomolar-affinity small-molecule inhibitors of ROMK creates opportunities for exploring the chemical and physical basis of ligand-channel interactions required for selective ROMK inhibition. We previously reported that the bis-nitro-phenyl ROMK inhibitor VU591 exhibits voltage-dependent knock-off at hyperpolarizing potentials, suggesting that the binding site is located within the ion-conduction pore. In this study, comparative molecular modeling and in silico ligand docking were used to interrogate the full-length ROMK pore for energetically favorable VU591 binding sites. Cluster analysis of 2498 low-energy poses resulting from 9900 Monte Carlo docking trajectories on each of 10 conformationally distinct ROMK comparative homology models identified two putative binding sites in the transmembrane pore that were subsequently tested for a role in VU591-dependent inhibition using site-directed mutagenesis and patch-clamp electrophysiology. Introduction of mutations into the lower site had no effect on the sensitivity of the channel to VU591. In contrast, mutations of Val(168) or Asn(171) in the upper site, which are unique to ROMK within the Kir channel family, led to a dramatic reduction in VU591 sensitivity. This study highlights the utility of computational modeling for defining ligand-ROMK interactions and proposes a mechanism for inhibition of ROMK. PMID:25762321

  9. Computational and Functional Analyses of a Small-Molecule Binding Site in ROMK

    PubMed Central

    Swale, Daniel R.; Sheehan, Jonathan H.; Banerjee, Sreedatta; Husni, Afeef S.; Nguyen, Thuy T.; Meiler, Jens; Denton, Jerod S.

    2015-01-01

    The renal outer medullary potassium channel (ROMK, or Kir1.1, encoded by KCNJ1) critically regulates renal tubule electrolyte and water transport and hence blood volume and pressure. The discovery of loss-of-function mutations in KCNJ1 underlying renal salt and water wasting and lower blood pressure has sparked interest in developing new classes of antihypertensive diuretics targeting ROMK. The recent development of nanomolar-affinity small-molecule inhibitors of ROMK creates opportunities for exploring the chemical and physical basis of ligand-channel interactions required for selective ROMK inhibition. We previously reported that the bis-nitro-phenyl ROMK inhibitor VU591 exhibits voltage-dependent knock-off at hyperpolarizing potentials, suggesting that the binding site is located within the ion-conduction pore. In this study, comparative molecular modeling and in silico ligand docking were used to interrogate the full-length ROMK pore for energetically favorable VU591 binding sites. Cluster analysis of 2498 low-energy poses resulting from 9900 Monte Carlo docking trajectories on each of 10 conformationally distinct ROMK comparative homology models identified two putative binding sites in the transmembrane pore that were subsequently tested for a role in VU591-dependent inhibition using site-directed mutagenesis and patch-clamp electrophysiology. Introduction of mutations into the lower site had no effect on the sensitivity of the channel to VU591. In contrast, mutations of Val168 or Asn171 in the upper site, which are unique to ROMK within the Kir channel family, led to a dramatic reduction in VU591 sensitivity. This study highlights the utility of computational modeling for defining ligand-ROMK interactions and proposes a mechanism for inhibition of ROMK. PMID:25762321

  10. The Reaction Coordinate of a Functional Model of Tyrosinase: Spectroscopic and Computational Characterization

    PubMed Central

    Op’t Holt, Bryan T.; Vance, Michael A.; Mirica, Liviu M.; Stack, T. Daniel P.; Solomon, Edward I.

    2009-01-01

    The μ-η2:η2-peroxodicopper(II) complex synthesized by reacting the Cu(I) complex of the bis-diamine ligand N,N′-di-tert-butyl-ethylenediamine (DBED) with O2 is a functional and spectroscopic model of the coupled binuclear copper protein tyrosinase. This complex reacts with 2,4-di-tert-butylphenolate at low temperature to produce a mixture of the catechol and quinone products, which proceeds through three intermediates (A – C) that have been characterized. A, stabilized at 153K, is characterized as a phenolate-bonded bis-μ-oxo dicopper(III) species, which proceeds at 193K to B, presumably a catecholate-bridged coupled bis-copper(II) species via an electrophilic aromatic substitution mechanism wherein aromatic ring distortion is the rate-limiting step. Isotopic labeling shows that the oxygen inserted into the aromatic substrate during hydroxylation derives from dioxygen, and a late-stage ortho-H+ transfer to an exogenous base is associated with C-O bond formation. Addition of a proton to B produces C, determined from resonance Raman spectra to be a Cu(II)-semiquinone complex. The formation of C (the oxidation of catecholate and reduction to Cu(I)) is governed by the protonation state of the distal bridging oxygen ligand of B. Parallels and contrasts are drawn between the spectroscopically and computationally supported mechanism of the DBED system, presented here, and the experimentally-derived mechanism of the coupled binuclear copper protein tyrosinase. PMID:19368383

  11. Complex functionality with minimal computation. Promise and pitfalls of reduced-tracer ocean biogeochemistry models

    SciTech Connect

    Galbraith, Eric D.; Dunne, John P.; Gnanadesikan, Anand; Slater, Richard D.; Sarmiento, Jorge L.; Dufour, Carolina O.; de Souza, Gregory F.; Bianchi, Daniele; Claret, Mariona; Rodgers, Keith B.; Marvasti, Seyedehsafoura Sedigh

    2015-12-21

    Earth System Models increasingly include ocean biogeochemistry models in order to predict changes in ocean carbon storage, hypoxia, and biological productivity under climate change. However, state-of-the-art ocean biogeochemical models include many advected tracers, that significantly increase the computational resources required, forcing a trade-off with spatial resolution. Here, we compare a state-of the art model with 30 prognostic tracers (TOPAZ) with two reduced-tracer models, one with 6 tracers (BLING), and the other with 3 tracers (miniBLING). The reduced-tracer models employ parameterized, implicit biological functions, which nonetheless capture many of the most important processes resolved by TOPAZ. All three are embedded in the same coupled climate model. Despite the large difference in tracer number, the absence of tracers for living organic matter is shown to have a minimal impact on the transport of nutrient elements, and the three models produce similar mean annual preindustrial distributions of macronutrients, oxygen, and carbon. Significant differences do exist among the models, in particular the seasonal cycle of biomass and export production, but it does not appear that these are necessary consequences of the reduced tracer number. With increasing CO2, changes in dissolved oxygen and anthropogenic carbon uptake are very similar across the different models. Thus, while the reduced-tracer models do not explicitly resolve the diversity and internal dynamics of marine ecosystems, we demonstrate that such models are applicable to a broad suite of major biogeochemical concerns, including anthropogenic change. Lastly, these results are very promising for the further development and application of reduced-tracer biogeochemical models that incorporate ‘‘sub-ecosystem-scale’’ parameterizations.

  12. Complex functionality with minimal computation. Promise and pitfalls of reduced-tracer ocean biogeochemistry models

    DOE PAGESBeta

    Galbraith, Eric D.; Dunne, John P.; Gnanadesikan, Anand; Slater, Richard D.; Sarmiento, Jorge L.; Dufour, Carolina O.; de Souza, Gregory F.; Bianchi, Daniele; Claret, Mariona; Rodgers, Keith B.; et al

    2015-12-21

    Earth System Models increasingly include ocean biogeochemistry models in order to predict changes in ocean carbon storage, hypoxia, and biological productivity under climate change. However, state-of-the-art ocean biogeochemical models include many advected tracers, that significantly increase the computational resources required, forcing a trade-off with spatial resolution. Here, we compare a state-of the art model with 30 prognostic tracers (TOPAZ) with two reduced-tracer models, one with 6 tracers (BLING), and the other with 3 tracers (miniBLING). The reduced-tracer models employ parameterized, implicit biological functions, which nonetheless capture many of the most important processes resolved by TOPAZ. All three are embedded inmore » the same coupled climate model. Despite the large difference in tracer number, the absence of tracers for living organic matter is shown to have a minimal impact on the transport of nutrient elements, and the three models produce similar mean annual preindustrial distributions of macronutrients, oxygen, and carbon. Significant differences do exist among the models, in particular the seasonal cycle of biomass and export production, but it does not appear that these are necessary consequences of the reduced tracer number. With increasing CO2, changes in dissolved oxygen and anthropogenic carbon uptake are very similar across the different models. Thus, while the reduced-tracer models do not explicitly resolve the diversity and internal dynamics of marine ecosystems, we demonstrate that such models are applicable to a broad suite of major biogeochemical concerns, including anthropogenic change. Lastly, these results are very promising for the further development and application of reduced-tracer biogeochemical models that incorporate ‘‘sub-ecosystem-scale’’ parameterizations.« less

  13. Complex functionality with minimal computation: Promise and pitfalls of reduced-tracer ocean biogeochemistry models

    NASA Astrophysics Data System (ADS)

    Galbraith, Eric D.; Dunne, John P.; Gnanadesikan, Anand; Slater, Richard D.; Sarmiento, Jorge L.; Dufour, Carolina O.; de Souza, Gregory F.; Bianchi, Daniele; Claret, Mariona; Rodgers, Keith B.; Marvasti, Seyedehsafoura Sedigh

    2015-12-01

    Earth System Models increasingly include ocean biogeochemistry models in order to predict changes in ocean carbon storage, hypoxia, and biological productivity under climate change. However, state-of-the-art ocean biogeochemical models include many advected tracers, that significantly increase the computational resources required, forcing a trade-off with spatial resolution. Here, we compare a state-of-the art model with 30 prognostic tracers (TOPAZ) with two reduced-tracer models, one with 6 tracers (BLING), and the other with 3 tracers (miniBLING). The reduced-tracer models employ parameterized, implicit biological functions, which nonetheless capture many of the most important processes resolved by TOPAZ. All three are embedded in the same coupled climate model. Despite the large difference in tracer number, the absence of tracers for living organic matter is shown to have a minimal impact on the transport of nutrient elements, and the three models produce similar mean annual preindustrial distributions of macronutrients, oxygen, and carbon. Significant differences do exist among the models, in particular the seasonal cycle of biomass and export production, but it does not appear that these are necessary consequences of the reduced tracer number. With increasing CO2, changes in dissolved oxygen and anthropogenic carbon uptake are very similar across the different models. Thus, while the reduced-tracer models do not explicitly resolve the diversity and internal dynamics of marine ecosystems, we demonstrate that such models are applicable to a broad suite of major biogeochemical concerns, including anthropogenic change. These results are very promising for the further development and application of reduced-tracer biogeochemical models that incorporate "sub-ecosystem-scale" parameterizations.

  14. The application of computer assisted technologies (CAT) in the rehabilitation of cognitive functions in psychiatric disorders of childhood and adolescence.

    PubMed

    Srebnicki, Tomasz; Bryńska, Anita

    2016-01-01

    First applications of computer-assisted technologies (CAT) in the rehabilitation of cognitive deficits, including child and adolescent psychiatric disorders date back to the 80's last century. Recent developments in computer technologies, wide access to the Internet and vast expansion of electronic devices resulted in dynamic increase in therapeutic software as well as supporting devices. The aim of computer assisted technologies is the improvement in the comfort and quality of life as well as the rehabilitation of impaired functions. The goal of the article is the presentation of most common computer-assisted technologies used in the therapy of children and adolescents with cognitive deficits as well as the literature review of their effectiveness including the challenges and limitations in regard to the implementation of such interventions. PMID:27556116

  15. Accelerating Scientific Discovery Through Computation and Visualization III. Tight-Binding Wave Functions for Quantum Dots

    PubMed Central

    Sims, James S.; George, William L.; Griffin, Terence J.; Hagedorn, John G.; Hung, Howard K.; Kelso, John T.; Olano, Marc; Peskin, Adele P.; Satterfield, Steven G.; Terrill, Judith Devaney; Bryant, Garnett W.; Diaz, Jose G.

    2008-01-01

    This is the third in a series of articles that describe, through examples, how the Scientific Applications and Visualization Group (SAVG) at NIST has utilized high performance parallel computing, visualization, and machine learning to accelerate scientific discovery. In this article we focus on the use of high performance computing and visualization for simulations of nanotechnology. PMID:27096116

  16. On the computation of structure functions and mass spectra in a relativistic Hamiltonian formalism: A lattice point of view

    NASA Astrophysics Data System (ADS)

    Scheu, Norbert

    1998-11-01

    A non-perturbative computation of scHADRONIC STRUCTURE FUNCTIONS for deep inelastic lepton hadron scattering has not been achieved yet. In this thesis we investigate the viability of the Hamiltonian approach in order to compute hadronic structure functions. In the literature, the so- called scFRONT FORM (FF) approach is favoured over the conventional the scINSTANT FORM (IF)-the conventional Hamiltonian approach-due to claims (a) that structure functions are related to scLIGHT-LIKE CORRELATION FUNCTIONS and (b) that the front form is much simpler for numerical computations. We dispell both claims using general arguments as well as practical computations (in the case of the scSCALAR MODEL and scTWO-DIMENSIONAL QED) demonstrating (a) that structure functions are related to scSPACE-LIKE CORRELATIONS and that (b) the IF is better suited for practical computations if appropriate approximations are introduced. Moreover, we show that the FF is scUNPHYSICAL in general for reasons as follows: (1) the FF constitutes an scINCOMPLETE QUANTISATION of field theories (2) the FF 'predicts' an scINFINITE SPEED OF LIGHT in one space dimension, a scCOMPLETE BREAKDOWN OF MICROCAUSALITY and the scUBIQUITY OF TIME-TRAVEL. Additionally we demonstrate that the FF cannot be approached by so-called ɛ co-ordinates. We demonstrate that these co-ordinates are but the instant form in disguise. The FF cannot be legitimated to be an scEFFECTIVE THEORY. Finally, we demonstrate that the so- called scINFINITE MOMENTUM FRAME is neither physical nor equivalent to the FF.

  17. Accounting for the environment.

    PubMed

    Lutz, E; Munasinghe, M

    1991-03-01

    Environmental awareness in the 1980s has led to efforts to improve the current UN System of National Accounts (SNA) for better measurement of the value of environmental resources when estimating income. National governments, the UN, the International Monetary Fund, and the World Bank are interested in solving this issue. The World Bank relies heavily on national aggregates in income accounts compiled by means of the SNA that was published in 1968 and stressed gross domestic product (GDP). GDP measures mainly market activity, but it takes does not consider the consumption of natural capital, and indirectly inhibits sustained development. The deficiencies of the current method of accounting are inconsistent treatment of manmade and natural capital, the omission of natural resources and their depletion from balance sheets, and pollution cleanup costs from national income. In the calculation of GDP pollution is overlooked, and beneficial environmental inputs are valued at zero. The calculation of environmentally adjusted net domestic product (EDP) and environmentally adjusted net income (ENI) would lower income and growth rate, as the World Resources Institute found with respect to Indonesia for 1971-84. When depreciation for oil, timber, and top soil was included the net domestic product (NDP) was only 4% compared with a 7.1% GDP. The World Bank has advocated environmental accounting since 1983 in SNA revisions. The 1989 revised Blue Book of the SNA takes environment concerns into account. Relevant research is under way in Mexico and Papua New Guinea using the UN Statistical Office framework as a system for environmentally adjusted economic accounts that computes EDP and ENI and integrates environmental data with national accounts while preserving SNA concepts. PMID:12285741

  18. [COMPUTER TECHNOLOGY FOR ACCOUNTING OF CONFOUNDERS IN THE RISK ASSESSMENT IN COMPARATIVE STUDIES ON THE BASE OF THE METHOD OF STANDARDIZATION].

    PubMed

    Shalaumova, Yu V; Varaksin, A N; Panov, V G

    2016-01-01

    There was performed an analysis of the accounting of the impact of concomitant variables (confounders), introducing a systematic error in the assessment of the impact of risk factors on the resulting variable. The analysis showed that standardization is an effective method for the reduction of the shift of risk assessment. In the work there is suggested an algorithm implementing the method of standardization based on stratification, providing for the minimization of the difference of distributions of confounders in groups on risk factors. To automate the standardization procedures there was developed a software available on the website of the Institute of Industrial Ecology, UB RAS. With the help of the developed software by numerically modeling there were determined conditions of the applicability of the method of standardization on the basis of stratification for the case of the normal distribution on the response and confounder and linear relationship between them. Comparison ofresults obtained with the help of the standardization with statistical methods (logistic regression and analysis of covariance) in solving the problem of human ecology, has shown that obtaining close results is possible if there will be met exactly conditions for the applicability of statistical methods. Standardization is less sensitive to violations of conditions of applicability. PMID:27266034

  19. FIT: Computer Program that Interactively Determines Polynomial Equations for Data which are a Function of Two Independent Variables

    NASA Technical Reports Server (NTRS)

    Arbuckle, P. D.; Sliwa, S. M.; Roy, M. L.; Tiffany, S. H.

    1985-01-01

    A computer program for interactively developing least-squares polynomial equations to fit user-supplied data is described. The program is characterized by the ability to compute the polynomial equations of a surface fit through data that are a function of two independent variables. The program utilizes the Langley Research Center graphics packages to display polynomial equation curves and data points, facilitating a qualitative evaluation of the effectiveness of the fit. An explanation of the fundamental principles and features of the program, as well as sample input and corresponding output, are included.

  20. [Home computer stabilography: technical level, functional potentialities and spheres of application].

    PubMed

    Sliva, S S

    2005-01-01

    Described, compared and analyzed in the paper are data about sabilographic computer equipment manufactured serially by the leading foreign and Russian companies. Potential spheres of application of stabilographic equipment are discussed. PMID:15757091

  1. Functional Assessment for Human-Computer Interaction: A Method for Quantifying Physical Functional Capabilities for Information Technology Users

    ERIC Educational Resources Information Center

    Price, Kathleen J.

    2011-01-01

    The use of information technology is a vital part of everyday life, but for a person with functional impairments, technology interaction may be difficult at best. Information technology is commonly designed to meet the needs of a theoretical "normal" user. However, there is no such thing as a "normal" user. A user's capabilities will vary over…

  2. Open and closed cortico-subcortical loops: A neuro-computational account of access to consciousness in the distractor-induced blindness paradigm.

    PubMed

    Ebner, Christian; Schroll, Henning; Winther, Gesche; Niedeggen, Michael; Hamker, Fred H

    2015-09-01

    How the brain decides which information to process 'consciously' has been debated over for decades without a simple explanation at hand. While most experiments manipulate the perceptual energy of presented stimuli, the distractor-induced blindness task is a prototypical paradigm to investigate gating of information into consciousness without or with only minor visual manipulation. In this paradigm, subjects are asked to report intervals of coherent dot motion in a rapid serial visual presentation (RSVP) stream, whenever these are preceded by a particular color stimulus in a different RSVP stream. If distractors (i.e., intervals of coherent dot motion prior to the color stimulus) are shown, subjects' abilities to perceive and report intervals of target dot motion decrease, particularly with short delays between intervals of target color and target motion. We propose a biologically plausible neuro-computational model of how the brain controls access to consciousness to explain how distractor-induced blindness originates from information processing in the cortex and basal ganglia. The model suggests that conscious perception requires reverberation of activity in cortico-subcortical loops and that basal-ganglia pathways can either allow or inhibit this reverberation. In the distractor-induced blindness paradigm, inadequate distractor-induced response tendencies are suppressed by the inhibitory 'hyperdirect' pathway of the basal ganglia. If a target follows such a distractor closely, temporal aftereffects of distractor suppression prevent target identification. The model reproduces experimental data on how delays between target color and target motion affect the probability of target detection. PMID:25802010

  3. On one-dimensional stretching functions for finite-difference calculations. [computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Vinokur, M.

    1979-01-01

    The class of one-dimensional stretching functions used in finite-difference calculations is studied. For solutions containing a highly localized region of rapid variation, simple criteria for a stretching function are derived using a truncation error analysis. These criteria are used to investigate two types of stretching functions. One is an interior stretching function, for which the location and slope of an interior clustering region are specified. The simplest such function satisfying the criteria is found to be one based on the inverse hyperbolic sine. The other type of function is a two-sided stretching function, for which the arbitrary slopes at the two ends of the one-dimensional interval are specified. The simplest such general function is found to be one based on the inverse tangent.

  4. Enhancing functionality and performance in the PVM network computing system. Period 1 progress report

    SciTech Connect

    Sunderam, V.

    1995-08-01

    The research funded by this grant is part of an ongoing research project in heterogeneous distributed computing with the PVM system, at Emory as well as at Oak Ridge Labs and the University of Tennessee. This grant primarily supports research at Emory that continues to evolve new concepts and systems in distributed computing, but it also includes the PI`s ongoing interaction with the other groups in terms of collaborative research as well as software systems development and maintenance. The research effort at Emory has, in this first project period of the renewal (September 1994-June 1995), focused on (a) I/O frameworks for supporting data management in PVM; (b) evolution of a multithreaded concurrent computing model; and (c) responsive and portable graphical profiling tools for PVM.

  5. Fast Computation of Solvation Free Energies with Molecular Density Functional Theory: Thermodynamic-Ensemble Partial Molar Volume Corrections.

    PubMed

    Sergiievskyi, Volodymyr P; Jeanmairet, Guillaume; Levesque, Maximilien; Borgis, Daniel

    2014-06-01

    Molecular density functional theory (MDFT) offers an efficient implicit-solvent method to estimate molecule solvation free-energies, whereas conserving a fully molecular representation of the solvent. Even within a second-order approximation for the free-energy functional, the so-called homogeneous reference fluid approximation, we show that the hydration free-energies computed for a data set of 500 organic compounds are of similar quality as those obtained from molecular dynamics free-energy perturbation simulations, with a computer cost reduced by 2-3 orders of magnitude. This requires to introduce the proper partial volume correction to transform the results from the grand canonical to the isobaric-isotherm ensemble that is pertinent to experiments. We show that this correction can be extended to 3D-RISM calculations, giving a sound theoretical justification to empirical partial molar volume corrections that have been proposed recently. PMID:26273876

  6. The use of computer graphic techniques for the determination of ventricular function.

    NASA Technical Reports Server (NTRS)

    Sandler, H.; Rasmussen, D.

    1972-01-01

    Description of computer techniques employed to increase the speed, accuracy, reliability, and scope of angiocardiographic analyses determining human heart dimensions. Chamber margins are traced with a Calma 303 digitizer from projections of the angiographic films. The digitized margins of the ventricular images are filed in a computer for subsequent analysis. The margins can be displayed on the television screen of a graphics unit for individual study or they can be viewed in real time (or at any selected speed) to study dynamic changes in the chamber outline. The construction of three dimensional images of the ventricle is described.

  7. Integrating computational modeling and functional assays to decipher the structure-function relationship of influenza virus PB1 protein

    PubMed Central

    Li, Chunfeng; Wu, Aiping; Peng, Yousong; Wang, Jingfeng; Guo, Yang; Chen, Zhigao; Zhang, Hong; Wang, Yongqiang; Dong, Jiuhong; Wang, Lulan; Qin, F. Xiao-Feng; Cheng, Genhong; Deng, Tao; Jiang, Taijiao

    2014-01-01

    The influenza virus PB1 protein is the core subunit of the heterotrimeric polymerase complex (PA, PB1 and PB2) in which PB1 is responsible for catalyzing RNA polymerization and binding to the viral RNA promoter. Among the three subunits, PB1 is the least known subunit so far in terms of its structural information. In this work, by integrating template-based structural modeling approach with all known sequence and functional information about the PB1 protein, we constructed a modeled structure of PB1. Based on this model, we performed mutagenesis analysis for the key residues that constitute the RNA template binding and catalytic (TBC) channel in an RNP reconstitution system. The results correlated well with the model and further identified new residues of PB1 that are critical for RNA synthesis. Moreover, we derived 5 peptides from the sequence of PB1 that form the TBC channel and 4 of them can inhibit the viral RNA polymerase activity. Interestingly, we found that one of them named PB1(491–515) can inhibit influenza virus replication by disrupting viral RNA promoter binding activity of polymerase. Therefore, this study has not only deepened our understanding of structure-function relationship of PB1, but also promoted the development of novel therapeutics against influenza virus. PMID:25424584

  8. Evaluating the Appropriateness of a New Computer-Administered Measure of Adaptive Function for Children and Youth with Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Coster, Wendy J.; Kramer, Jessica M.; Tian, Feng; Dooley, Meghan; Liljenquist, Kendra; Kao, Ying-Chia; Ni, Pengsheng

    2016-01-01

    The Pediatric Evaluation of Disability Inventory-Computer Adaptive Test is an alternative method for describing the adaptive function of children and youth with disabilities using a computer-administered assessment. This study evaluated the performance of the Pediatric Evaluation of Disability Inventory-Computer Adaptive Test with a national…

  9. Effective thermionic work function measurements of zirconium carbide using a computer-processed image of a thermionic projection microscope pattern

    SciTech Connect

    Mackie, W.A.; Hinrichs, C.H.; Cohen, I.M.; Alin, J.S.; Schnitzler, D.T.; Carleson, P.; Ginn, R.; Krueger, P.; Vetter, C.G. ); Davis, P.R. )

    1990-05-01

    We report on a unique experimental method to determine thermionic work functions of major crystal planes of single crystal zirconium carbide. Applications for transition metal carbides could include cathodes for advanced thermionic energy conversion, radiation immune microcircuitry, {beta}-SiC substrates or high current density field emission cathodes. The primary emphasis of this paper is the analytical method used, that of computer processing a digitized image. ZrC single crystal specimens were prepared by floating zone arc refinement from sintered stock, yielding an average bulk stoichiometry of C/Zr=0.92. A 0.075 cm hemispherical cathode was prepared and mounted in a thermionic projection microscope (TPM) tube. The imaged patterns of thermally emitted electrons taken at various extraction voltages were digitized and computer analyzed to yield currents and corresponding emitting areas for major crystallographic planes. These data were taken at pyrometrically measured temperatures in the range 1700{lt}{ital T}{lt}2200 K. Schottky plots were then used to determine effective thermionic work functions as a function of crystallographic direction and temperature. Work function ordering for various crystal planes is reported through the TPM image processing method. Comparisons are made with effective thermionic and absolute (FERP) work function methods. To support the TPM image processing method, clean tungsten surfaces were examined and results are listed with accepted values.

  10. Discourse Functions and Vocabulary Use in English Language Learners' Synchronous Computer-Mediated Communication

    ERIC Educational Resources Information Center

    Rabab'ah, Ghaleb

    2013-01-01

    This study explores the discourse generated by English as a foreign language (EFL) learners using synchronous computer-mediated communication (CMC) as an approach to help English language learners to create social interaction in the classroom. It investigates the impact of synchronous CMC mode on the quantity of total words, lexical range and…

  11. Computational insights into function and inhibition of fatty acid amide hydrolase.

    PubMed

    Palermo, Giulia; Rothlisberger, Ursula; Cavalli, Andrea; De Vivo, Marco

    2015-02-16

    The Fatty Acid Amide Hydrolase (FAAH) enzyme is a membrane-bound serine hydrolase responsible for the deactivating hydrolysis of a family of naturally occurring fatty acid amides. FAAH is a critical enzyme of the endocannabinoid system, being mainly responsible for regulating the level of its main cannabinoid substrate anandamide. For this reason, pharmacological inhibition of FAAH, which increases the level of endogenous anandamide, is a promising strategy to cure a variety of diseases including pain, inflammation, and cancer. Much structural, mutagenesis, and kinetic data on FAAH has been generated over the last couple of decades. This has prompted several informative computational investigations to elucidate, at the atomic-level, mechanistic details on catalysis and inhibition of this pharmaceutically relevant enzyme. Here, we review how these computational studies - based on classical molecular dynamics, full quantum mechanics, and hybrid QM/MM methods - have clarified the binding and reactivity of some relevant substrates and inhibitors of FAAH. We also discuss the experimental implications of these computational insights, which have provided a thoughtful elucidation of the complex physical and chemical steps of the enzymatic mechanism of FAAH. Finally, we discuss how computations have been helpful for building structure-activity relationships of potent FAAH inhibitors. PMID:25240419

  12. Variability in Reading Ability Gains as a Function of Computer-Assisted Instruction Method of Presentation

    ERIC Educational Resources Information Center

    Johnson, Erin Phinney; Perry, Justin; Shamir, Haya

    2010-01-01

    This study examines the effects on early reading skills of three different methods of presenting material with computer-assisted instruction (CAI): (1) learner-controlled picture menu, which allows the student to choose activities, (2) linear sequencer, which progresses the students through lessons at a pre-specified pace, and (3) mastery-based…

  13. Computers, Mass Media, and Schooling: Functional Equivalence in Uses of New Media.

    ERIC Educational Resources Information Center

    Lieberman, Debra A.; And Others

    1988-01-01

    Presents a study of 156 California eighth grade students which contrasted their recreational and intellectual computer use in terms of academic performance and use of other media. Among the conclusions were that recreational users watched television heavily and performed poorly in school, whereas intellectual users watched less television,…

  14. Next-generation computers

    SciTech Connect

    Torrero, E.A.

    1985-01-01

    Developments related to tomorrow's computers are discussed, taking into account advances toward the fifth generation in Japan, the challenge to U.S. supercomputers, plans concerning the creation of supersmart computers for the U.S. military, a U.S. industry response to the Japanese challenge, a survey of U.S. and European research, Great Britain, the European Common Market, codifying human knowledge for machine reading, software engineering, the next-generation softwave, plans for obtaining the million-transistor chip, and fabrication issues for next-generation circuits. Other topics explored are related to a status report regarding artificial intelligence, an assessment of the technical challenges, aspects of sociotechnology, and defense advanced research projects. Attention is also given to expert systems, speech recognition, computer vision, function-level programming and automated programming, computing at the speed limit, VLSI, and superpower computers.

  15. A Computation of the Frequency Dependent Dielectric Function for Energetic Materials

    NASA Astrophysics Data System (ADS)

    Zwitter, D. E.; Kuklja, M. M.; Kunz, A. B.

    1999-06-01

    The imaginary part of the dielectric function as a function of frequency is calculated for the solids RDX, TATB, ADN, and PETN. Calculations have been performed including the effects of isotropic and uniaxial pressure. Simple lattice defects are included in some of the calculations.

  16. Computer analysis of protein functional sites projection on exon structure of genes in Metazoa

    PubMed Central

    2015-01-01

    Background Study of the relationship between the structural and functional organization of proteins and their coding genes is necessary for an understanding of the evolution of molecular systems and can provide new knowledge for many applications for designing proteins with improved medical and biological properties. It is well known that the functional properties of proteins are determined by their functional sites. Functional sites are usually represented by a small number of amino acid residues that are distantly located from each other in the amino acid sequence. They are highly conserved within their functional group and vary significantly in structure between such groups. According to this facts analysis of the general properties of the structural organization of the functional sites at the protein level and, at the level of exon-intron structure of the coding gene is still an actual problem. Results One approach to this analysis is the projection of amino acid residue positions of the functional sites along with the exon boundaries to the gene structure. In this paper, we examined the discontinuity of the functional sites in the exon-intron structure of genes and the distribution of lengths and phases of the functional site encoding exons in vertebrate genes. We have shown that the DNA fragments coding the functional sites were in the same exons, or in close exons. The observed tendency to cluster the exons that code functional sites which could be considered as the unit of protein evolution. We studied the characteristics of the structure of the exon boundaries that code, and do not code, functional sites in 11 Metazoa species. This is accompanied by a reduced frequency of intercodon gaps (phase 0) in exons encoding the amino acid residue functional site, which may be evidence of the existence of evolutionary limitations to the exon shuffling. Conclusions These results characterize the features of the coding exon-intron structure that affect the

  17. Cosmic Reionization on Computers: The Faint End of the Galaxy Luminosity Function

    NASA Astrophysics Data System (ADS)

    Gnedin, Nickolay Y.

    2016-07-01

    Using numerical cosmological simulations completed under the “Cosmic Reionization On Computers” project, I explore theoretical predictions for the faint end of the galaxy UV luminosity functions at z≳ 6. A commonly used Schechter function approximation with the magnitude cut at {M}{{cut}}˜ -13 provides a reasonable fit to the actual luminosity function of simulated galaxies. When the Schechter functional form is forced on the luminosity functions from the simulations, the magnitude cut {M}{{cut}} is found to vary between ‑12 and ‑14 with a mild redshift dependence. An analytical model of reionization from Madau et al., as used by Robertson et al., provides a good description of the simulated results, which can be improved even further by adding two physically motivated modifications to the original Madau et al. equation.

  18. Cosmic reionization on computers: The faint end of the galaxy luminosity function

    DOE PAGESBeta

    Gnedin, Nickolay Y.

    2016-07-01

    Using numerical cosmological simulations completed under the “Cosmic Reionization On Computers” project, I explore theoretical predictions for the faint end of the galaxy UV luminosity functions atmore » $$z\\gtrsim 6$$. A commonly used Schechter function approximation with the magnitude cut at $${M}_{{\\rm{cut}}}\\sim -13$$ provides a reasonable fit to the actual luminosity function of simulated galaxies. When the Schechter functional form is forced on the luminosity functions from the simulations, the magnitude cut $${M}_{{\\rm{cut}}}$$ is found to vary between -12 and -14 with a mild redshift dependence. Here, an analytical model of reionization from Madau et al., as used by Robertson et al., provides a good description of the simulated results, which can be improved even further by adding two physically motivated modifications to the original Madau et al. equation.« less

  19. WAPA Daily Energy Accounting Activities

    1990-10-01

    ISA (Interchange, Scheduling, & Accounting) is the interchange scheduling system used by the DOE Western Area Power Administration to perform energy accounting functions associated with the daily activities of the Watertown Operations Office (WOO). The system's primary role is to provide accounting functions for scheduled energy which is exchanged with other power companies and power operating organizations. The system has a secondary role of providing a historical record of all scheduled interchange transactions. The followingmore » major functions are performed by ISA: scheduled energy accounting for received and delivered energy; generation scheduling accounting for both fossil and hydro-electric power plants; metered energy accounting for received and delivered totals; energy accounting for Direct Current (D.C.) Ties; regulation accounting; automatic generation control set calculations; accounting summaries for Basin, Heartland Consumers Power District, and the Missouri Basin Municipal Power Agency; calculation of estimated generation for the Laramie River Station plant; daily and monthly reports; and dual control areas.« less

  20. MATERIAL CONTROL ACCOUNTING INMM

    SciTech Connect

    Hasty, T.

    2009-06-14

    Since 1996, the Mining and Chemical Combine (MCC - formerly known as K-26), and the United States Department of Energy (DOE) have been cooperating under the cooperative Nuclear Material Protection, Control and Accounting (MPC&A) Program between the Russian Federation and the U.S. Governments. Since MCC continues to operate a reactor for steam and electricity production for the site and city of Zheleznogorsk which results in production of the weapons grade plutonium, one of the goals of the MPC&A program is to support implementation of an expanded comprehensive nuclear material control and accounting (MC&A) program. To date MCC has completed upgrades identified in the initial gap analysis and documented in the site MC&A Plan and is implementing additional upgrades identified during an update to the gap analysis. The scope of these upgrades includes implementation of MCC organization structure relating to MC&A, establishing material balance area structure for special nuclear materials (SNM) storage and bulk processing areas, and material control functions including SNM portal monitors at target locations. Material accounting function upgrades include enhancements in the conduct of physical inventories, limit of error inventory difference procedure enhancements, implementation of basic computerized accounting system for four SNM storage areas, implementation of measurement equipment for improved accountability reporting, and both new and revised site-level MC&A procedures. This paper will discuss the implementation of MC&A upgrades at MCC based on the requirements established in the comprehensive MC&A plan developed by the Mining and Chemical Combine as part of the MPC&A Program.

  1. High-Throughput Computational Design of Advanced Functional Materials: Topological Insulators and Two-Dimensional Electron Gas Systems

    NASA Astrophysics Data System (ADS)

    Yang, Kesong

    As a rapidly growing area of materials science, high-throughput (HT) computational materials design is playing a crucial role in accelerating the discovery and development of novel functional materials. In this presentation, I will first introduce the strategy of HT computational materials design, and take the HT discovery of topological insulators (TIs) as a practical example to show the usage of such an approach. Topological insulators are one of the most studied classes of novel materials because of their great potential for applications ranging from spintronics to quantum computers. Here I will show that, by defining a reliable and accessible descriptor, which represents the topological robustness or feasibility of the candidate, and by searching the quantum materials repository aflowlib.org, we have automatically discovered 28 TIs (some of them already known) in five different symmetry families. Next, I will talk about our recent research work on the HT computational design of the perovskite-based two-dimensional electron gas (2DEG) systems. The 2DEG formed on the perovskite oxide heterostructure (HS) has potential applications in next-generation nanoelectronic devices. In order to achieve practical implementation of the 2DEG in the device design, desired physical properties such as high charge carrier density and mobility are necessary. Here I show that, using the same strategy with the HT discovery of TIs, by introducing a series of combinatorial descriptors, we have successfully identified a series of candidate 2DEG systems based on the perovskite oxides. This work provides another exemplar of applying HT computational design approach for the discovery of advanced functional materials.

  2. Head sinuses, melon, and jaws of bottlenose dolphins, Tursiops truncatus, observed with computed tomography structural and single photon emission computed tomography functional imaging

    NASA Astrophysics Data System (ADS)

    Ridgway, Sam; Houser, Dorian; Finneran, James J.; Carder, Don; van Bonn, William; Smith, Cynthia; Hoh, Carl; Corbeil, Jacqueline; Mattrey, Robert

    2003-04-01

    The head sinuses, melon, and lower jaws of dolphins have been studied extensively with various methods including radiography, chemical analysis, and imaging of dead specimens. Here we report the first structural and functional imaging of live dolphins. Two animals were imaged, one male and one female. Computed tomography (CT) revealed extensive air cavities posterior and medial to the ear as well as between the ear and sound-producing nasal structures. Single photon emission computed tomography (SPECT) employing 50 mCi of the intravenously injected ligand technetium [Tc-99m] biscisate (Neurolite) revealed extensive and uptake in the core of the melon as well as near the pan bone area of the lower jaw. Count density on SPECT images was four times greater in melon as in the surrounding tissue and blubber layer suggesting that the melon is an active rather than a passive tissue. Since the dolphin temporal bone is not attached to the skull except by fibrous suspensions, the air cavities medial and posterior to the ear as well as the abutment of the temporal bone, to the acoustic fat bodies of each lower jaw, should be considered in modeling the mechanism of sound transmission from the environment to the dolphin ear.

  3. Substrate Tunnels in Enzymes: Structure-Function Relationships and Computational Methodology

    PubMed Central

    Kingsley, Laura J.; Lill, Markus A.

    2015-01-01

    In enzymes, the active site is the location where incoming substrates are chemically converted to products. In some enzymes, this site is deeply buried within the core of the protein and in order to access the active site, substrates must pass through the body of the protein via a tunnel. In many systems, these tunnels act as filters and have been found to influence both substrate specificity and catalytic mechanism. Identifying and understanding how these tunnels exert such control has been of growing interest over the past several years due to implications in fields such as protein engineering and drug design. This growing interest has spurred the development of several computational methods to identify and analyze tunnels and how ligands migrate through these tunnels. The goal of this review is to outline how tunnels influence substrate specificity and catalytic efficiency in enzymes with tunnels and to provide a brief summary of the computational tools used to identify and evaluate these tunnels. PMID:25663659

  4. Computationally efficient approach for the minimization of volume constrained vector-valued Ginzburg-Landau energy functional

    NASA Astrophysics Data System (ADS)

    Tavakoli, Rouhollah

    2015-08-01

    The minimization of volume constrained vector-valued Ginzburg-Landau energy functional is considered in the present study. It has many applications in computational science and engineering, like the conservative phase separation in multiphase systems (such as the spinodal decomposition), phase coarsening in multiphase systems, color image segmentation and optimal space partitioning. A computationally efficient algorithm is presented to solve the space discretized form of the original optimization problem. The algorithm is based on the constrained nonmonotone L2 gradient flow of Ginzburg-Landau functional followed by a regularization step, which is resulted from the Tikhonov regularization term added to the objective functional, that lifts the solution from the L2 function space into H1 space. The regularization step not only improves the convergence rate of the presented algorithm, but also increases its stability bound. The step-size selection based on the Barzilai-Borwein approach is adapted to improve the convergence rate of the introduced algorithm. The success and performance of the presented approach is demonstrated throughout several numerical experiments. To make it possible to reproduce the results presented in this work, the MATLAB implementation of the presented algorithm is provided as the supplementary material.

  5. Computer Equipment Repair Curriculum Guide.

    ERIC Educational Resources Information Center

    Reneau, Fred; And Others

    This guide is intended for use in a course to train students to repair computer equipment and perform related administrative and customer service tasks. Addressed in the individual units are the following topics (with selected subtopics in brackets): performing administrative functions (preparing service bills, maintaining accounts and labor…

  6. Use of 4-Dimensional Computed Tomography-Based Ventilation Imaging to Correlate Lung Dose and Function With Clinical Outcomes

    SciTech Connect

    Vinogradskiy, Yevgeniy; Castillo, Richard; Castillo, Edward; Department of Computational and Applied Mathematics, Rice University, Houston, Texas ; Tucker, Susan L.; Liao, Zhongxing; Guerrero, Thomas; Department of Computational and Applied Mathematics, Rice University, Houston, Texas ; Martel, Mary K.

    2013-06-01

    Purpose: Four-dimensional computed tomography (4DCT)-based ventilation is an emerging imaging modality that can be used in the thoracic treatment planning process. The clinical benefit of using ventilation images in radiation treatment plans remains to be tested. The purpose of the current work was to test the potential benefit of using ventilation in treatment planning by evaluating whether dose to highly ventilated regions of the lung resulted in increased incidence of clinical toxicity. Methods and Materials: Pretreatment 4DCT data were used to compute pretreatment ventilation images for 96 lung cancer patients. Ventilation images were calculated using 4DCT data, deformable image registration, and a density-change based algorithm. Dose–volume and ventilation-based dose function metrics were computed for each patient. The ability of the dose–volume and ventilation-based dose–function metrics to predict for severe (grade 3+) radiation pneumonitis was assessed using logistic regression analysis, area under the curve (AUC) metrics, and bootstrap methods. Results: A specific patient example is presented that demonstrates how incorporating ventilation-based functional information can help separate patients with and without toxicity. The logistic regression significance values were all lower for the dose–function metrics (range P=.093-.250) than for their dose–volume equivalents (range, P=.331-.580). The AUC values were all greater for the dose–function metrics (range, 0.569-0.620) than for their dose–volume equivalents (range, 0.500-0.544). Bootstrap results revealed an improvement in model fit using dose–function metrics compared to dose–volume metrics that approached significance (range, P=.118-.155). Conclusions: To our knowledge, this is the first study that attempts to correlate lung dose and 4DCT ventilation-based function to thoracic toxicity after radiation therapy. Although the results were not significant at the .05 level, our data suggests

  7. Development of microgravity, full body functional reach envelope using 3-D computer graphic models and virtual reality technology

    NASA Technical Reports Server (NTRS)

    Lindsey, Patricia F.

    1994-01-01

    In microgravity conditions mobility is greatly enhanced and body stability is difficult to achieve. Because of these difficulties, optimum placement and accessibility of objects and controls can be critical to required tasks on board shuttle flights or on the proposed space station. Anthropometric measurement of the maximum reach of occupants of a microgravity environment provide knowledge about maximum functional placement for tasking situations. Calculations for a full body, functional reach envelope for microgravity environments are imperative. To this end, three dimensional computer modeled human figures, providing a method of anthropometric measurement, were used to locate the data points that define the full body, functional reach envelope. Virtual reality technology was utilized to enable an occupant of the microgravity environment to experience movement within the reach envelope while immersed in a simulated microgravity environment.

  8. When can Empirical Green Functions be computed from Noise Cross-Correlations? Hints from different Geographical and Tectonic environments

    NASA Astrophysics Data System (ADS)

    Matos, Catarina; Silveira, Graça; Custódio, Susana; Domingues, Ana; Dias, Nuno; Fonseca, João F. B.; Matias, Luís; Krueger, Frank; Carrilho, Fernando

    2014-05-01

    Noise cross-correlations are now widely used to extract Green functions between station pairs. But, do all the cross-correlations routinely computed produce successful Green Functions? What is the relationship between noise recorded in a couple of stations and the cross-correlation between them? During the last decade, we have been involved in the deployment of several temporary dense broadband (BB) networks within the scope of both national projects and international collaborations. From 2000 to 2002, a pool of 8 BB stations continuously operated in the Azores in the scope of the Memorandum of Understanding COSEA (COordinated Seismic Experiment in the Azores). Thanks to the Project WILAS (West Iberia Lithosphere and Astenosphere Structure, PTDC/CTE-GIX/097946/2008) we temporarily increased the number of BB deployed in mainland Portugal to more than 50 (permanent + temporary) during the period 2010 - 2012. In 2011/12 a temporary pool of 12 seismometers continuously recorded BB data in the Madeira archipelago, as part of the DOCTAR (Deep Ocean Test Array Experiment) project. Project CV-PLUME (Investigation on the geometry and deep signature of the Cape Verde mantle plume, PTDC/CTE-GIN/64330/2006) covered the archipelago of Cape Verde, North Atlantic, with 40 temporary BB stations in 2007/08. Project MOZART (Mozambique African Rift Tomography, PTDC/CTE-GIX/103249/2008), covered Mozambique, East Africa, with 30 temporary BB stations in the period 2011 - 2013. These networks, located in very distinct geographical and tectonic environments, offer an interesting opportunity to study seasonal and spatial variations of noise sources and their impact on Empirical Green functions computed from noise cross-correlation. Seismic noise recorded at different seismic stations is evaluated by computation of the probability density functions of power spectral density (PSD) of continuous data. To assess seasonal variations of ambient noise sources in frequency content, time-series of

  9. A first principle approach using Maximally Localized Wannier Functions for computing and understanding elasto-optic reponse

    NASA Astrophysics Data System (ADS)

    Liang, Xin; Ismail-Beigi, Sohrab

    Strain-induced changes of optical properties are of use in the design and functioning of devices that couple photons and phonons. The elasto-optic (or photo-elastic) effect describes a general materials property where strain induces a change in the dielectric tensor. Despite a number of experimental and computational works, it is fair to say that a basic physical understanding of the effect and its materials dependence is lacking: e.g., we know of no materials design rule for enhancing or suppressing elasto-optic response. Based on our previous work, we find that a real space representation, as opposed to a k-space description, is a promising way to understand this effect. We have finished the development of a method of computing the dielectric and elasto-optic tensors using Maximally Localized Wannier Functions (MLWFs). By analyzing responses to uniaxial strain, we find that both tensors respond in a localized manner to the perturbation: the dominant optical transitions are between local electronic states on nearby bonds. We describe the method, the resulting physical picture and computed results for semiconductors. This work is supported by the National Science Foundation through Grant NSF DMR-1104974.

  10. Partition function of interacting calorons ensemble

    NASA Astrophysics Data System (ADS)

    Deldar, S.; Kiamari, M.

    2016-01-01

    We present a method for computing the partition function of a caloron ensemble taking into account the interaction of calorons. We focus on caloron-Dirac string interaction and show that the metric that Diakonov and Petrov offered, works well in the limit where this interaction occurs. We suggest computing the correlation function of two polyakov loops by applying Ewald's method.

  11. An atomic orbital based real-time time-dependent density functional theory for computing electronic circular dichroism band spectra

    NASA Astrophysics Data System (ADS)

    Goings, Joshua J.; Li, Xiaosong

    2016-06-01

    One of the challenges of interpreting electronic circular dichroism (ECD) band spectra is that different states may have different rotatory strength signs, determined by their absolute configuration. If the states are closely spaced and opposite in sign, observed transitions may be washed out by nearby states, unlike absorption spectra where transitions are always positive additive. To accurately compute ECD bands, it is necessary to compute a large number of excited states, which may be prohibitively costly if one uses the linear-response time-dependent density functional theory (TDDFT) framework. Here we implement a real-time, atomic-orbital based TDDFT method for computing the entire ECD spectrum simultaneously. The method is advantageous for large systems with a high density of states. In contrast to previous implementations based on real-space grids, the method is variational, independent of nuclear orientation, and does not rely on pseudopotential approximations, making it suitable for computation of chiroptical properties well into the X-ray regime.

  12. Tetralogy of Fallot Cardiac Function Evaluation and Intelligent Diagnosis Based on Dual-Source Computed Tomography Cardiac Images.

    PubMed

    Cai, Ken; Rongqian, Yang; Li, Lihua; Xie, Zi; Ou, Shanxing; Chen, Yuke; Dou, Jianhong

    2016-05-01

    Tetralogy of Fallot (TOF) is the most common complex congenital heart disease (CHD) of the cyanotic type. Studies on ventricular functions have received an increasing amount of attention as the development of diagnosis and treatment technology for CHD continues to advance. Reasonable options for imaging examination and accurate assessment of preoperative and postoperative left ventricular functions of TOF patients are important in improving the cure rate of TOF radical operation, therapeutic evaluation, and judgment prognosis. Therefore, with the aid of dual-source computed tomography (DSCT), cardiac images with high temporal resolution and high definition, we measured the left ventricular time-volume curve using image data and calculating the left ventricular function parameters to conduct the preliminary evaluation on TOF patients. To comprehensively evaluate the cardiac function, the segmental ventricular wall function parameters were measured, and the measurement results were mapped to a bull's eye diagram to realize the standardization of segmental ventricular wall function evaluation. Finally, we introduced a new clustering method based on auto-regression model parameters and combined this method with Euclidean distance measurements to establish an intelligent diagnosis of TOF. The results of this experiment show that the TOF evaluation and the intelligent diagnostic methods proposed in this article are feasible. PMID:26496001

  13. The van Hove distribution function for Brownian hard spheres: Dynamical test particle theory and computer simulations for bulk dynamics

    NASA Astrophysics Data System (ADS)

    Hopkins, Paul; Fortini, Andrea; Archer, Andrew J.; Schmidt, Matthias

    2010-12-01

    We describe a test particle approach based on dynamical density functional theory (DDFT) for studying the correlated time evolution of the particles that constitute a fluid. Our theory provides a means of calculating the van Hove distribution function by treating its self and distinct parts as the two components of a binary fluid mixture, with the "self " component having only one particle, the "distinct" component consisting of all the other particles, and using DDFT to calculate the time evolution of the density profiles for the two components. We apply this approach to a bulk fluid of Brownian hard spheres and compare to results for the van Hove function and the intermediate scattering function from Brownian dynamics computer simulations. We find good agreement at low and intermediate densities using the very simple Ramakrishnan-Yussouff [Phys. Rev. B 19, 2775 (1979)] approximation for the excess free energy functional. Since the DDFT is based on the equilibrium Helmholtz free energy functional, we can probe a free energy landscape that underlies the dynamics. Within the mean-field approximation we find that as the particle density increases, this landscape develops a minimum, while an exact treatment of a model confined situation shows that for an ergodic fluid this landscape should be monotonic. We discuss possible implications for slow, glassy, and arrested dynamics at high densities.

  14. The van Hove distribution function for brownian hard spheres: dynamical test particle theory and computer simulations for bulk dynamics.

    PubMed

    Hopkins, Paul; Fortini, Andrea; Archer, Andrew J; Schmidt, Matthias

    2010-12-14

    We describe a test particle approach based on dynamical density functional theory (DDFT) for studying the correlated time evolution of the particles that constitute a fluid. Our theory provides a means of calculating the van Hove distribution function by treating its self and distinct parts as the two components of a binary fluid mixture, with the "self " component having only one particle, the "distinct" component consisting of all the other particles, and using DDFT to calculate the time evolution of the density profiles for the two components. We apply this approach to a bulk fluid of Brownian hard spheres and compare to results for the van Hove function and the intermediate scattering function from Brownian dynamics computer simulations. We find good agreement at low and intermediate densities using the very simple Ramakrishnan-Yussouff [Phys. Rev. B 19, 2775 (1979)] approximation for the excess free energy functional. Since the DDFT is based on the equilibrium Helmholtz free energy functional, we can probe a free energy landscape that underlies the dynamics. Within the mean-field approximation we find that as the particle density increases, this landscape develops a minimum, while an exact treatment of a model confined situation shows that for an ergodic fluid this landscape should be monotonic. We discuss possible implications for slow, glassy, and arrested dynamics at high densities. PMID:21171689

  15. Comparing Two Computational Mechanisms for Explaining Functional Recovery in Robot-Therapy of Stroke Survivors

    PubMed Central

    Piovesan, Davide; Casadio, Maura; Mussa-Ivaldi, Ferdinando A.; Morasso, Pietro

    2015-01-01

    In this paper we discuss two possible strategies of movement control that can be used by stroke survivors during rehabilitation robotics training. To perform a reaching task in a minimally assistive force field, subjects either can move following the trajectory provided by the assistive force or they can use an internal representation of a minimum jerk trajectory from their starting position to the target. We used the stiffness and damping values directly estimated from the experimental data to simulate the trajectories that result by taking into account both hypotheses. The comparison of the simulated results with the data collected on four hemiparetic subjects supports the hypothesis that the central nervous system (CNS) is still able to correctly plan the movement, although a normal execution is impaired. PMID:26180655

  16. Computing zeros of analytic functions in the complex plane without using derivatives

    NASA Astrophysics Data System (ADS)

    Gillan, C. J.; Schuchinsky, A.; Spence, I.

    2006-08-01

    We present a package in Fortran 90 which solves f(z)=0, where z∈W⊂C without requiring the evaluation of derivatives, f(z). W is bounded by a simple closed curve and f(z) must be holomorphic within W. We have developed and tested the package to support our work in the modeling of high frequency and optical wave guiding and resonant structures. The respective eigenvalue problems are particularly challenging because they require the high precision computation of all multiple complex roots of f(z) confined to the specified finite domain. Generally f(z), despite being holomorphic, does not have explicit analytical form thereby inhibiting evaluation of its derivatives. Program summaryTitle of program:EZERO Catalogue identifier:ADXY_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXY_v1_0 Program obtainable from:CPC Program Library, Queen's University of Belfast, N. Ireland Computer:IBM compatible desktop PC Operating system:Fedora Core 2 Linux (with 2.6.5 kernel) Programming languages used:Fortran 90 No. of bits in a word:32 No. of processors used:one Has the code been vectorized:no No. of lines in distributed program, including test data, etc.:21045 Number of bytes in distributed program including test data, etc.:223 756 Distribution format:tar.gz Peripherals used:none Method of solution:Our package uses the principle of the argument to count the number of zeros encompassed by a contour and then computes estimates for the zeros. Refined results for each zero are obtained by application of the derivative-free Halley method with or without Aitken acceleration, as the user wishes.

  17. Using brain-computer interfaces to induce neural plasticity and restore function

    NASA Astrophysics Data System (ADS)

    Grosse-Wentrup, Moritz; Mattia, Donatella; Oweiss, Karim

    2011-04-01

    Analyzing neural signals and providing feedback in realtime is one of the core characteristics of a brain-computer interface (BCI). As this feature may be employed to induce neural plasticity, utilizing BCI technology for therapeutic purposes is increasingly gaining popularity in the BCI community. In this paper, we discuss the state-of-the-art of research on this topic, address the principles of and challenges in inducing neural plasticity by means of a BCI, and delineate the problems of study design and outcome evaluation arising in this context. We conclude with a list of open questions and recommendations for future research in this field.

  18. Computer program for supersonic Kernel-function flutter analysis of thin lifting surfaces

    NASA Technical Reports Server (NTRS)

    Cunningham, H. J.

    1974-01-01

    This report describes a computer program (program D2180) that has been prepared to implement the analysis described in (N71-10866) for calculating the aerodynamic forces on a class of harmonically oscillating planar lifting surfaces in supersonic potential flow. The planforms treated are the delta and modified-delta (arrowhead) planforms with subsonic leading and supersonic trailing edges, and (essentially) pointed tips. The resulting aerodynamic forces are applied in a Galerkin modal flutter analysis. The required input data are the flow and planform parameters including deflection-mode data, modal frequencies, and generalized masses.

  19. Fast computation of the Gauss hypergeometric function with all its parameters complex with application to the Pöschl Teller Ginocchio potential wave functions

    NASA Astrophysics Data System (ADS)

    Michel, N.; Stoitsov, M. V.

    2008-04-01

    The fast computation of the Gauss hypergeometric function F12 with all its parameters complex is a difficult task. Although the F12 function verifies numerous analytical properties involving power series expansions whose implementation is apparently immediate, their use is thwarted by instabilities induced by cancellations between very large terms. Furthermore, small areas of the complex plane, in the vicinity of z=e, are inaccessible using F12 power series linear transformations. In order to solve these problems, a generalization of R.C. Forrey's transformation theory has been developed. The latter has been successful in treating the F12 function with real parameters. As in real case transformation theory, the large canceling terms occurring in F12 analytical formulas are rigorously dealt with, but by way of a new method, directly applicable to the complex plane. Taylor series expansions are employed to enter complex areas outside the domain of validity of power series analytical formulas. The proposed algorithm, however, becomes unstable in general when |a|, |b|, |c| are moderate or large. As a physical application, the calculation of the wave functions of the analytical Pöschl-Teller-Ginocchio potential involving F12 evaluations is considered. Program summaryProgram title: hyp_2F1, PTG_wf Catalogue identifier: AEAE_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEAE_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 6839 No. of bytes in distributed program, including test data, etc.: 63 334 Distribution format: tar.gz Programming language: C++, Fortran 90 Computer: Intel i686 Operating system: Linux, Windows Word size: 64 bits Classification: 4.7 Nature of problem: The Gauss hypergeometric function F12, with all its parameters complex, is uniquely

  20. Accuracy and computational efficiency of real-time subspace propagation schemes for the time-dependent density functional theory

    NASA Astrophysics Data System (ADS)

    Russakoff, Arthur; Li, Yonghui; He, Shenglai; Varga, Kalman

    2016-05-01

    Time-dependent Density Functional Theory (TDDFT) has become successful for its balance of economy and accuracy. However, the application of TDDFT to large systems or long time scales remains computationally prohibitively expensive. In this paper, we investigate the numerical stability and accuracy of two subspace propagation methods to solve the time-dependent Kohn-Sham equations with finite and periodic boundary conditions. The bases considered are the Lánczos basis and the adiabatic eigenbasis. The results are compared to a benchmark fourth-order Taylor expansion of the time propagator. Our results show that it is possible to use larger time steps with the subspace methods, leading to computational speedups by a factor of 2-3 over Taylor propagation. Accuracy is found to be maintained for certain energy regimes and small time scales.

  1. A Computationally Inexpensive Optimal Guidance via Radial-Basis-Function Neural Network for Autonomous Soft Landing on Asteroids

    PubMed Central

    Zhang, Peng; Liu, Keping; Zhao, Bo; Li, Yuanchun

    2015-01-01

    Optimal guidance is essential for the soft landing task. However, due to its high computational complexities, it is hardly applied to the autonomous guidance. In this paper, a computationally inexpensive optimal guidance algorithm based on the radial basis function neural network (RBFNN) is proposed. The optimization problem of the trajectory for soft landing on asteroids is formulated and transformed into a two-point boundary value problem (TPBVP). Combining the database of initial states with the relative initial co-states, an RBFNN is trained offline. The optimal trajectory of the soft landing is determined rapidly by applying the trained network in the online guidance. The Monte Carlo simulations of soft landing on the Eros433 are performed to demonstrate the effectiveness of the proposed guidance algorithm. PMID:26367382

  2. Reverse energy partitioning-An efficient algorithm for computing the density of states, partition functions, and free energy of solids.

    PubMed

    Do, Hainam; Wheatley, Richard J

    2016-08-28

    A robust and model free Monte Carlo simulation method is proposed to address the challenge in computing the classical density of states and partition function of solids. Starting from the minimum configurational energy, the algorithm partitions the entire energy range in the increasing energy direction ("upward") into subdivisions whose integrated density of states is known. When combined with the density of states computed from the "downward" energy partitioning approach [H. Do, J. D. Hirst, and R. J. Wheatley, J. Chem. Phys. 135, 174105 (2011)], the equilibrium thermodynamic properties can be evaluated at any temperature and in any phase. The method is illustrated in the context of the Lennard-Jones system and can readily be extended to other molecular systems and clusters for which the structures are known. PMID:27586913

  3. A Computationally Inexpensive Optimal Guidance via Radial-Basis-Function Neural Network for Autonomous Soft Landing on Asteroids.

    PubMed

    Zhang, Peng; Liu, Keping; Zhao, Bo; Li, Yuanchun

    2015-01-01

    Optimal guidance is essential for the soft landing task. However, due to its high computational complexities, it is hardly applied to the autonomous guidance. In this paper, a computationally inexpensive optimal guidance algorithm based on the radial basis function neural network (RBFNN) is proposed. The optimization problem of the trajectory for soft landing on asteroids is formulated and transformed into a two-point boundary value problem (TPBVP). Combining the database of initial states with the relative initial co-states, an RBFNN is trained offline. The optimal trajectory of the soft landing is determined rapidly by applying the trained network in the online guidance. The Monte Carlo simulations of soft landing on the Eros433 are performed to demonstrate the effectiveness of the proposed guidance algorithm. PMID:26367382

  4. Density functional computational studies on the glucose and glycine Maillard reaction: Formation of the Amadori rearrangement products

    NASA Astrophysics Data System (ADS)

    Jalbout, Abraham F.; Roy, Amlan K.; Shipar, Abul Haider; Ahmed, M. Samsuddin

    Theoretical energy changes of various intermediates leading to the formation of the Amadori rearrangement products (ARPs) under different mechanistic assumptions have been calculated, by using open chain glucose (O-Glu)/closed chain glucose (A-Glu and B-Glu) and glycine (Gly) as a model for the Maillard reaction. Density functional theory (DFT) computations have been applied on the proposed mechanisms under different pH conditions. Thus, the possibility of the formation of different compounds and electronic energy changes for different steps in the proposed mechanisms has been evaluated. B-Glu has been found to be more efficient than A-Glu, and A-Glu has been found more efficient than O-Glu in the reaction. The reaction under basic condition is the most favorable for the formation of ARPs. Other reaction pathways have been computed and discussed in this work.0

  5. Accuracy and computational efficiency of real-time subspace propagation schemes for the time-dependent density functional theory.

    PubMed

    Russakoff, Arthur; Li, Yonghui; He, Shenglai; Varga, Kalman

    2016-05-28

    Time-dependent Density Functional Theory (TDDFT) has become successful for its balance of economy and accuracy. However, the application of TDDFT to large systems or long time scales remains computationally prohibitively expensive. In this paper, we investigate the numerical stability and accuracy of two subspace propagation methods to solve the time-dependent Kohn-Sham equations with finite and periodic boundary conditions. The bases considered are the Lánczos basis and the adiabatic eigenbasis. The results are compared to a benchmark fourth-order Taylor expansion of the time propagator. Our results show that it is possible to use larger time steps with the subspace methods, leading to computational speedups by a factor of 2-3 over Taylor propagation. Accuracy is found to be maintained for certain energy regimes and small time scales. PMID:27250297

  6. Critical assessment of density functional theory for computing vibrational (hyper)polarizabilities

    NASA Astrophysics Data System (ADS)

    Zaleśny, R.; Bulik, I. W.; Mikołajczyk, M.; Bartkowiak, W.; Luis, J. M.; Kirtman, B.; Avramopoulos, A.; Papadopoulos, M. G.

    2012-12-01

    Despite undisputed success of the density functional theory (DFT) in various branches of chemistry and physics, an application of the DFT for reliable predictions of nonlinear optical properties of molecules has been questioned a decade ago. As it was shown by Champagne, et al. [1, 2, 3] most conventional DFT schemes were unable to qualitatively predict the response of conjugated oligomers to a static electric field. Long-range corrected (LRC) functionals, like LC-BLYP or CAM-B3LYP, have been proposed to alleviate this deficiency. The reliability of LRC functionals for evaluating molecular (hyper)polarizabilities is studied for various groups of organic systems, with a special focus on vibrational corrections to the electric properties.

  7. Evaluation of cardiac function and myocardial viability with 16- and 64-slice multidetector computed tomography.

    PubMed

    Kopp, Andreas F; Heuschmid, Martin; Reimann, Anja; Kuettner, Axel; Beck, Thorsten; Ohmer, Martin; Burgstahler, Christoph; Brodoefel, Harald; Claussen, Claus D; Schroeder, Stephen

    2005-11-01

    Retrospectively ECG-gated MDCT shows a high correlation and acceptable agreement of left-ventricular functional parameters compared to MR imaging. Thus, in addition to the non-invasive evaluation of coronary arteries, further important additional information of left-ventricular functional parameters with clinical and prognostic relevance can be achieved by one single MDCT examination. For assessment of myocardial viability, low-dose CT late enhancement scanning is feasible, and preliminary results look promising. CT late enhancement adds valuable diagnostic information on the haemodynamical significance of coronary stenoses or prior to interventional procedures. PMID:16479639

  8. The Secrets of a Functional Synapse – From a Computational and Experimental Viewpoint

    PubMed Central

    Linial, Michal

    2006-01-01

    Background Neuronal communication is tightly regulated in time and in space. The neuronal transmission takes place in the nerve terminal, at a specialized structure called the synapse. Following neuronal activation, an electrical signal triggers neurotransmitter (NT) release at the active zone. The process starts by the signal reaching the synapse followed by a fusion of the synaptic vesicle and diffusion of the released NT in the synaptic cleft; the NT then binds to the appropriate receptor, and as a result, a potential change at the target cell membrane is induced. The entire process lasts for only a fraction of a millisecond. An essential property of the synapse is its capacity to undergo biochemical and morphological changes, a phenomenon that is referred to as synaptic plasticity. Results In this survey, we consider the mammalian brain synapse as our model. We take a cell biological and a molecular perspective to present fundamental properties of the synapse:(i) the accurate and efficient delivery of organelles and material to and from the synapse; (ii) the coordination of gene expression that underlies a particular NT phenotype; (iii) the induction of local protein expression in a subset of stimulated synapses. We describe the computational facet and the formulation of the problem for each of these topics. Conclusion Predicting the behavior of a synapse under changing conditions must incorporate genomics and proteomics information with new approaches in computational biology. PMID:16723009

  9. Density-functional computation of ⁹³Nb NMR chemical shifts.

    PubMed

    Bühl, Michael; Wrackmeyer, Bernd

    2010-12-01

    93Nb chemical shifts of [NbX6](-) (X = Cl, F, CO), [NbXCl4](-) (X = O, S), Nb2(OMe)10, Cp*2Nb(κ2-BH4), (Cp*Nb)2(µ-B2H6)2, CpNb(CO)4, and Cp2NbH3 are computed at the GIAO (gauge-including atomic orbitals)-, BPW91- and B3LYP-, and CSGT (continuous set of gauge transformations)-CAM-B3LYP, -ωB97, and -ωB97X levels, using BP86-optimized or experimental (X-ray) geometries. Experimental chemical shifts are best reproduced at the GIAO-BPW91 level when δ(93Nb) values of inorganic complexes are referenced directly relative to [NbCl6](-) and those of organometallic species are first calculated relative to [Nb(CO)6](-). An inadvertent error in the reported δ(93Nb) values of cyclopentadiene borane complexes (H. Brunner et al., J. Organomet. Chem.1992, 436, 313) is corrected. Trends in the observed 93Nb NMR linewidths for anionic niobates [Nb(CO)5](3-), [Nb(CO)5H](2-), and [Nb(CO)5(NH3)](-) are rationalized in terms of computed electric field gradients at the metal. PMID:20552575

  10. Charon Toolkit for Parallel, Implicit Structured-Grid Computations: Functional Design

    NASA Technical Reports Server (NTRS)

    VanderWijngaart, Rob F.; Kutler, Paul (Technical Monitor)

    1997-01-01

    In a previous report the design concepts of Charon were presented. Charon is a toolkit that aids engineers in developing scientific programs for structured-grid applications to be run on MIMD parallel computers. It constitutes an augmentation of the general-purpose MPI-based message-passing layer, and provides the user with a hierarchy of tools for rapid prototyping and validation of parallel programs, and subsequent piecemeal performance tuning. Here we describe the implementation of the domain decomposition tools used for creating data distributions across sets of processors. We also present the hierarchy of parallelization tools that allows smooth translation of legacy code (or a serial design) into a parallel program. Along with the actual tool descriptions, we will present the considerations that led to the particular design choices. Many of these are motivated by the requirement that Charon must be useful within the traditional computational environments of Fortran 77 and C. Only the Fortran 77 syntax will be presented in this report.

  11. Computational functional genomics based analysis of pain-relevant micro-RNAs.

    PubMed

    Lötsch, Jörn; Niederberger, Ellen; Ultsch, Alfred

    2015-11-01

    Micro-ribonucleic acids (miRNAs) play a role in pain, based on studies on models of neuropathic or inflammatory pain and clinical evidence. The present analysis made extensive use of computational biology, knowledge discovery methods, publicly available databases and data mining tools to merge results from genetic and miRNA research into an analysis of the systems biological roles of miRNAs in pain. We identified that about one-third of miRNAs detected through nociceptive research have been associated with a mere 18 regulated genes. Substituting the missing genetic information by computational data mining and based on comprehensive current empirical evidence of gene versus miRNA interactions, we have identified a total of 130 pain genes as being probably regulated by a total of 167 different miRNAs. Particularly pain-relevant roles of miRNAs include the control of gene expression at any level and regulation of interleukin-6-related pain entities. Among the miRNAs regulating pain genes are seven that are brain specific, hinting at their therapeutic utility for modulating central nervous mechanisms of pain. PMID:26385553

  12. Evaluation of accountability measurements

    SciTech Connect

    Cacic, C.G.

    1988-01-01

    The New Brunswick Laboratory (NBL) is programmatically responsible to the U.S. Department of Energy (DOE) Office of Safeguards and Security (OSS) for providing independent review and evaluation of accountability measurement technology in DOE nuclear facilities. This function is addressed in part through the NBL Safegaurds Measurement Evaluation (SME) Program. The SME Program utilizes both on-site review of measurement methods along with material-specific measurement evaluation studies to provide information concerning the adequacy of subject accountability measurements. This paper reviews SME Program activities for the 1986-87 time period, with emphasis on noted improvements in measurement capabilities. Continued evolution of the SME Program to respond to changing safeguards concerns is discussed.

  13. Study of space shuttle orbiter system management computer function. Volume 2: Automated performance verification concepts

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The findings are presented of investigations on concepts and techniques in automated performance verification. The investigations were conducted to provide additional insight into the design methodology and to develop a consolidated technology base from which to analyze performance verification design approaches. Other topics discussed include data smoothing, function selection, flow diagrams, data storage, and shuttle hydraulic systems.

  14. Charon Toolkit for Parallel, Implicit Structured-Grid Computations: Functional Design

    NASA Technical Reports Server (NTRS)

    VanderWijngaart, Rob F.; Kutler, Paul (Technical Monitor)

    1997-01-01

    Charon is a software toolkit that enables engineers to develop high-performing message-passing programs in a convenient and piecemeal fashion. Emphasis is on rapid program development and prototyping. In this report a detailed description of the functional design of the toolkit is presented. It is illustrated by the stepwise parallelization of two representative code examples.

  15. Computer Simulation for Calculating the Second-Order Correlation Function of Classical and Quantum Light

    ERIC Educational Resources Information Center

    Facao, M.; Lopes, A.; Silva, A. L.; Silva, P.

    2011-01-01

    We propose an undergraduate numerical project for simulating the results of the second-order correlation function as obtained by an intensity interference experiment for two kinds of light, namely bunched light with Gaussian or Lorentzian power density spectrum and antibunched light obtained from single-photon sources. While the algorithm for…

  16. Neural network models of cortical functions based on the computational properties of the cerebral cortex.

    PubMed

    Guigon, E; Grandguillaume, P; Otto, I; Boutkhil, L; Burnod, Y

    1994-01-01

    We describe a biologically plausible modelling framework based on the architectural and processing characteristics of the cerebral cortex. Its key feature is a multicellular processing unit (cortical column) reflecting the modular nature of cortical organization and function. In this framework, we describe a neural network model organization and function. In this framework, we describe a neural network model of the neuronal circuits of the cerebral cortex that learn different functions associated with different parts of the cortex: 1) visual integration for invariant pattern recognition, performed by a cooperation between temporal and parietal areas; 2) visual-to-motor transformation for 3D arm reaching movements, performed by parietal and motor areas; and 3) temporal integration and storage of sensorimotor programs, performed by networks linking the prefrontal cortex to associative sensory and motor areas. The architecture of the network is inspired from the features of the architecture of cortical pathways involved in these functions. We propose two rules which describe neural processing and plasticity in the network. The first rule (adaptive tuning if gating) is an analog of operant conditioning and permits to learn to anticipate an action. The second rule (adaptive timing) is based on a bistable state of activity and permits to learn temporally separate events forming a behavioral sequence. PMID:7787829

  17. A Unit on Slope Functions--Using a Computer in Mathematics Class.

    ERIC Educational Resources Information Center

    Lappan, Glenda; Winter, M. J.

    1982-01-01

    An introductory unit on slope, designed to give students a chance to discover some of the basic relationships between functions and slopes, is described. Programs written in BASIC for PET microcomputers were used. It is felt that students have the background to understand derivatives after experiences with this unit. (MP)

  18. A Mobile Computing Solution for Collecting Functional Analysis Data on a Pocket PC

    ERIC Educational Resources Information Center

    Jackson, James; Dixon, Mark R.

    2007-01-01

    The present paper provides a task analysis for creating a computerized data system using a Pocket PC and Microsoft Visual Basic. With Visual Basic software and any handheld device running the Windows MOBLE operating system, this task analysis will allow behavior analysts to program and customize their own functional analysis data-collection…

  19. Computational modeling to predict mechanical function of joints: application to the lower leg with simulation of two cadaver studies.

    PubMed

    Liacouras, Peter C; Wayne, Jennifer S

    2007-12-01

    Computational models of musculoskeletal joints and limbs can provide useful information about joint mechanics. Validated models can be used as predictive devices for understanding joint function and serve as clinical tools for predicting the outcome of surgical procedures. A new computational modeling approach was developed for simulating joint kinematics that are dictated by bone/joint anatomy, ligamentous constraints, and applied loading. Three-dimensional computational models of the lower leg were created to illustrate the application of this new approach. Model development began with generating three-dimensional surfaces of each bone from CT images and then importing into the three-dimensional solid modeling software SOLIDWORKS and motion simulation package COSMOSMOTION. Through SOLIDWORKS and COSMOSMOTION, each bone surface file was filled to create a solid object and positioned necessary components added, and simulations executed. Three-dimensional contacts were added to inhibit intersection of the bones during motion. Ligaments were represented as linear springs. Model predictions were then validated by comparison to two different cadaver studies, syndesmotic injury and repair and ankle inversion following ligament transection. The syndesmotic injury model was able to predict tibial rotation, fibular rotation, and anterior/posterior displacement. In the inversion simulation, calcaneofibular ligament extension and angles of inversion compared well. Some experimental data proved harder to simulate accurately, due to certain software limitations and lack of complete experimental data. Other parameters that could not be easily obtained experimentally can be predicted and analyzed by the computational simulations. In the syndesmotic injury study, the force generated in the tibionavicular and calcaneofibular ligaments reduced with the insertion of the staple, indicating how this repair technique changes joint function. After transection of the calcaneofibular

  20. Computing the partition function and sampling for saturated secondary structures of RNA, with respect to the Turner energy model.

    PubMed

    Waldispühl, J; Clote, P

    2007-03-01

    An RNA secondary structure is saturated if no base pairs can be added without violating the definition of secondary structure. Here we describe a new algorithm, RNAsat, which for a given RNA sequence a, an integral temperature 0 computes the Boltzmann partition function Z(k)(T)(a) = SigmaSepsilonSAT(k)(a) exp(-E(S)/RT), where the sum is over all saturated secondary structures of a which have exactly k base pairs, R is the universal gas constant and E(S) denotes the free energy with respect to the Turner nearest neighbor energy model. By dynamic programming, we compute Z(k)(T)simultaneously for all values of k in time O(n(5)) and space O(n(3)).Additionally, RNAsat computes the partition function Q(k)(T)(a) = SigmaSepsilonS(k)(a) exp(-E(S)/RT), where the sum is over all secondary structures of a which have k base pairs; the latter computation is performed simultaneously for all values of k in O(n(4)) time and O(n(3)) space. Lastly, using the partition function Z(k)(T) [resp. Q(k)(T)] with stochastic backtracking, RNAsat rigorously samples the collection of saturated secondary structures [resp. secondary structures] having k base pairs; for Q(k)(T) this provides a parametrized form of Sfold sampling (Ding and Lawrence, 2003). Using RNAsat, (i) we compute the ensemble free energy for saturated secondary structures having k base pairs, (ii) show cooperativity of the Turner model, (iii) demonstrate a temperature-dependent phase transition, (iv) illustrate the predictive advantage of RNAsat for precursor microRNA cel-mir-72 of C. elegans and for the pseudoknot PKB 00152 of Pseudobase (van Batenburg et al., 2001), (v) illustrate the RNA shapes (Giegerich et al., 2004) of sampled secondary structures [resp. saturated structures] having exactly k base pairs. A web server for RNAsat is under construction at bioinformatics.bc.edu/clotelab/RNAsat/. PMID:17456015