Sample records for computational analysis suggests

  1. A Set of Computer Projects for an Electromagnetic Fields Class.

    ERIC Educational Resources Information Center

    Gleeson, Ronald F.

    1989-01-01

    Presented are three computer projects: vector analysis, electric field intensities at various distances, and the Biot-Savart law. Programing suggestions and project results are provided. One month is suggested for each project. (MVL)

  2. Graphics Flutter Analysis Methods, an interactive computing system at Lockheed-California Company

    NASA Technical Reports Server (NTRS)

    Radovcich, N. A.

    1975-01-01

    An interactive computer graphics system, Graphics Flutter Analysis Methods (GFAM), was developed to complement FAMAS, a matrix-oriented batch computing system, and other computer programs in performing complex numerical calculations using a fully integrated data management system. GFAM has many of the matrix operation capabilities found in FAMAS, but on a smaller scale, and is utilized when the analysis requires a high degree of interaction between the engineer and computer, and schedule constraints exclude the use of batch entry programs. Applications of GFAM to a variety of preliminary design, development design, and project modification programs suggest that interactive flutter analysis using matrix representations is a feasible and cost effective computing tool.

  3. Practical Use of Computationally Frugal Model Analysis Methods

    DOE PAGES

    Hill, Mary C.; Kavetski, Dmitri; Clark, Martyn; ...

    2015-03-21

    Computationally frugal methods of model analysis can provide substantial benefits when developing models of groundwater and other environmental systems. Model analysis includes ways to evaluate model adequacy and to perform sensitivity and uncertainty analysis. Frugal methods typically require 10s of parallelizable model runs; their convenience allows for other uses of the computational effort. We suggest that model analysis be posed as a set of questions used to organize methods that range from frugal to expensive (requiring 10,000 model runs or more). This encourages focus on method utility, even when methods have starkly different theoretical backgrounds. We note that many frugalmore » methods are more useful when unrealistic process-model nonlinearities are reduced. Inexpensive diagnostics are identified for determining when frugal methods are advantageous. Examples from the literature are used to demonstrate local methods and the diagnostics. We suggest that the greater use of computationally frugal model analysis methods would allow questions such as those posed in this work to be addressed more routinely, allowing the environmental sciences community to obtain greater scientific insight from the many ongoing and future modeling efforts« less

  4. Computational modeling of peripheral pain: a commentary.

    PubMed

    Argüello, Erick J; Silva, Ricardo J; Huerta, Mónica K; Avila, René S

    2015-06-11

    This commentary is intended to find possible explanations for the low impact of computational modeling on pain research. We discuss the main strategies that have been used in building computational models for the study of pain. The analysis suggests that traditional models lack biological plausibility at some levels, they do not provide clinically relevant results, and they cannot capture the stochastic character of neural dynamics. On this basis, we provide some suggestions that may be useful in building computational models of pain with a wider range of applications.

  5. Hybrid soft computing systems for electromyographic signals analysis: a review.

    PubMed

    Xie, Hong-Bo; Guo, Tianruo; Bai, Siwei; Dokos, Socrates

    2014-02-03

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis.

  6. Hybrid soft computing systems for electromyographic signals analysis: a review

    PubMed Central

    2014-01-01

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis. PMID:24490979

  7. Categories of Computer Use and Their Relationships with Attitudes toward Computers.

    ERIC Educational Resources Information Center

    Mitra, Anandra

    1998-01-01

    Analysis of attitude and use questionnaires completed by undergraduates (n1,444) at Wake Forest University determined that computers were used most frequently for word processing. Other uses were e-mail for task and non-task activities and mathematical and statistical computation. Results suggest that the level of computer use was related to…

  8. The Human-Computer Interface and Information Literacy: Some Basics and Beyond.

    ERIC Educational Resources Information Center

    Church, Gary M.

    1999-01-01

    Discusses human/computer interaction research, human/computer interface, and their relationships to information literacy. Highlights include communication models; cognitive perspectives; task analysis; theory of action; problem solving; instructional design considerations; and a suggestion that human/information interface may be a more appropriate…

  9. Prediction of ball and roller bearing thermal and kinematic performance by computer analysis

    NASA Technical Reports Server (NTRS)

    Pirvics, J.; Kleckner, R. J.

    1983-01-01

    Characteristics of good computerized analysis software are suggested. These general remarks and an overview of representative software precede a more detailed discussion of load support system analysis program structure. Particular attention is directed at a recent cylindrical roller bearing analysis as an example of the available design tools. Selected software modules are then examined to reveal the detail inherent in contemporary analysis. This leads to a brief section on current design computation which seeks to suggest when and why computerized analysis is warranted. An example concludes the argument offered for such design methodology. Finally, remarks are made concerning needs for model development to address effects which are now considered to be secondary but are anticipated to emerge to primary status in the near future.

  10. Common Sense Planning for a Computer, or, What's It Worth to You?

    ERIC Educational Resources Information Center

    Crawford, Walt

    1984-01-01

    Suggests factors to be considered in planning for the purchase of a microcomputer, including budgets, benefits, costs, and decisions. Major uses of a personal computer are described--word processing, financial analysis, file and database management, programming and computer literacy, education, entertainment, and thrill of high technology. (EJS)

  11. Introducing computational thinking through hands-on projects using R with applications to calculus, probability and data analysis

    NASA Astrophysics Data System (ADS)

    Benakli, Nadia; Kostadinov, Boyan; Satyanarayana, Ashwin; Singh, Satyanand

    2017-04-01

    The goal of this paper is to promote computational thinking among mathematics, engineering, science and technology students, through hands-on computer experiments. These activities have the potential to empower students to learn, create and invent with technology, and they engage computational thinking through simulations, visualizations and data analysis. We present nine computer experiments and suggest a few more, with applications to calculus, probability and data analysis, which engage computational thinking through simulations, visualizations and data analysis. We are using the free (open-source) statistical programming language R. Our goal is to give a taste of what R offers rather than to present a comprehensive tutorial on the R language. In our experience, these kinds of interactive computer activities can be easily integrated into a smart classroom. Furthermore, these activities do tend to keep students motivated and actively engaged in the process of learning, problem solving and developing a better intuition for understanding complex mathematical concepts.

  12. Computers: from ethos and ethics to mythos and religion. Notes on the new frontier between computers and philosophy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitcham, C.

    This essay surveys recent studies concerning the social, cultural, ethical and religious dimensions of computers. The argument is that computers have certain cultural influences which call for ethical analysis. Further suggestions are that American culture is itself reflected in new ways in the high-technology computer milieu, and that ethical issues entail religious ones which are being largely ignored. 28 references.

  13. Should Computing Be Taught in Single-Sex Environments? An Analysis of the Computing Learning Environment of Upper Secondary Students

    ERIC Educational Resources Information Center

    Logan, Keri

    2007-01-01

    It has been well established in the literature that girls are turning their backs on computing courses at all levels of the education system. One reason given for this is that the computer learning environment is not conducive to girls, and it is often suggested that they would benefit from learning computing in a single-sex environment. The…

  14. Translational bioinformatics in the cloud: an affordable alternative

    PubMed Central

    2010-01-01

    With the continued exponential expansion of publicly available genomic data and access to low-cost, high-throughput molecular technologies for profiling patient populations, computational technologies and informatics are becoming vital considerations in genomic medicine. Although cloud computing technology is being heralded as a key enabling technology for the future of genomic research, available case studies are limited to applications in the domain of high-throughput sequence data analysis. The goal of this study was to evaluate the computational and economic characteristics of cloud computing in performing a large-scale data integration and analysis representative of research problems in genomic medicine. We find that the cloud-based analysis compares favorably in both performance and cost in comparison to a local computational cluster, suggesting that cloud computing technologies might be a viable resource for facilitating large-scale translational research in genomic medicine. PMID:20691073

  15. Attitudes toward computers: a new attitudinal dimension.

    PubMed

    Wang, Lei; Chen, Yang; Shi, Junqi

    2007-10-01

    The present study examined the reliability and the construct validity of a questionnaire designed to measure the attitudes toward computers in everyday life. A total of 2,050 participants responded to the questionnaire. Confirmatory factor analysis suggests that attitudes toward computers are composed of three dimensions: sense of benefit, sense of dependence, and sense of harm.

  16. Aerothermal Analysis of the Project Fire II Afterbody Flow

    NASA Technical Reports Server (NTRS)

    Wright, Michael J.; Loomis, Mark; Papadopoulos, Periklis; Arnold, James O. (Technical Monitor)

    2001-01-01

    Computational fluid dynamics (CFD) is used to simulate the wake flow and afterbody heating of the Project Fire II ballistic reentry to Earth at 11.4 km/sec. Laminar results are obtained over a portion of the trajectory between the initial heat pulse and peak afterbody heating. Although non-catalytic forebody convective heating results are in excellent agreement with previous computations, initial predictions of afterbody heating were about a factor of two below the experimental values. Further analysis suggests that significant catalysis may be occurring on the afterbody heat shield. Computations including finite-rate catalysis on the afterbody surface are in good agreement with the data over the early portion of the trajectory, but are conservative near the peak afterbody heating point, especially on the rear portion of the conical frustum. Further analysis of the flight data from Fire II shows that peak afterbody heating occurs before peak forebody heating, a result that contradicts computations and flight data from other entry vehicles. This result suggests that another mechanism, possibly pyrolysis, may be occurring during the later portion of the trajectory, resulting in less total heat transfer than the current predictions.

  17. Bimolecular dynamics by computer analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eilbeck, J.C.; Lomdahl, P.S.; Scott, A.C.

    1984-01-01

    As numerical tools (computers and display equipment) become more powerful and the atomic structures of important biological molecules become known, the importance of detailed computation of nonequilibrium biomolecular dynamics increases. In this manuscript we report results from a well developed study of the hydrogen bonded polypeptide crystal acetanilide, a model protein. Directions for future research are suggested. 9 references, 6 figures.

  18. Research Activity in Computational Physics utilizing High Performance Computing: Co-authorship Network Analysis

    NASA Astrophysics Data System (ADS)

    Ahn, Sul-Ah; Jung, Youngim

    2016-10-01

    The research activities of the computational physicists utilizing high performance computing are analyzed by bibliometirc approaches. This study aims at providing the computational physicists utilizing high-performance computing and policy planners with useful bibliometric results for an assessment of research activities. In order to achieve this purpose, we carried out a co-authorship network analysis of journal articles to assess the research activities of researchers for high-performance computational physics as a case study. For this study, we used journal articles of the Scopus database from Elsevier covering the time period of 2004-2013. We extracted the author rank in the physics field utilizing high-performance computing by the number of papers published during ten years from 2004. Finally, we drew the co-authorship network for 45 top-authors and their coauthors, and described some features of the co-authorship network in relation to the author rank. Suggestions for further studies are discussed.

  19. 1/f Noise in the ``Game of Life''

    NASA Astrophysics Data System (ADS)

    Andrecut, Mircea

    Conway's celebrated ``game of life'' cellular automaton possesses computational universality. The Fourier analysis reported here shows that the power spectra of the ``game of life'' exhibit 1/f noise. The obtained result suggests a connection between 1/f noise and computational universality.

  20. Performance review using sequential sampling and a practice computer.

    PubMed

    Difford, F

    1988-06-01

    The use of sequential sample analysis for repeated performance review is described with examples from several areas of practice. The value of a practice computer in providing a random sample from a complete population, evaluating the parameters of a sequential procedure, and producing a structured worksheet is discussed. It is suggested that sequential analysis has advantages over conventional sampling in the area of performance review in general practice.

  1. Antecedent Knowledge and Intelligent Computer Assisted Instruction.

    ERIC Educational Resources Information Center

    Woodward, John P.; Carnine, Douglas W.

    1988-01-01

    The article reviews Intelligent Computer Assisted Instruction (ICAI), an area of artificial intelligence and notes its shortcomings for learning disabled students. It is suggested that emphasis on antecedent knowledge (important facts, concepts, rules, and/or strategies for the content area) and content analysis and design techniques would make…

  2. In Praise of Numerical Computation

    NASA Astrophysics Data System (ADS)

    Yap, Chee K.

    Theoretical Computer Science has developed an almost exclusively discrete/algebraic persona. We have effectively shut ourselves off from half of the world of computing: a host of problems in Computational Science & Engineering (CS&E) are defined on the continuum, and, for them, the discrete viewpoint is inadequate. The computational techniques in such problems are well-known to numerical analysis and applied mathematics, but are rarely discussed in theoretical algorithms: iteration, subdivision and approximation. By various case studies, I will indicate how our discrete/algebraic view of computing has many shortcomings in CS&E. We want embrace the continuous/analytic view, but in a new synthesis with the discrete/algebraic view. I will suggest a pathway, by way of an exact numerical model of computation, that allows us to incorporate iteration and approximation into our algorithms’ design. Some recent results give a peek into how this view of algorithmic development might look like, and its distinctive form suggests the name “numerical computational geometry” for such activities.

  3. The Children of the Computer Generation: An Analysis of the Family Computer Fad in Japan.

    ERIC Educational Resources Information Center

    Ishigaki, Emiko Hannah

    Results of a survey of grade school and junior high school students suggest that Japan is now caught up in a TV game fad called Family Computer (Fami-Com). Fami-Com is a household electric machine for video games that allows players to use more than 100 currently marketed software products. Since its introduction in 1983, the popularity of the…

  4. Development and validation of the computer technology literacy self-assessment scale for Taiwanese elementary school students.

    PubMed

    Chang, Chiung-Sui

    2008-01-01

    The purpose of this study was to describe the development and validation of an instrument to identify various dimensions of the computer technology literacy self-assessment scale (CTLS) for elementary school students. The instrument included five CTLS dimensions (subscales): the technology operation skills, the computer usages concepts, the attitudes toward computer technology, the learning with technology, and the Internet operation skills. Participants were 1,539 elementary school students in Taiwan. Data analysis indicated that the instrument developed in the study had satisfactory validity and reliability. Correlations analysis supported the legitimacy of using multiple dimensions in representing students' computer technology literacy. Significant differences were found between male and female students, and between grades on some CTLS dimensions. Suggestions are made for use of the instrument to examine complicated interplays between students' computer behaviors and their computer technology literacy.

  5. Application of microarray analysis on computer cluster and cloud platforms.

    PubMed

    Bernau, C; Boulesteix, A-L; Knaus, J

    2013-01-01

    Analysis of recent high-dimensional biological data tends to be computationally intensive as many common approaches such as resampling or permutation tests require the basic statistical analysis to be repeated many times. A crucial advantage of these methods is that they can be easily parallelized due to the computational independence of the resampling or permutation iterations, which has induced many statistics departments to establish their own computer clusters. An alternative is to rent computing resources in the cloud, e.g. at Amazon Web Services. In this article we analyze whether a selection of statistical projects, recently implemented at our department, can be efficiently realized on these cloud resources. Moreover, we illustrate an opportunity to combine computer cluster and cloud resources. In order to compare the efficiency of computer cluster and cloud implementations and their respective parallelizations we use microarray analysis procedures and compare their runtimes on the different platforms. Amazon Web Services provide various instance types which meet the particular needs of the different statistical projects we analyzed in this paper. Moreover, the network capacity is sufficient and the parallelization is comparable in efficiency to standard computer cluster implementations. Our results suggest that many statistical projects can be efficiently realized on cloud resources. It is important to mention, however, that workflows can change substantially as a result of a shift from computer cluster to cloud computing.

  6. K-Fold Crossvalidation in Canonical Analysis.

    ERIC Educational Resources Information Center

    Liang, Kun-Hsia; And Others

    1995-01-01

    A computer-assisted, K-fold cross-validation technique is discussed in the framework of canonical correlation analysis of randomly generated data sets. Analysis results suggest that this technique can effectively reduce the contamination of canonical variates and canonical correlations by sample-specific variance components. (Author/SLD)

  7. Automatic computation for optimum height planning of apartment buildings to improve solar access

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seong, Yoon-Bok; Kim, Yong-Yee; Seok, Ho-Tae

    2011-01-15

    The objective of this study is to suggest a mathematical model and an optimal algorithm for determining the height of apartment buildings to satisfy the solar rights of survey buildings or survey housing units. The objective is also to develop an automatic computation model for the optimum height of apartment buildings and then to clarify the performance and expected effects. To accomplish the objective of this study, the following procedures were followed: (1) The necessity of the height planning of obstruction buildings to satisfy the solar rights of survey buildings or survey housing units is demonstrated by analyzing through amore » literature review the recent trend of disputes related to solar rights and to examining the social requirements in terms of solar rights. In addition, the necessity of the automatic computation system for height planning of apartment buildings is demonstrated and a suitable analysis method for this system is chosen by investigating the characteristics of analysis methods for solar rights assessment. (2) A case study on the process of height planning of apartment buildings will be briefly described and the problems occurring in this process will then be examined carefully. (3) To develop an automatic computation model for height planning of apartment buildings, geometrical elements forming apartment buildings are defined by analyzing the geometrical characteristics of apartment buildings. In addition, design factors and regulations required in height planning of apartment buildings are investigated. Based on this knowledge, the methodology and mathematical algorithm to adjust the height of apartment buildings by automatic computation are suggested and probable problems and the ways to resolve these problems are discussed. Finally, the methodology and algorithm for the optimization are suggested. (4) Based on the suggested methodology and mathematical algorithm, the automatic computation model for optimum height of apartment buildings is developed and the developed system is verified through the application of some cases. The effects of the suggested model are then demonstrated quantitatively and qualitatively. (author)« less

  8. Radiomic analysis in prediction of Human Papilloma Virus status.

    PubMed

    Yu, Kaixian; Zhang, Youyi; Yu, Yang; Huang, Chao; Liu, Rongjie; Li, Tengfei; Yang, Liuqing; Morris, Jeffrey S; Baladandayuthapani, Veerabhadran; Zhu, Hongtu

    2017-12-01

    Human Papilloma Virus (HPV) has been associated with oropharyngeal cancer prognosis. Traditionally the HPV status is tested through invasive lab test. Recently, the rapid development of statistical image analysis techniques has enabled precise quantitative analysis of medical images. The quantitative analysis of Computed Tomography (CT) provides a non-invasive way to assess HPV status for oropharynx cancer patients. We designed a statistical radiomics approach analyzing CT images to predict HPV status. Various radiomics features were extracted from CT scans, and analyzed using statistical feature selection and prediction methods. Our approach ranked the highest in the 2016 Medical Image Computing and Computer Assisted Intervention (MICCAI) grand challenge: Oropharynx Cancer (OPC) Radiomics Challenge, Human Papilloma Virus (HPV) Status Prediction. Further analysis on the most relevant radiomic features distinguishing HPV positive and negative subjects suggested that HPV positive patients usually have smaller and simpler tumors.

  9. A National Study of the Relationship between Home Access to a Computer and Academic Performance Scores of Grade 12 U.S. Science Students: An Analysis of the 2009 NAEP Data

    NASA Astrophysics Data System (ADS)

    Coffman, Mitchell Ward

    The purpose of this dissertation was to examine the relationship between student access to a computer at home and academic achievement. The 2009 National Assessment of Educational Progress (NAEP) dataset was probed using the National Data Explorer (NDE) to investigate correlations in the subsets of SES, Parental Education, Race, and Gender as it relates to access of a home computer and improved performance scores for U.S. public school grade 12 science students. A causal-comparative approach was employed seeking clarity on the relationship between home access and performance scores. The influence of home access cannot overcome the challenges students of lower SES face. The achievement gap, or a second digital divide, for underprivileged classes of students, including minorities does not appear to contract via student access to a home computer. Nonetheless, in tests for significance, statistically significant improvement in science performance scores was reported for those having access to a computer at home compared to those not having access. Additionally, regression models reported evidence of correlations between and among subsets of controls for the demographic factors gender, race, and socioeconomic status. Variability in these correlations was high; suggesting influence from unobserved factors may have more impact upon the dependent variable. Having access to a computer at home increases performance scores for grade 12 general science students of all races, genders and socioeconomic levels. However, the performance gap is roughly equivalent to the existing performance gap of the national average for science scores, suggesting little influence from access to a computer on academic achievement. The variability of scores reported in the regression analysis models reflects a moderate to low effect, suggesting an absence of causation. These statistical results are accurate and confirm the literature review, whereby having access to a computer at home and the predictor variables were found to have a significant impact on performance scores, although the data presented suggest computer access at home is less influential upon performance scores than poverty and its correlates.

  10. Computational and Experimental Analysis of the Secretome of Methylococcus capsulatus (Bath)

    PubMed Central

    Indrelid, Stine; Mathiesen, Geir; Jacobsen, Morten; Lea, Tor; Kleiveland, Charlotte R.

    2014-01-01

    The Gram-negative methanotroph Methylococcus capsulatus (Bath) was recently demonstrated to abrogate inflammation in a murine model of inflammatory bowel disease, suggesting interactions with cells involved in maintaining mucosal homeostasis and emphasizing the importance of understanding the many properties of M. capsulatus. Secreted proteins determine how bacteria may interact with their environment, and a comprehensive knowledge of such proteins is therefore vital to understand bacterial physiology and behavior. The aim of this study was to systematically analyze protein secretion in M. capsulatus (Bath) by identifying the secretion systems present and the respective secreted substrates. Computational analysis revealed that in addition to previously recognized type II secretion systems and a type VII secretion system, a type Vb (two-partner) secretion system and putative type I secretion systems are present in M. capsulatus (Bath). In silico analysis suggests that the diverse secretion systems in M.capsulatus transport proteins likely to be involved in adhesion, colonization, nutrient acquisition and homeostasis maintenance. Results of the computational analysis was verified and extended by an experimental approach showing that in addition an uncharacterized protein and putative moonlighting proteins are released to the medium during exponential growth of M. capsulatus (Bath). PMID:25479164

  11. Computer Administering of the Psychological Investigations: Set-Relational Representation

    NASA Astrophysics Data System (ADS)

    Yordzhev, Krasimir

    Computer administering of a psychological investigation is the computer representation of the entire procedure of psychological assessments - test construction, test implementation, results evaluation, storage and maintenance of the developed database, its statistical processing, analysis and interpretation. A mathematical description of psychological assessment with the aid of personality tests is discussed in this article. The set theory and the relational algebra are used in this description. A relational model of data, needed to design a computer system for automation of certain psychological assessments is given. Some finite sets and relation on them, which are necessary for creating a personality psychological test, are described. The described model could be used to develop real software for computer administering of any psychological test and there is full automation of the whole process: test construction, test implementation, result evaluation, storage of the developed database, statistical implementation, analysis and interpretation. A software project for computer administering personality psychological tests is suggested.

  12. Computational Investigation of the Performance and Back-Pressure Limits of a Hypersonic Inlet

    NASA Technical Reports Server (NTRS)

    Smart, Michael K.; White, Jeffery A.

    2002-01-01

    A computational analysis of Mach 6.2 operation of a hypersonic inlet with rectangular-to-elliptical shape transition has been performed. The results of the computations are compared with experimental data for cases with and without a manually imposed back-pressure. While the no-back-pressure numerical solutions match the general trends of the data, certain features observed in the experiments did not appear in the computational solutions. The reasons for these discrepancies are discussed and possible remedies are suggested. Most importantly, however, the computational analysis increased the understanding of the consequences of certain aspects of the inlet design. This will enable the performance of future inlets of this class to be improved. Computational solutions with back-pressure under-estimated the back-pressure limit observed in the experiments, but did supply significant insight into the character of highly back-pressured inlet flows.

  13. Reading Emotion From Mouse Cursor Motions: Affective Computing Approach.

    PubMed

    Yamauchi, Takashi; Xiao, Kunchen

    2018-04-01

    Affective computing research has advanced emotion recognition systems using facial expressions, voices, gaits, and physiological signals, yet these methods are often impractical. This study integrates mouse cursor motion analysis into affective computing and investigates the idea that movements of the computer cursor can provide information about emotion of the computer user. We extracted 16-26 trajectory features during a choice-reaching task and examined the link between emotion and cursor motions. Participants were induced for positive or negative emotions by music, film clips, or emotional pictures, and they indicated their emotions with questionnaires. Our 10-fold cross-validation analysis shows that statistical models formed from "known" participants (training data) could predict nearly 10%-20% of the variance of positive affect and attentiveness ratings of "unknown" participants, suggesting that cursor movement patterns such as the area under curve and direction change help infer emotions of computer users. Copyright © 2017 Cognitive Science Society, Inc.

  14. The Sensitivity of Memory Consolidation and Reconsolidation to Inhibitors of Protein Synthesis and Kinases: Computational Analysis

    ERIC Educational Resources Information Center

    Zhang, Yili; Smolen, Paul; Baxter, Douglas A.; Byrne, John H.

    2010-01-01

    Memory consolidation and reconsolidation require kinase activation and protein synthesis. Blocking either process during or shortly after training or recall disrupts memory stabilization, which suggests the existence of a critical time window during which these processes are necessary. Using a computational model of kinase synthesis and…

  15. A Future of Reversals: Dyslexic Talents in a World of Computer Visualization.

    ERIC Educational Resources Information Center

    West, Thomas G.

    1992-01-01

    This paper proposes that those traits which handicap visually oriented dyslexics in a verbally oriented educational system may confer advantages in new fields which rely on visual methods of analysis, especially those in computer applications. It is suggested that such traits also characterized Albert Einstein, Michael Faraday, James Maxwell, and…

  16. Human vs. Computer Diagnosis of Students' Natural Selection Knowledge: Testing the Efficacy of Text Analytic Software

    NASA Astrophysics Data System (ADS)

    Nehm, Ross H.; Haertig, Hendrik

    2012-02-01

    Our study examines the efficacy of Computer Assisted Scoring (CAS) of open-response text relative to expert human scoring within the complex domain of evolutionary biology. Specifically, we explored whether CAS can diagnose the explanatory elements (or Key Concepts) that comprise undergraduate students' explanatory models of natural selection with equal fidelity as expert human scorers in a sample of >1,000 essays. We used SPSS Text Analysis 3.0 to perform our CAS and measure Kappa values (inter-rater reliability) of KC detection (i.e., computer-human rating correspondence). Our first analysis indicated that the text analysis functions (or extraction rules) developed and deployed in SPSSTA to extract individual Key Concepts (KCs) from three different items differing in several surface features (e.g., taxon, trait, type of evolutionary change) produced "substantial" (Kappa 0.61-0.80) or "almost perfect" (0.81-1.00) agreement. The second analysis explored the measurement of human-computer correspondence for KC diversity (the number of different accurate knowledge elements) in the combined sample of all 827 essays. Here we found outstanding correspondence; extraction rules generated using one prompt type are broadly applicable to other evolutionary scenarios (e.g., bacterial resistance, cheetah running speed, etc.). This result is encouraging, as it suggests that the development of new item sets may not necessitate the development of new text analysis rules. Overall, our findings suggest that CAS tools such as SPSS Text Analysis may compensate for some of the intrinsic limitations of currently used multiple-choice Concept Inventories designed to measure student knowledge of natural selection.

  17. The Effects of Variability and Risk in Selection Utility Analysis: An Empirical Comparison.

    ERIC Educational Resources Information Center

    Rich, Joseph R.; Boudreau, John W.

    1987-01-01

    Investigated utility estimate variability for the selection utility of using the Programmer Aptitude Test to select computer programmers. Comparison of Monte Carlo results to other risk assessment approaches (sensitivity analysis, break-even analysis, algebraic derivation of the distribtion) suggests that distribution information provided by Monte…

  18. Recent developments in rotary-wing aerodynamic theory

    NASA Technical Reports Server (NTRS)

    Johnson, W.

    1986-01-01

    Current progress in the computational analysis of rotary-wing flowfields is surveyed, and some typical results are presented in graphs. Topics examined include potential theory, rotating coordinate systems, lifting-surface theory (moving singularity, fixed wing, and rotary wing), panel methods (surface singularity representations, integral equations, and compressible flows), transonic theory (the small-disturbance equation), wake analysis (hovering rotor-wake models and transonic blade-vortex interaction), limitations on computational aerodynamics, and viscous-flow methods (dynamic-stall theories and lifting-line theory). It is suggested that the present algorithms and advanced computers make it possible to begin working toward the ultimate goal of turbulent Navier-Stokes calculations for an entire rotorcraft.

  19. The use of ERTS imagery in reservoir management and operation

    NASA Technical Reports Server (NTRS)

    Cooper, S. (Principal Investigator)

    1973-01-01

    There are no author-identified significant results in this report. Preliminary analysis of ERTS-1 imagery suggests that the configuration and areal coverage of surface waters, as well as other hydrologically related terrain features, may be obtained from ERTS-1 imagery to an extent that would be useful. Computer-oriented pattern recognition techniques are being developed to help automate the identification and analysis of hydrologic features. Considerable man-machine interaction is required while training the computer for these tasks.

  20. A Meta-Analysis of Suggestopedia, Suggestology, Suggestive-accelerative Learning and Teaching (SALT), and Super-learning.

    ERIC Educational Resources Information Center

    Moon, Charles E.; And Others

    Forty studies using one or more components of Lozanov's method of suggestive-accelerative learning and teaching were identified from a search of all issues of the "Journal of Suggestive-Accelerative Learning and Teaching." Fourteen studies contained sufficient statistics to compute effect sizes. The studies were coded according to substantive and…

  1. An Evaluation of a Computer-Based Training on the Visual Analysis of Single-Subject Data

    ERIC Educational Resources Information Center

    Snyder, Katie

    2013-01-01

    Visual analysis is the primary method of analyzing data in single-subject methodology, which is the predominant research method used in the fields of applied behavior analysis and special education. Previous research on the reliability of visual analysis suggests that judges often disagree about what constitutes an intervention effect. Considering…

  2. Computer use and carpal tunnel syndrome: A meta-analysis.

    PubMed

    Shiri, Rahman; Falah-Hassani, Kobra

    2015-02-15

    Studies have reported contradictory results on the role of keyboard or mouse use in carpal tunnel syndrome (CTS). This meta-analysis aimed to assess whether computer use causes CTS. Literature searches were conducted in several databases until May 2014. Twelve studies qualified for a random-effects meta-analysis. Heterogeneity and publication bias were assessed. In a meta-analysis of six studies (N=4964) that compared computer workers with the general population or other occupational populations, computer/typewriter use (pooled odds ratio (OR)=0.72, 95% confidence interval (CI) 0.58-0.90), computer/typewriter use ≥1 vs. <1h/day (OR=0.63, 95% CI 0.38-1.04) and computer/typewriter use ≥4 vs. <4h/day (OR=0.68, 95% CI 0.54-0.87) were inversely associated with CTS. Conversely, in a meta-analysis of six studies (N=5202) conducted among office workers, CTS was positively associated with computer/typewriter use (pooled OR=1.34, 95% CI 1.08-1.65), mouse use (OR=1.93, 95% CI 1.43-2.61), frequent computer use (OR=1.89, 95% CI 1.15-3.09), frequent mouse use (OR=1.84, 95% CI 1.18-2.87) and with years of computer work (OR=1.92, 95% CI 1.17-3.17 for long vs. short). There was no evidence of publication bias for both types of studies. Studies that compared computer workers with the general population or several occupational groups did not control their estimates for occupational risk factors. Thus, office workers with no or little computer use are a more appropriate comparison group than the general population or several occupational groups. This meta-analysis suggests that excessive computer use, particularly mouse usage might be a minor occupational risk factor for CTS. Further prospective studies among office workers with objectively assessed keyboard and mouse use, and CTS symptoms or signs confirmed by a nerve conduction study are needed. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Computer Assisted Thermography And Its Application In Ovulation Detection

    NASA Astrophysics Data System (ADS)

    Rao, K. H.; Shah, A. V.

    1984-08-01

    Hardware and software of a computer-assisted image analyzing system used for infrared images in medical applications are discussed. The application of computer-assisted thermography (CAT) as a complementary diagnostic tool in centralized diagnostic management is proposed. The authors adopted 'Computer Assisted Thermography' to study physiological changes in the breasts related to the hormones characterizing the menstrual cycle of a woman. Based on clinical experi-ments followed by thermal image analysis, they suggest that 'differential skin temperature (DST)1 be measured to detect the fertility interval in the menstrual cycle of a woman.

  4. Exploration of the Attitudes of Freshman Foreign Language Students toward Using Computers at a Turkish State University

    ERIC Educational Resources Information Center

    Akbulut, Yavuz

    2008-01-01

    The present study expands the design of Warschauer (1996) surveying freshman foreign language students at a Turkish university. Motivating aspects of computer assisted instruction in terms of writing and e-mailing are explored through an exploratory factor analysis conducted on the survey developed by Warschauer (1996). Findings suggest that…

  5. Computerized content analysis of some adolescent writings of Napoleon Bonaparte: a test of the validity of the method.

    PubMed

    Gottschalk, Louis A; DeFrancisco, Don; Bechtel, Robert J

    2002-08-01

    The aim of this study was to test the validity of a computer software program previously demonstrated to be capable of making DSM-IV neuropsychiatric diagnoses from the content analysis of speech or verbal texts. In this report, the computer program was applied to three personal writings of Napoleon Bonaparte when he was 12 to 16 years of age. The accuracy of the neuropsychiatric evaluations derived from the computerized content analysis of these writings of Napoleon was independently corroborated by two biographers who have described pertinent details concerning his life situations, moods, and other emotional reactions during this adolescent period of his life. The relevance of this type of computer technology to psychohistorical research and clinical psychiatry is suggested.

  6. Discrete Fourier Transform Analysis in a Complex Vector Space

    NASA Technical Reports Server (NTRS)

    Dean, Bruce H.

    2009-01-01

    Alternative computational strategies for the Discrete Fourier Transform (DFT) have been developed using analysis of geometric manifolds. This approach provides a general framework for performing DFT calculations, and suggests a more efficient implementation of the DFT for applications using iterative transform methods, particularly phase retrieval. The DFT can thus be implemented using fewer operations when compared to the usual DFT counterpart. The software decreases the run time of the DFT in certain applications such as phase retrieval that iteratively call the DFT function. The algorithm exploits a special computational approach based on analysis of the DFT as a transformation in a complex vector space. As such, this approach has the potential to realize a DFT computation that approaches N operations versus Nlog(N) operations for the equivalent Fast Fourier Transform (FFT) calculation.

  7. Descent graphs in pedigree analysis: applications to haplotyping, location scores, and marker-sharing statistics.

    PubMed Central

    Sobel, E.; Lange, K.

    1996-01-01

    The introduction of stochastic methods in pedigree analysis has enabled geneticists to tackle computations intractable by standard deterministic methods. Until now these stochastic techniques have worked by running a Markov chain on the set of genetic descent states of a pedigree. Each descent state specifies the paths of gene flow in the pedigree and the founder alleles dropped down each path. The current paper follows up on a suggestion by Elizabeth Thompson that genetic descent graphs offer a more appropriate space for executing a Markov chain. A descent graph specifies the paths of gene flow but not the particular founder alleles traveling down the paths. This paper explores algorithms for implementing Thompson's suggestion for codominant markers in the context of automatic haplotyping, estimating location scores, and computing gene-clustering statistics for robust linkage analysis. Realistic numerical examples demonstrate the feasibility of the algorithms. PMID:8651310

  8. Scholarly literature and the press: scientific impact and social perception of physics computing

    NASA Astrophysics Data System (ADS)

    Pia, M. G.; Basaglia, T.; Bell, Z. W.; Dressendorfer, P. V.

    2014-06-01

    The broad coverage of the search for the Higgs boson in the mainstream media is a relative novelty for high energy physics (HEP) research, whose achievements have traditionally been limited to scholarly literature. This paper illustrates the results of a scientometric analysis of HEP computing in scientific literature, institutional media and the press, and a comparative overview of similar metrics concerning representative particle physics measurements. The picture emerging from these scientometric data documents the relationship between the scientific impact and the social perception of HEP physics research versus that of HEP computing. The results of this analysis suggest that improved communication of the scientific and social role of HEP computing via press releases from the major HEP laboratories would be beneficial to the high energy physics community.

  9. Satellite on-board processing for earth resources data

    NASA Technical Reports Server (NTRS)

    Bodenheimer, R. E.; Gonzalez, R. C.; Gupta, J. N.; Hwang, K.; Rochelle, R. W.; Wilson, J. B.; Wintz, P. A.

    1975-01-01

    Results of a survey of earth resources user applications and their data requirements, earth resources multispectral scanner sensor technology, and preprocessing algorithms for correcting the sensor outputs and for data bulk reduction are presented along with a candidate data format. Computational requirements required to implement the data analysis algorithms are included along with a review of computer architectures and organizations. Computer architectures capable of handling the algorithm computational requirements are suggested and the environmental effects of an on-board processor discussed. By relating performance parameters to the system requirements of each of the user requirements the feasibility of on-board processing is determined for each user. A tradeoff analysis is performed to determine the sensitivity of results to each of the system parameters. Significant results and conclusions are discussed, and recommendations are presented.

  10. Space lab system analysis

    NASA Technical Reports Server (NTRS)

    Ingels, F. M.; Rives, T. B.

    1987-01-01

    An analytical analysis of the HOSC Generic Peripheral processing system was conducted. The results are summarized and they indicate that the maximum delay in performing screen change requests should be less than 2.5 sec., occurring for a slow VAX host to video screen I/O rate of 50 KBps. This delay is due to the average I/O rate from the video terminals to their host computer. Software structure of the main computers and the host computers will have greater impact on screen change or refresh response times. The HOSC data system model was updated by a newly coded PASCAL based simulation program which was installed on the HOSC VAX system. This model is described and documented. Suggestions are offered to fine tune the performance of the ETERNET interconnection network. Suggestions for using the Nutcracker by Excelan to trace itinerate packets which appear on the network from time to time were offered in discussions with the HOSC personnel. Several visits to the HOSC facility were to install and demonstrate the simulation model.

  11. Automated analysis and classification of melanocytic tumor on skin whole slide images.

    PubMed

    Xu, Hongming; Lu, Cheng; Berendt, Richard; Jha, Naresh; Mandal, Mrinal

    2018-06-01

    This paper presents a computer-aided technique for automated analysis and classification of melanocytic tumor on skin whole slide biopsy images. The proposed technique consists of four main modules. First, skin epidermis and dermis regions are segmented by a multi-resolution framework. Next, epidermis analysis is performed, where a set of epidermis features reflecting nuclear morphologies and spatial distributions is computed. In parallel with epidermis analysis, dermis analysis is also performed, where dermal cell nuclei are segmented and a set of textural and cytological features are computed. Finally, the skin melanocytic image is classified into different categories such as melanoma, nevus or normal tissue by using a multi-class support vector machine (mSVM) with extracted epidermis and dermis features. Experimental results on 66 skin whole slide images indicate that the proposed technique achieves more than 95% classification accuracy, which suggests that the technique has the potential to be used for assisting pathologists on skin biopsy image analysis and classification. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. Inhalation toxicity of indoor air pollutants in Drosophila melanogaster using integrated transcriptomics and computational behavior analyses

    NASA Astrophysics Data System (ADS)

    Eom, Hyun-Jeong; Liu, Yuedan; Kwak, Gyu-Suk; Heo, Muyoung; Song, Kyung Seuk; Chung, Yun Doo; Chon, Tae-Soo; Choi, Jinhee

    2017-06-01

    We conducted an inhalation toxicity test on the alternative animal model, Drosophila melanogaster, to investigate potential hazards of indoor air pollution. The inhalation toxicity of toluene and formaldehyde was investigated using comprehensive transcriptomics and computational behavior analyses. The ingenuity pathway analysis (IPA) based on microarray data suggests the involvement of pathways related to immune response, stress response, and metabolism in formaldehyde and toluene exposure based on hub molecules. We conducted a toxicity test using mutants of the representative genes in these pathways to explore the toxicological consequences of alterations of these pathways. Furthermore, extensive computational behavior analysis showed that exposure to either toluene or formaldehyde reduced most of the behavioral parameters of both wild-type and mutants. Interestingly, behavioral alteration caused by toluene or formaldehyde exposure was most severe in the p38b mutant, suggesting that the defects in the p38 pathway underlie behavioral alteration. Overall, the results indicate that exposure to toluene and formaldehyde via inhalation causes severe toxicity in Drosophila, by inducing significant alterations in gene expression and behavior, suggesting that Drosophila can be used as a potential alternative model in inhalation toxicity screening.

  13. Inhalation toxicity of indoor air pollutants in Drosophila melanogaster using integrated transcriptomics and computational behavior analyses

    PubMed Central

    Eom, Hyun-Jeong; Liu, Yuedan; Kwak, Gyu-Suk; Heo, Muyoung; Song, Kyung Seuk; Chung, Yun Doo; Chon, Tae-Soo; Choi, Jinhee

    2017-01-01

    We conducted an inhalation toxicity test on the alternative animal model, Drosophila melanogaster, to investigate potential hazards of indoor air pollution. The inhalation toxicity of toluene and formaldehyde was investigated using comprehensive transcriptomics and computational behavior analyses. The ingenuity pathway analysis (IPA) based on microarray data suggests the involvement of pathways related to immune response, stress response, and metabolism in formaldehyde and toluene exposure based on hub molecules. We conducted a toxicity test using mutants of the representative genes in these pathways to explore the toxicological consequences of alterations of these pathways. Furthermore, extensive computational behavior analysis showed that exposure to either toluene or formaldehyde reduced most of the behavioral parameters of both wild-type and mutants. Interestingly, behavioral alteration caused by toluene or formaldehyde exposure was most severe in the p38b mutant, suggesting that the defects in the p38 pathway underlie behavioral alteration. Overall, the results indicate that exposure to toluene and formaldehyde via inhalation causes severe toxicity in Drosophila, by inducing significant alterations in gene expression and behavior, suggesting that Drosophila can be used as a potential alternative model in inhalation toxicity screening. PMID:28621308

  14. Decision Analysis Using Spreadsheets.

    ERIC Educational Resources Information Center

    Sounderpandian, Jayavel

    1989-01-01

    Discussion of decision analysis and its importance in a business curriculum focuses on the use of spreadsheets instead of commercial software packages for computer assisted instruction. A hypothetical example is given of a company drilling for oil, and suggestions are provided for classroom exercises using spreadsheets. (seven references) (LRW)

  15. Wait, are you sad or angry? Large exposure time differences required for the categorization of facial expressions of emotion

    PubMed Central

    Du, Shichuan; Martinez, Aleix M.

    2013-01-01

    Abstract Facial expressions of emotion are essential components of human behavior, yet little is known about the hierarchical organization of their cognitive analysis. We study the minimum exposure time needed to successfully classify the six classical facial expressions of emotion (joy, surprise, sadness, anger, disgust, fear) plus neutral as seen at different image resolutions (240 × 160 to 15 × 10 pixels). Our results suggest a consistent hierarchical analysis of these facial expressions regardless of the resolution of the stimuli. Happiness and surprise can be recognized after very short exposure times (10–20 ms), even at low resolutions. Fear and anger are recognized the slowest (100–250 ms), even in high-resolution images, suggesting a later computation. Sadness and disgust are recognized in between (70–200 ms). The minimum exposure time required for successful classification of each facial expression correlates with the ability of a human subject to identify it correctly at low resolutions. These results suggest a fast, early computation of expressions represented mostly by low spatial frequencies or global configural cues and a later, slower process for those categories requiring a more fine-grained analysis of the image. We also demonstrate that those expressions that are mostly visible in higher-resolution images are not recognized as accurately. We summarize implications for current computational models. PMID:23509409

  16. Dental application of novel finite element analysis software for three-dimensional finite element modeling of a dentulous mandible from its computed tomography images.

    PubMed

    Nakamura, Keiko; Tajima, Kiyoshi; Chen, Ker-Kong; Nagamatsu, Yuki; Kakigawa, Hiroshi; Masumi, Shin-ich

    2013-12-01

    This study focused on the application of novel finite-element analysis software for constructing a finite-element model from the computed tomography data of a human dentulous mandible. The finite-element model is necessary for evaluating the mechanical response of the alveolar part of the mandible, resulting from occlusal force applied to the teeth during biting. Commercially available patient-specific general computed tomography-based finite-element analysis software was solely applied to the finite-element analysis for the extraction of computed tomography data. The mandibular bone with teeth was extracted from the original images. Both the enamel and the dentin were extracted after image processing, and the periodontal ligament was created from the segmented dentin. The constructed finite-element model was reasonably accurate using a total of 234,644 nodes and 1,268,784 tetrahedral and 40,665 shell elements. The elastic moduli of the heterogeneous mandibular bone were determined from the bone density data of the computed tomography images. The results suggested that the software applied in this study is both useful and powerful for creating a more accurate three-dimensional finite-element model of a dentulous mandible from the computed tomography data without the need for any other software.

  17. Programmers, professors, and parasites: credit and co-authorship in computer science.

    PubMed

    Solomon, Justin

    2009-12-01

    This article presents an in-depth analysis of past and present publishing practices in academic computer science to suggest the establishment of a more consistent publishing standard. Historical precedent for academic publishing in computer science is established through the study of anecdotes as well as statistics collected from databases of published computer science papers. After examining these facts alongside information about analogous publishing situations and standards in other scientific fields, the article concludes with a list of basic principles that should be adopted in any computer science publishing standard. These principles would contribute to the reliability and scientific nature of academic publications in computer science and would allow for more straightforward discourse in future publications.

  18. Bacterial computing with engineered populations.

    PubMed

    Amos, Martyn; Axmann, Ilka Maria; Blüthgen, Nils; de la Cruz, Fernando; Jaramillo, Alfonso; Rodriguez-Paton, Alfonso; Simmel, Friedrich

    2015-07-28

    We describe strategies for the construction of bacterial computing platforms by describing a number of results from the recently completed bacterial computing with engineered populations project. In general, the implementation of such systems requires a framework containing various components such as intracellular circuits, single cell input/output and cell-cell interfacing, as well as extensive analysis. In this overview paper, we describe our approach to each of these, and suggest possible areas for future research. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  19. A general concept for consistent documentation of computational analyses

    PubMed Central

    Müller, Fabian; Nordström, Karl; Lengauer, Thomas; Schulz, Marcel H.

    2015-01-01

    The ever-growing amount of data in the field of life sciences demands standardized ways of high-throughput computational analysis. This standardization requires a thorough documentation of each step in the computational analysis to enable researchers to understand and reproduce the results. However, due to the heterogeneity in software setups and the high rate of change during tool development, reproducibility is hard to achieve. One reason is that there is no common agreement in the research community on how to document computational studies. In many cases, simple flat files or other unstructured text documents are provided by researchers as documentation, which are often missing software dependencies, versions and sufficient documentation to understand the workflow and parameter settings. As a solution we suggest a simple and modest approach for documenting and verifying computational analysis pipelines. We propose a two-part scheme that defines a computational analysis using a Process and an Analysis metadata document, which jointly describe all necessary details to reproduce the results. In this design we separate the metadata specifying the process from the metadata describing an actual analysis run, thereby reducing the effort of manual documentation to an absolute minimum. Our approach is independent of a specific software environment, results in human readable XML documents that can easily be shared with other researchers and allows an automated validation to ensure consistency of the metadata. Because our approach has been designed with little to no assumptions concerning the workflow of an analysis, we expect it to be applicable in a wide range of computational research fields. Database URL: http://deep.mpi-inf.mpg.de/DAC/cmds/pub/pyvalid.zip PMID:26055099

  20. Home care nurses' attitudes toward computers. A confirmatory factor analysis of the Stronge and Brodt instrument.

    PubMed

    Stricklin, Mary Lou; Bierer, S Beth; Struk, Cynthia

    2003-01-01

    Point-of-care technology for home care use will be the final step in enterprise-wide healthcare electronic communications. Successful implementation of home care point-of-care technology hinges upon nurses' attitudes toward point-of-care technology and its use in clinical practice. This study addresses the factors associated with home care nurses' attitudes using Stronge and Brodt's Nurse Attitudes Toward Computers instrument. In this study, the Nurses Attitudes Toward Computers instrument was administered to a convenience sample of 138 nurses employed by a large midwestern home care agency, with an 88% response rate. Confirmatory factor analysis corroborated the Nurses Attitudes Toward Computers' 3-dimensional factor structure for practicing nurses, which was labeled as nurses' work, security issues, and perceived barriers. Results from the confirmatory factor analysis also suggest that these 3 factors are internally correlated and represent multiple dimensions of a higher order construct labeled as nurses' attitudes toward computers. Additionally, two of these factors, nurses' work and perceived barriers, each appears to explain more variance in nurses' attitudes toward computers than security issues. Instrument reliability was high for the sample (.90), with subscale reliabilities ranging from 86 to 70.

  1. Rapid Geometry Creation for Computer-Aided Engineering Parametric Analyses: A Case Study Using ComGeom2 for Launch Abort System Design

    NASA Technical Reports Server (NTRS)

    Hawke, Veronica; Gage, Peter; Manning, Ted

    2007-01-01

    ComGeom2, a tool developed to generate Common Geometry representation for multidisciplinary analysis, has been used to create a large set of geometries for use in a design study requiring analysis by two computational codes. This paper describes the process used to generate the large number of configurations and suggests ways to further automate the process and make it more efficient for future studies. The design geometry for this study is the launch abort system of the NASA Crew Launch Vehicle.

  2. An evaluation of a computer code based on linear acoustic theory for predicting helicopter main rotor noise

    NASA Astrophysics Data System (ADS)

    Davis, S. J.; Egolf, T. A.

    1980-07-01

    Acoustic characteristics predicted using a recently developed computer code were correlated with measured acoustic data for two helicopter rotors. The analysis, is based on a solution of the Ffowcs-Williams-Hawkings (FW-H) equation and includes terms accounting for both the thickness and loading components of the rotational noise. Computations are carried out in the time domain and assume free field conditions. Results of the correlation show that the Farrassat/Nystrom analysis, when using predicted airload data as input, yields fair but encouraging correlation for the first 6 harmonics of blade passage. It also suggests that although the analysis represents a valuable first step towards developing a truly comprehensive helicopter rotor noise prediction capability, further work remains to be done identifying and incorporating additional noise mechanisms into the code.

  3. Prosodic analysis by rule

    NASA Astrophysics Data System (ADS)

    Lindsay, D.

    1985-02-01

    Research on the automatic computer analysis of intonation using linguistic knowledge is described. The use of computer programs to analyze and classify fundamental frequency (FO) contours, and work on the psychophysics of British English intonation and on the phonetics of FO contours are described. Results suggest that FO can be conveniently tracked to represent intonation through time, which can be subsequently used by a computer program as the basis for analysis. Nuclear intonation, where the intonational nucleus is the region of auditory prominence, or information focus, found in all spoken sentences was studied. The main mechanism behind such prominence is the perception of an extensive FO movement on the nuclear syllable. A classification of the nuclear contour shape is a classification of the sentence type, often into categories that cannot be readily determined from only the segmental phonemes of the utterance.

  4. Using computer simulations to facilitate conceptual understanding of electromagnetic induction

    NASA Astrophysics Data System (ADS)

    Lee, Yu-Fen

    This study investigated the use of computer simulations to facilitate conceptual understanding in physics. The use of computer simulations in the present study was grounded in a conceptual framework drawn from findings related to the use of computer simulations in physics education. To achieve the goal of effective utilization of computers for physics education, I first reviewed studies pertaining to computer simulations in physics education categorized by three different learning frameworks and studies comparing the effects of different simulation environments. My intent was to identify the learning context and factors for successful use of computer simulations in past studies and to learn from the studies which did not obtain a significant result. Based on the analysis of reviewed literature, I proposed effective approaches to integrate computer simulations in physics education. These approaches are consistent with well established education principles such as those suggested by How People Learn (Bransford, Brown, Cocking, Donovan, & Pellegrino, 2000). The research based approaches to integrated computer simulations in physics education form a learning framework called Concept Learning with Computer Simulations (CLCS) in the current study. The second component of this study was to examine the CLCS learning framework empirically. The participants were recruited from a public high school in Beijing, China. All participating students were randomly assigned to two groups, the experimental (CLCS) group and the control (TRAD) group. Research based computer simulations developed by the physics education research group at University of Colorado at Boulder were used to tackle common conceptual difficulties in learning electromagnetic induction. While interacting with computer simulations, CLCS students were asked to answer reflective questions designed to stimulate qualitative reasoning and explanation. After receiving model reasoning online, students were asked to submit their revised answers electronically. Students in the TRAD group were not granted access to the CLCS material and followed their normal classroom routine. At the end of the study, both the CLCS and TRAD students took a post-test. Questions on the post-test were divided into "what" questions, "how" questions, and an open response question. Analysis of students' post-test performance showed mixed results. While the TRAD students scored higher on the "what" questions, the CLCS students scored higher on the "how" questions and the one open response questions. This result suggested that more TRAD students knew what kinds of conditions may or may not cause electromagnetic induction without understanding how electromagnetic induction works. Analysis of the CLCS students' learning also suggested that frequent disruption and technical trouble might pose threats to the effectiveness of the CLCS learning framework. Despite the mixed results of students' post-test performance, the CLCS learning framework revealed some limitations to promote conceptual understanding in physics. Improvement can be made by providing students with background knowledge necessary to understand model reasoning and incorporating the CLCS learning framework with other learning frameworks to promote integration of various physics concepts. In addition, the reflective questions in the CLCS learning framework may be refined to better address students' difficulties. Limitations of the study, as well as suggestions for future research, are also presented in this study.

  5. High performance computing enabling exhaustive analysis of higher order single nucleotide polymorphism interaction in Genome Wide Association Studies.

    PubMed

    Goudey, Benjamin; Abedini, Mani; Hopper, John L; Inouye, Michael; Makalic, Enes; Schmidt, Daniel F; Wagner, John; Zhou, Zeyu; Zobel, Justin; Reumann, Matthias

    2015-01-01

    Genome-wide association studies (GWAS) are a common approach for systematic discovery of single nucleotide polymorphisms (SNPs) which are associated with a given disease. Univariate analysis approaches commonly employed may miss important SNP associations that only appear through multivariate analysis in complex diseases. However, multivariate SNP analysis is currently limited by its inherent computational complexity. In this work, we present a computational framework that harnesses supercomputers. Based on our results, we estimate a three-way interaction analysis on 1.1 million SNP GWAS data requiring over 5.8 years on the full "Avoca" IBM Blue Gene/Q installation at the Victorian Life Sciences Computation Initiative. This is hundreds of times faster than estimates for other CPU based methods and four times faster than runtimes estimated for GPU methods, indicating how the improvement in the level of hardware applied to interaction analysis may alter the types of analysis that can be performed. Furthermore, the same analysis would take under 3 months on the currently largest IBM Blue Gene/Q supercomputer "Sequoia" at the Lawrence Livermore National Laboratory assuming linear scaling is maintained as our results suggest. Given that the implementation used in this study can be further optimised, this runtime means it is becoming feasible to carry out exhaustive analysis of higher order interaction studies on large modern GWAS.

  6. Bayesian Computation Emerges in Generic Cortical Microcircuits through Spike-Timing-Dependent Plasticity

    PubMed Central

    Nessler, Bernhard; Pfeiffer, Michael; Buesing, Lars; Maass, Wolfgang

    2013-01-01

    The principles by which networks of neurons compute, and how spike-timing dependent plasticity (STDP) of synaptic weights generates and maintains their computational function, are unknown. Preceding work has shown that soft winner-take-all (WTA) circuits, where pyramidal neurons inhibit each other via interneurons, are a common motif of cortical microcircuits. We show through theoretical analysis and computer simulations that Bayesian computation is induced in these network motifs through STDP in combination with activity-dependent changes in the excitability of neurons. The fundamental components of this emergent Bayesian computation are priors that result from adaptation of neuronal excitability and implicit generative models for hidden causes that are created in the synaptic weights through STDP. In fact, a surprising result is that STDP is able to approximate a powerful principle for fitting such implicit generative models to high-dimensional spike inputs: Expectation Maximization. Our results suggest that the experimentally observed spontaneous activity and trial-to-trial variability of cortical neurons are essential features of their information processing capability, since their functional role is to represent probability distributions rather than static neural codes. Furthermore it suggests networks of Bayesian computation modules as a new model for distributed information processing in the cortex. PMID:23633941

  7. Application of NASTRAN to propeller-induced ship vibration

    NASA Technical Reports Server (NTRS)

    Liepins, A. A.; Conaway, J. H.

    1975-01-01

    An application of the NASTRAN program to the analysis of propeller-induced ship vibration is presented. The essentials of the model, the computational procedure, and experience are described. Desirable program enhancements are suggested.

  8. A dictionary based informational genome analysis

    PubMed Central

    2012-01-01

    Background In the post-genomic era several methods of computational genomics are emerging to understand how the whole information is structured within genomes. Literature of last five years accounts for several alignment-free methods, arisen as alternative metrics for dissimilarity of biological sequences. Among the others, recent approaches are based on empirical frequencies of DNA k-mers in whole genomes. Results Any set of words (factors) occurring in a genome provides a genomic dictionary. About sixty genomes were analyzed by means of informational indexes based on genomic dictionaries, where a systemic view replaces a local sequence analysis. A software prototype applying a methodology here outlined carried out some computations on genomic data. We computed informational indexes, built the genomic dictionaries with different sizes, along with frequency distributions. The software performed three main tasks: computation of informational indexes, storage of these in a database, index analysis and visualization. The validation was done by investigating genomes of various organisms. A systematic analysis of genomic repeats of several lengths, which is of vivid interest in biology (for example to compute excessively represented functional sequences, such as promoters), was discussed, and suggested a method to define synthetic genetic networks. Conclusions We introduced a methodology based on dictionaries, and an efficient motif-finding software application for comparative genomics. This approach could be extended along many investigation lines, namely exported in other contexts of computational genomics, as a basis for discrimination of genomic pathologies. PMID:22985068

  9. Bayesian Latent Class Analysis Tutorial.

    PubMed

    Li, Yuelin; Lord-Bessen, Jennifer; Shiyko, Mariya; Loeb, Rebecca

    2018-01-01

    This article is a how-to guide on Bayesian computation using Gibbs sampling, demonstrated in the context of Latent Class Analysis (LCA). It is written for students in quantitative psychology or related fields who have a working knowledge of Bayes Theorem and conditional probability and have experience in writing computer programs in the statistical language R . The overall goals are to provide an accessible and self-contained tutorial, along with a practical computation tool. We begin with how Bayesian computation is typically described in academic articles. Technical difficulties are addressed by a hypothetical, worked-out example. We show how Bayesian computation can be broken down into a series of simpler calculations, which can then be assembled together to complete a computationally more complex model. The details are described much more explicitly than what is typically available in elementary introductions to Bayesian modeling so that readers are not overwhelmed by the mathematics. Moreover, the provided computer program shows how Bayesian LCA can be implemented with relative ease. The computer program is then applied in a large, real-world data set and explained line-by-line. We outline the general steps in how to extend these considerations to other methodological applications. We conclude with suggestions for further readings.

  10. GISpark: A Geospatial Distributed Computing Platform for Spatiotemporal Big Data

    NASA Astrophysics Data System (ADS)

    Wang, S.; Zhong, E.; Wang, E.; Zhong, Y.; Cai, W.; Li, S.; Gao, S.

    2016-12-01

    Geospatial data are growing exponentially because of the proliferation of cost effective and ubiquitous positioning technologies such as global remote-sensing satellites and location-based devices. Analyzing large amounts of geospatial data can provide great value for both industrial and scientific applications. Data- and compute- intensive characteristics inherent in geospatial big data increasingly pose great challenges to technologies of data storing, computing and analyzing. Such challenges require a scalable and efficient architecture that can store, query, analyze, and visualize large-scale spatiotemporal data. Therefore, we developed GISpark - a geospatial distributed computing platform for processing large-scale vector, raster and stream data. GISpark is constructed based on the latest virtualized computing infrastructures and distributed computing architecture. OpenStack and Docker are used to build multi-user hosting cloud computing infrastructure for GISpark. The virtual storage systems such as HDFS, Ceph, MongoDB are combined and adopted for spatiotemporal data storage management. Spark-based algorithm framework is developed for efficient parallel computing. Within this framework, SuperMap GIScript and various open-source GIS libraries can be integrated into GISpark. GISpark can also integrated with scientific computing environment (e.g., Anaconda), interactive computing web applications (e.g., Jupyter notebook), and machine learning tools (e.g., TensorFlow/Orange). The associated geospatial facilities of GISpark in conjunction with the scientific computing environment, exploratory spatial data analysis tools, temporal data management and analysis systems make up a powerful geospatial computing tool. GISpark not only provides spatiotemporal big data processing capacity in the geospatial field, but also provides spatiotemporal computational model and advanced geospatial visualization tools that deals with other domains related with spatial property. We tested the performance of the platform based on taxi trajectory analysis. Results suggested that GISpark achieves excellent run time performance in spatiotemporal big data applications.

  11. Nonreflective Conditions for Perfectly Matched Layer in Computational Aeroacoustics

    NASA Astrophysics Data System (ADS)

    Choung, Hanahchim; Jang, Seokjong; Lee, Soogab

    2018-05-01

    In computational aeroacoustics, boundary conditions such as radiation, outflow, or absorbing boundary conditions are critical issues in that they can affect the entire solution of the computation. Among these types of boundary conditions, the perfectly matched layer boundary condition, which has been widely used in computational fluid dynamics and computational aeroacoustics, is developed by augmenting the additional term in the original governing equations by an absorption function so as to stably absorb the outgoing waves. Even if the perfectly matched layer is analytically a perfectly nonreflective boundary condition, spurious waves occur at the interface, since the analysis is performed in discretized space. Hence, this study is focused on factors that affect numerical errors from perfectly matched layer to find the optimum conditions for nonreflective PML. Through a mathematical approach, a minimum width of perfectly matched layer and an optimum absorption coefficient are suggested. To validate the prediction of the analysis, numerical simulations are performed in a generalized coordinate system, as well as in a Cartesian coordinate system.

  12. An evaluation of a computer code based on linear acoustic theory for predicting helicopter main rotor noise. [CH-53A and S-76 helicopters

    NASA Technical Reports Server (NTRS)

    Davis, S. J.; Egolf, T. A.

    1980-01-01

    Acoustic characteristics predicted using a recently developed computer code were correlated with measured acoustic data for two helicopter rotors. The analysis, is based on a solution of the Ffowcs-Williams-Hawkings (FW-H) equation and includes terms accounting for both the thickness and loading components of the rotational noise. Computations are carried out in the time domain and assume free field conditions. Results of the correlation show that the Farrassat/Nystrom analysis, when using predicted airload data as input, yields fair but encouraging correlation for the first 6 harmonics of blade passage. It also suggests that although the analysis represents a valuable first step towards developing a truly comprehensive helicopter rotor noise prediction capability, further work remains to be done identifying and incorporating additional noise mechanisms into the code.

  13. An automatic step adjustment method for average power analysis technique used in fiber amplifiers

    NASA Astrophysics Data System (ADS)

    Liu, Xue-Ming

    2006-04-01

    An automatic step adjustment (ASA) method for average power analysis (APA) technique used in fiber amplifiers is proposed in this paper for the first time. In comparison with the traditional APA technique, the proposed method has suggested two unique merits such as a higher order accuracy and an ASA mechanism, so that it can significantly shorten the computing time and improve the solution accuracy. A test example demonstrates that, by comparing to the APA technique, the proposed method increases the computing speed by more than a hundredfold under the same errors. By computing the model equations of erbium-doped fiber amplifiers, the numerical results show that our method can improve the solution accuracy by over two orders of magnitude at the same amplifying section number. The proposed method has the capacity to rapidly and effectively compute the model equations of fiber Raman amplifiers and semiconductor lasers.

  14. The Historical and Situated Nature Design Experiments--Implications for Data Analysis

    ERIC Educational Resources Information Center

    Krange, I.; Ludvigsen, Sten

    2009-01-01

    This article is a methodological contribution to the use of design experiments in educational research. We will discuss the implications of a historical and situated interpretation to design experiments, the consequences this has for the analysis of the collected data and empirically based suggestions to improve the designs of the computer-based…

  15. Analysis of multigrid methods on massively parallel computers: Architectural implications

    NASA Technical Reports Server (NTRS)

    Matheson, Lesley R.; Tarjan, Robert E.

    1993-01-01

    We study the potential performance of multigrid algorithms running on massively parallel computers with the intent of discovering whether presently envisioned machines will provide an efficient platform for such algorithms. We consider the domain parallel version of the standard V cycle algorithm on model problems, discretized using finite difference techniques in two and three dimensions on block structured grids of size 10(exp 6) and 10(exp 9), respectively. Our models of parallel computation were developed to reflect the computing characteristics of the current generation of massively parallel multicomputers. These models are based on an interconnection network of 256 to 16,384 message passing, 'workstation size' processors executing in an SPMD mode. The first model accomplishes interprocessor communications through a multistage permutation network. The communication cost is a logarithmic function which is similar to the costs in a variety of different topologies. The second model allows single stage communication costs only. Both models were designed with information provided by machine developers and utilize implementation derived parameters. With the medium grain parallelism of the current generation and the high fixed cost of an interprocessor communication, our analysis suggests an efficient implementation requires the machine to support the efficient transmission of long messages, (up to 1000 words) or the high initiation cost of a communication must be significantly reduced through an alternative optimization technique. Furthermore, with variable length message capability, our analysis suggests the low diameter multistage networks provide little or no advantage over a simple single stage communications network.

  16. User's manual for master: Modeling of aerodynamic surfaces by 3-dimensional explicit representation. [input to three dimensional computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Gibson, S. G.

    1983-01-01

    A system of computer programs was developed to model general three dimensional surfaces. Surfaces are modeled as sets of parametric bicubic patches. There are also capabilities to transform coordinates, to compute mesh/surface intersection normals, and to format input data for a transonic potential flow analysis. A graphical display of surface models and intersection normals is available. There are additional capabilities to regulate point spacing on input curves and to compute surface/surface intersection curves. Input and output data formats are described; detailed suggestions are given for user input. Instructions for execution are given, and examples are shown.

  17. Constructing storyboards based on hierarchical clustering analysis

    NASA Astrophysics Data System (ADS)

    Hasebe, Satoshi; Sami, Mustafa M.; Muramatsu, Shogo; Kikuchi, Hisakazu

    2005-07-01

    There are growing needs for quick preview of video contents for the purpose of improving accessibility of video archives as well as reducing network traffics. In this paper, a storyboard that contains a user-specified number of keyframes is produced from a given video sequence. It is based on hierarchical cluster analysis of feature vectors that are derived from wavelet coefficients of video frames. Consistent use of extracted feature vectors is the key to avoid a repetition of computationally-intensive parsing of the same video sequence. Experimental results suggest that a significant reduction in computational time is gained by this strategy.

  18. Transportation Impact Evaluation System

    DOT National Transportation Integrated Search

    1979-11-01

    This report specifies a framework for spatial analysis and the general modelling steps required. It also suggests available urban and regional data sources, along with some typical existing urban and regional models. The goal is to develop a computer...

  19. Communication: Symmetrical quasi-classical analysis of linear optical spectroscopy

    NASA Astrophysics Data System (ADS)

    Provazza, Justin; Coker, David F.

    2018-05-01

    The symmetrical quasi-classical approach for propagation of a many degree of freedom density matrix is explored in the context of computing linear spectra. Calculations on a simple two state model for which exact results are available suggest that the approach gives a qualitative description of peak positions, relative amplitudes, and line broadening. Short time details in the computed dipole autocorrelation function result in exaggerated tails in the spectrum.

  20. Some Observations on the Current Status of Performing Finite Element Analyses

    NASA Technical Reports Server (NTRS)

    Raju, Ivatury S.; Knight, Norman F., Jr; Shivakumar, Kunigal N.

    2015-01-01

    Aerospace structures are complex high-performance structures. Advances in reliable and efficient computing and modeling tools are enabling analysts to consider complex configurations, build complex finite element models, and perform analysis rapidly. Many of the early career engineers of today are very proficient in the usage of modern computers, computing engines, complex software systems, and visualization tools. These young engineers are becoming increasingly efficient in building complex 3D models of complicated aerospace components. However, the current trends demonstrate blind acceptance of the results of the finite element analysis results. This paper is aimed at raising an awareness of this situation. Examples of the common encounters are presented. To overcome the current trends, some guidelines and suggestions for analysts, senior engineers, and educators are offered.

  1. Hierarchical Parallelization of Gene Differential Association Analysis

    PubMed Central

    2011-01-01

    Background Microarray gene differential expression analysis is a widely used technique that deals with high dimensional data and is computationally intensive for permutation-based procedures. Microarray gene differential association analysis is even more computationally demanding and must take advantage of multicore computing technology, which is the driving force behind increasing compute power in recent years. In this paper, we present a two-layer hierarchical parallel implementation of gene differential association analysis. It takes advantage of both fine- and coarse-grain (with granularity defined by the frequency of communication) parallelism in order to effectively leverage the non-uniform nature of parallel processing available in the cutting-edge systems of today. Results Our results show that this hierarchical strategy matches data sharing behavior to the properties of the underlying hardware, thereby reducing the memory and bandwidth needs of the application. The resulting improved efficiency reduces computation time and allows the gene differential association analysis code to scale its execution with the number of processors. The code and biological data used in this study are downloadable from http://www.urmc.rochester.edu/biostat/people/faculty/hu.cfm. Conclusions The performance sweet spot occurs when using a number of threads per MPI process that allows the working sets of the corresponding MPI processes running on the multicore to fit within the machine cache. Hence, we suggest that practitioners follow this principle in selecting the appropriate number of MPI processes and threads within each MPI process for their cluster configurations. We believe that the principles of this hierarchical approach to parallelization can be utilized in the parallelization of other computationally demanding kernels. PMID:21936916

  2. Hierarchical parallelization of gene differential association analysis.

    PubMed

    Needham, Mark; Hu, Rui; Dwarkadas, Sandhya; Qiu, Xing

    2011-09-21

    Microarray gene differential expression analysis is a widely used technique that deals with high dimensional data and is computationally intensive for permutation-based procedures. Microarray gene differential association analysis is even more computationally demanding and must take advantage of multicore computing technology, which is the driving force behind increasing compute power in recent years. In this paper, we present a two-layer hierarchical parallel implementation of gene differential association analysis. It takes advantage of both fine- and coarse-grain (with granularity defined by the frequency of communication) parallelism in order to effectively leverage the non-uniform nature of parallel processing available in the cutting-edge systems of today. Our results show that this hierarchical strategy matches data sharing behavior to the properties of the underlying hardware, thereby reducing the memory and bandwidth needs of the application. The resulting improved efficiency reduces computation time and allows the gene differential association analysis code to scale its execution with the number of processors. The code and biological data used in this study are downloadable from http://www.urmc.rochester.edu/biostat/people/faculty/hu.cfm. The performance sweet spot occurs when using a number of threads per MPI process that allows the working sets of the corresponding MPI processes running on the multicore to fit within the machine cache. Hence, we suggest that practitioners follow this principle in selecting the appropriate number of MPI processes and threads within each MPI process for their cluster configurations. We believe that the principles of this hierarchical approach to parallelization can be utilized in the parallelization of other computationally demanding kernels.

  3. A NEW FACTOR ANALYSIS OF THE SVIB--SUGGESTED MODIFICATION OF EXISTING GROUPS AND IMPLICATIONS FOR COUNSELING.

    ERIC Educational Resources Information Center

    SMITH, STUART E.; AND OTHERS

    FACTOR ANALYSIS WAS CARRIED OUT TO ASCERTAIN THE BEST OCCUPATIONAL GROUP LOCATION FOR EACH OF FOUR STRONG VOCATIONAL INTEREST BLANK (SVIB) SCALES--VETERMINARIAN, SENIOR CPA, PHARMACIST, AND MORTICIAN. THE SVIB WAS ADMINISTERED TO 125 MALE LIBERAL ARTS FRESHMEN. MEANS, STANDARD DEVIATIONS, AND INTERCORRELATIONS WERE COMPUTED. THIS FACTOR ANALYSIS…

  4. Users' Perceptions of the Web As Revealed by Transaction Log Analysis.

    ERIC Educational Resources Information Center

    Moukdad, Haidar; Large, Andrew

    2001-01-01

    Describes the results of a transaction log analysis of a Web search engine, WebCrawler, to analyze user's queries for information retrieval. Results suggest most users do not employ advanced search features, and the linguistic structure often resembles a human-human communication model that is not always successful in human-computer communication.…

  5. A study of partial coherence for identifying interior noise sources and paths on general aviation aircraft

    NASA Technical Reports Server (NTRS)

    Howlett, J. T.

    1979-01-01

    The partial coherence analysis method for noise source/path determination is summarized and the application to a two input, single output system with coherence between the inputs is illustrated. The augmentation of the calculations on a digital computer interfaced with a two channel, real time analyzer is also discussed. The results indicate possible sources of error in the computations and suggest procedures for avoiding these errors.

  6. Whole-Volume Clustering of Time Series Data from Zebrafish Brain Calcium Images via Mixture Modeling.

    PubMed

    Nguyen, Hien D; Ullmann, Jeremy F P; McLachlan, Geoffrey J; Voleti, Venkatakaushik; Li, Wenze; Hillman, Elizabeth M C; Reutens, David C; Janke, Andrew L

    2018-02-01

    Calcium is a ubiquitous messenger in neural signaling events. An increasing number of techniques are enabling visualization of neurological activity in animal models via luminescent proteins that bind to calcium ions. These techniques generate large volumes of spatially correlated time series. A model-based functional data analysis methodology via Gaussian mixtures is suggested for the clustering of data from such visualizations is proposed. The methodology is theoretically justified and a computationally efficient approach to estimation is suggested. An example analysis of a zebrafish imaging experiment is presented.

  7. Understanding and enhancing user acceptance of computer technology

    NASA Technical Reports Server (NTRS)

    Rouse, William B.; Morris, Nancy M.

    1986-01-01

    Technology-driven efforts to implement computer technology often encounter problems due to lack of acceptance or begrudging acceptance of the personnel involved. It is argued that individuals' acceptance of automation, in terms of either computerization or computer aiding, is heavily influenced by their perceptions of the impact of the automation on their discretion in performing their jobs. It is suggested that desired levels of discretion reflect needs to feel in control and achieve self-satisfaction in task performance, as well as perceptions of inadequacies of computer technology. Discussion of these factors leads to a structured set of considerations for performing front-end analysis, deciding what to automate, and implementing the resulting changes.

  8. HBonanza: A Computer Algorithm for Molecular-Dynamics-Trajectory Hydrogen-Bond Analysis

    PubMed Central

    Durrant, Jacob D.; McCammon, J. Andrew

    2011-01-01

    In the current work, we present a hydrogen-bond analysis of 2,673 ligand-receptor complexes that suggests the total number of hydrogen bonds formed between a ligand and its protein receptor is a poor predictor of ligand potency; furthermore, even that poor prediction does not suggest a statistically significant correlation between hydrogen-bond formation and potency. While we are not the first to suggest that hydrogen bonds on average do not generally contribute to ligand binding affinities, this additional evidence is nevertheless interesting. The primary role of hydrogen bonds may instead be to ensure specificity, to correctly position the ligand within the active site, and to hold the protein active site in a ligand-friendly conformation. We also present a new computer program called HBonanza (hydrogen-bond analyzer) that aids the analysis and visualization of hydrogen-bond networks. HBonanza, which can be used to analyze single structures or the many structures of a molecular dynamics trajectory, is open source and python implemented, making it easily editable, customizable, and platform independent. Unlike many other freely available hydrogen-bond analysis tools, HBonanza provides not only a text-based table describing the hydrogen-bond network, but also a Tcl script to facilitate visualization in VMD, a popular molecular visualization program. Visualization in other programs is also possible. A copy of HBonanza can be obtained free of charge from http://www.nbcr.net/hbonanza. PMID:21880522

  9. The Effects of Social Environments on Time Spent Gaming: Focusing on the Effects of Communities and Neighborhoods.

    PubMed

    Lim, Tee Teng; Jung, Sun Young; Kim, Eunyi

    2018-04-01

    This study examined the impact of community and neighborhood on time spent computer gaming. Computer gaming for over 20 hours a week was set as the cutoff line for "engaged use" of computer games. For the analysis, this study analyzed data for about 1,800 subjects who participated in the Korean Children and Youth Panel Survey. The main findings are as follows: first, structural community characteristics and neighborhood social capital affected the engaged use of computer games. Second, adolescents who reside in regions with a higher divorce rate or higher residential mobility were likely to exhibit engaged use of computer games. Third, adolescents who highly perceive neighborhood social capital exhibited lower possibility of engaged use of computer games. Based on these findings, practical implications and directions for further study are suggested.

  10. Rater reliability and concurrent validity of the Keyboard Personal Computer Style instrument (K-PeCS).

    PubMed

    Baker, Nancy A; Cook, James R; Redfern, Mark S

    2009-01-01

    This paper describes the inter-rater and intra-rater reliability, and the concurrent validity of an observational instrument, the Keyboard Personal Computer Style instrument (K-PeCS), which assesses stereotypical postures and movements associated with computer keyboard use. Three trained raters independently rated the video clips of 45 computer keyboard users to ascertain inter-rater reliability, and then re-rated a sub-sample of 15 video clips to ascertain intra-rater reliability. Concurrent validity was assessed by comparing the ratings obtained using the K-PeCS to scores developed from a 3D motion analysis system. The overall K-PeCS had excellent reliability [inter-rater: intra-class correlation coefficients (ICC)=.90; intra-rater: ICC=.92]. Most individual items on the K-PeCS had from good to excellent reliability, although six items fell below ICC=.75. Those K-PeCS items that were assessed for concurrent validity compared favorably to the motion analysis data for all but two items. These results suggest that most items on the K-PeCS can be used to reliably document computer keyboarding style.

  11. Study of Geometric Porosity on Static Stability and Drag Using Computational Fluid Dynamics for Rigid Parachute Shapes

    NASA Technical Reports Server (NTRS)

    Greathouse, James S.; Schwing, Alan M.

    2015-01-01

    This paper explores use of computational fluid dynamics to study the e?ect of geometric porosity on static stability and drag for NASA's Multi-Purpose Crew Vehicle main parachute. Both of these aerodynamic characteristics are of interest to in parachute design, and computational methods promise designers the ability to perform detailed parametric studies and other design iterations with a level of control previously unobtainable using ground or flight testing. The approach presented here uses a canopy structural analysis code to define the inflated parachute shapes on which structured computational grids are generated. These grids are used by the computational fluid dynamics code OVERFLOW and are modeled as rigid, impermeable bodies for this analysis. Comparisons to Apollo drop test data is shown as preliminary validation of the technique. Results include several parametric sweeps through design variables in order to better understand the trade between static stability and drag. Finally, designs that maximize static stability with a minimal loss in drag are suggested for further study in subscale ground and flight testing.

  12. Molecular Modeling in Drug Design for the Development of Organophosphorus Antidotes/Prophylactics.

    DTIC Science & Technology

    1986-06-01

    multidimensional statistical QSAR analysis techniques to suggest new structures for synthesis and evaluation. C. Application of quantum chemical techniques to...compounds for synthesis and testing for antidotal potency. E. Use of computer-assisted methods to determine the steric constraints at the active site...modeling techniques to model the enzyme acetylcholinester-se. H. Suggestion of some novel compounds for synthesis and testing for reactivating

  13. Indications for quantum computation requirements from comparative brain analysis

    NASA Astrophysics Data System (ADS)

    Bernroider, Gustav; Baer, Wolfgang

    2010-04-01

    Whether or not neuronal signal properties can engage 'non-trivial', i.e. functionally significant, quantum properties, is the subject of an ongoing debate. Here we provide evidence that quantum coherence dynamics can play a functional role in ion conduction mechanism with consequences on the shape and associative character of classical membrane signals. In particular, these new perspectives predict that a specific neuronal topology (e.g. the connectivity pattern of cortical columns in the primate brain) is less important and not really required to explain abilities in perception and sensory-motor integration. Instead, this evidence is suggestive for a decisive role of the number and functional segregation of ion channel proteins that can be engaged in a particular neuronal constellation. We provide evidence from comparative brain studies and estimates of computational capacity behind visual flight functions suggestive for a possible role of quantum computation in biological systems.

  14. Reliability analysis of a robotic system using hybridized technique

    NASA Astrophysics Data System (ADS)

    Kumar, Naveen; Komal; Lather, J. S.

    2017-09-01

    In this manuscript, the reliability of a robotic system has been analyzed using the available data (containing vagueness, uncertainty, etc). Quantification of involved uncertainties is done through data fuzzification using triangular fuzzy numbers with known spreads as suggested by system experts. With fuzzified data, if the existing fuzzy lambda-tau (FLT) technique is employed, then the computed reliability parameters have wide range of predictions. Therefore, decision-maker cannot suggest any specific and influential managerial strategy to prevent unexpected failures and consequently to improve complex system performance. To overcome this problem, the present study utilizes a hybridized technique. With this technique, fuzzy set theory is utilized to quantify uncertainties, fault tree is utilized for the system modeling, lambda-tau method is utilized to formulate mathematical expressions for failure/repair rates of the system, and genetic algorithm is utilized to solve established nonlinear programming problem. Different reliability parameters of a robotic system are computed and the results are compared with the existing technique. The components of the robotic system follow exponential distribution, i.e., constant. Sensitivity analysis is also performed and impact on system mean time between failures (MTBF) is addressed by varying other reliability parameters. Based on analysis some influential suggestions are given to improve the system performance.

  15. Deformations of thick two-material cylinder under axially varying radial pressure

    NASA Technical Reports Server (NTRS)

    Patel, Y. A.

    1976-01-01

    Stresses and deformations in thick, short, composite cylinder subjected to axially varying radial pressure are studied. Effect of slippage at the interface is examined. In the NASTRAN finite element model, multipoint constraint feature is utilized. Results are compared with theoretical analysis and SAP-IV computer code. Results from NASTRAN computer code are in good agreement with the analytical solutions. Results suggest a considerable influence of interfacial slippage on the axial bending stresses in the cylinder.

  16. Computer-aided classification of lung nodules on computed tomography images via deep learning technique

    PubMed Central

    Hua, Kai-Lung; Hsu, Che-Hao; Hidayati, Shintami Chusnul; Cheng, Wen-Huang; Chen, Yu-Jen

    2015-01-01

    Lung cancer has a poor prognosis when not diagnosed early and unresectable lesions are present. The management of small lung nodules noted on computed tomography scan is controversial due to uncertain tumor characteristics. A conventional computer-aided diagnosis (CAD) scheme requires several image processing and pattern recognition steps to accomplish a quantitative tumor differentiation result. In such an ad hoc image analysis pipeline, every step depends heavily on the performance of the previous step. Accordingly, tuning of classification performance in a conventional CAD scheme is very complicated and arduous. Deep learning techniques, on the other hand, have the intrinsic advantage of an automatic exploitation feature and tuning of performance in a seamless fashion. In this study, we attempted to simplify the image analysis pipeline of conventional CAD with deep learning techniques. Specifically, we introduced models of a deep belief network and a convolutional neural network in the context of nodule classification in computed tomography images. Two baseline methods with feature computing steps were implemented for comparison. The experimental results suggest that deep learning methods could achieve better discriminative results and hold promise in the CAD application domain. PMID:26346558

  17. Computer-aided classification of lung nodules on computed tomography images via deep learning technique.

    PubMed

    Hua, Kai-Lung; Hsu, Che-Hao; Hidayati, Shintami Chusnul; Cheng, Wen-Huang; Chen, Yu-Jen

    2015-01-01

    Lung cancer has a poor prognosis when not diagnosed early and unresectable lesions are present. The management of small lung nodules noted on computed tomography scan is controversial due to uncertain tumor characteristics. A conventional computer-aided diagnosis (CAD) scheme requires several image processing and pattern recognition steps to accomplish a quantitative tumor differentiation result. In such an ad hoc image analysis pipeline, every step depends heavily on the performance of the previous step. Accordingly, tuning of classification performance in a conventional CAD scheme is very complicated and arduous. Deep learning techniques, on the other hand, have the intrinsic advantage of an automatic exploitation feature and tuning of performance in a seamless fashion. In this study, we attempted to simplify the image analysis pipeline of conventional CAD with deep learning techniques. Specifically, we introduced models of a deep belief network and a convolutional neural network in the context of nodule classification in computed tomography images. Two baseline methods with feature computing steps were implemented for comparison. The experimental results suggest that deep learning methods could achieve better discriminative results and hold promise in the CAD application domain.

  18. Principles of Experimental Design for Big Data Analysis.

    PubMed

    Drovandi, Christopher C; Holmes, Christopher; McGree, James M; Mengersen, Kerrie; Richardson, Sylvia; Ryan, Elizabeth G

    2017-08-01

    Big Datasets are endemic, but are often notoriously difficult to analyse because of their size, heterogeneity and quality. The purpose of this paper is to open a discourse on the potential for modern decision theoretic optimal experimental design methods, which by their very nature have traditionally been applied prospectively, to improve the analysis of Big Data through retrospective designed sampling in order to answer particular questions of interest. By appealing to a range of examples, it is suggested that this perspective on Big Data modelling and analysis has the potential for wide generality and advantageous inferential and computational properties. We highlight current hurdles and open research questions surrounding efficient computational optimisation in using retrospective designs, and in part this paper is a call to the optimisation and experimental design communities to work together in the field of Big Data analysis.

  19. Principles of Experimental Design for Big Data Analysis

    PubMed Central

    Drovandi, Christopher C; Holmes, Christopher; McGree, James M; Mengersen, Kerrie; Richardson, Sylvia; Ryan, Elizabeth G

    2016-01-01

    Big Datasets are endemic, but are often notoriously difficult to analyse because of their size, heterogeneity and quality. The purpose of this paper is to open a discourse on the potential for modern decision theoretic optimal experimental design methods, which by their very nature have traditionally been applied prospectively, to improve the analysis of Big Data through retrospective designed sampling in order to answer particular questions of interest. By appealing to a range of examples, it is suggested that this perspective on Big Data modelling and analysis has the potential for wide generality and advantageous inferential and computational properties. We highlight current hurdles and open research questions surrounding efficient computational optimisation in using retrospective designs, and in part this paper is a call to the optimisation and experimental design communities to work together in the field of Big Data analysis. PMID:28883686

  20. Parameter estimation and sensitivity analysis in an agent-based model of Leishmania major infection

    PubMed Central

    Jones, Douglas E.; Dorman, Karin S.

    2009-01-01

    Computer models of disease take a systems biology approach toward understanding host-pathogen interactions. In particular, data driven computer model calibration is the basis for inference of immunological and pathogen parameters, assessment of model validity, and comparison between alternative models of immune or pathogen behavior. In this paper we describe the calibration and analysis of an agent-based model of Leishmania major infection. A model of macrophage loss following uptake of necrotic tissue is proposed to explain macrophage depletion following peak infection. Using Gaussian processes to approximate the computer code, we perform a sensitivity analysis to identify important parameters and to characterize their influence on the simulated infection. The analysis indicates that increasing growth rate can favor or suppress pathogen loads, depending on the infection stage and the pathogen’s ability to avoid detection. Subsequent calibration of the model against previously published biological observations suggests that L. major has a relatively slow growth rate and can replicate for an extended period of time before damaging the host cell. PMID:19837088

  1. Computational Aeroacoustic Analysis of Slat Trailing-Edge Flow

    NASA Technical Reports Server (NTRS)

    Singer, Bart A.; Lockard, David P.; Brentner, Kenneth S.; Khorrami, Mehdi R.; Berkman, Mert E.; Choudhari, Meelan

    2000-01-01

    An acoustic analysis based on the Ffowcs Williams and Hawkings equation was performed for a high-lift system. As input, the acoustic analysis used un- steady flow data obtained from a highly resolved, time-dependent, Reynolds-averaged Navier-Stokes calculation. The analysis strongly suggests that vor- tex shedding from the trailing edge of the slat results in a high-amplitude, high-frequency acoustic signal, similar to that which was observed in a correspond- ing experimental study of the high-lift system.

  2. Organic chemistry as a language and the implications of chemical linguistics for structural and retrosynthetic analyses.

    PubMed

    Cadeddu, Andrea; Wylie, Elizabeth K; Jurczak, Janusz; Wampler-Doty, Matthew; Grzybowski, Bartosz A

    2014-07-28

    Methods of computational linguistics are used to demonstrate that a natural language such as English and organic chemistry have the same structure in terms of the frequency of, respectively, text fragments and molecular fragments. This quantitative correspondence suggests that it is possible to extend the methods of computational corpus linguistics to the analysis of organic molecules. It is shown that within organic molecules bonds that have highest information content are the ones that 1) define repeat/symmetry subunits and 2) in asymmetric molecules, define the loci of potential retrosynthetic disconnections. Linguistics-based analysis appears well-suited to the analysis of complex structural and reactivity patterns within organic molecules. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Preliminary Evaluation of MapReduce for High-Performance Climate Data Analysis

    NASA Technical Reports Server (NTRS)

    Duffy, Daniel Q.; Schnase, John L.; Thompson, John H.; Freeman, Shawn M.; Clune, Thomas L.

    2012-01-01

    MapReduce is an approach to high-performance analytics that may be useful to data intensive problems in climate research. It offers an analysis paradigm that uses clusters of computers and combines distributed storage of large data sets with parallel computation. We are particularly interested in the potential of MapReduce to speed up basic operations common to a wide range of analyses. In order to evaluate this potential, we are prototyping a series of canonical MapReduce operations over a test suite of observational and climate simulation datasets. Our initial focus has been on averaging operations over arbitrary spatial and temporal extents within Modern Era Retrospective- Analysis for Research and Applications (MERRA) data. Preliminary results suggest this approach can improve efficiencies within data intensive analytic workflows.

  4. Thermal radiation view factor: Methods, accuracy and computer-aided procedures

    NASA Technical Reports Server (NTRS)

    Kadaba, P. V.

    1982-01-01

    The computer aided thermal analysis programs which predicts the result of predetermined acceptable temperature range prior to stationing of these orbiting equipment in various attitudes with respect to the Sun and the Earth was examined. Complexity of the surface geometries suggests the use of numerical schemes for the determination of these viewfactors. Basic definitions and standard methods which form the basis for various digital computer methods and various numerical methods are presented. The physical model and the mathematical methods on which a number of available programs are built are summarized. The strength and the weaknesses of the methods employed, the accuracy of the calculations and the time required for computations are evaluated. The situations where accuracies are important for energy calculations are identified and methods to save computational times are proposed. Guide to best use of the available programs at several centers and the future choices for efficient use of digital computers are included in the recommendations.

  5. Theoretical and experimental study of polycyclic aromatic compounds as β-tubulin inhibitors.

    PubMed

    Olazarán, Fabian E; García-Pérez, Carlos A; Bandyopadhyay, Debasish; Balderas-Rentería, Isaias; Reyes-Figueroa, Angel D; Henschke, Lars; Rivera, Gildardo

    2017-03-01

    In this work, through a docking analysis of compounds from the ZINC chemical library on human β-tubulin using high performance computer cluster, we report new polycyclic aromatic compounds that bind with high energy on the colchicine binding site of β-tubulin, suggesting three new key amino acids. However, molecular dynamic analysis showed low stability in the interaction between ligand and receptor. Results were confirmed experimentally in in vitro and in vivo models that suggest that molecular dynamics simulation is the best option to find new potential β-tubulin inhibitors. Graphical abstract Bennett's acceptance ratio (BAR) method.

  6. Retention in a Computer-based Outreach Intervention For Chronically Ill Rural Women

    PubMed Central

    Weinert, Clarann; Cudney, Shirley; Hill, Wade G.

    2009-01-01

    The study's purpose was to examine retention factors in a computer intervention with 158 chronically ill rural women. After a 22 week intervention, 18.9 percent of the women had dropped out. A Cox regression survival analysis was performed to assess the effects of selected covariates on retention. Reasons for dropping were tallied and categorized. Major reasons for dropping were: lack of time; decline in health status, and non-participation in study activities. Four covariates predicted survival time: level of computer skills, marital status, work outside of home, and impact of social events on participants' lives. Retention-enhancing strategies are suggested for implementation. PMID:18226760

  7. Numerical information processing under the global rule expressed by the Euler-Riemann ζ function defined in the complex plane

    NASA Astrophysics Data System (ADS)

    Chatelin, Françoise

    2010-09-01

    When nonzero, the ζ function is intimately connected with numerical information processing. Two other functions play a key role, namely, η(s )=∑n ≥1(-1)n +1/ns and λ(s )=∑n ≥01/(2n+1)s. The paper opens on a survey of some of the seminal work of Euler [Mémoires Acad. Sci., Berlin 1768, 83 (1749)] and of the amazing theorem by Voronin [Math. USSR, Izv. 9, 443 (1975)] Then, as a follow-up of Chatelin [Qualitative Computing. A Computational Journey into Nonlinearity (World Scientific, Singapore, in press)], we present a fresh look at the triple (η ,ζ,λ) which suggests an elementary analysis based on the distances of the three complex numbers z, z /2, and 2/z to 0 and 1. This metric approach is used to contextualize any nonlinear computation when it is observed at a point describing a complex plane. The results applied to ζ, η, and λ shed a new epistemological light about the critical line. The suggested interpretation related to ζ carries computational significance.

  8. Second-order shaped pulsed for solid-state quantum computation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sengupta, Pinaki

    2008-01-01

    We present the construction and detailed analysis of highly optimized self-refocusing pulse shapes for several rotation angles. We characterize the constructed pulses by the coefficients appearing in the Magnus expansion up to second order. This allows a semianalytical analysis of the performance of the constructed shapes in sequences and composite pulses by computing the corresponding leading-order error operators. Higher orders can be analyzed with the numerical technique suggested by us previously. We illustrate the technique by analyzing several composite pulses designed to protect against pulse amplitude errors, and on decoupling sequences for potentially long chains of qubits with on-site andmore » nearest-neighbor couplings.« less

  9. An Expert Assistant for Computer Aided Parallelization

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Chun, Robert; Jin, Haoqiang; Labarta, Jesus; Gimenez, Judit

    2004-01-01

    The prototype implementation of an expert system was developed to assist the user in the computer aided parallelization process. The system interfaces to tools for automatic parallelization and performance analysis. By fusing static program structure information and dynamic performance analysis data the expert system can help the user to filter, correlate, and interpret the data gathered by the existing tools. Sections of the code that show poor performance and require further attention are rapidly identified and suggestions for improvements are presented to the user. In this paper we describe the components of the expert system and discuss its interface to the existing tools. We present a case study to demonstrate the successful use in full scale scientific applications.

  10. Phantom bite: a real or a phantom diagnosis? A case report.

    PubMed

    Sutter, Ben A

    2017-01-01

    This case report describes computer-guided occlusal therapy in a patient who met the unified diagnostic criteria for phantom bite. After a review of the patient's medical history, along with a diagnostic work-up that included cone beam computed tomography, temporomandibular joint vibration analysis, and digital occlusal analysis, problematic dental components were discovered (including prolonged disclusion time and imbalanced bite force). A digital occlusal analyzer evaluated the patient's occlusion and systematically guided the necessary changes. After reduction of the disclusion time and correction of the occlusal force imbalance, the patient reported significant improvement in comfort. The results suggest that phantom bite could be an abnormal occlusal condition and not a psychological or neurologic somatoform disorder.

  11. Flexion-relaxation ratio in computer workers with and without chronic neck pain.

    PubMed

    Pinheiro, Carina Ferreira; dos Santos, Marina Foresti; Chaves, Thais Cristina

    2016-02-01

    This study evaluated the flexion-relaxation phenomenon (FRP) and flexion-relaxation ratios (FR-ratios) using surface electromyography (sEMG) of the cervical extensor muscles of computer workers with and without chronic neck pain, as well as of healthy subjects who were not computer users. This study comprised 60 subjects 20-45years of age, of which 20 were computer workers with chronic neck pain (CPG), 20 were computer workers without neck pain (NPG), and 20 were control individuals who do not use computers for work and use them less than 4h/day for other purposes (CG). FRP and FR-ratios were analyzed using sEMG of the cervical extensors. Analysis of FR-ratios showed smaller values in the semispinalis capitis muscles of the two groups of workers compared to the control group. The reference FR-ratio (flexion relaxation ratio [FRR], defined as the maximum activity in 1s of the re-extension/full flexion sEMG activity) was significantly higher in the computer workers with neck pain compared to the CG (CPG: 3.10, 95% confidence interval [CI95%] 2.50-3.70; NPG: 2.33, CI95% 1.93-2.74; CG: 1.99, CI95% 1.81-2.17; p<0.001). The FR-ratios and FRR of sEMG in this study suggested that computer use could increase recruitment of the semispinalis capitis during neck extension (concentric and eccentric phases), which could explain our results. These results also suggest that the FR-ratios of the semispinalis may be a potential functional predictive neuromuscular marker of asymptomatic neck musculoskeletal disorders since even asymptomatic computer workers showed altered values. On the other hand, the FRR values of the semispinalis capitis demonstrated a good discriminative ability to detect neck pain, and such results suggested that each FR-ratio could have a different application. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Computational neurobiology is a useful tool in translational neurology: the example of ataxia

    PubMed Central

    Brown, Sherry-Ann; McCullough, Louise D.; Loew, Leslie M.

    2014-01-01

    Hereditary ataxia, or motor incoordination, affects approximately 150,000 Americans and hundreds of thousands of individuals worldwide with onset from as early as mid-childhood. Affected individuals exhibit dysarthria, dysmetria, action tremor, and diadochokinesia. In this review, we consider an array of computational studies derived from experimental observations relevant to human neuropathology. A survey of related studies illustrates the impact of integrating clinical evidence with data from mouse models and computational simulations. Results from these studies may help explain findings in mice, and after extensive laboratory study, may ultimately be translated to ataxic individuals. This inquiry lays a foundation for using computation to understand neurobiochemical and electrophysiological pathophysiology of spinocerebellar ataxias and may contribute to development of therapeutics. The interdisciplinary analysis suggests that computational neurobiology can be an important tool for translational neurology. PMID:25653585

  13. Identification and addressing reduction-related misconceptions

    NASA Astrophysics Data System (ADS)

    Gal-Ezer, Judith; Trakhtenbrot, Mark

    2016-07-01

    Reduction is one of the key techniques used for problem-solving in computer science. In particular, in the theory of computation and complexity (TCC), mapping and polynomial reductions are used for analysis of decidability and computational complexity of problems, including the core concept of NP-completeness. Reduction is a highly abstract technique that involves revealing close non-trivial connections between problems that often seem to have nothing in common. As a result, proper understanding and application of reduction is a serious challenge for students and a source of numerous misconceptions. The main contribution of this paper is detection of such misconceptions, analysis of their roots, and proposing a way to address them in an undergraduate TCC course. Our observations suggest that the main source of the misconceptions is the false intuitive rule "the bigger is a set/problem, the harder it is to solve". Accordingly, we developed a series of exercises for proactive prevention of these misconceptions.

  14. Exploring the use of optical flow for the study of functional NIRS signals

    NASA Astrophysics Data System (ADS)

    Fernandez Rojas, Raul; Huang, Xu; Ou, Keng-Liang; Hernandez-Juarez, Jesus

    2017-03-01

    Near infrared spectroscopy (NIRS) is an optical imaging technique that allows real-time measurements of Oxy and Deoxy-hemoglobin concentrations in human body tissue. In functional NIRS (fNIRS), this technique is used to study cortical activation in response to changes in neural activity. However, analysis of activation regions using NIRS is a challenging task in the field of medical image analysis and despite existing solutions, no homogeneous analysis method has yet been determined. For that reason, the aim of our present study is to report the use of an optical flow method for the analysis of cortical activation using near-infrared spectroscopy signals. We used real fNIRS data recorded from a noxious stimulation experiment as base of our implementation. To compute the optical flow algorithm, we first arrange NIRS signals (Oxy-hemoglobin) following our 24 channels (12 channels per hemisphere) head-probe configuration to create image-like samples. We then used two consecutive fNIRS samples per hemisphere as input frames for the optical flow algorithm, making one computation per hemisphere. The output from these two computations is the velocity field representing cortical activation from each hemisphere. The experimental results showed that the radial structure of flow vectors exhibited the origin of cortical activity, the development of stimulation as expansion or contraction of such flow vectors, and the flow of activation patterns may suggest prediction in cortical activity. The present study demonstrates that optical flow provides a power tool for the analysis of NIRS signals. Finally, we suggested a novel idea to identify pain status in nonverbal patients by using optical flow motion vectors; however, this idea will be study further in our future research.

  15. Combination of Thin Lenses--A Computer Oriented Method.

    ERIC Educational Resources Information Center

    Flerackers, E. L. M.; And Others

    1984-01-01

    Suggests a method treating geometric optics using a microcomputer to do the calculations of image formation. Calculations are based on the connection between the composition of lenses and the mathematics of fractional linear equations. Logic of the analysis and an example problem are included. (JM)

  16. Does albendazole affect seizure remission and computed tomography response in children with neurocysticercosis? A Systematic review and meta-analysis.

    PubMed

    Mazumdar, Maitreyi; Pandharipande, Pari; Poduri, Annapurna

    2007-02-01

    A recent trial suggested that albendazole reduces seizures in adults with neurocysticercosis. There is still no consensus regarding optimal management of neurocysticercosis in children. The authors conducted a systematic review and meta-analysis to assess the efficacy of albendazole in children with neurocysticercosis, by searching the Cochrane Databases, MEDLINE, EMBASE, and LILACS. Three reviewers extracted data using an intent-to-treat analysis. Random effects models were used to estimate relative risks. Four randomized trials were selected for meta-analysis, and 10 observational studies were selected for qualitative review. The relative risk of seizure remission in treatment versus control was 1.26 (1.09, 1.46). The relative risk of improvement in computed tomography in these trials was 1.15 (0.97, 1.36). Review of observational studies showed conflicting results, likely owing to preferential administration of albendazole to sicker children.

  17. Manual vs. computer-assisted sperm analysis: can CASA replace manual assessment of human semen in clinical practice?

    PubMed

    Talarczyk-Desole, Joanna; Berger, Anna; Taszarek-Hauke, Grażyna; Hauke, Jan; Pawelczyk, Leszek; Jedrzejczak, Piotr

    2017-01-01

    The aim of the study was to check the quality of computer-assisted sperm analysis (CASA) system in comparison to the reference manual method as well as standardization of the computer-assisted semen assessment. The study was conducted between January and June 2015 at the Andrology Laboratory of the Division of Infertility and Reproductive Endocrinology, Poznań University of Medical Sciences, Poland. The study group consisted of 230 men who gave sperm samples for the first time in our center as part of an infertility investigation. The samples underwent manual and computer-assisted assessment of concentration, motility and morphology. A total of 184 samples were examined twice: manually, according to the 2010 WHO recommendations, and with CASA, using the program set-tings provided by the manufacturer. Additionally, 46 samples underwent two manual analyses and two computer-assisted analyses. The p-value of p < 0.05 was considered as statistically significant. Statistically significant differences were found between all of the investigated sperm parameters, except for non-progressive motility, measured with CASA and manually. In the group of patients where all analyses with each method were performed twice on the same sample we found no significant differences between both assessments of the same probe, neither in the samples analyzed manually nor with CASA, although standard deviation was higher in the CASA group. Our results suggest that computer-assisted sperm analysis requires further improvement for a wider application in clinical practice.

  18. Observations Regarding Use of Advanced CFD Analysis, Sensitivity Analysis, and Design Codes in MDO

    NASA Technical Reports Server (NTRS)

    Newman, Perry A.; Hou, Gene J. W.; Taylor, Arthur C., III

    1996-01-01

    Observations regarding the use of advanced computational fluid dynamics (CFD) analysis, sensitivity analysis (SA), and design codes in gradient-based multidisciplinary design optimization (MDO) reflect our perception of the interactions required of CFD and our experience in recent aerodynamic design optimization studies using CFD. Sample results from these latter studies are summarized for conventional optimization (analysis - SA codes) and simultaneous analysis and design optimization (design code) using both Euler and Navier-Stokes flow approximations. The amount of computational resources required for aerodynamic design using CFD via analysis - SA codes is greater than that required for design codes. Thus, an MDO formulation that utilizes the more efficient design codes where possible is desired. However, in the aerovehicle MDO problem, the various disciplines that are involved have different design points in the flight envelope; therefore, CFD analysis - SA codes are required at the aerodynamic 'off design' points. The suggested MDO formulation is a hybrid multilevel optimization procedure that consists of both multipoint CFD analysis - SA codes and multipoint CFD design codes that perform suboptimizations.

  19. Interfacing Email Tutoring: Shaping an Emergent Literate Practice.

    ERIC Educational Resources Information Center

    Anderson, Dana

    2002-01-01

    Presents a descriptive analysis of 29 online writing lab sites for email tutoring, currently the most popular mode of computer-mediated collaboration. Considers how email tutoring interfaces represent the literate practice of email tutoring, shaping expectations and experiences consistent with its literate aims. Suggests that email tutoring…

  20. Effect of Premolar Axial Wall Height on Computer-Aided Design/Computer-Assisted Manufacture Crown Retention.

    PubMed

    Martin, Curt; Harris, Ashley; DuVall, Nicholas; Wajdowicz, Michael; Roberts, Howard Wayne

    2018-03-28

    To evaluate the effect of premolar axial wall height on the retention of adhesive, full-coverage, computer-aided design/computer-assisted manufacture (CAD/CAM) restorations. A total of 48 premolar teeth randomized into four groups (n = 12 per group) received all-ceramic CAD/CAM restorations with axial wall heights (AWH) of 3, 2, 1, and 0 mm and 16-degree total occlusal convergence (TOC). Specimens were restored with lithium disilicate material and cemented with self-adhesive resin cement. Specimens were loaded to failure after 24 hours. The 3- and 2-mm AWH specimens demonstrated significantly greater failure load. Failure analysis suggests a 2-mm minimum AWH for premolars with a TOC of 16 degrees. Adhesive technology may compensate for compromised AWH.

  1. Approach to recognition of flexible form for credit card expiration date recognition as example

    NASA Astrophysics Data System (ADS)

    Sheshkus, Alexander; Nikolaev, Dmitry P.; Ingacheva, Anastasia; Skoryukina, Natalya

    2015-12-01

    In this paper we consider a task of finding information fields within document with flexible form for credit card expiration date field as example. We discuss main difficulties and suggest possible solutions. In our case this task is to be solved on mobile devices therefore computational complexity has to be as low as possible. In this paper we provide results of the analysis of suggested algorithm. Error distribution of the recognition system shows that suggested algorithm solves the task with required accuracy.

  2. Comparative Evaluation of a Four-Implant-Supported Polyetherketoneketone Framework Prosthesis: A Three-Dimensional Finite Element Analysis Based on Cone Beam Computed Tomography and Computer-Aided Design.

    PubMed

    Lee, Ki-Sun; Shin, Sang-Wan; Lee, Sang-Pyo; Kim, Jong-Eun; Kim, Jee-Hwan; Lee, Jeong-Yol

    The purpose of this pilot study was to evaluate and compare polyetherketoneketone (PEKK) with different framework materials for implant-supported prostheses by means of a three-dimensional finite element analysis (3D-FEA) based on cone beam computed tomography (CBCT) and computer-aided design (CAD) data. A geometric model that consisted of four maxillary implants supporting a prosthesis framework was constructed from CBCT and CAD data of a treated patient. Three different materials (zirconia, titanium, and PEKK) were selected, and their material properties were simulated using FEA software in the generated geometric model. In the PEKK framework (ie, low elastic modulus) group, the stress transferred to the implant and simulated adjacent tissue was reduced when compressive stress was dominant, but increased when tensile stress was dominant. This study suggests that the shock-absorbing effects of a resilient implant-supported framework are limited in some areas and that rigid framework material shows a favorable stress distribution and safety of overall components of the prosthesis.

  3. Natural language processing, pragmatics, and verbal behavior

    PubMed Central

    Cherpas, Chris

    1992-01-01

    Natural Language Processing (NLP) is that part of Artificial Intelligence (AI) concerned with endowing computers with verbal and listener repertoires, so that people can interact with them more easily. Most attention has been given to accurately parsing and generating syntactic structures, although NLP researchers are finding ways of handling the semantic content of language as well. It is increasingly apparent that understanding the pragmatic (contextual and consequential) dimension of natural language is critical for producing effective NLP systems. While there are some techniques for applying pragmatics in computer systems, they are piecemeal, crude, and lack an integrated theoretical foundation. Unfortunately, there is little awareness that Skinner's (1957) Verbal Behavior provides an extensive, principled pragmatic analysis of language. The implications of Skinner's functional analysis for NLP and for verbal aspects of epistemology lead to a proposal for a “user expert”—a computer system whose area of expertise is the long-term computer user. The evolutionary nature of behavior suggests an AI technology known as genetic algorithms/programming for implementing such a system. ImagesFig. 1 PMID:22477052

  4. Computer-aided design of the human aortic root.

    PubMed

    Ovcharenko, E A; Klyshnikov, K U; Vlad, A R; Sizova, I N; Kokov, A N; Nushtaev, D V; Yuzhalin, A E; Zhuravleva, I U

    2014-11-01

    The development of computer-based 3D models of the aortic root is one of the most important problems in constructing the prostheses for transcatheter aortic valve implantation. In the current study, we analyzed data from 117 patients with and without aortic valve disease and computed tomography data from 20 patients without aortic valvular diseases in order to estimate the average values of the diameter of the aortic annulus and other aortic root parameters. Based on these data, we developed a 3D model of human aortic root with unique geometry. Furthermore, in this study we show that by applying different material properties to the aortic annulus zone in our model, we can significantly improve the quality of the results of finite element analysis. To summarize, here we present four 3D models of human aortic root with unique geometry based on computational analysis of ECHO and CT data. We suggest that our models can be utilized for the development of better prostheses for transcatheter aortic valve implantation. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. A review on recent contribution of meshfree methods to structure and fracture mechanics applications.

    PubMed

    Daxini, S D; Prajapati, J M

    2014-01-01

    Meshfree methods are viewed as next generation computational techniques. With evident limitations of conventional grid based methods, like FEM, in dealing with problems of fracture mechanics, large deformation, and simulation of manufacturing processes, meshfree methods have gained much attention by researchers. A number of meshfree methods have been proposed till now for analyzing complex problems in various fields of engineering. Present work attempts to review recent developments and some earlier applications of well-known meshfree methods like EFG and MLPG to various types of structure mechanics and fracture mechanics applications like bending, buckling, free vibration analysis, sensitivity analysis and topology optimization, single and mixed mode crack problems, fatigue crack growth, and dynamic crack analysis and some typical applications like vibration of cracked structures, thermoelastic crack problems, and failure transition in impact problems. Due to complex nature of meshfree shape functions and evaluation of integrals in domain, meshless methods are computationally expensive as compared to conventional mesh based methods. Some improved versions of original meshfree methods and other techniques suggested by researchers to improve computational efficiency of meshfree methods are also reviewed here.

  6. Computing tools for implementing standards for single-case designs.

    PubMed

    Chen, Li-Ting; Peng, Chao-Ying Joanne; Chen, Ming-E

    2015-11-01

    In the single-case design (SCD) literature, five sets of standards have been formulated and distinguished: design standards, assessment standards, analysis standards, reporting standards, and research synthesis standards. This article reviews computing tools that can assist researchers and practitioners in meeting the analysis standards recommended by the What Works Clearinghouse: Procedures and Standards Handbook-the WWC standards. These tools consist of specialized web-based calculators or downloadable software for SCD data, and algorithms or programs written in Excel, SAS procedures, SPSS commands/Macros, or the R programming language. We aligned these tools with the WWC standards and evaluated them for accuracy and treatment of missing data, using two published data sets. All tools were tested to be accurate. When missing data were present, most tools either gave an error message or conducted analysis based on the available data. Only one program used a single imputation method. This article concludes with suggestions for an inclusive computing tool or environment, additional research on the treatment of missing data, and reasonable and flexible interpretations of the WWC standards. © The Author(s) 2015.

  7. The semantic system is involved in mathematical problem solving.

    PubMed

    Zhou, Xinlin; Li, Mengyi; Li, Leinian; Zhang, Yiyun; Cui, Jiaxin; Liu, Jie; Chen, Chuansheng

    2018-02-01

    Numerous studies have shown that the brain regions around bilateral intraparietal cortex are critical for number processing and arithmetical computation. However, the neural circuits for more advanced mathematics such as mathematical problem solving (with little routine arithmetical computation) remain unclear. Using functional magnetic resonance imaging (fMRI), this study (N = 24 undergraduate students) compared neural bases of mathematical problem solving (i.e., number series completion, mathematical word problem solving, and geometric problem solving) and arithmetical computation. Direct subject- and item-wise comparisons revealed that mathematical problem solving typically had greater activation than arithmetical computation in all 7 regions of the semantic system (which was based on a meta-analysis of 120 functional neuroimaging studies on semantic processing). Arithmetical computation typically had greater activation in the supplementary motor area and left precentral gyrus. The results suggest that the semantic system in the brain supports mathematical problem solving. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Computational modelling of cellular level metabolism

    NASA Astrophysics Data System (ADS)

    Calvetti, D.; Heino, J.; Somersalo, E.

    2008-07-01

    The steady and stationary state inverse problems consist of estimating the reaction and transport fluxes, blood concentrations and possibly the rates of change of some of the concentrations based on data which are often scarce noisy and sampled over a population. The Bayesian framework provides a natural setting for the solution of this inverse problem, because a priori knowledge about the system itself and the unknown reaction fluxes and transport rates can compensate for the insufficiency of measured data, provided that the computational costs do not become prohibitive. This article identifies the computational challenges which have to be met when analyzing the steady and stationary states of multicompartment model for cellular metabolism and suggest stable and efficient ways to handle the computations. The outline of a computational tool based on the Bayesian paradigm for the simulation and analysis of complex cellular metabolic systems is also presented.

  9. STEPS: A Simulated, Tutorable Physics Student.

    ERIC Educational Resources Information Center

    Ur, Sigalit; VanLehn, Kurt

    1995-01-01

    Describes a simulated student that learns by interacting with a human tutor. Tests suggest that simulated students, when developed past the prototype stage, could be valuable for training human tutors. Provides a computational cognitive task analysis of the skill of learning from a tutor that is useful for designing intelligent tutoring systems.…

  10. Empathy in Distance Learning Design Practice

    ERIC Educational Resources Information Center

    Matthews, Michael T.; Williams, Gregory S.; Yanchar, Stephen C.; McDonald, Jason K.

    2017-01-01

    The notion of designer empathy has become a cornerstone of design philosophy in fields such as product design, human-computer interaction, and service design. But the literature on instructional designer empathy and learner analysis suggests that distance learning designers are generally quite removed from the learners with whom they could be…

  11. B and V photometry and analysis of the eclipsing binary RZ CAS

    NASA Astrophysics Data System (ADS)

    Riazi, N.; Bagheri, M. R.; Faghihi, F.

    1994-01-01

    Photoelectric light curves of the eclipsing binary RZ Cas are presented for B and V filters. The light curves are analyzed for light and geometrical elements, starting with a previously suggested preliminary method. The approximate results thus obtained are then optimised through the Wilson-Devinney computer programs.

  12. Synchronous Computer-Mediated Communication and Interaction: A Research Synthesis and Meta-Analysis

    ERIC Educational Resources Information Center

    Ziegler, Nicole

    2013-01-01

    The interaction approach to second language acquisition (SLA) suggests that changes that occur during conversation facilitate second language development by providing learners with opportunities to receive modified comprehensible input and interactional feedback, to produce output, and to notice gaps between their interlanguage and the target…

  13. The computational neurobiology of learning and reward.

    PubMed

    Daw, Nathaniel D; Doya, Kenji

    2006-04-01

    Following the suggestion that midbrain dopaminergic neurons encode a signal, known as a 'reward prediction error', used by artificial intelligence algorithms for learning to choose advantageous actions, the study of the neural substrates for reward-based learning has been strongly influenced by computational theories. In recent work, such theories have been increasingly integrated into experimental design and analysis. Such hybrid approaches have offered detailed new insights into the function of a number of brain areas, especially the cortex and basal ganglia. In part this is because these approaches enable the study of neural correlates of subjective factors (such as a participant's beliefs about the reward to be received for performing some action) that the computational theories purport to quantify.

  14. Neural correlates of auditory scene analysis and perception

    PubMed Central

    Cohen, Yale E.

    2014-01-01

    The auditory system is designed to transform acoustic information from low-level sensory representations into perceptual representations. These perceptual representations are the computational result of the auditory system's ability to group and segregate spectral, spatial and temporal regularities in the acoustic environment into stable perceptual units (i.e., sounds or auditory objects). Current evidence suggests that the cortex--specifically, the ventral auditory pathway--is responsible for the computations most closely related to perceptual representations. Here, we discuss how the transformations along the ventral auditory pathway relate to auditory percepts, with special attention paid to the processing of vocalizations and categorization, and explore recent models of how these areas may carry out these computations. PMID:24681354

  15. Requirements and principles for the implementation and construction of large-scale geographic information systems

    NASA Technical Reports Server (NTRS)

    Smith, Terence R.; Menon, Sudhakar; Star, Jeffrey L.; Estes, John E.

    1987-01-01

    This paper provides a brief survey of the history, structure and functions of 'traditional' geographic information systems (GIS), and then suggests a set of requirements that large-scale GIS should satisfy, together with a set of principles for their satisfaction. These principles, which include the systematic application of techniques from several subfields of computer science to the design and implementation of GIS and the integration of techniques from computer vision and image processing into standard GIS technology, are discussed in some detail. In particular, the paper provides a detailed discussion of questions relating to appropriate data models, data structures and computational procedures for the efficient storage, retrieval and analysis of spatially-indexed data.

  16. Personalized cloud-based bioinformatics services for research and education: use cases and the elasticHPC package

    PubMed Central

    2012-01-01

    Background Bioinformatics services have been traditionally provided in the form of a web-server that is hosted at institutional infrastructure and serves multiple users. This model, however, is not flexible enough to cope with the increasing number of users, increasing data size, and new requirements in terms of speed and availability of service. The advent of cloud computing suggests a new service model that provides an efficient solution to these problems, based on the concepts of "resources-on-demand" and "pay-as-you-go". However, cloud computing has not yet been introduced within bioinformatics servers due to the lack of usage scenarios and software layers that address the requirements of the bioinformatics domain. Results In this paper, we provide different use case scenarios for providing cloud computing based services, considering both the technical and financial aspects of the cloud computing service model. These scenarios are for individual users seeking computational power as well as bioinformatics service providers aiming at provision of personalized bioinformatics services to their users. We also present elasticHPC, a software package and a library that facilitates the use of high performance cloud computing resources in general and the implementation of the suggested bioinformatics scenarios in particular. Concrete examples that demonstrate the suggested use case scenarios with whole bioinformatics servers and major sequence analysis tools like BLAST are presented. Experimental results with large datasets are also included to show the advantages of the cloud model. Conclusions Our use case scenarios and the elasticHPC package are steps towards the provision of cloud based bioinformatics services, which would help in overcoming the data challenge of recent biological research. All resources related to elasticHPC and its web-interface are available at http://www.elasticHPC.org. PMID:23281941

  17. Personalized cloud-based bioinformatics services for research and education: use cases and the elasticHPC package.

    PubMed

    El-Kalioby, Mohamed; Abouelhoda, Mohamed; Krüger, Jan; Giegerich, Robert; Sczyrba, Alexander; Wall, Dennis P; Tonellato, Peter

    2012-01-01

    Bioinformatics services have been traditionally provided in the form of a web-server that is hosted at institutional infrastructure and serves multiple users. This model, however, is not flexible enough to cope with the increasing number of users, increasing data size, and new requirements in terms of speed and availability of service. The advent of cloud computing suggests a new service model that provides an efficient solution to these problems, based on the concepts of "resources-on-demand" and "pay-as-you-go". However, cloud computing has not yet been introduced within bioinformatics servers due to the lack of usage scenarios and software layers that address the requirements of the bioinformatics domain. In this paper, we provide different use case scenarios for providing cloud computing based services, considering both the technical and financial aspects of the cloud computing service model. These scenarios are for individual users seeking computational power as well as bioinformatics service providers aiming at provision of personalized bioinformatics services to their users. We also present elasticHPC, a software package and a library that facilitates the use of high performance cloud computing resources in general and the implementation of the suggested bioinformatics scenarios in particular. Concrete examples that demonstrate the suggested use case scenarios with whole bioinformatics servers and major sequence analysis tools like BLAST are presented. Experimental results with large datasets are also included to show the advantages of the cloud model. Our use case scenarios and the elasticHPC package are steps towards the provision of cloud based bioinformatics services, which would help in overcoming the data challenge of recent biological research. All resources related to elasticHPC and its web-interface are available at http://www.elasticHPC.org.

  18. Analysis of the knowledge and opinions of students and qualified dentists regarding the use of computers.

    PubMed

    Castelló Castañeda, Coral; Ríos Santos, Jose Vicente; Bullón, Pedro

    2008-01-01

    Dentists are currently required to make multiple diagnoses and treatment decisions every day and the information necessary to achieve this satisfactorily doubles in volume every five years. Knowledge therefore rapidly becomes out of date, so that it is often impossible to remember established information and assimilate new concepts. This may result in a significant lack of knowledge in the future, which would jeopardize the success of treatments. To remedy this situation and to prevent it, we nowadays have access to modern computing systems, with an extensive data base, which helps us to retain the information necessary for daily practice and access it instantaneously. The objectives of this study are therefore to determine how widespread the use of computing is in this environment and to determine the opinion of students and qualified dentists as regards its use in Dentistry. 90 people were chosen to take part in the study, divided into the following groups (students) (newly qualified dentists) (experts). It has been demonstrated that a high percentage (93.30%) use a computer, but that their level of computing knowledge is predominantly moderate. The place where a computer is used most is the home, which suggests that the majority own a computer. Analysis of the results obtained for evaluation of computers in teaching showed that the participants thought that it saved a great deal of time and had great potential for providing an image (in terms of marketing) and they considered it a very innovative and stimulating tool.

  19. Methodological considerations for the evaluation of EEG mapping data: a practical example based on a placebo/diazepam crossover trial.

    PubMed

    Jähnig, P; Jobert, M

    1995-01-01

    Quantitative EEG is a sensitive method for measuring pharmacological effects on the central nervous system. Nowadays, computers enable EEG data to be stored and spectral parameters to be computed for signals obtained from a large number of electrode locations. However, the statistical analysis of such vast amounts of EEG data is complicated due to the limited number of subjects usually involved in pharmacological studies. In the present study, data from a trial aimed at comparing diazepam and placebo were used to investigate different properties of EEG mapping data and to compare different methods of data analysis. Both the topography and the temporal changes of EEG activity were investigated using descriptive data analysis, which is based on an inspection of patterns of pd values (descriptive p values) assessed for all pair-wise tests for differences in time or treatment. An empirical measure (tri-mean) for the computation of group maps is suggested, allowing a better description of group effects with skewed data of small samples size. Finally, both the investigation of maps based on principal component analysis and the notion of distance between maps are discussed and applied to the analysis of the data collected under diazepam treatment, exemplifying the evaluation of pharmacodynamic drug effects.

  20. Deep Learning in Medical Image Analysis.

    PubMed

    Shen, Dinggang; Wu, Guorong; Suk, Heung-Il

    2017-06-21

    This review covers computer-assisted analysis of images in the field of medical imaging. Recent advances in machine learning, especially with regard to deep learning, are helping to identify, classify, and quantify patterns in medical images. At the core of these advances is the ability to exploit hierarchical feature representations learned solely from data, instead of features designed by hand according to domain-specific knowledge. Deep learning is rapidly becoming the state of the art, leading to enhanced performance in various medical applications. We introduce the fundamentals of deep learning methods and review their successes in image registration, detection of anatomical and cellular structures, tissue segmentation, computer-aided disease diagnosis and prognosis, and so on. We conclude by discussing research issues and suggesting future directions for further improvement.

  1. Whole-genome CNV analysis: advances in computational approaches.

    PubMed

    Pirooznia, Mehdi; Goes, Fernando S; Zandi, Peter P

    2015-01-01

    Accumulating evidence indicates that DNA copy number variation (CNV) is likely to make a significant contribution to human diversity and also play an important role in disease susceptibility. Recent advances in genome sequencing technologies have enabled the characterization of a variety of genomic features, including CNVs. This has led to the development of several bioinformatics approaches to detect CNVs from next-generation sequencing data. Here, we review recent advances in CNV detection from whole genome sequencing. We discuss the informatics approaches and current computational tools that have been developed as well as their strengths and limitations. This review will assist researchers and analysts in choosing the most suitable tools for CNV analysis as well as provide suggestions for new directions in future development.

  2. A VLBI variance-covariance analysis interactive computer program. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Bock, Y.

    1980-01-01

    An interactive computer program (in FORTRAN) for the variance covariance analysis of VLBI experiments is presented for use in experiment planning, simulation studies and optimal design problems. The interactive mode is especially suited to these types of analyses providing ease of operation as well as savings in time and cost. The geodetic parameters include baseline vector parameters and variations in polar motion and Earth rotation. A discussion of the theroy on which the program is based provides an overview of the VLBI process emphasizing the areas of interest to geodesy. Special emphasis is placed on the problem of determining correlations between simultaneous observations from a network of stations. A model suitable for covariance analyses is presented. Suggestions towards developing optimal observation schedules are included.

  3. CFD Analysis of Turbo Expander for Cryogenic Refrigeration and Liquefaction Cycles

    NASA Astrophysics Data System (ADS)

    Verma, Rahul; Sam, Ashish Alex; Ghosh, Parthasarathi

    Computational Fluid Dynamics analysis has emerged as a necessary tool for designing of turbomachinery. It helps to understand the various sources of inefficiency through investigation of flow physics of the turbine. In this paper, 3D turbulent flow analysis of a cryogenic turboexpander for small scale air separation was performed using Ansys CFX®. The turboexpander has been designed following assumptions based on meanlineblade generation procedure provided in open literature and good engineering judgement. Through analysis of flow field, modifications and further analysis required to evolve a more robust design procedure, have been suggested.

  4. Sequence-structure mapping errors in the PDB: OB-fold domains

    PubMed Central

    Venclovas, Česlovas; Ginalski, Krzysztof; Kang, Chulhee

    2004-01-01

    The Protein Data Bank (PDB) is the single most important repository of structural data for proteins and other biologically relevant molecules. Therefore, it is critically important to keep the PDB data, as much as possible, error-free. In this study, we have analyzed PDB crystal structures possessing oligonucleotide/oligosaccharide binding (OB)-fold, one of the highly populated folds, for the presence of sequence-structure mapping errors. Using energy-based structure quality assessment coupled with sequence analyses, we have found that there are at least five OB-structures in the PDB that have regions where sequences have been incorrectly mapped onto the structure. We have demonstrated that the combination of these computation techniques is effective not only in detecting sequence-structure mapping errors, but also in providing guidance to correct them. Namely, we have used results of computational analysis to direct a revision of X-ray data for one of the PDB entries containing a fairly inconspicuous sequence-structure mapping error. The revised structure has been deposited with the PDB. We suggest use of computational energy assessment and sequence analysis techniques to facilitate structure determination when homologs having known structure are available to use as a reference. Such computational analysis may be useful in either guiding the sequence-structure assignment process or verifying the sequence mapping within poorly defined regions. PMID:15133161

  5. Eye movement analysis of reading from computer displays, eReaders and printed books.

    PubMed

    Zambarbieri, Daniela; Carniglia, Elena

    2012-09-01

    To compare eye movements during silent reading of three eBooks and a printed book. The three different eReading tools were a desktop PC, iPad tablet and Kindle eReader. Video-oculographic technology was used for recording eye movements. In the case of reading from the computer display the recordings were made by a video camera placed below the computer screen, whereas for reading from the iPad tablet, eReader and printed book the recording system was worn by the subject and had two cameras: one for recording the movement of the eyes and the other for recording the scene in front of the subject. Data analysis provided quantitative information in terms of number of fixations, their duration, and the direction of the movement, the latter to distinguish between fixations and regressions. Mean fixation duration was different only in reading from the computer display, and was similar for the Tablet, eReader and printed book. The percentage of regressions with respect to the total amount of fixations was comparable for eReading tools and the printed book. The analysis of eye movements during reading an eBook from different eReading tools suggests that subjects' reading behaviour is similar to reading from a printed book. © 2012 The College of Optometrists.

  6. Novel Regulatory Small RNAs in Streptococcus pyogenes

    PubMed Central

    Tesorero, Rafael A.; Yu, Ning; Wright, Jordan O.; Svencionis, Juan P.; Cheng, Qiang; Kim, Jeong-Ho; Cho, Kyu Hong

    2013-01-01

    Streptococcus pyogenes (Group A Streptococcus or GAS) is a Gram-positive bacterial pathogen that has shown complex modes of regulation of its virulence factors to cause diverse diseases. Bacterial small RNAs are regarded as novel widespread regulators of gene expression in response to environmental signals. Recent studies have revealed that several small RNAs (sRNAs) have an important role in S. pyogenes physiology and pathogenesis by regulating gene expression at the translational level. To search for new sRNAs in S. pyogenes, we performed a genomewide analysis through computational prediction followed by experimental verification. To overcome the limitation of low accuracy in computational prediction, we employed a combination of three different computational algorithms (sRNAPredict, eQRNA and RNAz). A total of 45 candidates were chosen based on the computational analysis, and their transcription was analyzed by reverse-transcriptase PCR and Northern blot. Through this process, we discovered 7 putative novel trans-acting sRNAs. Their abundance varied between different growth phases, suggesting that their expression is influenced by environmental or internal signals. Further, to screen target mRNAs of an sRNA, we employed differential RNA sequencing analysis. This study provides a significant resource for future study of small RNAs and their roles in physiology and pathogenesis of S. pyogenes. PMID:23762235

  7. Resources and costs for microbial sequence analysis evaluated using virtual machines and cloud computing.

    PubMed

    Angiuoli, Samuel V; White, James R; Matalka, Malcolm; White, Owen; Fricke, W Florian

    2011-01-01

    The widespread popularity of genomic applications is threatened by the "bioinformatics bottleneck" resulting from uncertainty about the cost and infrastructure needed to meet increasing demands for next-generation sequence analysis. Cloud computing services have been discussed as potential new bioinformatics support systems but have not been evaluated thoroughly. We present benchmark costs and runtimes for common microbial genomics applications, including 16S rRNA analysis, microbial whole-genome shotgun (WGS) sequence assembly and annotation, WGS metagenomics and large-scale BLAST. Sequence dataset types and sizes were selected to correspond to outputs typically generated by small- to midsize facilities equipped with 454 and Illumina platforms, except for WGS metagenomics where sampling of Illumina data was used. Automated analysis pipelines, as implemented in the CloVR virtual machine, were used in order to guarantee transparency, reproducibility and portability across different operating systems, including the commercial Amazon Elastic Compute Cloud (EC2), which was used to attach real dollar costs to each analysis type. We found considerable differences in computational requirements, runtimes and costs associated with different microbial genomics applications. While all 16S analyses completed on a single-CPU desktop in under three hours, microbial genome and metagenome analyses utilized multi-CPU support of up to 120 CPUs on Amazon EC2, where each analysis completed in under 24 hours for less than $60. Representative datasets were used to estimate maximum data throughput on different cluster sizes and to compare costs between EC2 and comparable local grid servers. Although bioinformatics requirements for microbial genomics depend on dataset characteristics and the analysis protocols applied, our results suggests that smaller sequencing facilities (up to three Roche/454 or one Illumina GAIIx sequencer) invested in 16S rRNA amplicon sequencing, microbial single-genome and metagenomics WGS projects can achieve cost-efficient bioinformatics support using CloVR in combination with Amazon EC2 as an alternative to local computing centers.

  8. Resources and Costs for Microbial Sequence Analysis Evaluated Using Virtual Machines and Cloud Computing

    PubMed Central

    Angiuoli, Samuel V.; White, James R.; Matalka, Malcolm; White, Owen; Fricke, W. Florian

    2011-01-01

    Background The widespread popularity of genomic applications is threatened by the “bioinformatics bottleneck” resulting from uncertainty about the cost and infrastructure needed to meet increasing demands for next-generation sequence analysis. Cloud computing services have been discussed as potential new bioinformatics support systems but have not been evaluated thoroughly. Results We present benchmark costs and runtimes for common microbial genomics applications, including 16S rRNA analysis, microbial whole-genome shotgun (WGS) sequence assembly and annotation, WGS metagenomics and large-scale BLAST. Sequence dataset types and sizes were selected to correspond to outputs typically generated by small- to midsize facilities equipped with 454 and Illumina platforms, except for WGS metagenomics where sampling of Illumina data was used. Automated analysis pipelines, as implemented in the CloVR virtual machine, were used in order to guarantee transparency, reproducibility and portability across different operating systems, including the commercial Amazon Elastic Compute Cloud (EC2), which was used to attach real dollar costs to each analysis type. We found considerable differences in computational requirements, runtimes and costs associated with different microbial genomics applications. While all 16S analyses completed on a single-CPU desktop in under three hours, microbial genome and metagenome analyses utilized multi-CPU support of up to 120 CPUs on Amazon EC2, where each analysis completed in under 24 hours for less than $60. Representative datasets were used to estimate maximum data throughput on different cluster sizes and to compare costs between EC2 and comparable local grid servers. Conclusions Although bioinformatics requirements for microbial genomics depend on dataset characteristics and the analysis protocols applied, our results suggests that smaller sequencing facilities (up to three Roche/454 or one Illumina GAIIx sequencer) invested in 16S rRNA amplicon sequencing, microbial single-genome and metagenomics WGS projects can achieve cost-efficient bioinformatics support using CloVR in combination with Amazon EC2 as an alternative to local computing centers. PMID:22028928

  9. Guide to AERO2S and WINGDES Computer Codes for Prediction and Minimization of Drag Due to Lift

    NASA Technical Reports Server (NTRS)

    Carlson, Harry W.; Chu, Julio; Ozoroski, Lori P.; McCullers, L. Arnold

    1997-01-01

    The computer codes, AER02S and WINGDES, are now widely used for the analysis and design of airplane lifting surfaces under conditions that tend to induce flow separation. These codes have undergone continued development to provide additional capabilities since the introduction of the original versions over a decade ago. This code development has been reported in a variety of publications (NASA technical papers, NASA contractor reports, and society journals). Some modifications have not been publicized at all. Users of these codes have suggested the desirability of combining in a single document the descriptions of the code development, an outline of the features of each code, and suggestions for effective code usage. This report is intended to supply that need.

  10. Providing security for automated process control systems at hydropower engineering facilities

    NASA Astrophysics Data System (ADS)

    Vasiliev, Y. S.; Zegzhda, P. D.; Zegzhda, D. P.

    2016-12-01

    This article suggests the concept of a cyberphysical system to manage computer security of automated process control systems at hydropower engineering facilities. According to the authors, this system consists of a set of information processing tools and computer-controlled physical devices. Examples of cyber attacks on power engineering facilities are provided, and a strategy of improving cybersecurity of hydropower engineering systems is suggested. The architecture of the multilevel protection of the automated process control system (APCS) of power engineering facilities is given, including security systems, control systems, access control, encryption, secure virtual private network of subsystems for monitoring and analysis of security events. The distinctive aspect of the approach is consideration of interrelations and cyber threats, arising when SCADA is integrated with the unified enterprise information system.

  11. Analysis of basic clustering algorithms for numerical estimation of statistical averages in biomolecules.

    PubMed

    Anandakrishnan, Ramu; Onufriev, Alexey

    2008-03-01

    In statistical mechanics, the equilibrium properties of a physical system of particles can be calculated as the statistical average over accessible microstates of the system. In general, these calculations are computationally intractable since they involve summations over an exponentially large number of microstates. Clustering algorithms are one of the methods used to numerically approximate these sums. The most basic clustering algorithms first sub-divide the system into a set of smaller subsets (clusters). Then, interactions between particles within each cluster are treated exactly, while all interactions between different clusters are ignored. These smaller clusters have far fewer microstates, making the summation over these microstates, tractable. These algorithms have been previously used for biomolecular computations, but remain relatively unexplored in this context. Presented here, is a theoretical analysis of the error and computational complexity for the two most basic clustering algorithms that were previously applied in the context of biomolecular electrostatics. We derive a tight, computationally inexpensive, error bound for the equilibrium state of a particle computed via these clustering algorithms. For some practical applications, it is the root mean square error, which can be significantly lower than the error bound, that may be more important. We how that there is a strong empirical relationship between error bound and root mean square error, suggesting that the error bound could be used as a computationally inexpensive metric for predicting the accuracy of clustering algorithms for practical applications. An example of error analysis for such an application-computation of average charge of ionizable amino-acids in proteins-is given, demonstrating that the clustering algorithm can be accurate enough for practical purposes.

  12. Influence of computer work under time pressure on cardiac activity.

    PubMed

    Shi, Ping; Hu, Sijung; Yu, Hongliu

    2015-03-01

    Computer users are often under stress when required to complete computer work within a required time. Work stress has repeatedly been associated with an increased risk for cardiovascular disease. The present study examined the effects of time pressure workload during computer tasks on cardiac activity in 20 healthy subjects. Heart rate, time domain and frequency domain indices of heart rate variability (HRV) and Poincaré plot parameters were compared among five computer tasks and two rest periods. Faster heart rate and decreased standard deviation of R-R interval were noted in response to computer tasks under time pressure. The Poincaré plot parameters showed significant differences between different levels of time pressure workload during computer tasks, and between computer tasks and the rest periods. In contrast, no significant differences were identified for the frequency domain indices of HRV. The results suggest that the quantitative Poincaré plot analysis used in this study was able to reveal the intrinsic nonlinear nature of the autonomically regulated cardiac rhythm. Specifically, heightened vagal tone occurred during the relaxation computer tasks without time pressure. In contrast, the stressful computer tasks with added time pressure stimulated cardiac sympathetic activity. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Qualitative data analysis: conceptual and practical considerations.

    PubMed

    Liamputtong, Pranee

    2009-08-01

    Qualitative inquiry requires that collected data is organised in a meaningful way, and this is referred to as data analysis. Through analytic processes, researchers turn what can be voluminous data into understandable and insightful analysis. This paper sets out the different approaches that qualitative researchers can use to make sense of their data including thematic analysis, narrative analysis, discourse analysis and semiotic analysis and discusses the ways that qualitative researchers can analyse their data. I first discuss salient issues in performing qualitative data analysis, and then proceed to provide some suggestions on different methods of data analysis in qualitative research. Finally, I provide some discussion on the use of computer-assisted data analysis.

  14. The Relationship Between Computer Experience and Computerized Cognitive Test Performance Among Older Adults

    PubMed Central

    2013-01-01

    Objective. This study compared the relationship between computer experience and performance on computerized cognitive tests and a traditional paper-and-pencil cognitive test in a sample of older adults (N = 634). Method. Participants completed computer experience and computer attitudes questionnaires, three computerized cognitive tests (Useful Field of View (UFOV) Test, Road Sign Test, and Stroop task) and a paper-and-pencil cognitive measure (Trail Making Test). Multivariate analysis of covariance was used to examine differences in cognitive performance across the four measures between those with and without computer experience after adjusting for confounding variables. Results. Although computer experience had a significant main effect across all cognitive measures, the effect sizes were similar. After controlling for computer attitudes, the relationship between computer experience and UFOV was fully attenuated. Discussion. Findings suggest that computer experience is not uniquely related to performance on computerized cognitive measures compared with paper-and-pencil measures. Because the relationship between computer experience and UFOV was fully attenuated by computer attitudes, this may imply that motivational factors are more influential to UFOV performance than computer experience. Our findings support the hypothesis that computer use is related to cognitive performance, and this relationship is not stronger for computerized cognitive measures. Implications and directions for future research are provided. PMID:22929395

  15. Factors influencing exemplary science teachers' levels of computer use

    NASA Astrophysics Data System (ADS)

    Hakverdi, Meral

    This study examines exemplary science teachers' use of technology in science instruction, factors influencing their level of computer use, their level of knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, and their students' use of computer applications/tools in or for their science class. After a relevant review of the literature certain variables were selected for analysis. These variables included personal self-efficacy in teaching with computers, outcome expectancy, pupil-control ideology, level of computer use, age, gender, teaching experience, personal computer use, professional computer use and science teachers' level of knowledge/skills in using specific computer applications for science instruction. The sample for this study includes middle and high school science teachers who received the Presidential Award for Excellence in Science Teaching Award (sponsored by the White House and the National Science Foundation) between the years 1997 and 2003 from all 50 states and U.S. territories. Award-winning science teachers were contacted about the survey via e-mail or letter with an enclosed return envelope. Of the 334 award-winning science teachers, usable responses were received from 92 science teachers, which made a response rate of 27.5%. Analysis of the survey responses indicated that exemplary science teachers have a variety of knowledge/skills in using computer related applications/tools. The most commonly used computer applications/tools are information retrieval via the Internet, presentation tools, online communication, digital cameras, and data collection probes. Results of the study revealed that students' use of technology in their science classroom is highly correlated with the frequency of their science teachers' use of computer applications/tools. The results of the multiple regression analysis revealed that personal self-efficacy related to the exemplary science teachers' level of computer use suggesting that computer use is dependent on perceived abilities at using computers. The teachers' use of computer-related applications/tools during class, and their personal self-efficacy, age, and gender are highly related with their level of knowledge/skills in using specific computer applications for science instruction. The teachers' level of knowledge/skills in using specific computer applications for science instruction and gender related to their use of computer-related applications/tools during class and the students' use of computer-related applications/tools in or for their science class. In conclusion, exemplary science teachers need assistance in learning and using computer-related applications/tool in their science class.

  16. Findings from an Organizational Network Analysis to Support Local Public Health Management

    PubMed Central

    Caldwell, Michael; Rockoff, Maxine L.; Gebbie, Kristine; Carley, Kathleen M.; Bakken, Suzanne

    2008-01-01

    We assessed the feasibility of using organizational network analysis in a local public health organization. The research setting was an urban/suburban county health department with 156 employees. The goal of the research was to study communication and information flow in the department and to assess the technique for public health management. Network data were derived from survey questionnaires. Computational analysis was performed with the Organizational Risk Analyzer. Analysis revealed centralized communication, limited interdependencies, potential knowledge loss through retirement, and possible informational silos. The findings suggested opportunities for more cross program coordination but also suggested the presences of potentially efficient communication paths and potentially beneficial social connectedness. Managers found the findings useful to support decision making. Public health organizations must be effective in an increasingly complex environment. Network analysis can help build public health capacity for complex system management. PMID:18481183

  17. Patterns of computer usage among medical practitioners in rural and remote Queensland.

    PubMed

    White, Col; Sheedy, Vicki; Lawrence, Nicola

    2002-06-01

    As part of a more detailed needs analysis, patterns of computer usage among medical practitioners in rural and remote Queensland were investigated. Utilising a questionnaire approach, a response rate of 23.82% (n = 131) was obtained. Results suggest that medical practitioners in rural and remote Queensland are relatively sophisticated in their use of computer and information technologies and have embraced computerisation to a substantially higher extent compared with their urban counterparts and previously published estimates. Findings also indicate that a substantial number of rural and remote practitioners are utilising computer and information technologies for clinical purposes such as pathology, patient information sheets, prescribing, education, patient records and patient recalls. Despite barriers such as bandwidth limitations, cost and the sometimes unreliable quality of Internet service providers, a majority of rural and remote respondents rated an Internet site with continuing medical education information and services as being important or very important. Suggestions that "rural doctors are slow to adapt to new technologies" are questioned, with findings indicating that rural and remote medical practitioners in Queensland have adapted to, and utilise, information technology to a far higher extent than has been previously documented.

  18. Computational assessment of hemodynamics-based diagnostic tools using a database of virtual subjects: Application to three case studies.

    PubMed

    Willemet, Marie; Vennin, Samuel; Alastruey, Jordi

    2016-12-08

    Many physiological indexes and algorithms based on pulse wave analysis have been suggested in order to better assess cardiovascular function. Because these tools are often computed from in-vivo hemodynamic measurements, their validation is time-consuming, challenging, and biased by measurement errors. Recently, a new methodology has been suggested to assess theoretically these computed tools: a database of virtual subjects generated using numerical 1D-0D modeling of arterial hemodynamics. The generated set of simulations encloses a wide selection of healthy cases that could be encountered in a clinical study. We applied this new methodology to three different case studies that demonstrate the potential of our new tool, and illustrated each of them with a clinically relevant example: (i) we assessed the accuracy of indexes estimating pulse wave velocity; (ii) we validated and refined an algorithm that computes central blood pressure; and (iii) we investigated theoretical mechanisms behind the augmentation index. Our database of virtual subjects is a new tool to assist the clinician: it provides insight into the physical mechanisms underlying the correlations observed in clinical practice. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  19. The Influence of Reconstruction Kernel on Bone Mineral and Strength Estimates Using Quantitative Computed Tomography and Finite Element Analysis.

    PubMed

    Michalski, Andrew S; Edwards, W Brent; Boyd, Steven K

    2017-10-17

    Quantitative computed tomography has been posed as an alternative imaging modality to investigate osteoporosis. We examined the influence of computed tomography convolution back-projection reconstruction kernels on the analysis of bone quantity and estimated mechanical properties in the proximal femur. Eighteen computed tomography scans of the proximal femur were reconstructed using both a standard smoothing reconstruction kernel and a bone-sharpening reconstruction kernel. Following phantom-based density calibration, we calculated typical bone quantity outcomes of integral volumetric bone mineral density, bone volume, and bone mineral content. Additionally, we performed finite element analysis in a standard sideways fall on the hip loading configuration. Significant differences for all outcome measures, except integral bone volume, were observed between the 2 reconstruction kernels. Volumetric bone mineral density measured using images reconstructed by the standard kernel was significantly lower (6.7%, p < 0.001) when compared with images reconstructed using the bone-sharpening kernel. Furthermore, the whole-bone stiffness and the failure load measured in images reconstructed by the standard kernel were significantly lower (16.5%, p < 0.001, and 18.2%, p < 0.001, respectively) when compared with the image reconstructed by the bone-sharpening kernel. These data suggest that for future quantitative computed tomography studies, a standardized reconstruction kernel will maximize reproducibility, independent of the use of a quantitative calibration phantom. Copyright © 2017 The International Society for Clinical Densitometry. Published by Elsevier Inc. All rights reserved.

  20. Comparison of missing value imputation methods in time series: the case of Turkish meteorological data

    NASA Astrophysics Data System (ADS)

    Yozgatligil, Ceylan; Aslan, Sipan; Iyigun, Cem; Batmaz, Inci

    2013-04-01

    This study aims to compare several imputation methods to complete the missing values of spatio-temporal meteorological time series. To this end, six imputation methods are assessed with respect to various criteria including accuracy, robustness, precision, and efficiency for artificially created missing data in monthly total precipitation and mean temperature series obtained from the Turkish State Meteorological Service. Of these methods, simple arithmetic average, normal ratio (NR), and NR weighted with correlations comprise the simple ones, whereas multilayer perceptron type neural network and multiple imputation strategy adopted by Monte Carlo Markov Chain based on expectation-maximization (EM-MCMC) are computationally intensive ones. In addition, we propose a modification on the EM-MCMC method. Besides using a conventional accuracy measure based on squared errors, we also suggest the correlation dimension (CD) technique of nonlinear dynamic time series analysis which takes spatio-temporal dependencies into account for evaluating imputation performances. Depending on the detailed graphical and quantitative analysis, it can be said that although computational methods, particularly EM-MCMC method, are computationally inefficient, they seem favorable for imputation of meteorological time series with respect to different missingness periods considering both measures and both series studied. To conclude, using the EM-MCMC algorithm for imputing missing values before conducting any statistical analyses of meteorological data will definitely decrease the amount of uncertainty and give more robust results. Moreover, the CD measure can be suggested for the performance evaluation of missing data imputation particularly with computational methods since it gives more precise results in meteorological time series.

  1. A Simple and Computationally Efficient Sampling Approach to Covariate Adjustment for Multifactor Dimensionality Reduction Analysis of Epistasis

    PubMed Central

    Gui, Jiang; Andrew, Angeline S.; Andrews, Peter; Nelson, Heather M.; Kelsey, Karl T.; Karagas, Margaret R.; Moore, Jason H.

    2010-01-01

    Epistasis or gene-gene interaction is a fundamental component of the genetic architecture of complex traits such as disease susceptibility. Multifactor dimensionality reduction (MDR) was developed as a nonparametric and model-free method to detect epistasis when there are no significant marginal genetic effects. However, in many studies of complex disease, other covariates like age of onset and smoking status could have a strong main effect and may potentially interfere with MDR's ability to achieve its goal. In this paper, we present a simple and computationally efficient sampling method to adjust for covariate effects in MDR. We use simulation to show that after adjustment, MDR has sufficient power to detect true gene-gene interactions. We also compare our method with the state-of-art technique in covariate adjustment. The results suggest that our proposed method performs similarly, but is more computationally efficient. We then apply this new method to an analysis of a population-based bladder cancer study in New Hampshire. PMID:20924193

  2. Computational Analysis of the G-III Laminar Flow Glove

    NASA Technical Reports Server (NTRS)

    Malik, Mujeeb R.; Liao, Wei; Lee-Rausch, Elizabeth M.; Li, Fei; Choudhari, Meelan M.; Chang, Chau-Lyan

    2011-01-01

    Under NASA's Environmentally Responsible Aviation Project, flight experiments are planned with the primary objective of demonstrating the Discrete Roughness Elements (DRE) technology for passive laminar flow control at chord Reynolds numbers relevant to transport aircraft. In this paper, we present a preliminary computational assessment of the Gulfstream-III (G-III) aircraft wing-glove designed to attain natural laminar flow for the leading-edge sweep angle of 34.6deg. Analysis for a flight Mach number of 0.75 shows that it should be possible to achieve natural laminar flow for twice the transition Reynolds number ever achieved at this sweep angle. However, the wing-glove needs to be redesigned to effectively demonstrate passive laminar flow control using DREs. As a by-product of the computational assessment, effect of surface curvature on stationary crossflow disturbances is found to be strongly stabilizing for the current design, and it is suggested that convex surface curvature could be used as a control parameter for natural laminar flow design, provided transition occurs via stationary crossflow disturbances.

  3. Visual Analysis of Cloud Computing Performance Using Behavioral Lines.

    PubMed

    Muelder, Chris; Zhu, Biao; Chen, Wei; Zhang, Hongxin; Ma, Kwan-Liu

    2016-02-29

    Cloud computing is an essential technology to Big Data analytics and services. A cloud computing system is often comprised of a large number of parallel computing and storage devices. Monitoring the usage and performance of such a system is important for efficient operations, maintenance, and security. Tracing every application on a large cloud system is untenable due to scale and privacy issues. But profile data can be collected relatively efficiently by regularly sampling the state of the system, including properties such as CPU load, memory usage, network usage, and others, creating a set of multivariate time series for each system. Adequate tools for studying such large-scale, multidimensional data are lacking. In this paper, we present a visual based analysis approach to understanding and analyzing the performance and behavior of cloud computing systems. Our design is based on similarity measures and a layout method to portray the behavior of each compute node over time. When visualizing a large number of behavioral lines together, distinct patterns often appear suggesting particular types of performance bottleneck. The resulting system provides multiple linked views, which allow the user to interactively explore the data by examining the data or a selected subset at different levels of detail. Our case studies, which use datasets collected from two different cloud systems, show that this visual based approach is effective in identifying trends and anomalies of the systems.

  4. Estimation of the failure risk of a maxillary premolar with different crack depths with endodontic treatment by computer-aided design/computer-aided manufacturing ceramic restorations.

    PubMed

    Lin, Chun-Li; Chang, Yen-Hsiang; Hsieh, Shih-Kai; Chang, Wen-Jen

    2013-03-01

    This study evaluated the risk of failure for an endodontically treated premolar with different crack depths, which was shearing toward the pulp chamber and was restored by using 3 different computer-aided design/computer-aided manufacturing ceramic restoration configurations. Three 3-dimensional finite element models designed with computer-aided design/computer-aided manufacturing ceramic onlay, endocrown, and conventional crown restorations were constructed to perform simulations. The Weibull function was incorporated with finite element analysis to calculate the long-term failure probability relative to different load conditions. The results indicated that the stress values on the enamel, dentin, and luting cement for endocrown restorations exhibited the lowest values relative to the other 2 restoration methods. Weibull analysis revealed that the overall failure probabilities in a shallow cracked premolar were 27%, 2%, and 1% for the onlay, endocrown, and conventional crown restorations, respectively, in the normal occlusal condition. The corresponding values were 70%, 10%, and 2% for the depth cracked premolar. This numeric investigation suggests that the endocrown provides sufficient fracture resistance only in a shallow cracked premolar with endodontic treatment. The conventional crown treatment can immobilize the premolar for different cracked depths with lower failure risk. Copyright © 2013 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  5. Secondary Education Students' Difficulties in Algorithmic Problems with Arrays: An Analysis Using the SOLO Taxonomy

    ERIC Educational Resources Information Center

    Vrachnos, Euripides; Jimoyiannis, Athanassios

    2017-01-01

    Developing students' algorithmic and computational thinking is currently a major objective for primary and secondary education in many countries around the globe. Literature suggests that students face at various difficulties in programming processes, because of their mental models about basic programming constructs. Arrays constitute the first…

  6. How Learners Use Automated Computer-Based Feedback to Produce Revised Drafts of Essays

    ERIC Educational Resources Information Center

    Laing, Jonny; El Ebyary, Khaled; Windeatt, Scott

    2012-01-01

    Our previous results suggest that the use of "Criterion", an automatic writing evaluation (AWE) system, is particularly successful in encouraging learners to produce amended drafts of their essays, and that those amended drafts generally represent an improvement on the original submission. Our analysis of the submitted essays and the…

  7. Effects of Response-Driven Feedback in Computer Science Learning

    ERIC Educational Resources Information Center

    Fernandez Aleman, J. L.; Palmer-Brown, D.; Jayne, C.

    2011-01-01

    This paper presents the results of a project on generating diagnostic feedback for guided learning in a first-year course on programming and a Master's course on software quality. An online multiple-choice questions (MCQs) system is integrated with neural network-based data analysis. Findings about how students use the system suggest that the…

  8. Integrating Statistical Visualization Research into the Political Science Classroom

    ERIC Educational Resources Information Center

    Draper, Geoffrey M.; Liu, Baodong; Riesenfeld, Richard F.

    2011-01-01

    The use of computer software to facilitate learning in political science courses is well established. However, the statistical software packages used in many political science courses can be difficult to use and counter-intuitive. We describe the results of a preliminary user study suggesting that visually-oriented analysis software can help…

  9. Functional Connectivity Parcellation of the Human Thalamus by Independent Component Analysis.

    PubMed

    Zhang, Sheng; Li, Chiang-Shan R

    2017-11-01

    As a key structure to relay and integrate information, the thalamus supports multiple cognitive and affective functions through the connectivity between its subnuclei and cortical and subcortical regions. Although extant studies have largely described thalamic regional functions in anatomical terms, evidence accumulates to suggest a more complex picture of subareal activities and connectivities of the thalamus. In this study, we aimed to parcellate the thalamus and examine whole-brain connectivity of its functional clusters. With resting state functional magnetic resonance imaging data from 96 adults, we used independent component analysis (ICA) to parcellate the thalamus into 10 components. On the basis of the independence assumption, ICA helps to identify how subclusters overlap spatially. Whole brain functional connectivity of each subdivision was computed for independent component's time course (ICtc), which is a unique time series to represent an IC. For comparison, we computed seed-region-based functional connectivity using the averaged time course across all voxels within a thalamic subdivision. The results showed that, at p < 10 -6 , corrected, 49% of voxels on average overlapped among subdivisions. Compared with seed-region analysis, ICtc analysis revealed patterns of connectivity that were more distinguished between thalamic clusters. ICtc analysis demonstrated thalamic connectivity to the primary motor cortex, which has eluded the analysis as well as previous studies based on averaged time series, and clarified thalamic connectivity to the hippocampus, caudate nucleus, and precuneus. The new findings elucidate functional organization of the thalamus and suggest that ICA clustering in combination with ICtc rather than seed-region analysis better distinguishes whole-brain connectivities among functional clusters of a brain region.

  10. Improving finite element results in modeling heart valve mechanics.

    PubMed

    Earl, Emily; Mohammadi, Hadi

    2018-06-01

    Finite element analysis is a well-established computational tool which can be used for the analysis of soft tissue mechanics. Due to the structural complexity of the leaflet tissue of the heart valve, the currently available finite element models do not adequately represent the leaflet tissue. A method of addressing this issue is to implement computationally expensive finite element models, characterized by precise constitutive models including high-order and high-density mesh techniques. In this study, we introduce a novel numerical technique that enhances the results obtained from coarse mesh finite element models to provide accuracy comparable to that of fine mesh finite element models while maintaining a relatively low computational cost. Introduced in this study is a method by which the computational expense required to solve linear and nonlinear constitutive models, commonly used in heart valve mechanics simulations, is reduced while continuing to account for large and infinitesimal deformations. This continuum model is developed based on the least square algorithm procedure coupled with the finite difference method adhering to the assumption that the components of the strain tensor are available at all nodes of the finite element mesh model. The suggested numerical technique is easy to implement, practically efficient, and requires less computational time compared to currently available commercial finite element packages such as ANSYS and/or ABAQUS.

  11. Linguistic analysis of project ownership for undergraduate research experiences.

    PubMed

    Hanauer, D I; Frederick, J; Fotinakes, B; Strobel, S A

    2012-01-01

    We used computational linguistic and content analyses to explore the concept of project ownership for undergraduate research. We used linguistic analysis of student interview data to develop a quantitative methodology for assessing project ownership and applied this method to measure degrees of project ownership expressed by students in relation to different types of educational research experiences. The results of the study suggest that the design of a research experience significantly influences the degree of project ownership expressed by students when they describe those experiences. The analysis identified both positive and negative aspects of project ownership and provided a working definition for how a student experiences his or her research opportunity. These elements suggest several features that could be incorporated into an undergraduate research experience to foster a student's sense of project ownership.

  12. Exact Rayleigh scattering calculations for use with the Nimbus-7 Coastal Zone Color Scanner

    NASA Technical Reports Server (NTRS)

    Gordon, Howard R.; Brown, James W.; Evans, Robert H.

    1988-01-01

    The radiance reflected from a plane-parallel atmosphere and flat sea surface in the absence of aerosols has been determined with an exact multiple scattering code to improve the analysis of Nimbus-7 CZCS imagery. It is shown that the single scattering approximation normally used to compute this radiance can result in errors of up to 5 percent for small and moderate solar zenith angles. A scheme to include the effect of variations in the surface pressure in the exact computation of the Rayleigh radiance is discussed. The results of an application of these computations to CZCS imagery suggest that accurate atmospheric corrections can be obtained for solar zenith angles at least as large as 65 deg.

  13. Small-Noise Analysis and Symmetrization of Implicit Monte Carlo Samplers

    DOE PAGES

    Goodman, Jonathan; Lin, Kevin K.; Morzfeld, Matthias

    2015-07-06

    Implicit samplers are algorithms for producing independent, weighted samples from multivariate probability distributions. These are often applied in Bayesian data assimilation algorithms. We use Laplace asymptotic expansions to analyze two implicit samplers in the small noise regime. Our analysis suggests a symmetrization of the algorithms that leads to improved implicit sampling schemes at a relatively small additional cost. Here, computational experiments confirm the theory and show that symmetrization is effective for small noise sampling problems.

  14. A Nonparametric Statistical Approach to the Validation of Computer Simulation Models

    DTIC Science & Technology

    1985-11-01

    Ballistic Research Laboratory, the Experimental Design and Analysis Branch of the Systems Engineering and Concepts Analysis Division was funded to...2 Winter. E M. Wisemiler. D P. azd UjiharmJ K. Venrgcation ad Validatiot of Engineering Simulatiots with Minimal D2ta." Pmeedinr’ of the 1976 Summer...used by numerous authors. Law%6 has augmented their approach with specific suggestions for each of the three stage’s: 1. develop high face-validity

  15. The Application of Systems Analysis and Mathematical Models to the Study of Erythropoiesis During Space Flight

    NASA Technical Reports Server (NTRS)

    Leonard, J. I.

    1974-01-01

    Included in the report are: (1) review of the erythropoietic mechanisms; (2) an evaluation of existing models for the control of erythropoiesis; (3) a computer simulation of the model's response to hypoxia; (4) an hypothesis to explain observed decreases in red blood cell mass during weightlessness; (5) suggestions for further research; and (6) an assessment of the role that systems analysis can play in the Skylab hematological program.

  16. Cost-Benefit Analysis of Computer Resources for Machine Learning

    USGS Publications Warehouse

    Champion, Richard A.

    2007-01-01

    Machine learning describes pattern-recognition algorithms - in this case, probabilistic neural networks (PNNs). These can be computationally intensive, in part because of the nonlinear optimizer, a numerical process that calibrates the PNN by minimizing a sum of squared errors. This report suggests efficiencies that are expressed as cost and benefit. The cost is computer time needed to calibrate the PNN, and the benefit is goodness-of-fit, how well the PNN learns the pattern in the data. There may be a point of diminishing returns where a further expenditure of computer resources does not produce additional benefits. Sampling is suggested as a cost-reduction strategy. One consideration is how many points to select for calibration and another is the geometric distribution of the points. The data points may be nonuniformly distributed across space, so that sampling at some locations provides additional benefit while sampling at other locations does not. A stratified sampling strategy can be designed to select more points in regions where they reduce the calibration error and fewer points in regions where they do not. Goodness-of-fit tests ensure that the sampling does not introduce bias. This approach is illustrated by statistical experiments for computing correlations between measures of roadless area and population density for the San Francisco Bay Area. The alternative to training efficiencies is to rely on high-performance computer systems. These may require specialized programming and algorithms that are optimized for parallel performance.

  17. Computer-Based Image Analysis for Plus Disease Diagnosis in Retinopathy of Prematurity: Performance of the "i-ROP" System and Image Features Associated With Expert Diagnosis.

    PubMed

    Ataer-Cansizoglu, Esra; Bolon-Canedo, Veronica; Campbell, J Peter; Bozkurt, Alican; Erdogmus, Deniz; Kalpathy-Cramer, Jayashree; Patel, Samir; Jonas, Karyn; Chan, R V Paul; Ostmo, Susan; Chiang, Michael F

    2015-11-01

    We developed and evaluated the performance of a novel computer-based image analysis system for grading plus disease in retinopathy of prematurity (ROP), and identified the image features, shapes, and sizes that best correlate with expert diagnosis. A dataset of 77 wide-angle retinal images from infants screened for ROP was collected. A reference standard diagnosis was determined for each image by combining image grading from 3 experts with the clinical diagnosis from ophthalmoscopic examination. Manually segmented images were cropped into a range of shapes and sizes, and a computer algorithm was developed to extract tortuosity and dilation features from arteries and veins. Each feature was fed into our system to identify the set of characteristics that yielded the highest-performing system compared to the reference standard, which we refer to as the "i-ROP" system. Among the tested crop shapes, sizes, and measured features, point-based measurements of arterial and venous tortuosity (combined), and a large circular cropped image (with radius 6 times the disc diameter), provided the highest diagnostic accuracy. The i-ROP system achieved 95% accuracy for classifying preplus and plus disease compared to the reference standard. This was comparable to the performance of the 3 individual experts (96%, 94%, 92%), and significantly higher than the mean performance of 31 nonexperts (81%). This comprehensive analysis of computer-based plus disease suggests that it may be feasible to develop a fully-automated system based on wide-angle retinal images that performs comparably to expert graders at three-level plus disease discrimination. Computer-based image analysis, using objective and quantitative retinal vascular features, has potential to complement clinical ROP diagnosis by ophthalmologists.

  18. Representational Similarity Analysis – Connecting the Branches of Systems Neuroscience

    PubMed Central

    Kriegeskorte, Nikolaus; Mur, Marieke; Bandettini, Peter

    2008-01-01

    A fundamental challenge for systems neuroscience is to quantitatively relate its three major branches of research: brain-activity measurement, behavioral measurement, and computational modeling. Using measured brain-activity patterns to evaluate computational network models is complicated by the need to define the correspondency between the units of the model and the channels of the brain-activity data, e.g., single-cell recordings or voxels from functional magnetic resonance imaging (fMRI). Similar correspondency problems complicate relating activity patterns between different modalities of brain-activity measurement (e.g., fMRI and invasive or scalp electrophysiology), and between subjects and species. In order to bridge these divides, we suggest abstracting from the activity patterns themselves and computing representational dissimilarity matrices (RDMs), which characterize the information carried by a given representation in a brain or model. Building on a rich psychological and mathematical literature on similarity analysis, we propose a new experimental and data-analytical framework called representational similarity analysis (RSA), in which multi-channel measures of neural activity are quantitatively related to each other and to computational theory and behavior by comparing RDMs. We demonstrate RSA by relating representations of visual objects as measured with fMRI in early visual cortex and the fusiform face area to computational models spanning a wide range of complexities. The RDMs are simultaneously related via second-level application of multidimensional scaling and tested using randomization and bootstrap techniques. We discuss the broad potential of RSA, including novel approaches to experimental design, and argue that these ideas, which have deep roots in psychology and neuroscience, will allow the integrated quantitative analysis of data from all three branches, thus contributing to a more unified systems neuroscience. PMID:19104670

  19. Computational analysis of microRNA function in heart development.

    PubMed

    Liu, Ganqiang; Ding, Min; Chen, Jiajia; Huang, Jinyan; Wang, Haiyun; Jing, Qing; Shen, Bairong

    2010-09-01

    Emerging evidence suggests that specific spatio-temporal microRNA (miRNA) expression is required for heart development. In recent years, hundreds of miRNAs have been discovered. In contrast, functional annotations are available only for a very small fraction of these regulatory molecules. In order to provide a global perspective for the biologists who study the relationship between differentially expressed miRNAs and heart development, we employed computational analysis to uncover the specific cellular processes and biological pathways targeted by miRNAs in mouse heart development. Here, we utilized Gene Ontology (GO) categories, KEGG Pathway, and GeneGo Pathway Maps as a gene functional annotation system for miRNA target enrichment analysis. The target genes of miRNAs were found to be enriched in functional categories and pathway maps in which miRNAs could play important roles during heart development. Meanwhile, we developed miRHrt (http://sysbio.suda.edu.cn/mirhrt/), a database aiming to provide a comprehensive resource of miRNA function in regulating heart development. These computational analysis results effectively illustrated the correlation of differentially expressed miRNAs with cellular functions and heart development. We hope that the identified novel heart development-associated pathways and the database presented here would facilitate further understanding of the roles and mechanisms of miRNAs in heart development.

  20. Improving the clinical correlation of multiple sclerosis black hole volume change by paired-scan analysis.

    PubMed

    Tam, Roger C; Traboulsee, Anthony; Riddehough, Andrew; Li, David K B

    2012-01-01

    The change in T 1-hypointense lesion ("black hole") volume is an important marker of pathological progression in multiple sclerosis (MS). Black hole boundaries often have low contrast and are difficult to determine accurately and most (semi-)automated segmentation methods first compute the T 2-hyperintense lesions, which are a superset of the black holes and are typically more distinct, to form a search space for the T 1w lesions. Two main potential sources of measurement noise in longitudinal black hole volume computation are partial volume and variability in the T 2w lesion segmentation. A paired analysis approach is proposed herein that uses registration to equalize partial volume and lesion mask processing to combine T 2w lesion segmentations across time. The scans of 247 MS patients are used to compare a selected black hole computation method with an enhanced version incorporating paired analysis, using rank correlation to a clinical variable (MS functional composite) as the primary outcome measure. The comparison is done at nine different levels of intensity as a previous study suggests that darker black holes may yield stronger correlations. The results demonstrate that paired analysis can strongly improve longitudinal correlation (from -0.148 to -0.303 in this sample) and may produce segmentations that are more sensitive to clinically relevant changes.

  1. A Mobile Computing Solution for Collecting Functional Analysis Data on a Pocket PC

    PubMed Central

    Jackson, James; Dixon, Mark R

    2007-01-01

    The present paper provides a task analysis for creating a computerized data system using a Pocket PC and Microsoft Visual Basic. With Visual Basic software and any handheld device running the Windows Moble operating system, this task analysis will allow behavior analysts to program and customize their own functional analysis data-collection system. The program will allow the user to select the type of behavior to be recorded, choose between interval and frequency data collection, and summarize data for graphing and analysis. We also provide suggestions for customizing the data-collection system for idiosyncratic research and clinical needs. PMID:17624078

  2. Computational Analysis of the Flow and Acoustic Effects of Jet-Pylon Interaction

    NASA Technical Reports Server (NTRS)

    Hunter, Craig A.; Thomas, Russell H.; Abdol-Hamid, K. S.; Pao, S. Paul; Elmiligui, Alaa A.; Massey, Steven J.

    2005-01-01

    Computational simulation and prediction tools were used to understand the jet-pylon interaction effect in a set of bypass-ratio five core/fan nozzles. Results suggest that the pylon acts as a large scale mixing vane that perturbs the jet flow and jump starts the jet mixing process. The enhanced mixing and associated secondary flows from the pylon result in a net increase of noise in the first 10 diameters of the jet s development, but there is a sustained reduction in noise from that point downstream. This is likely the reason the pylon nozzle is quieter overall than the baseline round nozzle in this case. The present work suggests that focused pylon design could lead to advanced pylon shapes and nozzle configurations that take advantage of propulsion-airframe integration to provide additional noise reduction capabilities.

  3. Experimental and Computational Analysis of Polyglutamine-Mediated Cytotoxicity

    PubMed Central

    Tang, Matthew Y.; Proctor, Carole J.; Woulfe, John; Gray, Douglas A.

    2010-01-01

    Expanded polyglutamine (polyQ) proteins are known to be the causative agents of a number of human neurodegenerative diseases but the molecular basis of their cytoxicity is still poorly understood. PolyQ tracts may impede the activity of the proteasome, and evidence from single cell imaging suggests that the sequestration of polyQ into inclusion bodies can reduce the proteasomal burden and promote cell survival, at least in the short term. The presence of misfolded protein also leads to activation of stress kinases such as p38MAPK, which can be cytotoxic. The relationships of these systems are not well understood. We have used fluorescent reporter systems imaged in living cells, and stochastic computer modeling to explore the relationships of polyQ, p38MAPK activation, generation of reactive oxygen species (ROS), proteasome inhibition, and inclusion body formation. In cells expressing a polyQ protein inclusion, body formation was preceded by proteasome inhibition but cytotoxicity was greatly reduced by administration of a p38MAPK inhibitor. Computer simulations suggested that without the generation of ROS, the proteasome inhibition and activation of p38MAPK would have significantly reduced toxicity. Our data suggest a vicious cycle of stress kinase activation and proteasome inhibition that is ultimately lethal to cells. There was close agreement between experimental data and the predictions of a stochastic computer model, supporting a central role for proteasome inhibition and p38MAPK activation in inclusion body formation and ROS-mediated cell death. PMID:20885783

  4. Syntactic Computations in the Language Network: Characterizing Dynamic Network Properties Using Representational Similarity Analysis

    PubMed Central

    Tyler, Lorraine K.; Cheung, Teresa P. L.; Devereux, Barry J.; Clarke, Alex

    2013-01-01

    The core human capacity of syntactic analysis involves a left hemisphere network involving left inferior frontal gyrus (LIFG) and posterior middle temporal gyrus (LMTG) and the anatomical connections between them. Here we use magnetoencephalography (MEG) to determine the spatio-temporal properties of syntactic computations in this network. Listeners heard spoken sentences containing a local syntactic ambiguity (e.g., “… landing planes …”), at the offset of which they heard a disambiguating verb and decided whether it was an acceptable/unacceptable continuation of the sentence. We charted the time-course of processing and resolving syntactic ambiguity by measuring MEG responses from the onset of each word in the ambiguous phrase and the disambiguating word. We used representational similarity analysis (RSA) to characterize syntactic information represented in the LIFG and left posterior middle temporal gyrus (LpMTG) over time and to investigate their relationship to each other. Testing a variety of lexico-syntactic and ambiguity models against the MEG data, our results suggest early lexico-syntactic responses in the LpMTG and later effects of ambiguity in the LIFG, pointing to a clear differentiation in the functional roles of these two regions. Our results suggest the LpMTG represents and transmits lexical information to the LIFG, which responds to and resolves the ambiguity. PMID:23730293

  5. Computer Simulation Is an Undervalued Tool for Genetic Analysis: A Historical View and Presentation of SHIMSHON – A Web-Based Genetic Simulation Package

    PubMed Central

    Greenberg, David A.

    2011-01-01

    Computer simulation methods are under-used tools in genetic analysis because simulation approaches have been portrayed as inferior to analytic methods. Even when simulation is used, its advantages are not fully exploited. Here, I present SHIMSHON, our package of genetic simulation programs that have been developed, tested, used for research, and used to generated data for Genetic Analysis Workshops (GAW). These simulation programs, now web-accessible, can be used by anyone to answer questions about designing and analyzing genetic disease studies for locus identification. This work has three foci: (1) the historical context of SHIMSHON's development, suggesting why simulation has not been more widely used so far. (2) Advantages of simulation: computer simulation helps us to understand how genetic analysis methods work. It has advantages for understanding disease inheritance and methods for gene searches. Furthermore, simulation methods can be used to answer fundamental questions that either cannot be answered by analytical approaches or cannot even be defined until the problems are identified and studied, using simulation. (3) I argue that, because simulation was not accepted, there was a failure to grasp the meaning of some simulation-based studies of linkage. This may have contributed to perceived weaknesses in linkage analysis; weaknesses that did not, in fact, exist. PMID:22189467

  6. Using Computer-Adaptive Quizzing as a Tool for National Council Licensure Examination Success.

    PubMed

    Pence, Jill; Wood, Felecia

    This study examined the relationship between using computer-adaptive quizzing (CAQ) and first-time National Council Licensure Examination (NCLEX) success. A retrospective, descriptive, correlational design was used to analyze the relationship between use of a CAQ program and first-time NCLEX results of 194 baccalaureate graduates. Chi-square analysis suggested that there was an association between using the software and NCLEX success (p < .001, df = 1), with 16 percent of those without access compared to 1 percent with access being unsuccessful on the licensure exam. Results support using CAQ as formative preparation for the NCLEX.

  7. Longitudinal effects of college type and selectivity on degrees conferred upon undergraduate females in physical science, life science, math and computer science, and social science

    NASA Astrophysics Data System (ADS)

    Stevens, Stacy Mckimm

    There has been much research to suggest that a single-sex college experience for female undergraduate students can increase self-confidence and leadership ability during the college years and beyond. The results of previous studies also suggest that these students achieve in the workforce and enter graduate school at higher rates than their female peers graduating from coeducational institutions. However, some researchers have questioned these findings, suggesting that it is the selectivity level of the colleges rather than the comprised gender of the students that causes these differences. The purpose of this study was to justify the continuation of single-sex educational opportunities for females at the post-secondary level by examining the effects that college selectivity, college type, and time have on the rate of undergraduate females pursuing majors in non-traditional fields. The study examined the percentage of physical science, life science, math and computer science, and social science degrees conferred upon females graduating from women's colleges from 1985-2001, as compared to those at comparable coeducational colleges. Sampling for this study consisted of 42 liberal arts women's (n = 21) and coeducational (n = 21) colleges. Variables included the type of college, the selectivity level of the college, and the effect of time on the percentage of female graduates. Doubly multivariate repeated measures analysis of variance testing revealed significant main effects for college selectivity on social science graduates, and time on both life science and math and computer science graduates. Significant interaction was also found between the college type and time on social science graduates, as well as the college type, selectivity level, and time on math and computer science graduates. Implications of the results and suggestions for further research are discussed.

  8. The Trapping Index: How to integrate the Eulerian and the Lagrangian approach for the computation of the transport time scales of semi-enclosed basins.

    PubMed

    Cucco, Andrea; Umgiesser, Georg

    2015-09-15

    In this work, we investigated if the Eulerian and the Lagrangian approaches for the computation of the Transport Time Scales (TTS) of semi-enclosed water bodies can be used univocally to define the spatial variability of basin flushing features. The Eulerian and Lagrangian TTS were computed for both simplified test cases and a realistic domain: the Venice Lagoon. The results confirmed the two approaches cannot be adopted univocally and that the spatial variability of the water renewal capacity can be investigated only through the computation of both the TTS. A specific analysis, based on the computation of a so-called Trapping Index, was then suggested to integrate the information provided by the two different approaches. The obtained results proved the Trapping Index to be useful to avoid any misleading interpretation due to the evaluation of the basin renewal features just from an Eulerian only or from a Lagrangian only perspective. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Integrating the Apache Big Data Stack with HPC for Big Data

    NASA Astrophysics Data System (ADS)

    Fox, G. C.; Qiu, J.; Jha, S.

    2014-12-01

    There is perhaps a broad consensus as to important issues in practical parallel computing as applied to large scale simulations; this is reflected in supercomputer architectures, algorithms, libraries, languages, compilers and best practice for application development. However, the same is not so true for data intensive computing, even though commercially clouds devote much more resources to data analytics than supercomputers devote to simulations. We look at a sample of over 50 big data applications to identify characteristics of data intensive applications and to deduce needed runtime and architectures. We suggest a big data version of the famous Berkeley dwarfs and NAS parallel benchmarks and use these to identify a few key classes of hardware/software architectures. Our analysis builds on combining HPC and ABDS the Apache big data software stack that is well used in modern cloud computing. Initial results on clouds and HPC systems are encouraging. We propose the development of SPIDAL - Scalable Parallel Interoperable Data Analytics Library -- built on system aand data abstractions suggested by the HPC-ABDS architecture. We discuss how it can be used in several application areas including Polar Science.

  10. Computers and Young Children: New Frontiers in Computer Hardware and Software or What Computer Should I Buy?

    ERIC Educational Resources Information Center

    Shade, Daniel D.

    1994-01-01

    Provides advice and suggestions for educators or parents who are trying to decide what type of computer to buy to run the latest computer software for children. Suggests that purchasers should buy a computer with as large a hard drive as possible, at least 10 megabytes of RAM, and a CD-ROM drive. (MDM)

  11. Data Processing: Fifteen Suggestions for Computer Training in Your Business Education Classes.

    ERIC Educational Resources Information Center

    Barr, Lowell L.

    1980-01-01

    Presents 15 suggestions for training business education students in the use of computers. Suggestions involve computer language, method of presentation, laboratory time, programing assignments, instructions and handouts, problem solving, deadlines, reviews, programming concepts, programming logic, documentation, and defensive programming. (CT)

  12. Grouping Inhibits Motion Fading by Giving Rise to Virtual Trackable Features

    ERIC Educational Resources Information Center

    Hsieh, P. -J.; Tse, P. U.

    2007-01-01

    After prolonged viewing of a slowly drifting or rotating pattern under strict fixation, the pattern appears to slow down and then momentarily stop. The authors show that grouping can slow down the process of "motion fading," suggesting that cortical configural form analysis interacts with the computation of motion signals during motion fading. The…

  13. Capital Budgeting Guidelines: How to Decide Whether to Fund a New Dorm or an Upgraded Computer Lab.

    ERIC Educational Resources Information Center

    Swiger, John; Klaus, Allen

    1996-01-01

    A process for college and university decision making and budgeting for capital outlays that focuses on evaluating the qualitative and quantitative benefits of each proposed project is described and illustrated. The process provides a means to solicit suggestions from those involved and provide detailed information for cost-benefit analysis. (MSE)

  14. A Theoretical Analysis of Learning with Graphics--Implications for Computer Graphics Design.

    ERIC Educational Resources Information Center

    ChanLin, Lih-Juan

    This paper reviews the literature pertinent to learning with graphics. The dual coding theory provides explanation about how graphics are stored and precessed in semantic memory. The level of processing theory suggests how graphics can be employed in learning to encourage deeper processing. In addition to dual coding theory and level of processing…

  15. Mentalistic Basis of Core Social Cognition: Experiments in Preverbal Infants and a Computational Model

    ERIC Educational Resources Information Center

    Hamlin, J. Kiley; Ullman, Tomer; Tenenbaum, Josh; Goodman, Noah; Baker, Chris

    2013-01-01

    Evaluating individuals based on their pro- and anti-social behaviors is fundamental to successful human interaction. Recent research suggests that even preverbal infants engage in social evaluation; however, it remains an open question whether infants' judgments are driven uniquely by an analysis of the mental states that motivate others' helpful…

  16. Sensitivities to Early Exchange in Synchronous Computer-Supported Collaborative Learning (CSCL) Groups

    ERIC Educational Resources Information Center

    Kapur, Manu; Voiklis, John; Kinzer, Charles K.

    2008-01-01

    This study reports the impact of high sensitivity to early exchange in 11th-grade, CSCL triads solving well- and ill-structured problems in Newtonian Kinematics. A mixed-method analysis of the evolution of participation inequity (PI) in group discussions suggested that participation levels tended to get locked-in relatively early on in the…

  17. Monitoring the vernal advancement and retrogradation (green wave effect) of natural vegetation

    NASA Technical Reports Server (NTRS)

    Rouse, J. W., Jr. (Principal Investigator)

    1973-01-01

    The author has identified the following significant results. Preliminary evaluation of autumnal phase ground truth data suggests that the sampling procedures at the Great Plains Corridor network test sites are adequate to show relatively small temporal changes in above-ground vegetation biomass and vegetation condition. Vegetation changes measured August through December, reflect grazing intensity and environmental conditions at the test sites. Preliminary analysis of black and white imagery suggests that detail in vegetation patterns is much greater than originally anticipated. A first look analysis of single band imagery and digital data at two locations shows that woodland, grassland, and cropland areas are easily delineated. Computer derived grey-scale maps from MSS digital data were shown to be useful in identifying the location of small fields and features of the natural and cultivated lands. Single band imagery and digital data are believed to have important application for synoptic land use mapping and inventory. Initial ratio analysis, using band 5 and 7 data, suggests the applicability in the greenness of a vegetative scene.

  18. Error and Symmetry Analysis of Misner's Algorithm for Spherical Harmonic Decomposition on a Cubic Grid

    NASA Technical Reports Server (NTRS)

    Fiske, David R.

    2004-01-01

    In an earlier paper, Misner (2004, Class. Quant. Grav., 21, S243) presented a novel algorithm for computing the spherical harmonic components of data represented on a cubic grid. I extend Misner s original analysis by making detailed error estimates of the numerical errors accrued by the algorithm, by using symmetry arguments to suggest a more efficient implementation scheme, and by explaining how the algorithm can be applied efficiently on data with explicit reflection symmetries.

  19. Kinematic Analysis of Cpm Machine Supporting to Rehabilitation Process after Surgical Knee Arthroscopy and Arthroplasty

    NASA Astrophysics Data System (ADS)

    Trochimczuk, R.; Kuźmierowski, T.

    2014-11-01

    Existing commercial solutions of the CPM (Continuous Passive Motion) machines are described in the paper. Based on the analysis of existing solutions we present our conceptual solution to support the process of rehabilitation of the knee joint which is necessary after arthroscopic surgery. For a given novel structure we analyze and present proprietary algorithms and the computer application to simulate the operation of our PCM device. In addition, we suggest directions for further research.

  20. Teaching NMR spectra analysis with nmr.cheminfo.org.

    PubMed

    Patiny, Luc; Bolaños, Alejandro; Castillo, Andrés M; Bernal, Andrés; Wist, Julien

    2018-06-01

    Teaching spectra analysis and structure elucidation requires students to get trained on real problems. This involves solving exercises of increasing complexity and when necessary using computational tools. Although desktop software packages exist for this purpose, nmr.cheminfo.org platform offers students an online alternative. It provides a set of exercises and tools to help solving them. Only a small number of exercises are currently available, but contributors are invited to submit new ones and suggest new types of problems. Copyright © 2018 John Wiley & Sons, Ltd.

  1. Typing of Human Mycobacterium avium Isolates in Italy by IS1245-Based Restriction Fragment Length Polymorphism Analysis

    PubMed Central

    Lari, Nicoletta; Cavallini, Michela; Rindi, Laura; Iona, Elisabetta; Fattorini, Lanfranco; Garzelli, Carlo

    1998-01-01

    All but 2 of 63 Mycobacterium avium isolates from distinct geographic areas of Italy exhibited markedly polymorphic, multibanded IS1245 restriction fragment length polymorphism (RFLP) patterns; 2 isolates showed the low-number banding pattern typical of bird isolates. By computer analysis, 41 distinct IS1245 patterns and 10 clusters of essentially identical strains were detected; 40% of the 63 isolates showed genetic relatedness, suggesting the existence of a predominant AIDS-associated IS1245 RFLP pattern. PMID:9817900

  2. Secure distributed genome analysis for GWAS and sequence comparison computation.

    PubMed

    Zhang, Yihua; Blanton, Marina; Almashaqbeh, Ghada

    2015-01-01

    The rapid increase in the availability and volume of genomic data makes significant advances in biomedical research possible, but sharing of genomic data poses challenges due to the highly sensitive nature of such data. To address the challenges, a competition for secure distributed processing of genomic data was organized by the iDASH research center. In this work we propose techniques for securing computation with real-life genomic data for minor allele frequency and chi-squared statistics computation, as well as distance computation between two genomic sequences, as specified by the iDASH competition tasks. We put forward novel optimizations, including a generalization of a version of mergesort, which might be of independent interest. We provide implementation results of our techniques based on secret sharing that demonstrate practicality of the suggested protocols and also report on performance improvements due to our optimization techniques. This work describes our techniques, findings, and experimental results developed and obtained as part of iDASH 2015 research competition to secure real-life genomic computations and shows feasibility of securely computing with genomic data in practice.

  3. Secure distributed genome analysis for GWAS and sequence comparison computation

    PubMed Central

    2015-01-01

    Background The rapid increase in the availability and volume of genomic data makes significant advances in biomedical research possible, but sharing of genomic data poses challenges due to the highly sensitive nature of such data. To address the challenges, a competition for secure distributed processing of genomic data was organized by the iDASH research center. Methods In this work we propose techniques for securing computation with real-life genomic data for minor allele frequency and chi-squared statistics computation, as well as distance computation between two genomic sequences, as specified by the iDASH competition tasks. We put forward novel optimizations, including a generalization of a version of mergesort, which might be of independent interest. Results We provide implementation results of our techniques based on secret sharing that demonstrate practicality of the suggested protocols and also report on performance improvements due to our optimization techniques. Conclusions This work describes our techniques, findings, and experimental results developed and obtained as part of iDASH 2015 research competition to secure real-life genomic computations and shows feasibility of securely computing with genomic data in practice. PMID:26733307

  4. Uncertainty quantification based on pillars of experiment, theory, and computation. Part I: Data analysis

    NASA Astrophysics Data System (ADS)

    Elishakoff, I.; Sarlin, N.

    2016-06-01

    In this paper we provide a general methodology of analysis and design of systems involving uncertainties. Available experimental data is enclosed by some geometric figures (triangle, rectangle, ellipse, parallelogram, super ellipse) of minimum area. Then these areas are inflated resorting to the Chebyshev inequality in order to take into account the forecasted data. Next step consists in evaluating response of system when uncertainties are confined to one of the above five suitably inflated geometric figures. This step involves a combined theoretical and computational analysis. We evaluate the maximum response of the system subjected to variation of uncertain parameters in each hypothesized region. The results of triangular, interval, ellipsoidal, parallelogram, and super ellipsoidal calculi are compared with the view of identifying the region that leads to minimum of maximum response. That response is identified as a result of the suggested predictive inference. The methodology thus synthesizes probabilistic notion with each of the five calculi. Using the term "pillar" in the title was inspired by the News Release (2013) on according Honda Prize to J. Tinsley Oden, stating, among others, that "Dr. Oden refers to computational science as the "third pillar" of scientific inquiry, standing beside theoretical and experimental science. Computational science serves as a new paradigm for acquiring knowledge and informing decisions important to humankind". Analysis of systems with uncertainties necessitates employment of all three pillars. The analysis is based on the assumption that that the five shapes are each different conservative estimates of the true bounding region. The smallest of the maximal displacements in x and y directions (for a 2D system) therefore provides the closest estimate of the true displacements based on the above assumption.

  5. A Novel Quantitative Computed Tomographic Analysis Suggests How Sirolimus Stabilizes Progressive Air Trapping in Lymphangioleiomyomatosis

    PubMed Central

    Kokosi, Maria; Lo, Pechin; Kim, Hyun J.; Ravenel, James G.; Meyer, Cristopher; Goldin, Jonathan; Lee, Hye-Seung; Strange, Charlie; McCormack, Francis X.

    2016-01-01

    Rationale: The Multicenter International Lymphangioleiomyomatosis Efficacy and Safety of Sirolimus (MILES) trial demonstrated that sirolimus stabilized lung function and improved measures of functional performance and quality of life in patients with lymphangioleiomyomatosis. The physiologic mechanisms of these beneficial actions of sirolimus are incompletely understood. Objectives: To prospectively determine the longitudinal computed tomographic lung imaging correlates of lung function change in MILES patients treated with placebo or sirolimus. Methods: We determined the baseline to 12-month change in computed tomographic image–derived lung volumes and the volume of the lung occupied by cysts in the 31 MILES participants (17 in sirolimus group, 14 in placebo group) with baseline and 12-month scans. Measurements and Main Results: There was a trend toward an increase in median expiratory cyst volume percentage in the placebo group and a reduction in the sirolimus group (+2.68% vs. +0.97%, respectively; P = 0.10). The computed tomographic image–derived residual volume and the ratio of residual volume to total lung capacity increased more in the placebo group than in the sirolimus group (+214.4 ml vs. +2.9 ml [P = 0.054] and +0.05 ml vs. −0.01 ml [P = 0.0498], respectively). A Markov transition chain analysis of respiratory cycle cyst volume changes revealed greater dynamic variation in the sirolimus group than in the placebo group at the 12-month time point. Conclusions: Collectively, these data suggest that sirolimus attenuates progressive gas trapping in lymphangioleiomyomatosis, consistent with a beneficial effect of the drug on airflow obstruction. We speculate that a reduction in lymphangioleiomyomatosis cell burden around small airways and cyst walls alleviates progressive airflow limitation and facilitates cyst emptying. PMID:26799509

  6. A Novel Quantitative Computed Tomographic Analysis Suggests How Sirolimus Stabilizes Progressive Air Trapping in Lymphangioleiomyomatosis.

    PubMed

    Argula, Rahul G; Kokosi, Maria; Lo, Pechin; Kim, Hyun J; Ravenel, James G; Meyer, Cristopher; Goldin, Jonathan; Lee, Hye-Seung; Strange, Charlie; McCormack, Francis X

    2016-03-01

    The Multicenter International Lymphangioleiomyomatosis Efficacy and Safety of Sirolimus (MILES) trial demonstrated that sirolimus stabilized lung function and improved measures of functional performance and quality of life in patients with lymphangioleiomyomatosis. The physiologic mechanisms of these beneficial actions of sirolimus are incompletely understood. To prospectively determine the longitudinal computed tomographic lung imaging correlates of lung function change in MILES patients treated with placebo or sirolimus. We determined the baseline to 12-month change in computed tomographic image-derived lung volumes and the volume of the lung occupied by cysts in the 31 MILES participants (17 in sirolimus group, 14 in placebo group) with baseline and 12-month scans. There was a trend toward an increase in median expiratory cyst volume percentage in the placebo group and a reduction in the sirolimus group (+2.68% vs. +0.97%, respectively; P = 0.10). The computed tomographic image-derived residual volume and the ratio of residual volume to total lung capacity increased more in the placebo group than in the sirolimus group (+214.4 ml vs. +2.9 ml [P = 0.054] and +0.05 ml vs. -0.01 ml [P = 0.0498], respectively). A Markov transition chain analysis of respiratory cycle cyst volume changes revealed greater dynamic variation in the sirolimus group than in the placebo group at the 12-month time point. Collectively, these data suggest that sirolimus attenuates progressive gas trapping in lymphangioleiomyomatosis, consistent with a beneficial effect of the drug on airflow obstruction. We speculate that a reduction in lymphangioleiomyomatosis cell burden around small airways and cyst walls alleviates progressive airflow limitation and facilitates cyst emptying.

  7. A Simple Method for Automated Equilibration Detection in Molecular Simulations.

    PubMed

    Chodera, John D

    2016-04-12

    Molecular simulations intended to compute equilibrium properties are often initiated from configurations that are highly atypical of equilibrium samples, a practice which can generate a distinct initial transient in mechanical observables computed from the simulation trajectory. Traditional practice in simulation data analysis recommends this initial portion be discarded to equilibration, but no simple, general, and automated procedure for this process exists. Here, we suggest a conceptually simple automated procedure that does not make strict assumptions about the distribution of the observable of interest in which the equilibration time is chosen to maximize the number of effectively uncorrelated samples in the production timespan used to compute equilibrium averages. We present a simple Python reference implementation of this procedure and demonstrate its utility on typical molecular simulation data.

  8. A simple method for automated equilibration detection in molecular simulations

    PubMed Central

    Chodera, John D.

    2016-01-01

    Molecular simulations intended to compute equilibrium properties are often initiated from configurations that are highly atypical of equilibrium samples, a practice which can generate a distinct initial transient in mechanical observables computed from the simulation trajectory. Traditional practice in simulation data analysis recommends this initial portion be discarded to equilibration, but no simple, general, and automated procedure for this process exists. Here, we suggest a conceptually simple automated procedure that does not make strict assumptions about the distribution of the observable of interest, in which the equilibration time is chosen to maximize the number of effectively uncorrelated samples in the production timespan used to compute equilibrium averages. We present a simple Python reference implementation of this procedure, and demonstrate its utility on typical molecular simulation data. PMID:26771390

  9. Exact Rayleigh scattering calculations for use with the Nimbus-7 Coastal Zone Color Scanner.

    PubMed

    Gordon, H R; Brown, J W; Evans, R H

    1988-03-01

    For improved analysis of Coastal Zone Color Scanner (CZCS) imagery, the radiance reflected from a planeparallel atmosphere and flat sea surface in the absence of aerosols (Rayleigh radiance) has been computed with an exact multiple scattering code, i.e., including polarization. The results indicate that the single scattering approximation normally used to compute this radiance can cause errors of up to 5% for small and moderate solar zenith angles. At large solar zenith angles, such as encountered in the analysis of high-latitude imagery, the errors can become much larger, e.g.,>10% in the blue band. The single scattering error also varies along individual scan lines. Comparison with multiple scattering computations using scalar transfer theory, i.e., ignoring polarization, show that scalar theory can yield errors of approximately the same magnitude as single scattering when compared with exact computations at small to moderate values of the solar zenith angle. The exact computations can be easily incorporated into CZCS processing algorithms, and, for application to future instruments with higher radiometric sensitivity, a scheme is developed with which the effect of variations in the surface pressure could be easily and accurately included in the exact computation of the Rayleigh radiance. Direct application of these computations to CZCS imagery indicates that accurate atmospheric corrections can be made with solar zenith angles at least as large as 65 degrees and probably up to at least 70 degrees with a more sensitive instrument. This suggests that the new Rayleigh radiance algorithm should produce more consistent pigment retrievals, particularly at high latitudes.

  10. An efficient, large-scale, non-lattice-detection algorithm for exhaustive structural auditing of biomedical ontologies.

    PubMed

    Zhang, Guo-Qiang; Xing, Guangming; Cui, Licong

    2018-04-01

    One of the basic challenges in developing structural methods for systematic audition on the quality of biomedical ontologies is the computational cost usually involved in exhaustive sub-graph analysis. We introduce ANT-LCA, a new algorithm for computing all non-trivial lowest common ancestors (LCA) of each pair of concepts in the hierarchical order induced by an ontology. The computation of LCA is a fundamental step for non-lattice approach for ontology quality assurance. Distinct from existing approaches, ANT-LCA only computes LCAs for non-trivial pairs, those having at least one common ancestor. To skip all trivial pairs that may be of no practical interest, ANT-LCA employs a simple but innovative algorithmic strategy combining topological order and dynamic programming to keep track of non-trivial pairs. We provide correctness proofs and demonstrate a substantial reduction in computational time for two largest biomedical ontologies: SNOMED CT and Gene Ontology (GO). ANT-LCA achieved an average computation time of 30 and 3 sec per version for SNOMED CT and GO, respectively, about 2 orders of magnitude faster than the best known approaches. Our algorithm overcomes a fundamental computational barrier in sub-graph based structural analysis of large ontological systems. It enables the implementation of a new breed of structural auditing methods that not only identifies potential problematic areas, but also automatically suggests changes to fix the issues. Such structural auditing methods can lead to more effective tools supporting ontology quality assurance work. Copyright © 2018 Elsevier Inc. All rights reserved.

  11. Comparison of bias analysis strategies applied to a large data set.

    PubMed

    Lash, Timothy L; Abrams, Barbara; Bodnar, Lisa M

    2014-07-01

    Epidemiologic data sets continue to grow larger. Probabilistic-bias analyses, which simulate hundreds of thousands of replications of the original data set, may challenge desktop computational resources. We implemented a probabilistic-bias analysis to evaluate the direction, magnitude, and uncertainty of the bias arising from misclassification of prepregnancy body mass index when studying its association with early preterm birth in a cohort of 773,625 singleton births. We compared 3 bias analysis strategies: (1) using the full cohort, (2) using a case-cohort design, and (3) weighting records by their frequency in the full cohort. Underweight and overweight mothers were more likely to deliver early preterm. A validation substudy demonstrated misclassification of prepregnancy body mass index derived from birth certificates. Probabilistic-bias analyses suggested that the association between underweight and early preterm birth was overestimated by the conventional approach, whereas the associations between overweight categories and early preterm birth were underestimated. The 3 bias analyses yielded equivalent results and challenged our typical desktop computing environment. Analyses applied to the full cohort, case cohort, and weighted full cohort required 7.75 days and 4 terabytes, 15.8 hours and 287 gigabytes, and 8.5 hours and 202 gigabytes, respectively. Large epidemiologic data sets often include variables that are imperfectly measured, often because data were collected for other purposes. Probabilistic-bias analysis allows quantification of errors but may be difficult in a desktop computing environment. Solutions that allow these analyses in this environment can be achieved without new hardware and within reasonable computational time frames.

  12. Spatial computation of intratumoral T cells correlates with survival of patients with pancreatic cancer.

    PubMed

    Carstens, Julienne L; Correa de Sampaio, Pedro; Yang, Dalu; Barua, Souptik; Wang, Huamin; Rao, Arvind; Allison, James P; LeBleu, Valerie S; Kalluri, Raghu

    2017-04-27

    The exact nature and dynamics of pancreatic ductal adenocarcinoma (PDAC) immune composition remains largely unknown. Desmoplasia is suggested to polarize PDAC immunity. Therefore, a comprehensive evaluation of the composition and distribution of desmoplastic elements and T-cell infiltration is necessary to delineate their roles. Here we develop a novel computational imaging technology for the simultaneous evaluation of eight distinct markers, allowing for spatial analysis of distinct populations within the same section. We report a heterogeneous population of infiltrating T lymphocytes. Spatial distribution of cytotoxic T cells in proximity to cancer cells correlates with increased overall patient survival. Collagen-I and αSMA + fibroblasts do not correlate with paucity in T-cell accumulation, suggesting that PDAC desmoplasia may not be a simple physical barrier. Further exploration of this technology may improve our understanding of how specific stromal composition could impact T-cell activity, with potential impact on the optimization of immune-modulatory therapies.

  13. The direction of cloud computing for Malaysian education sector in 21st century

    NASA Astrophysics Data System (ADS)

    Jaafar, Jazurainifariza; Rahman, M. Nordin A.; Kadir, M. Fadzil A.; Shamsudin, Syadiah Nor; Saany, Syarilla Iryani A.

    2017-08-01

    In 21st century, technology has turned learning environment into a new way of education to make learning systems more effective and systematic. Nowadays, education institutions are faced many challenges to ensure the teaching and learning process is running smoothly and manageable. Some of challenges in the current education management are lack of integrated systems, high cost of maintenance, difficulty of configuration and deployment as well as complexity of storage provision. Digital learning is an instructional practice that use technology to make learning experience more effective, provides education process more systematic and attractive. Digital learning can be considered as one of the prominent application that implemented under cloud computing environment. Cloud computing is a type of network resources that provides on-demands services where the users can access applications inside it at any location and no time border. It also promises for minimizing the cost of maintenance and provides a flexible of data storage capacity. The aim of this article is to review the definition and types of cloud computing for improving digital learning management as required in the 21st century education. The analysis of digital learning context focused on primary school in Malaysia. Types of cloud applications and services in education sector are also discussed in the article. Finally, gap analysis and direction of cloud computing in education sector for facing the 21st century challenges are suggested.

  14. Concentrating on beauty: sexual selection and sociospatial memory.

    PubMed

    Becker, D Vaughn; Kenrick, Douglas T; Guerin, Stephen; Maner, Jon K

    2005-12-01

    In three experiments, location memory for faces was examined using a computer version of the matching game Concentration. Findings suggested that physical attractiveness led to more efficient matching for female faces but not for male faces. Study 3 revealed this interaction despite allowing participants to initially see, attend to, and match the attractive male faces in the first few turns. Analysis of matching errors suggested that, compared to other targets, attractive women were less confusable with one another. Results are discussed in terms of the different functions that attractiveness serves for men and women.

  15. Encoding of Natural Sounds at Multiple Spectral and Temporal Resolutions in the Human Auditory Cortex

    PubMed Central

    Santoro, Roberta; Moerel, Michelle; De Martino, Federico; Goebel, Rainer; Ugurbil, Kamil; Yacoub, Essa; Formisano, Elia

    2014-01-01

    Functional neuroimaging research provides detailed observations of the response patterns that natural sounds (e.g. human voices and speech, animal cries, environmental sounds) evoke in the human brain. The computational and representational mechanisms underlying these observations, however, remain largely unknown. Here we combine high spatial resolution (3 and 7 Tesla) functional magnetic resonance imaging (fMRI) with computational modeling to reveal how natural sounds are represented in the human brain. We compare competing models of sound representations and select the model that most accurately predicts fMRI response patterns to natural sounds. Our results show that the cortical encoding of natural sounds entails the formation of multiple representations of sound spectrograms with different degrees of spectral and temporal resolution. The cortex derives these multi-resolution representations through frequency-specific neural processing channels and through the combined analysis of the spectral and temporal modulations in the spectrogram. Furthermore, our findings suggest that a spectral-temporal resolution trade-off may govern the modulation tuning of neuronal populations throughout the auditory cortex. Specifically, our fMRI results suggest that neuronal populations in posterior/dorsal auditory regions preferably encode coarse spectral information with high temporal precision. Vice-versa, neuronal populations in anterior/ventral auditory regions preferably encode fine-grained spectral information with low temporal precision. We propose that such a multi-resolution analysis may be crucially relevant for flexible and behaviorally-relevant sound processing and may constitute one of the computational underpinnings of functional specialization in auditory cortex. PMID:24391486

  16. Space Debris Surfaces (Computer Code): Probability of No Penetration Versus Impact Velocity and Obliquity

    NASA Technical Reports Server (NTRS)

    Elfer, N.; Meibaum, R.; Olsen, G.

    1995-01-01

    A unique collection of computer codes, Space Debris Surfaces (SD_SURF), have been developed to assist in the design and analysis of space debris protection systems. SD_SURF calculates and summarizes a vehicle's vulnerability to space debris as a function of impact velocity and obliquity. An SD_SURF analysis will show which velocities and obliquities are the most probable to cause a penetration. This determination can help the analyst select a shield design that is best suited to the predominant penetration mechanism. The analysis also suggests the most suitable parameters for development or verification testing. The SD_SURF programs offer the option of either FORTRAN programs or Microsoft-EXCEL spreadsheets and macros. The FORTRAN programs work with BUMPERII. The EXCEL spreadsheets and macros can be used independently or with selected output from the SD_SURF FORTRAN programs. Examples will be presented of the interaction between space vehicle geometry, the space debris environment, and the penetration and critical damage ballistic limit surfaces of the shield under consideration.

  17. Image analysis of pubic bone for age estimation in a computed tomography sample.

    PubMed

    López-Alcaraz, Manuel; González, Pedro Manuel Garamendi; Aguilera, Inmaculada Alemán; López, Miguel Botella

    2015-03-01

    Radiology has demonstrated great utility for age estimation, but most of the studies are based on metrical and morphological methods in order to perform an identification profile. A simple image analysis-based method is presented, aimed to correlate the bony tissue ultrastructure with several variables obtained from the grey-level histogram (GLH) of computed tomography (CT) sagittal sections of the pubic symphysis surface and the pubic body, and relating them with age. The CT sample consisted of 169 hospital Digital Imaging and Communications in Medicine (DICOM) archives of known sex and age. The calculated multiple regression models showed a maximum R (2) of 0.533 for females and 0.726 for males, with a high intra- and inter-observer agreement. The method suggested is considered not only useful for performing an identification profile during virtopsy, but also for application in further studies in order to attach a quantitative correlation for tissue ultrastructure characteristics, without complex and expensive methods beyond image analysis.

  18. A computational study on outliers in world music.

    PubMed

    Panteli, Maria; Benetos, Emmanouil; Dixon, Simon

    2017-01-01

    The comparative analysis of world music cultures has been the focus of several ethnomusicological studies in the last century. With the advances of Music Information Retrieval and the increased accessibility of sound archives, large-scale analysis of world music with computational tools is today feasible. We investigate music similarity in a corpus of 8200 recordings of folk and traditional music from 137 countries around the world. In particular, we aim to identify music recordings that are most distinct compared to the rest of our corpus. We refer to these recordings as 'outliers'. We use signal processing tools to extract music information from audio recordings, data mining to quantify similarity and detect outliers, and spatial statistics to account for geographical correlation. Our findings suggest that Botswana is the country with the most distinct recordings in the corpus and China is the country with the most distinct recordings when considering spatial correlation. Our analysis includes a comparison of musical attributes and styles that contribute to the 'uniqueness' of the music of each country.

  19. A computational study on outliers in world music

    PubMed Central

    Benetos, Emmanouil; Dixon, Simon

    2017-01-01

    The comparative analysis of world music cultures has been the focus of several ethnomusicological studies in the last century. With the advances of Music Information Retrieval and the increased accessibility of sound archives, large-scale analysis of world music with computational tools is today feasible. We investigate music similarity in a corpus of 8200 recordings of folk and traditional music from 137 countries around the world. In particular, we aim to identify music recordings that are most distinct compared to the rest of our corpus. We refer to these recordings as ‘outliers’. We use signal processing tools to extract music information from audio recordings, data mining to quantify similarity and detect outliers, and spatial statistics to account for geographical correlation. Our findings suggest that Botswana is the country with the most distinct recordings in the corpus and China is the country with the most distinct recordings when considering spatial correlation. Our analysis includes a comparison of musical attributes and styles that contribute to the ‘uniqueness’ of the music of each country. PMID:29253027

  20. Sensitivity analysis of a pulse nutrient addition technique for estimating nutrient uptake in large streams

    Treesearch

    Laurence Lin; J.R. Webster

    2012-01-01

    The constant nutrient addition technique has been used extensively to measure nutrient uptake in streams. However, this technique is impractical for large streams, and the pulse nutrient addition (PNA) has been suggested as an alternative. We developed a computer model to simulate Monod kinetics nutrient uptake in large rivers and used this model to evaluate the...

  1. An iterative transformation procedure for numerical solution of flutter and similar characteristics-value problems

    NASA Technical Reports Server (NTRS)

    Gossard, Myron L

    1952-01-01

    An iterative transformation procedure suggested by H. Wielandt for numerical solution of flutter and similar characteristic-value problems is presented. Application of this procedure to ordinary natural-vibration problems and to flutter problems is shown by numerical examples. Comparisons of computed results with experimental values and with results obtained by other methods of analysis are made.

  2. Environmental Studies: Mathematical, Computational and Statistical Analyses

    DTIC Science & Technology

    1993-03-03

    mathematical analysis addresses the seasonally and longitudinally averaged circulation which is under the influence of a steady forcing located asymmetrically...employed, as has been suggested for some situations. A general discussion of how interfacial phenomena influence both the original contamination process...describing the large-scale advective and dispersive behaviour of contaminants transported by groundwater and the uncertainty associated with field-scale

  3. The Five Families of Cognitive Learning: A Context in Which To Conduct Cognitive Demands Analyses of Innovative Technologies.

    ERIC Educational Resources Information Center

    Klein, Davina C. D.; O'Neil, Harold F., Jr.; Dennis, Robert A.; Baker, Eva L.

    A cognitive demands analysis of a learning technology, a term that includes the hardware and the computer software products that form learning environments, attempts to describe the types of cognitive learning expected of the individual by the technology. This paper explores the context of cognitive learning, suggesting five families of cognitive…

  4. Supercomputing with toys: harnessing the power of NVIDIA 8800GTX and playstation 3 for bioinformatics problem.

    PubMed

    Wilson, Justin; Dai, Manhong; Jakupovic, Elvis; Watson, Stanley; Meng, Fan

    2007-01-01

    Modern video cards and game consoles typically have much better performance to price ratios than that of general purpose CPUs. The parallel processing capabilities of game hardware are well-suited for high throughput biomedical data analysis. Our initial results suggest that game hardware is a cost-effective platform for some computationally demanding bioinformatics problems.

  5. MSSA de-noising of horizon time structure to improve the curvature attribute analysis

    NASA Astrophysics Data System (ADS)

    Tiwari, R. K.; Rekapalli, R.; Vedanti, N.

    2017-12-01

    Although the seismic attributes are useful for identifying sub-surface structural features like faults, fractures, lineaments and sharp stratigraphy etc., the different kinds of noises arising from unknown physical sources during the data acquisition and processing creates acute problems in physical interpretation of complex crustal structures. Hence, we propose to study effect of noise on curvature attribute analysis of seismic time structure data. We propose here Multichannel Singular Spectrum Analysis (MSSA) de-noising algorithm as a pre filtering scheme to reduce effect of noise. To demonstrate the procedure, first, we compute the most positive and negative curvature on a synthetic time structure with surface features resembling anticlines, synclines and faults and then adding the known percentage of noise. We noticed that the curvatures estimated from the noisy data reveal considerable deviations from the curvature of pure synthetic data. This suggests that there is a strong impact of noise on the curvature estimates. Further, we have employed 2D median filter and MSSA methods to filter the noisy time structure and then computed the curvatures. The comparisons of curvatures estimated from de-noised data suggest that the results obtained from MSSA de-noised data match well with the curvatures of pure synthetic data. Finally, we present an example of real data analysis from Utsira Top (UT) horizon of Southern Viking Graben, Norway to identify the time-lapse changes in UT horizon after CO2 injection. We applied the MSSA de-noising algorithm on UT horizon time structure and amplitude data of pre and post CO2 injection. Our analyses suggest modest but clearly visible, structural changes in the UT horizon after CO2 injection at a few locations, which seem to be associated with the locations of change in seismic amplitudes. Thus, the results from both the synthetic and real field data suggest that the MSSA based de-noising algorithm is robust for filtering the horizon time structures for accurate curvature attributes analysis and better interpretation of structural changes in geological features. Key Words: Curvature attributes, MSSA, Seismic Horizon, 2D-median filter, Utsira Horizon.

  6. Mathematical Model for Dengue Epidemics with Differential Susceptibility and Asymptomatic Patients Using Computer Algebra

    NASA Astrophysics Data System (ADS)

    Saldarriaga Vargas, Clarita

    When there are diseases affecting large populations where the social, economic and cultural diversity is significant within the same region, the biological parameters that determine the behavior of the dispersion disease analysis are affected by the selection of different individuals. Therefore and because of the variety and magnitude of the communities at risk of contracting dengue disease around all over the world, suggest defining differentiated populations with individual contributions in the results of the dispersion dengue disease analysis. In this paper those conditions were taken in account when several epidemiologic models were analyzed. Initially a stability analysis was done for a SEIR mathematical model of Dengue disease without differential susceptibility. Both free disease and endemic equilibrium states were found in terms of the basic reproduction number and were defined in the Theorem (3.1). Then a DSEIR model was solved when a new susceptible group was introduced to consider the effects of important biological parameters of non-homogeneous populations in the spreading analysis. The results were compiled in the Theorem (3.2). Finally Theorems (3.3) and (3.4) resumed the basic reproduction numbers for three and n different susceptible groups respectively, giving an idea of how differential susceptibility affects the equilibrium states. The computations were done using an algorithmic method implemented in Maple 11, a general-purpose computer algebra system.

  7. Space station integrated wall design and penetration damage control

    NASA Technical Reports Server (NTRS)

    Coronado, A. R.; Gibbins, M. N.; Wright, M. A.; Stern, P. H.

    1987-01-01

    The analysis code BUMPER executes a numerical solution to the problem of calculating the probability of no penetration (PNP) of a spacecraft subject to man-made orbital debris or meteoroid impact. The codes were developed on a DEC VAX 11/780 computer that uses the Virtual Memory System (VMS) operating system, which is written in FORTRAN 77 with no VAX extensions. To help illustrate the steps involved, a single sample analysis is performed. The example used is the space station reference configuration. The finite element model (FEM) of this configuration is relatively complex but demonstrates many BUMPER features. The computer tools and guidelines are described for constructing a FEM for the space station under consideration. The methods used to analyze the sensitivity of PNP to variations in design, are described. Ways are suggested for developing contour plots of the sensitivity study data. Additional BUMPER analysis examples are provided, including FEMs, command inputs, and data outputs. The mathematical theory used as the basis for the code is described, and illustrates the data flow within the analysis.

  8. Fels-Rand: an Xlisp-Stat program for the comparative analysis of data under phylogenetic uncertainty.

    PubMed

    Blomberg, S

    2000-11-01

    Currently available programs for the comparative analysis of phylogenetic data do not perform optimally when the phylogeny is not completely specified (i.e. the phylogeny contains polytomies). Recent literature suggests that a better way to analyse the data would be to create random trees from the known phylogeny that are fully-resolved but consistent with the known tree. A computer program is presented, Fels-Rand, that performs such analyses. A randomisation procedure is used to generate trees that are fully resolved but whose structure is consistent with the original tree. Statistics are then calculated on a large number of these randomly-generated trees. Fels-Rand uses the object-oriented features of Xlisp-Stat to manipulate internal tree representations. Xlisp-Stat's dynamic graphing features are used to provide heuristic tools to aid in analysis, particularly outlier analysis. The usefulness of Xlisp-Stat as a system for phylogenetic computation is discussed. Available from the author or at http://www.uq.edu.au/~ansblomb/Fels-Rand.sit.hqx. Xlisp-Stat is available from http://stat.umn.edu/~luke/xls/xlsinfo/xlsinfo.html. s.blomberg@abdn.ac.uk

  9. A communications model for an ISAS to NASA span link

    NASA Technical Reports Server (NTRS)

    Green, James L.; Mcguire, Robert E.; Lopez-Swafford, Brian

    1987-01-01

    The authors propose that an initial computer-to-computer communication link use the public packet switched networks (PPSN) Venus-P in Japan and TELENET in the U.S. When the traffic warrants it, this link would then be upgraded to a dedicated leased line that directly connects into the Space Physics Analysis Network (SPAN). The proposed system of hardware and software will easily support migration to such a dedicated link. It therefore provides a cost effective approach to the network problem. Once a dedicated line becomes operation it is suggested that the public networks link and continue to coexist, providing a backup capability.

  10. The neural circuits for arithmetic principles.

    PubMed

    Liu, Jie; Zhang, Han; Chen, Chuansheng; Chen, Hui; Cui, Jiaxin; Zhou, Xinlin

    2017-02-15

    Arithmetic principles are the regularities underlying arithmetic computation. Little is known about how the brain supports the processing of arithmetic principles. The current fMRI study examined neural activation and functional connectivity during the processing of verbalized arithmetic principles, as compared to numerical computation and general language processing. As expected, arithmetic principles elicited stronger activation in bilateral horizontal intraparietal sulcus and right supramarginal gyrus than did language processing, and stronger activation in left middle temporal lobe and left orbital part of inferior frontal gyrus than did computation. In contrast, computation elicited greater activation in bilateral horizontal intraparietal sulcus (extending to posterior superior parietal lobule) than did either arithmetic principles or language processing. Functional connectivity analysis with the psychophysiological interaction approach (PPI) showed that left temporal-parietal (MTG-HIPS) connectivity was stronger during the processing of arithmetic principle and language than during computation, whereas parietal-occipital connectivities were stronger during computation than during the processing of arithmetic principles and language. Additionally, the left fronto-parietal (orbital IFG-HIPS) connectivity was stronger during the processing of arithmetic principles than during computation. The results suggest that verbalized arithmetic principles engage a neural network that overlaps but is distinct from the networks for computation and language processing. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Uncertainty Analysis for a Jet Flap Airfoil

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.; Cruz, Josue

    2006-01-01

    An analysis of variance (ANOVA) study was performed to quantify the potential uncertainties of lift and pitching moment coefficient calculations from a computational fluid dynamics code, relative to an experiment, for a jet flap airfoil configuration. Uncertainties due to a number of factors including grid density, angle of attack and jet flap blowing coefficient were examined. The ANOVA software produced a numerical model of the input coefficient data, as functions of the selected factors, to a user-specified order (linear, 2-factor interference, quadratic, or cubic). Residuals between the model and actual data were also produced at each of the input conditions, and uncertainty confidence intervals (in the form of Least Significant Differences or LSD) for experimental, computational, and combined experimental / computational data sets were computed. The LSD bars indicate the smallest resolvable differences in the functional values (lift or pitching moment coefficient) attributable solely to changes in independent variable, given just the input data points from selected data sets. The software also provided a collection of diagnostics which evaluate the suitability of the input data set for use within the ANOVA process, and which examine the behavior of the resultant data, possibly suggesting transformations which should be applied to the data to reduce the LSD. The results illustrate some of the key features of, and results from, the uncertainty analysis studies, including the use of both numerical (continuous) and categorical (discrete) factors, the effects of the number and range of the input data points, and the effects of the number of factors considered simultaneously.

  12. Synthesizing Results From Empirical Research on Computer-Based Scaffolding in STEM Education: A Meta-Analysis.

    PubMed

    Belland, Brian R; Walker, Andrew E; Kim, Nam Ju; Lefler, Mason

    2017-04-01

    Computer-based scaffolding assists students as they generate solutions to complex problems, goals, or tasks, helping increase and integrate their higher order skills in the process. However, despite decades of research on scaffolding in STEM (science, technology, engineering, and mathematics) education, no existing comprehensive meta-analysis has synthesized the results of these studies. This review addresses that need by synthesizing the results of 144 experimental studies (333 outcomes) on the effects of computer-based scaffolding designed to assist the full range of STEM learners (primary through adult education) as they navigated ill-structured, problem-centered curricula. Results of our random effect meta-analysis (a) indicate that computer-based scaffolding showed a consistently positive (ḡ = 0.46) effect on cognitive outcomes across various contexts of use, scaffolding characteristics, and levels of assessment and (b) shed light on many scaffolding debates, including the roles of customization (i.e., fading and adding) and context-specific support. Specifically, scaffolding's influence on cognitive outcomes did not vary on the basis of context-specificity, presence or absence of scaffolding change, and logic by which scaffolding change is implemented. Scaffolding's influence was greatest when measured at the principles level and among adult learners. Still scaffolding's effect was substantial and significantly greater than zero across all age groups and assessment levels. These results suggest that scaffolding is a highly effective intervention across levels of different characteristics and can largely be designed in many different ways while still being highly effective.

  13. Induction of Social Behavior in Zebrafish: Live Versus Computer Animated Fish as Stimuli

    PubMed Central

    Qin, Meiying; Wong, Albert; Seguin, Diane

    2014-01-01

    Abstract The zebrafish offers an excellent compromise between system complexity and practical simplicity and has been suggested as a translational research tool for the analysis of human brain disorders associated with abnormalities of social behavior. Unlike laboratory rodents zebrafish are diurnal, thus visual cues may be easily utilized in the analysis of their behavior and brain function. Visual cues, including the sight of conspecifics, have been employed to induce social behavior in zebrafish. However, the method of presentation of these cues and the question of whether computer animated images versus live stimulus fish have differential effects have not been systematically analyzed. Here, we compare the effects of five stimulus presentation types: live conspecifics in the experimental tank or outside the tank, playback of video-recorded live conspecifics, computer animated images of conspecifics presented by two software applications, the previously employed General Fish Animator, and a new application Zebrafish Presenter. We report that all stimuli were equally effective and induced a robust social response (shoaling) manifesting as reduced distance between stimulus and experimental fish. We conclude that presentation of live stimulus fish, or 3D images, is not required and 2D computer animated images are sufficient to induce robust and consistent social behavioral responses in zebrafish. PMID:24575942

  14. Induction of social behavior in zebrafish: live versus computer animated fish as stimuli.

    PubMed

    Qin, Meiying; Wong, Albert; Seguin, Diane; Gerlai, Robert

    2014-06-01

    The zebrafish offers an excellent compromise between system complexity and practical simplicity and has been suggested as a translational research tool for the analysis of human brain disorders associated with abnormalities of social behavior. Unlike laboratory rodents zebrafish are diurnal, thus visual cues may be easily utilized in the analysis of their behavior and brain function. Visual cues, including the sight of conspecifics, have been employed to induce social behavior in zebrafish. However, the method of presentation of these cues and the question of whether computer animated images versus live stimulus fish have differential effects have not been systematically analyzed. Here, we compare the effects of five stimulus presentation types: live conspecifics in the experimental tank or outside the tank, playback of video-recorded live conspecifics, computer animated images of conspecifics presented by two software applications, the previously employed General Fish Animator, and a new application Zebrafish Presenter. We report that all stimuli were equally effective and induced a robust social response (shoaling) manifesting as reduced distance between stimulus and experimental fish. We conclude that presentation of live stimulus fish, or 3D images, is not required and 2D computer animated images are sufficient to induce robust and consistent social behavioral responses in zebrafish.

  15. The interactome of CCT complex - A computational analysis.

    PubMed

    Narayanan, Aswathy; Pullepu, Dileep; Kabir, M Anaul

    2016-10-01

    The eukaryotic chaperonin, CCT (Chaperonin Containing TCP1 or TriC-TCP-1 Ring Complex) has been subjected to physical and genetic analyses in S. cerevisiae which can be extrapolated to human CCT (hCCT), owing to its structural and functional similarities with yeast CCT (yCCT). Studies on hCCT and its interactome acquire an additional dimension, as it has been implicated in several disease conditions like neurodegeneration and cancer. We attempt to study its stress response role in general, which will be reflected in the aspects of human diseases and yeast physiology, through computational analysis of the interactome. Towards consolidating and analysing the interactome data, we prepared and compared the unique CCT-interacting protein lists for S. cerevisiae and H. sapiens, performed GO term classification and enrichment studies which provide information on the diversity in CCT interactome, in terms of protein classes in the data set. Enrichment with disease-associated proteins and pathways highlight the medical importance of CCT. Different analyses converge, suggesting the significance of WD-repeat proteins, protein kinases and cytoskeletal proteins in the interactome. The prevalence of proteasomal subunits and ribosomal proteins suggest a possible cross-talk between protein-synthesis, folding and degradation machinery. A network of chaperones and chaperonins that function in combination can also be envisaged from the CCT interactome-Hsp70 interactome analysis. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Multifractal analysis of information processing in hippocampal neural ensembles during working memory under Δ9-tetrahydrocannabinol administration

    PubMed Central

    Fetterhoff, Dustin; Opris, Ioan; Simpson, Sean L.; Deadwyler, Sam A.; Hampson, Robert E.; Kraft, Robert A.

    2014-01-01

    Background Multifractal analysis quantifies the time-scale-invariant properties in data by describing the structure of variability over time. By applying this analysis to hippocampal interspike interval sequences recorded during performance of a working memory task, a measure of long-range temporal correlations and multifractal dynamics can reveal single neuron correlates of information processing. New method Wavelet leaders-based multifractal analysis (WLMA) was applied to hippocampal interspike intervals recorded during a working memory task. WLMA can be used to identify neurons likely to exhibit information processing relevant to operation of brain–computer interfaces and nonlinear neuronal models. Results Neurons involved in memory processing (“Functional Cell Types” or FCTs) showed a greater degree of multifractal firing properties than neurons without task-relevant firing characteristics. In addition, previously unidentified FCTs were revealed because multifractal analysis suggested further functional classification. The cannabinoid-type 1 receptor partial agonist, tetrahydrocannabinol (THC), selectively reduced multifractal dynamics in FCT neurons compared to non-FCT neurons. Comparison with existing methods WLMA is an objective tool for quantifying the memory-correlated complexity represented by FCTs that reveals additional information compared to classification of FCTs using traditional z-scores to identify neuronal correlates of behavioral events. Conclusion z-Score-based FCT classification provides limited information about the dynamical range of neuronal activity characterized by WLMA. Increased complexity, as measured with multifractal analysis, may be a marker of functional involvement in memory processing. The level of multifractal attributes can be used to differentially emphasize neural signals to improve computational models and algorithms underlying brain–computer interfaces. PMID:25086297

  17. Evaluating Computer Screen Time and Its Possible Link to Psychopathology in the Context of Age: A Cross-Sectional Study of Parents and Children

    PubMed Central

    Ross, Sharon; Silman, Zmira; Maoz, Hagai; Bloch, Yuval

    2015-01-01

    Background Several studies have suggested that high levels of computer use are linked to psychopathology. However, there is ambiguity about what should be considered normal or over-use of computers. Furthermore, the nature of the link between computer usage and psychopathology is controversial. The current study utilized the context of age to address these questions. Our hypothesis was that the context of age will be paramount for differentiating normal from excessive use, and that this context will allow a better understanding of the link to psychopathology. Methods In a cross-sectional study, 185 parents and children aged 3–18 years were recruited in clinical and community settings. They were asked to fill out questionnaires regarding demographics, functional and academic variables, computer use as well as psychiatric screening questionnaires. Using a regression model, we identified 3 groups of normal-use, over-use and under-use and examined known factors as putative differentiators between the over-users and the other groups. Results After modeling computer screen time according to age, factors linked to over-use were: decreased socialization (OR 3.24, Confidence interval [CI] 1.23–8.55, p = 0.018), difficulty to disengage from the computer (OR 1.56, CI 1.07–2.28, p = 0.022) and age, though borderline-significant (OR 1.1 each year, CI 0.99–1.22, p = 0.058). While psychopathology was not linked to over-use, post-hoc analysis revealed that the link between increased computer screen time and psychopathology was age-dependent and solidified as age progressed (p = 0.007). Unlike computer usage, the use of small-screens and smartphones was not associated with psychopathology. Conclusions The results suggest that computer screen time follows an age-based course. We conclude that differentiating normal from over-use as well as defining over-use as a possible marker for psychiatric difficulties must be performed within the context of age. If verified by additional studies, future research should integrate those views in order to better understand the intricacies of computer over-use. PMID:26536037

  18. Evaluating Computer Screen Time and Its Possible Link to Psychopathology in the Context of Age: A Cross-Sectional Study of Parents and Children.

    PubMed

    Segev, Aviv; Mimouni-Bloch, Aviva; Ross, Sharon; Silman, Zmira; Maoz, Hagai; Bloch, Yuval

    2015-01-01

    Several studies have suggested that high levels of computer use are linked to psychopathology. However, there is ambiguity about what should be considered normal or over-use of computers. Furthermore, the nature of the link between computer usage and psychopathology is controversial. The current study utilized the context of age to address these questions. Our hypothesis was that the context of age will be paramount for differentiating normal from excessive use, and that this context will allow a better understanding of the link to psychopathology. In a cross-sectional study, 185 parents and children aged 3-18 years were recruited in clinical and community settings. They were asked to fill out questionnaires regarding demographics, functional and academic variables, computer use as well as psychiatric screening questionnaires. Using a regression model, we identified 3 groups of normal-use, over-use and under-use and examined known factors as putative differentiators between the over-users and the other groups. After modeling computer screen time according to age, factors linked to over-use were: decreased socialization (OR 3.24, Confidence interval [CI] 1.23-8.55, p = 0.018), difficulty to disengage from the computer (OR 1.56, CI 1.07-2.28, p = 0.022) and age, though borderline-significant (OR 1.1 each year, CI 0.99-1.22, p = 0.058). While psychopathology was not linked to over-use, post-hoc analysis revealed that the link between increased computer screen time and psychopathology was age-dependent and solidified as age progressed (p = 0.007). Unlike computer usage, the use of small-screens and smartphones was not associated with psychopathology. The results suggest that computer screen time follows an age-based course. We conclude that differentiating normal from over-use as well as defining over-use as a possible marker for psychiatric difficulties must be performed within the context of age. If verified by additional studies, future research should integrate those views in order to better understand the intricacies of computer over-use.

  19. Development and verification of a cementless novel tapered wedge stem for total hip arthroplasty.

    PubMed

    Faizan, Ahmad; Wuestemann, Thies; Nevelos, Jim; Bastian, Adam C; Collopy, Dermot

    2015-02-01

    Most current tapered wedge hip stems were designed based upon the original Mueller straight stem design introduced in 1977. These stems were designed to have a single medial curvature and grew laterally to accommodate different sizes. In this preclinical study, the design and verification of a tapered wedge stem using computed tomography scans of 556 patients are presented. The computer simulation demonstrated that the novel stem, designed for proximal engagement, allowed for reduced distal fixation, particularly in the 40-60 year male population. Moreover, the physical micromotion testing and finite element analysis demonstrated that the novel stem allowed for reduced micromotion. In summary, preclinical data suggest that the computed tomography based stem design described here may offer enhanced implant fit and reduced micromotion. Copyright © 2014 Elsevier Inc. All rights reserved.

  20. Proactive health computing.

    PubMed

    Timpka, T

    2001-08-01

    In an analysis departing from the global health situation, the foundation for a change of paradigm in health informatics based on socially embedded information infrastructures and technologies is identified and discussed. It is shown how an increasing computing and data transmitting capacity can be employed for proactive health computing. As a foundation for ubiquitous health promotion and prevention of disease and injury, proactive health systems use data from multiple sources to supply individuals and communities evidence-based information on means to improve their state of health and avoid health risks. The systems are characterised by: (1) being profusely connected to the world around them, using perceptual interfaces, sensors and actuators; (2) responding to external stimuli at faster than human speeds; (3) networked feedback loops; and (4) humans remaining in control, while being left outside the primary computing loop. The extended scientific mission of this new partnership between computer science, electrical engineering and social medicine is suggested to be the investigation of how the dissemination of information and communication technology on democratic grounds can be made even more important for global health than sanitation and urban planning became a century ago.

  1. A psychosocial comparison of computer-mediated and face-to-face language use among severely disturbed adolescents.

    PubMed

    Zimmerman, D P

    1987-01-01

    This study analyzes the content of communications among 18 severely disturbed adolescents. Interactions were recorded from two sources: computer-based "conferences" for the group, and small group face-to-face sessions which addressed similar topics. The purpose was to determine whether there are important differences in indications of psychological state, interpersonal interest, and expressive style. The research was significant, given the strong attraction of computers to many adolescents and the paucity of research on social-psychological effects of this technology. A content analysis based on a total sample of 10,224 words was performed using the Harvard IV Psychosociological Dictionary. Results indicated that computer-mediated communication was more expressive of feelings and made more frequent mention of interpersonal issues. Further, it displayed a more positive object-relations stance, was less negative in expressive style, and appeared to diminish certain traditional gender differences in group communication. These findings suggest that the computer may have an interesting adjunct role to play in reducing communication deficits commonly observed in severely disturbed adolescent clinical populations.

  2. Toward structure prediction of cyclic peptides.

    PubMed

    Yu, Hongtao; Lin, Yu-Shan

    2015-02-14

    Cyclic peptides are a promising class of molecules that can be used to target specific protein-protein interactions. A computational method to accurately predict their structures would substantially advance the development of cyclic peptides as modulators of protein-protein interactions. Here, we develop a computational method that integrates bias-exchange metadynamics simulations, a Boltzmann reweighting scheme, dihedral principal component analysis and a modified density peak-based cluster analysis to provide a converged structural description for cyclic peptides. Using this method, we evaluate the performance of a number of popular protein force fields on a model cyclic peptide. All the tested force fields seem to over-stabilize the α-helix and PPII/β regions in the Ramachandran plot, commonly populated by linear peptides and proteins. Our findings suggest that re-parameterization of a force field that well describes the full Ramachandran plot is necessary to accurately model cyclic peptides.

  3. Multiscale Modeling of Carbon/Phenolic Composite Thermal Protection Materials: Atomistic to Effective Properties

    NASA Technical Reports Server (NTRS)

    Arnold, Steven M.; Murthy, Pappu L.; Bednarcyk, Brett A.; Lawson, John W.; Monk, Joshua D.; Bauschlicher, Charles W., Jr.

    2016-01-01

    Next generation ablative thermal protection systems are expected to consist of 3D woven composite architectures. It is well known that composites can be tailored to achieve desired mechanical and thermal properties in various directions and thus can be made fit-for-purpose if the proper combination of constituent materials and microstructures can be realized. In the present work, the first, multiscale, atomistically-informed, computational analysis of mechanical and thermal properties of a present day - Carbon/Phenolic composite Thermal Protection System (TPS) material is conducted. Model results are compared to measured in-plane and out-of-plane mechanical and thermal properties to validate the computational approach. Results indicate that given sufficient microstructural fidelity, along with lowerscale, constituent properties derived from molecular dynamics simulations, accurate composite level (effective) thermo-elastic properties can be obtained. This suggests that next generation TPS properties can be accurately estimated via atomistically informed multiscale analysis.

  4. Grain Boundary Plane Orientation Fundamental Zones and Structure-Property Relationships

    PubMed Central

    Homer, Eric R.; Patala, Srikanth; Priedeman, Jonathan L.

    2015-01-01

    Grain boundary plane orientation is a profoundly important determinant of character in polycrystalline materials that is not well understood. This work demonstrates how boundary plane orientation fundamental zones, which capture the natural crystallographic symmetries of a grain boundary, can be used to establish structure-property relationships. Using the fundamental zone representation, trends in computed energy, excess volume at the grain boundary, and temperature-dependent mobility naturally emerge and show a strong dependence on the boundary plane orientation. Analysis of common misorientation axes even suggests broader trends of grain boundary energy as a function of misorientation angle and plane orientation. Due to the strong structure-property relationships that naturally emerge from this work, boundary plane fundamental zones are expected to simplify analysis of both computational and experimental data. This standardized representation has the potential to significantly accelerate research in the topologically complex and vast five-dimensional phase space of grain boundaries. PMID:26498715

  5. PC-based note taking in patient-centred diagnostic interviews: a thematic analysis of patient opinion elicited using a pilot survey instrument.

    PubMed

    Barker, Fiona; Court, Gemma

    2011-01-01

    Computers are used increasingly in patient-clinician consultations. There is the potential for PC use to have an effect on the communication process. The aim of this preliminary study was to investigate patient opinion regarding the use of PC-based note taking during diagnostic vestibular assessments. We gave a simple four-item questionnaire to 100 consecutive patients attending for vestibular assessment at a secondary referral level primary care trust audiology service. Written responses to two of the questionnaire items were subject to an inductive thematic analysis. The questionnaire was acceptable to patients, none refused to complete it. Dominant themes identified suggest that patients do perceive consistent positive benefits from the use of PC-based note taking. This pilot study's short survey instrument is usable and may provide insights into patients' perceptions of computer use in a clinical setting.

  6. Deep Learning in Medical Image Analysis

    PubMed Central

    Shen, Dinggang; Wu, Guorong; Suk, Heung-Il

    2016-01-01

    The computer-assisted analysis for better interpreting images have been longstanding issues in the medical imaging field. On the image-understanding front, recent advances in machine learning, especially, in the way of deep learning, have made a big leap to help identify, classify, and quantify patterns in medical images. Specifically, exploiting hierarchical feature representations learned solely from data, instead of handcrafted features mostly designed based on domain-specific knowledge, lies at the core of the advances. In that way, deep learning is rapidly proving to be the state-of-the-art foundation, achieving enhanced performances in various medical applications. In this article, we introduce the fundamentals of deep learning methods; review their successes to image registration, anatomical/cell structures detection, tissue segmentation, computer-aided disease diagnosis or prognosis, and so on. We conclude by raising research issues and suggesting future directions for further improvements. PMID:28301734

  7. Monte Carlo investigation of transient acoustic fields in partially or completely bounded medium. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Thanedar, B. D.

    1972-01-01

    A simple repetitive calculation was used to investigate what happens to the field in terms of the signal paths of disturbances originating from the energy source. The computation allowed the field to be reconstructed as a function of space and time on a statistical basis. The suggested Monte Carlo method is in response to the need for a numerical method to supplement analytical methods of solution which are only valid when the boundaries have simple shapes, rather than for a medium that is bounded. For the analysis, a suitable model was created from which was developed an algorithm for the estimation of acoustic pressure variations in the region under investigation. The validity of the technique was demonstrated by analysis of simple physical models with the aid of a digital computer. The Monte Carlo method is applicable to a medium which is homogeneous and is enclosed by either rectangular or curved boundaries.

  8. Wash load and bed-material load transport in the Yellow River

    USGS Publications Warehouse

    Yang, C.T.; Simoes, F.J.M.

    2005-01-01

    It has been the conventional assumption that wash load is supply limited and is only indirectly related to the hydraulics of a river. Hydraulic engineers also assumed that bed-material load concentration is independent of wash load concentration. This paper provides a detailed analysis of the Yellow River sediment transport data to determine whether the above assumptions are true and whether wash load concentration can be computed from the original unit stream power formula and the modified unit stream power formula for sediment-laden flows. A systematic and thorough analysis of 1,160 sets of data collected from 9 gauging stations along the Middle and Lower Yellow River confirmed that the method suggested by the conjunctive use of the two formulas can be used to compute wash load, bed-material load, and total load in the Yellow River with accuracy. Journal of Hydraulic Engineering ?? ASCE.

  9. Grain boundary plane orientation fundamental zones and structure-property relationships

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Homer, Eric R.; Patala, Srikanth; Priedeman, Jonathan L.

    2015-10-26

    Grain boundary plane orientation is a profoundly important determinant of character in polycrystalline materials that is not well understood. This work demonstrates how boundary plane orientation fundamental zones, which capture the natural crystallographic symmetries of a grain boundary, can be used to establish structure-property relationships. Using the fundamental zone representation, trends in computed energy, excess volume at the grain boundary, and temperature-dependent mobility naturally emerge and show a strong dependence on the boundary plane orientation. Analysis of common misorientation axes even suggests broader trends of grain boundary energy as a function of misorientation angle and plane orientation. Due to themore » strong structure-property relationships that naturally emerge from this work, boundary plane fundamental zones are expected to simplify analysis of both computational and experimental data. This standardized representation has the potential to significantly accelerate research in the topologically complex and vast five-dimensional phase space of grain boundaries.« less

  10. Assessing the psychological effects of an exercise training programme for patients following myocardial infarction: a pilot study.

    PubMed

    Prosser, G; Carson, P; Gelson, A; Tucker, H; Neophytou, M; Phillips, R; Simpson, T

    1978-03-01

    During a study investigating physiological and other effects of an exercise programme for coronary patients, a questionnaire was administered. Preliminary analysis had suggested some improvement in the patients' morale, but in view of the possible relevance of a number of psychological variables it was decided to carry out further analysis on the available data. The coefficient of discrimination was computed for 32 patients. For 19 patients correlations were computed between scores on subjective fitness, symptoms, anxiety, interest in sex, if at work, age, weight, and workload achieved at a given heart rate. The questionnaire appeared to have satisfactorily high internal and external validity. Patients with a high 'morale' score tended to achieve a greater increase in workload over the course. Although cause and effect cannot be unequivocally assigned, the association is felt to be important, and research is continuing.

  11. Saccadic eye movements analysis as a measure of drug effect on central nervous system function.

    PubMed

    Tedeschi, G; Quattrone, A; Bonavita, V

    1986-04-01

    Peak velocity (PSV) and duration (SD) of horizontal saccadic eye movements are demonstrably under the control of specific brain stem structures. Experimental and clinical evidence suggest the existence of an immediate premotor system for saccade generation located in the paramedian pontine reticular formation (PPRF). Effects on saccadic eye movements have been studied in normal volunteers with barbiturates, benzodiazepines, amphetamine and ethanol. On two occasions computer analysis of PSV, SD, saccade reaction time (SRT) and saccade accuracy (SA) was carried out in comparison with more traditional methods of assessment of human psychomotor performance like choice reaction time (CRT) and critical flicker fusion threshold (CFFT). The computer system proved to be a highly sensitive and objective method for measuring drug effect on central nervous system (CNS) function. It allows almost continuous sampling of data and appears to be particularly suitable for studying rapidly changing drug effects on the CNS.

  12. A computational study of liposome logic: towards cellular computing from the bottom up

    PubMed Central

    Smaldon, James; Romero-Campero, Francisco J.; Fernández Trillo, Francisco; Gheorghe, Marian; Alexander, Cameron

    2010-01-01

    In this paper we propose a new bottom-up approach to cellular computing, in which computational chemical processes are encapsulated within liposomes. This “liposome logic” approach (also called vesicle computing) makes use of supra-molecular chemistry constructs, e.g. protocells, chells, etc. as minimal cellular platforms to which logical functionality can be added. Modeling and simulations feature prominently in “top-down” synthetic biology, particularly in the specification, design and implementation of logic circuits through bacterial genome reengineering. The second contribution in this paper is the demonstration of a novel set of tools for the specification, modelling and analysis of “bottom-up” liposome logic. In particular, simulation and modelling techniques are used to analyse some example liposome logic designs, ranging from relatively simple NOT gates and NAND gates to SR-Latches, D Flip-Flops all the way to 3 bit ripple counters. The approach we propose consists of specifying, by means of P systems, gene regulatory network-like systems operating inside proto-membranes. This P systems specification can be automatically translated and executed through a multiscaled pipeline composed of dissipative particle dynamics (DPD) simulator and Gillespie’s stochastic simulation algorithm (SSA). Finally, model selection and analysis can be performed through a model checking phase. This is the first paper we are aware of that brings to bear formal specifications, DPD, SSA and model checking to the problem of modeling target computational functionality in protocells. Potential chemical routes for the laboratory implementation of these simulations are also discussed thus for the first time suggesting a potentially realistic physiochemical implementation for membrane computing from the bottom-up. PMID:21886681

  13. A Semisupervised Support Vector Machines Algorithm for BCI Systems

    PubMed Central

    Qin, Jianzhao; Li, Yuanqing; Sun, Wei

    2007-01-01

    As an emerging technology, brain-computer interfaces (BCIs) bring us new communication interfaces which translate brain activities into control signals for devices like computers, robots, and so forth. In this study, we propose a semisupervised support vector machine (SVM) algorithm for brain-computer interface (BCI) systems, aiming at reducing the time-consuming training process. In this algorithm, we apply a semisupervised SVM for translating the features extracted from the electrical recordings of brain into control signals. This SVM classifier is built from a small labeled data set and a large unlabeled data set. Meanwhile, to reduce the time for training semisupervised SVM, we propose a batch-mode incremental learning method, which can also be easily applied to the online BCI systems. Additionally, it is suggested in many studies that common spatial pattern (CSP) is very effective in discriminating two different brain states. However, CSP needs a sufficient labeled data set. In order to overcome the drawback of CSP, we suggest a two-stage feature extraction method for the semisupervised learning algorithm. We apply our algorithm to two BCI experimental data sets. The offline data analysis results demonstrate the effectiveness of our algorithm. PMID:18368141

  14. Weighted analysis of paired microarray experiments.

    PubMed

    Kristiansson, Erik; Sjögren, Anders; Rudemo, Mats; Nerman, Olle

    2005-01-01

    In microarray experiments quality often varies, for example between samples and between arrays. The need for quality control is therefore strong. A statistical model and a corresponding analysis method is suggested for experiments with pairing, including designs with individuals observed before and after treatment and many experiments with two-colour spotted arrays. The model is of mixed type with some parameters estimated by an empirical Bayes method. Differences in quality are modelled by individual variances and correlations between repetitions. The method is applied to three real and several simulated datasets. Two of the real datasets are of Affymetrix type with patients profiled before and after treatment, and the third dataset is of two-colour spotted cDNA type. In all cases, the patients or arrays had different estimated variances, leading to distinctly unequal weights in the analysis. We suggest also plots which illustrate the variances and correlations that affect the weights computed by our analysis method. For simulated data the improvement relative to previously published methods without weighting is shown to be substantial.

  15. Information-processing differences and laterality of students from different colleges and disciplines.

    PubMed

    Monfort, M; Martin, S A; Frederickson, W

    1990-02-01

    1023 college students were assessed for hemispheric brain dominance using the paper-and-pencil test, the Human Information Processing Survey. Analysis of scores of students majoring in Advertising, Interior Design, Music, Journalism, Art, Oral Communication, and Architecture suggested a preference for right-brain hemispheric processing, while scores of students majoring in Accounting, Management, Finance, Computer Science, Mathematics, Nursing, Funeral Service, Criminal Justice, and Elementary Education suggested a preference for left-hemispheric strategies for processing information. The differential effects of hemispheric processing in an educational system emphasizing the left-hemispheric activities of structured logic and sequential processing suggests repression of the intellectual development of those students who may be genetically favorable to right-hemispheric processing.

  16. Numerosity as a topological invariant.

    PubMed

    Kluth, Tobias; Zetzsche, Christoph

    2016-01-01

    The ability to quickly recognize the number of objects in our environment is a fundamental cognitive function. However, it is far from clear which computations and which actual neural processing mechanisms are used to provide us with such a skill. Here we try to provide a detailed and comprehensive analysis of this issue, which comprises both the basic mathematical foundations and the peculiarities imposed by the structure of the visual system and by the neural computations provided by the visual cortex. We suggest that numerosity should be considered as a mathematical invariant. Making use of concepts from mathematical topology--like connectedness, Betti numbers, and the Gauss-Bonnet theorem--we derive the basic computations suited for the computation of this invariant. We show that the computation of numerosity is possible in a neurophysiologically plausible fashion using only computational elements which are known to exist in the visual cortex. We further show that a fundamental feature of numerosity perception, its Weber property, arises naturally, assuming noise in the basic neural operations. The model is tested on an extended data set (made publicly available). It is hoped that our results can provide a general framework for future research on the invariance properties of the numerosity system.

  17. Bio and health informatics meets cloud : BioVLab as an example.

    PubMed

    Chae, Heejoon; Jung, Inuk; Lee, Hyungro; Marru, Suresh; Lee, Seong-Whan; Kim, Sun

    2013-01-01

    The exponential increase of genomic data brought by the advent of the next or the third generation sequencing (NGS) technologies and the dramatic drop in sequencing cost have driven biological and medical sciences to data-driven sciences. This revolutionary paradigm shift comes with challenges in terms of data transfer, storage, computation, and analysis of big bio/medical data. Cloud computing is a service model sharing a pool of configurable resources, which is a suitable workbench to address these challenges. From the medical or biological perspective, providing computing power and storage is the most attractive feature of cloud computing in handling the ever increasing biological data. As data increases in size, many research organizations start to experience the lack of computing power, which becomes a major hurdle in achieving research goals. In this paper, we review the features of publically available bio and health cloud systems in terms of graphical user interface, external data integration, security and extensibility of features. We then discuss about issues and limitations of current cloud systems and conclude with suggestion of a biological cloud environment concept, which can be defined as a total workbench environment assembling computational tools and databases for analyzing bio/medical big data in particular application domains.

  18. On-line confidence monitoring during decision making.

    PubMed

    Dotan, Dror; Meyniel, Florent; Dehaene, Stanislas

    2018-02-01

    Humans can readily assess their degree of confidence in their decisions. Two models of confidence computation have been proposed: post hoc computation using post-decision variables and heuristics, versus online computation using continuous assessment of evidence throughout the decision-making process. Here, we arbitrate between these theories by continuously monitoring finger movements during a manual sequential decision-making task. Analysis of finger kinematics indicated that subjects kept separate online records of evidence and confidence: finger deviation continuously reflected the ongoing accumulation of evidence, whereas finger speed continuously reflected the momentary degree of confidence. Furthermore, end-of-trial finger speed predicted the post-decisional subjective confidence rating. These data indicate that confidence is computed on-line, throughout the decision process. Speed-confidence correlations were previously interpreted as a post-decision heuristics, whereby slow decisions decrease subjective confidence, but our results suggest an adaptive mechanism that involves the opposite causality: by slowing down when unconfident, participants gain time to improve their decisions. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Evolutionary-inspired probabilistic search for enhancing sampling of local minima in the protein energy surface

    PubMed Central

    2012-01-01

    Background Despite computational challenges, elucidating conformations that a protein system assumes under physiologic conditions for the purpose of biological activity is a central problem in computational structural biology. While these conformations are associated with low energies in the energy surface that underlies the protein conformational space, few existing conformational search algorithms focus on explicitly sampling low-energy local minima in the protein energy surface. Methods This work proposes a novel probabilistic search framework, PLOW, that explicitly samples low-energy local minima in the protein energy surface. The framework combines algorithmic ingredients from evolutionary computation and computational structural biology to effectively explore the subspace of local minima. A greedy local search maps a conformation sampled in conformational space to a nearby local minimum. A perturbation move jumps out of a local minimum to obtain a new starting conformation for the greedy local search. The process repeats in an iterative fashion, resulting in a trajectory-based exploration of the subspace of local minima. Results and conclusions The analysis of PLOW's performance shows that, by navigating only the subspace of local minima, PLOW is able to sample conformations near a protein's native structure, either more effectively or as well as state-of-the-art methods that focus on reproducing the native structure for a protein system. Analysis of the actual subspace of local minima shows that PLOW samples this subspace more effectively that a naive sampling approach. Additional theoretical analysis reveals that the perturbation function employed by PLOW is key to its ability to sample a diverse set of low-energy conformations. This analysis also suggests directions for further research and novel applications for the proposed framework. PMID:22759582

  20. Structural analysis of two different stent configurations.

    PubMed

    Simão, M; Ferreira, J M; Mora-Rodriguez, J; Ramos, H M

    2017-06-01

    Two different stent configurations (i.e. the well known Palmaz-Schatz (PS) and a new stent configuration) are mechanically investigated. A finite element model was used to study the two geometries under combining loads and a computational fluid dynamic model based on fluid structure interaction was developed investigating the plaque and the artery wall reactions in a stented arterial segment. These models determine the stress and displacement fields of the two stents under internal pressure conditions. Results suggested that stent designs cause alterations in vascular anatomy that adversely affect arterial stress distributions within the wall, which have impact in the vessel responses such as the restenosis. The hemodynamic analysis shows the use of new stent geometry suggests better biofluid mechanical response such as the deformation and the progressive amount of plaque growth.

  1. The Handicap Principle for Trust in Computer Security, the Semantic Web and Social Networking

    NASA Astrophysics Data System (ADS)

    Ma, Zhanshan (Sam); Krings, Axel W.; Hung, Chih-Cheng

    Communication is a fundamental function of life, and it exists in almost all living things: from single-cell bacteria to human beings. Communication, together with competition and cooperation,arethree fundamental processes in nature. Computer scientists are familiar with the study of competition or 'struggle for life' through Darwin's evolutionary theory, or even evolutionary computing. They may be equally familiar with the study of cooperation or altruism through the Prisoner's Dilemma (PD) game. However, they are likely to be less familiar with the theory of animal communication. The objective of this article is three-fold: (i) To suggest that the study of animal communication, especially the honesty (reliability) of animal communication, in which some significant advances in behavioral biology have been achieved in the last three decades, should be on the verge to spawn important cross-disciplinary research similar to that generated by the study of cooperation with the PD game. One of the far-reaching advances in the field is marked by the publication of "The Handicap Principle: a Missing Piece of Darwin's Puzzle" by Zahavi (1997). The 'Handicap' principle [34][35], which states that communication signals must be costly in some proper way to be reliable (honest), is best elucidated with evolutionary games, e.g., Sir Philip Sidney (SPS) game [23]. Accordingly, we suggest that the Handicap principle may serve as a fundamental paradigm for trust research in computer science. (ii) To suggest to computer scientists that their expertise in modeling computer networks may help behavioral biologists in their study of the reliability of animal communication networks. This is largely due to the historical reason that, until the last decade, animal communication was studied with the dyadic paradigm (sender-receiver) rather than with the network paradigm. (iii) To pose several open questions, the answers to which may bear some refreshing insights to trust research in computer science, especially secure and resilient computing, the semantic web, and social networking. One important thread unifying the three aspects is the evolutionary game theory modeling or its extensions with survival analysis and agreement algorithms [19][20], which offer powerful game models for describing time-, space-, and covariate-dependent frailty (uncertainty and vulnerability) and deception (honesty).

  2. A Neural Basis of Facial Action Recognition in Humans

    PubMed Central

    Srinivasan, Ramprakash; Golomb, Julie D.

    2016-01-01

    By combining different facial muscle actions, called action units, humans can produce an extraordinarily large number of facial expressions. Computational models and studies in cognitive science and social psychology have long hypothesized that the brain needs to visually interpret these action units to understand other people's actions and intentions. Surprisingly, no studies have identified the neural basis of the visual recognition of these action units. Here, using functional magnetic resonance imaging and an innovative machine learning analysis approach, we identify a consistent and differential coding of action units in the brain. Crucially, in a brain region thought to be responsible for the processing of changeable aspects of the face, multivoxel pattern analysis could decode the presence of specific action units in an image. This coding was found to be consistent across people, facilitating the estimation of the perceived action units on participants not used to train the multivoxel decoder. Furthermore, this coding of action units was identified when participants attended to the emotion category of the facial expression, suggesting an interaction between the visual analysis of action units and emotion categorization as predicted by the computational models mentioned above. These results provide the first evidence for a representation of action units in the brain and suggest a mechanism for the analysis of large numbers of facial actions and a loss of this capacity in psychopathologies. SIGNIFICANCE STATEMENT Computational models and studies in cognitive and social psychology propound that visual recognition of facial expressions requires an intermediate step to identify visible facial changes caused by the movement of specific facial muscles. Because facial expressions are indeed created by moving one's facial muscles, it is logical to assume that our visual system solves this inverse problem. Here, using an innovative machine learning method and neuroimaging data, we identify for the first time a brain region responsible for the recognition of actions associated with specific facial muscles. Furthermore, this representation is preserved across subjects. Our machine learning analysis does not require mapping the data to a standard brain and may serve as an alternative to hyperalignment. PMID:27098688

  3. Allele-sharing models: LOD scores and accurate linkage tests.

    PubMed

    Kong, A; Cox, N J

    1997-11-01

    Starting with a test statistic for linkage analysis based on allele sharing, we propose an associated one-parameter model. Under general missing-data patterns, this model allows exact calculation of likelihood ratios and LOD scores and has been implemented by a simple modification of existing software. Most important, accurate linkage tests can be performed. Using an example, we show that some previously suggested approaches to handling less than perfectly informative data can be unacceptably conservative. Situations in which this model may not perform well are discussed, and an alternative model that requires additional computations is suggested.

  4. Allele-sharing models: LOD scores and accurate linkage tests.

    PubMed Central

    Kong, A; Cox, N J

    1997-01-01

    Starting with a test statistic for linkage analysis based on allele sharing, we propose an associated one-parameter model. Under general missing-data patterns, this model allows exact calculation of likelihood ratios and LOD scores and has been implemented by a simple modification of existing software. Most important, accurate linkage tests can be performed. Using an example, we show that some previously suggested approaches to handling less than perfectly informative data can be unacceptably conservative. Situations in which this model may not perform well are discussed, and an alternative model that requires additional computations is suggested. PMID:9345087

  5. Performance evaluation of the Engineering Analysis and Data Systems (EADS) 2

    NASA Technical Reports Server (NTRS)

    Debrunner, Linda S.

    1994-01-01

    The Engineering Analysis and Data System (EADS)II (1) was installed in March 1993 to provide high performance computing for science and engineering at Marshall Space Flight Center (MSFC). EADS II increased the computing capabilities over the existing EADS facility in the areas of throughput and mass storage. EADS II includes a Vector Processor Compute System (VPCS), a Virtual Memory Compute System (CFS), a Common Output System (COS), as well as Image Processing Station, Mini Super Computers, and Intelligent Workstations. These facilities are interconnected by a sophisticated network system. This work considers only the performance of the VPCS and the CFS. The VPCS is a Cray YMP. The CFS is implemented on an RS 6000 using the UniTree Mass Storage System. To better meet the science and engineering computing requirements, EADS II must be monitored, its performance analyzed, and appropriate modifications for performance improvement made. Implementing this approach requires tool(s) to assist in performance monitoring and analysis. In Spring 1994, PerfStat 2.0 was purchased to meet these needs for the VPCS and the CFS. PerfStat(2) is a set of tools that can be used to analyze both historical and real-time performance data. Its flexible design allows significant user customization. The user identifies what data is collected, how it is classified, and how it is displayed for evaluation. Both graphical and tabular displays are supported. The capability of the PerfStat tool was evaluated, appropriate modifications to EADS II to optimize throughput and enhance productivity were suggested and implemented, and the effects of these modifications on the systems performance were observed. In this paper, the PerfStat tool is described, then its use with EADS II is outlined briefly. Next, the evaluation of the VPCS, as well as the modifications made to the system are described. Finally, conclusions are drawn and recommendations for future worked are outlined.

  6. The Role of Computer-Assisted Technology in Post-Traumatic Orbital Reconstruction: A PRISMA-driven Systematic Review.

    PubMed

    Wan, Kelvin H; Chong, Kelvin K L; Young, Alvin L

    2015-12-08

    Post-traumatic orbital reconstruction remains a surgical challenge and requires careful preoperative planning, sound anatomical knowledge and good intraoperative judgment. Computer-assisted technology has the potential to reduce error and subjectivity in the management of these complex injuries. A systematic review of the literature was conducted to explore the emerging role of computer-assisted technologies in post-traumatic orbital reconstruction, in terms of functional and safety outcomes. We searched for articles comparing computer-assisted procedures with conventional surgery and studied outcomes on diplopia, enophthalmos, or procedure-related complications. Six observational studies with 273 orbits at a mean follow-up of 13 months were included. Three out of 4 studies reported significantly fewer patients with residual diplopia in the computer-assisted group, while only 1 of the 5 studies reported better improvement in enophthalmos in the assisted group. Types and incidence of complications were comparable. Study heterogeneities limiting statistical comparison by meta-analysis will be discussed. This review highlights the scarcity of data on computer-assisted technology in orbital reconstruction. The result suggests that computer-assisted technology may offer potential advantage in treating diplopia while its role remains to be confirmed in enophthalmos. Additional well-designed and powered randomized controlled trials are much needed.

  7. The importance of employing computational resources for the automation of drug discovery.

    PubMed

    Rosales-Hernández, Martha Cecilia; Correa-Basurto, José

    2015-03-01

    The application of computational tools to drug discovery helps researchers to design and evaluate new drugs swiftly with a reduce economic resources. To discover new potential drugs, computational chemistry incorporates automatization for obtaining biological data such as adsorption, distribution, metabolism, excretion and toxicity (ADMET), as well as drug mechanisms of action. This editorial looks at examples of these computational tools, including docking, molecular dynamics simulation, virtual screening, quantum chemistry, quantitative structural activity relationship, principal component analysis and drug screening workflow systems. The authors then provide their perspectives on the importance of these techniques for drug discovery. Computational tools help researchers to design and discover new drugs for the treatment of several human diseases without side effects, thus allowing for the evaluation of millions of compounds with a reduced cost in both time and economic resources. The problem is that operating each program is difficult; one is required to use several programs and understand each of the properties being tested. In the future, it is possible that a single computer and software program will be capable of evaluating the complete properties (mechanisms of action and ADMET properties) of ligands. It is also possible that after submitting one target, this computer-software will be capable of suggesting potential compounds along with ways to synthesize them, and presenting biological models for testing.

  8. Electronic Circuit Analysis Language (ECAL)

    NASA Astrophysics Data System (ADS)

    Chenghang, C.

    1983-03-01

    The computer aided design technique is an important development in computer applications and it is an important component of computer science. The special language for electronic circuit analysis is the foundation of computer aided design or computer aided circuit analysis (abbreviated as CACD and CACA) of simulated circuits. Electronic circuit analysis language (ECAL) is a comparatively simple and easy to use circuit analysis special language which uses the FORTRAN language to carry out the explanatory executions. It is capable of conducting dc analysis, ac analysis, and transient analysis of a circuit. Futhermore, the results of the dc analysis can be used directly as the initial conditions for the ac and transient analyses.

  9. An Open Source Rapid Computer Aided Control System Design Toolchain Using Scilab, Scicos and RTAI Linux

    NASA Astrophysics Data System (ADS)

    Bouchpan-Lerust-Juéry, L.

    2007-08-01

    Current and next generation on-board computer systems tend to implement real-time embedded control applications (e.g. Attitude and Orbit Control Subsystem (AOCS), Packet Utililization Standard (PUS), spacecraft autonomy . . . ) which must meet high standards of Reliability and Predictability as well as Safety. All these requirements require a considerable amount of effort and cost for Space Sofware Industry. This paper, in a first part, presents a free Open Source integrated solution to develop RTAI applications from analysis, design, simulation and direct implementation using code generation based on Open Source and in its second part summarises this suggested approach, its results and the conclusion for further work.

  10. Mind, Machine, and Creativity: An Artist's Perspective.

    PubMed

    Sundararajan, Louise

    2014-06-01

    Harold Cohen is a renowned painter who has developed a computer program, AARON, to create art. While AARON has been hailed as one of the most creative AI programs, Cohen consistently rejects the claims of machine creativity. Questioning the possibility for AI to model human creativity, Cohen suggests in so many words that the human mind takes a different route to creativity, a route that privileges the relational, rather than the computational, dimension of cognition. This unique perspective on the tangled web of mind, machine, and creativity is explored by an application of three relational models of the mind to an analysis of Cohen's talks and writings, which are available on his website: www.aaronshome.com.

  11. Mind, Machine, and Creativity: An Artist's Perspective

    PubMed Central

    Sundararajan, Louise

    2014-01-01

    Harold Cohen is a renowned painter who has developed a computer program, AARON, to create art. While AARON has been hailed as one of the most creative AI programs, Cohen consistently rejects the claims of machine creativity. Questioning the possibility for AI to model human creativity, Cohen suggests in so many words that the human mind takes a different route to creativity, a route that privileges the relational, rather than the computational, dimension of cognition. This unique perspective on the tangled web of mind, machine, and creativity is explored by an application of three relational models of the mind to an analysis of Cohen's talks and writings, which are available on his website: www.aaronshome.com. PMID:25541564

  12. Computational search for hypotheses concerning the endocannabinoid contribution to the extinction of fear conditioning.

    PubMed

    Anastasio, Thomas J

    2013-01-01

    Fear conditioning, in which a cue is conditioned to elicit a fear response, and extinction, in which a previously conditioned cue no longer elicits a fear response, depend on neural plasticity occurring within the amygdala. Projection neurons in the basolateral amygdala (BLA) learn to respond to the cue during fear conditioning, and they mediate fear responding by transferring cue signals to the output stage of the amygdala. Some BLA projection neurons retain their cue responses after extinction. Recent work shows that activation of the endocannabinoid system is necessary for extinction, and it leads to long-term depression (LTD) of the GABAergic synapses that inhibitory interneurons make onto BLA projection neurons. Such GABAergic LTD would enhance the responses of the BLA projection neurons that mediate fear responding, so it would seem to oppose, rather than promote, extinction. To address this paradox, a computational analysis of two well-known conceptual models of amygdaloid plasticity was undertaken. The analysis employed exhaustive state-space search conducted within a declarative programming environment. The analysis reveals that GABAergic LTD actually increases the number of synaptic strength configurations that achieve extinction while preserving the cue responses of some BLA projection neurons in both models. The results suggest that GABAergic LTD helps the amygdala retain cue memory during extinction even as the amygdala learns to suppress the previously conditioned response. The analysis also reveals which features of both models are essential for their ability to achieve extinction with some cue memory preservation, and suggests experimental tests of those features.

  13. Computational search for hypotheses concerning the endocannabinoid contribution to the extinction of fear conditioning

    PubMed Central

    Anastasio, Thomas J.

    2013-01-01

    Fear conditioning, in which a cue is conditioned to elicit a fear response, and extinction, in which a previously conditioned cue no longer elicits a fear response, depend on neural plasticity occurring within the amygdala. Projection neurons in the basolateral amygdala (BLA) learn to respond to the cue during fear conditioning, and they mediate fear responding by transferring cue signals to the output stage of the amygdala. Some BLA projection neurons retain their cue responses after extinction. Recent work shows that activation of the endocannabinoid system is necessary for extinction, and it leads to long-term depression (LTD) of the GABAergic synapses that inhibitory interneurons make onto BLA projection neurons. Such GABAergic LTD would enhance the responses of the BLA projection neurons that mediate fear responding, so it would seem to oppose, rather than promote, extinction. To address this paradox, a computational analysis of two well-known conceptual models of amygdaloid plasticity was undertaken. The analysis employed exhaustive state-space search conducted within a declarative programming environment. The analysis reveals that GABAergic LTD actually increases the number of synaptic strength configurations that achieve extinction while preserving the cue responses of some BLA projection neurons in both models. The results suggest that GABAergic LTD helps the amygdala retain cue memory during extinction even as the amygdala learns to suppress the previously conditioned response. The analysis also reveals which features of both models are essential for their ability to achieve extinction with some cue memory preservation, and suggests experimental tests of those features. PMID:23761759

  14. Computational Analysis of Pharyngeal Swallowing Mechanics in Patients with Motor Neuron Disease: A Pilot Investigation.

    PubMed

    Garand, K L; Schwertner, Ryan; Chen, Amy; Pearson, William G

    2018-04-01

    Swallowing impairment (dysphagia) is a common sequela in patients with motor neuron disease (MND). The purpose of this retrospective, observational pilot investigation was to characterize how pharyngeal swallowing mechanics are impacted in patients with MND using a comparison with healthy, non-dysphagic control group. Computational analysis of swallowing mechanics (CASM) was used to determine covariate biomechanics of pharyngeal swallowing from videofluoroscopic assessment in 15 patients with MND and 15 age- and sex-matched healthy controls. Canonical variant analysis with post hoc discriminate function analysis (DFA) was performed on coordinate data mapping functional muscle groups underlying pharyngeal swallowing. Differences in swallowing mechanics associated with group (MND; control), motor neuron predominance (upper; lower), onset (bulbar; spinal), and swallow task (thin, pudding) were evaluated and visualized. Pharyngeal swallowing mechanics differed significantly in patients with MND compared with healthy controls (D = 2.01, p < 0.0001). Post hoc DFA pairwise comparisons suggest differences in pharyngeal swallow mechanics by motor neuron predominance (D = 5.03, p < 0.0001), onset (D = 2.03, p < 0.0001), and swallow task (D = 1.04, p < 0.0001). Pharyngeal swallowing mechanics of patients with MND differ from and are more heterogeneous than healthy controls. These findings suggest patients with MND may compensate reductions in pharyngeal shortening and tongue base retraction by extending the head and neck and increasing hyolaryngeal excursion. This work and further CASM investigations will lead to further insights into development and evaluation of targeted clinical treatments designed to prolong safe and efficient swallowing function in patients with MND.

  15. Achieving benefit for patients in primary care informatics: the report of a international consensus workshop at Medinfo 2007.

    PubMed

    de Lusignan, Simon; Teasdale, Sheila

    2007-01-01

    Landmark reports suggest that sharing health data between clinical computer systems should improve patient safety and the quality of care. Enhancing the use of informatics in primary care is usually a key part of these strategies. To synthesise the learning from the international use of informatics in primary care. The workshop was attended by 21 delegates drawn from all continents. There were presentations from USA, UK and the Netherlands, and informal updates from Australia, Argentina, and Sweden and the Nordic countries. These presentations were discussed in a workshop setting to identify common issues. Key principles were synthesised through a post-workshop analysis and then sorted into themes. Themes emerged about the deployment of informatics which can be applied at health service, practice and individual clinical consultation level: 1 At the health service or provider level, success appeared proportional to the extent of collaboration between a broad range of stakeholders and identification of leaders. 2 Within the practice much is currently being achieved with legacy computer systems and apparently outdated coding systems. This includes prescribing safety alerts, clinical audit and promoting computer data recording and quality. 3 In the consultation the computer is a 'big player' and may make traditional models of the consultation redundant. We should make more efforts to share learning; develop clear internationally acceptable definitions; highlight gaps between pockets of excellence and real-world practice, and most importantly suggest how they might be bridged. Knowledge synthesis from different health systems may provide a greater understanding of how the third actor (the computer) is best used in primary care.

  16. Research in applied mathematics, numerical analysis, and computer science

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  17. Noninvasive Computed Tomography–based Risk Stratification of Lung Adenocarcinomas in the National Lung Screening Trial

    PubMed Central

    Maldonado, Fabien; Duan, Fenghai; Raghunath, Sushravya M.; Rajagopalan, Srinivasan; Karwoski, Ronald A.; Garg, Kavita; Greco, Erin; Nath, Hrudaya; Robb, Richard A.; Bartholmai, Brian J.

    2015-01-01

    Rationale: Screening for lung cancer using low-dose computed tomography (CT) reduces lung cancer mortality. However, in addition to a high rate of benign nodules, lung cancer screening detects a large number of indolent cancers that generally belong to the adenocarcinoma spectrum. Individualized management of screen-detected adenocarcinomas would be facilitated by noninvasive risk stratification. Objectives: To validate that Computer-Aided Nodule Assessment and Risk Yield (CANARY), a novel image analysis software, successfully risk stratifies screen-detected lung adenocarcinomas based on clinical disease outcomes. Methods: We identified retrospective 294 eligible patients diagnosed with lung adenocarcinoma spectrum lesions in the low-dose CT arm of the National Lung Screening Trial. The last low-dose CT scan before the diagnosis of lung adenocarcinoma was analyzed using CANARY blinded to clinical data. Based on their parametric CANARY signatures, all the lung adenocarcinoma nodules were risk stratified into three groups. CANARY risk groups were compared using survival analysis for progression-free survival. Measurements and Main Results: A total of 294 patients were included in the analysis. Kaplan-Meier analysis of all the 294 adenocarcinoma nodules stratified into the Good, Intermediate, and Poor CANARY risk groups yielded distinct progression-free survival curves (P < 0.0001). This observation was confirmed in the unadjusted and adjusted (age, sex, race, and smoking status) progression-free survival analysis of all stage I cases. Conclusions: CANARY allows the noninvasive risk stratification of lung adenocarcinomas into three groups with distinct post-treatment progression-free survival. Our results suggest that CANARY could ultimately facilitate individualized management of incidentally or screen-detected lung adenocarcinomas. PMID:26052977

  18. Noninvasive Computed Tomography-based Risk Stratification of Lung Adenocarcinomas in the National Lung Screening Trial.

    PubMed

    Maldonado, Fabien; Duan, Fenghai; Raghunath, Sushravya M; Rajagopalan, Srinivasan; Karwoski, Ronald A; Garg, Kavita; Greco, Erin; Nath, Hrudaya; Robb, Richard A; Bartholmai, Brian J; Peikert, Tobias

    2015-09-15

    Screening for lung cancer using low-dose computed tomography (CT) reduces lung cancer mortality. However, in addition to a high rate of benign nodules, lung cancer screening detects a large number of indolent cancers that generally belong to the adenocarcinoma spectrum. Individualized management of screen-detected adenocarcinomas would be facilitated by noninvasive risk stratification. To validate that Computer-Aided Nodule Assessment and Risk Yield (CANARY), a novel image analysis software, successfully risk stratifies screen-detected lung adenocarcinomas based on clinical disease outcomes. We identified retrospective 294 eligible patients diagnosed with lung adenocarcinoma spectrum lesions in the low-dose CT arm of the National Lung Screening Trial. The last low-dose CT scan before the diagnosis of lung adenocarcinoma was analyzed using CANARY blinded to clinical data. Based on their parametric CANARY signatures, all the lung adenocarcinoma nodules were risk stratified into three groups. CANARY risk groups were compared using survival analysis for progression-free survival. A total of 294 patients were included in the analysis. Kaplan-Meier analysis of all the 294 adenocarcinoma nodules stratified into the Good, Intermediate, and Poor CANARY risk groups yielded distinct progression-free survival curves (P < 0.0001). This observation was confirmed in the unadjusted and adjusted (age, sex, race, and smoking status) progression-free survival analysis of all stage I cases. CANARY allows the noninvasive risk stratification of lung adenocarcinomas into three groups with distinct post-treatment progression-free survival. Our results suggest that CANARY could ultimately facilitate individualized management of incidentally or screen-detected lung adenocarcinomas.

  19. Deep Learning in Gastrointestinal Endoscopy.

    PubMed

    Patel, Vivek; Armstrong, David; Ganguli, Malika; Roopra, Sandeep; Kantipudi, Neha; Albashir, Siwar; Kamath, Markad V

    2016-01-01

    Gastrointestinal (GI) endoscopy is used to inspect the lumen or interior of the GI tract for several purposes, including, (1) making a clinical diagnosis, in real time, based on the visual appearances; (2) taking targeted tissue samples for subsequent histopathological examination; and (3) in some cases, performing therapeutic interventions targeted at specific lesions. GI endoscopy is therefore predicated on the assumption that the operator-the endoscopist-is able to identify and characterize abnormalities or lesions accurately and reproducibly. However, as in other areas of clinical medicine, such as histopathology and radiology, many studies have documented marked interobserver and intraobserver variability in lesion recognition. Thus, there is a clear need and opportunity for techniques or methodologies that will enhance the quality of lesion recognition and diagnosis and improve the outcomes of GI endoscopy. Deep learning models provide a basis to make better clinical decisions in medical image analysis. Biomedical image segmentation, classification, and registration can be improved with deep learning. Recent evidence suggests that the application of deep learning methods to medical image analysis can contribute significantly to computer-aided diagnosis. Deep learning models are usually considered to be more flexible and provide reliable solutions for image analysis problems compared to conventional computer vision models. The use of fast computers offers the possibility of real-time support that is important for endoscopic diagnosis, which has to be made in real time. Advanced graphics processing units and cloud computing have also favored the use of machine learning, and more particularly, deep learning for patient care. This paper reviews the rapidly evolving literature on the feasibility of applying deep learning algorithms to endoscopic imaging.

  20. How should a speech recognizer work?

    PubMed

    Scharenborg, Odette; Norris, Dennis; Bosch, Louis; McQueen, James M

    2005-11-12

    Although researchers studying human speech recognition (HSR) and automatic speech recognition (ASR) share a common interest in how information processing systems (human or machine) recognize spoken language, there is little communication between the two disciplines. We suggest that this lack of communication follows largely from the fact that research in these related fields has focused on the mechanics of how speech can be recognized. In Marr's (1982) terms, emphasis has been on the algorithmic and implementational levels rather than on the computational level. In this article, we provide a computational-level analysis of the task of speech recognition, which reveals the close parallels between research concerned with HSR and ASR. We illustrate this relation by presenting a new computational model of human spoken-word recognition, built using techniques from the field of ASR that, in contrast to current existing models of HSR, recognizes words from real speech input. 2005 Lawrence Erlbaum Associates, Inc.

  1. Quantification of Energy Release in Composite Structures

    NASA Technical Reports Server (NTRS)

    Minnetyan, Levon

    2003-01-01

    Energy release rate is usually suggested as a quantifier for assessing structural damage tolerance. Computational prediction of energy release rate is based on composite mechanics with micro-stress level damage assessment, finite element structural analysis and damage progression tracking modules. This report examines several issues associated with energy release rates in composite structures as follows: Chapter I demonstrates computational simulation of an adhesively bonded composite joint and validates the computed energy release rates by comparison with acoustic emission signals in the overall sense. Chapter II investigates the effect of crack plane orientation with respect to fiber direction on the energy release rates. Chapter III quantifies the effects of contiguous constraint plies on the residual stiffness of a 90 ply subjected to transverse tensile fractures. Chapter IV compares ICAN and ICAN/JAVA solutions of composites. Chapter V examines the effects of composite structural geometry and boundary conditions on damage progression characteristics.

  2. Quantification of Energy Release in Composite Structures

    NASA Technical Reports Server (NTRS)

    Minnetyan, Levon; Chamis, Christos C. (Technical Monitor)

    2003-01-01

    Energy release rate is usually suggested as a quantifier for assessing structural damage tolerance. Computational prediction of energy release rate is based on composite mechanics with micro-stress level damage assessment, finite element structural analysis and damage progression tracking modules. This report examines several issues associated with energy release rates in composite structures as follows: Chapter I demonstrates computational simulation of an adhesively bonded composite joint and validates the computed energy release rates by comparison with acoustic emission signals in the overall sense. Chapter II investigates the effect of crack plane orientation with respect to fiber direction on the energy release rates. Chapter III quantifies the effects of contiguous constraint plies on the residual stiffness of a 90 deg ply subjected to transverse tensile fractures. Chapter IV compares ICAN and ICAN/JAVA solutions of composites. Chapter V examines the effects of composite structural geometry and boundary conditions on damage progression characteristics.

  3. Computational tools for copy number variation (CNV) detection using next-generation sequencing data: features and perspectives.

    PubMed

    Zhao, Min; Wang, Qingguo; Wang, Quan; Jia, Peilin; Zhao, Zhongming

    2013-01-01

    Copy number variation (CNV) is a prevalent form of critical genetic variation that leads to an abnormal number of copies of large genomic regions in a cell. Microarray-based comparative genome hybridization (arrayCGH) or genotyping arrays have been standard technologies to detect large regions subject to copy number changes in genomes until most recently high-resolution sequence data can be analyzed by next-generation sequencing (NGS). During the last several years, NGS-based analysis has been widely applied to identify CNVs in both healthy and diseased individuals. Correspondingly, the strong demand for NGS-based CNV analyses has fuelled development of numerous computational methods and tools for CNV detection. In this article, we review the recent advances in computational methods pertaining to CNV detection using whole genome and whole exome sequencing data. Additionally, we discuss their strengths and weaknesses and suggest directions for future development.

  4. Computational tools for copy number variation (CNV) detection using next-generation sequencing data: features and perspectives

    PubMed Central

    2013-01-01

    Copy number variation (CNV) is a prevalent form of critical genetic variation that leads to an abnormal number of copies of large genomic regions in a cell. Microarray-based comparative genome hybridization (arrayCGH) or genotyping arrays have been standard technologies to detect large regions subject to copy number changes in genomes until most recently high-resolution sequence data can be analyzed by next-generation sequencing (NGS). During the last several years, NGS-based analysis has been widely applied to identify CNVs in both healthy and diseased individuals. Correspondingly, the strong demand for NGS-based CNV analyses has fuelled development of numerous computational methods and tools for CNV detection. In this article, we review the recent advances in computational methods pertaining to CNV detection using whole genome and whole exome sequencing data. Additionally, we discuss their strengths and weaknesses and suggest directions for future development. PMID:24564169

  5. Children's strategies to solving additive inverse problems: a preliminary analysis

    NASA Astrophysics Data System (ADS)

    Ding, Meixia; Auxter, Abbey E.

    2017-03-01

    Prior studies show that elementary school children generally "lack" formal understanding of inverse relations. This study goes beyond lack to explore what children might "have" in their existing conception. A total of 281 students, kindergarten to third grade, were recruited to respond to a questionnaire that involved both contextual and non-contextual tasks on inverse relations, requiring both computational and explanatory skills. Results showed that children demonstrated better performance in computation than explanation. However, many students' explanations indicated that they did not necessarily utilize inverse relations for computation. Rather, they appeared to possess partial understanding, as evidenced by their use of part-whole structure, which is a key to understanding inverse relations. A close inspection of children's solution strategies further revealed that the sophistication of children's conception of part-whole structure varied in representation use and unknown quantity recognition, which suggests rich opportunities to develop students' understanding of inverse relations in lower elementary classrooms.

  6. Correlative multiple porosimetries for reservoir sandstones with adoption of a new reference-sample-guided computed-tomographic method.

    PubMed

    Jin, Jae Hwa; Kim, Junho; Lee, Jeong-Yil; Oh, Young Min

    2016-07-22

    One of the main interests in petroleum geology and reservoir engineering is to quantify the porosity of reservoir beds as accurately as possible. A variety of direct measurements, including methods of mercury intrusion, helium injection and petrographic image analysis, have been developed; however, their application frequently yields equivocal results because these methods are different in theoretical bases, means of measurement, and causes of measurement errors. Here, we present a set of porosities measured in Berea Sandstone samples by the multiple methods, in particular with adoption of a new method using computed tomography and reference samples. The multiple porosimetric data show a marked correlativeness among different methods, suggesting that these methods are compatible with each other. The new method of reference-sample-guided computed tomography is more effective than the previous methods when the accompanied merits such as experimental conveniences are taken into account.

  7. Correlative multiple porosimetries for reservoir sandstones with adoption of a new reference-sample-guided computed-tomographic method

    PubMed Central

    Jin, Jae Hwa; Kim, Junho; Lee, Jeong-Yil; Oh, Young Min

    2016-01-01

    One of the main interests in petroleum geology and reservoir engineering is to quantify the porosity of reservoir beds as accurately as possible. A variety of direct measurements, including methods of mercury intrusion, helium injection and petrographic image analysis, have been developed; however, their application frequently yields equivocal results because these methods are different in theoretical bases, means of measurement, and causes of measurement errors. Here, we present a set of porosities measured in Berea Sandstone samples by the multiple methods, in particular with adoption of a new method using computed tomography and reference samples. The multiple porosimetric data show a marked correlativeness among different methods, suggesting that these methods are compatible with each other. The new method of reference-sample-guided computed tomography is more effective than the previous methods when the accompanied merits such as experimental conveniences are taken into account. PMID:27445105

  8. 2D problems of surface growth theory with applications to additive manufacturing

    NASA Astrophysics Data System (ADS)

    Manzhirov, A. V.; Mikhin, M. N.

    2018-04-01

    We study 2D problems of surface growth theory of deformable solids and their applications to the analysis of the stress-strain state of AM fabricated products and structures. Statements of the problems are given, and a solution method based on the approaches of the theory of functions of a complex variable is suggested. Computations are carried out for model problems. Qualitative and quantitative results are discussed.

  9. Synoptic and Taxonomic Analysis of Form Perception Data and Theory.

    DTIC Science & Technology

    1987-11-01

    project, I suggested that we may, in fact, also not be on the right road. Heavily influenced by cellular neurophysiology and conventional computer...reductionistic approach has evolved from one stressing explanations that were patently and exclusively neurophysiological to one reflecting an equal...disappointing developments associated with this new enlightenment , however valid it may be, is that we are now beginning to see some signs of limits or

  10. Reinforcement learning with Marr.

    PubMed

    Niv, Yael; Langdon, Angela

    2016-10-01

    To many, the poster child for David Marr's famous three levels of scientific inquiry is reinforcement learning-a computational theory of reward optimization, which readily prescribes algorithmic solutions that evidence striking resemblance to signals found in the brain, suggesting a straightforward neural implementation. Here we review questions that remain open at each level of analysis, concluding that the path forward to their resolution calls for inspiration across levels, rather than a focus on mutual constraints.

  11. Evaluation of Computational Codes for Underwater Hull Analysis Model Applications

    DTIC Science & Technology

    2014-02-05

    desirable that the code can be run on a Windows operating system on the laptop, desktop, or workstation. The focus on Windows machines allows for...transition to such systems as operated on the Navy-Marine Corp Internet (NMCI). For each code the initial cost and yearly maintenance are identified...suggestions for reducing this burden to Department of Defense, Washington Headquarters Services, Directorate for Information Operations and Reports

  12. Direct tubulin polymerization perturbation contributes significantly to the induction of micronuclei in vivo.

    PubMed

    ter Haar, E; Day, B W; Rosenkranz, H S

    1996-03-09

    The computational analysis data presented indicate a significant mechanistic association between the ability of a chemical to cause tubulin polymerization perturbation (TPP), via direct interaction with the protein, and the in vivo induction of micronuclei (MN). Since it is known that TPP is not a genotoxic event, the analyses suggest that the induction of MN by a non-genotoxic mechanism is a significant alternate pathway.

  13. Integrals for IBS and beam cooling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burov, A.; /Fermilab

    Simulation of beam cooling usually requires performing certain integral transformations every time step or so, which is a significant burden on the CPU. Examples are the dispersion integrals (Hilbert transforms) in the stochastic cooling, wake fields and IBS integrals. An original method is suggested for fast and sufficiently accurate computation of the integrals. This method is applied for the dispersion integral. Some methodical aspects of the IBS analysis are discussed.

  14. Integrals for IBS and Beam Cooling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burov, A.

    Simulation of beam cooling usually requires performing certain integral transformations every time step or so, which is a significant burden on the CPU. Examples are the dispersion integrals (Hilbert transforms) in the stochastic cooling, wake fields and IBS integrals. An original method is suggested for fast and sufficiently accurate computation of the integrals. This method is applied for the dispersion integral. Some methodical aspects of the IBS analysis are discussed.

  15. Laptop Use, Interactive Science Software, and Science Learning Among At-Risk Students

    NASA Astrophysics Data System (ADS)

    Zheng, Binbin; Warschauer, Mark; Hwang, Jin Kyoung; Collins, Penelope

    2014-08-01

    This year-long, quasi-experimental study investigated the impact of the use of netbook computers and interactive science software on fifth-grade students' science learning processes, academic achievement, and interest in further science, technology, engineering, and mathematics (STEM) study within a linguistically diverse school district in California. Analysis of students' state standardized science test scores indicated that the program helped close gaps in scientific achievement between at-risk learners (i.e., English learners, Hispanics, and free/reduced-lunch recipients) and their counterparts. Teacher and student interviews and classroom observations suggested that computer-supported visual representations and interactions supported diverse learners' scientific understanding and inquiry and enabled more individualized and differentiated instruction. Finally, interviews revealed that the program had a positive impact on students' motivation in science and on their interest in pursuing science-related careers. This study suggests that technology-facilitated science instruction is beneficial for improving at-risk students' science achievement, scaffolding students' scientific understanding, and strengthening students' motivation to pursue STEM-related careers.

  16. Nucleotide synthetase ribozymes may have emerged first in the RNA world

    PubMed Central

    Ma, Wentao; Yu, Chunwu; Zhang, Wentao; Hu, Jiming

    2007-01-01

    Though the “RNA world” hypothesis has gained a central role in ideas concerning the origin of life, the scenario concerning its emergence remains uncertain. It has been speculated that the first scene may have been the emergence of a template-dependent RNA synthetase ribozyme, which catalyzed its own replication: thus, “RNA replicase.” However, the speculation remains uncertain, primarily because of the large sequence length requirement of such a replicase and the lack of a convincing mechanism to ensure its self-favoring features. Instead, we propose a nucleotide synthetase ribozyme as an alternative candidate, especially considering recent experimental evidence suggesting the possibility of effective nonenzymatic template-directed synthesis of RNA. A computer simulation was conducted to support our proposal. The conditions for the emergence of the nucleotide synthetase ribozyme are discussed, based on dynamic analysis on a computer. We suggest the template-dependent RNA synthetase ribozyme emerged later, perhaps after the emergence of protocells. PMID:17878321

  17. Closed-cage tungsten oxide clusters in the gas phase.

    PubMed

    Singh, D M David Jeba; Pradeep, T; Thirumoorthy, Krishnan; Balasubramanian, Krishnan

    2010-05-06

    During the course of a study on the clustering of W-Se and W-S mixtures in the gas phase using laser desorption ionization (LDI) mass spectrometry, we observed several anionic W-O clusters. Three distinct species, W(6)O(19)(-), W(13)O(29)(-), and W(14)O(32)(-), stand out as intense peaks in the regular mass spectral pattern of tungsten oxide clusters suggesting unusual stabilities for them. Moreover, these clusters do not fragment in the postsource decay analysis. While trying to understand the precursor material, which produced these clusters, we found the presence of nanoscale forms of tungsten oxide. The structure and thermodynamic parameters of tungsten clusters have been explored using relativistic quantum chemical methods. Our computed results of atomization energy are consistent with the observed LDI mass spectra. The computational results suggest that the clusters observed have closed-cage structure. These distinct W(13) and W(14) clusters were observed for the first time in the gas phase.

  18. Unifying model of carpal mechanics based on computationally derived isometric constraints and rules-based motion - the stable central column theory.

    PubMed

    Sandow, M J; Fisher, T J; Howard, C Q; Papas, S

    2014-05-01

    This study was part of a larger project to develop a (kinetic) theory of carpal motion based on computationally derived isometric constraints. Three-dimensional models were created from computed tomography scans of the wrists of ten normal subjects and carpal spatial relationships at physiological motion extremes were assessed. Specific points on the surface of the various carpal bones and the radius that remained isometric through range of movement were identified. Analysis of the isometric constraints and intercarpal motion suggests that the carpus functions as a stable central column (lunate-capitate-hamate-trapezoid-trapezium) with a supporting lateral column (scaphoid), which behaves as a 'two gear four bar linkage'. The triquetrum functions as an ulnar translation restraint, as well as controlling lunate flexion. The 'trapezoid'-shaped trapezoid places the trapezium anterior to the transverse plane of the radius and ulna, and thus rotates the principal axis of the central column to correspond to that used in the 'dart thrower's motion'. This study presents a forward kinematic analysis of the carpus that provides the basis for the development of a unifying kinetic theory of wrist motion based on isometric constraints and rules-based motion.

  19. Comparison of computational results of the SABRE LMFBR pin bundle blockage code with data from well-instrumented out-of-pile test bundles (THORS bundles 3A and 5A)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dearing, J.F.

    The Subchannel Analysis of Blockages in Reactor Elements (SABRE) computer code, developed by the United Kingdom Atomic Energy Authority, is currently the only practical tool available for performing detailed analyses of velocity and temperature fields in the recirculating flow regions downstream of blockages in liquid-metal fast breeder reactor (LMFBR) pin bundles. SABRE is a subchannel analysis code; that is, it accurately represents the complex geometry of nuclear fuel pins arranged on a triangular lattice. The results of SABRE computational models are compared here with temperature data from two out-of-pile 19-pin test bundles from the Thermal-Hydraulic Out-of-Reactor Safety (THORS) Facility atmore » Oak Ridge National Laboratory. One of these bundles has a small central flow blockage (bundle 3A), while the other has a large edge blockage (bundle 5A). Values that give best agreement with experiment for the empirical thermal mixing correlation factor, FMIX, in SABRE are suggested. These values of FMIX are Reynolds-number dependent, however, indicating that the coded turbulent mixing correlation is not appropriate for wire-wrap pin bundles.« less

  20. Performance implications from sizing a VM on multi-core systems: A Data analytic application s view

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lim, Seung-Hwan; Horey, James L; Begoli, Edmon

    In this paper, we present a quantitative performance analysis of data analytics applications running on multi-core virtual machines. Such environments form the core of cloud computing. In addition, data analytics applications, such as Cassandra and Hadoop, are becoming increasingly popular on cloud computing platforms. This convergence necessitates a better understanding of the performance and cost implications of such hybrid systems. For example, the very rst step in hosting applications in virtualized environments, requires the user to con gure the number of virtual processors and the size of memory. To understand performance implications of this step, we benchmarked three Yahoo Cloudmore » Serving Benchmark (YCSB) workloads in a virtualized multi-core environment. Our measurements indicate that the performance of Cassandra for YCSB workloads does not heavily depend on the processing capacity of a system, while the size of the data set is critical to performance relative to allocated memory. We also identi ed a strong relationship between the running time of workloads and various hardware events (last level cache loads, misses, and CPU migrations). From this analysis, we provide several suggestions to improve the performance of data analytics applications running on cloud computing environments.« less

  1. Integrated control and health management. Orbit transfer rocket engine technology program

    NASA Technical Reports Server (NTRS)

    Holzmann, Wilfried A.; Hayden, Warren R.

    1988-01-01

    To insure controllability of the baseline design for a 7500 pound thrust, 10:1 throttleable, dual expanded cycle, Hydrogen-Oxygen, orbit transfer rocket engine, an Integrated Controls and Health Monitoring concept was developed. This included: (1) Dynamic engine simulations using a TUTSIM derived computer code; (2) analysis of various control methods; (3) Failure Modes Analysis to identify critical sensors; (4) Survey of applicable sensors technology; and, (5) Study of Health Monitoring philosophies. The engine design was found to be controllable over the full throttling range by using 13 valves, including an oxygen turbine bypass valve to control mixture ratio, and a hydrogen turbine bypass valve, used in conjunction with the oxygen bypass to control thrust. Classic feedback control methods are proposed along with specific requirements for valves, sensors, and the controller. Expanding on the control system, a Health Monitoring system is proposed including suggested computing methods and the following recommended sensors: (1) Fiber optic and silicon bearing deflectometers; (2) Capacitive shaft displacement sensors; and (3) Hot spot thermocouple arrays. Further work is needed to refine and verify the dynamic simulations and control algorithms, to advance sensor capabilities, and to develop the Health Monitoring computational methods.

  2. Variational Methods in Sensitivity Analysis and Optimization for Aerodynamic Applications

    NASA Technical Reports Server (NTRS)

    Ibrahim, A. H.; Hou, G. J.-W.; Tiwari, S. N. (Principal Investigator)

    1996-01-01

    Variational methods (VM) sensitivity analysis, which is the continuous alternative to the discrete sensitivity analysis, is employed to derive the costate (adjoint) equations, the transversality conditions, and the functional sensitivity derivatives. In the derivation of the sensitivity equations, the variational methods use the generalized calculus of variations, in which the variable boundary is considered as the design function. The converged solution of the state equations together with the converged solution of the costate equations are integrated along the domain boundary to uniquely determine the functional sensitivity derivatives with respect to the design function. The determination of the sensitivity derivatives of the performance index or functional entails the coupled solutions of the state and costate equations. As the stable and converged numerical solution of the costate equations with their boundary conditions are a priori unknown, numerical stability analysis is performed on both the state and costate equations. Thereafter, based on the amplification factors obtained by solving the generalized eigenvalue equations, the stability behavior of the costate equations is discussed and compared with the state (Euler) equations. The stability analysis of the costate equations suggests that the converged and stable solution of the costate equation is possible only if the computational domain of the costate equations is transformed to take into account the reverse flow nature of the costate equations. The application of the variational methods to aerodynamic shape optimization problems is demonstrated for internal flow problems at supersonic Mach number range. The study shows, that while maintaining the accuracy of the functional sensitivity derivatives within the reasonable range for engineering prediction purposes, the variational methods show a substantial gain in computational efficiency, i.e., computer time and memory, when compared with the finite difference sensitivity analysis.

  3. Assessment of Situated Learning Using Computer Environments.

    ERIC Educational Resources Information Center

    Young, Michael

    1995-01-01

    Suggests that, based on a theory of situated learning, assessment must emphasize process as much as product. Several assessment examples are given, including a computer-based planning assistant for a mathematics and science video, suggestions for computer-based portfolio assessment, and speculations about embedded assessment of virtual situations.…

  4. Computer-based medical education in Benha University, Egypt: knowledge, attitude, limitations, and suggestions.

    PubMed

    Bayomy, Hanaa; El Awadi, Mona; El Araby, Eman; Abed, Hala A

    2016-12-01

    Computer-assisted medical education has been developed to enhance learning and enable high-quality medical care. This study aimed to assess computer knowledge and attitude toward the inclusion of computers in medical education among second-year medical students in Benha Faculty of Medicine, Egypt, to identify limitations, and obtain suggestions for successful computer-based learning. This was a one-group pre-post-test study, which was carried out on second-year students in Benha Faculty of Medicine. A structured self-administered questionnaire was used to compare students' knowledge, attitude, limitations, and suggestions toward computer usage in medical education before and after the computer course to evaluate the change in students' responses. The majority of students were familiar with use of the mouse and keyboard, basic word processing, internet and web searching, and e-mail both before and after the computer course. The proportion of students who were familiar with software programs other than the word processing and trouble-shoot software/hardware was significantly higher after the course (P<0.001). There was a significant increase in the proportion of students who agreed on owning a computer (P=0.008), the inclusion of computer skills course in medical education, downloading lecture handouts, and computer-based exams (P<0.001) after the course. After the course, there was a significant increase in the proportion of students who agreed that the lack of central computers limited the inclusion of computer in medical education (P<0.001). Although the lack of computer labs, lack of Information Technology staff mentoring, large number of students, unclear course outline, and lack of internet access were more frequently reported before the course (P<0.001), the majority of students suggested the provision of computer labs, inviting Information Technology staff to support computer teaching, and the availability of free Wi-Fi internet access covering several areas in the university campus; all would support computer-assisted medical education. Medical students in Benha University are computer literate, which allows for computer-based medical education. Staff training, provision of computer labs, and internet access are essential requirements for enhancing computer usage in medical education in the university.

  5. The effects of home computer access and social capital on mathematics and science achievement among Asian-American high school students in the NELS:88 data set

    NASA Astrophysics Data System (ADS)

    Quigley, Mark Declan

    The purpose of this researcher was to examine specific environmental, educational, and demographic factors and their influence on mathematics and science achievement. In particular, the researcher ascertained the interconnections of home computer access and social capital, with Asian American students and the effect on mathematics and science achievement. Coleman's theory on social capital and parental influence was used as a basis for the analysis of data. Subjects for this study were the base year students from the National Education Longitudinal Study of 1988 (NELS:88) and the subsequent follow-up survey data in 1990, 1992, and 1994. The approximate sample size for this study is 640 ethnic Asians from the NELS:88 database. The analysis was a longitudinal study based on the Student and Parent Base Year responses and the Second Follow-up survey of 1992, when the subjects were in 12th grade. Achievement test results from the NELS:88 data were used to measure achievement in mathematics and science. The NELS:88 test battery was developed to measure both individual status and a student's growth in a number of achievement areas. The subject's responses were analyzed by principal components factor analysis, weights, effect sizes, hierarchial regression analysis, and PLSPath Analysis. The results of this study were that prior ability in mathematics and science is a major influence in the student's educational achievement. Findings from the study support the view that home computer access has a negative direct effect on mathematics and science achievement for both Asian American males and females. None of the social capital factors in the study had either a negative or positive direct effect on mathematics and science achievement although some indirect effects were found. Suggestions were made toward increasing parental involvement in their children's academic endeavors. Computer access in the home should be considered related to television viewing and should be closely monitored by the parents to promote educational uses.

  6. Computers as an Instrument for Data Analysis. Technical Report No. 11.

    ERIC Educational Resources Information Center

    Muller, Mervin E.

    A review of statistical data analysis involving computers as a multi-dimensional problem provides the perspective for consideration of the use of computers in statistical analysis and the problems associated with large data files. An overall description of STATJOB, a particular system for doing statistical data analysis on a digital computer,…

  7. Gaze distribution analysis and saliency prediction across age groups.

    PubMed

    Krishna, Onkar; Helo, Andrea; Rämä, Pia; Aizawa, Kiyoharu

    2018-01-01

    Knowledge of the human visual system helps to develop better computational models of visual attention. State-of-the-art models have been developed to mimic the visual attention system of young adults that, however, largely ignore the variations that occur with age. In this paper, we investigated how visual scene processing changes with age and we propose an age-adapted framework that helps to develop a computational model that can predict saliency across different age groups. Our analysis uncovers how the explorativeness of an observer varies with age, how well saliency maps of an age group agree with fixation points of observers from the same or different age groups, and how age influences the center bias tendency. We analyzed the eye movement behavior of 82 observers belonging to four age groups while they explored visual scenes. Explorative- ness was quantified in terms of the entropy of a saliency map, and area under the curve (AUC) metrics was used to quantify the agreement analysis and the center bias tendency. Analysis results were used to develop age adapted saliency models. Our results suggest that the proposed age-adapted saliency model outperforms existing saliency models in predicting the regions of interest across age groups.

  8. Mars approach for global sensitivity analysis of differential equation models with applications to dynamics of influenza infection.

    PubMed

    Lee, Yeonok; Wu, Hulin

    2012-01-01

    Differential equation models are widely used for the study of natural phenomena in many fields. The study usually involves unknown factors such as initial conditions and/or parameters. It is important to investigate the impact of unknown factors (parameters and initial conditions) on model outputs in order to better understand the system the model represents. Apportioning the uncertainty (variation) of output variables of a model according to the input factors is referred to as sensitivity analysis. In this paper, we focus on the global sensitivity analysis of ordinary differential equation (ODE) models over a time period using the multivariate adaptive regression spline (MARS) as a meta model based on the concept of the variance of conditional expectation (VCE). We suggest to evaluate the VCE analytically using the MARS model structure of univariate tensor-product functions which is more computationally efficient. Our simulation studies show that the MARS model approach performs very well and helps to significantly reduce the computational cost. We present an application example of sensitivity analysis of ODE models for influenza infection to further illustrate the usefulness of the proposed method.

  9. The Effectiveness of Computer-Assisted Instruction to Teach Physical Examination to Students and Trainees in the Health Sciences Professions: A Systematic Review and Meta-Analysis.

    PubMed

    Tomesko, Jennifer; Touger-Decker, Riva; Dreker, Margaret; Zelig, Rena; Parrott, James Scott

    2017-01-01

    To explore knowledge and skill acquisition outcomes related to learning physical examination (PE) through computer-assisted instruction (CAI) compared with a face-to-face (F2F) approach. A systematic literature review and meta-analysis published between January 2001 and December 2016 was conducted. Databases searched included Medline, Cochrane, CINAHL, ERIC, Ebsco, Scopus, and Web of Science. Studies were synthesized by study design, intervention, and outcomes. Statistical analyses included DerSimonian-Laird random-effects model. In total, 7 studies were included in the review, and 5 in the meta-analysis. There were no statistically significant differences for knowledge (mean difference [MD] = 5.39, 95% confidence interval [CI]: -2.05 to 12.84) or skill acquisition (MD = 0.35, 95% CI: -5.30 to 6.01). The evidence does not suggest a strong consistent preference for either CAI or F2F instruction to teach students/trainees PE. Further research is needed to identify conditions which examine knowledge and skill acquisition outcomes that favor one mode of instruction over the other.

  10. Angry facial expressions bias gender categorization in children and adults: behavioral and computational evidence

    PubMed Central

    Bayet, Laurie; Pascalis, Olivier; Quinn, Paul C.; Lee, Kang; Gentaz, Édouard; Tanaka, James W.

    2015-01-01

    Angry faces are perceived as more masculine by adults. However, the developmental course and underlying mechanism (bottom-up stimulus driven or top-down belief driven) associated with the angry-male bias remain unclear. Here we report that anger biases face gender categorization toward “male” responding in children as young as 5–6 years. The bias is observed for both own- and other-race faces, and is remarkably unchanged across development (into adulthood) as revealed by signal detection analyses (Experiments 1–2). The developmental course of the angry-male bias, along with its extension to other-race faces, combine to suggest that it is not rooted in extensive experience, e.g., observing males engaging in aggressive acts during the school years. Based on several computational simulations of gender categorization (Experiment 3), we further conclude that (1) the angry-male bias results, at least partially, from a strategy of attending to facial features or their second-order relations when categorizing face gender, and (2) any single choice of computational representation (e.g., Principal Component Analysis) is insufficient to assess resemblances between face categories, as different representations of the very same faces suggest different bases for the angry-male bias. Our findings are thus consistent with stimulus-and stereotyped-belief driven accounts of the angry-male bias. Taken together, the evidence suggests considerable stability in the interaction between some facial dimensions in social categorization that is present prior to the onset of formal schooling. PMID:25859238

  11. Coupled Aerodynamic and Structural Sensitivity Analysis of a High-Speed Civil Transport

    NASA Technical Reports Server (NTRS)

    Mason, B. H.; Walsh, J. L.

    2001-01-01

    An objective of the High Performance Computing and Communication Program at the NASA Langley Research Center is to demonstrate multidisciplinary shape and sizing optimization of a complete aerospace vehicle configuration by using high-fidelity, finite-element structural analysis and computational fluid dynamics aerodynamic analysis. In a previous study, a multi-disciplinary analysis system for a high-speed civil transport was formulated to integrate a set of existing discipline analysis codes, some of them computationally intensive, This paper is an extension of the previous study, in which the sensitivity analysis for the coupled aerodynamic and structural analysis problem is formulated and implemented. Uncoupled stress sensitivities computed with a constant load vector in a commercial finite element analysis code are compared to coupled aeroelastic sensitivities computed by finite differences. The computational expense of these sensitivity calculation methods is discussed.

  12. Developing Computer-Assisted Instruction Multimedia For Educational Technology Course of Coastal Area Students

    NASA Astrophysics Data System (ADS)

    Idris, Husni; Nurhayati, Nurhayati; Satriani, Satriani

    2018-05-01

    This research aims to a) identify instructional software (interactive multimedia CDs) by developing Computer-Assisted Instruction (CAI) multimedia that is eligible to be used in the instruction of the Educational Technology course; b) analysis the role of instructional software (interactive multimedia CDs) on the Educational Technology course through the development of Computer-Assisted Instruction (CAI) multimedia to improve the quality of education and instructional activities. This is Research and Development (R&D). It employed the descriptive procedural model of development, which outlines the steps to be taken to develop a product, which is instructional multimedia. The number of subjects of the research trial or respondents for each stage was 20 people. To maintain development quality, an expert in materials outside the materials under study, an expert in materials who is also a Educational Technology lecturer, a small groupof 3 students, a medium-sized group of 10 students, and 20 students to participate in the field testing took part in this research. Then, data collection instruments were developed in two stages, namely: a) developing the instruments; and b) trying out instruments. Data on students’ responses were collected using questionnaires and analyzed using descriptive statistics with percentage and categorization techniques. Based on data analysis results, it is revealed that the Computer-Assisted Instruction (CAI) multimedia developed and tried out among students during the preliminary field testing falls into the “Good” category, with the aspects of instruction, materials, and media falling into the “Good” category. Subsequently, results of the main field testing among students also suggest that it falls into the “Good” category, with the aspects of instruction, materials, and media falling into the “Good” category. Similarly, results of the operational field testing among students also suggest that it falls into the “Good” category. Thus, it can be concluded that quality of the Computer-Assisted Instruction (CAI) multimedia developed in this research falls into the “Good” category viewed from the aspects of instruction, materials, and media. In other words, overall, the quality of this multimedia belongs to the “Good” category.

  13. Spatial data analytics on heterogeneous multi- and many-core parallel architectures using python

    USGS Publications Warehouse

    Laura, Jason R.; Rey, Sergio J.

    2017-01-01

    Parallel vector spatial analysis concerns the application of parallel computational methods to facilitate vector-based spatial analysis. The history of parallel computation in spatial analysis is reviewed, and this work is placed into the broader context of high-performance computing (HPC) and parallelization research. The rise of cyber infrastructure and its manifestation in spatial analysis as CyberGIScience is seen as a main driver of renewed interest in parallel computation in the spatial sciences. Key problems in spatial analysis that have been the focus of parallel computing are covered. Chief among these are spatial optimization problems, computational geometric problems including polygonization and spatial contiguity detection, the use of Monte Carlo Markov chain simulation in spatial statistics, and parallel implementations of spatial econometric methods. Future directions for research on parallelization in computational spatial analysis are outlined.

  14. Intelligent Soft Computing on Forex: Exchange Rates Forecasting with Hybrid Radial Basis Neural Network

    PubMed Central

    Marcek, Dusan; Durisova, Maria

    2016-01-01

    This paper deals with application of quantitative soft computing prediction models into financial area as reliable and accurate prediction models can be very helpful in management decision-making process. The authors suggest a new hybrid neural network which is a combination of the standard RBF neural network, a genetic algorithm, and a moving average. The moving average is supposed to enhance the outputs of the network using the error part of the original neural network. Authors test the suggested model on high-frequency time series data of USD/CAD and examine the ability to forecast exchange rate values for the horizon of one day. To determine the forecasting efficiency, they perform a comparative statistical out-of-sample analysis of the tested model with autoregressive models and the standard neural network. They also incorporate genetic algorithm as an optimizing technique for adapting parameters of ANN which is then compared with standard backpropagation and backpropagation combined with K-means clustering algorithm. Finally, the authors find out that their suggested hybrid neural network is able to produce more accurate forecasts than the standard models and can be helpful in eliminating the risk of making the bad decision in decision-making process. PMID:26977450

  15. Intelligent Soft Computing on Forex: Exchange Rates Forecasting with Hybrid Radial Basis Neural Network.

    PubMed

    Falat, Lukas; Marcek, Dusan; Durisova, Maria

    2016-01-01

    This paper deals with application of quantitative soft computing prediction models into financial area as reliable and accurate prediction models can be very helpful in management decision-making process. The authors suggest a new hybrid neural network which is a combination of the standard RBF neural network, a genetic algorithm, and a moving average. The moving average is supposed to enhance the outputs of the network using the error part of the original neural network. Authors test the suggested model on high-frequency time series data of USD/CAD and examine the ability to forecast exchange rate values for the horizon of one day. To determine the forecasting efficiency, they perform a comparative statistical out-of-sample analysis of the tested model with autoregressive models and the standard neural network. They also incorporate genetic algorithm as an optimizing technique for adapting parameters of ANN which is then compared with standard backpropagation and backpropagation combined with K-means clustering algorithm. Finally, the authors find out that their suggested hybrid neural network is able to produce more accurate forecasts than the standard models and can be helpful in eliminating the risk of making the bad decision in decision-making process.

  16. A Small World of Neuronal Synchrony

    PubMed Central

    Yu, Shan; Huang, Debin; Singer, Wolf

    2008-01-01

    A small-world network has been suggested to be an efficient solution for achieving both modular and global processing—a property highly desirable for brain computations. Here, we investigated functional networks of cortical neurons using correlation analysis to identify functional connectivity. To reconstruct the interaction network, we applied the Ising model based on the principle of maximum entropy. This allowed us to assess the interactions by measuring pairwise correlations and to assess the strength of coupling from the degree of synchrony. Visual responses were recorded in visual cortex of anesthetized cats, simultaneously from up to 24 neurons. First, pairwise correlations captured most of the patterns in the population's activity and, therefore, provided a reliable basis for the reconstruction of the interaction networks. Second, and most importantly, the resulting networks had small-world properties; the average path lengths were as short as in simulated random networks, but the clustering coefficients were larger. Neurons differed considerably with respect to the number and strength of interactions, suggesting the existence of “hubs” in the network. Notably, there was no evidence for scale-free properties. These results suggest that cortical networks are optimized for the coexistence of local and global computations: feature detection and feature integration or binding. PMID:18400792

  17. Arbitrary Steady-State Solutions with the K-epsilon Model

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.; Pettersson Reif, B. A.; Gatski, Thomas B.

    2006-01-01

    Widely-used forms of the K-epsilon turbulence model are shown to yield arbitrary steady-state converged solutions that are highly dependent on numerical considerations such as initial conditions and solution procedure. These solutions contain pseudo-laminar regions of varying size. By applying a nullcline analysis to the equation set, it is possible to clearly demonstrate the reasons for the anomalous behavior. In summary, the degenerate solution acts as a stable fixed point under certain conditions, causing the numerical method to converge there. The analysis also suggests a methodology for preventing the anomalous behavior in steady-state computations.

  18. Technical Note: Guidelines for the digital computation of 2D and 3D enamel thickness in hominoid teeth.

    PubMed

    Benazzi, Stefano; Panetta, Daniele; Fornai, Cinzia; Toussaint, Michel; Gruppioni, Giorgio; Hublin, Jean-Jacques

    2014-02-01

    The study of enamel thickness has received considerable attention in regard to the taxonomic, phylogenetic and dietary assessment of human and non-human primates. Recent developments based on two-dimensional (2D) and three-dimensional (3D) digital techniques have facilitated accurate analyses, preserving the original object from invasive procedures. Various digital protocols have been proposed. These include several procedures based on manual handling of the virtual models and technical shortcomings, which prevent other scholars from confidently reproducing the entire digital protocol. There is a compelling need for standard, reproducible, and well-tailored protocols for the digital analysis of 2D and 3D dental enamel thickness. In this contribution we provide essential guidelines for the digital computation of 2D and 3D enamel thickness in hominoid molars, premolars, canines and incisors. We modify previous techniques suggested for 2D analysis and we develop a new approach for 3D analysis that can also be applied to premolars and anterior teeth. For each tooth class, the cervical line should be considered as the fundamental morphological feature both to isolate the crown from the root (for 3D analysis) and to define the direction of the cross-sections (for 2D analysis). Copyright © 2013 Wiley Periodicals, Inc.

  19. SSVEP recognition using common feature analysis in brain-computer interface.

    PubMed

    Zhang, Yu; Zhou, Guoxu; Jin, Jing; Wang, Xingyu; Cichocki, Andrzej

    2015-04-15

    Canonical correlation analysis (CCA) has been successfully applied to steady-state visual evoked potential (SSVEP) recognition for brain-computer interface (BCI) application. Although the CCA method outperforms the traditional power spectral density analysis through multi-channel detection, it requires additionally pre-constructed reference signals of sine-cosine waves. It is likely to encounter overfitting in using a short time window since the reference signals include no features from training data. We consider that a group of electroencephalogram (EEG) data trials recorded at a certain stimulus frequency on a same subject should share some common features that may bear the real SSVEP characteristics. This study therefore proposes a common feature analysis (CFA)-based method to exploit the latent common features as natural reference signals in using correlation analysis for SSVEP recognition. Good performance of the CFA method for SSVEP recognition is validated with EEG data recorded from ten healthy subjects, in contrast to CCA and a multiway extension of CCA (MCCA). Experimental results indicate that the CFA method significantly outperformed the CCA and the MCCA methods for SSVEP recognition in using a short time window (i.e., less than 1s). The superiority of the proposed CFA method suggests it is promising for the development of a real-time SSVEP-based BCI. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Application of kernel principal component analysis and computational machine learning to exploration of metabolites strongly associated with diet.

    PubMed

    Shiokawa, Yuka; Date, Yasuhiro; Kikuchi, Jun

    2018-02-21

    Computer-based technological innovation provides advancements in sophisticated and diverse analytical instruments, enabling massive amounts of data collection with relative ease. This is accompanied by a fast-growing demand for technological progress in data mining methods for analysis of big data derived from chemical and biological systems. From this perspective, use of a general "linear" multivariate analysis alone limits interpretations due to "non-linear" variations in metabolic data from living organisms. Here we describe a kernel principal component analysis (KPCA)-incorporated analytical approach for extracting useful information from metabolic profiling data. To overcome the limitation of important variable (metabolite) determinations, we incorporated a random forest conditional variable importance measure into our KPCA-based analytical approach to demonstrate the relative importance of metabolites. Using a market basket analysis, hippurate, the most important variable detected in the importance measure, was associated with high levels of some vitamins and minerals present in foods eaten the previous day, suggesting a relationship between increased hippurate and intake of a wide variety of vegetables and fruits. Therefore, the KPCA-incorporated analytical approach described herein enabled us to capture input-output responses, and should be useful not only for metabolic profiling but also for profiling in other areas of biological and environmental systems.

  1. StrAuto: automation and parallelization of STRUCTURE analysis.

    PubMed

    Chhatre, Vikram E; Emerson, Kevin J

    2017-03-24

    Population structure inference using the software STRUCTURE has become an integral part of population genetic studies covering a broad spectrum of taxa including humans. The ever-expanding size of genetic data sets poses computational challenges for this analysis. Although at least one tool currently implements parallel computing to reduce computational overload of this analysis, it does not fully automate the use of replicate STRUCTURE analysis runs required for downstream inference of optimal K. There is pressing need for a tool that can deploy population structure analysis on high performance computing clusters. We present an updated version of the popular Python program StrAuto, to streamline population structure analysis using parallel computing. StrAuto implements a pipeline that combines STRUCTURE analysis with the Evanno Δ K analysis and visualization of results using STRUCTURE HARVESTER. Using benchmarking tests, we demonstrate that StrAuto significantly reduces the computational time needed to perform iterative STRUCTURE analysis by distributing runs over two or more processors. StrAuto is the first tool to integrate STRUCTURE analysis with post-processing using a pipeline approach in addition to implementing parallel computation - a set up ideal for deployment on computing clusters. StrAuto is distributed under the GNU GPL (General Public License) and available to download from http://strauto.popgen.org .

  2. Using text analysis to quantify the similarity and evolution of scientific disciplines

    PubMed Central

    Dias, Laércio; Scharloth, Joachim

    2018-01-01

    We use an information-theoretic measure of linguistic similarity to investigate the organization and evolution of scientific fields. An analysis of almost 20 M papers from the past three decades reveals that the linguistic similarity is related but different from experts and citation-based classifications, leading to an improved view on the organization of science. A temporal analysis of the similarity of fields shows that some fields (e.g. computer science) are becoming increasingly central, but that on average the similarity between pairs of disciplines has not changed in the last decades. This suggests that tendencies of convergence (e.g. multi-disciplinarity) and divergence (e.g. specialization) of disciplines are in balance. PMID:29410857

  3. Using text analysis to quantify the similarity and evolution of scientific disciplines.

    PubMed

    Dias, Laércio; Gerlach, Martin; Scharloth, Joachim; Altmann, Eduardo G

    2018-01-01

    We use an information-theoretic measure of linguistic similarity to investigate the organization and evolution of scientific fields. An analysis of almost 20 M papers from the past three decades reveals that the linguistic similarity is related but different from experts and citation-based classifications, leading to an improved view on the organization of science. A temporal analysis of the similarity of fields shows that some fields (e.g. computer science) are becoming increasingly central, but that on average the similarity between pairs of disciplines has not changed in the last decades. This suggests that tendencies of convergence (e.g. multi-disciplinarity) and divergence (e.g. specialization) of disciplines are in balance.

  4. Factors influencing health professions students' use of computers for data analysis at three Ugandan public medical schools: a cross-sectional survey.

    PubMed

    Munabi, Ian G; Buwembo, William; Bajunirwe, Francis; Kitara, David Lagoro; Joseph, Ruberwa; Peter, Kawungezi; Obua, Celestino; Quinn, John; Mwaka, Erisa S

    2015-02-25

    Effective utilization of computers and their applications in medical education and research is of paramount importance to students. The objective of this study was to determine the association between owning a computer and use of computers for research data analysis and the other factors influencing health professions students' computer use for data analysis. We conducted a cross sectional study among undergraduate health professions students at three public universities in Uganda using a self-administered questionnaire. The questionnaire was composed of questions on participant demographics, students' participation in research, computer ownership, and use of computers for data analysis. Descriptive and inferential statistics (uni-variable and multi- level logistic regression analysis) were used to analyse data. The level of significance was set at 0.05. Six hundred (600) of 668 questionnaires were completed and returned (response rate 89.8%). A majority of respondents were male (68.8%) and 75.3% reported owning computers. Overall, 63.7% of respondents reported that they had ever done computer based data analysis. The following factors were significant predictors of having ever done computer based data analysis: ownership of a computer (adj. OR 1.80, p = 0.02), recently completed course in statistics (Adj. OR 1.48, p =0.04), and participation in research (Adj. OR 2.64, p <0.01). Owning a computer, participation in research and undertaking courses in research methods influence undergraduate students' use of computers for research data analysis. Students are increasingly participating in research, and thus need to have competencies for the successful conduct of research. Medical training institutions should encourage both curricular and extra-curricular efforts to enhance research capacity in line with the modern theories of adult learning.

  5. Bridge-scour analysis using the water surface profile (WSPRO) model

    USGS Publications Warehouse

    Mueller, David S.; ,

    1993-01-01

    A program was developed to extract hydraulic information required for bridge-scour computations, from the Water-Surface Profile computation model (WSPRO). The program is written in compiled BASIC and is menu driven. Using only ground points, the program can compute average ground elevation, cross-sectional area below a specified datum, or create a Drawing Exchange Format (DXF) fie of cross section. Using both ground points ad hydraulic information form the equal-conveyance tubes computed by WSPRO, the program can compute hydraulic parameters at a user-specified station or in a user-specified subsection of the cross section. The program can identify the maximum velocity in a cross section and the velocity and depth at a user-specified station. The program also can identify the maximum velocity in the cross section and the average velocity, average depth, average ground elevation, width perpendicular to the flow, cross-sectional area of flow, and discharge in a subsection of the cross section. This program does not include any help or suggestions as to what data should be extracted; therefore, the used must understand the scour equations and associated variables to the able to extract the proper information from the WSPRO output.

  6. Measurements and computational analysis of heat transfer and flow in a simulated turbine blade internal cooling passage

    NASA Technical Reports Server (NTRS)

    Russell, Louis M.; Thurman, Douglas R.; Simonyi, Patricia S.; Hippensteele, Steven A.; Poinsatte, Philip E.

    1993-01-01

    Visual and quantitative information was obtained on heat transfer and flow in a branched-duct test section that had several significant features of an internal cooling passage of a turbine blade. The objective of this study was to generate a set of experimental data that could be used to validate computer codes for internal cooling systems. Surface heat transfer coefficients and entrance flow conditions were measured at entrance Reynolds numbers of 45,000, 335,000, and 726,000. The heat transfer data were obtained using an Inconel heater sheet attached to the surface and coated with liquid crystals. Visual and quantitative flow field results using particle image velocimetry were also obtained for a plane at mid channel height for a Reynolds number of 45,000. The flow was seeded with polystyrene particles and illuminated by a laser light sheet. Computational results were determined for the same configurations and at matching Reynolds numbers; these surface heat transfer coefficients and flow velocities were computed with a commercially available code. The experimental and computational results were compared. Although some general trends did agree, there were inconsistencies in the temperature patterns as well as in the numerical results. These inconsistencies strongly suggest the need for further computational studies on complicated geometries such as the one studied.

  7. Sensitivity Analysis for Coupled Aero-structural Systems

    NASA Technical Reports Server (NTRS)

    Giunta, Anthony A.

    1999-01-01

    A novel method has been developed for calculating gradients of aerodynamic force and moment coefficients for an aeroelastic aircraft model. This method uses the Global Sensitivity Equations (GSE) to account for the aero-structural coupling, and a reduced-order modal analysis approach to condense the coupling bandwidth between the aerodynamic and structural models. Parallel computing is applied to reduce the computational expense of the numerous high fidelity aerodynamic analyses needed for the coupled aero-structural system. Good agreement is obtained between aerodynamic force and moment gradients computed with the GSE/modal analysis approach and the same quantities computed using brute-force, computationally expensive, finite difference approximations. A comparison between the computational expense of the GSE/modal analysis method and a pure finite difference approach is presented. These results show that the GSE/modal analysis approach is the more computationally efficient technique if sensitivity analysis is to be performed for two or more aircraft design parameters.

  8. Efficient electronic structure theory via hierarchical scale-adaptive coupled-cluster formalism: I. Theory and computational complexity analysis

    NASA Astrophysics Data System (ADS)

    Lyakh, Dmitry I.

    2018-03-01

    A novel reduced-scaling, general-order coupled-cluster approach is formulated by exploiting hierarchical representations of many-body tensors, combined with the recently suggested formalism of scale-adaptive tensor algebra. Inspired by the hierarchical techniques from the renormalisation group approach, H/H2-matrix algebra and fast multipole method, the computational scaling reduction in our formalism is achieved via coarsening of quantum many-body interactions at larger interaction scales, thus imposing a hierarchical structure on many-body tensors of coupled-cluster theory. In our approach, the interaction scale can be defined on any appropriate Euclidean domain (spatial domain, momentum-space domain, energy domain, etc.). We show that the hierarchically resolved many-body tensors can reduce the storage requirements to O(N), where N is the number of simulated quantum particles. Subsequently, we prove that any connected many-body diagram consisting of a finite number of arbitrary-order tensors, e.g. an arbitrary coupled-cluster diagram, can be evaluated in O(NlogN) floating-point operations. On top of that, we suggest an additional approximation to further reduce the computational complexity of higher order coupled-cluster equations, i.e. equations involving higher than double excitations, which otherwise would introduce a large prefactor into formal O(NlogN) scaling.

  9. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  10. Inferring hidden causal relations between pathway members using reduced Google matrix of directed biological networks

    PubMed Central

    2018-01-01

    Signaling pathways represent parts of the global biological molecular network which connects them into a seamless whole through complex direct and indirect (hidden) crosstalk whose structure can change during development or in pathological conditions. We suggest a novel methodology, called Googlomics, for the structural analysis of directed biological networks using spectral analysis of their Google matrices, using parallels with quantum scattering theory, developed for nuclear and mesoscopic physics and quantum chaos. We introduce analytical “reduced Google matrix” method for the analysis of biological network structure. The method allows inferring hidden causal relations between the members of a signaling pathway or a functionally related group of genes. We investigate how the structure of hidden causal relations can be reprogrammed as a result of changes in the transcriptional network layer during cancerogenesis. The suggested Googlomics approach rigorously characterizes complex systemic changes in the wiring of large causal biological networks in a computationally efficient way. PMID:29370181

  11. Computer analysis of Holter electrocardiogram.

    PubMed

    Yanaga, T; Adachi, M; Sato, Y; Ichimaru, Y; Otsuka, K

    1994-10-01

    Computer analysis is indispensable for the interpretation of Holter ECG, because it includes a large quantity of data. Computer analysis of Holter ECG is similar to that of conventional ECG, however, in computer analysis of Holter ECG, there are some difficulties such as many noise, limited analyzing time and voluminous data. The main topics in computer analysis of Holter ECG will be arrhythmias, ST-T changes, heart rate variability, QT interval, late potential and construction of database. Although many papers have been published on the computer analysis of Holter ECG, some of the papers was reviewed briefly in the present paper. We have studied on computer analysis of VPCs, ST-T changes, heart rate variability, QT interval and Cheyne-Stokes respiration during 24-hour ambulatory ECG monitoring. Further, we have studied on ambulatory palmar sweating for the evaluation of mental stress during a day. In future, the development of "the integrated Holter system", which enables the evaluation of ventricular vulnerability and modulating factor such as psychoneural hypersensitivity may be important.

  12. Impaired theta phase-resetting underlying auditory N1 suppression in chronic alcoholism.

    PubMed

    Fuentemilla, Lluis; Marco-Pallarés, Josep; Gual, Antoni; Escera, Carles; Polo, Maria Dolores; Grau, Carles

    2009-02-18

    It has been suggested that chronic alcoholism may lead to altered neural mechanisms related to inhibitory processes. Here, we studied auditory N1 suppression phenomena (i.e. amplitude reduction with repetitive stimuli) in chronic alcoholic patients as an early-stage information-processing brain function involving inhibition by the analysis of the N1 event-related potential and time-frequency computation (spectral power and phase-resetting). Our results showed enhanced neural theta oscillatory phase-resetting underlying N1 generation in suppressed N1 event-related potential. The present findings suggest that chronic alcoholism alters neural oscillatory synchrony dynamics at very early stages of information processing.

  13. Thermal Performance Analysis of a Geologic Borehole Repository

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reagin, Lauren

    2016-08-16

    The Brazilian Nuclear Research Institute (IPEN) proposed a design for the disposal of Disused Sealed Radioactive Sources (DSRS) based on the IAEA Borehole Disposal of Sealed Radioactive Sources (BOSS) design that would allow the entirety of Brazil’s inventory of DSRS to be disposed in a single borehole. The proposed IPEN design allows for 170 waste packages (WPs) containing DSRS (such as Co-60 and Cs-137) to be stacked on top of each other inside the borehole. The primary objective of this work was to evaluate the thermal performance of a conservative approach to the IPEN proposal with the equivalent of twomore » WPs and two different inside configurations using Co-60 as the radioactive heat source. The current WP configuration (heterogeneous) for the IPEN proposal has 60% of the WP volume being occupied by a nuclear radioactive heat source and the remaining 40% as vacant space. The second configuration (homogeneous) considered for this project was a homogeneous case where 100% of the WP volume was occupied by a nuclear radioactive heat source. The computational models for the thermal analyses of the WP configurations with the Co-60 heat source considered three different cooling mechanisms (conduction, radiation, and convection) and the effect of mesh size on the results from the thermal analysis. The results of the analyses yielded maximum temperatures inside the WPs for both of the WP configurations and various mesh sizes. The heterogeneous WP considered the cooling mechanisms of conduction, convection, and radiation. The temperature results from the heterogeneous WP analysis suggest that the model is cooled predominantly by conduction with effect of radiation and natural convection on cooling being negligible. From the thermal analysis comparing the two WP configurations, the results suggest that either WP configuration could be used for the design. The mesh sensitivity results verify the meshes used, and results obtained from the thermal analyses were close to being independent of mesh size. The results from the computational case and analytically-calculated case for the homogeneous WP in benchmarking were almost identical, which indicates that the computational approach used here was successfully verified by the analytical solution.« less

  14. Logic circuits from zero forcing.

    PubMed

    Burgarth, Daniel; Giovannetti, Vittorio; Hogben, Leslie; Severini, Simone; Young, Michael

    We design logic circuits based on the notion of zero forcing on graphs; each gate of the circuits is a gadget in which zero forcing is performed. We show that such circuits can evaluate every monotone Boolean function. By using two vertices to encode each logical bit, we obtain universal computation. We also highlight a phenomenon of "back forcing" as a property of each function. Such a phenomenon occurs in a circuit when the input of gates which have been already used at a given time step is further modified by a computation actually performed at a later stage. Finally, we show that zero forcing can be also used to implement reversible computation. The model introduced here provides a potentially new tool in the analysis of Boolean functions, with particular attention to monotonicity. Moreover, in the light of applications of zero forcing in quantum mechanics, the link with Boolean functions may suggest a new directions in quantum control theory and in the study of engineered quantum spin systems. It is an open technical problem to verify whether there is a link between zero forcing and computation with contact circuits.

  15. A real-time spike sorting method based on the embedded GPU.

    PubMed

    Zelan Yang; Kedi Xu; Xiang Tian; Shaomin Zhang; Xiaoxiang Zheng

    2017-07-01

    Microelectrode arrays with hundreds of channels have been widely used to acquire neuron population signals in neuroscience studies. Online spike sorting is becoming one of the most important challenges for high-throughput neural signal acquisition systems. Graphic processing unit (GPU) with high parallel computing capability might provide an alternative solution for increasing real-time computational demands on spike sorting. This study reported a method of real-time spike sorting through computing unified device architecture (CUDA) which was implemented on an embedded GPU (NVIDIA JETSON Tegra K1, TK1). The sorting approach is based on the principal component analysis (PCA) and K-means. By analyzing the parallelism of each process, the method was further optimized in the thread memory model of GPU. Our results showed that the GPU-based classifier on TK1 is 37.92 times faster than the MATLAB-based classifier on PC while their accuracies were the same with each other. The high-performance computing features of embedded GPU demonstrated in our studies suggested that the embedded GPU provide a promising platform for the real-time neural signal processing.

  16. An Analysis of 27 Years of Research into Computer Education Published in Australian Educational Computing

    ERIC Educational Resources Information Center

    Zagami, Jason

    2015-01-01

    Analysis of three decades of publications in Australian Educational Computing (AEC) provides insight into the historical trends in Australian educational computing, highlighting an emphasis on pedagogy, comparatively few articles on educational technologies, and strong research topic alignment with similar international journals. Analysis confirms…

  17. Beyond Reason: Emotion

    NASA Astrophysics Data System (ADS)

    Suárez Araujo, Carmen Paz; Barahona da Fonseca, Isabel; Barahona da Fonseca, José; Simões da Fonseca, J.

    2004-08-01

    A theoretical approach that aims to the identification of information processing that may be responsible for emotional dimensions of subjective experience is studied as an initial step in the construction of a neural net model of affective dimensions of psychological experiences. In this paper it is suggested that a way of orientated recombination of attributes can be present not only in the perceptive processing but also in cognitive ones. We will present an analysis of the most important emotion theories, we show their neural organization and we propose the neural computation approach as an appropriate framework for generating knowledge about the neural base of emotional experience. Finally, in this study we present a scheme corresponding to framework to design a computational neural multi-system for Emotion (CONEMSE).

  18. Application of CFD codes to the design and development of propulsion systems

    NASA Technical Reports Server (NTRS)

    Lord, W. K.; Pickett, G. F.; Sturgess, G. J.; Weingold, H. D.

    1987-01-01

    The internal flows of aerospace propulsion engines have certain common features that are amenable to analysis through Computational Fluid Dynamics (CFD) computer codes. Although the application of CFD to engineering problems in engines was delayed by the complexities associated with internal flows, many codes with different capabilities are now being used as routine design tools. This is illustrated by examples taken from the aircraft gas turbine engine of flows calculated with potential flow, Euler flow, parabolized Navier-Stokes, and Navier-Stokes codes. Likely future directions of CFD applied to engine flows are described, and current barriers to continued progress are highlighted. The potential importance of the Numerical Aerodynamic Simulator (NAS) to resolution of these difficulties is suggested.

  19. Attractor Dynamics and Semantic Neighborhood Density: Processing Is Slowed by Near Neighbors and Speeded by Distant Neighbors

    PubMed Central

    Mirman, Daniel; Magnuson, James S.

    2008-01-01

    The authors investigated semantic neighborhood density effects on visual word processing to examine the dynamics of activation and competition among semantic representations. Experiment 1 validated feature-based semantic representations as a basis for computing semantic neighborhood density and suggested that near and distant neighbors have opposite effects on word processing. Experiment 2 confirmed these results: Word processing was slower for dense near neighborhoods and faster for dense distant neighborhoods. Analysis of a computational model showed that attractor dynamics can produce this pattern of neighborhood effects. The authors argue for reconsideration of traditional models of neighborhood effects in terms of attractor dynamics, which allow both inhibitory and facilitative effects to emerge. PMID:18194055

  20. A SUGGESTED CURRICULUM GUIDE FOR ELECTRO-MECHANICAL TECHNOLOGY ORIENTED SPECIFICALLY TO THE COMPUTER AND BUSINESS MACHINE FIELDS. INTERIM REPORT.

    ERIC Educational Resources Information Center

    LESCARBEAU, ROLAND F.; AND OTHERS

    A SUGGESTED POST-SECONDARY CURRICULUM GUIDE FOR ELECTRO-MECHANICAL TECHNOLOGY ORIENTED SPECIFICALLY TO THE COMPUTER AND BUSINESS MACHINE FIELDS WAS DEVELOPED BY A GROUP OF COOPERATING INSTITUTIONS, NOW INCORPORATED AS TECHNICAL EDUCATION CONSORTIUM, INCORPORATED. SPECIFIC NEEDS OF THE COMPUTER AND BUSINESS MACHINE INDUSTRY WERE DETERMINED FROM…

  1. Improving Students' Understanding of Molecular Structure through Broad-Based Use of Computer Models in the Undergraduate Organic Chemistry Lecture

    ERIC Educational Resources Information Center

    Springer, Michael T.

    2014-01-01

    Several articles suggest how to incorporate computer models into the organic chemistry laboratory, but relatively few papers discuss how to incorporate these models broadly into the organic chemistry lecture. Previous research has suggested that "manipulating" physical or computer models enhances student understanding; this study…

  2. Advanced Computing for Medicine.

    ERIC Educational Resources Information Center

    Rennels, Glenn D.; Shortliffe, Edward H.

    1987-01-01

    Discusses contributions that computers and computer networks are making to the field of medicine. Emphasizes the computer's speed in storing and retrieving data. Suggests that doctors may soon be able to use computers to advise on diagnosis and treatment. (TW)

  3. Comparison and correlation of Simple Sequence Repeats distribution in genomes of Brucella species

    PubMed Central

    Kiran, Jangampalli Adi Pradeep; Chakravarthi, Veeraraghavulu Praveen; Kumar, Yellapu Nanda; Rekha, Somesula Swapna; Kruti, Srinivasan Shanthi; Bhaskar, Matcha

    2011-01-01

    Computational genomics is one of the important tools to understand the distribution of closely related genomes including simple sequence repeats (SSRs) in an organism, which gives valuable information regarding genetic variations. The central objective of the present study was to screen the SSRs distributed in coding and non-coding regions among different human Brucella species which are involved in a range of pathological disorders. Computational analysis of the SSRs in the Brucella indicates few deviations from expected random models. Statistical analysis also reveals that tri-nucleotide SSRs are overrepresented and tetranucleotide SSRs underrepresented in Brucella genomes. From the data, it can be suggested that over expressed tri-nucleotide SSRs in genomic and coding regions might be responsible in the generation of functional variation of proteins expressed which in turn may lead to different pathogenicity, virulence determinants, stress response genes, transcription regulators and host adaptation proteins of Brucella genomes. Abbreviations SSRs - Simple Sequence Repeats, ORFs - Open Reading Frames. PMID:21738309

  4. Computer adaptive test approach to the assessment of children and youth with brachial plexus birth palsy.

    PubMed

    Mulcahey, M J; Merenda, Lisa; Tian, Feng; Kozin, Scott; James, Michelle; Gogola, Gloria; Ni, Pengsheng

    2013-01-01

    This study examined the psychometric properties of item pools relevant to upper-extremity function and activity performance and evaluated simulated 5-, 10-, and 15-item computer adaptive tests (CATs). In a multicenter, cross-sectional study of 200 children and youth with brachial plexus birth palsy (BPBP), parents responded to upper-extremity (n = 52) and activity (n = 34) items using a 5-point response scale. We used confirmatory and exploratory factor analysis, ordinal logistic regression, item maps, and standard errors to evaluate the psychometric properties of the item banks. Validity was evaluated using analysis of variance and Pearson correlation coefficients. Results show that the two item pools have acceptable model fit, scaled well for children and youth with BPBP, and had good validity, content range, and precision. Simulated CATs performed comparably to the full item banks, suggesting that a reduced number of items provide similar information to the entire set of items. Copyright © 2013 by the American Occupational Therapy Association, Inc.

  5. Signalling maps in cancer research: construction and data analysis

    PubMed Central

    Kondratova, Maria; Sompairac, Nicolas; Barillot, Emmanuel; Zinovyev, Andrei

    2018-01-01

    Abstract Generation and usage of high-quality molecular signalling network maps can be augmented by standardizing notations, establishing curation workflows and application of computational biology methods to exploit the knowledge contained in the maps. In this manuscript, we summarize the major aims and challenges of assembling information in the form of comprehensive maps of molecular interactions. Mainly, we share our experience gained while creating the Atlas of Cancer Signalling Network. In the step-by-step procedure, we describe the map construction process and suggest solutions for map complexity management by introducing a hierarchical modular map structure. In addition, we describe the NaviCell platform, a computational technology using Google Maps API to explore comprehensive molecular maps similar to geographical maps and explain the advantages of semantic zooming principles for map navigation. We also provide the outline to prepare signalling network maps for navigation using the NaviCell platform. Finally, several examples of cancer high-throughput data analysis and visualization in the context of comprehensive signalling maps are presented. PMID:29688383

  6. The effects of computer-assisted instruction and locus of control upon preservice elementary teachers' acquisition of the integrated science process skills

    NASA Astrophysics Data System (ADS)

    Wesley, Beth Eddinger; Krockover, Gerald H.; Devito, Alfred

    The purpose of this study was to determine the effects of computer-assisted instruction (CAI) versus a text mode of programmed instruction (PI), and the cognitive style of locus of control, on preservice elementary teachers' achievement of the integrated science process skills. Eighty-one preservice elementary teachers in six sections of a science methods class were classified as internally or externally controlled. The sections were randomly assigned to receive instruction in the integrated science process skills via a microcomputer or printed text. The study used a pretest-posttest control group design. Before assessing main and interaction effects, analysis of covariance was used to adjust posttest scores using the pretest scores. Statistical analysis revealed that main effects were not significant. Additionally, no interaction effects between treatments and loci of control were demonstrated. The results suggest that printed PI and tutorial CAI are equally effective modes of instruction for teaching internally and externally oriented preservice elementary teachers the integrated science process skills.

  7. PTP-ε HAS A CRITICAL ROLE IN SIGNALING TRANSDUCTION PATHWAYS AND PHOSPHOPROTEIN NETWORK TOPOLOGY IN RED CELLS

    PubMed Central

    De Franceschi, Lucia; Biondani, Andrea; Carta, Franco; Turrini, Franco; Laudanna, Carlo; Deana, Renzo; Brunati, Anna Maria; Turretta, Loris; Iolascon, Achille; Perrotta, Silverio; Elson, Ari; Bulato, Cristina; Brugnara, Carlo

    2010-01-01

    Protein tyrosine phosphatases (PTPs) are crucial components of cellular signal transduction pathways. We report here that red blood cells (RBCs) from mice lacking PTPε (Ptpre−/−) exhibit abnormal morphology and increased Ca2+-activated-K+ channel activity, which was partially blocked by the Src-Family-Kinases (SFKs) inhibitor PP1. In Ptpre−/− mouse RBCs, the activity of Fyn and Yes, two SFKs, were increased, suggesting a functional relationship between SFKs, PTPε and Ca2+-activated-K+-channel. The absence of PTPε markedly affected the RBC membrane tyrosine (Tyr-) phosphoproteome, indicating a perturbation of RBCs signal transduction pathways. Using signaling network computational analysis of the Tyr-phosphoproteomic data, we identified 7 topological clusters. We studied cluster 1, containing Syk-Tyr-kinase: Syk-kinase activity was higher in wild-type than in Ptpre−/− RBCs, validating the network computational analysis and indicating a novel signaling pathway, which involves Fyn and Syk in regulation of red cell morphology. PMID:18924107

  8. Use of the internet to study the utility values of the public.

    PubMed Central

    Lenert, Leslie A.; Sturley, Ann E.

    2002-01-01

    One of the most difficult tasks in cost-effectiveness analysis is the measurement of quality weights (utilities) for health states. The task is difficult because subjects often lack familiarity with health states they are asked to rate and because utilities measures such as the standard gamble, ask subjects to perform tasks that are complex and far from everyday experience. A large body of research suggests that computer methods can play an important role in explaining health states and measuring utilities. However, administering computer surveys to a "general public" sample, the most relevant sample for cost-effectiveness analysis, is logistically difficult. In this paper, we describe a software system designed to allow the study of general population preferences in a volunteer Internet survey panel. The approach, which relied on over sampling of ethnic groups and older members of the panel, produced a data set with an ethnically, chronologically and geographically diverse group of respondents, but was not successful in replicating the joint distribution of demographic patterns in the population. PMID:12463862

  9. A novel spinal kinematic analysis using X-ray imaging and vicon motion analysis: a case study.

    PubMed

    Noh, Dong K; Lee, Nam G; You, Joshua H

    2014-01-01

    This study highlights a novel spinal kinematic analysis method and the feasibility of X-ray imaging measurements to accurately assess thoracic spine motion. The advanced X-ray Nash-Moe method and analysis were used to compute the segmental range of motion in thoracic vertebra pedicles in vivo. This Nash-Moe X-ray imaging method was compared with a standardized method using the Vicon 3-dimensional motion capture system. Linear regression analysis showed an excellent and significant correlation between the two methods (R2 = 0.99, p < 0.05), suggesting that the analysis of spinal segmental range of motion using X-ray imaging measurements was accurate and comparable to the conventional 3-dimensional motion analysis system. Clinically, this novel finding is compelling evidence demonstrating that measurements with X-ray imaging are useful to accurately decipher pathological spinal alignment and movement impairments in idiopathic scoliosis (IS).

  10. Experimental Investigation of Jet Impingement Heat Transfer Using Thermochromic Liquid Crystals

    NASA Technical Reports Server (NTRS)

    Dempsey, Brian Paul

    1997-01-01

    Jet impingement cooling of a hypersonic airfoil leading edge is experimentally investigated using thermochromic liquid crystals (TLCS) to measure surface temperature. The experiment uses computer data acquisition with digital imaging of the TLCs to determine heat transfer coefficients during a transient experiment. The data reduction relies on analysis of a coupled transient conduction - convection heat transfer problem that characterizes the experiment. The recovery temperature of the jet is accounted for by running two experiments with different heating rates, thereby generating a second equation that is used to solve for the recovery temperature. The resulting solution requires a complicated numerical iteration that is handled by a computer. Because the computational data reduction method is complex, special attention is paid to error assessment. The error analysis considers random and systematic errors generated by the instrumentation along with errors generated by the approximate nature of the numerical methods. Results of the error analysis show that the experimentally determined heat transfer coefficients are accurate to within 15%. The error analysis also shows that the recovery temperature data may be in error by more than 50%. The results show that the recovery temperature data is only reliable when the recovery temperature of the jet is greater than 5 C, i.e. the jet velocity is in excess of 100 m/s. Parameters that were investigated include nozzle width, distance from the nozzle exit to the airfoil surface, and jet velocity. Heat transfer data is presented in graphical and tabular forms. An engineering analysis of hypersonic airfoil leading edge cooling is performed using the results from these experiments. Several suggestions for the improvement of the experimental technique are discussed.

  11. Comparison of different hip prosthesis shapes considering micro-level bone remodeling and stress-shielding criteria using three-dimensional design space topology optimization.

    PubMed

    Boyle, Christopher; Kim, Il Yong

    2011-06-03

    Since the late 1980s, computational analysis of total hip arthroplasty (THA) prosthesis components has been completed using macro-level bone remodeling algorithms. The utilization of macro-sized elements requires apparent bone densities to predict cancellous bone strength, thereby, preventing visualization and analysis of realistic trabecular architecture. In this study, we utilized a recently developed structural optimization algorithm, design space optimization (DSO), to perform a micro-level three-dimensional finite element bone remodeling simulation on the human proximal femur pre- and post-THA. The computational simulation facilitated direct performance comparison between two commercially available prosthetic implant stems from Zimmer Inc.: the Alloclassic and the Mayo conservative. The novel micro-level approach allowed the unique ability to visualize the trabecular bone adaption post-operation and to quantify the changes in bone mineral content by region. Stress-shielding and strain energy distribution were also quantified for the immediate post-operation and the stably fixated, post-remodeling conditions. Stress-shielding was highest in the proximal region and remained unchanged post-remodeling; conversely, the mid and distal portions show large increases in stress, suggesting a distal shift in the loadpath. The Mayo design conserves bone mass, while simultaneously reducing the incidence of stress-shielding compared to the Alloclassic, revealing a key benefit of the distinctive geometry. Several important factors for stable fixation, determined in clinical evaluations from the literature, were evident in both designs: high levels of proximal bone loss and distal bone densification. The results suggest this novel computational framework can be utilized for comparative hip prosthesis shape, uniquely considering the post-operation bone remodeling as a design criterion. Copyright © 2011 Elsevier Ltd. All rights reserved.

  12. Synthesis, molecular structure, spectral analysis and cytotoxic activity of two new aroylhydrazones

    NASA Astrophysics Data System (ADS)

    Singh, Ravindra Kumar; Singh, Ashok Kumar; Siddiqui, Sahabjada; Arshad, Mohammad; Jafri, Asif

    2017-05-01

    Two new aroylhydrazones viz 4-nitro-N‧-(1-(pyridin-2-yl)ethylidene)benzohydrazide, NPHY (4) and 4-nitro-N‧-(1-(thiophen-2-yl)ethylidene)benzohydrazide, NPHT (5) have been prepared and characterized by 1H NMR, 13C NMR, FT-IR, UV-Visible spectroscopy and mass spectrometry. All quantum calculations were performed at DFT level of theory using B3LYP functional and 6-31G (d,p) as basis set. TD-DFT calculated electronic transitions are found to be in good agreement with experimental findings. The assignments for normal vibrational modes have been done by computing Potential Energy Distribution (PED) using Gar2ped. HOMO-LUMO analysis was performed and reactivity descriptors were also computed. Global electrophilicity index (ω) of 6.12-6.26 eV shows these aroylhydrazones to be strong electrophiles. Intramolecular interactions were analyzed by 'Atoms in molecule' (AIM) approach. Also, the computed first static hyperpolarizabilities (β0) of these hydrazones indicate their future application as an attractive non-linear optical (NLO) material. Cytotoxicity evaluated by MTT assay, suggested that the synthesized aroylhydrazones significantly reduce the cell viability of breast cancer cell lines (MCF7) and human prostate adenocarcinoma (DU145) in a dose dependent manner. Cytotoxic potencies (IC50) of these hydrazones against MCF7 and DU145 cell lines were found in range of 54.67-85.67 μM. The result of ROS activity provides supportive data for molecular mechanism of these hydrazones, which is related to apoptotic cellular death. Nuclear condensation assay performed by DAPI staining shows fragmented and condensed nuclei in MCF7 cells, suggesting cell death by apoptosis.

  13. New computing systems, future computing environment, and their implications on structural analysis and design

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Housner, Jerrold M.

    1993-01-01

    Recent advances in computer technology that are likely to impact structural analysis and design of flight vehicles are reviewed. A brief summary is given of the advances in microelectronics, networking technologies, and in the user-interface hardware and software. The major features of new and projected computing systems, including high performance computers, parallel processing machines, and small systems, are described. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed. The impact of the advances in computer technology on structural analysis and the design of flight vehicles is described. A scenario for future computing paradigms is presented, and the near-term needs in the computational structures area are outlined.

  14. An emulator for minimizing computer resources for finite element analysis

    NASA Technical Reports Server (NTRS)

    Melosh, R.; Utku, S.; Islam, M.; Salama, M.

    1984-01-01

    A computer code, SCOPE, has been developed for predicting the computer resources required for a given analysis code, computer hardware, and structural problem. The cost of running the code is a small fraction (about 3 percent) of the cost of performing the actual analysis. However, its accuracy in predicting the CPU and I/O resources depends intrinsically on the accuracy of calibration data that must be developed once for the computer hardware and the finite element analysis code of interest. Testing of the SCOPE code on the AMDAHL 470 V/8 computer and the ELAS finite element analysis program indicated small I/O errors (3.2 percent), larger CPU errors (17.8 percent), and negligible total errors (1.5 percent).

  15. Substructure analysis using NICE/SPAR and applications of force to linear and nonlinear structures. [spacecraft masts

    NASA Technical Reports Server (NTRS)

    Razzaq, Zia; Prasad, Venkatesh; Darbhamulla, Siva Prasad; Bhati, Ravinder; Lin, Cai

    1987-01-01

    Parallel computing studies are presented for a variety of structural analysis problems. Included are the substructure planar analysis of rectangular panels with and without a hole, the static analysis of space mast, using NICE/SPAR and FORCE, and substructure analysis of plane rigid-jointed frames using FORCE. The computations are carried out on the Flex/32 MultiComputer using one to eighteen processors. The NICE/SPAR runstream samples are documented for the panel problem. For the substructure analysis of plane frames, a computer program is developed to demonstrate the effectiveness of a substructuring technique when FORCE is enforced. Ongoing research activities for an elasto-plastic stability analysis problem using FORCE, and stability analysis of the focus problem using NICE/SPAR are briefly summarized. Speedup curves for the panel, the mast, and the frame problems provide a basic understanding of the effectiveness of parallel computing procedures utilized or developed, within the domain of the parameters considered. Although the speedup curves obtained exhibit various levels of computational efficiency, they clearly demonstrate the excellent promise which parallel computing holds for the structural analysis problem. Source code is given for the elasto-plastic stability problem and the FORCE program.

  16. Development of a New Data Tool for Computing Launch and Landing Availability with Respect to Surface Weather

    NASA Technical Reports Server (NTRS)

    Burns, K. Lee; Altino, Karen

    2008-01-01

    The Marshall Space Flight Center Natural Environments Branch has a long history of expertise in the modeling and computation of statistical launch availabilities with respect to weather conditions. Their existing data analysis product, the Atmospheric Parametric Risk Assessment (APRA) tool, computes launch availability given an input set of vehicle hardware and/or operational weather constraints by calculating the climatological probability of exceeding the specified constraint limits, APRA has been used extensively to provide the Space Shuttle program the ability to estimate impacts that various proposed design modifications would have to overall launch availability. The model accounts for both seasonal and diurnal variability at a single geographic location and provides output probabilities for a single arbitrary launch attempt. Recently, the Shuttle program has shown interest in having additional capabilities added to the APRA model, including analysis of humidity parameters, inclusion of landing site weather to produce landing availability, and concurrent analysis of multiple sites, to assist in operational landing site selection. In addition, the Constellation program has also expressed interest in the APRA tool, and has requested several additional capabilities to address some Constellation-specific issues, both in the specification and verification of design requirements and in the development of operations concepts. The combined scope of the requested capability enhancements suggests an evolution of the model beyond a simple revision process. Development has begun for a new data analysis tool that will satisfy the requests of both programs. This new tool, Probabilities of Atmospheric Conditions and Environmental Risk (PACER), will provide greater flexibility and significantly enhanced functionality compared to the currently existing tool.

  17. Opportunities and Needs for Mobile-Computing Technology to Support U.S. Geological Survey Fieldwork

    USGS Publications Warehouse

    Wood, Nathan J.; Halsing, David L.

    2006-01-01

    To assess the opportunities and needs for mobile-computing technology at the U.S. Geological Survey (USGS), we conducted an internal, Internet-based survey of bureau scientists whose research includes fieldwork. In summer 2005, 144 survey participants answered 65 questions about fieldwork activities and conditions, technology to support field research, and postfieldwork data processing and analysis. Results suggest that some types of mobile-computing technology are already commonplace, such as digital cameras and Global Positioning System (GPS) receivers, whereas others are not, such as personal digital assistants (PDAs) and tablet-based personal computers (tablet PCs). The potential for PDA use in the USGS is high: 97 percent of respondents record field observations (primarily environmental conditions and water-quality data), and 87 percent take field samples (primarily water-quality data, water samples, and sediment/soil samples). The potential for tablet PC use in the USGS is also high: 59 percent of respondents map environmental features in the field, primarily by sketching in field notebooks, on aerial photographs, or on topographic-map sheets. Results also suggest that efficient mobile-computing-technology solutions could benefit many USGS scientists because most respondents spend at least 1 week per year in the field, conduct field sessions that are least 1 week in duration, have field crews of one to three people, and typically travel on foot about 1 mi from their field vehicles. By allowing researchers to enter data directly into digital databases while in the field, mobile-computing technology could also minimize postfieldwork data processing: 93 percent of respondents enter collected field data into their office computers, and more than 50 percent spend at least 1 week per year on postfieldwork data processing. Reducing postfieldwork data processing could free up additional time for researchers and result in cost savings for the bureau. Generally, respondents support greater use of mobile-computing technology at the USGS and are interested in training opportunities and further discussions related to data archiving, access to additional digital data types, and technology development.

  18. Computed tomographic analysis of temporal maxillary stability and pterygomaxillary generate formation following pediatric Le Fort III distraction advancement.

    PubMed

    Hopper, Richard A; Sandercoe, Gavin; Woo, Albert; Watts, Robyn; Kelley, Patrick; Ettinger, Russell E; Saltzman, Babette

    2010-11-01

    Le Fort III distraction requires generation of bone in the pterygomaxillary region. The authors performed retrospective digital analysis on temporal fine-cut computed tomographic images to quantify both radiographic evidence of pterygomaxillary region bone formation and relative maxillary stability. Fifteen patients with syndromic midface hypoplasia were included in the study. The average age of the patients was 8.7 years; 11 had either Crouzon or Apert syndrome. The average displacement of the maxilla during distraction was 16.2 mm (range, 7 to 31 mm). Digital analysis was performed on fine-cut computed tomographic scans before surgery, at device removal, and at annual follow-up. Seven patients also had mid-consolidation computed tomographic scans. Relative maxillary stability and density of radiographic bone in the pterygomaxillary region were calculated between each scan. There was no evidence of clinically significant maxillary relapse, rotation, or growth between the end of consolidation and 1-year follow-up, other than a relatively small 2-mm subnasal maxillary vertical growth. There was an average radiographic ossification of 0.5 mm/mm advancement at the time of device removal, with a 25th percentile value of 0.3 mm/mm. The time during consolidation that each patient reached the 25th percentile of pterygomaxillary region bone density observed in this series of clinically stable advancements ranged from 1.3 to 9.8 weeks (average, 3.7 weeks). There was high variability in the amount of bone formed in the pterygomaxillary region associated with clinical stability of the advanced Le Fort III segment. These data suggest that a subsection of patients generate the minimal amount of pterygomaxillary region bone formation associated with advancement stability as early as 4 weeks into consolidation.

  19. Computational System For Rapid CFD Analysis In Engineering

    NASA Technical Reports Server (NTRS)

    Barson, Steven L.; Ascoli, Edward P.; Decroix, Michelle E.; Sindir, Munir M.

    1995-01-01

    Computational system comprising modular hardware and software sub-systems developed to accelerate and facilitate use of techniques of computational fluid dynamics (CFD) in engineering environment. Addresses integration of all aspects of CFD analysis process, including definition of hardware surfaces, generation of computational grids, CFD flow solution, and postprocessing. Incorporates interfaces for integration of all hardware and software tools needed to perform complete CFD analysis. Includes tools for efficient definition of flow geometry, generation of computational grids, computation of flows on grids, and postprocessing of flow data. System accepts geometric input from any of three basic sources: computer-aided design (CAD), computer-aided engineering (CAE), or definition by user.

  20. Studies in nonlinear problems of energy. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matkowsky, B.J.

    1998-12-01

    The author completed a successful research program on Nonlinear Problems of Energy, with emphasis on combustion and flame propagation. A total of 183 papers associated with the grant has appeared in the literature, and the efforts have twice been recognized by DOE`s Basic Science Division for Top Accomplishment. In the research program the author concentrated on modeling, analysis and computation of combustion phenomena, with particular emphasis on the transition from laminar to turbulent combustion. Thus he investigated the nonlinear dynamics and pattern formation in the successive stages of transition. He described the stability of combustion waves, and transitions to wavesmore » exhibiting progressively higher degrees of spatio-temporal complexity. Combustion waves are characterized by large activation energies, so that chemical reactions are significant only in thin layers, termed reaction zones. In the limit of infinite activation energy, the zones shrink to moving surfaces, termed fronts, which must be found during the course of the analysis, so that the problems are moving free boundary problems. The analytical studies were carried out for the limiting case with fronts, while the numerical studies were carried out for the case of finite, though large, activation energy. Accurate resolution of the solution in the reaction zone(s) is essential, otherwise false predictions of dynamical behavior are possible. Since the reaction zones move, and their location is not known a-priori, the author has developed adaptive pseudo-spectral methods, which have proven to be very useful for the accurate, efficient computation of solutions of combustion, and other, problems. The approach is based on a combination of analytical and numerical methods. The numerical computations built on and extended the information obtained analytically. Furthermore, the solutions obtained analytically served as benchmarks for testing the accuracy of the solutions determined computationally. Finally, the computational results suggested new analysis to be considered. A cumulative list of publications citing the grant make up the contents of this report.« less

  1. On avoided words, absent words, and their application to biological sequence analysis.

    PubMed

    Almirantis, Yannis; Charalampopoulos, Panagiotis; Gao, Jia; Iliopoulos, Costas S; Mohamed, Manal; Pissis, Solon P; Polychronopoulos, Dimitris

    2017-01-01

    The deviation of the observed frequency of a word w from its expected frequency in a given sequence x is used to determine whether or not the word is avoided . This concept is particularly useful in DNA linguistic analysis. The value of the deviation of w , denoted by [Formula: see text], effectively characterises the extent of a word by its edge contrast in the context in which it occurs. A word w of length [Formula: see text] is a [Formula: see text]-avoided word in x if [Formula: see text], for a given threshold [Formula: see text]. Notice that such a word may be completely absent from x . Hence, computing all such words naïvely can be a very time-consuming procedure, in particular for large k . In this article, we propose an [Formula: see text]-time and [Formula: see text]-space algorithm to compute all [Formula: see text]-avoided words of length k in a given sequence of length n over a fixed-sized alphabet. We also present a time-optimal [Formula: see text]-time algorithm to compute all [Formula: see text]-avoided words (of any length) in a sequence of length n over an integer alphabet of size [Formula: see text]. In addition, we provide a tight asymptotic upper bound for the number of [Formula: see text]-avoided words over an integer alphabet and the expected length of the longest one. We make available an implementation of our algorithm. Experimental results, using both real and synthetic data, show the efficiency and applicability of our implementation in biological sequence analysis. The systematic search for avoided words is particularly useful for biological sequence analysis. We present a linear-time and linear-space algorithm for the computation of avoided words of length k in a given sequence x . We suggest a modification to this algorithm so that it computes all avoided words of x , irrespective of their length, within the same time complexity. We also present combinatorial results with regards to avoided words and absent words.

  2. An insight to the molecular interactions of the FDA approved HIV PR drugs against L38L↑N↑L PR mutant

    NASA Astrophysics Data System (ADS)

    Sanusi, Zainab K.; Govender, Thavendran; Maguire, Glenn E. M.; Maseko, Sibusiso B.; Lin, Johnson; Kruger, Hendrik G.; Honarparvar, Bahareh

    2018-03-01

    The aspartate protease of the human immune deficiency type-1 virus (HIV-1) has become a crucial antiviral target in which many useful antiretroviral inhibitors have been developed. However, it seems the emergence of new HIV-1 PR mutations enhances drug resistance, hence, the available FDA approved drugs show less activity towards the protease. A mutation and insertion designated L38L↑N↑L PR was recently reported from subtype of C-SA HIV-1. An integrated two-layered ONIOM (QM:MM) method was employed in this study to examine the binding affinities of the nine HIV PR inhibitors against this mutant. The computed binding free energies as well as experimental data revealed a reduced inhibitory activity towards the L38L↑N↑L PR in comparison with subtype C-SA HIV-1 PR. This observation suggests that the insertion and mutations significantly affect the binding affinities or characteristics of the HIV PIs and/or parent PR. The same trend for the computational binding free energies was observed for eight of the nine inhibitors with respect to the experimental binding free energies. The outcome of this study shows that ONIOM method can be used as a reliable computational approach to rationalize lead compounds against specific targets. The nature of the intermolecular interactions in terms of the host-guest hydrogen bond interactions is discussed using the atoms in molecules (AIM) analysis. Natural bond orbital analysis was also used to determine the extent of charge transfer between the QM region of the L38L↑N↑L PR enzyme and FDA approved drugs. AIM analysis showed that the interaction between the QM region of the L38L↑N↑L PR and FDA approved drugs are electrostatic dominant, the bond stability computed from the NBO analysis supports the results from the AIM application. Future studies will focus on the improvement of the computational model by considering explicit water molecules in the active pocket. We believe that this approach has the potential to provide information that will aid in the design of much improved HIV-1 PR antiviral drugs.

  3. Analysis and design of algorithm-based fault-tolerant systems

    NASA Technical Reports Server (NTRS)

    Nair, V. S. Sukumaran

    1990-01-01

    An important consideration in the design of high performance multiprocessor systems is to ensure the correctness of the results computed in the presence of transient and intermittent failures. Concurrent error detection and correction have been applied to such systems in order to achieve reliability. Algorithm Based Fault Tolerance (ABFT) was suggested as a cost-effective concurrent error detection scheme. The research was motivated by the complexity involved in the analysis and design of ABFT systems. To that end, a matrix-based model was developed and, based on that, algorithms for both the design and analysis of ABFT systems are formulated. These algorithms are less complex than the existing ones. In order to reduce the complexity further, a hierarchical approach is developed for the analysis of large systems.

  4. Can natural proteins designed with 'inverted' peptide sequences adopt native-like protein folds?

    PubMed

    Sridhar, Settu; Guruprasad, Kunchur

    2014-01-01

    We have carried out a systematic computational analysis on a representative dataset of proteins of known three-dimensional structure, in order to evaluate whether it would possible to 'swap' certain short peptide sequences in naturally occurring proteins with their corresponding 'inverted' peptides and generate 'artificial' proteins that are predicted to retain native-like protein fold. The analysis of 3,967 representative proteins from the Protein Data Bank revealed 102,677 unique identical inverted peptide sequence pairs that vary in sequence length between 5-12 and 18 amino acid residues. Our analysis illustrates with examples that such 'artificial' proteins may be generated by identifying peptides with 'similar structural environment' and by using comparative protein modeling and validation studies. Our analysis suggests that natural proteins may be tolerant to accommodating such peptides.

  5. CSM research: Methods and application studies

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.

    1989-01-01

    Computational mechanics is that discipline of applied science and engineering devoted to the study of physical phenomena by means of computational methods based on mathematical modeling and simulation, utilizing digital computers. The discipline combines theoretical and applied mechanics, approximation theory, numerical analysis, and computer science. Computational mechanics has had a major impact on engineering analysis and design. When applied to structural mechanics, the discipline is referred to herein as computational structural mechanics. Complex structures being considered by NASA for the 1990's include composite primary aircraft structures and the space station. These structures will be much more difficult to analyze than today's structures and necessitate a major upgrade in computerized structural analysis technology. NASA has initiated a research activity in structural analysis called Computational Structural Mechanics (CSM). The broad objective of the CSM activity is to develop advanced structural analysis technology that will exploit modern and emerging computers, such as those with vector and/or parallel processing capabilities. Here, the current research directions for the Methods and Application Studies Team of the Langley CSM activity are described.

  6. Rational use of cognitive resources: levels of analysis between the computational and the algorithmic.

    PubMed

    Griffiths, Thomas L; Lieder, Falk; Goodman, Noah D

    2015-04-01

    Marr's levels of analysis-computational, algorithmic, and implementation-have served cognitive science well over the last 30 years. But the recent increase in the popularity of the computational level raises a new challenge: How do we begin to relate models at different levels of analysis? We propose that it is possible to define levels of analysis that lie between the computational and the algorithmic, providing a way to build a bridge between computational- and algorithmic-level models. The key idea is to push the notion of rationality, often used in defining computational-level models, deeper toward the algorithmic level. We offer a simple recipe for reverse-engineering the mind's cognitive strategies by deriving optimal algorithms for a series of increasingly more realistic abstract computational architectures, which we call "resource-rational analysis." Copyright © 2015 Cognitive Science Society, Inc.

  7. Comparison of Building Loads Analysis and System Thermodynamics (BLAST) Computer Program Simulations and Measured Energy Use for Army Buildings.

    DTIC Science & Technology

    1980-05-01

    engineering ,ZteNo D R RPTE16 research w 9 laboratory COMPARISON OF BUILDING LOADS ANALYSIS AND SYSTEM THERMODYNAMICS (BLAST) AD 0 5 5,0 3COMPUTER PROGRAM...Building Loads Analysis and System Thermodynamics (BLAST) computer program. A dental clinic and a battalion headquarters and classroom building were...Building and HVAC System Data Computer Simulation Comparison of Actual and Simulated Results ANALYSIS AND FINDINGS

  8. Multidisciplinary High-Fidelity Analysis and Optimization of Aerospace Vehicles. Part 2; Preliminary Results

    NASA Technical Reports Server (NTRS)

    Walsh, J. L.; Weston, R. P.; Samareh, J. A.; Mason, B. H.; Green, L. L.; Biedron, R. T.

    2000-01-01

    An objective of the High Performance Computing and Communication Program at the NASA Langley Research Center is to demonstrate multidisciplinary shape and sizing optimization of a complete aerospace vehicle configuration by using high-fidelity finite-element structural analysis and computational fluid dynamics aerodynamic analysis in a distributed, heterogeneous computing environment that includes high performance parallel computing. A software system has been designed and implemented to integrate a set of existing discipline analysis codes, some of them computationally intensive, into a distributed computational environment for the design of a high-speed civil transport configuration. The paper describes both the preliminary results from implementing and validating the multidisciplinary analysis and the results from an aerodynamic optimization. The discipline codes are integrated by using the Java programming language and a Common Object Request Broker Architecture compliant software product. A companion paper describes the formulation of the multidisciplinary analysis and optimization system.

  9. Multiphase fluid-solid coupled analysis of shock-bubble-stone interaction in shockwave lithotripsy.

    PubMed

    Wang, Kevin G

    2017-10-01

    A novel multiphase fluid-solid-coupled computational framework is applied to investigate the interaction of a kidney stone immersed in liquid with a lithotripsy shock wave (LSW) and a gas bubble near the stone. The main objective is to elucidate the effects of a bubble in the shock path to the elastic and fracture behaviors of the stone. The computational framework couples a finite volume 2-phase computational fluid dynamics solver with a finite element computational solid dynamics solver. The surface of the stone is represented as a dynamic embedded boundary in the computational fluid dynamics solver. The evolution of the bubble surface is captured by solving the level set equation. The interface conditions at the surfaces of the stone and the bubble are enforced through the construction and solution of local fluid-solid and 2-fluid Riemann problems. This computational framework is first verified for 3 example problems including a 1D multimaterial Riemann problem, a 3D shock-stone interaction problem, and a 3D shock-bubble interaction problem. Next, a series of shock-bubble-stone-coupled simulations are presented. This study suggests that the dynamic response of a bubble to LSW varies dramatically depending on its initial size. Bubbles with an initial radius smaller than a threshold collapse within 1 μs after the passage of LSW, whereas larger bubbles do not. For a typical LSW generated by an electrohydraulic lithotripter (p max  = 35.0MPa, p min  =- 10.1MPa), this threshold is approximately 0.12mm. Moreover, this study suggests that a noncollapsing bubble imposes a negative effect on stone fracture as it shields part of the LSW from the stone. On the other hand, a collapsing bubble may promote fracture on the proximal surface of the stone, yet hinder fracture from stone interior. Copyright © 2016 John Wiley & Sons, Ltd.

  10. Estimation of Fine-Scale Histologic Features at Low Magnification.

    PubMed

    Zarella, Mark D; Quaschnick, Matthew R; Breen, David E; Garcia, Fernando U

    2018-06-18

    - Whole-slide imaging has ushered in a new era of technology that has fostered the use of computational image analysis for diagnostic support and has begun to transfer the act of analyzing a slide to computer monitors. Due to the overwhelming amount of detail available in whole-slide images, analytic procedures-whether computational or visual-often operate at magnifications lower than the magnification at which the image was acquired. As a result, a corresponding reduction in image resolution occurs. It is unclear how much information is lost when magnification is reduced, and whether the rich color attributes of histologic slides can aid in reconstructing some of that information. - To examine the correspondence between the color and spatial properties of whole-slide images to elucidate the impact of resolution reduction on the histologic attributes of the slide. - We simulated image resolution reduction and modeled its effect on classification of the underlying histologic structure. By harnessing measured histologic features and the intrinsic spatial relationships between histologic structures, we developed a predictive model to estimate the histologic composition of tissue in a manner that exceeds the resolution of the image. - Reduction in resolution resulted in a significant loss of the ability to accurately characterize histologic components at magnifications less than ×10. By utilizing pixel color, this ability was improved at all magnifications. - Multiscale analysis of histologic images requires an adequate understanding of the limitations imposed by image resolution. Our findings suggest that some of these limitations may be overcome with computational modeling.

  11. Computer network environment planning and analysis

    NASA Technical Reports Server (NTRS)

    Dalphin, John F.

    1989-01-01

    The GSFC Computer Network Environment provides a broadband RF cable between campus buildings and ethernet spines in buildings for the interlinking of Local Area Networks (LANs). This system provides terminal and computer linkage among host and user systems thereby providing E-mail services, file exchange capability, and certain distributed computing opportunities. The Environment is designed to be transparent and supports multiple protocols. Networking at Goddard has a short history and has been under coordinated control of a Network Steering Committee for slightly more than two years; network growth has been rapid with more than 1500 nodes currently addressed and greater expansion expected. A new RF cable system with a different topology is being installed during summer 1989; consideration of a fiber optics system for the future will begin soon. Summmer study was directed toward Network Steering Committee operation and planning plus consideration of Center Network Environment analysis and modeling. Biweekly Steering Committee meetings were attended to learn the background of the network and the concerns of those managing it. Suggestions for historical data gathering have been made to support future planning and modeling. Data Systems Dynamic Simulator, a simulation package developed at NASA and maintained at GSFC was studied as a possible modeling tool for the network environment. A modeling concept based on a hierarchical model was hypothesized for further development. Such a model would allow input of newly updated parameters and would provide an estimation of the behavior of the network.

  12. Synthesizing Results From Empirical Research on Computer-Based Scaffolding in STEM Education

    PubMed Central

    Belland, Brian R.; Walker, Andrew E.; Kim, Nam Ju; Lefler, Mason

    2016-01-01

    Computer-based scaffolding assists students as they generate solutions to complex problems, goals, or tasks, helping increase and integrate their higher order skills in the process. However, despite decades of research on scaffolding in STEM (science, technology, engineering, and mathematics) education, no existing comprehensive meta-analysis has synthesized the results of these studies. This review addresses that need by synthesizing the results of 144 experimental studies (333 outcomes) on the effects of computer-based scaffolding designed to assist the full range of STEM learners (primary through adult education) as they navigated ill-structured, problem-centered curricula. Results of our random effect meta-analysis (a) indicate that computer-based scaffolding showed a consistently positive (ḡ = 0.46) effect on cognitive outcomes across various contexts of use, scaffolding characteristics, and levels of assessment and (b) shed light on many scaffolding debates, including the roles of customization (i.e., fading and adding) and context-specific support. Specifically, scaffolding’s influence on cognitive outcomes did not vary on the basis of context-specificity, presence or absence of scaffolding change, and logic by which scaffolding change is implemented. Scaffolding’s influence was greatest when measured at the principles level and among adult learners. Still scaffolding’s effect was substantial and significantly greater than zero across all age groups and assessment levels. These results suggest that scaffolding is a highly effective intervention across levels of different characteristics and can largely be designed in many different ways while still being highly effective. PMID:28344365

  13. Modification and evaluation of a Barnes-type objective analysis scheme for surface meteorological data

    NASA Technical Reports Server (NTRS)

    Smith, D. R.

    1982-01-01

    The Purdue Regional Objective Analysis of the Mesoscale (PROAM) is a Barness-type scheme for the analysis of surface meteorological data. Modifications are introduced to the original version in order to increase its flexibility and to permit greater ease of usage. The code was rewritten for an interactive computer environment. Furthermore, a multiple iteration technique suggested by Barnes was implemented for greater accuracy. PROAM was subjected to a series of experiments in order to evaluate its performance under a variety of analysis conditions. The tests include use of a known analytic temperature distribution in order to quantify error bounds for the scheme. Similar experiments were conducted using actual atmospheric data. Results indicate that the multiple iteration technique increases the accuracy of the analysis. Furthermore, the tests verify appropriate values for the analysis parameters in resolving meso-beta scale phenomena.

  14. Initiation and Modification of Reaction by Energy Addition: Kinetic and Transport Phenomena

    DTIC Science & Technology

    1990-10-01

    ignition- delay time ranges from about 2 to 100 ps. The results of a computer- modeling calcu- lation of the chemical kinetics suggest that the...Page PROGRAM INFORMATION iii 1.0 RESEARCH OBJECTIVES 2.0 ANALYSIS 2 3.0 EXPERIMENT 7 REFERENCES 8 APPENDIX I. Evaluating a Simple Model for Laminar...Flame-Propagation I-1 Rates. I. Planar Geometry. APPENDIX II. Evaluating a Simple Model for Laminar-Flame-Propagation II-1 Rates. II. Spherical

  15. A design study for an advanced ocean color scanner system. [spaceborne equipment

    NASA Technical Reports Server (NTRS)

    Kim, H. H.; Fraser, R. S.; Thompson, L. L.; Bahethi, O.

    1980-01-01

    Along with a colorimetric data analysis scheme, the instrumental parameters which need to be optimized in future spaceborne ocean color scanner systems are outlined. With regard to assessing atmospheric effects from ocean colorimetry, attention is given to computing size parameters of the aerosols in the atmosphere, total optical depth measurement, and the aerosol optical thickness. It is suggested that sensors based on the use of linear array technology will meet hardware objectives.

  16. Pre-incident Analysis using Multigraphs and Faceted Ontologies

    DTIC Science & Technology

    2013-08-01

    ontology for beverages, part of which is shown in the form of an entity- relationship (ER) graph in Figure 4. The entities Beer , Wine, etc. have is a...another from Beer to Grains. The terminology is suggestive: The is a type of link has already been defined (informally). The made from link...expressions derived from natural language such as Beer , is a, Grains and made from. Labels alone are insufficient for a computer system for ontology and

  17. Analysis of Special Forces Medic (18D) Attrition

    DTIC Science & Technology

    1994-08-01

    130 students per year, 18D should reach 100% strength in the second quarter of FY95. Training Issues The biggest single reason for the high attrition...including a library and 24-hour study rooms, ix SOMED and SOMTC should consider incorporating computer-based training into the 18D training. The PA...particularly in the sciences. Given this, the high SOMED attrition rate is to be expected. Some have suggested that a greater effort should be made to recruit

  18. An evaluation of superminicomputers for thermal analysis

    NASA Technical Reports Server (NTRS)

    Storaasli, O. O.; Vidal, J. B.; Jones, G. K.

    1962-01-01

    The feasibility and cost effectiveness of solving thermal analysis problems on superminicomputers is demonstrated. Conventional thermal analysis and the changing computer environment, computer hardware and software used, six thermal analysis test problems, performance of superminicomputers (CPU time, accuracy, turnaround, and cost) and comparison with large computers are considered. Although the CPU times for superminicomputers were 15 to 30 times greater than the fastest mainframe computer, the minimum cost to obtain the solutions on superminicomputers was from 11 percent to 59 percent of the cost of mainframe solutions. The turnaround (elapsed) time is highly dependent on the computer load, but for large problems, superminicomputers produced results in less elapsed time than a typically loaded mainframe computer.

  19. As above, so below? Towards understanding inverse models in BCI

    NASA Astrophysics Data System (ADS)

    Lindgren, Jussi T.

    2018-02-01

    Objective. In brain-computer interfaces (BCI), measurements of the user’s brain activity are classified into commands for the computer. With EEG-based BCIs, the origins of the classified phenomena are often considered to be spatially localized in the cortical volume and mixed in the EEG. We investigate if more accurate BCIs can be obtained by reconstructing the source activities in the volume. Approach. We contrast the physiology-driven source reconstruction with data-driven representations obtained by statistical machine learning. We explain these approaches in a common linear dictionary framework and review the different ways to obtain the dictionary parameters. We consider the effect of source reconstruction on some major difficulties in BCI classification, namely information loss, feature selection and nonstationarity of the EEG. Main results. Our analysis suggests that the approaches differ mainly in their parameter estimation. Physiological source reconstruction may thus be expected to improve BCI accuracy if machine learning is not used or where it produces less optimal parameters. We argue that the considered difficulties of surface EEG classification can remain in the reconstructed volume and that data-driven techniques are still necessary. Finally, we provide some suggestions for comparing approaches. Significance. The present work illustrates the relationships between source reconstruction and machine learning-based approaches for EEG data representation. The provided analysis and discussion should help in understanding, applying, comparing and improving such techniques in the future.

  20. Computational insights of K1444N substitution in GAP-related domain of NF1 gene associated with neurofibromatosis type 1 disease: a molecular modeling and dynamics approach.

    PubMed

    Agrahari, Ashish Kumar; Muskan, Meghana; George Priya Doss, C; Siva, R; Zayed, Hatem

    2018-05-27

    The NF1 gene encodes for neurofibromin protein, which is ubiquitously expressed, but most highly in the central nervous system. Non-synonymous SNPs (nsSNPs) in the NF1 gene were found to be associated with Neurofibromatosis Type 1 disease, which is characterized by the growth of tumors along nerves in the skin, brain, and other parts of the body. In this study, we used several in silico predictions tools to analyze 16 nsSNPs in the RAS-GAP domain of neurofibromin, the K1444N (K1423N) mutation was predicted as the most pathogenic. The comparative molecular dynamic simulation (MDS; 50 ns) between the wild type and the K1444N (K1423N) mutant suggested a significant change in the electrostatic potential. In addition, the RMSD, RMSF, Rg, hydrogen bonds, and PCA analysis confirmed the loss of flexibility and increase in compactness of the mutant protein. Further, SASA analysis revealed exchange between hydrophobic and hydrophilic residues from the core of the RAS-GAP domain to the surface of the mutant domain, consistent with the secondary structure analysis that showed significant alteration in the mutant protein conformation. Our data concludes that the K1444N (K1423N) mutant lead to increasing the rigidity and compactness of the protein. This study provides evidence of the benefits of the computational tools in predicting the pathogenicity of genetic mutations and suggests the application of MDS and different in silico prediction tools for variant assessment and classification in genetic clinics.

  1. Computed Tomographic Analysis of Ventral Atlantoaxial Optimal Safe Implantation Corridors in 27 Dogs.

    PubMed

    Leblond, Guillaume; Gaitero, Luis; Moens, Noel M M; Zur Linden, Alex; James, Fiona M K; Monteith, Gabrielle J; Runciman, John

    2017-11-01

    Objectives  Ventral atlantoaxial stabilization techniques are challenging surgical procedures in dogs. Available surgical guidelines are based upon subjective anatomical landmarks, and limited radiographic and computed tomographic data. The aims of this study were (1) to provide detailed anatomical descriptions of atlantoaxial optimal safe implantation corridors to generate objective recommendations for optimal implant placements and (2) to compare anatomical data obtained in non-affected Toy breed dogs, affected Toy breed dogs suffering from atlantoaxial instability and non-affected Beagle dogs. Methods  Anatomical data were collected from a prospectively recruited population of 27 dogs using a previously validated method of optimal safe implantation corridor analysis using computed tomographic images. Results  Optimal implant positions and three-dimensional numerical data were generated successfully in all cases. Anatomical landmarks could be used to generate objective definitions of optimal insertion points which were applicable across all three groups. Overall the geometrical distribution of all implant sites was similar in all three groups with a few exceptions. Clinical Significance  This study provides extensive anatomical data available to facilitate surgical planning of implant placement for atlantoaxial stabilization. Our data suggest that non-affected Toy breed dogs and non-affected Beagle dogs constitute reasonable research models to study atlantoaxial stabilization constructs. Schattauer GmbH Stuttgart.

  2. Striatal dopamine in Parkinson disease: A meta-analysis of imaging studies.

    PubMed

    Kaasinen, Valtteri; Vahlberg, Tero

    2017-12-01

    A meta-analysis of 142 positron emission tomography and single photon emission computed tomography studies that have investigated striatal presynaptic dopamine function in Parkinson disease (PD) was performed. Subregional estimates of striatal dopamine metabolism are presented. The aromatic L-amino-acid decarboxylase (AADC) defect appears to be consistently smaller than the dopamine transporter and vesicular monoamine transporter 2 defects, suggesting upregulation of AADC function in PD. The correlation between disease severity and dopamine loss appears linear, but the majority of longitudinal studies point to a negative exponential progression pattern of dopamine loss in PD. Ann Neurol 2017;82:873-882. © 2017 American Neurological Association.

  3. Probabilistic Analysis of a SiC/SiC Ceramic Matrix Composite Turbine Vane

    NASA Technical Reports Server (NTRS)

    Murthy, Pappu L. N.; Nemeth, Noel N.; Brewer, David N.; Mital, Subodh

    2004-01-01

    To demonstrate the advanced composite materials technology under development within the Ultra-Efficient Engine Technology (UEET) Program, it was planned to fabricate, test, and analyze a turbine vane made entirely of silicon carbide-fiber-reinforced silicon carbide matrix composite (SiC/SiC CMC) material. The objective was to utilize a five-harness satin weave melt-infiltrated (MI) SiC/SiC composite material developed under this program to design and fabricate a stator vane that can endure 1000 hours of engine service conditions. The vane was designed such that the expected maximum stresses were kept within the proportional limit strength of the material. Any violation of this design requirement was considered as the failure. This report presents results of a probabilistic analysis and reliability assessment of the vane. Probability of failure to meet the design requirements was computed. In the analysis, material properties, strength, and pressure loading were considered as random variables. The pressure loads were considered normally distributed with a nominal variation. A temperature profile on the vane was obtained by performing a computational fluid dynamics (CFD) analysis and was assumed to be deterministic. The results suggest that for the current vane design, the chance of not meeting design requirements is about 1.6 percent.

  4. Computer Instrumentation and the New Tools of Science.

    ERIC Educational Resources Information Center

    Snyder, H. David

    1990-01-01

    The impact and uses of new technologies in science teaching are discussed. Included are computers, software, sensors, integrated circuits, computer signal access, and computer interfaces. Uses and advantages of these new technologies are suggested. (CW)

  5. Modern Computational Techniques for the HMMER Sequence Analysis

    PubMed Central

    2013-01-01

    This paper focuses on the latest research and critical reviews on modern computing architectures, software and hardware accelerated algorithms for bioinformatics data analysis with an emphasis on one of the most important sequence analysis applications—hidden Markov models (HMM). We show the detailed performance comparison of sequence analysis tools on various computing platforms recently developed in the bioinformatics society. The characteristics of the sequence analysis, such as data and compute-intensive natures, make it very attractive to optimize and parallelize by using both traditional software approach and innovated hardware acceleration technologies. PMID:25937944

  6. Summary of research in applied mathematics, numerical analysis, and computer sciences

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The major categories of current ICASE research programs addressed include: numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; control and parameter identification problems, with emphasis on effective numerical methods; computational problems in engineering and physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and computer systems and software, especially vector and parallel computers.

  7. HCN1 channels in cerebellar Purkinje cells promote late stages of learning and constrain synaptic inhibition

    PubMed Central

    Rinaldi, Arianna; Defterali, Cagla; Mialot, Antoine; Garden, Derek L F; Beraneck, Mathieu; Nolan, Matthew F

    2013-01-01

    Neural computations rely on ion channels that modify neuronal responses to synaptic inputs. While single cell recordings suggest diverse and neurone type-specific computational functions for HCN1 channels, their behavioural roles in any single neurone type are not clear. Using a battery of behavioural assays, including analysis of motor learning in vestibulo-ocular reflex and rotarod tests, we find that deletion of HCN1 channels from cerebellar Purkinje cells selectively impairs late stages of motor learning. Because deletion of HCN1 modifies only a subset of behaviours involving Purkinje cells, we asked whether the channel also has functional specificity at a cellular level. We find that HCN1 channels in cerebellar Purkinje cells reduce the duration of inhibitory synaptic responses but, in the absence of membrane hyperpolarization, do not affect responses to excitatory inputs. Our results indicate that manipulation of subthreshold computation in a single neurone type causes specific modifications to behaviour. PMID:24000178

  8. Unraveling the electrolyte properties of Na3SbS4 through computation and experiment

    NASA Astrophysics Data System (ADS)

    Rush, Larry E.; Hood, Zachary D.; Holzwarth, N. A. W.

    2017-12-01

    Solid-state sodium electrolytes are expected to improve next-generation batteries on the basis of favorable energy density and reduced cost. Na3SbS4 represents a new solid-state ion conductor with high ionic conductivities in the mS/cm range. Here, we explore the tetragonal phase of Na3SbS4 and its interface with metallic sodium anode using a combination of experiments and first-principles calculations. The computed Na-ion vacancy migration energies of 0.1 eV are smaller than the value inferred from experiment, suggesting that grain boundaries or other factors dominate the experimental systems. Analysis of symmetric cells of the electrolyte—Na/Na 3SbS4/Na —show that a conductive solid electrolyte interphase forms. Computer simulations infer that the interface is likely to be related to Na3SbS3 , involving the conversion of the tetrahedral SbS43 - ions of the bulk electrolyte into trigonal pyramidal SbS33 - ions at the interface.

  9. A study of kinematic cues and anticipatory performance in tennis using computational manipulation and computer graphics.

    PubMed

    Ida, Hirofumi; Fukuhara, Kazunobu; Kusubori, Seiji; Ishii, Motonobu

    2011-09-01

    Computer graphics of digital human models can be used to display human motions as visual stimuli. This study presents our technique for manipulating human motion with a forward kinematics calculation without violating anatomical constraints. A motion modulation of the upper extremity was conducted by proportionally modulating the anatomical joint angular velocity calculated by motion analysis. The effect of this manipulation was examined in a tennis situation--that is, the receiver's performance of predicting ball direction when viewing a digital model of the server's motion derived by modulating the angular velocities of the forearm or that of the elbow during the forward swing. The results showed that the faster the server's forearm pronated, the more the receiver's anticipation of the ball direction tended to the left side of the serve box. In contrast, the faster the server's elbow extended, the more the receiver's anticipation of the ball direction tended to the right. This suggests that tennis players are sensitive to the motion modulation of their opponent's racket-arm.

  10. A robust method of computing finite difference coefficients based on Vandermonde matrix

    NASA Astrophysics Data System (ADS)

    Zhang, Yijie; Gao, Jinghuai; Peng, Jigen; Han, Weimin

    2018-05-01

    When the finite difference (FD) method is employed to simulate the wave propagation, high-order FD method is preferred in order to achieve better accuracy. However, if the order of FD scheme is high enough, the coefficient matrix of the formula for calculating finite difference coefficients is close to be singular. In this case, when the FD coefficients are computed by matrix inverse operator of MATLAB, inaccuracy can be produced. In order to overcome this problem, we have suggested an algorithm based on Vandermonde matrix in this paper. After specified mathematical transformation, the coefficient matrix is transformed into a Vandermonde matrix. Then the FD coefficients of high-order FD method can be computed by the algorithm of Vandermonde matrix, which prevents the inverse of the singular matrix. The dispersion analysis and numerical results of a homogeneous elastic model and a geophysical model of oil and gas reservoir demonstrate that the algorithm based on Vandermonde matrix has better accuracy compared with matrix inverse operator of MATLAB.

  11. The effectiveness of computer-managed instruction versus traditional classroom lecture on achievement outcomes.

    PubMed

    Schmidt, S M; Arndt, M J; Gaston, S; Miller, B J

    1991-01-01

    This controlled experimental study examines the effect of two teaching methods on achievement outcomes from a 15-week, 2 credit hour semester course taught at two midwestern universities. Students were randomly assigned to either computer-managed instruction in which faculty function as tutors or the traditional classroom course of study. In addition, the effects of age, grade point average, attitudes toward computers, and satisfaction with the course on teaching method were analyzed using analysis of covariance. Younger students achieved better scores than did older students. Regardless of teaching method, however, neither method appeared to be better than the other for teaching course content. Students did not prefer one method over the other as indicated by their satisfaction scores. With demands upon university faculty to conduct research and publish, alternative methods of teaching that free faculty from the classroom should be considered. This study suggests that educators can select such an alternative teaching method to traditional classroom teaching without sacrificing quality education for certain courses.

  12. Energy and time determine scaling in biological and computer designs

    PubMed Central

    Bezerra, George; Edwards, Benjamin; Brown, James; Forrest, Stephanie

    2016-01-01

    Metabolic rate in animals and power consumption in computers are analogous quantities that scale similarly with size. We analyse vascular systems of mammals and on-chip networks of microprocessors, where natural selection and human engineering, respectively, have produced systems that minimize both energy dissipation and delivery times. Using a simple network model that simultaneously minimizes energy and time, our analysis explains empirically observed trends in the scaling of metabolic rate in mammals and power consumption and performance in microprocessors across several orders of magnitude in size. Just as the evolutionary transitions from unicellular to multicellular animals in biology are associated with shifts in metabolic scaling, our model suggests that the scaling of power and performance will change as computer designs transition to decentralized multi-core and distributed cyber-physical systems. More generally, a single energy–time minimization principle may govern the design of many complex systems that process energy, materials and information. This article is part of the themed issue ‘The major synthetic evolutionary transitions’. PMID:27431524

  13. Energy and time determine scaling in biological and computer designs.

    PubMed

    Moses, Melanie; Bezerra, George; Edwards, Benjamin; Brown, James; Forrest, Stephanie

    2016-08-19

    Metabolic rate in animals and power consumption in computers are analogous quantities that scale similarly with size. We analyse vascular systems of mammals and on-chip networks of microprocessors, where natural selection and human engineering, respectively, have produced systems that minimize both energy dissipation and delivery times. Using a simple network model that simultaneously minimizes energy and time, our analysis explains empirically observed trends in the scaling of metabolic rate in mammals and power consumption and performance in microprocessors across several orders of magnitude in size. Just as the evolutionary transitions from unicellular to multicellular animals in biology are associated with shifts in metabolic scaling, our model suggests that the scaling of power and performance will change as computer designs transition to decentralized multi-core and distributed cyber-physical systems. More generally, a single energy-time minimization principle may govern the design of many complex systems that process energy, materials and information.This article is part of the themed issue 'The major synthetic evolutionary transitions'. © 2016 The Author(s).

  14. Noise facilitation in associative memories of exponential capacity.

    PubMed

    Karbasi, Amin; Salavati, Amir Hesam; Shokrollahi, Amin; Varshney, Lav R

    2014-11-01

    Recent advances in associative memory design through structured pattern sets and graph-based inference algorithms have allowed reliable learning and recall of an exponential number of patterns that satisfy certain subspace constraints. Although these designs correct external errors in recall, they assume neurons that compute noiselessly, in contrast to the highly variable neurons in brain regions thought to operate associatively, such as hippocampus and olfactory cortex. Here we consider associative memories with boundedly noisy internal computations and analytically characterize performance. As long as the internal noise level is below a specified threshold, the error probability in the recall phase can be made exceedingly small. More surprising, we show that internal noise improves the performance of the recall phase while the pattern retrieval capacity remains intact: the number of stored patterns does not reduce with noise (up to a threshold). Computational experiments lend additional support to our theoretical analysis. This work suggests a functional benefit to noisy neurons in biological neuronal networks.

  15. Ad Hoc modeling, expert problem solving, and R&T program evaluation

    NASA Technical Reports Server (NTRS)

    Silverman, B. G.; Liebowitz, J.; Moustakis, V. S.

    1983-01-01

    A simplified cost and time (SCAT) analysis program utilizing personal-computer technology is presented and demonstrated in the case of the NASA-Goddard end-to-end data system. The difficulties encountered in implementing complex program-selection and evaluation models in the research and technology field are outlined. The prototype SCAT system described here is designed to allow user-friendly ad hoc modeling in real time and at low cost. A worksheet constructed on the computer screen displays the critical parameters and shows how each is affected when one is altered experimentally. In the NASA case, satellite data-output and control requirements, ground-facility data-handling capabilities, and project priorities are intricately interrelated. Scenario studies of the effects of spacecraft phaseout or new spacecraft on throughput and delay parameters are shown. The use of a network of personal computers for higher-level coordination of decision-making processes is suggested, as a complement or alternative to complex large-scale modeling.

  16. In vitro protease cleavage and computer simulations reveal the HIV-1 capsid maturation pathway

    NASA Astrophysics Data System (ADS)

    Ning, Jiying; Erdemci-Tandogan, Gonca; Yufenyuy, Ernest L.; Wagner, Jef; Himes, Benjamin A.; Zhao, Gongpu; Aiken, Christopher; Zandi, Roya; Zhang, Peijun

    2016-12-01

    HIV-1 virions assemble as immature particles containing Gag polyproteins that are processed by the viral protease into individual components, resulting in the formation of mature infectious particles. There are two competing models for the process of forming the mature HIV-1 core: the disassembly and de novo reassembly model and the non-diffusional displacive model. To study the maturation pathway, we simulate HIV-1 maturation in vitro by digesting immature particles and assembled virus-like particles with recombinant HIV-1 protease and monitor the process with biochemical assays and cryoEM structural analysis in parallel. Processing of Gag in vitro is accurate and efficient and results in both soluble capsid protein and conical or tubular capsid assemblies, seemingly converted from immature Gag particles. Computer simulations further reveal probable assembly pathways of HIV-1 capsid formation. Combining the experimental data and computer simulations, our results suggest a sequential combination of both displacive and disassembly/reassembly processes for HIV-1 maturation.

  17. Artificial Intelligence in Medical Practice: The Question to the Answer?

    PubMed

    Miller, D Douglas; Brown, Eric W

    2018-02-01

    Computer science advances and ultra-fast computing speeds find artificial intelligence (AI) broadly benefitting modern society-forecasting weather, recognizing faces, detecting fraud, and deciphering genomics. AI's future role in medical practice remains an unanswered question. Machines (computers) learn to detect patterns not decipherable using biostatistics by processing massive datasets (big data) through layered mathematical models (algorithms). Correcting algorithm mistakes (training) adds to AI predictive model confidence. AI is being successfully applied for image analysis in radiology, pathology, and dermatology, with diagnostic speed exceeding, and accuracy paralleling, medical experts. While diagnostic confidence never reaches 100%, combining machines plus physicians reliably enhances system performance. Cognitive programs are impacting medical practice by applying natural language processing to read the rapidly expanding scientific literature and collate years of diverse electronic medical records. In this and other ways, AI may optimize the care trajectory of chronic disease patients, suggest precision therapies for complex illnesses, reduce medical errors, and improve subject enrollment into clinical trials. Copyright © 2018 Elsevier Inc. All rights reserved.

  18. Computational Model of Population Dynamics Based on the Cell Cycle and Local Interactions

    NASA Astrophysics Data System (ADS)

    Oprisan, Sorinel Adrian; Oprisan, Ana

    2005-03-01

    Our study bridges cellular (mesoscopic) level interactions and global population (macroscopic) dynamics of carcinoma. The morphological differences and transitions between well and smooth defined benign tumors and tentacular malignat tumors suggest a theoretical analysis of tumor invasion based on the development of mathematical models exhibiting bifurcations of spatial patterns in the density of tumor cells. Our computational model views the most representative and clinically relevant features of oncogenesis as a fight between two distinct sub-systems: the immune system of the host and the neoplastic system. We implemented the neoplastic sub-system using a three-stage cell cycle: active, dormant, and necrosis. The second considered sub-system consists of cytotoxic active (effector) cells — EC, with a very broad phenotype ranging from NK cells to CTL cells, macrophages, etc. Based on extensive numerical simulations, we correlated the fractal dimensions for carcinoma, which could be obtained from tumor imaging, with the malignat stage. Our computational model was able to also simulate the effects of surgical, chemotherapeutical, and radiotherapeutical treatments.

  19. Stream network analysis from orbital and suborbital imagery, Colorado River Basin, Texas

    NASA Technical Reports Server (NTRS)

    Baker, V. R. (Principal Investigator)

    1973-01-01

    The author has identified the following significant results. Orbital SL-2 imagery (earth terrain camera S-190B), received September 5, 1973, was subjected to quantitative network analysis and compared to 7.5 minute topographic mapping (scale: 1/24,000) and U.S.D.A. conventional black and white aerial photography (scale: 1/22,200). Results can only be considered suggestive because detail on the SL-2 imagery was badly obscured by heavy cloud cover. The upper Bee Creek basin was chosen for analysis because it appeared in a relatively cloud-free portion of the orbital imagery. Drainage maps were drawn from the three sources digitized into a computer-compatible format, and analyzed by the WATER system computer program. Even at its small scale (1/172,000) and with bad haze the orbital photo showed much drainage detail. The contour-like character of the Glen Rose Formation's resistant limestone units allowed channel definition. The errors in pattern recognition can be attributed to local areas of dense vegetation and to other areas of very high albedo caused by surficial exposure of caliche. The latter effect caused particular difficulty in the determination of drainage divides.

  20. [Development and application of a web-based expert system using artificial intelligence for management of mental health by Korean emigrants].

    PubMed

    Bae, Jeongyee

    2013-04-01

    The purpose of this project was to develop an international web-based expert system using principals of artificial intelligence and user-centered design for management of mental health by Korean emigrants. Using this system, anyone can access the system via computer access to the web. Our design process utilized principles of user-centered design with 4 phases: needs assessment, analysis, design/development/testing, and application release. A survey was done with 3,235 Korean emigrants. Focus group interviews were also conducted. Survey and analysis results guided the design of the web-based expert system. With this system, anyone can check their mental health status by themselves using a personal computer. The system analyzes facts based on answers to automated questions, and suggests solutions accordingly. A history tracking mechanism enables monitoring and future analysis. In addition, this system will include intervention programs to promote mental health status. This system is interactive and accessible to anyone in the world. It is expected that this management system will contribute to Korean emigrants' mental health promotion and allow researchers and professionals to share information on mental health.

  1. The Effectiveness of Computer-Assisted Instruction to Teach Physical Examination to Students and Trainees in the Health Sciences Professions: A Systematic Review and Meta-Analysis

    PubMed Central

    Tomesko, Jennifer; Touger-Decker, Riva; Dreker, Margaret; Zelig, Rena; Parrott, James Scott

    2017-01-01

    Purpose: To explore knowledge and skill acquisition outcomes related to learning physical examination (PE) through computer-assisted instruction (CAI) compared with a face-to-face (F2F) approach. Method: A systematic literature review and meta-analysis published between January 2001 and December 2016 was conducted. Databases searched included Medline, Cochrane, CINAHL, ERIC, Ebsco, Scopus, and Web of Science. Studies were synthesized by study design, intervention, and outcomes. Statistical analyses included DerSimonian-Laird random-effects model. Results: In total, 7 studies were included in the review, and 5 in the meta-analysis. There were no statistically significant differences for knowledge (mean difference [MD] = 5.39, 95% confidence interval [CI]: −2.05 to 12.84) or skill acquisition (MD = 0.35, 95% CI: −5.30 to 6.01). Conclusions: The evidence does not suggest a strong consistent preference for either CAI or F2F instruction to teach students/trainees PE. Further research is needed to identify conditions which examine knowledge and skill acquisition outcomes that favor one mode of instruction over the other. PMID:29349338

  2. Flutter Analysis for Turbomachinery Using Volterra Series

    NASA Technical Reports Server (NTRS)

    Liou, Meng-Sing; Yao, Weigang

    2014-01-01

    The objective of this paper is to describe an accurate and efficient reduced order modeling method for aeroelastic (AE) analysis and for determining the flutter boundary. Without losing accuracy, we develop a reduced order model based on the Volterra series to achieve significant savings in computational cost. The aerodynamic force is provided by a high-fidelity solution from the Reynolds-averaged Navier-Stokes (RANS) equations; the structural mode shapes are determined from the finite element analysis. The fluid-structure coupling is then modeled by the state-space formulation with the structural displacement as input and the aerodynamic force as output, which in turn acts as an external force to the aeroelastic displacement equation for providing the structural deformation. NASA's rotor 67 blade is used to study its aeroelastic characteristics under the designated operating condition. First, the CFD results are validated against measured data available for the steady state condition. Then, the accuracy of the developed reduced order model is compared with the full-order solutions. Finally the aeroelastic solutions of the blade are computed and a flutter boundary is identified, suggesting that the rotor, with the material property chosen for the study, is structurally stable at the operating condition, free of encountering flutter.

  3. Automated Computerized Analysis of Speechin Psychiatric Disorders

    PubMed Central

    Cohen, Alex S.; Elvevåg, Brita

    2014-01-01

    Purpose of Review Disturbances in communication are a hallmark of severe mental illnesses. Recent technological advances have paved the way for objectifying communication using automated computerized linguistic and acoustic analysis. We review recent studies applying various computer-based assessments to the natural language produced by adult patients with severe mental illness. Recent Findings Automated computerized methods afford tools with which it is possible to objectively evaluate patients in a reliable, valid and efficient manner that complements human ratings. Crucially, these measures correlate with important clinical measures. The clinical relevance of these novel metrics has been demonstrated by showing their relationship to functional outcome measures, their in vivo link to classic ‘language’ regions in the brain, and, in the case of linguistic analysis, their relationship to candidate genes for severe mental illness. Summary Computer based assessments of natural language afford a framework with which to measure communication disturbances in adults with SMI. Emerging evidence suggests that they can be reliable and valid, and overcome many practical limitations of more traditional assessment methods. The advancement of these technologies offers unprecedented potential for measuring and understanding some of the most crippling symptoms of some of the most debilitating illnesses known to humankind. PMID:24613984

  4. HITCal: a software tool for analysis of video head impulse test responses.

    PubMed

    Rey-Martinez, Jorge; Batuecas-Caletrio, Angel; Matiño, Eusebi; Perez Fernandez, Nicolás

    2015-09-01

    The developed software (HITCal) may be a useful tool in the analysis and measurement of the saccadic video head impulse test (vHIT) responses and with the experience obtained during its use the authors suggest that HITCal is an excellent method for enhanced exploration of vHIT outputs. To develop a (software) method to analyze and explore the vHIT responses, mainly saccades. HITCal was written using a computational development program; the function to access a vHIT file was programmed; extended head impulse exploration and measurement tools were created and an automated saccade analysis was developed using an experimental algorithm. For pre-release HITCal laboratory tests, a database of head impulse tests (HITs) was created with the data collected retrospectively in three reference centers. This HITs database was evaluated by humans and was also computed with HITCal. The authors have successfully built HITCal and it has been released as open source software; the developed software was fully operative and all the proposed characteristics were incorporated in the released version. The automated saccades algorithm implemented in HITCal has good concordance with the assessment by human observers (Cohen's kappa coefficient = 0.7).

  5. Detailed Analysis of the Binding Mode of Vanilloids to Transient Receptor Potential Vanilloid Type I (TRPV1) by a Mutational and Computational Study

    PubMed Central

    Mori, Yoshikazu; Ogawa, Kazuo; Warabi, Eiji; Yamamoto, Masahiro; Hirokawa, Takatsugu

    2016-01-01

    Transient receptor potential vanilloid type 1 (TRPV1) is a non-selective cation channel and a multimodal sensor protein. Since the precise structure of TRPV1 was obtained by electron cryo-microscopy, the binding mode of representative agonists such as capsaicin and resiniferatoxin (RTX) has been extensively characterized; however, detailed information on the binding mode of other vanilloids remains lacking. In this study, mutational analysis of human TRPV1 was performed, and four agonists (capsaicin, RTX, [6]-shogaol and [6]-gingerol) were used to identify amino acid residues involved in ligand binding and/or modulation of proton sensitivity. The detailed binding mode of each ligand was then simulated by computational analysis. As a result, three amino acids (L518, F591 and L670) were newly identified as being involved in ligand binding and/or modulation of proton sensitivity. In addition, in silico docking simulation and a subsequent mutational study suggested that [6]-gingerol might bind to and activate TRPV1 in a unique manner. These results provide novel insights into the binding mode of various vanilloids to the channel and will be helpful in developing a TRPV1 modulator. PMID:27606946

  6. Henry Gray, plagiarist.

    PubMed

    Richardson, Ruth

    2016-03-01

    The first edition of Anatomy Descriptive and Surgical (1858) was greeted with accolades, but also provoked serious controversy concerning Henry Gray's failure to acknowledge the work of earlier anatomists. A review in the Medical Times (1859) accused Gray of intellectual theft. The journal took the unusual step of substantiating its indictment by publishing twenty parallel texts from Gray and from a pre-existing textbook, Quain's Anatomy. At the recent "Vesalius Continuum" conference in Zakynthos, Greece (2014) Professor Brion Benninger disputed the theft by announcing from the floor the results of a computer analysis of both texts, which he reported exonerated Gray by revealing no evidence of plagiarism. The analysis has not been forthcoming, however, despite requests. Here the historian of Gray's Anatomy supplements the argument set out in the Medical Times 150 years ago with data suggesting unwelcome personality traits in Henry Gray, and demonstrating the utility of others' work to his professional advancement. Fair dealing in the world of anatomy and indeed the genuineness of the lustre of medical fame are important matters, but whether quantitative evidence has anything to add to the discussion concerning Gray's probity can be assessed only if Benninger makes public his computer analysis. © 2015 Wiley Periodicals, Inc.

  7. Computer Training for Entrepreneurial Meteorologists.

    NASA Astrophysics Data System (ADS)

    Koval, Joseph P.; Young, George S.

    2001-05-01

    Computer applications of increasing diversity form a growing part of the undergraduate education of meteorologists in the early twenty-first century. The advent of the Internet economy, as well as a waning demand for traditional forecasters brought about by better numerical models and statistical forecasting techniques has greatly increased the need for operational and commercial meteorologists to acquire computer skills beyond the traditional techniques of numerical analysis and applied statistics. Specifically, students with the skills to develop data distribution products are in high demand in the private sector job market. Meeting these demands requires greater breadth, depth, and efficiency in computer instruction. The authors suggest that computer instruction for undergraduate meteorologists should include three key elements: a data distribution focus, emphasis on the techniques required to learn computer programming on an as-needed basis, and a project orientation to promote management skills and support student morale. In an exploration of this approach, the authors have reinvented the Applications of Computers to Meteorology course in the Department of Meteorology at The Pennsylvania State University to teach computer programming within the framework of an Internet product development cycle. Because the computer skills required for data distribution programming change rapidly, specific languages are valuable for only a limited time. A key goal of this course was therefore to help students learn how to retrain efficiently as technologies evolve. The crux of the course was a semester-long project during which students developed an Internet data distribution product. As project management skills are also important in the job market, the course teamed students in groups of four for this product development project. The success, failures, and lessons learned from this experiment are discussed and conclusions drawn concerning undergraduate instructional methods for computer applications in meteorology.

  8. GATE Monte Carlo simulation in a cloud computing environment

    NASA Astrophysics Data System (ADS)

    Rowedder, Blake Austin

    The GEANT4-based GATE is a unique and powerful Monte Carlo (MC) platform, which provides a single code library allowing the simulation of specific medical physics applications, e.g. PET, SPECT, CT, radiotherapy, and hadron therapy. However, this rigorous yet flexible platform is used only sparingly in the clinic due to its lengthy calculation time. By accessing the powerful computational resources of a cloud computing environment, GATE's runtime can be significantly reduced to clinically feasible levels without the sizable investment of a local high performance cluster. This study investigated a reliable and efficient execution of GATE MC simulations using a commercial cloud computing services. Amazon's Elastic Compute Cloud was used to launch several nodes equipped with GATE. Job data was initially broken up on the local computer, then uploaded to the worker nodes on the cloud. The results were automatically downloaded and aggregated on the local computer for display and analysis. Five simulations were repeated for every cluster size between 1 and 20 nodes. Ultimately, increasing cluster size resulted in a decrease in calculation time that could be expressed with an inverse power model. Comparing the benchmark results to the published values and error margins indicated that the simulation results were not affected by the cluster size and thus that integrity of a calculation is preserved in a cloud computing environment. The runtime of a 53 minute long simulation was decreased to 3.11 minutes when run on a 20-node cluster. The ability to improve the speed of simulation suggests that fast MC simulations are viable for imaging and radiotherapy applications. With high power computing continuing to lower in price and accessibility, implementing Monte Carlo techniques with cloud computing for clinical applications will continue to become more attractive.

  9. Social network modulation of reward-related signals

    PubMed Central

    Fareri, Dominic S.; Niznikiewicz, Michael A.; Lee, Victoria K.; Delgado, Mauricio R.

    2012-01-01

    Everyday goals and experiences are often shared with others who may hold different places within our social networks. We investigated whether the experience of sharing a reward differs with respect to social network. Twenty human participants played a card guessing game for shared monetary outcomes with three partners: a computer, a confederate (out-of-network), and a friend (in-network). Participants subjectively rated the experience of sharing a reward more positively with their friend than the other partners. Neuroimaging results support participants’ subjective reports, as ventral striatal BOLD responses were more robust when sharing monetary gains with a friend, as compared to with the confederate or computer, suggesting a higher value for sharing with an in-network partner. Interestingly, ratings of social closeness co-varied with this activity, resulting in a significant partner × closeness interaction: exploratory analysis showed that only participants reporting higher levels of closeness demonstrated partner-related differences in striatal BOLD response. These results suggest that reward valuation in social contexts is sensitive to distinctions of social network, such that sharing positive experiences with in-network others may carry higher value. PMID:22745503

  10. For operation of the Computer Software Management and Information Center (COSMIC)

    NASA Technical Reports Server (NTRS)

    Carmon, J. L.

    1983-01-01

    Computer programs for large systems of normal equations, an interactive digital signal process, structural analysis of cylindrical thrust chambers, swirling turbulent axisymmetric recirculating flows in practical isothermal combustor geometrics, computation of three dimensional combustor performance, a thermal radiation analysis system, transient response analysis, and a software design analysis are summarized.

  11. Consumer and provider responses to a computerized version of the Illness Management and Recovery Program.

    PubMed

    Wright-Berryman, Jennifer L; Salyers, Michelle P; O'Halloran, James P; Kemp, Aaron S; Mueser, Kim T; Diazoni, Amanda J

    2013-12-01

    To explore mental health consumer and provider responses to a computerized version of the Illness Management and Recovery (IMR) program. Semistructured interviews were conducted to gather data from 6 providers and 12 consumers who participated in a computerized prototype of the IMR program. An inductive-consensus-based approach was used to analyze the interview responses. Qualitative analysis revealed consumers perceived various personal benefits and ease of use afforded by the new technology platform. Consumers also highly valued provider assistance and offered several suggestions to improve the program. The largest perceived barriers to future implementation were lack of computer skills and access to computers. Similarly, IMR providers commented on its ease and convenience, and the reduction of time intensive material preparation. Providers also expressed that the use of technology creates more options for the consumer to access treatment. The technology was acceptable, easy to use, and well-liked by consumers and providers. Clinician assistance with technology was viewed as helpful to get clients started with the program, as lack of computer skills and access to computers was a concern. Access to materials between sessions appears to be desired; however, given perceived barriers of computer skills and computer access, additional supports may be needed for consumers to achieve full benefits of a computerized version of IMR. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  12. Real longitudinal data analysis for real people: building a good enough mixed model.

    PubMed

    Cheng, Jing; Edwards, Lloyd J; Maldonado-Molina, Mildred M; Komro, Kelli A; Muller, Keith E

    2010-02-20

    Mixed effects models have become very popular, especially for the analysis of longitudinal data. One challenge is how to build a good enough mixed effects model. In this paper, we suggest a systematic strategy for addressing this challenge and introduce easily implemented practical advice to build mixed effects models. A general discussion of the scientific strategies motivates the recommended five-step procedure for model fitting. The need to model both the mean structure (the fixed effects) and the covariance structure (the random effects and residual error) creates the fundamental flexibility and complexity. Some very practical recommendations help to conquer the complexity. Centering, scaling, and full-rank coding of all the predictor variables radically improve the chances of convergence, computing speed, and numerical accuracy. Applying computational and assumption diagnostics from univariate linear models to mixed model data greatly helps to detect and solve the related computational problems. Applying computational and assumption diagnostics from the univariate linear models to the mixed model data can radically improve the chances of convergence, computing speed, and numerical accuracy. The approach helps to fit more general covariance models, a crucial step in selecting a credible covariance model needed for defensible inference. A detailed demonstration of the recommended strategy is based on data from a published study of a randomized trial of a multicomponent intervention to prevent young adolescents' alcohol use. The discussion highlights a need for additional covariance and inference tools for mixed models. The discussion also highlights the need for improving how scientists and statisticians teach and review the process of finding a good enough mixed model. (c) 2009 John Wiley & Sons, Ltd.

  13. Dynamical analysis of Parkinsonian state emulated by hybrid Izhikevich neuron models

    NASA Astrophysics Data System (ADS)

    Liu, Chen; Wang, Jiang; Yu, Haitao; Deng, Bin; Wei, Xile; Li, Huiyan; Loparo, Kenneth A.; Fietkiewicz, Chris

    2015-11-01

    Computational models play a significant role in exploring novel theories to complement the findings of physiological experiments. Various computational models have been developed to reveal the mechanisms underlying brain functions. Particularly, in the development of therapies to modulate behavioral and pathological abnormalities, computational models provide the basic foundations to exhibit transitions between physiological and pathological conditions. Considering the significant roles of the intrinsic properties of the globus pallidus and the coupling connections between neurons in determining the firing patterns and the dynamical activities of the basal ganglia neuronal network, we propose a hypothesis that pathological behaviors under the Parkinsonian state may originate from combined effects of intrinsic properties of globus pallidus neurons and synaptic conductances in the whole neuronal network. In order to establish a computational efficient network model, hybrid Izhikevich neuron model is used due to its capacity of capturing the dynamical characteristics of the biological neuronal activities. Detailed analysis of the individual Izhikevich neuron model can assist in understanding the roles of model parameters, which then facilitates the establishment of the basal ganglia-thalamic network model, and contributes to a further exploration of the underlying mechanisms of the Parkinsonian state. Simulation results show that the hybrid Izhikevich neuron model is capable of capturing many of the dynamical properties of the basal ganglia-thalamic neuronal network, such as variations of the firing rates and emergence of synchronous oscillations under the Parkinsonian condition, despite the simplicity of the two-dimensional neuronal model. It may suggest that the computational efficient hybrid Izhikevich neuron model can be used to explore basal ganglia normal and abnormal functions. Especially it provides an efficient way of emulating the large-scale neuron network and potentially contributes to development of improved therapy for neurological disorders such as Parkinson's disease.

  14. Further Development of Rotating Rake Mode Measurement Data Analysis

    NASA Technical Reports Server (NTRS)

    Dahl, Milo D.; Hixon, Ray; Sutliff, Daniel L.

    2013-01-01

    The Rotating Rake mode measurement system was designed to measure acoustic duct modes generated by a fan stage. After analysis of the measured data, the mode amplitudes and phases were quantified. For low-speed fans within axisymmetric ducts, mode power levels computed from rotating rake measured data would agree with the far-field power levels on a tone by tone basis. However, this agreement required that the sound from the noise sources within the duct propagated outward from the duct exit without reflection at the exit and previous studies suggested conditions could exist where significant reflections could occur. To directly measure the modes propagating in both directions within a duct, a second rake was mounted to the rotating system with an offset in both the axial and the azimuthal directions. The rotating rake data analysis technique was extended to include the data measured by the second rake. The analysis resulted in a set of circumferential mode levels at each of the two rake microphone locations. Radial basis functions were then least-squares fit to this data to obtain the radial mode amplitudes for the modes propagating in both directions within the duct. The fit equations were also modified to allow evanescent mode amplitudes to be computed. This extension of the rotating rake data analysis technique was tested using simulated data, numerical code produced data, and preliminary in-duct measured data.

  15. Suggested Approaches to the Measurement of Computer Anxiety.

    ERIC Educational Resources Information Center

    Toris, Carol

    Psychologists can gain insight into human behavior by examining what people feel about, know about, and do with, computers. Two extreme reactions to computers are computer phobia, or anxiety, and computer addiction, or "hacking". A four-part questionnaire was developed to measure computer anxiety. The first part is a projective technique which…

  16. On the origin of the cation templated self-assembly of uranyl-peroxide nanoclusters.

    PubMed

    Miró, Pere; Pierrefixe, Simon; Gicquel, Mickaël; Gil, Adrià; Bo, Carles

    2010-12-22

    Uranyl-peroxide nanoclusters display different topologies based on square, pentagonal and hexagonal building blocks. Computed complexation energies of different cations (Li(+), Na(+), K(+), Rb(+), and Cs(+)) with [UO(2)(O(2))(H(2)O)](n) (n = 4, 5, and 6) macrocycles suggest a strong cation templating effect. The inherent bent structure of a U-O(2)-U model dimer is demonstrated and justified through the analysis of its electronic structure, as well as of the inherent curvature of the four-, five-, and six-uranyl macrocyles. The curvature is enhaced by cation coordination, which is suggested to be the driving force for the self-assembly of the nanocapsules.

  17. Opportunities and choice in a new vector era

    NASA Astrophysics Data System (ADS)

    Nowak, A.

    2014-06-01

    This work discusses the significant changes in computing landscape related to the progression of Moore's Law, and the implications on scientific computing. Particular attention is devoted to the High Energy Physics domain (HEP), which has always made good use of threading, but levels of parallelism closer to the hardware were often left underutilized. Findings of the CERN openlab Platform Competence Center are reported in the context of expanding "performance dimensions", and especially the resurgence of vectors. These suggest that data oriented designs are feasible in HEP and have considerable potential for performance improvements on multiple levels, but will rarely trump algorithmic enhancements. Finally, an analysis of upcoming hardware and software technologies identifies heterogeneity as a major challenge for software, which will require more emphasis on scalable, efficient design.

  18. Applications of a General Finite-Difference Method for Calculating Bending Deformations of Solid Plates

    NASA Technical Reports Server (NTRS)

    Walton, William C., Jr.

    1960-01-01

    This paper reports the findings of an investigation of a finite - difference method directly applicable to calculating static or simple harmonic flexures of solid plates and potentially useful in other problems of structural analysis. The method, which was proposed in doctoral thesis by John C. Houbolt, is based on linear theory and incorporates the principle of minimum potential energy. Full realization of its advantages requires use of high-speed computing equipment. After a review of Houbolt's method, results of some applications are presented and discussed. The applications consisted of calculations of the natural modes and frequencies of several uniform-thickness cantilever plates and, as a special case of interest, calculations of the modes and frequencies of the uniform free-free beam. Computed frequencies and nodal patterns for the first five or six modes of each plate are compared with existing experiments, and those for one plate are compared with another approximate theory. Beam computations are compared with exact theory. On the basis of the comparisons it is concluded that the method is accurate and general in predicting plate flexures, and additional applications are suggested. An appendix is devoted t o computing procedures which evolved in the progress of the applications and which facilitate use of the method in conjunction with high-speed computing equipment.

  19. Methane Adsorption in Zr-Based MOFs: Comparison and Critical Evaluation of Force Fields

    PubMed Central

    2017-01-01

    The search for nanoporous materials that are highly performing for gas storage and separation is one of the contemporary challenges in material design. The computational tools to aid these experimental efforts are widely available, and adsorption isotherms are routinely computed for huge sets of (hypothetical) frameworks. Clearly the computational results depend on the interactions between the adsorbed species and the adsorbent, which are commonly described using force fields. In this paper, an extensive comparison and in-depth investigation of several force fields from literature is reported for the case of methane adsorption in the Zr-based Metal–Organic Frameworks UiO-66, UiO-67, DUT-52, NU-1000, and MOF-808. Significant quantitative differences in the computed uptake are observed when comparing different force fields, but most qualitative features are common which suggests some predictive power of the simulations when it comes to these properties. More insight into the host–guest interactions is obtained by benchmarking the force fields with an extensive number of ab initio computed single molecule interaction energies. This analysis at the molecular level reveals that especially ab initio derived force fields perform well in reproducing the ab initio interaction energies. Finally, the high sensitivity of uptake predictions on the underlying potential energy surface is explored. PMID:29170687

  20. BIOCOMPUTATION: some history and prospects.

    PubMed

    Cull, Paul

    2013-06-01

    At first glance, biology and computer science are diametrically opposed sciences. Biology deals with carbon based life forms shaped by evolution and natural selection. Computer Science deals with electronic machines designed by engineers and guided by mathematical algorithms. In this brief paper, we review biologically inspired computing. We discuss several models of computation which have arisen from various biological studies. We show what these have in common, and conjecture how biology can still suggest answers and models for the next generation of computing problems. We discuss computation and argue that these biologically inspired models do not extend the theoretical limits on computation. We suggest that, in practice, biological models may give more succinct representations of various problems, and we mention a few cases in which biological models have proved useful. We also discuss the reciprocal impact of computer science on biology and cite a few significant contributions to biological science. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  1. Computation of transonic potential flow about 3 dimensional inlets, ducts, and bodies

    NASA Technical Reports Server (NTRS)

    Reyhner, T. A.

    1982-01-01

    An analysis was developed and a computer code, P465 Version A, written for the prediction of transonic potential flow about three dimensional objects including inlet, duct, and body geometries. Finite differences and line relaxation are used to solve the complete potential flow equation. The coordinate system used for the calculations is independent of body geometry. Cylindrical coordinates are used for the computer code. The analysis is programmed in extended FORTRAN 4 for the CYBER 203 vector computer. The programming of the analysis is oriented toward taking advantage of the vector processing capabilities of this computer. Comparisons of computed results with experimental measurements are presented to verify the analysis. Descriptions of program input and output formats are also presented.

  2. Generic Hypersonic Inlet Module Analysis

    NASA Technical Reports Server (NTRS)

    Cockrell, Chares E., Jr.; Huebner, Lawrence D.

    2004-01-01

    A computational study associated with an internal inlet drag analysis was performed for a generic hypersonic inlet module. The purpose of this study was to determine the feasibility of computing the internal drag force for a generic scramjet engine module using computational methods. The computational study consisted of obtaining two-dimensional (2D) and three-dimensional (3D) computational fluid dynamics (CFD) solutions using the Euler and parabolized Navier-Stokes (PNS) equations. The solution accuracy was assessed by comparisons with experimental pitot pressure data. The CFD analysis indicates that the 3D PNS solutions show the best agreement with experimental pitot pressure data. The internal inlet drag analysis consisted of obtaining drag force predictions based on experimental data and 3D CFD solutions. A comparative assessment of each of the drag prediction methods is made and the sensitivity of CFD drag values to computational procedures is documented. The analysis indicates that the CFD drag predictions are highly sensitive to the computational procedure used.

  3. Composite Failures: A Comparison of Experimental Test Results and Computational Analysis Using XFEM

    DTIC Science & Technology

    2016-09-30

    NUWC-NPT Technical Report 12,218 30 September 2016 Composite Failures: A Comparison of Experimental Test Results and Computational Analysis...A Comparison of Experimental Test Results and Computational Analysis Using XFEM 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...availability of measurement techniques, experimental testing of composite materials has largely outpaced the computational modeling ability, forcing

  4. The national survey of health administration program graduates on management information systems education.

    PubMed

    Zalkind, D; Malec, B

    1988-01-01

    A national survey of alumni of AUPHA programs from the classes of 1983, 1984, and 1985 was undertaken to assess their experiences in management information systems education, both formally and on the job. The survey covered 38 AUPHA graduate member programs and resulted in 1,181 responses. Over 40 percent of the alumni indicated that they had had an introductory management information systems (MIS) course in a health administration program. Since graduation, almost 90 percent have had some significant on-the-job involvement with computers, computer-generated information, or MIS. More than one-third of the respondents felt that their MIS course work did not adequately prepare them for what was expected on the job. Alumni stressed that microcomputer software applications, such as spreadsheets and data bases, are important areas for student hands-on experiences. When asked the importance of certain areas to be included in a required introductory MIS course, the alumni also recommended spreadsheet analysis and design, report writing and data presentation, and other management areas. Additional comments suggested more access to personal computers (PCs), more relevance in the curriculum to the "real world," and the importance of MIS to the career paths of alumni. Faculty suggestions from a 1984-85 survey are compared with alumni responses in order to identify curricular changes needed. Recommendations are outlined for consideration.

  5. Topographical Organization of Attentional, Social, and Memory Processes in the Human Temporoparietal Cortex123

    PubMed Central

    Webb, Taylor W.; Kelly, Yin T.; Graziano, Michael S. A.

    2016-01-01

    Abstract The temporoparietal junction (TPJ) is activated in association with a large range of functions, including social cognition, episodic memory retrieval, and attentional reorienting. An ongoing debate is whether the TPJ performs an overarching, domain-general computation, or whether functions reside in domain-specific subdivisions. We scanned subjects with fMRI during five tasks known to activate the TPJ, probing social, attentional, and memory functions, and used data-driven parcellation (independent component analysis) to isolate task-related functional processes in the bilateral TPJ. We found that one dorsal component in the right TPJ, which was connected with the frontoparietal control network, was activated in all of the tasks. Other TPJ subregions were specific for attentional reorienting, oddball target detection, or social attribution of belief. The TPJ components that participated in attentional reorienting and oddball target detection appeared spatially separated, but both were connected with the ventral attention network. The TPJ component that participated in the theory-of-mind task was part of the default-mode network. Further, we found that the BOLD response in the domain-general dorsal component had a longer latency than responses in the domain-specific components, suggesting an involvement in distinct, perhaps postperceptual, computations. These findings suggest that the TPJ performs both domain-general and domain-specific computations that reside within spatially distinct functional components. PMID:27280153

  6. Digital processing of mesoscale analysis and space sensor data

    NASA Technical Reports Server (NTRS)

    Hickey, J. S.; Karitani, S.

    1985-01-01

    The mesoscale analysis and space sensor (MASS) data management and analysis system on the research computer system is presented. The MASS data base management and analysis system was implemented on the research computer system which provides a wide range of capabilities for processing and displaying large volumes of conventional and satellite derived meteorological data. The research computer system consists of three primary computers (HP-1000F, Harris/6, and Perkin-Elmer 3250), each of which performs a specific function according to its unique capabilities. The overall tasks performed concerning the software, data base management and display capabilities of the research computer system in terms of providing a very effective interactive research tool for the digital processing of mesoscale analysis and space sensor data is described.

  7. Restructuring the CS 1 classroom: Examining the effect of open laboratory-based classes vs. closed laboratory-based classes on Computer Science 1 students' achievement and attitudes toward computers and computer courses

    NASA Astrophysics Data System (ADS)

    Henderson, Jean Foster

    The purpose of this study was to assess the effect of classroom restructuring involving computer laboratories on student achievement and student attitudes toward computers and computer courses. The effects of the targeted student attributes of gender, previous programming experience, math background, and learning style were also examined. The open lab-based class structure consisted of a traditional lecture class with a separate, unscheduled lab component in which lab assignments were completed outside of class; the closed lab-based class structure integrated a lab component within the lecture class so that half the class was reserved for lecture and half the class was reserved for students to complete lab assignments by working cooperatively with each other and under the supervision and guidance of the instructor. The sample consisted of 71 students enrolled in four intact classes of Computer Science I during the fall and spring semesters of the 2006--2007 school year at two southern universities: two classes were held in the fall (one at each university) and two classes were held in the spring (one at each university). A counterbalanced repeated measures design was used in which all students experienced both class structures for half of each semester. The order of control and treatment was rotated among the four classes. All students received the same amount of class and instructor time. A multivariate analysis of variance (MANOVA) via a multiple regression strategy was used to test the study's hypotheses. Although the overall MANOVA model was statistically significant, independent follow-up univariate analyses relative to each dependent measure found that the only significant research factor was math background: Students whose mathematics background was at the level of Calculus I or higher had significantly higher student achievement than students whose mathematics background was less than Calculus I. The results suggest that classroom structures that incorporate an open laboratory setting are just as effective on student achievement and attitudes as classroom structures that incorporate a closed laboratory setting. The results also suggest that math background is a strong predictor of student achievement in CS 1.

  8. Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April, 1986 through September 30, 1986 is summarized.

  9. Metal–Metal Bonding in Uranium–Group 10 Complexes

    PubMed Central

    2016-01-01

    Heterobimetallic complexes containing short uranium–group 10 metal bonds have been prepared from monometallic IUIV(OArP-κ2O,P)3 (2) {[ArPO]− = 2-tert-butyl-4-methyl-6-(diphenylphosphino)phenolate}. The U–M bond in IUIV(μ-OArP-1κ1O,2κ1P)3M0, M = Ni (3–Ni), Pd (3–Pd), and Pt (3–Pt), has been investigated by experimental and DFT computational methods. Comparisons of 3–Ni with two further U–Ni complexes XUIV(μ-OArP-1κ1O,2κ1P)3Ni0, X = Me3SiO (4) and F (5), was also possible via iodide substitution. All complexes were characterized by variable-temperature NMR spectroscopy, electrochemistry, and single crystal X-ray diffraction. The U–M bonds are significantly shorter than any other crystallographically characterized d–f-block bimetallic, even though the ligand flexes to allow a variable U–M separation. Excellent agreement is found between the experimental and computed structures for 3–Ni and 3–Pd. Natural population analysis and natural localized molecular orbital (NLMO) compositions indicate that U employs both 5f and 6d orbitals in covalent bonding to a significant extent. Quantum theory of atoms-in-molecules analysis reveals U–M bond critical point properties typical of metallic bonding and a larger delocalization index (bond order) for the less polar U–Ni bond than U–Pd. Electrochemical studies agree with the computational analyses and the X-ray structural data for the U–X adducts 3–Ni, 4, and 5. The data show a trend in uranium–metal bond strength that decreases from 3–Ni down to 3–Pt and suggest that exchanging the iodide for a fluoride strengthens the metal–metal bond. Despite short U–TM (transition metal) distances, four other computational approaches also suggest low U–TM bond orders, reflecting highly transition metal localized valence NLMOs. These are more so for 3–Pd than 3–Ni, consistent with slightly larger U–TM bond orders in the latter. Computational studies of the model systems (PH3)3MU(OH)3I (M = Ni, Pd) reveal longer and weaker unsupported U–TM bonds vs 3. PMID:26942560

  10. Multidisciplinary High-Fidelity Analysis and Optimization of Aerospace Vehicles. Part 1; Formulation

    NASA Technical Reports Server (NTRS)

    Walsh, J. L.; Townsend, J. C.; Salas, A. O.; Samareh, J. A.; Mukhopadhyay, V.; Barthelemy, J.-F.

    2000-01-01

    An objective of the High Performance Computing and Communication Program at the NASA Langley Research Center is to demonstrate multidisciplinary shape and sizing optimization of a complete aerospace vehicle configuration by using high-fidelity, finite element structural analysis and computational fluid dynamics aerodynamic analysis in a distributed, heterogeneous computing environment that includes high performance parallel computing. A software system has been designed and implemented to integrate a set of existing discipline analysis codes, some of them computationally intensive, into a distributed computational environment for the design of a highspeed civil transport configuration. The paper describes the engineering aspects of formulating the optimization by integrating these analysis codes and associated interface codes into the system. The discipline codes are integrated by using the Java programming language and a Common Object Request Broker Architecture (CORBA) compliant software product. A companion paper presents currently available results.

  11. Introduction to the computational structural mechanics testbed

    NASA Technical Reports Server (NTRS)

    Lotts, C. G.; Greene, W. H.; Mccleary, S. L.; Knight, N. F., Jr.; Paulson, S. S.; Gillian, R. E.

    1987-01-01

    The Computational Structural Mechanics (CSM) testbed software system based on the SPAR finite element code and the NICE system is described. This software is denoted NICE/SPAR. NICE was developed at Lockheed Palo Alto Research Laboratory and contains data management utilities, a command language interpreter, and a command language definition for integrating engineering computational modules. SPAR is a system of programs used for finite element structural analysis developed for NASA by Lockheed and Engineering Information Systems, Inc. It includes many complementary structural analysis, thermal analysis, utility functions which communicate through a common database. The work on NICE/SPAR was motivated by requirements for a highly modular and flexible structural analysis system to use as a tool in carrying out research in computational methods and exploring computer hardware. Analysis examples are presented which demonstrate the benefits gained from a combination of the NICE command language with a SPAR computational modules.

  12. Customizable Computer-Based Interaction Analysis for Coaching and Self-Regulation in Synchronous CSCL Systems

    ERIC Educational Resources Information Center

    Lonchamp, Jacques

    2010-01-01

    Computer-based interaction analysis (IA) is an automatic process that aims at understanding a computer-mediated activity. In a CSCL system, computer-based IA can provide information directly to learners for self-assessment and regulation and to tutors for coaching support. This article proposes a customizable computer-based IA approach for a…

  13. Petascale Computing: Impact on Future NASA Missions

    NASA Technical Reports Server (NTRS)

    Brooks, Walter

    2006-01-01

    This slide presentation reviews NASA's use of a new super computer, called Columbia, capable of operating at 62 Tera Flops. This computer is the 4th fastest computer in the world. This computer will serve all mission directorates. The applications that it would serve are: aerospace analysis and design, propulsion subsystem analysis, climate modeling, hurricane prediction and astrophysics and cosmology.

  14. A review of intelligent systems for heart sound signal analysis.

    PubMed

    Nabih-Ali, Mohammed; El-Dahshan, El-Sayed A; Yahia, Ashraf S

    2017-10-01

    Intelligent computer-aided diagnosis (CAD) systems can enhance the diagnostic capabilities of physicians and reduce the time required for accurate diagnosis. CAD systems could provide physicians with a suggestion about the diagnostic of heart diseases. The objective of this paper is to review the recent published preprocessing, feature extraction and classification techniques and their state of the art of phonocardiogram (PCG) signal analysis. Published literature reviewed in this paper shows the potential of machine learning techniques as a design tool in PCG CAD systems and reveals that the CAD systems for PCG signal analysis are still an open problem. Related studies are compared to their datasets, feature extraction techniques and the classifiers they used. Current achievements and limitations in developing CAD systems for PCG signal analysis using machine learning techniques are presented and discussed. In the light of this review, a number of future research directions for PCG signal analysis are provided.

  15. Modeling intelligent adversaries for terrorism risk assessment: some necessary conditions for adversary models.

    PubMed

    Guikema, Seth

    2012-07-01

    Intelligent adversary modeling has become increasingly important for risk analysis, and a number of different approaches have been proposed for incorporating intelligent adversaries in risk analysis models. However, these approaches are based on a range of often-implicit assumptions about the desirable properties of intelligent adversary models. This "Perspective" paper aims to further risk analysis for situations involving intelligent adversaries by fostering a discussion of the desirable properties for these models. A set of four basic necessary conditions for intelligent adversary models is proposed and discussed. These are: (1) behavioral accuracy to the degree possible, (2) computational tractability to support decision making, (3) explicit consideration of uncertainty, and (4) ability to gain confidence in the model. It is hoped that these suggested necessary conditions foster discussion about the goals and assumptions underlying intelligent adversary modeling in risk analysis. © 2011 Society for Risk Analysis.

  16. The Effects of Computer Instruction on College Students' Reading Skills.

    ERIC Educational Resources Information Center

    Kuehner, Alison V.

    1999-01-01

    Reviews research concerning computer-based reading instruction for college students. Finds that most studies suggest that computers can provide motivating and efficient learning, but it is not clear whether the computer, or the instruction via computer, accounts for student gains. Notes many methodological flaws in the studies. Suggests…

  17. Sampling factors influencing accuracy of sperm kinematic analysis.

    PubMed

    Owen, D H; Katz, D F

    1993-01-01

    Sampling conditions that influence the accuracy of experimental measurement of sperm head kinematics were studied by computer simulation methods. Several archetypal sperm trajectories were studied. First, mathematical models of typical flagellar beats were input to hydrodynamic equations of sperm motion. The instantaneous swimming velocities of such sperm were computed over sequences of flagellar beat cycles, from which the resulting trajectories were determined. In a second, idealized approach, direct mathematical models of trajectories were utilized, based upon similarities to the previous hydrodynamic constructs. In general, it was found that analyses of sampling factors produced similar results for the hydrodynamic and idealized trajectories. A number of experimental sampling factors were studied, including the number of sperm head positions measured per flagellar beat, and the time interval over which these measurements are taken. It was found that when one flagellar beat is sampled, values of amplitude of lateral head displacement (ALH) and linearity (LIN) approached their actual values when five or more sample points per beat were taken. Mean angular displacement (MAD) values, however, remained sensitive to sampling rate even when large sampling rates were used. Values of MAD were also much more sensitive to the initial starting point of the sampling procedure than were ALH or LIN. On the basis of these analyses of measurement accuracy for individual sperm, simulations were then performed of cumulative effects when studying entire populations of motile cells. It was found that substantial (double digit) errors occurred in the mean values of curvilinear velocity (VCL), LIN, and MAD under the conditions of 30 video frames per second and 0.5 seconds of analysis time. Increasing the analysis interval to 1 second did not appreciably improve the results. However, increasing the analysis rate to 60 frames per second significantly reduced the errors. These findings thus suggest that computer-aided sperm analysis (CASA) application at 60 frames per second will significantly improve the accuracy of kinematic analysis in most applications to human and other mammalian sperm.

  18. R/S analysis of reaction time in Neuron Type Test for human activity in civil aviation

    NASA Astrophysics Data System (ADS)

    Zhang, Hong-Yan; Kang, Ming-Cui; Li, Jing-Qiang; Liu, Hai-Tao

    2017-03-01

    Human factors become the most serious problem leading to accidents of civil aviation, which stimulates the design and analysis of Neuron Type Test (NTT) system to explore the intrinsic properties and patterns behind the behaviors of professionals and students in civil aviation. In the experiment, normal practitioners' reaction time sequences, collected from NTT, exhibit log-normal distribution approximately. We apply the χ2 test to compute the goodness-of-fit by transforming the time sequence with Box-Cox transformation to cluster practitioners. The long-term correlation of different individual practitioner's time sequence is represented by the Hurst exponent via Rescaled Range Analysis, also named by Range/Standard deviation (R/S) Analysis. The different Hurst exponent suggests the existence of different collective behavior and different intrinsic patterns of human factors in civil aviation.

  19. Computers in My Curriculum? 18 Lesson Plans for Teaching Computer Awareness without a Computer. Adaptable Grades 3-12.

    ERIC Educational Resources Information Center

    Bailey, Suzanne Powers; Jeffers, Marcia

    Eighteen interrelated, sequential lesson plans and supporting materials for teaching computer literacy at the elementary and secondary levels are presented. The activities, intended to be infused into the regular curriculum, do not require the use of a computer. The introduction presents background information on computer literacy, suggests a…

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beauchamp, R.O. Jr.

    A preliminary examination of chemical-substructure analysis (CSA) demonstrates the effective use of the Chemical Abstracts compound connectivity file in conjunction with the bibliographic file for relating chemical structures to biological activity. The importance of considering the role of metabolic intermediates under a variety of conditions is illustrated, suggesting structures that should be examined that may exhibit potential activity. This CSA technique, which utilizes existing large files accessible with online personal computers, is recommended for use as another tool in examining chemicals in drugs. 2 refs., 4 figs.

  1. Berberine cation: A fluorescent chemosensor for alkanes and other low-polarity compounds. An explanation of this phenomenon

    PubMed

    Cossio; Arrieta; Cebolla; Membrado; Vela; Garriga; Domingo

    2000-07-27

    Alkanes in the presence of berberine sulfate provide an enhancement of fluorescent signal, which depends on alkane concentration and structure, when the system is irradiated with monochromatic UV light. Computational analysis suggests that an ion-induced dipole between alkanes and berberine sulfate is responsible for this phenomenon. This interaction can properly model the experimentally obtained fluorescent response. The proposed explanation allows other interacting systems to be designed, which have been experimentally confirmed.

  2. Evolution of statistical properties for a nonlinearly propagating sinusoid.

    PubMed

    Shepherd, Micah R; Gee, Kent L; Hanford, Amanda D

    2011-07-01

    The nonlinear propagation of a pure sinusoid is considered using time domain statistics. The probability density function, standard deviation, skewness, kurtosis, and crest factor are computed for both the amplitude and amplitude time derivatives as a function of distance. The amplitude statistics vary only in the postshock realm, while the amplitude derivative statistics vary rapidly in the preshock realm. The statistical analysis also suggests that the sawtooth onset distance can be considered to be earlier than previously realized. © 2011 Acoustical Society of America

  3. Nonlinear mechanical behavior of thermoplastic matrix materials for advanced composites

    NASA Technical Reports Server (NTRS)

    Arenz, R. J.; Landel, R. F.

    1989-01-01

    Two recent theories of nonlinear mechanical response are quantitatively compared and related to experimental data. Computer techniques are formulated to handle the numerical integration and iterative procedures needed to solve the associated sets of coupled nonlinear differential equations. Problems encountered during these formulations are discussed and some open questions described. Bearing in mind these cautions, the consequences of changing parameters that appear in the formulations on the resulting engineering properties are discussed. Hence, engineering approaches to the analysis of thermoplastic matrix material can be suggested.

  4. A Qualitative Analysis of NASA’s Human Computer Interaction Group Examining the Root Causes of Focusing on Derivative System Improvements Versus Core User Needs

    DTIC Science & Technology

    2017-12-01

    emphasis on meeting deliverable dates over a focus on customer service and user experience, a common finding in McGrath and MacMillan’s (2000) research...including suggestions for reducing this burden, to Washington headquarters Services , Directorate for Information Operations and Reports, 1215...structure, process improvements, and training needs as the group prepares to support the retirement of the International Space Station in the 2020s and

  5. Extracting falsifiable predictions from sloppy models.

    PubMed

    Gutenkunst, Ryan N; Casey, Fergal P; Waterfall, Joshua J; Myers, Christopher R; Sethna, James P

    2007-12-01

    Successful predictions are among the most compelling validations of any model. Extracting falsifiable predictions from nonlinear multiparameter models is complicated by the fact that such models are commonly sloppy, possessing sensitivities to different parameter combinations that range over many decades. Here we discuss how sloppiness affects the sorts of data that best constrain model predictions, makes linear uncertainty approximations dangerous, and introduces computational difficulties in Monte-Carlo uncertainty analysis. We also present a useful test problem and suggest refinements to the standards by which models are communicated.

  6. Solving a Class of Stochastic Mixed-Integer Programs With Branch and Price

    DTIC Science & Technology

    2006-01-01

    a two-dimensional knapsack problem, but for a given m, the objective value gi does not depend on the variance index v. This will be used in a final...optimization. Journal of Multicriteria Decision Analysis 11, 139–150 (2002) 29. Ford, L.R., Fulkerson, D.R.: A suggested computation for the maximal...for solution by a branch-and-price algorithm (B&P). We then survey a number of examples, and use a stochastic facility-location problem (SFLP) for a

  7. Urban and regional land use analysis: CARETS and census cities experiment package

    NASA Technical Reports Server (NTRS)

    Alexander, R. (Principal Investigator); Pease, R. W.; Lins, H. F., Jr.

    1975-01-01

    The author has identified the following significant results. Successful tentative calibration permits computer programs to be written to convert Skylab thermal tapes into line-printed graymaps showing actual surface radiation temperature distributions at the time of imaging. The calibrations will be further checked when atmospheric soundings are available. Success of Skylab calibration suggests that satellite are feasible platforms for thermal scanning and provide a much broader geographical field of view than is possible with airborne platforms.

  8. Increasingly mobile: How new technologies can enhance qualitative research

    PubMed Central

    Moylan, Carrie Ann; Derr, Amelia Seraphia; Lindhorst, Taryn

    2015-01-01

    Advances in technology, such as the growth of smart phones, tablet computing, and improved access to the internet have resulted in many new tools and applications designed to increase efficiency and improve workflow. Some of these tools will assist scholars using qualitative methods with their research processes. We describe emerging technologies for use in data collection, analysis, and dissemination that each offer enhancements to existing research processes. Suggestions for keeping pace with the ever-evolving technological landscape are also offered. PMID:25798072

  9. CREATING AN IPHONE APPLICATION FOR COLLECTING CONTINUOUS ABC DATA

    PubMed Central

    Whiting, Seth W; Dixon, Mark R

    2012-01-01

    This paper provides an overview and task analysis for creating a continuous ABC data-collection application using Xcode on a Mac computer. Behavior analysts can program an ABC data collection system, complete with a customized list of target clients, antecedents, behaviors, and consequences to be recorded, and have the data automatically sent to an e-mail account after observations have concluded. Further suggestions are provided to customize the ABC data- collection system for individual preferences and clinical needs. PMID:23060682

  10. Creating an iPhone application for collecting continuous ABC data.

    PubMed

    Whiting, Seth W; Dixon, Mark R

    2012-01-01

    This paper provides an overview and task analysis for creating a continuous ABC data-collection application using Xcode on a Mac computer. Behavior analysts can program an ABC data collection system, complete with a customized list of target clients, antecedents, behaviors, and consequences to be recorded, and have the data automatically sent to an e-mail account after observations have concluded. Further suggestions are provided to customize the ABC data- collection system for individual preferences and clinical needs.

  11. Theoretical and experimental analysis of the impacts of removable storage media and antivirus software on viral spread

    NASA Astrophysics Data System (ADS)

    Gan, Chenquan; Yang, Xiaofan

    2015-05-01

    In this paper, a new computer virus propagation model, which incorporates the effects of removable storage media and antivirus software, is proposed and analyzed. The global stability of the unique equilibrium of the model is independent of system parameters. Numerical simulations not only verify this result, but also illustrate the influences of removable storage media and antivirus software on viral spread. On this basis, some applicable measures for suppressing virus prevalence are suggested.

  12. DFT calculations on molecular structure, spectral analysis, multiple interactions, reactivity, NLO property and molecular docking study of flavanol-2,4-dinitrophenylhydrazone

    NASA Astrophysics Data System (ADS)

    Singh, Ravindra Kumar; Singh, Ashok Kumar

    2017-02-01

    A new flavanol-2,4-dinitrophenylhydrazone (FDNP) was synthesized and its structure was confirmed by FT-IR, FT-Raman, 1H NMR, mass spectrometry and elemental analysis. All quantum chemical calculations were carried out at level of density functional theory (DFT) with B3LYP functional using 6-311++ G (d,p) basis atomic set. UV-Vis absorption spectra for the singlet-singlet transition computed for fully optimized ground state geometry using Time-Dependent-Density Functional Theory (TD-DFT) with CAM-B3LYP functional was found to be in consistent with that of experimental findings. Analysis of vibrational (FT-IR and FT-Raman) spectrum and their assignments has been done by computing Potential Energy Distribution (PED) using Gar2ped. HOMO-LUMO analysis was performed and reactivity descriptors were calculated. Calculated global electrophilicity index (ω = 7.986 eV) shows molecule to be a strong electrophile. 1H NMR chemical shift calculated with the help of gauge-including atomic orbital (GIAO) approach shows agreement with experimental data. Various intramolecular interactions were analysed by AIM approach. DFT computed total first static hyperpolarizability (β0 = 189.03 × 10-30 esu) indicates that title molecule can be used as attractive future NLO material. Solvent induced effects on the NLO properties studied by using self-consistent reaction field (SCRF) method shows that β0 value increases with increase in solvent polarity. To study the thermal behaviour of title molecule, thermodynamic properties such as heat capacity, entropy and enthalpy change at various temperatures have been calculated and reported. Molecular docking results suggests title molecule to be a potential kinase inhibitor and might be used in future for designing of new anticancer drug.

  13. Holistic Approaches to Reading (The Printout).

    ERIC Educational Resources Information Center

    Balajthy, Ernest

    1989-01-01

    Presents eight guidelines to consider when using computers for language instruction, emphasizing computer use in a social and purposeful context. Suggests computer software which adheres to these guidelines. (MM)

  14. Computer Technology: State of the Art.

    ERIC Educational Resources Information Center

    Withington, Frederic G.

    1981-01-01

    Describes the nature of modern general-purpose computer systems, including hardware, semiconductor electronics, microprocessors, computer architecture, input output technology, and system control programs. Seven suggested readings are cited. (FM)

  15. Computational Aspects of Heat Transfer in Structures

    NASA Technical Reports Server (NTRS)

    Adelman, H. M. (Compiler)

    1982-01-01

    Techniques for the computation of heat transfer and associated phenomena in complex structures are examined with an emphasis on reentry flight vehicle structures. Analysis methods, computer programs, thermal analysis of large space structures and high speed vehicles, and the impact of computer systems are addressed.

  16. Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April l, 1988 through September 30, 1988.

  17. Summary of research in applied mathematics, numerical analysis and computer science at the Institute for Computer Applications in Science and Engineering

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science during the period October 1, 1983 through March 31, 1984 is summarized.

  18. Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1986 through March 31, 1987 is summarized.

  19. Ship Trim Optimization: Assessment of Influence of Trim on Resistance of MOERI Container Ship

    PubMed Central

    Duan, Wenyang

    2014-01-01

    Environmental issues and rising fuel prices necessitate better energy efficiency in all sectors. Shipping industry is a stakeholder in environmental issues. Shipping industry is responsible for approximately 3% of global CO2 emissions, 14-15% of global NOX emissions, and 16% of global SOX emissions. Ship trim optimization has gained enormous momentum in recent years being an effective operational measure for better energy efficiency to reduce emissions. Ship trim optimization analysis has traditionally been done through tow-tank testing for a specific hullform. Computational techniques are increasingly popular in ship hydrodynamics applications. The purpose of this study is to present MOERI container ship (KCS) hull trim optimization by employing computational methods. KCS hull total resistances and trim and sinkage computed values, in even keel condition, are compared with experimental values and found in reasonable agreement. The agreement validates that mesh, boundary conditions, and solution techniques are correct. The same mesh, boundary conditions, and solution techniques are used to obtain resistance values in different trim conditions at Fn = 0.2274. Based on attained results, optimum trim is suggested. This research serves as foundation for employing computational techniques for ship trim optimization. PMID:24578649

  20. Expert overseer for mass spectrometer system

    DOEpatents

    Filby, Evan E.; Rankin, Richard A.

    1991-01-01

    An expert overseer for the operation and real-time management of a mass spectrometer and associated laboratory equipment. The overseer is a computer-based expert diagnostic system implemented on a computer separate from the dedicated computer used to control the mass spectrometer and produce the analysis results. An interface links the overseer to components of the mass spectrometer, components of the laboratory support system, and the dedicated control computer. Periodically, the overseer polls these devices and as well as itself. These data are fed into an expert portion of the system for real-time evaluation. A knowledge base used for the evaluation includes both heuristic rules and precise operation parameters. The overseer also compares current readings to a long-term database to detect any developing trends using a combination of statistical and heuristic rules to evaluate the results. The overseer has the capability to alert lab personnel whenever questionable readings or trends are observed and provide a background review of the problem and suggest root causes and potential solutions, or appropriate additional tests that could be performed. The overseer can change the sequence or frequency of the polling to respond to an observation in the current data.

  1. Grammatical Analysis as a Distributed Neurobiological Function

    PubMed Central

    Bozic, Mirjana; Fonteneau, Elisabeth; Su, Li; Marslen-Wilson, William D

    2015-01-01

    Language processing engages large-scale functional networks in both hemispheres. Although it is widely accepted that left perisylvian regions have a key role in supporting complex grammatical computations, patient data suggest that some aspects of grammatical processing could be supported bilaterally. We investigated the distribution and the nature of grammatical computations across language processing networks by comparing two types of combinatorial grammatical sequences—inflectionally complex words and minimal phrases—and contrasting them with grammatically simple words. Novel multivariate analyses revealed that they engage a coalition of separable subsystems: inflected forms triggered left-lateralized activation, dissociable into dorsal processes supporting morphophonological parsing and ventral, lexically driven morphosyntactic processes. In contrast, simple phrases activated a consistently bilateral pattern of temporal regions, overlapping with inflectional activations in L middle temporal gyrus. These data confirm the role of the left-lateralized frontotemporal network in supporting complex grammatical computations. Critically, they also point to the capacity of bilateral temporal regions to support simple, linear grammatical computations. This is consistent with a dual neurobiological framework where phylogenetically older bihemispheric systems form part of the network that supports language function in the modern human, and where significant capacities for language comprehension remain intact even following severe left hemisphere damage. PMID:25421880

  2. Structural synthesis: Precursor and catalyst

    NASA Technical Reports Server (NTRS)

    Schmit, L. A.

    1984-01-01

    More than twenty five years have elapsed since it was recognized that a rather general class of structural design optimization tasks could be properly posed as an inequality constrained minimization problem. It is suggested that, independent of primary discipline area, it will be useful to think about: (1) posing design problems in terms of an objective function and inequality constraints; (2) generating design oriented approximate analysis methods (giving special attention to behavior sensitivity analysis); (3) distinguishing between decisions that lead to an analysis model and those that lead to a design model; (4) finding ways to generate a sequence of approximate design optimization problems that capture the essential characteristics of the primary problem, while still having an explicit algebraic form that is matched to one or more of the established optimization algorithms; (5) examining the potential of optimum design sensitivity analysis to facilitate quantitative trade-off studies as well as participation in multilevel design activities. It should be kept in mind that multilevel methods are inherently well suited to a parallel mode of operation in computer terms or to a division of labor between task groups in organizational terms. Based on structural experience with multilevel methods general guidelines are suggested.

  3. Age determination by teeth examination: a comparison between different morphologic and quantitative analyses.

    PubMed

    Amariti, M L; Restori, M; De Ferrari, F; Paganelli, C; Faglia, R; Legnani, G

    1999-06-01

    Age determination by teeth examination is one of the main means of determining personal identification. Current studies have suggested different techniques for determining the age of a subject by means of the analysis of microscopic and macroscopic structural modifications of the tooth with ageing. The histological approach is useful among the various methodologies utilized for this purpose. It is still unclear as to what is the best technique, as almost all the authors suggest the use of the approach they themselves have tested. In the present study, age determination by means of microscopic techniques has been based on the quantitative analysis of three parameters, all well recognized in specialized literature: 1. dentinal tubules density/sclerosis 2. tooth translucency 3. analysis of the cementum thickness. After a description of the three methodologies (with automatic image processing of the dentinal sclerosis utilizing an appropriate computer program developed by the authors) the results obtained on cases using the three different approaches are presented, and the merits and failings of each technique are identified with the intention of identifying the one offering the least degree of error in age determination.

  4. Summary of research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1988 through March 31, 1989 is summarized.

  5. CloVR: a virtual machine for automated and portable sequence analysis from the desktop using cloud computing.

    PubMed

    Angiuoli, Samuel V; Matalka, Malcolm; Gussman, Aaron; Galens, Kevin; Vangala, Mahesh; Riley, David R; Arze, Cesar; White, James R; White, Owen; Fricke, W Florian

    2011-08-30

    Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arimura, Hidetaka, E-mail: arimurah@med.kyushu-u.ac.jp; Kamezawa, Hidemi; Jin, Ze

    Good relationships between computational image analysis and radiological physics have been constructed for increasing the accuracy of medical diagnostic imaging and radiation therapy in radiological physics. Computational image analysis has been established based on applied mathematics, physics, and engineering. This review paper will introduce how computational image analysis is useful in radiation therapy with respect to radiological physics.

  7. The analysis of delays in simulator digital computing systems. Volume 1: Formulation of an analysis approach using a central example simulator model

    NASA Technical Reports Server (NTRS)

    Heffley, R. K.; Jewell, W. F.; Whitbeck, R. F.; Schulman, T. M.

    1980-01-01

    The effects of spurious delays in real time digital computing systems are examined. Various sources of spurious delays are defined and analyzed using an extant simulator system as an example. A specific analysis procedure is set forth and four cases are viewed in terms of their time and frequency domain characteristics. Numerical solutions are obtained for three single rate one- and two-computer examples, and the analysis problem is formulated for a two-rate, two-computer example.

  8. Fast Virtual Fractional Flow Reserve Based Upon Steady-State Computational Fluid Dynamics Analysis: Results From the VIRTU-Fast Study.

    PubMed

    Morris, Paul D; Silva Soto, Daniel Alejandro; Feher, Jeroen F A; Rafiroiu, Dan; Lungu, Angela; Varma, Susheel; Lawford, Patricia V; Hose, D Rodney; Gunn, Julian P

    2017-08-01

    Fractional flow reserve (FFR)-guided percutaneous intervention is superior to standard assessment but remains underused. The authors have developed a novel "pseudotransient" analysis protocol for computing virtual fractional flow reserve (vFFR) based upon angiographic images and steady-state computational fluid dynamics. This protocol generates vFFR results in 189 s (cf >24 h for transient analysis) using a desktop PC, with <1% error relative to that of full-transient computational fluid dynamics analysis. Sensitivity analysis demonstrated that physiological lesion significance was influenced less by coronary or lesion anatomy (33%) and more by microvascular physiology (59%). If coronary microvascular resistance can be estimated, vFFR can be accurately computed in less time than it takes to make invasive measurements.

  9. Flexible Launch Vehicle Stability Analysis Using Steady and Unsteady Computational Fluid Dynamics

    NASA Technical Reports Server (NTRS)

    Bartels, Robert E.

    2012-01-01

    Launch vehicles frequently experience a reduced stability margin through the transonic Mach number range. This reduced stability margin can be caused by the aerodynamic undamping one of the lower-frequency flexible or rigid body modes. Analysis of the behavior of a flexible vehicle is routinely performed with quasi-steady aerodynamic line loads derived from steady rigid aerodynamics. However, a quasi-steady aeroelastic stability analysis can be unconservative at the critical Mach numbers, where experiment or unsteady computational aeroelastic analysis show a reduced or even negative aerodynamic damping.Amethod of enhancing the quasi-steady aeroelastic stability analysis of a launch vehicle with unsteady aerodynamics is developed that uses unsteady computational fluid dynamics to compute the response of selected lower-frequency modes. The response is contained in a time history of the vehicle line loads. A proper orthogonal decomposition of the unsteady aerodynamic line-load response is used to reduce the scale of data volume and system identification is used to derive the aerodynamic stiffness, damping, and mass matrices. The results are compared with the damping and frequency computed from unsteady computational aeroelasticity and from a quasi-steady analysis. The results show that incorporating unsteady aerodynamics in this way brings the enhanced quasi-steady aeroelastic stability analysis into close agreement with the unsteady computational aeroelastic results.

  10. Computer Analogies: Teaching Molecular Biology and Ecology.

    ERIC Educational Resources Information Center

    Rice, Stanley; McArthur, John

    2002-01-01

    Suggests that computer science analogies can aid the understanding of gene expression, including the storage of genetic information on chromosomes. Presents a matrix of biology and computer science concepts. (DDR)

  11. Children's Computers.

    ERIC Educational Resources Information Center

    Samaras, Anastasia P.

    1996-01-01

    Suggests that teachers and social context determine what young children acquire from computer experiences. Provides anecdotes of teachers working with children who are using a computer program to complete a picture puzzle. The computer allowed teachers to present a problem, witness children's cognitive capabilities, listen to their metacognitive…

  12. The DYNAMO Simulation Language--An Alternate Approach to Computer Science Education.

    ERIC Educational Resources Information Center

    Bronson, Richard

    1986-01-01

    Suggests the use of computer simulation of continuous systems as a problem solving approach to computer languages. Outlines the procedures that the system dynamics approach employs in computer simulations. Explains the advantages of the special purpose language, DYNAMO. (ML)

  13. A Modeling and Data Analysis of Laser Beam Propagation in the Maritime Domain

    DTIC Science & Technology

    2015-05-18

    approach to computing pdfs is the Kernel Density Method (Reference [9] has an intro - duction to the method), which we will apply to compute the pdf of our...The project has two parts to it: 1) we present a computational analysis of different probability density function approximation techniques; and 2) we... computational analysis of different probability density function approximation techniques; and 2) we introduce preliminary steps towards developing a

  14. The Computer Revolution. An Introduction to Computers. A Good Apple Activity Book for Grades 4-8.

    ERIC Educational Resources Information Center

    Colgren, John

    This booklet is designed to introduce computers to children. A letter to parents is provided, explaining that a unit on computers will be taught which will discuss the major parts of the computer and programming in the computer language BASIC. Suggestions for teachers provide information on starting, the binary system, base two worksheet, binary…

  15. Data-mining of potential antitubercular activities from molecular ingredients of traditional Chinese medicines.

    PubMed

    Jamal, Salma; Scaria, Vinod

    2014-01-01

    Background. Traditional Chinese medicine encompasses a well established alternate system of medicine based on a broad range of herbal formulations and is practiced extensively in the region for the treatment of a wide variety of diseases. In recent years, several reports describe in depth studies of the molecular ingredients of traditional Chinese medicines on the biological activities including anti-bacterial activities. The availability of a well-curated dataset of molecular ingredients of traditional Chinese medicines and accurate in-silico cheminformatics models for data mining for antitubercular agents and computational filters to prioritize molecules has prompted us to search for potential hits from these datasets. Results. We used a consensus approach to predict molecules with potential antitubercular activities from a large dataset of molecular ingredients of traditional Chinese medicines available in the public domain. We further prioritized 160 molecules based on five computational filters (SMARTSfilter) so as to avoid potentially undesirable molecules. We further examined the molecules for permeability across Mycobacterial cell wall and for potential activities against non-replicating and drug tolerant Mycobacteria. Additional in-depth literature surveys for the reported antitubercular activities of the molecular ingredients and their sources were considered for drawing support to prioritization. Conclusions. Our analysis suggests that datasets of molecular ingredients of traditional Chinese medicines offer a new opportunity to mine for potential biological activities. In this report, we suggest a proof-of-concept methodology to prioritize molecules for further experimental assays using a variety of computational tools. We also additionally suggest that a subset of prioritized molecules could be used for evaluation for tuberculosis due to their additional effect against non-replicating tuberculosis as well as the additional hepato-protection offered by the source of these ingredients.

  16. Data-mining of potential antitubercular activities from molecular ingredients of traditional Chinese medicines

    PubMed Central

    Jamal, Salma

    2014-01-01

    Background. Traditional Chinese medicine encompasses a well established alternate system of medicine based on a broad range of herbal formulations and is practiced extensively in the region for the treatment of a wide variety of diseases. In recent years, several reports describe in depth studies of the molecular ingredients of traditional Chinese medicines on the biological activities including anti-bacterial activities. The availability of a well-curated dataset of molecular ingredients of traditional Chinese medicines and accurate in-silico cheminformatics models for data mining for antitubercular agents and computational filters to prioritize molecules has prompted us to search for potential hits from these datasets. Results. We used a consensus approach to predict molecules with potential antitubercular activities from a large dataset of molecular ingredients of traditional Chinese medicines available in the public domain. We further prioritized 160 molecules based on five computational filters (SMARTSfilter) so as to avoid potentially undesirable molecules. We further examined the molecules for permeability across Mycobacterial cell wall and for potential activities against non-replicating and drug tolerant Mycobacteria. Additional in-depth literature surveys for the reported antitubercular activities of the molecular ingredients and their sources were considered for drawing support to prioritization. Conclusions. Our analysis suggests that datasets of molecular ingredients of traditional Chinese medicines offer a new opportunity to mine for potential biological activities. In this report, we suggest a proof-of-concept methodology to prioritize molecules for further experimental assays using a variety of computational tools. We also additionally suggest that a subset of prioritized molecules could be used for evaluation for tuberculosis due to their additional effect against non-replicating tuberculosis as well as the additional hepato-protection offered by the source of these ingredients. PMID:25081126

  17. Analysis of Software Systems for Specialized Computers,

    DTIC Science & Technology

    computer) with given computer hardware and software . The object of study is the software system of a computer, designed for solving a fixed complex of...purpose of the analysis is to find parameters that characterize the system and its elements during operation, i.e., when servicing the given requirement flow. (Author)

  18. APPLE - An aeroelastic analysis system for turbomachines and propfans

    NASA Technical Reports Server (NTRS)

    Reddy, T. S. R.; Bakhle, Milind A.; Srivastava, R.; Mehmed, Oral

    1992-01-01

    This paper reviews aeroelastic analysis methods for propulsion elements (advanced propellers, compressors and turbines) being developed and used at NASA Lewis Research Center. These aeroelastic models include both structural and aerodynamic components. The structural models include the typical section model, the beam model with and without disk flexibility, and the finite element blade model with plate bending elements. The aerodynamic models are based on the solution of equations ranging from the two-dimensional linear potential equation for a cascade to the three-dimensional Euler equations for multi-blade configurations. Typical results are presented for each aeroelastic model. Suggestions for further research are indicated. All the available aeroelastic models and analysis methods are being incorporated into a unified computer program named APPLE (Aeroelasticity Program for Propulsion at LEwis).

  19. Simulating and mapping spatial complexity using multi-scale techniques

    USGS Publications Warehouse

    De Cola, L.

    1994-01-01

    A central problem in spatial analysis is the mapping of data for complex spatial fields using relatively simple data structures, such as those of a conventional GIS. This complexity can be measured using such indices as multi-scale variance, which reflects spatial autocorrelation, and multi-fractal dimension, which characterizes the values of fields. These indices are computed for three spatial processes: Gaussian noise, a simple mathematical function, and data for a random walk. Fractal analysis is then used to produce a vegetation map of the central region of California based on a satellite image. This analysis suggests that real world data lie on a continuum between the simple and the random, and that a major GIS challenge is the scientific representation and understanding of rapidly changing multi-scale fields. -Author

  20. Numerical Analysis of Intra-Cavity and Power-Stream Flow Interaction in Multiple Gas-Turbine Disk-Cavities

    NASA Technical Reports Server (NTRS)

    Athavale, M. M.; Przekwas, A. J.; Hendricks, R. C.; Steinetz, B. M.

    1995-01-01

    A numerical analysis methodology and solutions of the interaction between the power stream and multiply-connected multi-cavity sealed secondary flow fields are presented. Flow solutions for a multi-cavity experimental rig were computed and compared with experimental data of Daniels and Johnson. The flow solutions illustrate the complex coupling between the main-path and the cavity flows as well as outline the flow thread that exists throughout the subplatform multiple cavities and seals. The analysis also shows that the de-coupled solutions on single cavities is inadequate. The present results show trends similar to the T-700 engine data that suggests the changes in the CDP seal altered the flow fields throughout the engine and affected the engine performance.

  1. Implementation of a computer database testing and analysis program.

    PubMed

    Rouse, Deborah P

    2007-01-01

    The author is the coordinator of a computer software database testing and analysis program implemented in an associate degree nursing program. Computer software database programs help support the testing development and analysis process. Critical thinking is measurable and promoted with their use. The reader of this article will learn what is involved in procuring and implementing a computer database testing and analysis program in an academic nursing program. The use of the computerized database for testing and analysis will be approached as a method to promote and evaluate the nursing student's critical thinking skills and to prepare the nursing student for the National Council Licensure Examination.

  2. The association between blood pressure and incident Alzheimer disease: a systematic review and meta-analysis

    PubMed Central

    Power, Melinda C.; Weuve, Jennifer; Gagne, Joshua J.; McQueen, Matthew B.; Viswanathan, Anand; Blacker, Deborah

    2013-01-01

    Background Many epidemiologic studies have considered the association between blood pressure (BP) and Alzheimer disease, yet the relationship remains poorly understood. Methods In parallel with work on the AlzRisk online database (www.alzrisk.org), we conducted a systematic review to identify all epidemiologic studies meeting pre-specified criteria reporting on the association between hypertension, systolic BP, or diastolic BP and incident Alzheimer disease. When possible, we computed summary measures using random-effects models and explored potential heterogeneity related to age at BP assessment. Results Eighteen studies reporting on 19 populations met the eligibility criteria. We computed summary relative risks (RRΣ) for three measures of BP: hypertension (RRΣ=0.97 [95% confidence interval= 0.80–1.16]); a 10 mm Hg-increase in systolic BP (RRΣ=0.95 [0.91–1.00]); and a 10 mm Hg-increase in diastolic BP (RRΣ=0.94 [0.85–1.04]). We were unable to compute summary estimates for the association between categories of systolic or diastolic BP and Alzheimer disease; however, there did not appear to be a consistent pattern across studies. After stratifying on age at BP assessment, we found a suggestion of an inverse association between late-life hypertension and Alzheimer disease and a suggestion of an adverse association between midlife diastolic hypertension and Alzheimer disease. Conclusions Based on existing epidemiologic research, we cannot determine whether there is a causal association between BP and Alzheimer disease. Selection bias and reverse causation may account for the suggested inverse association between late-life hypertension on Alzheimer disease, but, given the expected direction of these biases, they are less likely to account for the suggestion that midlife hypertension increases risk. We advocate continuing systematic review; the Alzrisk database entry on this topic (www.alzrisk.org), which was completed in parallel with this work, will be updated as new studies are published. PMID:21705906

  3. The association between blood pressure and incident Alzheimer disease: a systematic review and meta-analysis.

    PubMed

    Power, Melinda C; Weuve, Jennifer; Gagne, Joshua J; McQueen, Matthew B; Viswanathan, Anand; Blacker, Deborah

    2011-09-01

    Many epidemiologic studies have considered the association between blood pressure (BP) and Alzheimer disease, yet the relationship remains poorly understood. In parallel with work on the AlzRisk online database (www.alzrisk.org), we conducted a systematic review to identify all epidemiologic studies meeting prespecified criteria reporting on the association between hypertension, systolic BP, or diastolic BP and incident Alzheimer disease. When possible, we computed summary measures using random-effects models and explored potential heterogeneity related to age at BP assessment. Eighteen studies reporting on 19 populations met the eligibility criteria. We computed summary relative risks (RR(Σ)) for 3 measures of BP: hypertension (RR(Σ) = 0.97 [95% confidence interval = 0.80-1.16]); a 10-mm Hg increase in systolic BP (RR(Σ) = 0.95 [0.91-1.00]); and a 10-mm Hg increase in diastolic BP (RR(Σ) = 0.94 [0.85-1.04]). We were unable to compute summary estimates for the association between categories of systolic or diastolic BP and Alzheimer disease; however, there did not appear to be a consistent pattern across studies. After stratifying on age at BP assessment, we found a suggestion of an inverse association between late-life hypertension and Alzheimer disease and a suggestion of an adverse association between midlife diastolic hypertension and Alzheimer disease. Based on existing epidemiologic research, we cannot determine whether there is a causal association between BP and Alzheimer disease. Selection bias and reverse causation may account for the suggested inverse association between late-life hypertension on Alzheimer disease, but, given the expected direction of these biases, they are less likely to account for the suggestion that midlife hypertension increases risk. We advocate continuing systematic review; the AlzRisk database entry on this topic (www.alzrisk.org), which was completed in parallel with this work, will be updated as new studies are published.

  4. Academic reading format preferences and behaviors among university students worldwide: A comparative survey analysis

    PubMed Central

    Kurbanoglu, Serap; Boustany, Joumana

    2018-01-01

    This study reports the descriptive and inferential statistical findings of a survey of academic reading format preferences and behaviors of 10,293 tertiary students worldwide. The study hypothesized that country-based differences in schooling systems, socioeconomic development, culture or other factors might have an influence on preferred formats, print or electronic, for academic reading, as well as the learning engagement behaviors of students. The main findings are that country of origin has little to no relationship with or effect on reading format preferences of university students, and that the broad majority of students worldwide prefer to read academic course materials in print. The majority of participants report better focus and retention of information presented in print formats, and more frequently prefer print for longer texts. Additional demographic and post-hoc analysis suggests that format preference has a small relationship with academic rank. The relationship between task demands, format preferences and reading comprehension are discussed. Additional outcomes and implications for the fields of education, psychology, computer science, information science and human-computer interaction are considered. PMID:29847560

  5. Analysis of coherent dynamical processes through computer vision

    NASA Astrophysics Data System (ADS)

    Hack, M. J. Philipp

    2016-11-01

    Visualizations of turbulent boundary layers show an abundance of characteristic arc-shaped structures whose apparent similarity suggests a common origin in a coherent dynamical process. While the structures have been likened to the hairpin vortices observed in the late stages of transitional flow, a consistent description of the underlying mechanism has remained elusive. Detailed studies are complicated by the chaotic nature of turbulence which modulates each manifestation of the process and which renders the isolation of individual structures a challenging task. The present study applies methods from the field of computer vision to capture the time evolution of turbulent flow features and explore the associated physical mechanisms. The algorithm uses morphological operations to condense the structure of the turbulent flow field into a graph described by nodes and links. The low-dimensional geometric information is stored in a database and allows the identification and analysis of equivalent dynamical processes across multiple scales. The framework is not limited to turbulent boundary layers and can also be applied to different types of flows as well as problems from other fields of science.

  6. Parameters of Models of Structural Transformations in Alloy Steel Under Welding Thermal Cycle

    NASA Astrophysics Data System (ADS)

    Kurkin, A. S.; Makarov, E. L.; Kurkin, A. B.; Rubtsov, D. E.; Rubtsov, M. E.

    2017-05-01

    A mathematical model of structural transformations in an alloy steel under the thermal cycle of multipass welding is suggested for computer implementation. The minimum necessary set of parameters for describing the transformations under heating and cooling is determined. Ferritic-pearlitic, bainitic and martensitic transformations under cooling of a steel are considered. A method for deriving the necessary temperature and time parameters of the model from the chemical composition of the steel is described. Published data are used to derive regression models of the temperature ranges and parameters of transformation kinetics in alloy steels. It is shown that the disadvantages of the active visual methods of analysis of the final phase composition of steels are responsible for inaccuracy and mismatch of published data. The hardness of a specimen, which correlates with some other mechanical properties of the material, is chosen as the most objective and reproducible criterion of the final phase composition. The models developed are checked by a comparative analysis of computational results and experimental data on the hardness of 140 alloy steels after cooling at various rates.

  7. The Effects of Mobile-Computer-Supported Collaborative Learning: Meta-Analysis and Critical Synthesis.

    PubMed

    Sung, Yao-Ting; Yang, Je-Ming; Lee, Han-Yueh

    2017-08-01

    One of the trends in collaborative learning is using mobile devices for supporting the process and products of collaboration, which has been forming the field of mobile-computer-supported collaborative learning (mCSCL). Although mobile devices have become valuable collaborative learning tools, evaluative evidence for their substantial contributions to collaborative learning is still scarce. The present meta-analysis, which included 48 peer-reviewed journal articles and doctoral dissertations written over a 16-year period (2000-2015) involving 5,294 participants, revealed that mCSCL has produced meaningful improvements for collaborative learning, with an overall mean effect size of 0.516. Moderator variables, such as domain subject, group size, teaching method, intervention duration, and reward method were related to different effect sizes. The results provided implications for future research and practice, such as suggestions on how to appropriately use the functionalities of mobile devices, how to best leverage mCSCL through effective group learning mechanisms, and what outcome variables should be included in future studies to fully elucidate the process and products of mCSCL.

  8. The Effects of Mobile-Computer-Supported Collaborative Learning: Meta-Analysis and Critical Synthesis

    PubMed Central

    Sung, Yao-Ting; Yang, Je-Ming; Lee, Han-Yueh

    2017-01-01

    One of the trends in collaborative learning is using mobile devices for supporting the process and products of collaboration, which has been forming the field of mobile-computer-supported collaborative learning (mCSCL). Although mobile devices have become valuable collaborative learning tools, evaluative evidence for their substantial contributions to collaborative learning is still scarce. The present meta-analysis, which included 48 peer-reviewed journal articles and doctoral dissertations written over a 16-year period (2000–2015) involving 5,294 participants, revealed that mCSCL has produced meaningful improvements for collaborative learning, with an overall mean effect size of 0.516. Moderator variables, such as domain subject, group size, teaching method, intervention duration, and reward method were related to different effect sizes. The results provided implications for future research and practice, such as suggestions on how to appropriately use the functionalities of mobile devices, how to best leverage mCSCL through effective group learning mechanisms, and what outcome variables should be included in future studies to fully elucidate the process and products of mCSCL. PMID:28989193

  9. A pertinent approach to solve nonlinear fuzzy integro-differential equations.

    PubMed

    Narayanamoorthy, S; Sathiyapriya, S P

    2016-01-01

    Fuzzy integro-differential equations is one of the important parts of fuzzy analysis theory that holds theoretical as well as applicable values in analytical dynamics and so an appropriate computational algorithm to solve them is in essence. In this article, we use parametric forms of fuzzy numbers and suggest an applicable approach for solving nonlinear fuzzy integro-differential equations using homotopy perturbation method. A clear and detailed description of the proposed method is provided. Our main objective is to illustrate that the construction of appropriate convex homotopy in a proper way leads to highly accurate solutions with less computational work. The efficiency of the approximation technique is expressed via stability and convergence analysis so as to guarantee the efficiency and performance of the methodology. Numerical examples are demonstrated to verify the convergence and it reveals the validity of the presented numerical technique. Numerical results are tabulated and examined by comparing the obtained approximate solutions with the known exact solutions. Graphical representations of the exact and acquired approximate fuzzy solutions clarify the accuracy of the approach.

  10. Thermotaxis is a Robust Mechanism for Thermoregulation in C. elegans Nematodes

    PubMed Central

    Ramot, Daniel; MacInnis, Bronwyn L.; Lee, Hau-Chen; Goodman, Miriam B.

    2013-01-01

    Many biochemical networks are robust to variations in network or stimulus parameters. Although robustness is considered an important design principle of such networks, it is not known whether this principle also applies to higher-level biological processes such as animal behavior. In thermal gradients, C. elegans uses thermotaxis to bias its movement along the direction of the gradient. Here we develop a detailed, quantitative map of C. elegans thermotaxis and use these data to derive a computational model of thermotaxis in the soil, a natural environment of C. elegans. This computational analysis indicates that thermotaxis enables animals to avoid temperatures at which they cannot reproduce, to limit excursions from their adapted temperature, and to remain relatively close to the surface of the soil, where oxygen is abundant. Furthermore, our analysis reveals that this mechanism is robust to large variations in the parameters governing both worm locomotion and temperature fluctuations in the soil. We suggest that, similar to biochemical networks, animals evolve behavioral strategies that are robust, rather than strategies that rely on fine-tuning of specific behavioral parameters. PMID:19020047

  11. Estimation of the risk of failure for an endodontically treated maxillary premolar with MODP preparation and CAD/CAM ceramic restorations.

    PubMed

    Lin, Chun-Li; Chang, Yen-Hsiang; Pa, Che-An

    2009-10-01

    This study evaluated the risk of failure for an endodontically treated premolar with mesio occlusodistal palatal (MODP) preparation and 3 different computer-aided design/computer-aided manufacturing (CAD/CAM) ceramic restoration configurations. Three 3-dimensional finite element (FE) models designed with CAD/CAM ceramic onlay, endocrown, and conventional crown restorations were constructed to perform simulations. The Weibull function was incorporated with FE analysis to calculate the long-term failure probability relative to different load conditions. The results indicated that the stress values on the enamel, dentin, and luting cement for endocrown restoration were the lowest values relative to the other 2 restorations. Weibull analysis revealed that the individual failure probability in the endocrown enamel, dentin, and luting cement obviously diminished more than those for onlay and conventional crown restorations. The overall failure probabilities were 27.5%, 1%, and 1% for onlay, endocrown, and conventional crown restorations, respectively, in normal occlusal condition. This numeric investigation suggests that endocrown and conventional crown restorations for endodontically treated premolars with MODP preparation present similar longevity.

  12. A Multi-center Milestone Study of Clinical Vertebral CT Segmentation

    PubMed Central

    Yao, Jianhua; Burns, Joseph E.; Forsberg, Daniel; Seitel, Alexander; Rasoulian, Abtin; Abolmaesumi, Purang; Hammernik, Kerstin; Urschler, Martin; Ibragimov, Bulat; Korez, Robert; Vrtovec, Tomaž; Castro-Mateos, Isaac; Pozo, Jose M.; Frangi, Alejandro F.; Summers, Ronald M.; Li, Shuo

    2017-01-01

    A multiple center milestone study of clinical vertebra segmentation is presented in this paper. Vertebra segmentation is a fundamental step for spinal image analysis and intervention. The first half of the study was conducted in the spine segmentation challenge in 2014 International Conference on Medical Image Computing and Computer Assisted Intervention (MICCAI) Workshop on Computational Spine Imaging (CSI 2014). The objective was to evaluate the performance of several state-of-the-art vertebra segmentation algorithms on computed tomography (CT) scans using ten training and five testing dataset, all healthy cases; the second half of the study was conducted after the challenge, where additional 5 abnormal cases are used for testing to evaluate the performance under abnormal cases. Dice coefficients and absolute surface distances were used as evaluation metrics. Segmentation of each vertebra as a single geometric unit, as well as separate segmentation of vertebra substructures, was evaluated. Five teams participated in the comparative study. The top performers in the study achieved Dice coefficient of 0.93 in the upper thoracic, 0.95 in the lower thoracic and 0.96 in the lumbar spine for healthy cases, and 0.88 in the upper thoracic, 0.89 in the lower thoracic and 0.92 in the lumbar spine for osteoporotic and fractured cases. The strengths and weaknesses of each method as well as future suggestion for improvement are discussed. This is the first multi-center comparative study for vertebra segmentation methods, which will provide an up-to-date performance milestone for the fast growing spinal image analysis and intervention. PMID:26878138

  13. DISTMIX: direct imputation of summary statistics for unmeasured SNPs from mixed ethnicity cohorts.

    PubMed

    Lee, Donghyung; Bigdeli, T Bernard; Williamson, Vernell S; Vladimirov, Vladimir I; Riley, Brien P; Fanous, Ayman H; Bacanu, Silviu-Alin

    2015-10-01

    To increase the signal resolution for large-scale meta-analyses of genome-wide association studies, genotypes at unmeasured single nucleotide polymorphisms (SNPs) are commonly imputed using large multi-ethnic reference panels. However, the ever increasing size and ethnic diversity of both reference panels and cohorts makes genotype imputation computationally challenging for moderately sized computer clusters. Moreover, genotype imputation requires subject-level genetic data, which unlike summary statistics provided by virtually all studies, is not publicly available. While there are much less demanding methods which avoid the genotype imputation step by directly imputing SNP statistics, e.g. Directly Imputing summary STatistics (DIST) proposed by our group, their implicit assumptions make them applicable only to ethnically homogeneous cohorts. To decrease computational and access requirements for the analysis of cosmopolitan cohorts, we propose DISTMIX, which extends DIST capabilities to the analysis of mixed ethnicity cohorts. The method uses a relevant reference panel to directly impute unmeasured SNP statistics based only on statistics at measured SNPs and estimated/user-specified ethnic proportions. Simulations show that the proposed method adequately controls the Type I error rates. The 1000 Genomes panel imputation of summary statistics from the ethnically diverse Psychiatric Genetic Consortium Schizophrenia Phase 2 suggests that, when compared to genotype imputation methods, DISTMIX offers comparable imputation accuracy for only a fraction of computational resources. DISTMIX software, its reference population data, and usage examples are publicly available at http://code.google.com/p/distmix. dlee4@vcu.edu Supplementary Data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.

  14. A computer program for the design and analysis of low-speed airfoils, supplement

    NASA Technical Reports Server (NTRS)

    Eppler, R.; Somers, D. M.

    1980-01-01

    Three new options were incorporated into an existing computer program for the design and analysis of low speed airfoils. These options permit the analysis of airfoils having variable chord (variable geometry), a boundary layer displacement iteration, and the analysis of the effect of single roughness elements. All three options are described in detail and are included in the FORTRAN IV computer program.

  15. Flame trench analysis of NLS vehicles

    NASA Technical Reports Server (NTRS)

    Zeytinoglu, Nuri

    1993-01-01

    The present study takes the initial steps of establishing a better flame trench design criteria for future National Launch System vehicles. A three-dimensional finite element computer model for predicting the transient thermal and structural behavior of the flame trench walls was developed using both I-DEAS and MSC/NASTRAN software packages. The results of JANNAF Standardized Plume flowfield calculations of sea-level exhaust plume of the Space Shuttle Main Engine (SSME), Space Transportation Main Engine (STME), and Advanced Solid Rocket Motors (ASRM) were analyzed for different axial distances. The results of sample calculations, using the developed finite element model, are included. The further suggestions are also reported for enhancing the overall analysis of the flame trench model.

  16. Birth order, family configuration, and verbal achievement.

    PubMed

    Breland, H M

    1974-12-01

    Two samples of National Merit Scholarship participants test in 1962 and the entire population of almost 800,000 participants tested in 1965 were examined. Consistent effects in all 3 groups were observed with respect to both birth order and family size (1st born and those of smaller families scored higher). Control of both socioeconomic variables and maternal age, by analysis of variance as well as by analysis of covariance, failed to alter the relationships. Stepdown analyses suggested that the effects were due to a verbal component and that no differences were attributable to nonverbal factors. Mean test scores were computed for detailed sibship configurations based on birth order, family size, sibling spacing, and sibling sex.

  17. Theoretical and Experimental Studies on the Nonlinear Optical Chromophore para Bromoacetanilide

    NASA Astrophysics Data System (ADS)

    Jothy, V. Bena; Vijayakumar, T.; Jayakumar, V. S.; Udayalekshmi, K.; Ramamurthy, K.; Joe, I. Hubert

    2008-11-01

    Vibrational spectral analysis of the hydrogen bonded non-linear optical (NLO) material para Bromo Acetanilide (PBA) is carried out using NIR FT-Raman and FT-IR spectroscopy. Ab initio molecular orbital computations have been performed at HF/6-31G(d) level to derive equilibrium geometry, vibrational wavenumbers, intensities and first hyperpolarizability. The lowering of the imino stretching wavenumbers suggests the existence of strong intermolecular N-H⋯O hydrogen bonding substantiated by the natural bond orbital (NBO) analysis. Blue shifting CH stretching wavenumbers, simultaneous activation of carbonyl stretching mode and the strong activation of low wavenumber H-bond stretching vibrations shows the presence of intramolecular charge transfer in the molecule.

  18. A partisan effect in the efficiency of the US stock market

    NASA Astrophysics Data System (ADS)

    Alvarez-Ramirez, J.; Rodriguez, E.; Espinosa-Paredes, G.

    2012-10-01

    This work examines the presence of a partisan effect in the US markets over different presidential periods. The analysis is based on the computation of the fractal scaling dynamics of the Dow Jones Industrial Average by means of the detrended fluctuation analysis. The results indicated the presence of several cycles with dominant periods ranging from a 4 to 12 years/cycle. It is argued that these periods are within the range for business cycles reported in the recent literature. On the other hand, it is found that over Democratic terms the stock market tends to deviate from de random walk behavior, which suggests important differences in the economic policies implemented by each political party.

  19. PREFACE: 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011)

    NASA Astrophysics Data System (ADS)

    Teodorescu, Liliana; Britton, David; Glover, Nigel; Heinrich, Gudrun; Lauret, Jérôme; Naumann, Axel; Speer, Thomas; Teixeira-Dias, Pedro

    2012-06-01

    ACAT2011 This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011) which took place on 5-7 September 2011 at Brunel University, UK. The workshop series, which began in 1990 in Lyon, France, brings together computer science researchers and practitioners, and researchers from particle physics and related fields in order to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. It is a forum for the exchange of ideas among the fields, exploring and promoting cutting-edge computing, data analysis and theoretical calculation techniques in fundamental physics research. This year's edition of the workshop brought together over 100 participants from all over the world. 14 invited speakers presented key topics on computing ecosystems, cloud computing, multivariate data analysis, symbolic and automatic theoretical calculations as well as computing and data analysis challenges in astrophysics, bioinformatics and musicology. Over 80 other talks and posters presented state-of-the art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. Panel and round table discussions on data management and multivariate data analysis uncovered new ideas and collaboration opportunities in the respective areas. This edition of ACAT was generously sponsored by the Science and Technology Facility Council (STFC), the Institute for Particle Physics Phenomenology (IPPP) at Durham University, Brookhaven National Laboratory in the USA and Dell. We would like to thank all the participants of the workshop for the high level of their scientific contributions and for the enthusiastic participation in all its activities which were, ultimately, the key factors in the success of the workshop. Further information on ACAT 2011 can be found at http://acat2011.cern.ch Dr Liliana Teodorescu Brunel University ACATgroup The PDF also contains details of the workshop's committees and sponsors.

  20. Lanczos eigensolution method for high-performance computers

    NASA Technical Reports Server (NTRS)

    Bostic, Susan W.

    1991-01-01

    The theory, computational analysis, and applications are presented of a Lanczos algorithm on high performance computers. The computationally intensive steps of the algorithm are identified as: the matrix factorization, the forward/backward equation solution, and the matrix vector multiples. These computational steps are optimized to exploit the vector and parallel capabilities of high performance computers. The savings in computational time from applying optimization techniques such as: variable band and sparse data storage and access, loop unrolling, use of local memory, and compiler directives are presented. Two large scale structural analysis applications are described: the buckling of a composite blade stiffened panel with a cutout, and the vibration analysis of a high speed civil transport. The sequential computational time for the panel problem executed on a CONVEX computer of 181.6 seconds was decreased to 14.1 seconds with the optimized vector algorithm. The best computational time of 23 seconds for the transport problem with 17,000 degs of freedom was on the the Cray-YMP using an average of 3.63 processors.

Top