Sample records for previous computer experience

  1. DIRAC in Large Particle Physics Experiments

    NASA Astrophysics Data System (ADS)

    Stagni, F.; Tsaregorodtsev, A.; Arrabito, L.; Sailer, A.; Hara, T.; Zhang, X.; Consortium, DIRAC

    2017-10-01

    The DIRAC project is developing interware to build and operate distributed computing systems. It provides a development framework and a rich set of services for both Workload and Data Management tasks of large scientific communities. A number of High Energy Physics and Astrophysics collaborations have adopted DIRAC as the base for their computing models. DIRAC was initially developed for the LHCb experiment at LHC, CERN. Later, the Belle II, BES III and CTA experiments as well as the linear collider detector collaborations started using DIRAC for their computing systems. Some of the experiments built their DIRAC-based systems from scratch, others migrated from previous solutions, ad-hoc or based on different middlewares. Adaptation of DIRAC for a particular experiment was enabled through the creation of extensions to meet their specific requirements. Each experiment has a heterogeneous set of computing and storage resources at their disposal that were aggregated through DIRAC into a coherent pool. Users from different experiments can interact with the system in different ways depending on their specific tasks, expertise level and previous experience using command line tools, python APIs or Web Portals. In this contribution we will summarize the experience of using DIRAC in particle physics collaborations. The problems of migration to DIRAC from previous systems and their solutions will be presented. An overview of specific DIRAC extensions will be given. We hope that this review will be useful for experiments considering an update, or for those designing their computing models.

  2. Laboratory Experiments for Network Security Instruction

    ERIC Educational Resources Information Center

    Brustoloni, Jose Carlos

    2006-01-01

    We describe a sequence of five experiments on network security that cast students successively in the roles of computer user, programmer, and system administrator. Unlike experiments described in several previous papers, these experiments avoid placing students in the role of attacker. Each experiment starts with an in-class demonstration of an…

  3. Hegemony and Assessment: The Student Experience of Being in a Male Homogenous Higher Education Computing Course

    ERIC Educational Resources Information Center

    Sheedy, Caroline

    2018-01-01

    This work emanates from a previous study examining the experiences of male final year students in computing degree programmes that focused on their perceptions as students where they had few, if any, female classmates. This empirical work consisted of focus groups, with the findings outlined here drawn from two groups that were homogeneous with…

  4. The use of instant medical history in a rural clinic. Case study of the use of computers in an Arkansas physician's office.

    PubMed

    Pierce, B

    2000-05-01

    This study evaluated the acceptance of using computers to take a medical history by rural Arkansas patients. Sex, age, race, education, previous computer experience and owning a computer were used as variables. Patients were asked a series of questions to rate their comfort level with using a computer to take their medical history. Comfort ratings ranged from 30 to 45, with a mean of 36.8 (SEM = 0.67). Neither sex, race, age, education, owning a personal computer, nor prior computer experience had a significant effect on the comfort rating. This study helps alleviate one of the concerns--patient acceptance--about the increasing use of computers in practicing medicine.

  5. Computer program to minimize prediction error in models from experiments with 16 hypercube points and 0 to 6 center points

    NASA Technical Reports Server (NTRS)

    Holms, A. G.

    1982-01-01

    A previous report described a backward deletion procedure of model selection that was optimized for minimum prediction error and which used a multiparameter combination of the F - distribution and an order statistics distribution of Cochran's. A computer program is described that applies the previously optimized procedure to real data. The use of the program is illustrated by examples.

  6. Factors Influencing Skilled Use of the Computer Mouse by School-Aged Children

    ERIC Educational Resources Information Center

    Lane, Alison E.; Ziviani, Jenny M.

    2010-01-01

    Effective use of computers in education for children requires consideration of individual and developmental characteristics of users. There is limited empirical evidence, however, to guide educational programming when it comes to children and their acquisition of computing skills. This paper reports on the influence of previous experience and…

  7. Outline of the Course in Automated Language Processing.

    ERIC Educational Resources Information Center

    Pacak, M.; Roberts, A. Hood

    The course in computational linguistics described in this paper was given at The American University during the spring semester of 1969. The purpose of the course was "to convey to students with no previous experience an appreciation of the growing art of computational linguistics which encompasses every use to which computers can be put in…

  8. Student Preferences toward Microcomputer User Interfaces.

    ERIC Educational Resources Information Center

    Hazari, Sunil I.; Reaves, Rita R.

    1994-01-01

    Describes a study of undergraduates that was conducted to determine students' preferences toward Graphical User Interface versus Command Line Interface during computer-assisted instruction. Previous experience, comfort level, performance scores, and student attitudes are examined and compared, and the computer use survey is appended. (Contains 13…

  9. Hispanic women overcoming deterrents to computer science: A phenomenological study

    NASA Astrophysics Data System (ADS)

    Herling, Lourdes

    The products of computer science are important to all aspects of society and are tools in the solution of the world's problems. It is, therefore, troubling that the United States faces a shortage in qualified graduates in computer science. The number of women and minorities in computer science is significantly lower than the percentage of the U.S. population which they represent. The overall enrollment in computer science programs has continued to decline with the enrollment of women declining at a higher rate than that of men. This study addressed three aspects of underrepresentation about which there has been little previous research: addressing computing disciplines specifically rather than embedding them within the STEM disciplines, what attracts women and minorities to computer science, and addressing the issues of race/ethnicity and gender in conjunction rather than in isolation. Since women of underrepresented ethnicities are more severely underrepresented than women in general, it is important to consider whether race and ethnicity play a role in addition to gender as has been suggested by previous research. Therefore, this study examined what attracted Hispanic women to computer science specifically. The study determines whether being subjected to multiple marginalizations---female and Hispanic---played a role in the experiences of Hispanic women currently in computer science. The study found five emergent themes within the experiences of Hispanic women in computer science. Encouragement and role models strongly influenced not only the participants' choice to major in the field, but to persist as well. Most of the participants experienced a negative atmosphere and feelings of not fitting in while in college and industry. The interdisciplinary nature of computer science was the most common aspect that attracted the participants to computer science. The aptitudes participants commonly believed are needed for success in computer science are the Twenty-First Century skills problem solving, creativity, and critical thinking. While not all the participants had experience with computers or programming prior to attending college, experience played a role in the self-confidence of those who did.

  10. High-efficiency multiphoton boson sampling

    NASA Astrophysics Data System (ADS)

    Wang, Hui; He, Yu; Li, Yu-Huai; Su, Zu-En; Li, Bo; Huang, He-Liang; Ding, Xing; Chen, Ming-Cheng; Liu, Chang; Qin, Jian; Li, Jin-Peng; He, Yu-Ming; Schneider, Christian; Kamp, Martin; Peng, Cheng-Zhi; Höfling, Sven; Lu, Chao-Yang; Pan, Jian-Wei

    2017-06-01

    Boson sampling is considered as a strong candidate to demonstrate 'quantum computational supremacy' over classical computers. However, previous proof-of-principle experiments suffered from small photon number and low sampling rates owing to the inefficiencies of the single-photon sources and multiport optical interferometers. Here, we develop two central components for high-performance boson sampling: robust multiphoton interferometers with 99% transmission rate and actively demultiplexed single-photon sources based on a quantum dot-micropillar with simultaneously high efficiency, purity and indistinguishability. We implement and validate three-, four- and five-photon boson sampling, and achieve sampling rates of 4.96 kHz, 151 Hz and 4 Hz, respectively, which are over 24,000 times faster than previous experiments. Our architecture can be scaled up for a larger number of photons and with higher sampling rates to compete with classical computers, and might provide experimental evidence against the extended Church-Turing thesis.

  11. Comfort and experience with online learning: trends over nine years and associations with knowledge.

    PubMed

    Cook, David A; Thompson, Warren G

    2014-07-01

    Some evidence suggests that attitude toward computer-based instruction is an important determinant of success in online learning. We sought to determine how comfort using computers and perceptions of prior online learning experiences have changed over the past decade, and how these associate with learning outcomes. Each year from 2003-2011 we conducted a prospective trial of online learning. As part of each year's study, we asked medicine residents about their comfort using computers and if their previous experiences with online learning were favorable. We assessed knowledge using a multiple-choice test. We used regression to analyze associations and changes over time. 371 internal medicine and family medicine residents participated. Neither comfort with computers nor perceptions of prior online learning experiences showed a significant change across years (p > 0.61), with mean comfort rating 3.96 (maximum 5 = very comfortable) and mean experience rating 4.42 (maximum 6 = strongly agree [favorable]). Comfort showed no significant association with knowledge scores (p = 0.39) but perceptions of prior experiences did, with a 1.56% rise in knowledge score for a 1-point rise in experience score (p = 0.02). Correlations among comfort, perceptions of prior experiences, and number of prior experiences were all small and not statistically significant. Comfort with computers and perceptions of prior experience with online learning remained stable over nine years. Prior good experiences (but not comfort with computers) demonstrated a modest association with knowledge outcomes, suggesting that prior course satisfaction may influence subsequent learning.

  12. Using Computer Simulations in Chemistry Problem Solving

    ERIC Educational Resources Information Center

    Avramiotis, Spyridon; Tsaparlis, Georgios

    2013-01-01

    This study is concerned with the effects of computer simulations of two novel chemistry problems on the problem solving ability of students. A control-experimental group, equalized by pair groups (n[subscript Exp] = n[subscript Ctrl] = 78), research design was used. The students had no previous experience of chemical practical work. Student…

  13. The impact of supercomputers on experimentation: A view from a national laboratory

    NASA Technical Reports Server (NTRS)

    Peterson, V. L.; Arnold, J. O.

    1985-01-01

    The relative roles of large scale scientific computers and physical experiments in several science and engineering disciplines are discussed. Increasing dependence on computers is shown to be motivated both by the rapid growth in computer speed and memory, which permits accurate numerical simulation of complex physical phenomena, and by the rapid reduction in the cost of performing a calculation, which makes computation an increasingly attractive complement to experimentation. Computer speed and memory requirements are presented for selected areas of such disciplines as fluid dynamics, aerodynamics, aerothermodynamics, chemistry, atmospheric sciences, astronomy, and astrophysics, together with some examples of the complementary nature of computation and experiment. Finally, the impact of the emerging role of computers in the technical disciplines is discussed in terms of both the requirements for experimentation and the attainment of previously inaccessible information on physical processes.

  14. Qualifying for the Green500: Experience with the newest generation of supercomputers at LANL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yilk, Todd

    The High Performance Computing Division of Los Alamos National Laboratory recently brought four new supercomputing platforms on line: Trinity with separate partitions built around the Haswell and Knights Landing CPU architectures for capability computing and Grizzly, Fire, and Ice for capacity computing applications. The power monitoring infrastructure of these machines is significantly enhanced over previous supercomputing generations at LANL and all were qualified at the highest level of the Green500 benchmark. Here, this paper discusses supercomputing at LANL, the Green500 benchmark, and notes on our experience meeting the Green500's reporting requirements.

  15. Qualifying for the Green500: Experience with the newest generation of supercomputers at LANL

    DOE PAGES

    Yilk, Todd

    2018-02-17

    The High Performance Computing Division of Los Alamos National Laboratory recently brought four new supercomputing platforms on line: Trinity with separate partitions built around the Haswell and Knights Landing CPU architectures for capability computing and Grizzly, Fire, and Ice for capacity computing applications. The power monitoring infrastructure of these machines is significantly enhanced over previous supercomputing generations at LANL and all were qualified at the highest level of the Green500 benchmark. Here, this paper discusses supercomputing at LANL, the Green500 benchmark, and notes on our experience meeting the Green500's reporting requirements.

  16. Measurement of information and communication technology experience and attitudes to e-learning of students in the healthcare professions: integrative review.

    PubMed

    Wilkinson, Ann; While, Alison E; Roberts, Julia

    2009-04-01

    This paper is a report of a review to describe and discuss the psychometric properties of instruments used in healthcare education settings measuring experience and attitudes of healthcare students regarding their information and communication technology skills and their use of computers and the Internet for education. Healthcare professionals are expected to be computer and information literate at registration. A previous review of evaluative studies of computer-based learning suggests that methods of measuring learners' attitudes to computers and computer aided learning are problematic. A search of eight health and social science databases located 49 papers, the majority published between 1995 and January 2007, focusing on the experience and attitudes of students in the healthcare professions towards computers and e-learning. An integrative approach was adopted, with narrative description of findings. Criteria for inclusion were quantitative studies using survey tools with samples of healthcare students and concerning computer and information literacy skills, access to computers, experience with computers and use of computers and the Internet for education purposes. Since the 1980s a number of instruments have been developed, mostly in the United States of America, to measure attitudes to computers, anxiety about computer use, information and communication technology skills, satisfaction and more recently attitudes to the Internet and computers for education. The psychometric properties are poorly described. Advances in computers and technology mean that many earlier tools are no longer valid. Measures of the experience and attitudes of healthcare students to the increased use of e-learning require development in line with computer and technology advances.

  17. Feasibility of Virtual Machine and Cloud Computing Technologies for High Performance Computing

    DTIC Science & Technology

    2014-05-01

    Hat Enterprise Linux SaaS software as a service VM virtual machine vNUMA virtual non-uniform memory access WRF weather research and forecasting...previously mentioned in Chapter I Section B1 of this paper, which is used to run the weather research and forecasting ( WRF ) model in their experiments...against a VMware virtualization solution of WRF . The experiment consisted of running WRF in a standard configuration between the D-VTM and VMware while

  18. Exploring the experience of clients with tetraplegia utilizing assistive technology for computer access.

    PubMed

    Folan, Alyce; Barclay, Linda; Cooper, Cathy; Robinson, Merren

    2015-01-01

    Assistive technology for computer access can be used to facilitate people with a spinal cord injury to utilize mainstream computer applications, thereby enabling participation in a variety of meaningful occupations. The aim of this study was to gain an understanding of the experiences of clients with tetraplegia trialing assistive technologies for computer access during different stages in a public rehabilitation service. In order to explore the experiences of clients with tetraplegia trialing assistive technologies for computer use, qualitative methodology was selected. Data were collected from seven participants using semi-structured interviews, which were audio-taped, transcribed and analyzed thematically. Three main themes were identified. These were: getting back into life, assisting in adjusting to injury and learning new skills. The findings from this study demonstrated that people with tetraplegia can be assisted to return to previous life roles or engage in new roles, through developing skills in the use of assistive technology for computer access. Being able to use computers for meaningful activities contributed to the participants gaining an enhanced sense of self-efficacy, and thereby quality of life. Implications for Rehabilitation Findings from this pilot study indicate that people with tetraplegia can be assisted to return to previous life roles, and develop new roles that have meaning to them through the use of assistive technologies for computer use. Being able to use the internet to socialize, and complete daily tasks, contributed to the participants gaining a sense of control over their lives. Early introduction to assistive technology is important to ensure sufficient time for newly injured people to feel comfortable enough with the assistive technology to use the computers productively by the time of discharge. Further research into this important and expanding area is indicated.

  19. Experimental Realization of High-Efficiency Counterfactual Computation.

    PubMed

    Kong, Fei; Ju, Chenyong; Huang, Pu; Wang, Pengfei; Kong, Xi; Shi, Fazhan; Jiang, Liang; Du, Jiangfeng

    2015-08-21

    Counterfactual computation (CFC) exemplifies the fascinating quantum process by which the result of a computation may be learned without actually running the computer. In previous experimental studies, the counterfactual efficiency is limited to below 50%. Here we report an experimental realization of the generalized CFC protocol, in which the counterfactual efficiency can break the 50% limit and even approach unity in principle. The experiment is performed with the spins of a negatively charged nitrogen-vacancy color center in diamond. Taking advantage of the quantum Zeno effect, the computer can remain in the not-running subspace due to the frequent projection by the environment, while the computation result can be revealed by final detection. The counterfactual efficiency up to 85% has been demonstrated in our experiment, which opens the possibility of many exciting applications of CFC, such as high-efficiency quantum integration and imaging.

  20. Experimental Realization of High-Efficiency Counterfactual Computation

    NASA Astrophysics Data System (ADS)

    Kong, Fei; Ju, Chenyong; Huang, Pu; Wang, Pengfei; Kong, Xi; Shi, Fazhan; Jiang, Liang; Du, Jiangfeng

    2015-08-01

    Counterfactual computation (CFC) exemplifies the fascinating quantum process by which the result of a computation may be learned without actually running the computer. In previous experimental studies, the counterfactual efficiency is limited to below 50%. Here we report an experimental realization of the generalized CFC protocol, in which the counterfactual efficiency can break the 50% limit and even approach unity in principle. The experiment is performed with the spins of a negatively charged nitrogen-vacancy color center in diamond. Taking advantage of the quantum Zeno effect, the computer can remain in the not-running subspace due to the frequent projection by the environment, while the computation result can be revealed by final detection. The counterfactual efficiency up to 85% has been demonstrated in our experiment, which opens the possibility of many exciting applications of CFC, such as high-efficiency quantum integration and imaging.

  1. Critical Emergency Medicine Procedural Skills: A Comparative Study of Methods for Teaching and Assessment.

    ERIC Educational Resources Information Center

    Chapman, Dane M.; And Others

    Three critical procedural skills in emergency medicine were evaluated using three assessment modalities--written, computer, and animal model. The effects of computer practice and previous procedure experience on skill competence were also examined in an experimental sequential assessment design. Subjects were six medical students, six residents,…

  2. The relationship between emotional intelligence, previous caring experience and mindfulness in student nurses and midwives: a cross sectional analysis.

    PubMed

    Snowden, Austyn; Stenhouse, Rosie; Young, Jenny; Carver, Hannah; Carver, Fiona; Brown, Norrie

    2015-01-01

    Emotional Intelligence (EI), previous caring experience and mindfulness training may have a positive impact on nurse education. More evidence is needed to support the use of these variables in nurse recruitment and retention. To explore the relationship between EI, gender, age, programme of study, previous caring experience and mindfulness training. Cross sectional element of longitudinal study. 938year one nursing, midwifery and computing students at two Scottish Higher Education Institutes (HEIs) who entered their programme in September 2013. Participants completed a measure of 'trait' EI: Trait Emotional Intelligence Questionnaire Short Form (TEIQue-SF); and 'ability' EI: Schutte's et al. (1998) Emotional Intelligence Scale (SEIS). Demographics, previous caring experience and previous training in mindfulness were recorded. Relationships between variables were tested using non-parametric tests. Emotional intelligence increased with age on both measures of EI [TEIQ-SF H(5)=15.157 p=0.001; SEIS H(5)=11.388, p=0.044]. Females (n=786) scored higher than males (n=149) on both measures [TEIQ-SF, U=44,931, z=-4.509, p<.001; SEIS, U=44,744, z=-5.563, p<.001]. Nursing students scored higher that computing students [TEIQ-SF H(5)=46,496, p<.001; SEIS H(5)=33.309, p<0.001. There were no statistically significant differences in TEIQ-SF scores between those who had previous mindfulness training (n=50) and those who had not (n=857) [U=22,980, z=0.864, p = 0.388]. However, median SEIS was statistically significantly different according to mindfulness training [U=25,115.5, z=2.05, p=.039]. Neither measure demonstrated statistically significantly differences between those with (n=492) and without (n=479) previous caring experience, [TEIQ-SF, U=112, 102, z=0.938, p=.348; SEIS, U=115,194.5, z=1.863, p=0.063]. Previous caring experience was not associated with higher emotional intelligence. Mindfulness training was associated with higher 'ability' emotional intelligence. Implications for recruitment, retention and further research are explored. Copyright © 2014. Published by Elsevier Ltd.

  3. Comfort and experience with online learning: trends over nine years and associations with knowledge

    PubMed Central

    2014-01-01

    Background Some evidence suggests that attitude toward computer-based instruction is an important determinant of success in online learning. We sought to determine how comfort using computers and perceptions of prior online learning experiences have changed over the past decade, and how these associate with learning outcomes. Methods Each year from 2003–2011 we conducted a prospective trial of online learning. As part of each year’s study, we asked medicine residents about their comfort using computers and if their previous experiences with online learning were favorable. We assessed knowledge using a multiple-choice test. We used regression to analyze associations and changes over time. Results 371 internal medicine and family medicine residents participated. Neither comfort with computers nor perceptions of prior online learning experiences showed a significant change across years (p > 0.61), with mean comfort rating 3.96 (maximum 5 = very comfortable) and mean experience rating 4.42 (maximum 6 = strongly agree [favorable]). Comfort showed no significant association with knowledge scores (p = 0.39) but perceptions of prior experiences did, with a 1.56% rise in knowledge score for a 1-point rise in experience score (p = 0.02). Correlations among comfort, perceptions of prior experiences, and number of prior experiences were all small and not statistically significant. Conclusions Comfort with computers and perceptions of prior experience with online learning remained stable over nine years. Prior good experiences (but not comfort with computers) demonstrated a modest association with knowledge outcomes, suggesting that prior course satisfaction may influence subsequent learning. PMID:24985690

  4. Shock compression response of cold-rolled Ni/Al multilayer composites

    DOE PAGES

    Specht, Paul E.; Weihs, Timothy P.; Thadhani, Naresh N.

    2017-01-06

    Uniaxial strain, plate-on-plate impact experiments were performed on cold-rolled Ni/Al multilayer composites and the resulting Hugoniot was determined through time-resolved measurements combined with impedance matching. The experimental Hugoniot agreed with that previously predicted by two dimensional (2D) meso-scale calculations. Additional 2D meso-scale simulations were performed using the same computational method as the prior study to reproduce the experimentally measured free surface velocities and stress profiles. Finally, these simulations accurately replicated the experimental profiles, providing additional validation for the previous computational work.

  5. Probability distributions of molecular observables computed from Markov models. II. Uncertainties in observables and their time-evolution

    NASA Astrophysics Data System (ADS)

    Chodera, John D.; Noé, Frank

    2010-09-01

    Discrete-state Markov (or master equation) models provide a useful simplified representation for characterizing the long-time statistical evolution of biomolecules in a manner that allows direct comparison with experiments as well as the elucidation of mechanistic pathways for an inherently stochastic process. A vital part of meaningful comparison with experiment is the characterization of the statistical uncertainty in the predicted experimental measurement, which may take the form of an equilibrium measurement of some spectroscopic signal, the time-evolution of this signal following a perturbation, or the observation of some statistic (such as the correlation function) of the equilibrium dynamics of a single molecule. Without meaningful error bars (which arise from both approximation and statistical error), there is no way to determine whether the deviations between model and experiment are statistically meaningful. Previous work has demonstrated that a Bayesian method that enforces microscopic reversibility can be used to characterize the statistical component of correlated uncertainties in state-to-state transition probabilities (and functions thereof) for a model inferred from molecular simulation data. Here, we extend this approach to include the uncertainty in observables that are functions of molecular conformation (such as surrogate spectroscopic signals) characterizing each state, permitting the full statistical uncertainty in computed spectroscopic experiments to be assessed. We test the approach in a simple model system to demonstrate that the computed uncertainties provide a useful indicator of statistical variation, and then apply it to the computation of the fluorescence autocorrelation function measured for a dye-labeled peptide previously studied by both experiment and simulation.

  6. Does computer-synthesized speech manifest personality? Experimental tests of recognition, similarity-attraction, and consistency-attraction.

    PubMed

    Nass, C; Lee, K M

    2001-09-01

    Would people exhibit similarity-attraction and consistency-attraction toward unambiguously computer-generated speech even when personality is clearly not relevant? In Experiment 1, participants (extrovert or introvert) heard a synthesized voice (extrovert or introvert) on a book-buying Web site. Participants accurately recognized personality cues in text to speech and showed similarity-attraction in their evaluation of the computer voice, the book reviews, and the reviewer. Experiment 2, in a Web auction context, added personality of the text to the previous design. The results replicated Experiment 1 and demonstrated consistency (voice and text personality)-attraction. To maximize liking and trust, designers should set parameters, for example, words per minute or frequency range, that create a personality that is consistent with the user and the content being presented.

  7. Spider World: A Robot Language for Learning to Program. Assessing the Cognitive Consequences of Computer Environments for Learning (ACCCEL).

    ERIC Educational Resources Information Center

    Dalbey, John; Linn, Marcia

    Spider World is an interactive program designed to help individuals with no previous computer experience to learn the fundamentals of programming. The program emphasizes cognitive tasks which are central to programming and provides significant problem-solving opportunities. In Spider World, the user commands a hypothetical robot (called the…

  8. Mechanisms of Reference Frame Selection in Spatial Term Use: Computational and Empirical Studies

    ERIC Educational Resources Information Center

    Schultheis, Holger; Carlson, Laura A.

    2017-01-01

    Previous studies have shown that multiple reference frames are available and compete for selection during the use of spatial terms such as "above." However, the mechanisms that underlie the selection process are poorly understood. In the current paper we present two experiments and a comparison of three computational models of selection…

  9. An Analysis of Graduate Nursing Students' Innovation-Decision Process

    PubMed Central

    Kacynski, Kathryn A.; Roy, Katrina D.

    1984-01-01

    This study's purpose was to examine the innovation-decision process used by graduate nursing students when deciding to use computer applications. Graduate nursing students enrolled in a mandatory research class were surveyed before and after their use of a mainframe computer for beginning data analysis about their general attitudes towards computers, individual characteristics such as “cosmopoliteness”, and their desire to learn more about a computer application. It was expected that an experimental intervention, a videotaped demonstration of interactive video instruction of cardiopulmonary resuscitation (CPR); previous computer experience; and the subject's “cosmopoliteness” wolud influence attitudes towards computers and the desire to learn more about a computer application.

  10. Enhancements to the Image Analysis Tool for Core Punch Experiments and Simulations (vs. 2014)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hogden, John Edward; Unal, Cetin

    A previous paper (Hogden & Unal, 2012, Image Analysis Tool for Core Punch Experiments and Simulations) described an image processing computer program developed at Los Alamos National Laboratory. This program has proven useful so developement has been continued. In this paper we describe enhacements to the program as of 2014.

  11. Star Identification Without Attitude Knowledge: Testing with X-Ray Timing Experiment Data

    NASA Technical Reports Server (NTRS)

    Ketchum, Eleanor

    1997-01-01

    As the budget for the scientific exploration of space shrinks, the need for more autonomous spacecraft increases. For a spacecraft with a star tracker, the ability to determinate attitude from a lost in space state autonomously requires the capability to identify the stars in the field of view of the tracker. Although there have been efforts to produce autonomous star trackers which perform this function internally, many programs cannot afford these sensors. The author previously presented a method for identifying stars without a priori attitude knowledge specifically targeted for onboard computers as it minimizes the necessary computer storage. The method has previously been tested with simulated data. This paper provides results of star identification without a priori attitude knowledge using flight data from two 8 by 8 degree charge coupled device star trackers onboard the X-Ray Timing Experiment.

  12. Slow Impacts on Strong Targets Bring on the Heat

    NASA Astrophysics Data System (ADS)

    Melosh, H. J.; Ivanov, B. A.

    2018-03-01

    An important new paper by Kurosawa and Genda (2017, https://doi.org/10.1002/2017GL076285) reports a previously overlooked source of heating in low velocity meteorite impacts. Plastic deformation of the pressure-strengthened rocks behind the shock front dissipates energy, which appears as heat in addition to that generated across the shock wave itself. This heat source has surprisingly escaped explicit attention for decades: First, because it is minimized in the geometry typically chosen for laboratory experiments; and second because it is most important in rocks, and less so for the metals usually used in experiments. Nevertheless, modern numerical computer codes that include strength do compute this heating correctly. This raises the philosophical question of whether we can claim to understand some process just because our computer codes compute the results correctly.

  13. First Experiences with CMS Data Storage on the GEMSS System at the INFN-CNAF Tier-1

    NASA Astrophysics Data System (ADS)

    Andreotti, D.; Bonacorsi, D.; Cavalli, A.; Pra, S. Dal; Dell'Agnello, L.; Forti, Alberto; Grandi, C.; Gregori, D.; Gioi, L. Li; Martelli, B.; Prosperini, A.; Ricci, P. P.; Ronchieri, Elisabetta; Sapunenko, V.; Sartirana, A.; Vagnoni, V.; Zappi, Riccardo

    A brand new Mass Storage System solution called "Grid-Enabled Mass Storage System" (GEMSS) -based on the Storage Resource Manager (StoRM) developed by INFN, on the General Parallel File System by IBM and on the Tivoli Storage Manager by IBM -has been tested and deployed at the INFNCNAF Tier-1 Computing Centre in Italy. After a successful stress test phase, the solution is now being used in production for the data custodiality of the CMS experiment at CNAF. All data previously recorded on the CASTOR system have been transferred to GEMSS. As final validation of the GEMSS system, some of the computing tests done in the context of the WLCG "Scale Test for the Experiment Program" (STEP'09) challenge were repeated in September-October 2009 and compared with the results previously obtained with CASTOR in June 2009. In this paper, the GEMSS system basics, the stress test activity and the deployment phase -as well as the reliability and performance of the system -are overviewed. The experiences in the use of GEMSS at CNAF in preparing for the first months of data taking of the CMS experiment at the Large Hadron Collider are also presented.

  14. Shock compression response of cold-rolled Ni/Al multilayer composites

    NASA Astrophysics Data System (ADS)

    Specht, Paul E.; Weihs, Timothy P.; Thadhani, Naresh N.

    2017-01-01

    Uniaxial strain, plate-on-plate impact experiments were performed on cold-rolled Ni/Al multilayer composites and the resulting Hugoniot was determined through time-resolved measurements combined with impedance matching. The experimental Hugoniot agreed with that previously predicted by two dimensional (2D) meso-scale calculations [Specht et al., J. Appl. Phys. 111, 073527 (2012)]. Additional 2D meso-scale simulations were performed using the same computational method as the prior study to reproduce the experimentally measured free surface velocities and stress profiles. These simulations accurately replicated the experimental profiles, providing additional validation for the previous computational work.

  15. Effect of computer game playing on baseline laparoscopic simulator skills.

    PubMed

    Halvorsen, Fredrik H; Cvancarova, Milada; Fosse, Erik; Mjåland, Odd

    2013-08-01

    Studies examining the possible association between computer game playing and laparoscopic performance in general have yielded conflicting results and neither has a relationship between computer game playing and baseline performance on laparoscopic simulators been established. The aim of this study was to examine the possible association between previous and present computer game playing and baseline performance on a virtual reality laparoscopic performance in a sample of potential future medical students. The participating students completed a questionnaire covering the weekly amount and type of computer game playing activity during the previous year and 3 years ago. They then performed 2 repetitions of 2 tasks ("gallbladder dissection" and "traverse tube") on a virtual reality laparoscopic simulator. Performance on the simulator were then analyzed for association to their computer game experience. Local high school, Norway. Forty-eight students from 2 high school classes volunteered to participate in the study. No association between prior and present computer game playing and baseline performance was found. The results were similar both for prior and present action game playing and prior and present computer game playing in general. Our results indicate that prior and present computer game playing may not affect baseline performance in a virtual reality simulator.

  16. CFD Modeling of Free-Piston Stirling Engines

    NASA Technical Reports Server (NTRS)

    Ibrahim, Mounir B.; Zhang, Zhi-Guo; Tew, Roy C., Jr.; Gedeon, David; Simon, Terrence W.

    2001-01-01

    NASA Glenn Research Center (GRC) is funding Cleveland State University (CSU) to develop a reliable Computational Fluid Dynamics (CFD) code that can predict engine performance with the goal of significant improvements in accuracy when compared to one-dimensional (1-D) design code predictions. The funding also includes conducting code validation experiments at both the University of Minnesota (UMN) and CSU. In this paper a brief description of the work-in-progress is provided in the two areas (CFD and Experiments). Also, previous test results are compared with computational data obtained using (1) a 2-D CFD code obtained from Dr. Georg Scheuerer and further developed at CSU and (2) a multidimensional commercial code CFD-ACE+. The test data and computational results are for (1) a gas spring and (2) a single piston/cylinder with attached annular heat exchanger. The comparisons among the codes are discussed. The paper also discusses plans for conducting code validation experiments at CSU and UMN.

  17. The synergy of modeling and novel experiments for melt crystal growth research

    NASA Astrophysics Data System (ADS)

    Derby, Jeffrey J.

    2018-05-01

    Computational modeling and novel experiments, when performed together, can enable the identification of new, fundamental mechanisms important for the growth of bulk crystals from the melt. In this paper, we present a compelling example of this synergy via the discovery of previously unascertained physical mechanisms that govern the engulfment of silicon carbide particles during the growth of crystalline silicon.

  18. Investigation of models for large-scale meteorological prediction experiments

    NASA Technical Reports Server (NTRS)

    Spar, J.

    1973-01-01

    Studies are reported of the long term responses of the model atmosphere to anomalies in snow cover and sea surface temperature. An abstract of a previously issued report on the computed response to surface anomalies in a global atmospheric model is presented, and the experiments on the effects of transient sea surface temperature on the Mintz-Arakawa atmospheric model are reported.

  19. Micro-Ramp Flow Control for Oblique Shock Interactions: Comparisons of Computational and Experimental Data

    NASA Technical Reports Server (NTRS)

    Hirt, Stefanie M.; Reich, David B.; O'Connor, Michael B.

    2010-01-01

    Computational fluid dynamics was used to study the effectiveness of micro-ramp vortex generators to control oblique shock boundary layer interactions. Simulations were based on experiments previously conducted in the 15 x 15 cm supersonic wind tunnel at NASA Glenn Research Center. Four micro-ramp geometries were tested at Mach 2.0 varying the height, chord length, and spanwise spacing between micro-ramps. The overall flow field was examined. Additionally, key parameters such as boundary-layer displacement thickness, momentum thickness and incompressible shape factor were also examined. The computational results predicted the effects of the micro-ramps well, including the trends for the impact that the devices had on the shock boundary layer interaction. However, computing the shock boundary layer interaction itself proved to be problematic since the calculations predicted more pronounced adverse effects on the boundary layer due to the shock than were seen in the experiment.

  20. Micro-Ramp Flow Control for Oblique Shock Interactions: Comparisons of Computational and Experimental Data

    NASA Technical Reports Server (NTRS)

    Hirt, Stephanie M.; Reich, David B.; O'Connor, Michael B.

    2012-01-01

    Computational fluid dynamics was used to study the effectiveness of micro-ramp vortex generators to control oblique shock boundary layer interactions. Simulations were based on experiments previously conducted in the 15- by 15-cm supersonic wind tunnel at the NASA Glenn Research Center. Four micro-ramp geometries were tested at Mach 2.0 varying the height, chord length, and spanwise spacing between micro-ramps. The overall flow field was examined. Additionally, key parameters such as boundary-layer displacement thickness, momentum thickness and incompressible shape factor were also examined. The computational results predicted the effects of the microramps well, including the trends for the impact that the devices had on the shock boundary layer interaction. However, computing the shock boundary layer interaction itself proved to be problematic since the calculations predicted more pronounced adverse effects on the boundary layer due to the shock than were seen in the experiment.

  1. Software Assurance Curriculum Project Volume 4: Community College Education

    DTIC Science & Technology

    2011-09-01

    no previous programming or computer science experience expected) • Precalculus -ready (that is, proficiency sufficient to enter college-level... precalculus course) • English Composition I-ready (that is, proficiency sufficient to enter college-level English I course) Co-Requisite Discrete

  2. Static Fatigue of a Siliconized Silicon Carbide

    DTIC Science & Technology

    1987-03-01

    flexitral stress rupture and stepped temperature stress rupture (STSR) testing were performed to assess the static fatigue and creep resistances. Isothermal... stress rupture experiments were performed at 1200 0C in air for com- parison to previous results. - 10 STSR experiments 15 were under deadweight...temperature and stress levels that static fatigue and creep processes are active. The applied stresses were computed on the basis of the elastic

  3. Computational modeling and experimental studies on NO{sub x} reduction under pulverized coal combustion conditions. Seventh quarterly technical progress report, July 1, 1996--September 30, 1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumpaty, S.K.; Subramanian, K.; Nokku, V.P.

    1996-12-31

    During this quarter (July-August 1996), the experiments for nitric oxide reburning with a combination of methane and ammonia were conducted successfully. This marked the completion of gaseous phase experiments. Preparations are underway for the reburning studies with coal. A coal feeder was designed to suit our reactor facility which is being built by MK Fabrication. The coal feeder should be operational in the coming quarter. Presented here are the experimental results of NO reburning with methane/ammonia. The results are consistent with the computational work submitted in previous reports.

  4. Older Adults Perceptions of Technology and Barriers to Interacting with Tablet Computers: A Focus Group Study.

    PubMed

    Vaportzis, Eleftheria; Clausen, Maria Giatsi; Gow, Alan J

    2017-10-04

    New technologies provide opportunities for the delivery of broad, flexible interventions with older adults. Focus groups were conducted to: (1) understand older adults' familiarity with, and barriers to, interacting with new technologies and tablets; and (2) utilize user-engagement in refining an intervention protocol. Eighteen older adults (65-76 years old; 83.3% female) who were novice tablet users participated in discussions about their perceptions of and barriers to interacting with tablets. We conducted three separate focus groups and used a generic qualitative design applying thematic analysis to analyse the data. The focus groups explored attitudes toward tablets and technology in general. We also explored the perceived advantages and disadvantages of using tablets, familiarity with, and barriers to interacting with tablets. In two of the focus groups, participants had previous computing experience (e.g., desktop), while in the other, participants had no previous computing experience. None of the participants had any previous experience with tablet computers. The themes that emerged were related to barriers (i.e., lack of instructions and guidance, lack of knowledge and confidence, health-related barriers, cost); disadvantages and concerns (i.e., too much and too complex technology, feelings of inadequacy, and comparison with younger generations, lack of social interaction and communication, negative features of tablets); advantages (i.e., positive features of tablets, accessing information, willingness to adopt technology); and skepticism about using tablets and technology in general. After brief exposure to tablets, participants emphasized the likelihood of using a tablet in the future. Our findings suggest that most of our participants were eager to adopt new technology and willing to learn using a tablet. However, they voiced apprehension about lack of, or lack of clarity in, instructions and support. Understanding older adults' perceptions of technology is important to assist with introducing it to this population and maximize the potential of technology to facilitate independent living.

  5. Older Adults Perceptions of Technology and Barriers to Interacting with Tablet Computers: A Focus Group Study

    PubMed Central

    Vaportzis, Eleftheria; Giatsi Clausen, Maria; Gow, Alan J.

    2017-01-01

    Background: New technologies provide opportunities for the delivery of broad, flexible interventions with older adults. Focus groups were conducted to: (1) understand older adults' familiarity with, and barriers to, interacting with new technologies and tablets; and (2) utilize user-engagement in refining an intervention protocol. Methods: Eighteen older adults (65–76 years old; 83.3% female) who were novice tablet users participated in discussions about their perceptions of and barriers to interacting with tablets. We conducted three separate focus groups and used a generic qualitative design applying thematic analysis to analyse the data. The focus groups explored attitudes toward tablets and technology in general. We also explored the perceived advantages and disadvantages of using tablets, familiarity with, and barriers to interacting with tablets. In two of the focus groups, participants had previous computing experience (e.g., desktop), while in the other, participants had no previous computing experience. None of the participants had any previous experience with tablet computers. Results: The themes that emerged were related to barriers (i.e., lack of instructions and guidance, lack of knowledge and confidence, health-related barriers, cost); disadvantages and concerns (i.e., too much and too complex technology, feelings of inadequacy, and comparison with younger generations, lack of social interaction and communication, negative features of tablets); advantages (i.e., positive features of tablets, accessing information, willingness to adopt technology); and skepticism about using tablets and technology in general. After brief exposure to tablets, participants emphasized the likelihood of using a tablet in the future. Conclusions: Our findings suggest that most of our participants were eager to adopt new technology and willing to learn using a tablet. However, they voiced apprehension about lack of, or lack of clarity in, instructions and support. Understanding older adults' perceptions of technology is important to assist with introducing it to this population and maximize the potential of technology to facilitate independent living. PMID:29071004

  6. A Comparison of Computational Cognitive Models: Agent-Based Systems Versus Rule-Based Architectures

    DTIC Science & Technology

    2003-03-01

    Java™ How To Program , Prentice Hall, 1999. Friedman-Hill, E., Jess, The Expert System Shell for the Java Platform, Sandia National Laboratories, 2001...transition from the descriptive NDM theory to a computational model raises several questions: Who is an experienced decision maker? How do you model the...progression from being a novice to an experienced decision maker? How does the model account for previous experiences? Are there situations where

  7. Impact of computer use on children's vision.

    PubMed

    Kozeis, N

    2009-10-01

    Today, millions of children use computers on a daily basis. Extensive viewing of the computer screen can lead to eye discomfort, fatigue, blurred vision and headaches, dry eyes and other symptoms of eyestrain. These symptoms may be caused by poor lighting, glare, an improper work station set-up, vision problems of which the person was not previously aware, or a combination of these factors. Children can experience many of the same symptoms related to computer use as adults. However, some unique aspects of how children use computers may make them more susceptible than adults to the development of these problems. In this study, the most common eye symptoms related to computer use in childhood, the possible causes and ways to avoid them are reviewed.

  8. Theory, Image Simulation, and Data Analysis of Chemical Release Experiments

    NASA Technical Reports Server (NTRS)

    Wescott, Eugene M.

    1994-01-01

    The final phase of Grant NAG6-1 involved analysis of physics of chemical releases in the upper atmosphere and analysis of data obtained on previous NASA sponsored chemical release rocket experiments. Several lines of investigation of past chemical release experiments and computer simulations have been proceeding in parallel. This report summarizes the work performed and the resulting publications. The following topics are addressed: analysis of the 1987 Greenland rocket experiments; calculation of emission rates for barium, strontium, and calcium; the CRIT 1 and 2 experiments (Collisional Ionization Cross Section experiments); image calibration using background stars; rapid ray motions in ionospheric plasma clouds; and the NOONCUSP rocket experiments.

  9. Retrieving relevant time-course experiments: a study on Arabidopsis microarrays.

    PubMed

    Şener, Duygu Dede; Oğul, Hasan

    2016-06-01

    Understanding time-course regulation of genes in response to a stimulus is a major concern in current systems biology. The problem is usually approached by computational methods to model the gene behaviour or its networked interactions with the others by a set of latent parameters. The model parameters can be estimated through a meta-analysis of available data obtained from other relevant experiments. The key question here is how to find the relevant experiments which are potentially useful in analysing current data. In this study, the authors address this problem in the context of time-course gene expression experiments from an information retrieval perspective. To this end, they introduce a computational framework that takes a time-course experiment as a query and reports a list of relevant experiments retrieved from a given repository. These retrieved experiments can then be used to associate the environmental factors of query experiment with the findings previously reported. The model is tested using a set of time-course Arabidopsis microarrays. The experimental results show that relevant experiments can be successfully retrieved based on content similarity.

  10. Initial experience with custom-fit total knee replacement: intra-operative events and long-leg coronal alignment.

    PubMed

    Spencer, Brian A; Mont, Michael A; McGrath, Mike S; Boyd, Bradley; Mitrick, Michael F

    2009-12-01

    New technology using magnetic resonance imaging (MRI) allows the surgeon to place total knee replacement components into each patient's pre-arthritic natural alignment. This study evaluated the initial intra-operative experience using this technique. Twenty-one patients had a sagittal MRI of their arthritic knee to determine component placement for a total knee replacement. Cutting guides were machined to control all intra-operative cuts. Intra-operative events were recorded and these knees were compared to a matching cohort of the senior surgeon's previous 30 conventional total knee replacements. Post-operative scanograms were obtained from each patient and coronal alignment was compared to previous studies using conventional and computer-assisted techniques. There were no intra-operative or acute post-operative complications. There were no differences in blood loss and there was a mean decrease in operative time of 14% compared to a cohort of patients with conventional knee replacements. The average deviation from the mechanical axis was 1.2 degrees of varus, which was comparable to previously reported conventional and computer-assisted techniques. Custom-fit total knee replacement appeared to be a safe procedure for uncomplicated cases of osteoarthritis.

  11. Search for an Appropriate Behavior within the Emotional Regulation in Virtual Creatures Using a Learning Classifier System

    PubMed Central

    Rosales, Jonathan-Hernando; Cervantes, José-Antonio

    2017-01-01

    Emotion regulation is a process by which human beings control emotional behaviors. From neuroscientific evidence, this mechanism is the product of conscious or unconscious processes. In particular, the mechanism generated by a conscious process needs a priori components to be computed. The behaviors generated by previous experiences are among these components. These behaviors need to be adapted to fulfill the objectives in a specific situation. The problem we address is how to endow virtual creatures with emotion regulation in order to compute an appropriate behavior in a specific emotional situation. This problem is clearly important and we have not identified ways to solve this problem in the current literature. In our proposal, we show a way to generate the appropriate behavior in an emotional situation using a learning classifier system (LCS). We illustrate the function of our proposal in unknown and known situations by means of two case studies. Our results demonstrate that it is possible to converge to the appropriate behavior even in the first case; that is, when the system does not have previous experiences and in situations where some previous information is available our proposal proves to be a very powerful tool. PMID:29209362

  12. STS-42 Commander Grabe works with MWPE at IML-1 Rack 8 aboard OV-103

    NASA Technical Reports Server (NTRS)

    1992-01-01

    STS-42 Commander Ronald J. Grabe works with the Mental Workload and Performance Evaluation Experiment (MWPE) (portable laptop computer, keyboard cursor keys, a two-axis joystick, and a track ball) at Rack 8 in the International Microgravity Laboratory 1 (IML-1) module. The test was designed as a result of difficulty experienced by crewmembers working at a computer station on a previous Space Shuttle mission. The problem was due to the workstation's design being based on Earth-bound conditions with the operator in a typical one-G standing position. For STS-42, the workstation was redesigned to evaluate the effects of microgravity on the ability of crewmembers to interact with a computer workstation. Information gained from this experiment will be used to design workstations for future Spacelab missions and Space Station Freedom (SSF).

  13. The CREAM-CE: First experiences, results and requirements of the four LHC experiments

    NASA Astrophysics Data System (ADS)

    Mendez Lorenzo, Patricia; Santinelli, Roberto; Sciaba, Andrea; Thackray, Nick; Shiers, Jamie; Renshall, Harry; Sgaravatto, Massimo; Padhi, Sanjay

    2010-04-01

    In terms of the gLite middleware, the current LCG-CE used by the four LHC experiments is about to be deprecated. The new CREAM-CE service (Computing Resource Execution And Management) has been approved to replace the previous service. CREAM-CE is a lightweight service created to handle job management operations at the CE level. It is able to accept requests both via the gLite WMS service and also via direct submission for transmission to the local batch system. This flexible duality provides the experiments with a large level of freedom to adapt the service to their own computing models, but at the same time it requires a careful follow up of the requirements and tests of the experiments to ensure that their needs are fulfilled before real data taking. In this paper we present the current testing results of the four LHC experiments concerning this new service. The operations procedures, which have been elaborated together with the experiment support teams will be discussed. Finally, the experiments requirements and the expectations for both the sites and the service itself are exposed in detail.

  14. Reply to comment by Añel on "Most computational hydrology is not reproducible, so is it really science?"

    NASA Astrophysics Data System (ADS)

    Hutton, Christopher; Wagener, Thorsten; Freer, Jim; Han, Dawei; Duffy, Chris; Arheimer, Berit

    2017-03-01

    In this article, we reply to a comment made on our previous commentary regarding reproducibility in computational hydrology. Software licensing and version control of code are important technical aspects of making code and workflows of scientific experiments open and reproducible. However, in our view, it is the cultural change that is the greatest challenge to overcome to achieve reproducible scientific research in computational hydrology. We believe that from changing the culture and attitude among hydrological scientists, details will evolve to cover more (technical) aspects over time.

  15. Refining new-physics searches in B→Dτν with lattice QCD.

    PubMed

    Bailey, Jon A; Bazavov, A; Bernard, C; Bouchard, C M; Detar, C; Du, Daping; El-Khadra, A X; Foley, J; Freeland, E D; Gámiz, E; Gottlieb, Steven; Heller, U M; Kim, Jongjeong; Kronfeld, A S; Laiho, J; Levkova, L; Mackenzie, P B; Meurice, Y; Neil, E T; Oktay, M B; Qiu, Si-Wei; Simone, J N; Sugar, R; Toussaint, D; Van de Water, R S; Zhou, Ran

    2012-08-17

    The semileptonic decay channel B→Dτν is sensitive to the presence of a scalar current, such as that mediated by a charged-Higgs boson. Recently, the BABAR experiment reported the first observation of the exclusive semileptonic decay B→Dτ(-)ν, finding an approximately 2σ disagreement with the standard-model prediction for the ratio R(D)=BR(B→Dτν)/BR(B→Dℓν), where ℓ = e,μ. We compute this ratio of branching fractions using hadronic form factors computed in unquenched lattice QCD and obtain R(D)=0.316(12)(7), where the errors are statistical and total systematic, respectively. This result is the first standard-model calculation of R(D) from ab initio full QCD. Its error is smaller than that of previous estimates, primarily due to the reduced uncertainty in the scalar form factor f(0)(q(2)). Our determination of R(D) is approximately 1σ higher than previous estimates and, thus, reduces the tension with experiment. We also compute R(D) in models with electrically charged scalar exchange, such as the type-II two-Higgs-doublet model. Once again, our result is consistent with, but approximately 1σ higher than, previous estimates for phenomenologically relevant values of the scalar coupling in the type-II model. As a by-product of our calculation, we also present the standard-model prediction for the longitudinal-polarization ratio P(L)(D)=0.325(4)(3).

  16. Selection of Thermal Worst-Case Orbits via Modified Efficient Global Optimization

    NASA Technical Reports Server (NTRS)

    Moeller, Timothy M.; Wilhite, Alan W.; Liles, Kaitlin A.

    2014-01-01

    Efficient Global Optimization (EGO) was used to select orbits with worst-case hot and cold thermal environments for the Stratospheric Aerosol and Gas Experiment (SAGE) III. The SAGE III system thermal model changed substantially since the previous selection of worst-case orbits (which did not use the EGO method), so the selections were revised to ensure the worst cases are being captured. The EGO method consists of first conducting an initial set of parametric runs, generated with a space-filling Design of Experiments (DoE) method, then fitting a surrogate model to the data and searching for points of maximum Expected Improvement (EI) to conduct additional runs. The general EGO method was modified by using a multi-start optimizer to identify multiple new test points at each iteration. This modification facilitates parallel computing and decreases the burden of user interaction when the optimizer code is not integrated with the model. Thermal worst-case orbits for SAGE III were successfully identified and shown by direct comparison to be more severe than those identified in the previous selection. The EGO method is a useful tool for this application and can result in computational savings if the initial Design of Experiments (DoE) is selected appropriately.

  17. Teaching Non-Recursive Binary Searching: Establishing a Conceptual Framework.

    ERIC Educational Resources Information Center

    Magel, E. Terry

    1989-01-01

    Discusses problems associated with teaching non-recursive binary searching in computer language classes, and describes a teacher-directed dialog based on dictionary use that helps students use their previous searching experiences to conceptualize the binary search process. Algorithmic development is discussed and appropriate classroom discussion…

  18. 30 CFR 250.914 - How do I nominate a CVA?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... the individual or the primary staff for the specific project; (3) Size and type of organization or corporation; (4) In-house availability of, or access to, appropriate technology. This should include computer... specific project considering current commitments; (6) Previous experience with MMS requirements and...

  19. A DNA sequence analysis package for the IBM personal computer.

    PubMed Central

    Lagrimini, L M; Brentano, S T; Donelson, J E

    1984-01-01

    We present here a collection of DNA sequence analysis programs, called "PC Sequence" (PCS), which are designed to run on the IBM Personal Computer (PC). These programs are written in IBM PC compiled BASIC and take full advantage of the IBM PC's speed, error handling, and graphics capabilities. For a modest initial expense in hardware any laboratory can use these programs to quickly perform computer analysis on DNA sequences. They are written with the novice user in mind and require very little training or previous experience with computers. Also provided are a text editing program for creating and modifying DNA sequence files and a communications program which enables the PC to communicate with and collect information from mainframe computers and DNA sequence databases. PMID:6546433

  20. Analyses for precision reduced optical observations from the international satellite geodesy experiment (ISAGEX)

    NASA Technical Reports Server (NTRS)

    Marsh, J. G.; Douglas, B. C.; Klosko, S. M.

    1973-01-01

    During the time period of December 1970 to September 1971 an International Satllite Geodesy Experiment (ISAGEX) was conducted. Over fifty optical and laser tracking stations participated in the data gathering portion of this experiment. Data from some of the stations had not been previously available for dynamical orbit computations. With the recent availability of new data from the Astrosoviet, East European and other optical stations, orbital analyses were conducted to insure compatibility with the previously available laser data. These data have also been analyzed using dynamical orbital techniques for the estimation of estimation of geocentric coordinates for six camera stations (for Astrosoviet, two East European). Thirteen arcs of GEOS-1 and 2 observations between two and four days in length were used. The uncertainty in these new station values is considered to be about 20 meters in each coordinate. Adjustments to the previously available values were generally a few hundred meters. With these geocentric coordinates these data will now be used to supplement earth physics investigations during the ISAGEX.

  1. Robot services for elderly with cognitive impairment: testing usability of graphical user interfaces.

    PubMed

    Granata, C; Pino, M; Legouverneur, G; Vidal, J-S; Bidaud, P; Rigaud, A-S

    2013-01-01

    Socially assistive robotics for elderly care is a growing field. However, although robotics has the potential to support elderly in daily tasks by offering specific services, the development of usable interfaces is still a challenge. Since several factors such as age or disease-related changes in perceptual or cognitive abilities and familiarity with computer technologies influence technology use they must be considered when designing interfaces for these users. This paper presents findings from usability testing of two different services provided by a social assistive robot intended for elderly with cognitive impairment: a grocery shopping list and an agenda application. The main goal of this study is to identify the usability problems of the robot interface for target end-users as well as to isolate the human factors that affect the use of the technology by elderly. Socio-demographic characteristics and computer experience were examined as factors that could have an influence on task performance. A group of 11 elderly persons with Mild Cognitive Impairment and a group of 11 cognitively healthy elderly individuals took part in this study. Performance measures (task completion time and number of errors) were collected. Cognitive profile, age and computer experience were found to impact task performance. Participants with cognitive impairment achieved the tasks committing more errors than cognitively healthy elderly. Instead younger participants and those with previous computer experience were faster at completing the tasks confirming previous findings in the literature. The overall results suggested that interfaces and contents of the services assessed were usable by older adults with cognitive impairment. However, some usability problems were identified and should be addressed to better meet the needs and capacities of target end-users.

  2. The Ruggedized STD Bus Microcomputer - A low cost computer suitable for Space Shuttle experiments

    NASA Technical Reports Server (NTRS)

    Budney, T. J.; Stone, R. W.

    1982-01-01

    Previous space flight computers have been costly in terms of both hardware and software. The Ruggedized STD Bus Microcomputer is based on the commercial Mostek/Pro-Log STD Bus. Ruggedized PC cards can be based on commercial cards from more than 60 manufacturers, reducing hardware cost and design time. Software costs are minimized by using standard 8-bit microprocessors and by debugging code using commercial versions of the ruggedized flight boards while the flight hardware is being fabricated.

  3. Learning to Teach Mathematics with Technology: A Survey of Professional Development Needs, Experiences and Impacts

    ERIC Educational Resources Information Center

    Bennison, Anne; Goos, Merrilyn

    2010-01-01

    The potential for digital technologies to enhance students' mathematics learning is widely recognised, and use of computers and graphics calculators is now encouraged or required by secondary school mathematics curriculum documents throughout Australia. However, previous research indicates that effective integration of technology into classroom…

  4. Unstable solitary-wave solutions of the generalized Benjamin-Bona-Mahony equation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McKinney, W.R.; Restrepo, J.M.; Bona, J.L.

    1994-06-01

    The evolution of solitary waves of the gBBM equation is investigated computationally. The experiments confirm previously derived theoretical stability estimates and, more importantly, yield insights into their behavior. For example, highly energetic unstable solitary waves when perturbed are shown to evolve into several stable solitary waves.

  5. vFitness: a web-based computing tool for improving estimation of in vitro HIV-1 fitness experiments

    PubMed Central

    2010-01-01

    Background The replication rate (or fitness) between viral variants has been investigated in vivo and in vitro for human immunodeficiency virus (HIV). HIV fitness plays an important role in the development and persistence of drug resistance. The accurate estimation of viral fitness relies on complicated computations based on statistical methods. This calls for tools that are easy to access and intuitive to use for various experiments of viral fitness. Results Based on a mathematical model and several statistical methods (least-squares approach and measurement error models), a Web-based computing tool has been developed for improving estimation of virus fitness in growth competition assays of human immunodeficiency virus type 1 (HIV-1). Conclusions Unlike the two-point calculation used in previous studies, the estimation here uses linear regression methods with all observed data in the competition experiment to more accurately estimate relative viral fitness parameters. The dilution factor is introduced for making the computational tool more flexible to accommodate various experimental conditions. This Web-based tool is implemented in C# language with Microsoft ASP.NET, and is publicly available on the Web at http://bis.urmc.rochester.edu/vFitness/. PMID:20482791

  6. vFitness: a web-based computing tool for improving estimation of in vitro HIV-1 fitness experiments.

    PubMed

    Ma, Jingming; Dykes, Carrie; Wu, Tao; Huang, Yangxin; Demeter, Lisa; Wu, Hulin

    2010-05-18

    The replication rate (or fitness) between viral variants has been investigated in vivo and in vitro for human immunodeficiency virus (HIV). HIV fitness plays an important role in the development and persistence of drug resistance. The accurate estimation of viral fitness relies on complicated computations based on statistical methods. This calls for tools that are easy to access and intuitive to use for various experiments of viral fitness. Based on a mathematical model and several statistical methods (least-squares approach and measurement error models), a Web-based computing tool has been developed for improving estimation of virus fitness in growth competition assays of human immunodeficiency virus type 1 (HIV-1). Unlike the two-point calculation used in previous studies, the estimation here uses linear regression methods with all observed data in the competition experiment to more accurately estimate relative viral fitness parameters. The dilution factor is introduced for making the computational tool more flexible to accommodate various experimental conditions. This Web-based tool is implemented in C# language with Microsoft ASP.NET, and is publicly available on the Web at http://bis.urmc.rochester.edu/vFitness/.

  7. Simulating and assessing boson sampling experiments with phase-space representations

    NASA Astrophysics Data System (ADS)

    Opanchuk, Bogdan; Rosales-Zárate, Laura; Reid, Margaret D.; Drummond, Peter D.

    2018-04-01

    The search for new, application-specific quantum computers designed to outperform any classical computer is driven by the ending of Moore's law and the quantum advantages potentially obtainable. Photonic networks are promising examples, with experimental demonstrations and potential for obtaining a quantum computer to solve problems believed classically impossible. This introduces a challenge: how does one design or understand such photonic networks? One must be able to calculate observables using general methods capable of treating arbitrary inputs, dissipation, and noise. We develop complex phase-space software for simulating these photonic networks, and apply this to boson sampling experiments. Our techniques give sampling errors orders of magnitude lower than experimental correlation measurements for the same number of samples. We show that these techniques remove systematic errors in previous algorithms for estimating correlations, with large improvements in errors in some cases. In addition, we obtain a scalable channel-combination strategy for assessment of boson sampling devices.

  8. Introducing medical students to medical informatics.

    PubMed

    Sancho, J J; González, J C; Patak, A; Sanz, F; Sitges-Serra, A

    1993-11-01

    Medical informatics (MI) has been introduced to medical students in several countries. Before outlining a course plan it was necessary to conduct a survey on students' computer literacy. A questionnaire was designed for students, focusing on knowledge and previous computer experience. The questions reproduced a similar questionnaire submitted to medical students from North Carolina University in Chapel Hill (NCU). From the results it is clear that although almost 80% of students used computers, less than 30% used general purpose applications, and utilization of computer-aided search of databases or use in the laboratory was exceptional. Men reported more computer experience than women in each area investigated by our questionnaire but this did not appear to be related to academic performance, age or course. Our main objectives when planning an MI course were to give students a general overview of the medical applications of computers and instruct them in the use of computers in future medical practice. As our medical school uses both Apple Macintosh and IBM compatibles, we decided to provide students with basic knowledge of both. The programme was structured with a mix of theoretico-practical lectures and personalized practical sessions in the computer laboratory. As well as providing a basic overview of medical informatics, the course and computer laboratory were intended to encourage other areas of medicine to incorporate the computer into their teaching programmes.

  9. Characterization of Unsteady Flow Structures Near Landing-Edge Slat. Part 2; 2D Computations

    NASA Technical Reports Server (NTRS)

    Khorrami, Mehdi; Choudhari, Meelan M.; Jenkins, Luther N.

    2004-01-01

    In our previous computational studies of a generic high-lift configuration, quasi-laminar (as opposed to fully turbulent) treatment of the slat cove region proved to be an effective approach for capturing the unsteady dynamics of the cove flow field. Combined with acoustic propagation via Ffowes Williams and Hawkings formulation, the quasi-laminar simulations captured some important features of the slat cove noise measured with microphone array techniques. However. a direct assessment of the computed cove flow field was not feasible due to the unavailability of off-surface flow measurements. To remedy this shortcoming, we have undertaken a combined experiment and computational study aimed at characterizing the flow structures and fluid mechanical processes within the slat cove region. Part I of this paper outlines the experimental aspects of this investigation focused on the 30P30N high-lift configuration; the present paper describes the accompanying computational results including a comparison between computation and experiment at various angles of attack. Even through predictions of the time-averaged flow field agree well with the measured data, the study indicates the need for further refinement of the zonal turbulence approach in order to capture the full dynamics of the cove's fluctuating flow field.

  10. Trusted measurement model based on multitenant behaviors.

    PubMed

    Ning, Zhen-Hu; Shen, Chang-Xiang; Zhao, Yong; Liang, Peng

    2014-01-01

    With a fast growing pervasive computing, especially cloud computing, the behaviour measurement is at the core and plays a vital role. A new behaviour measurement tailored for Multitenants in cloud computing is needed urgently to fundamentally establish trust relationship. Based on our previous research, we propose an improved trust relationship scheme which captures the world of cloud computing where multitenants share the same physical computing platform. Here, we first present the related work on multitenant behaviour; secondly, we give the scheme of behaviour measurement where decoupling of multitenants is taken into account; thirdly, we explicitly explain our decoupling algorithm for multitenants; fourthly, we introduce a new way of similarity calculation for deviation control, which fits the coupled multitenants under study well; lastly, we design the experiments to test our scheme.

  11. Trusted Measurement Model Based on Multitenant Behaviors

    PubMed Central

    Ning, Zhen-Hu; Shen, Chang-Xiang; Zhao, Yong; Liang, Peng

    2014-01-01

    With a fast growing pervasive computing, especially cloud computing, the behaviour measurement is at the core and plays a vital role. A new behaviour measurement tailored for Multitenants in cloud computing is needed urgently to fundamentally establish trust relationship. Based on our previous research, we propose an improved trust relationship scheme which captures the world of cloud computing where multitenants share the same physical computing platform. Here, we first present the related work on multitenant behaviour; secondly, we give the scheme of behaviour measurement where decoupling of multitenants is taken into account; thirdly, we explicitly explain our decoupling algorithm for multitenants; fourthly, we introduce a new way of similarity calculation for deviation control, which fits the coupled multitenants under study well; lastly, we design the experiments to test our scheme. PMID:24987731

  12. An Algebra-Based Introductory Computational Neuroscience Course with Lab.

    PubMed

    Fink, Christian G

    2017-01-01

    A course in computational neuroscience has been developed at Ohio Wesleyan University which requires no previous experience with calculus or computer programming, and which exposes students to theoretical models of neural information processing and techniques for analyzing neural data. The exploration of theoretical models of neural processes is conducted in the classroom portion of the course, while data analysis techniques are covered in lab. Students learn to program in MATLAB and are offered the opportunity to conclude the course with a final project in which they explore a topic of their choice within computational neuroscience. Results from a questionnaire administered at the beginning and end of the course indicate significant gains in student facility with core concepts in computational neuroscience, as well as with analysis techniques applied to neural data.

  13. Cloud computing and validation of expandable in silico livers.

    PubMed

    Ropella, Glen E P; Hunt, C Anthony

    2010-12-03

    In Silico Livers (ISLs) are works in progress. They are used to challenge multilevel, multi-attribute, mechanistic hypotheses about the hepatic disposition of xenobiotics coupled with hepatic responses. To enhance ISL-to-liver mappings, we added discrete time metabolism, biliary elimination, and bolus dosing features to a previously validated ISL and initiated re-validated experiments that required scaling experiments to use more simulated lobules than previously, more than could be achieved using the local cluster technology. Rather than dramatically increasing the size of our local cluster we undertook the re-validation experiments using the Amazon EC2 cloud platform. So doing required demonstrating the efficacy of scaling a simulation to use more cluster nodes and assessing the scientific equivalence of local cluster validation experiments with those executed using the cloud platform. The local cluster technology was duplicated in the Amazon EC2 cloud platform. Synthetic modeling protocols were followed to identify a successful parameterization. Experiment sample sizes (number of simulated lobules) on both platforms were 49, 70, 84, and 152 (cloud only). Experimental indistinguishability was demonstrated for ISL outflow profiles of diltiazem using both platforms for experiments consisting of 84 or more samples. The process was analogous to demonstration of results equivalency from two different wet-labs. The results provide additional evidence that disposition simulations using ISLs can cover the behavior space of liver experiments in distinct experimental contexts (there is in silico-to-wet-lab phenotype similarity). The scientific value of experimenting with multiscale biomedical models has been limited to research groups with access to computer clusters. The availability of cloud technology coupled with the evidence of scientific equivalency has lowered the barrier and will greatly facilitate model sharing as well as provide straightforward tools for scaling simulations to encompass greater detail with no extra investment in hardware.

  14. A new graph-based method for pairwise global network alignment

    PubMed Central

    Klau, Gunnar W

    2009-01-01

    Background In addition to component-based comparative approaches, network alignments provide the means to study conserved network topology such as common pathways and more complex network motifs. Yet, unlike in classical sequence alignment, the comparison of networks becomes computationally more challenging, as most meaningful assumptions instantly lead to NP-hard problems. Most previous algorithmic work on network alignments is heuristic in nature. Results We introduce the graph-based maximum structural matching formulation for pairwise global network alignment. We relate the formulation to previous work and prove NP-hardness of the problem. Based on the new formulation we build upon recent results in computational structural biology and present a novel Lagrangian relaxation approach that, in combination with a branch-and-bound method, computes provably optimal network alignments. The Lagrangian algorithm alone is a powerful heuristic method, which produces solutions that are often near-optimal and – unlike those computed by pure heuristics – come with a quality guarantee. Conclusion Computational experiments on the alignment of protein-protein interaction networks and on the classification of metabolic subnetworks demonstrate that the new method is reasonably fast and has advantages over pure heuristics. Our software tool is freely available as part of the LISA library. PMID:19208162

  15. Gender Effects of Computer Use in a Conceptual Physics Lab Course

    NASA Astrophysics Data System (ADS)

    Van Domelen, Dave

    2010-11-01

    It's always hard to know what to expect when bringing computers into an educational setting, as things are always changing. Student skills with computers are different today than they were 10 years ago, and 20 years ago almost counts as an alien world. Still, one hopes that some of these changes result in positive trends, such as student attitudes toward the use of computers in the classroom. During the course of the Wandering Interactive Lecture Demonstration Project, we've seen a notable gender gap in some aspects of the previous experience of students, and worried that it might impact their learning. So we administered a number of surveys to see if we were right to be worried.

  16. The Effectiveness of Multimedia Programmes in Children's Vocabulary Learning

    ERIC Educational Resources Information Center

    Acha, Joana

    2009-01-01

    The present experiment investigated the effect of three different presentation modes in children's vocabulary learning with a self-guided multimedia programmes. Participants were 135 third and fourth grade children who read a short English language story presented by a computer programme. For 12 key (previously unknown) words in the story,…

  17. Program Evolves from Basic CAD to Total Manufacturing Experience

    ERIC Educational Resources Information Center

    Cassola, Joel

    2011-01-01

    Close to a decade ago, John Hersey High School (JHHS) in Arlington Heights, Illinois, made a transition from a traditional classroom-based pre-engineering program. The new program is geared towards helping students understand the entire manufacturing process. Previously, a JHHS student would design a project in computer-aided design (CAD) software…

  18. A Computer Model of Simple Forms of Learning.

    ERIC Educational Resources Information Center

    Jones, Thomas L.

    A basic unsolved problem in science is that of understanding learning, the process by which people and machines use their experience in a situation to guide future action in similar situations. The ideas of Piaget, Pavlov, Hull, and other learning theorists, as well as previous heuristic programing models of human intelligence, stimulated this…

  19. What Influences College Students to Continue Using Business Simulation Games? The Taiwan Experience

    ERIC Educational Resources Information Center

    Tao, Yu-Hui; Cheng, Chieh-Jen; Sun, Szu-Yuan

    2009-01-01

    Previous studies have pointed out that computer games could improve students' motivation to learn, but these studies have mostly targeted teachers or students in elementary and secondary education and are without user adoption models. Because business and management institutions in higher education have been increasingly using educational…

  20. Moving an In-Class Module Online: A Case Study for Chemistry

    ERIC Educational Resources Information Center

    Seery, Michael K.

    2012-01-01

    This article summarises the author's experiences in running a module "Computers for Chemistry" entirely online for the past four years. The module, previously taught in a face-to-face environment, was reconfigured for teaching in an online environment. The rationale for moving online along with the design, implementation and evaluation of the…

  1. Agreement processing and attraction errors in aging: evidence from subject-verb agreement in German.

    PubMed

    Reifegerste, Jana; Hauer, Franziska; Felser, Claudia

    2017-11-01

    Effects of aging on lexical processing are well attested, but the picture is less clear for grammatical processing. Where age differences emerge, these are usually ascribed to working-memory (WM) decline. Previous studies on the influence of WM on agreement computation have yielded inconclusive results, and work on aging and subject-verb agreement processing is lacking. In two experiments (Experiment 1: timed grammaticality judgment, Experiment 2: self-paced reading + WM test), we investigated older (OA) and younger (YA) adults' susceptibility to agreement attraction errors. We found longer reading latencies and judgment reaction times (RTs) for OAs. Further, OAs, particularly those with low WM scores, were more accepting of sentences with attraction errors than YAs. OAs showed longer reading latencies for ungrammatical sentences, again modulated by WM, than YAs. Our results indicate that OAs have greater difficulty blocking intervening nouns from interfering with the computation of agreement dependencies. WM can modulate this effect.

  2. Fast Realistic MRI Simulations Based on Generalized Multi-Pool Exchange Tissue Model.

    PubMed

    Liu, Fang; Velikina, Julia V; Block, Walter F; Kijowski, Richard; Samsonov, Alexey A

    2017-02-01

    We present MRiLab, a new comprehensive simulator for large-scale realistic MRI simulations on a regular PC equipped with a modern graphical processing unit (GPU). MRiLab combines realistic tissue modeling with numerical virtualization of an MRI system and scanning experiment to enable assessment of a broad range of MRI approaches including advanced quantitative MRI methods inferring microstructure on a sub-voxel level. A flexible representation of tissue microstructure is achieved in MRiLab by employing the generalized tissue model with multiple exchanging water and macromolecular proton pools rather than a system of independent proton isochromats typically used in previous simulators. The computational power needed for simulation of the biologically relevant tissue models in large 3D objects is gained using parallelized execution on GPU. Three simulated and one actual MRI experiments were performed to demonstrate the ability of the new simulator to accommodate a wide variety of voxel composition scenarios and demonstrate detrimental effects of simplified treatment of tissue micro-organization adapted in previous simulators. GPU execution allowed  ∼ 200× improvement in computational speed over standard CPU. As a cross-platform, open-source, extensible environment for customizing virtual MRI experiments, MRiLab streamlines the development of new MRI methods, especially those aiming to infer quantitatively tissue composition and microstructure.

  3. Fast Realistic MRI Simulations Based on Generalized Multi-Pool Exchange Tissue Model

    PubMed Central

    Velikina, Julia V.; Block, Walter F.; Kijowski, Richard; Samsonov, Alexey A.

    2017-01-01

    We present MRiLab, a new comprehensive simulator for large-scale realistic MRI simulations on a regular PC equipped with a modern graphical processing unit (GPU). MRiLab combines realistic tissue modeling with numerical virtualization of an MRI system and scanning experiment to enable assessment of a broad range of MRI approaches including advanced quantitative MRI methods inferring microstructure on a sub-voxel level. A flexibl representation of tissue microstructure is achieved in MRiLab by employing the generalized tissue model with multiple exchanging water and macromolecular proton pools rather than a system of independent proton isochromats typically used in previous simulators. The computational power needed for simulation of the biologically relevant tissue models in large 3D objects is gained using parallelized execution on GPU. Three simulated and one actual MRI experiments were performed to demonstrate the ability of the new simulator to accommodate a wide variety of voxel composition scenarios and demonstrate detrimental effects of simplifie treatment of tissue micro-organization adapted in previous simulators. GPU execution allowed ∼200× improvement in computational speed over standard CPU. As a cross-platform, open-source, extensible environment for customizing virtual MRI experiments, MRiLab streamlines the development of new MRI methods, especially those aiming to infer quantitatively tissue composition and microstructure. PMID:28113746

  4. Why we interact: on the functional role of the striatum in the subjective experience of social interaction.

    PubMed

    Pfeiffer, Ulrich J; Schilbach, Leonhard; Timmermans, Bert; Kuzmanovic, Bojana; Georgescu, Alexandra L; Bente, Gary; Vogeley, Kai

    2014-11-01

    There is ample evidence that human primates strive for social contact and experience interactions with conspecifics as intrinsically rewarding. Focusing on gaze behavior as a crucial means of human interaction, this study employed a unique combination of neuroimaging, eye-tracking, and computer-animated virtual agents to assess the neural mechanisms underlying this component of behavior. In the interaction task, participants believed that during each interaction the agent's gaze behavior could either be controlled by another participant or by a computer program. Their task was to indicate whether they experienced a given interaction as an interaction with another human participant or the computer program based on the agent's reaction. Unbeknownst to them, the agent was always controlled by a computer to enable a systematic manipulation of gaze reactions by varying the degree to which the agent engaged in joint attention. This allowed creating a tool to distinguish neural activity underlying the subjective experience of being engaged in social and non-social interaction. In contrast to previous research, this allows measuring neural activity while participants experience active engagement in real-time social interactions. Results demonstrate that gaze-based interactions with a perceived human partner are associated with activity in the ventral striatum, a core component of reward-related neurocircuitry. In contrast, interactions with a computer-driven agent activate attention networks. Comparisons of neural activity during interaction with behaviorally naïve and explicitly cooperative partners demonstrate different temporal dynamics of the reward system and indicate that the mere experience of engagement in social interaction is sufficient to recruit this system. Copyright © 2014 Elsevier Inc. All rights reserved.

  5. Estimations of global warming potentials from computational chemistry calculations for CH(2)F(2) and other fluorinated methyl species verified by comparison to experiment.

    PubMed

    Blowers, Paul; Hollingshead, Kyle

    2009-05-21

    In this work, the global warming potential (GWP) of methylene fluoride (CH(2)F(2)), or HFC-32, is estimated through computational chemistry methods. We find our computational chemistry approach reproduces well all phenomena important for predicting global warming potentials. Geometries predicted using the B3LYP/6-311g** method were in good agreement with experiment, although some other computational methods performed slightly better. Frequencies needed for both partition function calculations in transition-state theory and infrared intensities needed for radiative forcing estimates agreed well with experiment compared to other computational methods. A modified CBS-RAD method used to obtain energies led to superior results to all other previous heat of reaction estimates and most barrier height calculations when the B3LYP/6-311g** optimized geometry was used as the base structure. Use of the small-curvature tunneling correction and a hindered rotor treatment where appropriate led to accurate reaction rate constants and radiative forcing estimates without requiring any experimental data. Atmospheric lifetimes from theory at 277 K were indistinguishable from experimental results, as were the final global warming potentials compared to experiment. This is the first time entirely computational methods have been applied to estimate a global warming potential for a chemical, and we have found the approach to be robust, inexpensive, and accurate compared to prior experimental results. This methodology was subsequently used to estimate GWPs for three additional species [methane (CH(4)); fluoromethane (CH(3)F), or HFC-41; and fluoroform (CHF(3)), or HFC-23], where estimations also compare favorably to experimental values.

  6. Corvid caching: Insights from a cognitive model.

    PubMed

    van der Vaart, Elske; Verbrugge, Rineke; Hemelrijk, Charlotte K

    2011-07-01

    Caching and recovery of food by corvids is well-studied, but some ambiguous results remain. To help clarify these, we built a computational cognitive model. It is inspired by similar models built for humans, and it assumes that memory strength depends on frequency and recency of use. We compared our model's behavior to that of real birds in previously published experiments. Our model successfully replicated the outcomes of two experiments on recovery behavior and two experiments on cache site choice. Our "virtual birds" reproduced declines in recovery accuracy across sessions, revisits to previously emptied cache sites, a lack of correlation between caching and recovery order, and a preference for caching in safe locations. The model also produced two new explanations. First, that Clark's nutcrackers may become less accurate as recovery progresses not because of differential memory for different cache sites, as was once assumed, but because of chance effects. And second, that Western scrub jays may choose their cache sites not on the basis of negative recovery experiences only, as was previously thought, but on the basis of positive recovery experiences instead. Alternatively, both "punishment" and "reward" may be playing a role. We conclude with a set of new insights, a testable prediction, and directions for future work. PsycINFO Database Record (c) 2011 APA, all rights reserved

  7. Modeling resident error-making patterns in detection of mammographic masses using computer-extracted image features: preliminary experiments

    NASA Astrophysics Data System (ADS)

    Mazurowski, Maciej A.; Zhang, Jing; Lo, Joseph Y.; Kuzmiak, Cherie M.; Ghate, Sujata V.; Yoon, Sora

    2014-03-01

    Providing high quality mammography education to radiology trainees is essential, as good interpretation skills potentially ensure the highest benefit of screening mammography for patients. We have previously proposed a computer-aided education system that utilizes trainee models, which relate human-assessed image characteristics to interpretation error. We proposed that these models be used to identify the most difficult and therefore the most educationally useful cases for each trainee. In this study, as a next step in our research, we propose to build trainee models that utilize features that are automatically extracted from images using computer vision algorithms. To predict error, we used a logistic regression which accepts imaging features as input and returns error as output. Reader data from 3 experts and 3 trainees were used. Receiver operating characteristic analysis was applied to evaluate the proposed trainee models. Our experiments showed that, for three trainees, our models were able to predict error better than chance. This is an important step in the development of adaptive computer-aided education systems since computer-extracted features will allow for faster and more extensive search of imaging databases in order to identify the most educationally beneficial cases.

  8. A Computational Fluid Dynamics Study of Transitional Flows in Low-Pressure Turbines under a Wide Range of Operating Conditions

    NASA Technical Reports Server (NTRS)

    Suzen, Y. B.; Huang, P. G.; Ashpis, D. E.; Volino, R. J.; Corke, T. C.; Thomas, F. O.; Huang, J.; Lake, J. P.; King, P. I.

    2007-01-01

    A transport equation for the intermittency factor is employed to predict the transitional flows in low-pressure turbines. The intermittent behavior of the transitional flows is taken into account and incorporated into computations by modifying the eddy viscosity, mu(sub p) with the intermittency factor, gamma. Turbulent quantities are predicted using Menter's two-equation turbulence model (SST). The intermittency factor is obtained from a transport equation model which can produce both the experimentally observed streamwise variation of intermittency and a realistic profile in the cross stream direction. The model had been previously validated against low-pressure turbine experiments with success. In this paper, the model is applied to predictions of three sets of recent low-pressure turbine experiments on the Pack B blade to further validate its predicting capabilities under various flow conditions. Comparisons of computational results with experimental data are provided. Overall, good agreement between the experimental data and computational results is obtained. The new model has been shown to have the capability of accurately predicting transitional flows under a wide range of low-pressure turbine conditions.

  9. Predicting neutron damage using TEM with in situ ion irradiation and computer modeling

    NASA Astrophysics Data System (ADS)

    Kirk, Marquis A.; Li, Meimei; Xu, Donghua; Wirth, Brian D.

    2018-01-01

    We have constructed a computer model of irradiation defect production closely coordinated with TEM and in situ ion irradiation of Molybdenum at 80 °C over a range of dose, dose rate and foil thickness. We have reexamined our previous ion irradiation data to assign appropriate error and uncertainty based on more recent work. The spatially dependent cascade cluster dynamics model is updated with recent Molecular Dynamics results for cascades in Mo. After a careful assignment of both ion and neutron irradiation dose values in dpa, TEM data are compared for both ion and neutron irradiated Mo from the same source material. Using the computer model of defect formation and evolution based on the in situ ion irradiation of thin foils, the defect microstructure, consisting of densities and sizes of dislocation loops, is predicted for neutron irradiation of bulk material at 80 °C and compared with experiment. Reasonable agreement between model prediction and experimental data demonstrates a promising direction in understanding and predicting neutron damage using a closely coordinated program of in situ ion irradiation experiment and computer simulation.

  10. An evaluation method of computer usability based on human-to-computer information transmission model.

    PubMed

    Ogawa, K

    1992-01-01

    This paper proposes a new evaluation and prediction method for computer usability. This method is based on our two previously proposed information transmission measures created from a human-to-computer information transmission model. The model has three information transmission levels: the device, software, and task content levels. Two measures, called the device independent information measure (DI) and the computer independent information measure (CI), defined on the software and task content levels respectively, are given as the amount of information transmitted. Two information transmission rates are defined as DI/T and CI/T, where T is the task completion time: the device independent information transmission rate (RDI), and the computer independent information transmission rate (RCI). The method utilizes the RDI and RCI rates to evaluate relatively the usability of software and device operations on different computer systems. Experiments using three different systems, in this case a graphical information input task, confirm that the method offers an efficient way of determining computer usability.

  11. Teaching Fundamental Skills in Microsoft Excel to First-Year Students in Quantitative Analysis

    ERIC Educational Resources Information Center

    Rubin, Samuel J.; Abrams, Binyomin

    2015-01-01

    Despite their technological savvy, most students entering university lack the necessary computer skills to succeed in a quantitative analysis course, in which they are often expected to input, analyze, and plot results of experiments without any previous formal education in Microsoft Excel or similar programs. This lack of formal education results…

  12. Neural Correlates of Phrase Quadrature Perception in Harmonic Rhythm: An EEG Study Using a Brain-Computer Interface.

    PubMed

    Fernández-Soto, Alicia; Martínez-Rodrigo, Arturo; Moncho-Bogani, José; Latorre, José Miguel; Fernández-Caballero, Antonio

    2018-06-01

    For the sake of establishing the neural correlates of phrase quadrature perception in harmonic rhythm, a musical experiment has been designed to induce music-evoked stimuli related to one important aspect of harmonic rhythm, namely the phrase quadrature. Brain activity is translated to action through electroencephalography (EEG) by using a brain-computer interface. The power spectral value of each EEG channel is estimated to obtain how power variance distributes as a function of frequency. The results of processing the acquired signals are in line with previous studies that use different musical parameters to induce emotions. Indeed, our experiment shows statistical differences in theta and alpha bands between the fulfillment and break of phrase quadrature, an important cue of harmonic rhythm, in two classical sonatas.

  13. The Effective Conductivity of Random Suspensions of Spherical Particles

    NASA Astrophysics Data System (ADS)

    Bonnecaze, R. T.; Brady, J. F.

    1991-03-01

    The effective conductivity of an infinite, random, mono-disperse, hard-sphere suspension is reported for particle to matrix conductivity ratios of ∞ , 10 and 0.01 for sphere volume fractions, c, up to 0.6. The conductivities are computed with a method previously described by the authors, which includes both far- and near-field interactions, and the particle configurations are generated via a Monte Carlo method. The results are consistent with the previous theoretical work of D. J. Jeffrey to O(c2) and the bounds computed by S. Torquato and F. Lado. It is also found that the Clausius-Mosotti equation is reasonably accurate for conductivity ratios of 10 or less all the way up to 60% (by volume). The calculated conductivities compare very well with those of experiments. In addition, percolation-like numerical experiments are performed on periodically replicated cubic lattices of N nearly touching spheres with an infinite particle to matrix conductivity ratio where the conductivity is computed as spheres are removed one by one from the lattice. Under suitable normalization of the conductivity and volume fraction, it is found that the initial volume fraction must be extremely close to maximum packing in order to observe a percolation transition, indicating that the near-field effects must be very large relative to far-field effects. These percolation transitions occur at the accepted values for simple (SC), bodycentred (BCC) and face-centred (FCC) cubic lattices. Also, the vulnerability of the lattices computed here are exactly those of previous investigators. Due to limited data above the percolation threshold, we could not correlate the conductivity with a power law near the threshold; however, it can be correlated with a power law for large normalized volume fractions. In this case the exponents are found to be 1.70, 1.75 and 1.79 for SC, BCC and FCC lattices respectively.

  14. An Improved Treatment of External Boundary for Three-Dimensional Flow Computations

    NASA Technical Reports Server (NTRS)

    Tsynkov, Semyon V.; Vatsa, Veer N.

    1997-01-01

    We present an innovative numerical approach for setting highly accurate nonlocal boundary conditions at the external computational boundaries when calculating three-dimensional compressible viscous flows over finite bodies. The approach is based on application of the difference potentials method by V. S. Ryaben'kii and extends our previous technique developed for the two-dimensional case. The new boundary conditions methodology has been successfully combined with the NASA-developed code TLNS3D and used for the analysis of wing-shaped configurations in subsonic and transonic flow regimes. As demonstrated by the computational experiments, the improved external boundary conditions allow one to greatly reduce the size of the computational domain while still maintaining high accuracy of the numerical solution. Moreover, they may provide for a noticeable speedup of convergence of the multigrid iterations.

  15. The Changing Face of Human-Computer Interaction in the Age of Ubiquitous Computing

    NASA Astrophysics Data System (ADS)

    Rogers, Yvonne

    HCI is reinventing itself. No longer only about being user-centered, it has set its sights on pastures new, embracing a much broader and far-reaching set of interests. From emotional, eco-friendly, embodied experiences to context, constructivism and culture, HCI research is changing apace: from what it looks at, the lenses it uses and what it has to offer. Part of this is as a reaction to what is happening in the world; ubiquitous technologies are proliferating and transforming how we live our lives. We are becoming more connected and more dependent on technology. The home, the crèche, outdoors, public places and even the human body are now being experimented with as potential places to embed computational devices, even to the extent of invading previously private and taboo aspects of our lives. In this paper, I examine the diversity of lifestyle and technological transformations in our midst and outline some 'difficult' questions these raise together with alternative directions for HCI research and practice.

  16. Using a combined computational-experimental approach to predict antibody-specific B cell epitopes.

    PubMed

    Sela-Culang, Inbal; Benhnia, Mohammed Rafii-El-Idrissi; Matho, Michael H; Kaever, Thomas; Maybeno, Matt; Schlossman, Andrew; Nimrod, Guy; Li, Sheng; Xiang, Yan; Zajonc, Dirk; Crotty, Shane; Ofran, Yanay; Peters, Bjoern

    2014-04-08

    Antibody epitope mapping is crucial for understanding B cell-mediated immunity and required for characterizing therapeutic antibodies. In contrast to T cell epitope mapping, no computational tools are in widespread use for prediction of B cell epitopes. Here, we show that, utilizing the sequence of an antibody, it is possible to identify discontinuous epitopes on its cognate antigen. The predictions are based on residue-pairing preferences and other interface characteristics. We combined these antibody-specific predictions with results of cross-blocking experiments that identify groups of antibodies with overlapping epitopes to improve the predictions. We validate the high performance of this approach by mapping the epitopes of a set of antibodies against the previously uncharacterized D8 antigen, using complementary techniques to reduce method-specific biases (X-ray crystallography, peptide ELISA, deuterium exchange, and site-directed mutagenesis). These results suggest that antibody-specific computational predictions and simple cross-blocking experiments allow for accurate prediction of residues in conformational B cell epitopes. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Computer-aided detection of initial polyp candidates with level set-based adaptive convolution

    NASA Astrophysics Data System (ADS)

    Zhu, Hongbin; Duan, Chaijie; Liang, Zhengrong

    2009-02-01

    In order to eliminate or weaken the interference between different topological structures on the colon wall, adaptive and normalized convolution methods were used to compute the first and second order spatial derivatives of computed tomographic colonography images, which is the beginning of various geometric analyses. However, the performance of such methods greatly depends on the single-layer representation of the colon wall, which is called the starting layer (SL) in the following text. In this paper, we introduce a level set-based adaptive convolution (LSAC) method to compute the spatial derivatives, in which the level set method is employed to determine a more reasonable SL. The LSAC was applied to a computer-aided detection (CAD) scheme to detect the initial polyp candidates, and experiments showed that it benefits the CAD scheme in both the detection sensitivity and specificity as compared to our previous work.

  18. Remembrance of inferences past: Amortization in human hypothesis generation.

    PubMed

    Dasgupta, Ishita; Schulz, Eric; Goodman, Noah D; Gershman, Samuel J

    2018-05-21

    Bayesian models of cognition assume that people compute probability distributions over hypotheses. However, the required computations are frequently intractable or prohibitively expensive. Since people often encounter many closely related distributions, selective reuse of computations (amortized inference) is a computationally efficient use of the brain's limited resources. We present three experiments that provide evidence for amortization in human probabilistic reasoning. When sequentially answering two related queries about natural scenes, participants' responses to the second query systematically depend on the structure of the first query. This influence is sensitive to the content of the queries, only appearing when the queries are related. Using a cognitive load manipulation, we find evidence that people amortize summary statistics of previous inferences, rather than storing the entire distribution. These findings support the view that the brain trades off accuracy and computational cost, to make efficient use of its limited cognitive resources to approximate probabilistic inference. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. Entanglement-Based Machine Learning on a Quantum Computer

    NASA Astrophysics Data System (ADS)

    Cai, X.-D.; Wu, D.; Su, Z.-E.; Chen, M.-C.; Wang, X.-L.; Li, Li; Liu, N.-L.; Lu, C.-Y.; Pan, J.-W.

    2015-03-01

    Machine learning, a branch of artificial intelligence, learns from previous experience to optimize performance, which is ubiquitous in various fields such as computer sciences, financial analysis, robotics, and bioinformatics. A challenge is that machine learning with the rapidly growing "big data" could become intractable for classical computers. Recently, quantum machine learning algorithms [Lloyd, Mohseni, and Rebentrost, arXiv.1307.0411] were proposed which could offer an exponential speedup over classical algorithms. Here, we report the first experimental entanglement-based classification of two-, four-, and eight-dimensional vectors to different clusters using a small-scale photonic quantum computer, which are then used to implement supervised and unsupervised machine learning. The results demonstrate the working principle of using quantum computers to manipulate and classify high-dimensional vectors, the core mathematical routine in machine learning. The method can, in principle, be scaled to larger numbers of qubits, and may provide a new route to accelerate machine learning.

  20. Three-Dimensional Effects in Multi-Element High Lift Computations

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.; LeeReusch, Elizabeth M.; Watson, Ralph D.

    2003-01-01

    In an effort to discover the causes for disagreement between previous two-dimensional (2-D) computations and nominally 2-D experiment for flow over the three-element McDonnell Douglas 30P-30N airfoil configuration at high lift, a combined experimental/CFD investigation is described. The experiment explores several different side-wall boundary layer control venting patterns, documents venting mass flow rates, and looks at corner surface flow patterns. The experimental angle of attack at maximum lift is found to be sensitive to the side-wall venting pattern: a particular pattern increases the angle of attack at maximum lift by at least 2 deg. A significant amount of spanwise pressure variation is present at angles of attack near maximum lift. A CFD study using three-dimensional (3-D) structured-grid computations, which includes the modeling of side-wall venting, is employed to investigate 3-D effects on the flow. Side-wall suction strength is found to affect the angle at which maximum lift is predicted. Maximum lift in the CFD is shown to be limited by the growth of an off-body corner flow vortex and consequent increase in spanwise pressure variation and decrease in circulation. The 3-D computations with and without wall venting predict similar trends to experiment at low angles of attack, but either stall too early or else overpredict lift levels near maximum lift by as much as 5%. Unstructured-grid computations demonstrate that mounting brackets lower the lift levels near maximum lift conditions.

  1. Three-Dimensional Effects on Multi-Element High Lift Computations

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.; Lee-Rausch, Elizabeth M.; Watson, Ralph D.

    2002-01-01

    In an effort to discover the causes for disagreement between previous 2-D computations and nominally 2-D experiment for flow over the 3-clement McDonnell Douglas 30P-30N airfoil configuration at high lift, a combined experimental/CFD investigation is described. The experiment explores several different side-wall boundary layer control venting patterns, document's venting mass flow rates, and looks at corner surface flow patterns. The experimental angle of attack at maximum lift is found to be sensitive to the side wall venting pattern: a particular pattern increases the angle of attack at maximum lift by at least 2 deg. A significant amount of spanwise pressure variation is present at angles of attack near maximum lift. A CFD study using 3-D structured-grid computations, which includes the modeling of side-wall venting, is employed to investigate 3-D effects of the flow. Side-wall suction strength is found to affect the angle at which maximum lift is predicted. Maximum lift in the CFD is shown to be limited by the growth of all off-body corner flow vortex and consequent increase in spanwise pressure variation and decrease in circulation. The 3-D computations with and without wall venting predict similar trends to experiment at low angles of attack, but either stall too earl or else overpredict lift levels near maximum lift by as much as 5%. Unstructured-grid computations demonstrate that mounting brackets lower die the levels near maximum lift conditions.

  2. Demographic and psychological variables affecting test subject evaluations of ride quality

    NASA Technical Reports Server (NTRS)

    Duncan, N. C.; Conley, H. W.

    1975-01-01

    Ride-quality experiments similar in objectives, design, and procedure were conducted, one using the U.S. Air Force Total In-Flight Simulator and the other using the Langley Passenger Ride Quality Apparatus to provide the motion environments. Large samples (80 or more per experiment) of test subjects were recruited from the Tidewater Virginia area and asked to rate the comfort (on a 7-point scale) of random aircraft motion typical of that encountered during STOL flights. Test subject characteristics of age, sex, and previous flying history (number of previous airplane flights) were studied in a two by three by three factorial design. Correlations were computed between one dependent measure, the subject's mean comfort rating, and various demographic characteristics, attitudinal variables, and the scores on Spielberger's State-Trait Anxiety Inventory. An effect of sex was found in one of the studies. Males made higher (more uncomfortable) ratings of the ride than females. Age and number of previous flights were not significantly related to comfort ratings. No significant interactions between the variables of age, sex, or previous number of flights were observed.

  3. Convolutional coding results for the MVM '73 X-band telemetry experiment

    NASA Technical Reports Server (NTRS)

    Layland, J. W.

    1978-01-01

    Results of simulation of several short-constraint-length convolutional codes using a noisy symbol stream obtained via the turnaround ranging channels of the MVM'73 spacecraft are presented. First operational use of this coding technique is on the Voyager mission. The relative performance of these codes in this environment is as previously predicted from computer-based simulations.

  4. Collaboration, Reflection and Selective Neglect: Campus-Based Marketing Students' Experiences of Using a Virtual Learning Environment

    ERIC Educational Resources Information Center

    Molesworth, Mike

    2004-01-01

    Previous studies have suggested significant benefits to using computer-mediated communication in higher education and the development of the relevant skills may also be important for preparing students for their working careers. This study is a review of the introduction of a virtual learning environment to support a group of 60 campus-based,…

  5. 26 CFR 1.404(a)-14 - Special rules in connection with the Employee Retirement Income Security Act of 1974.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... method, and experience gains and losses of previous years. (3) Limit adjustment. The term “limit... (k) of the section, where applicable) with respect to a given plan year in computing deductible... case of a plan using a spread gain funding method which maintains an unfunded liability (e.g., the...

  6. Performance analysis of wireless sensor networks in geophysical sensing applications

    NASA Astrophysics Data System (ADS)

    Uligere Narasimhamurthy, Adithya

    Performance is an important criteria to consider before switching from a wired network to a wireless sensing network. Performance is especially important in geophysical sensing where the quality of the sensing system is measured by the precision of the acquired signal. Can a wireless sensing network maintain the same reliability and quality metrics that a wired system provides? Our work focuses on evaluating the wireless GeoMote sensor motes that were developed by previous computer science graduate students at Mines. Specifically, we conducted a set of experiments, namely WalkAway and Linear Array experiments, to characterize the performance of the wireless motes. The motes were also equipped with the Sticking Heartbeat Aperture Resynchronization Protocol (SHARP), a time synchronization protocol developed by a previous computer science graduate student at Mines. This protocol should automatically synchronize the mote's internal clocks and reduce time synchronization errors. We also collected passive data to evaluate the response of GeoMotes to various frequency components associated with the seismic waves. With the data collected from these experiments, we evaluated the performance of the SHARP protocol and compared the performance of our GeoMote wireless system against the industry standard wired seismograph system (Geometric-Geode). Using arrival time analysis and seismic velocity calculations, we set out to answer the following question. Can our wireless sensing system (GeoMotes) perform similarly to a traditional wired system in a realistic scenario?

  7. Language and Cognition Interaction Neural Mechanisms

    PubMed Central

    Perlovsky, Leonid

    2011-01-01

    How language and cognition interact in thinking? Is language just used for communication of completed thoughts, or is it fundamental for thinking? Existing approaches have not led to a computational theory. We develop a hypothesis that language and cognition are two separate but closely interacting mechanisms. Language accumulates cultural wisdom; cognition develops mental representations modeling surrounding world and adapts cultural knowledge to concrete circumstances of life. Language is acquired from surrounding language “ready-made” and therefore can be acquired early in life. This early acquisition of language in childhood encompasses the entire hierarchy from sounds to words, to phrases, and to highest concepts existing in culture. Cognition is developed from experience. Yet cognition cannot be acquired from experience alone; language is a necessary intermediary, a “teacher.” A mathematical model is developed; it overcomes previous difficulties and leads to a computational theory. This model is consistent with Arbib's “language prewired brain” built on top of mirror neuron system. It models recent neuroimaging data about cognition, remaining unnoticed by other theories. A number of properties of language and cognition are explained, which previously seemed mysterious, including influence of language grammar on cultural evolution, which may explain specifics of English and Arabic cultures. PMID:21876687

  8. National Dam Inspection Program. Jennings Pond Dam (NDI I.D. PA-0891 DER I.D. 066-012) Susquehanna River Basin, Little Mehoopany Creek, Wyoming County, Pennsylvania. Phase I Inspection Report,

    DTIC Science & Technology

    1981-03-19

    Area 7.9 square miles(1) b. Discharge at Dam Site ( cfs ) Maximum known flood at dam site Unknown Outlet conduit at maximum pool Unknown Gated spillway...700 cfs , based on the available 2.4-foot freeboard relative to the low spot on the left abutment. b. Experience Data. As previously stated, Jennings...in Appendix D. The inflow hydrograph for one-half PMF was found to have a peak flow of 6835 cfs . Computer input and summary of computer output are

  9. Better Decomposition Heuristics for the Maximum-Weight Connected Graph Problem Using Betweenness Centrality

    NASA Astrophysics Data System (ADS)

    Yamamoto, Takanori; Bannai, Hideo; Nagasaki, Masao; Miyano, Satoru

    We present new decomposition heuristics for finding the optimal solution for the maximum-weight connected graph problem, which is known to be NP-hard. Previous optimal algorithms for solving the problem decompose the input graph into subgraphs using heuristics based on node degree. We propose new heuristics based on betweenness centrality measures, and show through computational experiments that our new heuristics tend to reduce the number of subgraphs in the decomposition, and therefore could lead to the reduction in computational time for finding the optimal solution. The method is further applied to analysis of biological pathway data.

  10. Validation of light water reactor calculation methods and JEF-1-based data libraries by TRX and BAPL critical experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paratte, J.M.; Pelloni, S.; Grimm, P.

    1991-04-01

    This paper analyzes the capability of various code systems and JEF-1-based nuclear data libraries to compute light water reactor lattices by comparing calculations with results from thermal reactor benchmark experiments TRX and BAPL and with previously published values. With the JEF-1 evaluation, eigenvalues are generally well predicted within 8 mk (1 mk = 0.001) or less by all code systems, and all methods give reasonable results for the measured reaction rate ratios within, or not too far from, the experimental uncertainty.

  11. Computational simulations of supersonic magnetohydrodynamic flow control, power and propulsion systems

    NASA Astrophysics Data System (ADS)

    Wan, Tian

    This work is motivated by the lack of fully coupled computational tool that solves successfully the turbulent chemically reacting Navier-Stokes equation, the electron energy conservation equation and the electric current Poisson equation. In the present work, the abovementioned equations are solved in a fully coupled manner using fully implicit parallel GMRES methods. The system of Navier-Stokes equations are solved using a GMRES method with combined Schwarz and ILU(0) preconditioners. The electron energy equation and the electric current Poisson equation are solved using a GMRES method with combined SOR and Jacobi preconditioners. The fully coupled method has also been implemented successfully in an unstructured solver, US3D, and convergence test results were presented. This new method is shown two to five times faster than the original DPLR method. The Poisson solver is validated with analytic test problems. Then, four problems are selected; two of them are computed to explore the possibility of onboard MHD control and power generation, and the other two are simulation of experiments. First, the possibility of onboard reentry shock control by a magnetic field is explored. As part of a previous project, MHD power generation onboard a re-entry vehicle is also simulated. Then, the MHD acceleration experiments conducted at NASA Ames research center are simulated. Lastly, the MHD power generation experiments known as the HVEPS project are simulated. For code validation, the scramjet experiments at University of Queensland are simulated first. The generator section of the HVEPS test facility is computed then. The main conclusion is that the computational tool is accurate for different types of problems and flow conditions, and its accuracy and efficiency are necessary when the flow complexity increases.

  12. Cloud computing and validation of expandable in silico livers

    PubMed Central

    2010-01-01

    Background In Silico Livers (ISLs) are works in progress. They are used to challenge multilevel, multi-attribute, mechanistic hypotheses about the hepatic disposition of xenobiotics coupled with hepatic responses. To enhance ISL-to-liver mappings, we added discrete time metabolism, biliary elimination, and bolus dosing features to a previously validated ISL and initiated re-validated experiments that required scaling experiments to use more simulated lobules than previously, more than could be achieved using the local cluster technology. Rather than dramatically increasing the size of our local cluster we undertook the re-validation experiments using the Amazon EC2 cloud platform. So doing required demonstrating the efficacy of scaling a simulation to use more cluster nodes and assessing the scientific equivalence of local cluster validation experiments with those executed using the cloud platform. Results The local cluster technology was duplicated in the Amazon EC2 cloud platform. Synthetic modeling protocols were followed to identify a successful parameterization. Experiment sample sizes (number of simulated lobules) on both platforms were 49, 70, 84, and 152 (cloud only). Experimental indistinguishability was demonstrated for ISL outflow profiles of diltiazem using both platforms for experiments consisting of 84 or more samples. The process was analogous to demonstration of results equivalency from two different wet-labs. Conclusions The results provide additional evidence that disposition simulations using ISLs can cover the behavior space of liver experiments in distinct experimental contexts (there is in silico-to-wet-lab phenotype similarity). The scientific value of experimenting with multiscale biomedical models has been limited to research groups with access to computer clusters. The availability of cloud technology coupled with the evidence of scientific equivalency has lowered the barrier and will greatly facilitate model sharing as well as provide straightforward tools for scaling simulations to encompass greater detail with no extra investment in hardware. PMID:21129207

  13. Multiscale atomistic simulation of metal-oxygen surface interactions: Methodological development, theoretical investigation, and correlation with experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Judith C.

    The purpose of this grant is to develop the multi-scale theoretical methods to describe the nanoscale oxidation of metal thin films, as the PI (Yang) extensive previous experience in the experimental elucidation of the initial stages of Cu oxidation by primarily in situ transmission electron microscopy methods. Through the use and development of computational tools at varying length (and time) scales, from atomistic quantum mechanical calculation, force field mesoscale simulations, to large scale Kinetic Monte Carlo (KMC) modeling, the fundamental underpinings of the initial stages of Cu oxidation have been elucidated. The development of computational modeling tools allows for acceleratedmore » materials discovery. The theoretical tools developed from this program impact a wide range of technologies that depend on surface reactions, including corrosion, catalysis, and nanomaterials fabrication.« less

  14. Differential equations as a tool for community identification.

    PubMed

    Krawczyk, Małgorzata J

    2008-06-01

    We consider the task of identification of a cluster structure in random networks. The results of two methods are presented: (i) the Newman algorithm [M. E. J. Newman and M. Girvan, Phys. Rev. E 69, 026113 (2004)]; and (ii) our method based on differential equations. A series of computer experiments is performed to check if in applying these methods we are able to determine the structure of the network. The trial networks consist initially of well-defined clusters and are disturbed by introducing noise into their connectivity matrices. Further, we show that an improvement of the previous version of our method is possible by an appropriate choice of the threshold parameter beta . With this change, the results obtained by the two methods above are similar, and our method works better, for all the computer experiments we have done.

  15. A Parametric Geometry Computational Fluid Dynamics (CFD) Study Utilizing Design of Experiments (DOE)

    NASA Technical Reports Server (NTRS)

    Rhew, Ray D.; Parker, Peter A.

    2007-01-01

    Design of Experiments (DOE) was applied to the LAS geometric parameter study to efficiently identify and rank primary contributors to integrated drag over the vehicles ascent trajectory in an order of magnitude fewer CFD configurations thereby reducing computational resources and solution time. SME s were able to gain a better understanding on the underlying flowphysics of different geometric parameter configurations through the identification of interaction effects. An interaction effect, which describes how the effect of one factor changes with respect to the levels of other factors, is often the key to product optimization. A DOE approach emphasizes a sequential approach to learning through successive experimentation to continuously build on previous knowledge. These studies represent a starting point for expanded experimental activities that will eventually cover the entire design space of the vehicle and flight trajectory.

  16. Orientation-modulated attention effect on visual evoked potential: Application for PIN system using brain-computer interface.

    PubMed

    Wilaiprasitporn, Theerawit; Yagi, Tohru

    2015-01-01

    This research demonstrates the orientation-modulated attention effect on visual evoked potential. We combined this finding with our previous findings about the motion-modulated attention effect and used the result to develop novel visual stimuli for a personal identification number (PIN) application based on a brain-computer interface (BCI) framework. An electroencephalography amplifier with a single electrode channel was sufficient for our application. A computationally inexpensive algorithm and small datasets were used in processing. Seven healthy volunteers participated in experiments to measure offline performance. Mean accuracy was 83.3% at 13.9 bits/min. Encouraged by these results, we plan to continue developing the BCI-based personal identification application toward real-time systems.

  17. Efficient tiled calculation of over-10-gigapixel holograms using ray-wavefront conversion.

    PubMed

    Igarashi, Shunsuke; Nakamura, Tomoya; Matsushima, Kyoji; Yamaguchi, Masahiro

    2018-04-16

    In the calculation of large-scale computer-generated holograms, an approach called "tiling," which divides the hologram plane into small rectangles, is often employed due to limitations on computational memory. However, the total amount of computational complexity severely increases with the number of divisions. In this paper, we propose an efficient method for calculating tiled large-scale holograms using ray-wavefront conversion. In experiments, the effectiveness of the proposed method was verified by comparing its calculation cost with that using the previous method. Additionally, a hologram of 128K × 128K pixels was calculated and fabricated by a laser-lithography system, and a high-quality 105 mm × 105 mm 3D image including complicated reflection and translucency was optically reconstructed.

  18. Comparisons of Physicians' and Nurses' Attitudes towards Computers.

    PubMed

    Brumini, Gordana; Ković, Ivor; Zombori, Dejvid; Lulić, Ileana; Bilic-Zulle, Lidija; Petrovecki, Mladen

    2005-01-01

    Before starting the implementation of integrated hospital information systems, the physicians' and nurses' attitudes towards computers were measured by means of a questionnaire. The study was conducted in Dubrava University Hospital, Zagreb in Croatia. Out of 194 respondents, 141 were nurses and 53 physicians, randomly selected. They surveyed by an anonymous questionnaire consisting of 8 closed questions about demographic data, computer science education and computer usage, and 30 statements on attitudes towards computers. The statements were adapted to a Likert type scale. Differences in attitudes towards computers between groups were compared using Kruskal-Wallis and Mann Whitney test for post-hoc analysis. The total score presented attitudes toward computers. Physicians' total score was 130 (97-144), while nurses' total score was 123 (88-141). It points that the average answer to all statements was between "agree" and "strongly agree", and these high total scores indicated their positive attitudes. Age, computer science education and computer usage were important factors witch enhances the total score. Younger physicians and nurses with computer science education and with previous computer experience had more positive attitudes towards computers than others. Our results are important for planning and implementation of integrated hospital information systems in Croatia.

  19. Factors affecting and affected by user acceptance of computer-based nursing documentation: results of a two-year study.

    PubMed

    Ammenwerth, Elske; Mansmann, Ulrich; Iller, Carola; Eichstädter, Ronald

    2003-01-01

    The documentation of the nursing process is an important but often neglected part of clinical documentation. Paper-based systems have been introduced to support nursing process documentation. Frequently, however, problems such as low quality of documentation are reported. It is unclear whether computer-based documentation systems can reduce these problems and which factors influence their acceptance by users. We introduced a computer-based nursing documentation system on four wards of the University Hospitals of Heidelberg and systematically evaluated its preconditions and its effects in a pretest-posttest intervention study. For the analysis of user acceptance, we concentrated on subjective data drawn from questionnaires and interviews. A questionnaire was developed using items from published questionnaires and items that had to be developed for the special purpose of this study. The quantitative results point to two factors influencing the acceptance of a new computer-based documentation system: the previous acceptance of the nursing process and the previous amount of self-confidence when using computers. On one ward, the diverse acceptance scores heavily declined after the introduction of the nursing documentation system. Explorative qualitative analysis on this ward points to further success factors of computer-based nursing documentation systems. Our results can be used to assist the planning and introduction of computer-based nursing documentation systems. They demonstrate the importance of computer experience and acceptance of the nursing process on a ward but also point to other factors such as the fit between nursing workflow and the functionality of a nursing documentation system.

  20. Interaction of a Synthetic Jet Actuator with a Severely Separated Crossflow

    NASA Astrophysics Data System (ADS)

    Jansen, Kenneth; Farnsworth, John; Rasquin, Michel; Rathay, Nick; Monastero, Marianne; Amitay, Michael

    2017-11-01

    A coordinated experimental/computational study of synthetic jet-based flow control on a vertical tail/rudder assembly has been carried out on a 1/19th scale model operating at 30 degree rudder deflection, 0 degree side slip, and 20m/s free-stream flow. Under these conditions a very strong span-wise separated flow develops over the rudder surface for a majority of its span. Twelve synthetic jets were distributed across the span of the vertical tail just upstream of the rudder hinge-line to determine their ability to reduce flow separation and thereby increase the side force production; to extend the rudder effectiveness. Experiments were completed for the baseline case (i.e. no jets blowing) and for cases where 1, 6, and 12 jets were activated. RANS and DDES computations were completed to match these four experiments. While some experimental results for the same geometry have been previously reported, more detailed results concerning the experiments and their comparison to the DDES computations for the baseline and 1 jet active cases are reported here. Specifically, this effort focuses on the near-jet flow and the phase-averaged vortical structures produced by a single jet interacting with a severely separated, turbulent cross-flow. An award of computer time was provided by the INCITE program and the Theta and Aurora ESP through ALCF which is supported by the DOE under Contract DE-AC02-06CH11357.

  1. Comparison of image features calculated in different dimensions for computer-aided diagnosis of lung nodules

    NASA Astrophysics Data System (ADS)

    Xu, Ye; Lee, Michael C.; Boroczky, Lilla; Cann, Aaron D.; Borczuk, Alain C.; Kawut, Steven M.; Powell, Charles A.

    2009-02-01

    Features calculated from different dimensions of images capture quantitative information of the lung nodules through one or multiple image slices. Previously published computer-aided diagnosis (CADx) systems have used either twodimensional (2D) or three-dimensional (3D) features, though there has been little systematic analysis of the relevance of the different dimensions and of the impact of combining different dimensions. The aim of this study is to determine the importance of combining features calculated in different dimensions. We have performed CADx experiments on 125 pulmonary nodules imaged using multi-detector row CT (MDCT). The CADx system computed 192 2D, 2.5D, and 3D image features of the lesions. Leave-one-out experiments were performed using five different combinations of features from different dimensions: 2D, 3D, 2.5D, 2D+3D, and 2D+3D+2.5D. The experiments were performed ten times for each group. Accuracy, sensitivity and specificity were used to evaluate the performance. Wilcoxon signed-rank tests were applied to compare the classification results from these five different combinations of features. Our results showed that 3D image features generate the best result compared with other combinations of features. This suggests one approach to potentially reducing the dimensionality of the CADx data space and the computational complexity of the system while maintaining diagnostic accuracy.

  2. Multialternative drift-diffusion model predicts the relationship between visual fixations and choice in value-based decisions.

    PubMed

    Krajbich, Ian; Rangel, Antonio

    2011-08-16

    How do we make decisions when confronted with several alternatives (e.g., on a supermarket shelf)? Previous work has shown that accumulator models, such as the drift-diffusion model, can provide accurate descriptions of the psychometric data for binary value-based choices, and that the choice process is guided by visual attention. However, the computational processes used to make choices in more complicated situations involving three or more options are unknown. We propose a model of trinary value-based choice that generalizes what is known about binary choice, and test it using an eye-tracking experiment. We find that the model provides a quantitatively accurate description of the relationship between choice, reaction time, and visual fixation data using the same parameters that were estimated in previous work on binary choice. Our findings suggest that the brain uses similar computational processes to make binary and trinary choices.

  3. A Computational Study of the Flow Physics of Acoustic Liners

    NASA Technical Reports Server (NTRS)

    Tam, Christopher

    2006-01-01

    The present investigation is a continuation of a previous joint project between the Florida State University and the NASA Langley Research Center Liner Physics Team. In the previous project, a study of acoustic liners, in two dimensions, inside a normal incidence impedance tube was carried out. The study consisted of two parts. The NASA team was responsible for the experimental part of the project. This involved performing measurements in an impedance tube with a large aspect ratio slit resonator. The FSU team was responsible for the computation part of the project. This involved performing direct numerical simulation (DNS) of the NASA experiment in two dimensions using CAA methodology. It was agreed that upon completion of numerical simulation, the computed values of the liner impedance were to be sent to NASA for validation with experimental results. On following this procedure good agreements were found between numerical results and experimental measurements over a wide range of frequencies and sound-pressure-level. Broadband incident sound waves were also simulated numerically and measured experimentally. Overall, good agreements were also found.

  4. Memory states influence value-based decisions.

    PubMed

    Duncan, Katherine D; Shohamy, Daphna

    2016-11-01

    Using memory to guide decisions allows past experience to improve future outcomes. However, the circumstances that modulate how and when memory influences decisions are not well understood. Here, we report that the use of memories to guide decisions depends on the context in which these decisions are made. We show that decisions made in the context of familiar images are more likely to be influenced by past events than are decisions made in the context of novel images (Experiment 1), that this bias persists even when a temporal gap is introduced between the image presentation and the decision (Experiment 2), and that contextual novelty facilitates value learning whereas familiarity facilitates the retrieval and use of previously learned values (Experiment 3). These effects are consistent with neurobiological and computational models of memory, which propose that familiar images evoke a lingering "retrieval state" that facilitates the recollection of other episodic memories. Together, these experiments highlight the importance of episodic memory for decision-making and provide an example of how computational and neurobiological theories can lead to new insights into how and when different types of memories guide our choices. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  5. Virtual reality simulators: valuable surgical skills trainers or video games?

    PubMed

    Willis, Ross E; Gomez, Pedro Pablo; Ivatury, Srinivas J; Mitra, Hari S; Van Sickle, Kent R

    2014-01-01

    Virtual reality (VR) and physical model (PM) simulators differ in terms of whether the trainee is manipulating actual 3-dimensional objects (PM) or computer-generated 3-dimensional objects (VR). Much like video games (VG), VR simulators utilize computer-generated graphics. These differences may have profound effects on the utility of VR and PM training platforms. In this study, we aimed to determine whether a relationship exists between VR, PM, and VG platforms. VR and PM simulators for laparoscopic camera navigation ([LCN], experiment 1) and flexible endoscopy ([FE] experiment 2) were used in this study. In experiment 1, 20 laparoscopic novices played VG and performed 0° and 30° LCN exercises on VR and PM simulators. In experiment 2, 20 FE novices played VG and performed colonoscopy exercises on VR and PM simulators. In both experiments, VG performance was correlated with VR performance but not with PM performance. Performance on VR simulators did not correlate with performance on respective PM models. VR environments may be more like VG than previously thought. © 2013 Published by Association of Program Directors in Surgery on behalf of Association of Program Directors in Surgery.

  6. Efficient marginalization to compute protein posterior probabilities from shotgun mass spectrometry data

    PubMed Central

    Serang, Oliver; MacCoss, Michael J.; Noble, William Stafford

    2010-01-01

    The problem of identifying proteins from a shotgun proteomics experiment has not been definitively solved. Identifying the proteins in a sample requires ranking them, ideally with interpretable scores. In particular, “degenerate” peptides, which map to multiple proteins, have made such a ranking difficult to compute. The problem of computing posterior probabilities for the proteins, which can be interpreted as confidence in a protein’s presence, has been especially daunting. Previous approaches have either ignored the peptide degeneracy problem completely, addressed it by computing a heuristic set of proteins or heuristic posterior probabilities, or by estimating the posterior probabilities with sampling methods. We present a probabilistic model for protein identification in tandem mass spectrometry that recognizes peptide degeneracy. We then introduce graph-transforming algorithms that facilitate efficient computation of protein probabilities, even for large data sets. We evaluate our identification procedure on five different well-characterized data sets and demonstrate our ability to efficiently compute high-quality protein posteriors. PMID:20712337

  7. Combining wet and dry research: experience with model development for cardiac mechano-electric structure-function studies

    PubMed Central

    Quinn, T. Alexander; Kohl, Peter

    2013-01-01

    Since the development of the first mathematical cardiac cell model 50 years ago, computational modelling has become an increasingly powerful tool for the analysis of data and for the integration of information related to complex cardiac behaviour. Current models build on decades of iteration between experiment and theory, representing a collective understanding of cardiac function. All models, whether computational, experimental, or conceptual, are simplified representations of reality and, like tools in a toolbox, suitable for specific applications. Their range of applicability can be explored (and expanded) by iterative combination of ‘wet’ and ‘dry’ investigation, where experimental or clinical data are used to first build and then validate computational models (allowing integration of previous findings, quantitative assessment of conceptual models, and projection across relevant spatial and temporal scales), while computational simulations are utilized for plausibility assessment, hypotheses-generation, and prediction (thereby defining further experimental research targets). When implemented effectively, this combined wet/dry research approach can support the development of a more complete and cohesive understanding of integrated biological function. This review illustrates the utility of such an approach, based on recent examples of multi-scale studies of cardiac structure and mechano-electric function. PMID:23334215

  8. Use of redundant sets of landmark information by humans (Homo sapiens) in a goal-searching task in an open field and on a computer screen.

    PubMed

    Sekiguchi, Katsuo; Ushitani, Tomokazu; Sawa, Kosuke

    2018-05-01

    Landmark-based goal-searching tasks that were similar to those for pigeons (Ushitani & Jitsumori, 2011) were provided to human participants to investigate whether they could learn and use multiple sources of spatial information that redundantly indicate the position of a hidden target in both an open field (Experiment 1) and on a computer screen (Experiments 2 and 3). During the training in each experiment, participants learned to locate a target in 1 of 25 objects arranged in a 5 × 5 grid, using two differently colored, arrow-shaped (Experiments 1 and 2) or asymmetrically shaped (Experiment 3) landmarks placed adjacent to the goal and pointing to the goal location. The absolute location and directions of the landmarks varied across trials, but the constant configuration of the goal and the landmarks enabled participants to find the goal using both global configural information and local vector information (pointing to the goal by each individual landmark). On subsequent test trials, the direction was changed for one of the landmarks to conflict with the global configural information. Results of Experiment 1 indicated that participants used vector information from a single landmark but not configural information. Further examinations revealed that the use of global (metric) information was enhanced remarkably by goal searching with nonarrow-shaped landmarks on the computer monitor (Experiment 3) but much less so with arrow-shaped landmarks (Experiment 2). The General Discussion focuses on a comparison between humans in the current study and pigeons in the previous study. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  9. Evaluation of the concrete shield compositions from the 2010 criticality accident alarm system benchmark experiments at the CEA Valduc SILENE facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Thomas Martin; Celik, Cihangir; Dunn, Michael E

    In October 2010, a series of benchmark experiments were conducted at the French Commissariat a l'Energie Atomique et aux Energies Alternatives (CEA) Valduc SILENE facility. These experiments were a joint effort between the United States Department of Energy Nuclear Criticality Safety Program and the CEA. The purpose of these experiments was to create three benchmarks for the verification and validation of radiation transport codes and evaluated nuclear data used in the analysis of criticality accident alarm systems. This series of experiments consisted of three single-pulsed experiments with the SILENE reactor. For the first experiment, the reactor was bare (unshielded), whereasmore » in the second and third experiments, it was shielded by lead and polyethylene, respectively. The polyethylene shield of the third experiment had a cadmium liner on its internal and external surfaces, which vertically was located near the fuel region of SILENE. During each experiment, several neutron activation foils and thermoluminescent dosimeters (TLDs) were placed around the reactor. Nearly half of the foils and TLDs had additional high-density magnetite concrete, high-density barite concrete, standard concrete, and/or BoroBond shields. CEA Saclay provided all the concrete, and the US Y-12 National Security Complex provided the BoroBond. Measurement data from the experiments were published at the 2011 International Conference on Nuclear Criticality (ICNC 2011) and the 2013 Nuclear Criticality Safety Division (NCSD 2013) topical meeting. Preliminary computational results for the first experiment were presented in the ICNC 2011 paper, which showed poor agreement between the computational results and the measured values of the foils shielded by concrete. Recently the hydrogen content, boron content, and density of these concrete shields were further investigated within the constraints of the previously available data. New computational results for the first experiment are now available that show much better agreement with the measured values.« less

  10. Key steps in developing a cognitive vaccine against traumatic flashbacks: visuospatial Tetris versus verbal Pub Quiz.

    PubMed

    Holmes, Emily A; James, Ella L; Kilford, Emma J; Deeprose, Catherine

    2010-11-10

    Flashbacks (intrusive memories of a traumatic event) are the hallmark feature of Post Traumatic Stress Disorder, however preventative interventions are lacking. Tetris may offer a 'cognitive vaccine' [1] against flashback development after trauma exposure. We previously reported that playing the computer game Tetris soon after viewing traumatic material reduced flashbacks compared to no-task [1]. However, two criticisms need to be addressed for clinical translation: (1) Would all games have this effect via distraction/enjoyment, or might some games even be harmful? (2) Would effects be found if administered several hours post-trauma? Accordingly, we tested Tetris versus an alternative computer game--Pub Quiz--which we hypothesized not to be helpful (Experiments 1 and 2), and extended the intervention interval to 4 hours (Experiment 2). The trauma film paradigm was used as an experimental analog for flashback development in healthy volunteers. In both experiments, participants viewed traumatic film footage of death and injury before completing one of the following: (1) no-task control condition (2) Tetris or (3) Pub Quiz. Flashbacks were monitored for 1 week. Experiment 1: 30 min after the traumatic film, playing Tetris led to a significant reduction in flashbacks compared to no-task control, whereas Pub Quiz led to a significant increase in flashbacks. Experiment 2: 4 hours post-film, playing Tetris led to a significant reduction in flashbacks compared to no-task control, whereas Pub Quiz did not. First, computer games can have differential effects post-trauma, as predicted by a cognitive science formulation of trauma memory. In both Experiments, playing Tetris post-trauma film reduced flashbacks. Pub Quiz did not have this effect, even increasing flashbacks in Experiment 1. Thus not all computer games are beneficial or merely distracting post-trauma - some may be harmful. Second, the beneficial effects of Tetris are retained at 4 hours post-trauma. Clinically, this delivers a feasible time-window to administer a post-trauma "cognitive vaccine".

  11. A research program in empirical computer science

    NASA Technical Reports Server (NTRS)

    Knight, J. C.

    1991-01-01

    During the grant reporting period our primary activities have been to begin preparation for the establishment of a research program in experimental computer science. The focus of research in this program will be safety-critical systems. Many questions that arise in the effort to improve software dependability can only be addressed empirically. For example, there is no way to predict the performance of the various proposed approaches to building fault-tolerant software. Performance models, though valuable, are parameterized and cannot be used to make quantitative predictions without experimental determination of underlying distributions. In the past, experimentation has been able to shed some light on the practical benefits and limitations of software fault tolerance. It is common, also, for experimentation to reveal new questions or new aspects of problems that were previously unknown. A good example is the Consistent Comparison Problem that was revealed by experimentation and subsequently studied in depth. The result was a clear understanding of a previously unknown problem with software fault tolerance. The purpose of a research program in empirical computer science is to perform controlled experiments in the area of real-time, embedded control systems. The goal of the various experiments will be to determine better approaches to the construction of the software for computing systems that have to be relied upon. As such it will validate research concepts from other sources, provide new research results, and facilitate the transition of research results from concepts to practical procedures that can be applied with low risk to NASA flight projects. The target of experimentation will be the production software development activities undertaken by any organization prepared to contribute to the research program. Experimental goals, procedures, data analysis and result reporting will be performed for the most part by the University of Virginia.

  12. Experience and Sentence Processing: Statistical Learning and Relative Clause Comprehension

    PubMed Central

    Wells, Justine B.; Christiansen, Morten H.; Race, David S.; Acheson, Daniel J.; MacDonald, Maryellen C.

    2009-01-01

    Many explanations of the difficulties associated with interpreting object relative clauses appeal to the demands that object relatives make on working memory. MacDonald and Christiansen (2002) pointed to variations in reading experience as a source of differences, arguing that the unique word order of object relatives makes their processing more difficult and more sensitive to the effects of previous experience than the processing of subject relatives. This hypothesis was tested in a large-scale study manipulating reading experiences of adults over several weeks. The group receiving relative clause experience increased reading speeds for object relatives more than for subject relatives, whereas a control experience group did not. The reading time data were compared to performance of a computational model given different amounts of experience. The results support claims for experience-based individual differences and an important role for statistical learning in sentence comprehension processes. PMID:18922516

  13. Interpretation of sucrose gradient sedimentation pattern of deoxyribonucleic acid fragments resulting from random breaks.

    PubMed

    Litwin, S; Shahn, E; Kozinski, A W

    1969-07-01

    Mass distribution in a sucrose gradient of deoxyribonucleic acid (DNA) fragments arising as a result of random breaks is predicted by analytical means from which computer evaluations are plotted. The analytical results are compared with the results of verifying experiments: (i) a Monte Carlo computer experiment in which simulated molecules of DNA were individuals of unit length subjected to random "breaks" applied by a random number generator, and (ii) an in vitro experiment in which molecules of T4 DNA, highly labeled with (32)P, were stored in liquid nitrogen for variable periods of time during which a precisely known number of (32)P atoms decayed, causing single-stranded breaks. The distribution of sizes of the resulting fragments was measured in an alkaline sucrose gradient. The profiles obtained in this fashion were compared with the mathematical predictions. Both experiments agree with the analytical approach and thus permit the use of the graphs obtained from the latter as a means of determining the average number of random breaks in DNA from distributions obtained experimentally in a sucrose gradient. An example of the application of this procedure to a previously unresolved problem is provided in the case of DNA from ultraviolet-irradiated phage which undergoes a dose-dependent intracellular breakdown. The relationship between the number of lethal hits and the number of single-stranded breaks was not previously established. A comparison of the calculated number of nicks per strand of DNA with the known dose in phage-lethal hits reveals a relationship closely approximating one lethal hit to one single-stranded break.

  14. USSR Report, Cybernetics, Computers and Automation Technology

    DTIC Science & Technology

    1987-04-02

    Communication Channel (NTR: PROBLEMY I RESHENIYA, No 14, 22 Jul-4 Aug 86) 52 EDUCATION Informatics and the National Information Resource (I. Chebotaru...the method of actions, which were successful in the past. The experience of previous developments is implemented in the prototype programs. Many data...of the converter lining, due to reduction of ferroalloy consumption, oxygen consumption and energy resource consumption and due to a decrease of

  15. Digital multishaker modal testing

    NASA Technical Reports Server (NTRS)

    Blair, M.; Craig, R. R., Jr.

    1983-01-01

    A review of several modal testing techniques is made, along with brief discussions of their advantages and limitations. A new technique is presented which overcomes many of the previous limitations. Several simulated experiments are included to verify the validity and accuracy of the new method. Conclusions are drawn from the simulation studies and recommendations for further work are presented. The complete computer code configured for the simulation study is presented.

  16. Studies on Vapor Adsorption Systems

    NASA Technical Reports Server (NTRS)

    Shamsundar, N.; Ramotowski, M.

    1998-01-01

    The project consisted of performing experiments on single and dual bed vapor adsorption systems, thermodynamic cycle optimization, and thermal modeling. The work was described in a technical paper that appeared in conference proceedings and a Master's thesis, which were previously submitted to NASA. The present report describes some additional thermal modeling work done subsequently, and includes listings of computer codes developed during the project. Recommendations for future work are provided.

  17. 1-1 in Education: Current Practice, International Comparative Research Evidence and Policy Implications. OECD Education Working Papers, No. 44,

    ERIC Educational Resources Information Center

    Valiente, Oscar

    2010-01-01

    Over the last decade, more and more public and private stakeholders, in developed and developing countries, have been supporting 1:1 initiatives in education (i.e. every child receives her/his own personal computing device). These 1:1 initiatives represent a qualitative move forward from previous educational experiences with ICT, inasmuch as every…

  18. Experiments in Computing: A Survey

    PubMed Central

    Moisseinen, Nella

    2014-01-01

    Experiments play a central role in science. The role of experiments in computing is, however, unclear. Questions about the relevance of experiments in computing attracted little attention until the 1980s. As the discipline then saw a push towards experimental computer science, a variety of technically, theoretically, and empirically oriented views on experiments emerged. As a consequence of those debates, today's computing fields use experiments and experiment terminology in a variety of ways. This paper analyzes experimentation debates in computing. It presents five ways in which debaters have conceptualized experiments in computing: feasibility experiment, trial experiment, field experiment, comparison experiment, and controlled experiment. This paper has three aims: to clarify experiment terminology in computing; to contribute to disciplinary self-understanding of computing; and, due to computing's centrality in other fields, to promote understanding of experiments in modern science in general. PMID:24688404

  19. Experiments in computing: a survey.

    PubMed

    Tedre, Matti; Moisseinen, Nella

    2014-01-01

    Experiments play a central role in science. The role of experiments in computing is, however, unclear. Questions about the relevance of experiments in computing attracted little attention until the 1980s. As the discipline then saw a push towards experimental computer science, a variety of technically, theoretically, and empirically oriented views on experiments emerged. As a consequence of those debates, today's computing fields use experiments and experiment terminology in a variety of ways. This paper analyzes experimentation debates in computing. It presents five ways in which debaters have conceptualized experiments in computing: feasibility experiment, trial experiment, field experiment, comparison experiment, and controlled experiment. This paper has three aims: to clarify experiment terminology in computing; to contribute to disciplinary self-understanding of computing; and, due to computing's centrality in other fields, to promote understanding of experiments in modern science in general.

  20. Do aggressive people play violent computer games in a more aggressive way? Individual difference and idiosyncratic game-playing experience.

    PubMed

    Peng, Wei; Liu, Ming; Mou, Yi

    2008-04-01

    ABSTRACT This study investigates whether individual difference influences idiosyncratic experience of game playing. In particular, we examine the relationship between the game player's physical-aggressive personality and the aggressiveness of the player's game playing in violence-oriented video games. Screen video stream of 40 individual participants' game playing was captured and content analyzed. Participants' physical aggression was measured before the game play. The results suggest that people with more physical-aggressive personality engage in a more aggressive style of playing, after controlling the differences of gender and previous gaming experience. Implications of these findings and direction for future studies are discussed.

  1. A method of computer modelling the lithium-ion batteries aging process based on the experimental characteristics

    NASA Astrophysics Data System (ADS)

    Czerepicki, A.; Koniak, M.

    2017-06-01

    The paper presents a method of modelling the processes of aging lithium-ion batteries, its implementation as a computer application and results for battery state estimation. Authors use previously developed behavioural battery model, which was built using battery operating characteristics obtained from the experiment. This model was implemented in the form of a computer program using a database to store battery characteristics. Batteries aging process is a new extended functionality of the model. Algorithm of computer simulation uses a real measurements of battery capacity as a function of the battery charge and discharge cycles number. Simulation allows to take into account the incomplete cycles of charge or discharge battery, which are characteristic for transport powered by electricity. The developed model was used to simulate the battery state estimation for different load profiles, obtained by measuring the movement of the selected means of transport.

  2. Virtual viewpoint synthesis in multi-view video system

    NASA Astrophysics Data System (ADS)

    Li, Fang; Yang, Shiqiang

    2005-07-01

    In this paper, we present a virtual viewpoint video synthesis algorithm to satisfy the following three aims: low computing consuming; real time interpolation and acceptable video quality. In contrast with previous technologies, this method obtain incompletely 3D structure using neighbor video sources instead of getting total 3D information with all video sources, so that the computation is reduced greatly. So we demonstrate our interactive multi-view video synthesis algorithm in a personal computer. Furthermore, adopting the method of choosing feature points to build the correspondence between the frames captured by neighbor cameras, we need not require camera calibration. Finally, our method can be used when the angle between neighbor cameras is 25-30 degrees that it is much larger than common computer vision experiments. In this way, our method can be applied into many applications such as sports live, video conference, etc.

  3. Computation of turbulent flow in a thin liquid layer of fluid involving a hydraulic jump

    NASA Technical Reports Server (NTRS)

    Rahman, M. M.; Faghri, A.; Hankey, W. L.

    1991-01-01

    Numerically computed flow fields and free surface height distributions are presented for the flow of a thin layer of liquid adjacent to a solid horizontal surface that encounters a hydraulic jump. Two kinds of flow configurations are considered: two-dimensional plane flow and axisymmetric radial flow. The computations used a boundary-fitted moving grid method with a k-epsilon model for the closure of turbulence. The free surface height was determined by an optimization procedure which minimized the error in the pressure distribution on the free surface. It was also checked against an approximate procedure involving integration of the governing equations and use of the MacCormack predictor-corrector method. The computed film height also compared reasonably well with previous experiments. A region of recirculating flow was found to be present adjacent to the solid boundary near the location of the jump, which was caused by a rapid deceleration of the flow.

  4. An Improved Computational Technique for Calculating Electromagnetic Forces and Power Absorptions Generated in Spherical and Deformed Body in Levitation Melting Devices

    NASA Technical Reports Server (NTRS)

    Zong, Jin-Ho; Szekely, Julian; Schwartz, Elliot

    1992-01-01

    An improved computational technique for calculating the electromagnetic force field, the power absorption and the deformation of an electromagnetically levitated metal sample is described. The technique is based on the volume integral method, but represents a substantial refinement; the coordinate transformation employed allows the efficient treatment of a broad class of rotationally symmetrical bodies. Computed results are presented to represent the behavior of levitation melted metal samples in a multi-coil, multi-frequency levitation unit to be used in microgravity experiments. The theoretical predictions are compared with both analytical solutions and with the results or previous computational efforts for the spherical samples and the agreement has been very good. The treatment of problems involving deformed surfaces and actually predicting the deformed shape of the specimens breaks new ground and should be the major usefulness of the proposed method.

  5. Turbulence measurements in a swirling confined jet flowfield using a triple hot-wire probe

    NASA Technical Reports Server (NTRS)

    Janjua, S. I.; Mclaughlin, D. K.

    1982-01-01

    An axisymmetric swirling confined jet flowfield, similar to that encountered in gas turbine combustors was investigated using a triple hot-wire probe. The raw data from the three sensors were digitized using ADC's and stored on a Tektronix 4051 computer. The data were further reduced on the computer to obtain time-series for the three instantaneous velocity components in the flowfield. The time-mean velocities and the turbulence quantities were deduced. Qualification experiments were performed and where possible results compared with independent measurements. The major qualification experiments involved measurements performed in a non-swirling flow compared with conventional X-wire measurements. In the swirling flowfield, advantages of the triple wire technique over the previously used multi-position single hot-wire method are noted. The measurements obtained provide a data base with which the predictions of turbulence models in a recirculating swirling flowfield can be evaluated.

  6. Effect of turbulence on the disintegration rate of flushable consumer products.

    PubMed

    Karadagli, Fatih; Rittmann, Bruce E; McAvoy, Drew C; Richardson, John E

    2012-05-01

    A previously developed model for the physical disintegration of flushable consumer products is expanded by investigating the effects of turbulence on the rate of physical disintegration. Disintegration experiments were conducted with cardboard tampon applicators at 100, 150, and 200 rotations per minute, corresponding to Reynold's numbers of 25,900, 39,400, and 52,900, respectively, which were estimated by using computational fluid dynamics modeling. The experiments were simulated with the disintegration model to obtain best-fit values of the kinetic and distribution parameters. Computed rate coefficients (ki) for all solid sizes (i.e., greater than 8, 4 to 8, 2 to 4, and 1 to 2 mm) increased strongly with Reynold's number or rotational speed. Thus, turbulence strongly affected the disintegration rate of flushable products, and the relationship of the ki values to Reynold's number can be included in mathematical representations of physical disintegration.

  7. Solution x-ray scattering and structure formation in protein dynamics

    NASA Astrophysics Data System (ADS)

    Nasedkin, Alexandr; Davidsson, Jan; Niemi, Antti J.; Peng, Xubiao

    2017-12-01

    We propose a computationally effective approach that builds on Landau mean-field theory in combination with modern nonequilibrium statistical mechanics to model and interpret protein dynamics and structure formation in small- to wide-angle x-ray scattering (S/WAXS) experiments. We develop the methodology by analyzing experimental data in the case of Engrailed homeodomain protein as an example. We demonstrate how to interpret S/WAXS data qualitatively with a good precision and over an extended temperature range. We explain experimental observations in terms of protein phase structure, and we make predictions for future experiments and for how to analyze data at different ambient temperature values. We conclude that the approach we propose has the potential to become a highly accurate, computationally effective, and predictive tool for analyzing S/WAXS data. For this, we compare our results with those obtained previously in an all-atom molecular dynamics simulation.

  8. Production Experiences with the Cray-Enabled TORQUE Resource Manager

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ezell, Matthew A; Maxwell, Don E; Beer, David

    High performance computing resources utilize batch systems to manage the user workload. Cray systems are uniquely different from typical clusters due to Cray s Application Level Placement Scheduler (ALPS). ALPS manages binary transfer, job launch and monitoring, and error handling. Batch systems require special support to integrate with ALPS using an XML protocol called BASIL. Previous versions of Adaptive Computing s TORQUE and Moab batch suite integrated with ALPS from within Moab, using PERL scripts to interface with BASIL. This would occasionally lead to problems when all the components would become unsynchronized. Version 4.1 of the TORQUE Resource Manager introducedmore » new features that allow it to directly integrate with ALPS using BASIL. This paper describes production experiences at Oak Ridge National Laboratory using the new TORQUE software versions, as well as ongoing and future work to improve TORQUE.« less

  9. Experience with abstract notation one

    NASA Technical Reports Server (NTRS)

    Harvey, James D.; Weaver, Alfred C.

    1990-01-01

    The development of computer science has produced a vast number of machine architectures, programming languages, and compiler technologies. The cross product of these three characteristics defines the spectrum of previous and present data representation methodologies. With regard to computer networks, the uniqueness of these methodologies presents an obstacle when disparate host environments are to be interconnected. Interoperability within a heterogeneous network relies upon the establishment of data representation commonality. The International Standards Organization (ISO) is currently developing the abstract syntax notation one standard (ASN.1) and the basic encoding rules standard (BER) that collectively address this problem. When used within the presentation layer of the open systems interconnection reference model, these two standards provide the data representation commonality required to facilitate interoperability. The details of a compiler that was built to automate the use of ASN.1 and BER are described. From this experience, insights into both standards are given and potential problems relating to this development effort are discussed.

  10. Omissions and Byproducts across Moral Domains

    PubMed Central

    DeScioli, Peter; Asao, Kelly; Kurzban, Robert

    2012-01-01

    Research indicates that moral violations are judged less wrong when the violation results from omission as opposed to commission, and when the violation is a byproduct as opposed to a means to an end. Previous work examined these effects mainly for violent offenses such as killing. Here we investigate the generality of these effects across a range of moral violations including sexuality, food, property, and group loyalty. In Experiment 1, we observed omission effects in wrongness ratings for all of the twelve offenses investigated. In Experiments 2 and 3, we observed byproduct effects in wrongness ratings for seven and eight offenses (out of twelve), respectively, and we observed byproduct effects in forced-choice responses for all twelve offenses. Our results address an ongoing debate about whether different cognitive systems compute moral wrongness for different types of behaviors (surrounding violence, sexuality, food, etc.), or, alternatively, a common cognitive architecture computes wrongness for a variety of behaviors. PMID:23071678

  11. Frequency and associated risk factors for neck pain among software engineers in Karachi, Pakistan.

    PubMed

    Rasim Ul Hasanat, Mohammad; Ali, Syed Shahzad; Rasheed, Abdur; Khan, Muhammad

    2017-07-01

    To determine the frequency of neck pain and its association with risk factors among software engineers. This descriptive, cross-sectional study was conducted at the Dow University of Health Sciences, Karachi, from February to March 2016, and comprised software engineers from 19 different locations. Non-probability purposive sampling technique was used to select individuals spending at least 6 hours in front of computer screens every day and having a work experience of at least 6 months. Data were collected using a self-administrable questionnaire. SPSS 21 was used for data analysis. Of the 185 participants, 49(26.5%) had neck pain at the time of data-gathering, while 136(73.5%) reported no pain. However, 119(64.32%) participants had a previous history of neck pain. Other factors like smoking, physical inactivity, history of any muscular pain and neck pain, uncomfortable workstation, and work-related mental stress and insufficient sleep at night, were found to be significantly associated with current neck pain (p<0.05 each). Intensive computer users are likely to experience at least one episode of computer-associated neck pain.

  12. Using distributed partial memories to improve self-organizing collective movements.

    PubMed

    Winder, Ransom; Reggia, James A

    2004-08-01

    Past self-organizing models of collectively moving "particles" (simulated bird flocks, fish schools, etc.) have typically been based on purely reflexive agents that have no significant memory of past movements. We hypothesized that giving such individual particles a limited distributed memory of past obstacles they encountered could lead to significantly faster travel between goal destinations. Systematic computational experiments using six terrains that had different arrangements of obstacles demonstrated that, at least in some domains, this conjecture is true. Furthermore, these experiments demonstrated that improved performance over time came not only from the avoidance of previously seen obstacles, but also (surprisingly) immediately after first encountering obstacles due to decreased delays in circumventing those obstacles. Simulations also showed that, of the four strategies we tested for removal of remembered obstacles when memory was full and a new obstacle was to be saved, none was better than random selection. These results may be useful in interpreting future experimental research on group movements in biological populations, and in improving existing methodologies for control of collective movements in computer graphics, robotic teams, particle swarm optimization, and computer games.

  13. SAVLOC, computer program for automatic control and analysis of X-ray fluorescence experiments

    NASA Technical Reports Server (NTRS)

    Leonard, R. F.

    1977-01-01

    A program for a PDP-15 computer is presented which provides for control and analysis of trace element determinations by using X-ray fluorescence. The program simultaneously handles data accumulation for one sample and analysis of data from previous samples. Data accumulation consists of sample changing, timing, and data storage. Analysis requires the locating of peaks in X-ray spectra, determination of intensities of peaks, identification of origins of peaks, and determination of a real density of the element responsible for each peak. The program may be run in either a manual (supervised) mode or an automatic (unsupervised) mode.

  14. Space station data management system - A common GSE test interface for systems testing and verification

    NASA Technical Reports Server (NTRS)

    Martinez, Pedro A.; Dunn, Kevin W.

    1987-01-01

    This paper examines the fundamental problems and goals associated with test, verification, and flight-certification of man-rated distributed data systems. First, a summary of the characteristics of modern computer systems that affect the testing process is provided. Then, verification requirements are expressed in terms of an overall test philosophy for distributed computer systems. This test philosophy stems from previous experience that was gained with centralized systems (Apollo and the Space Shuttle), and deals directly with the new problems that verification of distributed systems may present. Finally, a description of potential hardware and software tools to help solve these problems is provided.

  15. A New Soft Computing Method for K-Harmonic Means Clustering.

    PubMed

    Yeh, Wei-Chang; Jiang, Yunzhi; Chen, Yee-Fen; Chen, Zhe

    2016-01-01

    The K-harmonic means clustering algorithm (KHM) is a new clustering method used to group data such that the sum of the harmonic averages of the distances between each entity and all cluster centroids is minimized. Because it is less sensitive to initialization than K-means (KM), many researchers have recently been attracted to studying KHM. In this study, the proposed iSSO-KHM is based on an improved simplified swarm optimization (iSSO) and integrates a variable neighborhood search (VNS) for KHM clustering. As evidence of the utility of the proposed iSSO-KHM, we present extensive computational results on eight benchmark problems. From the computational results, the comparison appears to support the superiority of the proposed iSSO-KHM over previously developed algorithms for all experiments in the literature.

  16. Display gamma is an important factor in Web image viewing

    NASA Astrophysics Data System (ADS)

    Zhang, Xuemei; Lavin, Yingmei; Silverstein, D. Amnon

    2001-06-01

    We conducted a perceptual image preference experiment over the web to find our (1) if typical computer users have significant variations in their display gamma settings, and (2) if so, do the gamma settings have significant perceptual effect on the appearance of images in their web browsers. The digital image renderings used were found to have preferred tone characteristics from a previous lab- controlled experiment. They were rendered with 4 different gamma settings. The subjects were asked to view the images over the web, with their own computer equipment and web browsers. The subjects werewe asked to view the images over the web, with their own computer equipment and web browsers. The subjects made pair-wise subjective preference judgements on which rendering they liked bets for each image. Each subject's display gamma setting was estimated using a 'gamma estimator' tool, implemented as a Java applet. The results indicated that (1) the user's gamma settings, as estimated in the experiment, span a wide range from about 1.8 to about 3.0; (2) the subjects preferred images that werewe rendered with a 'correct' gamma value matching their display setting. Subjects disliked images rendered with a gamma value not matching their displays'. This indicates that display gamma estimation is a perceptually significant factor in web image optimization.

  17. The Modulus of Rupture from a Mathematical Point of View

    NASA Astrophysics Data System (ADS)

    Quintela, P.; Sánchez, M. T.

    2007-04-01

    The goal of this work is to present a complete mathematical study about the three-point bending experiments and the modulus of rupture of brittle materials. We will present the mathematical model associated to three-point bending experiments and we will use the asymptotic expansion method to obtain a new formula to calculate the modulus of rupture. We will compare the modulus of rupture of porcelain obtained with the previous formula with that obtained by using the classic theoretical formula. Finally, we will also present one and three-dimensional numerical simulations to compute the modulus of rupture.

  18. Installing computers in older adults' homes and teaching them to access a patient education web site: a systematic approach.

    PubMed

    Dauz, Emily; Moore, Jan; Smith, Carol E; Puno, Florence; Schaag, Helen

    2004-01-01

    This article describes the experiences of nurses who, as part of a large clinical trial, brought the Internet into older adults' homes by installing a computer, if needed, and connecting to a patient education Web site. Most of these patients had not previously used the Internet and were taught even basic computer skills when necessary. Because of increasing use of the Internet in patient education, assessment, and home monitoring, nurses in various roles currently connect with patients to monitor their progress, teach about medications, and answer questions about appointments and treatments. Thus, nurses find themselves playing the role of technology managers for patients with home-based Internet connections. This article provides step-by-step procedures for computer installation and training in the form of protocols, checklists, and patient user guides. By following these procedures, nurses can install computers, arrange Internet access, teach and connect to their patients, and prepare themselves to install future generations of technological devices.

  19. CLOUDCLOUD : general-purpose instrument monitoring and data managing software

    NASA Astrophysics Data System (ADS)

    Dias, António; Amorim, António; Tomé, António

    2016-04-01

    An effective experiment is dependent on the ability to store and deliver data and information to all participant parties regardless of their degree of involvement in the specific parts that make the experiment a whole. Having fast, efficient and ubiquitous access to data will increase visibility and discussion, such that the outcome will have already been reviewed several times, strengthening the conclusions. The CLOUD project aims at providing users with a general purpose data acquisition, management and instrument monitoring platform that is fast, easy to use, lightweight and accessible to all participants of an experiment. This work is now implemented in the CLOUD experiment at CERN and will be fully integrated with the experiment as of 2016. Despite being used in an experiment of the scale of CLOUD, this software can also be used in any size of experiment or monitoring station, from single computers to large networks of computers to monitor any sort of instrument output without influencing the individual instrument's DAQ. Instrument data and meta data is stored and accessed via a specially designed database architecture and any type of instrument output is accepted using our continuously growing parsing application. Multiple databases can be used to separate different data taking periods or a single database can be used if for instance an experiment is continuous. A simple web-based application gives the user total control over the monitored instruments and their data, allowing data visualization and download, upload of processed data and the ability to edit existing instruments or add new instruments to the experiment. When in a network, new computers are immediately recognized and added to the system and are able to monitor instruments connected to them. Automatic computer integration is achieved by a locally running python-based parsing agent that communicates with a main server application guaranteeing that all instruments assigned to that computer are monitored with parsing intervals as fast as milliseconds. This software (server+agents+interface+database) comes in easy and ready-to-use packages that can be installed in any operating system, including Android and iOS systems. This software is ideal for use in modular experiments or monitoring stations with large variability in instruments and measuring methods or in large collaborations, where data requires homogenization in order to be effectively transmitted to all involved parties. This work presents the software and provides performance comparison with previously used monitoring systems in the CLOUD experiment at CERN.

  20. 48 CFR 252.227-7028 - Technical data or computer software previously delivered to the government.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... software previously delivered to the government. 252.227-7028 Section 252.227-7028 Federal Acquisition... computer software previously delivered to the government. As prescribed in 227.7103-6(d), 227.7104(f)(2), or 227.7203-6(e), use the following provision: Technical Data or Computer Software Previously...

  1. 48 CFR 252.227-7028 - Technical data or computer software previously delivered to the government.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... software previously delivered to the government. 252.227-7028 Section 252.227-7028 Federal Acquisition... computer software previously delivered to the government. As prescribed in 227.7103-6(d), 227.7104(f)(2), or 227.7203-6(e), use the following provision: Technical Data or Computer Software Previously...

  2. 48 CFR 252.227-7028 - Technical data or computer software previously delivered to the government.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... software previously delivered to the government. 252.227-7028 Section 252.227-7028 Federal Acquisition... computer software previously delivered to the government. As prescribed in 227.7103-6(d), 227.7104(f)(2), or 227.7203-6(e), use the following provision: Technical Data or Computer Software Previously...

  3. 48 CFR 252.227-7028 - Technical data or computer software previously delivered to the government.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... software previously delivered to the government. 252.227-7028 Section 252.227-7028 Federal Acquisition... computer software previously delivered to the government. As prescribed in 227.7103-6(d), 227.7104(f)(2), or 227.7203-6(e), use the following provision: Technical Data or Computer Software Previously...

  4. 48 CFR 252.227-7028 - Technical data or computer software previously delivered to the government.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... software previously delivered to the government. 252.227-7028 Section 252.227-7028 Federal Acquisition... computer software previously delivered to the government. As prescribed in 227.7103-6(d), 227.7104(f)(2), or 227.7203-6(e), use the following provision: Technical Data or Computer Software Previously...

  5. Pseudoaneurysm with Arteriovenous Fistula after Arthroscopic Procedure: A Rare Complication of Arthroscopy

    PubMed Central

    Jin, Moran; Lee, Yang-Haeng; Yoon, Young Chul; Han, Il-Yong; Park, Kyung-Taek; Wi, Jin Hong

    2015-01-01

    Pseudoaneurysm with arteriovenous fistula is a rare complication of arthroscopy, and can be diagnosed by ultrasonography, computed tomography, magnetic resonance imaging, or angiography. This condition can be treated with open surgical repair or endovascular repair. We report our experience with the open surgical repair of a pseudoaneurysm with an arteriovenous fistula in a young male patient who underwent arthroscopy five months previously. PMID:26290846

  6. A Constraint Generation Approach to Learning Stable Linear Dynamical Systems

    DTIC Science & Technology

    2008-01-01

    task of learning dynamic textures from image sequences as well as to modeling biosurveillance drug-sales data. The constraint generation approach...previous methods in our experiments. One application of LDSs in computer vision is learning dynamic textures from video data [8]. An advantage of...over-the-counter (OTC) drug sales for biosurveillance , and sunspot numbers from the UCR archive [9]. Comparison to the best alternative methods [7, 10

  7. PowerPlay: Training an Increasingly General Problem Solver by Continually Searching for the Simplest Still Unsolvable Problem

    PubMed Central

    Schmidhuber, Jürgen

    2013-01-01

    Most of computer science focuses on automatically solving given computational problems. I focus on automatically inventing or discovering problems in a way inspired by the playful behavior of animals and humans, to train a more and more general problem solver from scratch in an unsupervised fashion. Consider the infinite set of all computable descriptions of tasks with possibly computable solutions. Given a general problem-solving architecture, at any given time, the novel algorithmic framework PowerPlay (Schmidhuber, 2011) searches the space of possible pairs of new tasks and modifications of the current problem solver, until it finds a more powerful problem solver that provably solves all previously learned tasks plus the new one, while the unmodified predecessor does not. Newly invented tasks may require to achieve a wow-effect by making previously learned skills more efficient such that they require less time and space. New skills may (partially) re-use previously learned skills. The greedy search of typical PowerPlay variants uses time-optimal program search to order candidate pairs of tasks and solver modifications by their conditional computational (time and space) complexity, given the stored experience so far. The new task and its corresponding task-solving skill are those first found and validated. This biases the search toward pairs that can be described compactly and validated quickly. The computational costs of validating new tasks need not grow with task repertoire size. Standard problem solver architectures of personal computers or neural networks tend to generalize by solving numerous tasks outside the self-invented training set; PowerPlay’s ongoing search for novelty keeps breaking the generalization abilities of its present solver. This is related to Gödel’s sequence of increasingly powerful formal theories based on adding formerly unprovable statements to the axioms without affecting previously provable theorems. The continually increasing repertoire of problem-solving procedures can be exploited by a parallel search for solutions to additional externally posed tasks. PowerPlay may be viewed as a greedy but practical implementation of basic principles of creativity (Schmidhuber, 2006a, 2010). A first experimental analysis can be found in separate papers (Srivastava et al., 2012a,b, 2013). PMID:23761771

  8. Dynamic stability analysis for capillary channel flow: One-dimensional and three-dimensional computations and the equivalent steady state technique

    NASA Astrophysics Data System (ADS)

    Grah, Aleksander; Dreyer, Michael E.

    2010-01-01

    Spacecraft technology provides a series of applications for capillary channel flow. It can serve as a reliable means for positioning and transport of liquids under low gravity conditions. Basically, capillary channels provide liquid paths with one or more free surfaces. A problem may be flow instabilities leading to a collapse of the liquid surfaces. A result is undesired gas ingestion and a two phase flow which can in consequence cause several technical problems. The presented capillary channel consists of parallel plates with two free liquid surfaces. The flow rate is established by a pump at the channel outlet, creating a lower pressure within the channel. Owing to the pressure difference between the liquid phase and the ambient gas phase the free surfaces bend inwards and remain stable as long as they are able to resist the steady and unsteady pressure effects. For the numerical prediction of the flow stability two very different models are used. The one-dimensional unsteady model is mainly based on the Bernoulli equation, the continuity equation, and the Gauss-Laplace equation. For three-dimensional evaluations an open source computational fluid dynamics (CFD) tool is applied. For verifications the numerical results are compared with quasisteady and unsteady data of a sounding rocket experiment. Contrary to previous experiments this one results in a significantly longer observation sequence. Furthermore, the critical point of the steady flow instability could be approached by a quasisteady technique. As in previous experiments the comparison to the numerical model evaluation shows a very good agreement for the movement of the liquid surfaces and for the predicted flow instability. The theoretical prediction of the flow instability is related to the speed index, based on characteristic velocities of the capillary channel flow. Stable flow regimes are defined by stability criteria for steady and unsteady flow. The one-dimensional computation of the speed index is based on the technique of the equivalent steady system, which is published for the first time in the present paper. This approach assumes that for every unsteady state an equivalent steady state with a special boundary condition can be formulated. The equivalent steady state technique enables a reformulation of the equation system and an efficient and reliable speed index computation. Furthermore, the existence of the numerical singularity at the critical point of the steady flow instability, postulated in previous publication, is demonstrated in detail. The numerical singularity is related to the stability criterion for steady flow and represents the numerical consequence of the liquid surface collapse. The evaluation and generation of the pressure diagram is demonstrated in detail with a series of numerical dynamic flow studies. The stability diagram, based on one-dimensional computation, gives a detailed overview of the stable and instable flow regimes. This prediction is in good agreement with the experimentally observed critical flow conditions and results of three-dimensional CFD computations.

  9. Experimental and computational investigations on severe slugging in a catenary riser

    NASA Astrophysics Data System (ADS)

    Duan, Jin-long; Chen, Ke; You, Yun-xiang; Gao, Song

    2017-12-01

    Severe slugging can occur in a pipeline-riser system at relatively low liquid and gas flow rates during gas-oil transportation, possibly causing unexpected damage to the production facilities. Experiments with air and water are conducted in a horizontal and downward inclined pipeline followed by a catenary riser in order to investigate the mechanism and characteristics of severe slugging. A theoretical model is introduced to compare with the experiments. The results show that the formation mechanism of severe slugging in a catenary riser is different from that in a vertical riser due to the riser geometry and five flow patterns are obtained and analyzed. A gas-liquid mixture slug stage is observed at the beginning of one cycle of severe slugging, which is seldom noticed in previous studies. Based on both experiments and computations, the time period and variation of pressure amplitude of severe slugging are found closely related to the superficial gas velocity, implying that the gas velocity significantly influences the flow patterns in our experiments. Moreover, good agreements between the experimental data and the numerical results are shown in the stability curve and flow regime map, which can be a possible reference for design in an offshore oil-production system.

  10. Use of speech generating devices can improve perception of qualifications for skilled, verbal, and interactive jobs.

    PubMed

    Stern, Steven E; Chobany, Chelsea M; Beam, Alexander A; Hoover, Brittany N; Hull, Thomas T; Linsenbigler, Melissa; Makdad-Light, Courtney; Rubright, Courtney N

    2017-01-01

    We have previously demonstrated that when speech generating devices (SGD) are used as assistive technologies, they are preferred over the users' natural voices. We sought to examine whether using SGDs would affect listener's perceptions of hirability of people with complex communication needs. In a series of three experiments, participants rated videotaped actors, one using SGD and the other using their natural, mildly dysarthric voice, on (a) a measurement of perceptions of speaker credibility, strength, and informedness and (b) measurements of hirability for jobs coded in terms of skill, verbal ability, and interactivity. Experiment 1 examined hirability for jobs varying in terms of skill and verbal ability. Experiment 2 was a replication that examined hirability for jobs varying in terms of interactivity. Experiment 3 examined jobs in terms of skill and specific mode of interaction (face-to-face, telephone, computer-mediated). Actors were rated more favorably when using SGD than their own voices. Actors using SGD were also rated more favorably for highly skilled and highly verbal jobs. This preference for SGDs over mildly dysarthric voice was also found for jobs entailing computer-mediated-communication, particularly skillful jobs.

  11. Lingering representations of stimuli influence recall organization

    PubMed Central

    Chan, Stephanie C.Y.; Applegate, Marissa C.; Morton, Neal W; Polyn, Sean M.; Norman, Kenneth A.

    2017-01-01

    Several prominent theories posit that information about recent experiences lingers in the brain and organizes memories for current experiences, by forming a temporal context that is linked to those memories at encoding. According to these theories, if the thoughts preceding an experience X resemble the thoughts preceding an experience Y, then X and Y should show an elevated probability of being recalled together. We tested this prediction by using multi-voxel pattern analysis (MVPA) of fMRI data to measure neural evidence for lingering processing of preceding stimuli. As predicted, memories encoded with similar lingering thoughts about the category of preceding stimuli were more likely to be recalled together. Our results demonstrate that the “fading embers” of previous stimuli help to organize recall, confirming a key prediction of computational models of episodic memory. PMID:28132858

  12. Outreach programmes to attract girls into computing: how the best laid plans can sometimes fail

    NASA Astrophysics Data System (ADS)

    Lang, Catherine; Fisher, Julie; Craig, Annemieke; Forgasz, Helen

    2015-07-01

    This article presents a reflective analysis of an outreach programme called the Digital Divas Club. This curriculum-based programme was delivered in Australian schools with the aim of stimulating junior and middle school girls' interest in computing courses and careers. We believed that we had developed a strong intervention programme based on previous literature and our collective knowledge and experiences. While it was coordinated by university academics, the programme content was jointly created and modified by practicing school teachers. After four years, when the final data were compiled, it showed that our programme produced significant change to student confidence in computing, but the ability to influence a desire to pursue a career path in computing did not fully eventuate. To gain a deeper insight in to why this may be the case, data collected from two of the schools are interrogated in more detail as described in this article. These schools were at the end of the expected programme outcomes. We found that despite designing a programme that delivered a multi-layered positive computing experience, factors beyond our control such as school culture and teacher technical self-efficacy help account for the unanticipated results. Despite our best laid plans, the expectations that this semester long programme would influence students' longer term career outcomes may have been aspirational at best.

  13. On finding bicliques in bipartite graphs: a novel algorithm and its application to the integration of diverse biological data types

    PubMed Central

    2014-01-01

    Background Integrating and analyzing heterogeneous genome-scale data is a huge algorithmic challenge for modern systems biology. Bipartite graphs can be useful for representing relationships across pairs of disparate data types, with the interpretation of these relationships accomplished through an enumeration of maximal bicliques. Most previously-known techniques are generally ill-suited to this foundational task, because they are relatively inefficient and without effective scaling. In this paper, a powerful new algorithm is described that produces all maximal bicliques in a bipartite graph. Unlike most previous approaches, the new method neither places undue restrictions on its input nor inflates the problem size. Efficiency is achieved through an innovative exploitation of bipartite graph structure, and through computational reductions that rapidly eliminate non-maximal candidates from the search space. An iterative selection of vertices for consideration based on non-decreasing common neighborhood sizes boosts efficiency and leads to more balanced recursion trees. Results The new technique is implemented and compared to previously published approaches from graph theory and data mining. Formal time and space bounds are derived. Experiments are performed on both random graphs and graphs constructed from functional genomics data. It is shown that the new method substantially outperforms the best previous alternatives. Conclusions The new method is streamlined, efficient, and particularly well-suited to the study of huge and diverse biological data. A robust implementation has been incorporated into GeneWeaver, an online tool for integrating and analyzing functional genomics experiments, available at http://geneweaver.org. The enormous increase in scalability it provides empowers users to study complex and previously unassailable gene-set associations between genes and their biological functions in a hierarchical fashion and on a genome-wide scale. This practical computational resource is adaptable to almost any applications environment in which bipartite graphs can be used to model relationships between pairs of heterogeneous entities. PMID:24731198

  14. PC_Eyewitness and the sequential superiority effect: computer-based lineup administration.

    PubMed

    MacLin, Otto H; Zimmerman, Laura A; Malpass, Roy S

    2005-06-01

    Computer technology has become an increasingly important tool for conducting eyewitness identifications. In the area of lineup identifications, computerized administration offers several advantages for researchers and law enforcement. PC_Eyewitness is designed specifically to administer lineups. To assess this new lineup technology, two studies were conducted in order to replicate the results of previous studies comparing simultaneous and sequential lineups. One hundred twenty university students participated in each experiment. Experiment 1 used traditional paper-and-pencil lineup administration methods to compare simultaneous to sequential lineups. Experiment 2 used PC_Eyewitness to administer simultaneous and sequential lineups. The results of these studies were compared to the meta-analytic results reported by N. Steblay, J. Dysart, S. Fulero, and R. C. L. Lindsay (2001). No differences were found between paper-and-pencil and PC_Eyewitness lineup administration methods. The core findings of the N. Steblay et al. (2001) meta-analysis were replicated by both administration procedures. These results show that computerized lineup administration using PC_Eyewitness is an effective means for gathering eyewitness identification data.

  15. A Supersonic Argon/Air Coaxial Jet Experiment for Computational Fluid Dynamics Code Validation

    NASA Technical Reports Server (NTRS)

    Clifton, Chandler W.; Cutler, Andrew D.

    2007-01-01

    A non-reacting experiment is described in which data has been acquired for the validation of CFD codes used to design high-speed air-breathing engines. A coaxial jet-nozzle has been designed to produce pressure-matched exit flows of Mach 1.8 at 1 atm in both a center jet of argon and a coflow jet of air, creating a supersonic, incompressible mixing layer. The flowfield was surveyed using total temperature, gas composition, and Pitot probes. The data set was compared to CFD code predictions made using Vulcan, a structured grid Navier-Stokes code, as well as to data from a previous experiment in which a He-O2 mixture was used instead of argon in the center jet of the same coaxial jet assembly. Comparison of experimental data from the argon flowfield and its computational prediction shows that the CFD produces an accurate solution for most of the measured flowfield. However, the CFD prediction deviates from the experimental data in the region downstream of x/D = 4, underpredicting the mixing-layer growth rate.

  16. NASCAP simulation of PIX 2 experiments

    NASA Technical Reports Server (NTRS)

    Roche, J. C.; Mandell, M. J.

    1985-01-01

    The latest version of the NASCAP/LEO digital computer code used to simulate the PIX 2 experiment is discussed. NASCAP is a finite-element code and previous versions were restricted to a single fixed mesh size. As a consequence the resolution was dictated by the largest physical dimension to be modeled. The latest version of NASCAP/LEO can subdivide selected regions. This permitted the modeling of the overall Delta launch vehicle in the primary computational grid at a coarse resolution, with subdivided regions at finer resolution being used to pick up the details of the experiment module configuration. Langmuir probe data from the flight were used to estimate the space plasma density and temperature and the Delta ground potential relative to the space plasma. This information is needed for input to NASCAP. Because of the uncertainty or variability in the values of these parameters, it was necessary to explore a range around the nominal value in order to determine the variation in current collection. The flight data from PIX 2 were also compared with the results of the NASCAP simulation.

  17. Predicting Transport of 3,5,6-Trichloro-2-Pyridinol Into Saliva Using a Combination Experimental and Computational Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Jordan Ned; Carver, Zana A.; Weber, Thomas J.

    A combination experimental and computational approach was developed to predict chemical transport into saliva. A serous-acinar chemical transport assay was established to measure chemical transport with non-physiological (standard cell culture medium) and physiological (using surrogate plasma and saliva medium) conditions using 3,5,6-trichloro-2-pyridinol (TCPy) a metabolite of the pesticide chlorpyrifos. High levels of TCPy protein binding was observed in cell culture medium and rat plasma resulting in different TCPy transport behaviors in the two experimental conditions. In the non-physiological transport experiment, TCPy reached equilibrium at equivalent concentrations in apical and basolateral chambers. At higher TCPy doses, increased unbound TCPy was observed,more » and TCPy concentrations in apical and basolateral chambers reached equilibrium faster than lower doses, suggesting only unbound TCPy is able to cross the cellular monolayer. In the physiological experiment, TCPy transport was slower than non-physiological conditions, and equilibrium was achieved at different concentrations in apical and basolateral chambers at a comparable ratio (0.034) to what was previously measured in rats dosed with TCPy (saliva:blood ratio: 0.049). A cellular transport computational model was developed based on TCPy protein binding kinetics and accurately simulated all transport experiments using different permeability coefficients for the two experimental conditions (1.4 vs 0.4 cm/hr for non-physiological and physiological experiments, respectively). The computational model was integrated into a physiologically based pharmacokinetic (PBPK) model and accurately predicted TCPy concentrations in saliva of rats dosed with TCPy. Overall, this study demonstrates an approach to predict chemical transport in saliva potentially increasing the utility of salivary biomonitoring in the future.« less

  18. Do monkeys choose to choose?

    PubMed

    Perdue, Bonnie M; Evans, Theodore A; Washburn, David A; Rumbaugh, Duane M; Beran, Michael J

    2014-06-01

    Both empirical and anecdotal evidence supports the idea that choice is preferred by humans. Previous research has demonstrated that this preference extends to nonhuman animals, but it remains largely unknown whether animals will actively seek out or prefer opportunities to choose. Here we explored the issue of whether capuchin and rhesus monkeys choose to choose. We used a modified version of the SELECT task-a computer program in which monkeys can choose the order of completion of various psychomotor and cognitive tasks. In the present experiments, each trial began with a choice between two icons, one of which allowed the monkey to select the order of task completion, and the other of which led to the assignment of a task order by the computer. In either case, subjects still had to complete the same number of tasks and the same number of task trials. The tasks were relatively easy, and the monkeys responded correctly on most trials. Thus, global reinforcement rates were approximately equated across conditions. The only difference was whether the monkey chose the task order or it was assigned, thus isolating the act of choosing. Given sufficient experience with the task icons, all monkeys showed a significant preference for choice when the alternative was a randomly assigned order of tasks. To a lesser extent, some of the monkeys maintained a preference for choice over a preferred, but computer-assigned, task order that was yoked to their own previous choice selection. The results indicated that monkeys prefer to choose when all other aspects of the task are equated.

  19. Markstein Numbers of Negatively-Stretched Premixed Flames: Microgravity Measurements and Computations

    NASA Technical Reports Server (NTRS)

    Ibarreta, Alfonso F.; Driscoll, James F.; Feikema, Douglas A.; Salzman, Jack (Technical Monitor)

    2001-01-01

    The effect of flame stretch, composed of strain and curvature, plays a major role in the propagation of turbulent premixed flames. Although all forms of stretch (positive and negative) are present in turbulent conditions, little research has been focused on the stretch due to curvature. The present study quantifies the Markstein number (which characterizes the sensitivity of the flame propagation speed to the imposed stretch rate) for an inwardly-propagating flame (IPF). This flame is of interest because it is negatively stretched, and is subjected to curvature effects alone, without the competing effects of strain. In an extension of our previous work, microgravity experiments were run using a vortex-flame interaction to create a pocket of reactants surrounded by an IPF. Computations using the RUN-1DL code of Rogg were also performed in order to explain the measurements. It was found that the Markstein number of an inwardly-propagating flame, for both the microgravity experiment and the computations, is significantly larger than that of an outwardly-propagating flame. Further insight was gained by running the computations for the simplified (hypothetical) cases of one step chemistry, unity Lewis number, and negligible heat release. Results provide additional evidence that the Markstein numbers associated with strain and curvature have different values.

  20. Brain-Inspired Photonic Signal Processor for Generating Periodic Patterns and Emulating Chaotic Systems

    NASA Astrophysics Data System (ADS)

    Antonik, Piotr; Haelterman, Marc; Massar, Serge

    2017-05-01

    Reservoir computing is a bioinspired computing paradigm for processing time-dependent signals. Its hardware implementations have received much attention because of their simplicity and remarkable performance on a series of benchmark tasks. In previous experiments, the output was uncoupled from the system and, in most cases, simply computed off-line on a postprocessing computer. However, numerical investigations have shown that feeding the output back into the reservoir opens the possibility of long-horizon time-series forecasting. Here, we present a photonic reservoir computer with output feedback, and we demonstrate its capacity to generate periodic time series and to emulate chaotic systems. We study in detail the effect of experimental noise on system performance. In the case of chaotic systems, we introduce several metrics, based on standard signal-processing techniques, to evaluate the quality of the emulation. Our work significantly enlarges the range of tasks that can be solved by hardware reservoir computers and, therefore, the range of applications they could potentially tackle. It also raises interesting questions in nonlinear dynamics and chaos theory.

  1. Detailed Multidimensional Simulations of the Structure and Dynamics of Flames

    NASA Technical Reports Server (NTRS)

    Patnaik, G.; Kailasanath, K.

    1999-01-01

    Numerical simulations in which the various physical and chemical processes can be independently controlled can significantly advance our understanding of the structure, stability, dynamics and extinction of flames. Therefore, our approach has been to use detailed time-dependent, multidimensional, multispecies numerical models to perform carefully designed computational experiments of flames on Earth and in microgravity environments. Some of these computational experiments are complementary to physical experiments performed under the Microgravity Program while others provide a fundamental understanding that cannot be obtained from physical experiments alone. In this report, we provide a brief summary of our recent research highlighting the contributions since the previous microgravity combustion workshop. There are a number of mechanisms that can cause flame instabilities and result in the formation of dynamic multidimensional structures. In the past, we have used numerical simulations to show that it is the thermo-diffusive instability rather than an instability due to preferential diffusion that is the dominant mechanism for the formation of cellular flames in lean hydrogen-air mixtures. Other studies have explored the role of gravity on flame dynamics and extinguishment, multi-step kinetics and radiative losses on flame instabilities in rich hydrogen-air flames, and heat losses on burner-stabilized flames in microgravity. The recent emphasis of our work has been on exploring flame-vortex interactions and further investigating the structure and dynamics of lean hydrogen-air flames in microgravity. These topics are briefly discussed after a brief discussion of our computational approach for solving these problems.

  2. 5 CFR 839.1002 - Will OPM compute the lost earnings if my qualifying retirement coverage error was previously...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Will OPM compute the lost earnings if my... compute the lost earnings if my qualifying retirement coverage error was previously corrected and I made... coverage error was previously corrected, OPM will compute the lost earnings on your make-up contributions...

  3. Design of a serotonin 4 receptor radiotracer with decreased lipophilicity for single photon emission computed tomography.

    PubMed

    Fresneau, Nathalie; Dumas, Noé; Tournier, Benjamin B; Fossey, Christine; Ballandonne, Céline; Lesnard, Aurélien; Millet, Philippe; Charnay, Yves; Cailly, Thomas; Bouillon, Jean-Philippe; Fabis, Frédéric

    2015-04-13

    With the aim to develop a suitable radiotracer for the brain imaging of the serotonin 4 receptor subtype (5-HT4R) using single photon emission computed tomography (SPECT), we synthesized and evaluated a library of di- and triazaphenanthridines with lipophilicity values which were in the range expected to favour brain penetration, and which demonstrated specific binding to the target of interest. Adding additional nitrogen atoms to previously described phenanthridine ligands exhibiting a high unspecific binding, we were able to design a radioiodinated compound [(125)I]14. This compound exhibited a binding affinity value of 0.094 nM toward human 5-HT4R and a high selectivity over other serotonin receptor subtypes (5-HTR). In vivo SPECT imaging studies and competition experiments demonstrated that the decreased lipophilicity (in comparison with our previously reported compounds 4 and 5) allowed a more specific labelling of the 5-HT4R brain-containing regions. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  4. Tunable Low Energy, Compact and High Performance Neuromorphic Circuit for Spike-Based Synaptic Plasticity

    PubMed Central

    Rahimi Azghadi, Mostafa; Iannella, Nicolangelo; Al-Sarawi, Said; Abbott, Derek

    2014-01-01

    Cortical circuits in the brain have long been recognised for their information processing capabilities and have been studied both experimentally and theoretically via spiking neural networks. Neuromorphic engineers are primarily concerned with translating the computational capabilities of biological cortical circuits, using the Spiking Neural Network (SNN) paradigm, into in silico applications that can mimic the behaviour and capabilities of real biological circuits/systems. These capabilities include low power consumption, compactness, and relevant dynamics. In this paper, we propose a new accelerated-time circuit that has several advantages over its previous neuromorphic counterparts in terms of compactness, power consumption, and capability to mimic the outcomes of biological experiments. The presented circuit simulation results demonstrate that, in comparing the new circuit to previous published synaptic plasticity circuits, reduced silicon area and lower energy consumption for processing each spike is achieved. In addition, it can be tuned in order to closely mimic the outcomes of various spike timing- and rate-based synaptic plasticity experiments. The proposed circuit is also investigated and compared to other designs in terms of tolerance to mismatch and process variation. Monte Carlo simulation results show that the proposed design is much more stable than its previous counterparts in terms of vulnerability to transistor mismatch, which is a significant challenge in analog neuromorphic design. All these features make the proposed design an ideal circuit for use in large scale SNNs, which aim at implementing neuromorphic systems with an inherent capability that can adapt to a continuously changing environment, thus leading to systems with significant learning and computational abilities. PMID:24551089

  5. Tunable low energy, compact and high performance neuromorphic circuit for spike-based synaptic plasticity.

    PubMed

    Rahimi Azghadi, Mostafa; Iannella, Nicolangelo; Al-Sarawi, Said; Abbott, Derek

    2014-01-01

    Cortical circuits in the brain have long been recognised for their information processing capabilities and have been studied both experimentally and theoretically via spiking neural networks. Neuromorphic engineers are primarily concerned with translating the computational capabilities of biological cortical circuits, using the Spiking Neural Network (SNN) paradigm, into in silico applications that can mimic the behaviour and capabilities of real biological circuits/systems. These capabilities include low power consumption, compactness, and relevant dynamics. In this paper, we propose a new accelerated-time circuit that has several advantages over its previous neuromorphic counterparts in terms of compactness, power consumption, and capability to mimic the outcomes of biological experiments. The presented circuit simulation results demonstrate that, in comparing the new circuit to previous published synaptic plasticity circuits, reduced silicon area and lower energy consumption for processing each spike is achieved. In addition, it can be tuned in order to closely mimic the outcomes of various spike timing- and rate-based synaptic plasticity experiments. The proposed circuit is also investigated and compared to other designs in terms of tolerance to mismatch and process variation. Monte Carlo simulation results show that the proposed design is much more stable than its previous counterparts in terms of vulnerability to transistor mismatch, which is a significant challenge in analog neuromorphic design. All these features make the proposed design an ideal circuit for use in large scale SNNs, which aim at implementing neuromorphic systems with an inherent capability that can adapt to a continuously changing environment, thus leading to systems with significant learning and computational abilities.

  6. Objective techniques for psychological assessment, phase 2. [techniques for measuring human performance during space flight stress

    NASA Technical Reports Server (NTRS)

    Wortz, E. C.; Saur, A. J.; Nowlis, D. P.; Kendall, M. P.

    1974-01-01

    Results are presented of an initial experiment in a research program designed to develop objective techniques for psychological assessment of individuals and groups participating in long-duration space flights. Specifically examined is the rationale for utilizing measures of attention as an objective assessment technique. Subjects participating in the experiment performed various tasks (eg, playing matrix games which appeared on a display screen along with auditory stimuli). The psychophysiological reactions of the subjects were measured and are given. Previous research of various performance and psychophysiological methods of measuring attention is also discussed. The experiment design (independent and dependent variables) and apparatus (computers and display devices) are described and shown. Conclusions and recommendations are presented.

  7. Hollow cathodes as electron emitting plasma contactors Theory and computer modeling

    NASA Technical Reports Server (NTRS)

    Davis, V. A.; Katz, I.; Mandell, M. J.; Parks, D. E.

    1987-01-01

    Several researchers have suggested using hollow cathodes as plasma contactors for electrodynamic tethers, particularly to prevent the Shuttle Orbiter from charging to large negative potentials. Previous studies have shown that fluid models with anomalous scattering can describe the electron transport in hollow cathode generated plasmas. An improved theory of the hollow cathode plasmas is developed and computational results using the theory are compared with laboratory experiments. Numerical predictions for a hollow cathode plasma source of the type considered for use on the Shuttle are presented, as are three-dimensional NASCAP/LEO calculations of the emitted ion trajectories and the resulting potentials in the vicinity of the Orbiter. The computer calculations show that the hollow cathode plasma source makes vastly superior contact with the ionospheric plasma compared with either an electron gun or passive ion collection by the Orbiter.

  8. Dual-energy contrast-enhanced spectral mammography (CESM).

    PubMed

    Daniaux, Martin; De Zordo, Tobias; Santner, Wolfram; Amort, Birgit; Koppelstätter, Florian; Jaschke, Werner; Dromain, Clarisse; Oberaigner, Willi; Hubalek, Michael; Marth, Christian

    2015-10-01

    Dual-energy contrast-enhanced mammography is one of the latest developments in breast care. Imaging with contrast agents in breast cancer was already known from previous magnetic resonance imaging and computed tomography studies. However, high costs, limited availability-or high radiation dose-led to the development of contrast-enhanced spectral mammography (CESM). We reviewed the current literature, present our experience, discuss the advantages and drawbacks of CESM and look at the future of this innovative technique.

  9. Eye/Brain/Task Testbed And Software

    NASA Technical Reports Server (NTRS)

    Janiszewski, Thomas; Mainland, Nora; Roden, Joseph C.; Rothenheber, Edward H.; Ryan, Arthur M.; Stokes, James M.

    1994-01-01

    Eye/brain/task (EBT) testbed records electroencephalograms, movements of eyes, and structures of tasks to provide comprehensive data on neurophysiological experiments. Intended to serve continuing effort to develop means for interactions between human brain waves and computers. Software library associated with testbed provides capabilities to recall collected data, to process data on movements of eyes, to correlate eye-movement data with electroencephalographic data, and to present data graphically. Cognitive processes investigated in ways not previously possible.

  10. Limited-memory trust-region methods for sparse relaxation

    NASA Astrophysics Data System (ADS)

    Adhikari, Lasith; DeGuchy, Omar; Erway, Jennifer B.; Lockhart, Shelby; Marcia, Roummel F.

    2017-08-01

    In this paper, we solve the l2-l1 sparse recovery problem by transforming the objective function of this problem into an unconstrained differentiable function and applying a limited-memory trust-region method. Unlike gradient projection-type methods, which uses only the current gradient, our approach uses gradients from previous iterations to obtain a more accurate Hessian approximation. Numerical experiments show that our proposed approach eliminates spurious solutions more effectively while improving computational time.

  11. Social percolation models

    NASA Astrophysics Data System (ADS)

    Solomon, Sorin; Weisbuch, Gerard; de Arcangelis, Lucilla; Jan, Naeem; Stauffer, Dietrich

    2000-03-01

    We here relate the occurrence of extreme market shares, close to either 0 or 100%, in the media industry to a percolation phenomenon across the social network of customers. We further discuss the possibility of observing self-organized criticality when customers and cinema producers adjust their preferences and the quality of the produced films according to previous experience. Comprehensive computer simulations on square lattices do indeed exhibit self-organized criticality towards the usual percolation threshold and related scaling behaviour.

  12. A general panel sizing computer code and its application to composite structural panels

    NASA Technical Reports Server (NTRS)

    Anderson, M. S.; Stroud, W. J.

    1978-01-01

    A computer code for obtaining the dimensions of optimum (least mass) stiffened composite structural panels is described. The procedure, which is based on nonlinear mathematical programming and a rigorous buckling analysis, is applicable to general cross sections under general loading conditions causing buckling. A simplified method of accounting for bow-type imperfections is also included. Design studies in the form of structural efficiency charts for axial compression loading are made with the code for blade and hat stiffened panels. The effects on panel mass of imperfections, material strength limitations, and panel stiffness requirements are also examined. Comparisons with previously published experimental data show that accounting for imperfections improves correlation between theory and experiment.

  13. A theoretical approach for analyzing the restabilization of wakes

    NASA Astrophysics Data System (ADS)

    Hill, D. C.

    1992-04-01

    Recently reported experimental results demonstrate that restabilization of the low-Reynolds-number flow past a circular cylinder can be achieved by the placement of a smaller cylinder in the wake of the first at particular locations. Traditional numerical procedures for modeling such phenomena are computationally expensive. An approach is presented here in which the properties of the adjoint solutions to the linearized equations of motion are exploited to map quickly the best positions for the small cylinder's placement. Comparisons with experiment and previous computations are favorable. The approach is shown to be applicable to general flows, illustrating how strongly control mechanisms that involve sources of momentum couple to unstable (or stable) modes of the system.

  14. Inductive reasoning about causally transmitted properties.

    PubMed

    Shafto, Patrick; Kemp, Charles; Bonawitz, Elizabeth Baraff; Coley, John D; Tenenbaum, Joshua B

    2008-11-01

    Different intuitive theories constrain and guide inferences in different contexts. Formalizing simple intuitive theories as probabilistic processes operating over structured representations, we present a new computational model of category-based induction about causally transmitted properties. A first experiment demonstrates undergraduates' context-sensitive use of taxonomic and food web knowledge to guide reasoning about causal transmission and shows good qualitative agreement between model predictions and human inferences. A second experiment demonstrates strong quantitative and qualitative fits to inferences about a more complex artificial food web. A third experiment investigates human reasoning about complex novel food webs where species have known taxonomic relations. Results demonstrate a double-dissociation between the predictions of our causal model and a related taxonomic model [Kemp, C., & Tenenbaum, J. B. (2003). Learning domain structures. In Proceedings of the 25th annual conference of the cognitive science society]: the causal model predicts human inferences about diseases but not genes, while the taxonomic model predicts human inferences about genes but not diseases. We contrast our framework with previous models of category-based induction and previous formal instantiations of intuitive theories, and outline challenges in developing a complete model of context-sensitive reasoning.

  15. Key Steps in Developing a Cognitive Vaccine against Traumatic Flashbacks: Visuospatial Tetris versus Verbal Pub Quiz

    PubMed Central

    Holmes, Emily A.; James, Ella L.; Kilford, Emma J.; Deeprose, Catherine

    2010-01-01

    Background Flashbacks (intrusive memories of a traumatic event) are the hallmark feature of Post Traumatic Stress Disorder, however preventative interventions are lacking. Tetris may offer a ‘cognitive vaccine’ [1] against flashback development after trauma exposure. We previously reported that playing the computer game Tetris soon after viewing traumatic material reduced flashbacks compared to no-task [1]. However, two criticisms need to be addressed for clinical translation: (1) Would all games have this effect via distraction/enjoyment, or might some games even be harmful? (2) Would effects be found if administered several hours post-trauma? Accordingly, we tested Tetris versus an alternative computer game – Pub Quiz – which we hypothesized not to be helpful (Experiments 1 and 2), and extended the intervention interval to 4 hours (Experiment 2). Methodology/Principal Findings The trauma film paradigm was used as an experimental analog for flashback development in healthy volunteers. In both experiments, participants viewed traumatic film footage of death and injury before completing one of the following: (1) no-task control condition (2) Tetris or (3) Pub Quiz. Flashbacks were monitored for 1 week. Experiment 1: 30 min after the traumatic film, playing Tetris led to a significant reduction in flashbacks compared to no-task control, whereas Pub Quiz led to a significant increase in flashbacks. Experiment 2: 4 hours post-film, playing Tetris led to a significant reduction in flashbacks compared to no-task control, whereas Pub Quiz did not. Conclusions/Significance First, computer games can have differential effects post-trauma, as predicted by a cognitive science formulation of trauma memory. In both Experiments, playing Tetris post-trauma film reduced flashbacks. Pub Quiz did not have this effect, even increasing flashbacks in Experiment 1. Thus not all computer games are beneficial or merely distracting post-trauma - some may be harmful. Second, the beneficial effects of Tetris are retained at 4 hours post-trauma. Clinically, this delivers a feasible time-window to administer a post-trauma “cognitive vaccine”. PMID:21085661

  16. Computational chemistry

    NASA Technical Reports Server (NTRS)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  17. Effects on Training Using Illumination in Virtual Environments

    NASA Technical Reports Server (NTRS)

    Maida, James C.; Novak, M. S. Jennifer; Mueller, Kristian

    1999-01-01

    Camera based tasks are commonly performed during orbital operations, and orbital lighting conditions, such as high contrast shadowing and glare, are a factor in performance. Computer based training using virtual environments is a common tool used to make and keep CTW members proficient. If computer based training included some of these harsh lighting conditions, would the crew increase their proficiency? The project goal was to determine whether computer based training increases proficiency if one trains for a camera based task using computer generated virtual environments with enhanced lighting conditions such as shadows and glare rather than color shaded computer images normally used in simulators. Previous experiments were conducted using a two degree of freedom docking system. Test subjects had to align a boresight camera using a hand controller with one axis of rotation and one axis of rotation. Two sets of subjects were trained on two computer simulations using computer generated virtual environments, one with lighting, and one without. Results revealed that when subjects were constrained by time and accuracy, those who trained with simulated lighting conditions performed significantly better than those who did not. To reinforce these results for speed and accuracy, the task complexity was increased.

  18. An adaptive model approach for quantitative wrist rigidity evaluation during deep brain stimulation surgery.

    PubMed

    Assis, Sofia; Costa, Pedro; Rosas, Maria Jose; Vaz, Rui; Silva Cunha, Joao Paulo

    2016-08-01

    Intraoperative evaluation of the efficacy of Deep Brain Stimulation includes evaluation of the effect on rigidity. A subjective semi-quantitative scale is used, dependent on the examiner perception and experience. A system was proposed previously, aiming to tackle this subjectivity, using quantitative data and providing real-time feedback of the computed rigidity reduction, hence supporting the physician decision. This system comprised of a gyroscope-based motion sensor in a textile band, placed in the patients hand, which communicated its measurements to a laptop. The latter computed a signal descriptor from the angular velocity of the hand during wrist flexion in DBS surgery. The first approach relied on using a general rigidity reduction model, regardless of the initial severity of the symptom. Thus, to enhance the performance of the previously presented system, we aimed to develop models for high and low baseline rigidity, according to the examiner assessment before any stimulation. This would allow a more patient-oriented approach. Additionally, usability was improved by having in situ processing in a smartphone, instead of a computer. Such system has shown to be reliable, presenting an accuracy of 82.0% and a mean error of 3.4%. Relatively to previous results, the performance was similar, further supporting the importance of considering the cogwheel rigidity to better infer about the reduction in rigidity. Overall, we present a simple, wearable, mobile system, suitable for intra-operatory conditions during DBS, supporting a physician in decision-making when setting stimulation parameters.

  19. Lingering representations of stimuli influence recall organization.

    PubMed

    Chan, Stephanie C Y; Applegate, Marissa C; Morton, Neal W; Polyn, Sean M; Norman, Kenneth A

    2017-03-01

    Several prominent theories posit that information about recent experiences lingers in the brain and organizes memories for current experiences, by forming a temporal context that is linked to those memories at encoding. According to these theories, if the thoughts preceding an experience X resemble the thoughts preceding an experience Y, then X and Y should show an elevated probability of being recalled together. We tested this prediction by using multi-voxel pattern analysis (MVPA) of fMRI data to measure neural evidence for lingering processing of preceding stimuli. As predicted, memories encoded with similar lingering thoughts about the category of preceding stimuli were more likely to be recalled together. Our results demonstrate that the "fading embers" of previous stimuli help to organize recall, confirming a key prediction of computational models of episodic memory. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Molecular mechanism of NDMA formation from N,N-dimethylsulfamide during ozonation: quantum chemical insights into a bromide-catalyzed pathway.

    PubMed

    Trogolo, Daniela; Mishra, Brijesh Kumar; Heeb, Michèle B; von Gunten, Urs; Arey, J Samuel

    2015-04-07

    During ozonation of drinking water, the fungicide metabolite N,N-dimethylsulfamide (DMS) can be transformed into a highly toxic product, N-nitrosodimethylamine (NDMA). We used quantum chemical computations and stopped-flow experiments to evaluate a chemical mechanism proposed previously to describe this transformation. Stopped-flow experiments indicate a pK(a) = 10.4 for DMS. Experiments show that hypobromous acid (HOBr), generated by ozone oxidation of naturally occurring bromide, brominates the deprotonated DMS(-) anion with a near-diffusion controlled rate constant (7.1 ± 0.6 × 10(8) M(-1) s(-1)), forming Br-DMS(-) anion. According to quantum chemical calculations, Br-DMS has a pK(a) ∼ 9.0 and thus remains partially deprotonated at neutral pH. The anionic Br-DMS(-) bromamine can react with ozone with a high rate constant (10(5 ± 2.5) M(-1) s(-1)), forming the reaction intermediate (BrNO)(SO2)N(CH3)2(-). This intermediate resembles a loosely bound complex between an electrophilic nitrosyl bromide (BrNO) molecule and an electron-rich dimethylaminosulfinate ((SO2)N(CH3)2(-)) fragment, based on inspection of computed natural charges and geometric parameters. This fragile complex undergoes immediate (10(10 ± 2.5) s(-1)) reaction by two branches: an exothermic channel that produces NDMA, and an entropy-driven channel giving non-NDMA products. Computational results bring new insights into the electronic nature, chemical equilibria, and kinetics of the elementary reactions of this pathway, enabled by computed energies of structures that are not possible to access experimentally.

  1. CAMAC throughput of a new RISC-based data acquisition computer at the DIII-D tokamak

    NASA Astrophysics Data System (ADS)

    Vanderlaan, J. F.; Cummings, J. W.

    1993-10-01

    The amount of experimental data acquired per plasma discharge at DIII-D has continued to grow. The largest shot size in May 1991 was 49 Mbyte; in May 1992, 66 Mbyte; and in April 1993, 80 Mbyte. The increasing load has prompted the installation of a new Motorola 88100-based MODCOMP computer to supplement the existing core of three older MODCOMP data acquisition CPU's. New Kinetic Systems CAMAC serial highway driver hardware runs on the 88100 VME bus. The new operating system is MODCOMP REAL/IX version of AT&T System V UNIX with real-time extensions and networking capabilities; future plans call for installation of additional computers of this type for tokamak and neutral beam control functions. Experiences with the CAMAC hardware and software will be chronicled, including observation of data throughput. The Enhanced Serial Highway crate controller is advertised as twice as fast as the previous crate controller, and computer I/O speeds are expected to also increase data rates.

  2. Computational-Model-Based Analysis of Context Effects on Harmonic Expectancy.

    PubMed

    Morimoto, Satoshi; Remijn, Gerard B; Nakajima, Yoshitaka

    2016-01-01

    Expectancy for an upcoming musical chord, harmonic expectancy, is supposedly based on automatic activation of tonal knowledge. Since previous studies implicitly relied on interpretations based on Western music theory, the underlying computational processes involved in harmonic expectancy and how it relates to tonality need further clarification. In particular, short chord sequences which cannot lead to unique keys are difficult to interpret in music theory. In this study, we examined effects of preceding chords on harmonic expectancy from a computational perspective, using stochastic modeling. We conducted a behavioral experiment, in which participants listened to short chord sequences and evaluated the subjective relatedness of the last chord to the preceding ones. Based on these judgments, we built stochastic models of the computational process underlying harmonic expectancy. Following this, we compared the explanatory power of the models. Our results imply that, even when listening to short chord sequences, internally constructed and updated tonal assumptions determine the expectancy of the upcoming chord.

  3. Computational-Model-Based Analysis of Context Effects on Harmonic Expectancy

    PubMed Central

    Morimoto, Satoshi; Remijn, Gerard B.; Nakajima, Yoshitaka

    2016-01-01

    Expectancy for an upcoming musical chord, harmonic expectancy, is supposedly based on automatic activation of tonal knowledge. Since previous studies implicitly relied on interpretations based on Western music theory, the underlying computational processes involved in harmonic expectancy and how it relates to tonality need further clarification. In particular, short chord sequences which cannot lead to unique keys are difficult to interpret in music theory. In this study, we examined effects of preceding chords on harmonic expectancy from a computational perspective, using stochastic modeling. We conducted a behavioral experiment, in which participants listened to short chord sequences and evaluated the subjective relatedness of the last chord to the preceding ones. Based on these judgments, we built stochastic models of the computational process underlying harmonic expectancy. Following this, we compared the explanatory power of the models. Our results imply that, even when listening to short chord sequences, internally constructed and updated tonal assumptions determine the expectancy of the upcoming chord. PMID:27003807

  4. Effects of Geometric Details on Slat Noise Generation and Propagation

    NASA Technical Reports Server (NTRS)

    Khorrami, Mehdi R.; Lockard, David P.

    2009-01-01

    The relevance of geometric details to the generation and propagation of noise from leading-edge slats is considered. Typically, such details are omitted in computational simulations and model-scale experiments thereby creating ambiguities in comparisons with acoustic results from flight tests. The current study uses two-dimensional, computational simulations in conjunction with a Ffowcs Williams-Hawkings (FW-H) solver to investigate the effects of previously neglected slat "bulb" and "blade" seals on the local flow field and the associated acoustic radiation. The computations show that the presence of the "blade" seal at the cusp in the simulated geometry significantly changes the slat cove flow dynamics, reduces the amplitudes of the radiated sound, and to a lesser extent, alters the directivity beneath the airfoil. Furthermore, the computations suggest that a modest extension of the baseline "blade" seal further enhances the suppression of slat noise. As a side issue, the utility and equivalence of FW-H methodology for calculating far-field noise as opposed to a more direct approach is examined and demonstrated.

  5. Observation of temperature trace, induced by changing of temperature inside the human body, on the human body skin using commercially available IR camera

    NASA Astrophysics Data System (ADS)

    Trofimov, Vyacheslav A.; Trofimov, Vladislav V.

    2015-05-01

    As it is well-known, application of the passive THz camera for the security problems is very promising way. It allows seeing concealed object without contact with a person and this camera is non-dangerous for a person. In previous papers, we demonstrate new possibility of the passive THz camera using for a temperature difference observing on the human skin if this difference is caused by different temperatures inside the body. For proof of validity of our statement we make the similar physical experiment using the IR camera. We show a possibility of temperature trace on human body skin, caused by changing of temperature inside the human body due to water drinking. We use as a computer code that is available for treatment of images captured by commercially available IR camera, manufactured by Flir Corp., as well as our developed computer code for computer processing of these images. Using both codes we demonstrate clearly changing of human body skin temperature induced by water drinking. Shown phenomena are very important for the detection of forbidden samples and substances concealed inside the human body using non-destructive control without X-rays using. Early we have demonstrated such possibility using THz radiation. Carried out experiments can be used for counter-terrorism problem solving. We developed original filters for computer processing of images captured by IR cameras. Their applications for computer processing of images results in a temperature resolution enhancing of cameras.

  6. An Experimental Study in the Use of Computer-Based Instruction to Teach Automated Spreadsheet Functions

    DTIC Science & Technology

    1991-09-01

    review of past CBI studies -was conducted to provide the researcher a theoretical knowledge base on the effectiveness and efficiency of CBI. A summary...Literature Review Findinms on Ways to Measure CBI Effectiveness and Efficiency. The literature included previously conducted CBI experiments, studies , and...nine choices on each main and submenu (14:16). 3) Allow the student to make a menu selection with upper or lower case entries (28:291). 4) Prevent

  7. Replaying the game: hypnagogic images in normals and amnesics.

    PubMed

    Stickgold, R; Malia, A; Maguire, D; Roddenberry, D; O'Connor, M

    2000-10-13

    Participants playing the computer game Tetris reported intrusive, stereotypical, visual images of the game at sleep onset. Three amnesic patients with extensive bilateral medial temporal lobe damage produced similar hypnagogic reports despite being unable to recall playing the game, suggesting that such imagery may arise without important contribution from the declarative memory system. In addition, control participants reported images from previously played versions of the game, demonstrating that remote memories can influence the images from recent waking experience.

  8. Automatic maintenance payload on board of a Mexican LEO microsatellite

    NASA Astrophysics Data System (ADS)

    Vicente-Vivas, Esaú; García-Nocetti, Fabián; Mendieta-Jiménez, Francisco

    2006-02-01

    Few research institutions from Mexico work together to finalize the integration of a technological demonstration microsatellite called Satex, aiming the launching of the first ever fully designed and manufactured domestic space vehicle. The project is based on technical knowledge gained in previous space experiences, particularly in developing GASCAN automatic experiments for NASA's space shuttle, and in some support obtained from the local team which assembled the México-OSCAR-30 microsatellites. Satex includes three autonomous payloads and a power subsystem, each one with a local microcomputer to provide intelligent and dedicated control. It also contains a flight computer (FC) with a pair of full redundancies. This enables the remote maintenance of processing boards from the ground station. A fourth communications payload depends on the flight computer for control purposes. A fifth payload was decided to be developed for the satellite. It adds value to the available on-board computers and extends the opportunity for a developing country to learn and to generate domestic space technology. Its aim is to provide automatic maintenance capabilities for the most critical on-board computer in order to achieve continuous satellite operations. This paper presents the virtual computer architecture specially developed to provide maintenance capabilities to the flight computer. The architecture is periodically implemented by software with a small amount of physical processors (FC processors) and virtual redundancies (payload processors) to emulate a hybrid redundancy computer. Communications among processors are accomplished over a fault-tolerant LAN. This allows a versatile operating behavior in terms of data communication as well as in terms of distributed fault tolerance. Obtained results, payload validation and reliability results are also presented.

  9. Communication: rate coefficients from quasiclassical trajectory calculations from the reverse reaction: The Mu + H2 reaction re-visited.

    PubMed

    Homayoon, Zahra; Jambrina, Pablo G; Aoiz, F Javier; Bowman, Joel M

    2012-07-14

    In a previous paper [P. G. Jambrina et al., J. Chem. Phys. 135, 034310 (2011)] various calculations of the rate coefficient for the Mu + H(2) → MuH + H reaction were presented and compared to experiment. The widely used standard quasiclassical trajectory (QCT) method was shown to overestimate the rate coefficients by several orders of magnitude over the temperature range 200-1000 K. This was attributed to a major failure of that method to describe the correct threshold for the reaction owing to the large difference in zero-point energies (ZPE) of the reactant H(2) and product MuH (∼0.32 eV). In this Communication we show that by performing standard QCT calculations for the reverse reaction and then applying detailed balance, the resulting rate coefficient is in very good agreement with the other computational results that respect the ZPE, (as well as with the experiment) but which are more demanding computationally.

  10. Communication: Rate coefficients from quasiclassical trajectory calculations from the reverse reaction: The Mu + H2 reaction re-visited

    NASA Astrophysics Data System (ADS)

    Homayoon, Zahra; Jambrina, Pablo G.; Aoiz, F. Javier; Bowman, Joel M.

    2012-07-01

    In a previous paper [P. G. Jambrina et al., J. Chem. Phys. 135, 034310 (2011), 10.1063/1.3611400] various calculations of the rate coefficient for the Mu + H2 → MuH + H reaction were presented and compared to experiment. The widely used standard quasiclassical trajectory (QCT) method was shown to overestimate the rate coefficients by several orders of magnitude over the temperature range 200-1000 K. This was attributed to a major failure of that method to describe the correct threshold for the reaction owing to the large difference in zero-point energies (ZPE) of the reactant H2 and product MuH (˜0.32 eV). In this Communication we show that by performing standard QCT calculations for the reverse reaction and then applying detailed balance, the resulting rate coefficient is in very good agreement with the other computational results that respect the ZPE, (as well as with the experiment) but which are more demanding computationally.

  11. Multiscale Computer Simulation of Failure in Aerogels

    NASA Technical Reports Server (NTRS)

    Good, Brian S.

    2008-01-01

    Aerogels have been of interest to the aerospace community primarily for their thermal properties, notably their low thermal conductivities. While such gels are typically fragile, recent advances in the application of conformal polymer layers to these gels has made them potentially useful as lightweight structural materials as well. We have previously performed computer simulations of aerogel thermal conductivity and tensile and compressive failure, with results that are in qualitative, and sometimes quantitative, agreement with experiment. However, recent experiments in our laboratory suggest that gels having similar densities may exhibit substantially different properties. In this work, we extend our original diffusion limited cluster aggregation (DLCA) model for gel structure to incorporate additional variation in DLCA simulation parameters, with the aim of producing DLCA clusters of similar densities that nevertheless have different fractal dimension and secondary particle coordination. We perform particle statics simulations of gel strain on these clusters, and consider the effects of differing DLCA simulation conditions, and the resultant differences in fractal dimension and coordination, on gel strain properties.

  12. Improving Simulated Annealing by Replacing Its Variables with Game-Theoretic Utility Maximizers

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.; Bandari, Esfandiar; Tumer, Kagan

    2001-01-01

    The game-theory field of Collective INtelligence (COIN) concerns the design of computer-based players engaged in a non-cooperative game so that as those players pursue their self-interests, a pre-specified global goal for the collective computational system is achieved as a side-effect. Previous implementations of COIN algorithms have outperformed conventional techniques by up to several orders of magnitude, on domains ranging from telecommunications control to optimization in congestion problems. Recent mathematical developments have revealed that these previously developed algorithms were based on only two of the three factors determining performance. Consideration of only the third factor would instead lead to conventional optimization techniques like simulated annealing that have little to do with non-cooperative games. In this paper we present an algorithm based on all three terms at once. This algorithm can be viewed as a way to modify simulated annealing by recasting it as a non-cooperative game, with each variable replaced by a player. This recasting allows us to leverage the intelligent behavior of the individual players to substantially improve the exploration step of the simulated annealing. Experiments are presented demonstrating that this recasting significantly improves simulated annealing for a model of an economic process run over an underlying small-worlds topology. Furthermore, these experiments reveal novel small-worlds phenomena, and highlight the shortcomings of conventional mechanism design in bounded rationality domains.

  13. Computational and Spectroscopic Investigations of the Molecular Scale Structure and Dynamics of Geologically Important Fluids and Mineral-Fluid Interfaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    R. James Kirkpatrick; Andrey G. Kalinichev

    2008-11-25

    Research supported by this grant focuses on molecular scale understanding of central issues related to the structure and dynamics of geochemically important fluids, fluid-mineral interfaces, and confined fluids using computational modeling and experimental methods. Molecular scale knowledge about fluid structure and dynamics, how these are affected by mineral surfaces and molecular-scale (nano-) confinement, and how water molecules and dissolved species interact with surfaces is essential to understanding the fundamental chemistry of a wide range of low-temperature geochemical processes, including sorption and geochemical transport. Our principal efforts are devoted to continued development of relevant computational approaches, application of these approaches tomore » important geochemical questions, relevant NMR and other experimental studies, and application of computational modeling methods to understanding the experimental results. The combination of computational modeling and experimental approaches is proving highly effective in addressing otherwise intractable problems. In 2006-2007 we have significantly advanced in new, highly promising research directions along with completion of on-going projects and final publication of work completed in previous years. New computational directions are focusing on modeling proton exchange reactions in aqueous solutions using ab initio molecular dynamics (AIMD), metadynamics (MTD), and empirical valence bond (EVB) approaches. Proton exchange is critical to understanding the structure, dynamics, and reactivity at mineral-water interfaces and for oxy-ions in solution, but has traditionally been difficult to model with molecular dynamics (MD). Our ultimate objective is to develop this capability, because MD is much less computationally demanding than quantum-chemical approaches. We have also extended our previous MD simulations of metal binding to natural organic matter (NOM) to a much longer time scale (up to 10 ns) for significantly larger systems. These calculations have allowed us, for the first time, to study the effects of metal cations with different charges and charge density on the NOM aggregation in aqueous solutions. Other computational work has looked at the longer-time-scale dynamical behavior of aqueous species at mineral-water interfaces investigated simultaneously by NMR spectroscopy. Our experimental NMR studies have focused on understanding the structure and dynamics of water and dissolved species at mineral-water interfaces and in two-dimensional nano-confinement within clay interlayers. Combined NMR and MD study of H2O, Na+, and Cl- interactions with the surface of quartz has direct implications regarding interpretation of sum frequency vibrational spectroscopic experiments for this phase and will be an important reference for future studies. We also used NMR to examine the behavior of K+ and H2O in the interlayer and at the surfaces of the clay minerals hectorite and illite-rich illite-smectite. This the first time K+ dynamics has been characterized spectroscopically in geochemical systems. Preliminary experiments were also performed to evaluate the potential of 75As NMR as a probe of arsenic geochemical behavior. The 75As NMR study used advanced signal enhancement methods, introduced a new data acquisition approach to minimize the time investment in ultra-wide-line NMR experiments, and provides the first evidence of a strong relationship between the chemical shift and structural parameters for this experimentally challenging nucleus. We have also initiated a series of inelastic and quasi-elastic neutron scattering measurements of water dynamics in the interlayers of clays and layered double hydroxides. The objective of these experiments is to probe the correlations of water molecular motions in confined spaces over the scale of times and distances most directly comparable to our MD simulations and on a time scale different than that probed by NMR. This work is being done in collaboration with Drs. C.-K. Loong, N. de Souza, and A.I. Kolesnikov at the Intense Pulsed Neutron Source facility of the Argonne National Lab, and Dr. A. Faraone at the NIST Center for Neutron Research. A manuscript reporting the first results of these experiments, which are highly complimentary to our previous NMR, X-ray, and infra-red results for these phases, is currently in preparation. In total, in 2006-2007 our work has resulted in the publication of 14 peer-reviewed research papers. We also devoted considerable effort to making our work known to a wide range of researchers, as indicated by the 24 contributed abstracts and 14 invited presentations.« less

  14. Practical experimental certification of computational quantum gates using a twirling procedure.

    PubMed

    Moussa, Osama; da Silva, Marcus P; Ryan, Colm A; Laflamme, Raymond

    2012-08-17

    Because of the technical difficulty of building large quantum computers, it is important to be able to estimate how faithful a given implementation is to an ideal quantum computer. The common approach of completely characterizing the computation process via quantum process tomography requires an exponential amount of resources, and thus is not practical even for relatively small devices. We solve this problem by demonstrating that twirling experiments previously used to characterize the average fidelity of quantum memories efficiently can be easily adapted to estimate the average fidelity of the experimental implementation of important quantum computation processes, such as unitaries in the Clifford group, in a practical and efficient manner with applicability in current quantum devices. Using this procedure, we demonstrate state-of-the-art coherent control of an ensemble of magnetic moments of nuclear spins in a single crystal solid by implementing the encoding operation for a 3-qubit code with only a 1% degradation in average fidelity discounting preparation and measurement errors. We also highlight one of the advances that was instrumental in achieving such high fidelity control.

  15. Decisions reduce sensitivity to subsequent information.

    PubMed

    Bronfman, Zohar Z; Brezis, Noam; Moran, Rani; Tsetsos, Konstantinos; Donner, Tobias; Usher, Marius

    2015-07-07

    Behavioural studies over half a century indicate that making categorical choices alters beliefs about the state of the world. People seem biased to confirm previous choices, and to suppress contradicting information. These choice-dependent biases imply a fundamental bound of human rationality. However, it remains unclear whether these effects extend to lower level decisions, and only little is known about the computational mechanisms underlying them. Building on the framework of sequential-sampling models of decision-making, we developed novel psychophysical protocols that enable us to dissect quantitatively how choices affect the way decision-makers accumulate additional noisy evidence. We find robust choice-induced biases in the accumulation of abstract numerical (experiment 1) and low-level perceptual (experiment 2) evidence. These biases deteriorate estimations of the mean value of the numerical sequence (experiment 1) and reduce the likelihood to revise decisions (experiment 2). Computational modelling reveals that choices trigger a reduction of sensitivity to subsequent evidence via multiplicative gain modulation, rather than shifting the decision variable towards the chosen alternative in an additive fashion. Our results thus show that categorical choices alter the evidence accumulation mechanism itself, rather than just its outcome, rendering the decision-maker less sensitive to new information. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  16. Distractor Evoked Deviations of Saccade Trajectory Are Modulated by Fixation Activity in the Superior Colliculus: Computational and Behavioral Evidence

    PubMed Central

    Wang, Zhiguo; Theeuwes, Jan

    2014-01-01

    Previous studies have shown that saccades may deviate towards or away from task irrelevant visual distractors. This observation has been attributed to active suppression (inhibition) of the distractor location unfolding over time: early in time inhibition at the distractor location is incomplete causing deviation towards the distractor, while later in time when inhibition is complete the eyes deviate away from the distractor. In a recent computational study, Wang, Kruijne and Theeuwes proposed an alternative theory that the lateral interactions in the superior colliculus (SC), which are characterized by short-distance excitation and long-distance inhibition, are sufficient for generating both deviations towards and away from distractors. In the present study, we performed a meta-analysis of the literature, ran model simulations and conducted two behavioral experiments to further explore this unconventional theory. Confirming predictions generated by the model simulations, the behavioral experiments show that a) saccades deviate towards close distractors and away from remote distractors, and b) the amount of deviation depends on the strength of fixation activity in the SC, which can be manipulated by turning off the fixation stimulus before or after target onset (Experiment 1), or by varying the eccentricity of the target and distractor (Experiment 2). PMID:25551552

  17. The Relationship Between Computer Experience and Computerized Cognitive Test Performance Among Older Adults

    PubMed Central

    2013-01-01

    Objective. This study compared the relationship between computer experience and performance on computerized cognitive tests and a traditional paper-and-pencil cognitive test in a sample of older adults (N = 634). Method. Participants completed computer experience and computer attitudes questionnaires, three computerized cognitive tests (Useful Field of View (UFOV) Test, Road Sign Test, and Stroop task) and a paper-and-pencil cognitive measure (Trail Making Test). Multivariate analysis of covariance was used to examine differences in cognitive performance across the four measures between those with and without computer experience after adjusting for confounding variables. Results. Although computer experience had a significant main effect across all cognitive measures, the effect sizes were similar. After controlling for computer attitudes, the relationship between computer experience and UFOV was fully attenuated. Discussion. Findings suggest that computer experience is not uniquely related to performance on computerized cognitive measures compared with paper-and-pencil measures. Because the relationship between computer experience and UFOV was fully attenuated by computer attitudes, this may imply that motivational factors are more influential to UFOV performance than computer experience. Our findings support the hypothesis that computer use is related to cognitive performance, and this relationship is not stronger for computerized cognitive measures. Implications and directions for future research are provided. PMID:22929395

  18. Sources of computer self-efficacy: The relationship to outcome expectations, computer anxiety, and intention to use computers

    NASA Astrophysics Data System (ADS)

    Antoine, Marilyn V.

    2011-12-01

    The purpose of this research was to extend earlier research on sources of selfefficacy (Lent, Lopez, & Biechke, 1991; Usher & Pajares, 2009) to the information technology domain. The principal investigator examined how Bandura's (1977) sources of self-efficacy information---mastery experience, vicarious experience, verbal persuasion, and physiological states---shape computer self-efficacy beliefs and influence the decision to use or not use computers. The study took place at a mid-sized Historically Black College or University in the South. A convenience sample of 105 undergraduates was drawn from students enrolled in multiple sections of two introductory computer courses. There were 67 females and 38 males. This research was a correlational study of the following variables: sources of computer self-efficacy, general computer self-efficacy, outcome expectations, computer anxiety, and intention to use computers. The principal investigator administered a survey questionnaire containing 52 Likert items to measure the major study variables. Additionally, the survey instrument collected demographic variables such as gender, age, race, intended major, classification, technology use, technology adoption category, and whether the student owns a computer. The results reveal the following: (1) Mastery experience and verbal persuasion had statistically significant relationships to general computer self-efficacy, while vicarious experience and physiological states had non-significant relationships. Mastery experience had the strongest correlation to general computer self-efficacy. (2) All of the sources of computer self-efficacy had statistically significant relationships to personal outcome expectations. Vicarious experience had the strongest correlation to personal outcome expectations. (3) All of the sources of self-efficacy had statistically significant relationships to performance outcome expectations. Vicarious experience had the strongest correlation to performance outcome expectations. (4) Mastery experience and physiological states had statistically significant relationships to computer anxiety, while vicarious experience and verbal persuasion had non-significant relationships. Physiological states had the strongest correlation to computer anxiety. (5) Mastery experience, vicarious experience, and physiological states had statistically significant relationships to intention to use computers, while verbal persuasion had a non-significant relationship. Mastery experience had the strongest correlation to intention to use computers. Gender-related findings indicate that females reported higher average mastery experience, vicarious experience, physiological states, and intention to use computers than males. Females reported lower average general computer self-efficacy, computer anxiety, verbal persuasion, personal outcome expectations, and performance outcome expectations than males. The results of this study can be used to develop strategies for increasing general computer self-efficacy, outcome expectations, and intention to use computers. The results can also be used to develop strategies for reducing computer anxiety.

  19. An accurate global potential energy surface, dipole moment surface, and rovibrational frequencies for NH3

    NASA Astrophysics Data System (ADS)

    Huang, Xinchuan; Schwenke, David W.; Lee, Timothy J.

    2008-12-01

    A global potential energy surface (PES) that includes short and long range terms has been determined for the NH3 molecule. The singles and doubles coupled-cluster method that includes a perturbational estimate of connected triple excitations and the internally contracted averaged coupled-pair functional electronic structure methods have been used in conjunction with very large correlation-consistent basis sets, including diffuse functions. Extrapolation to the one-particle basis set limit was performed and core correlation and scalar relativistic contributions were included directly, while the diagonal Born-Oppenheimer correction was added. Our best purely ab initio PES, denoted "mixed," is constructed from two PESs which differ in whether the ic-ACPF higher-order correlation correction was added or not. Rovibrational transition energies computed from the mixed PES agree well with experiment and the best previous theoretical studies, but most importantly the quality does not deteriorate even up to 10300cm-1 above the zero-point energy (ZPE). The mixed PES was improved further by empirical refinement using the most reliable J =0-2 rovibrational transitions in the HITRAN 2004 database. Agreement between high-resolution experiment and rovibrational transition energies computed from our refined PES for J =0-6 is excellent. Indeed, the root mean square (rms) error for 13 HITRAN 2004 bands for J =0-2 is 0.023cm-1 and that for each band is always ⩽0.06cm-1. For J =3-5 the rms error is always ⩽0.15cm-1. This agreement means that transition energies computed with our refined PES should be useful in the assignment of new high-resolution NH3 spectra and in correcting mistakes in previous assignments. Ideas for further improvements to our refined PES and for extension to other isotopolog are discussed.

  20. Confirmation of a realistic reactor model for BNCT dosimetry at the TRIGA Mainz

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ziegner, Markus, E-mail: Markus.Ziegner.fl@ait.ac.at; Schmitz, Tobias; Hampel, Gabriele

    2014-11-01

    Purpose: In order to build up a reliable dose monitoring system for boron neutron capture therapy (BNCT) applications at the TRIGA reactor in Mainz, a computer model for the entire reactor was established, simulating the radiation field by means of the Monte Carlo method. The impact of different source definition techniques was compared and the model was validated by experimental fluence and dose determinations. Methods: The depletion calculation code ORIGEN2 was used to compute the burn-up and relevant material composition of each burned fuel element from the day of first reactor operation to its current core. The material composition ofmore » the current core was used in a MCNP5 model of the initial core developed earlier. To perform calculations for the region outside the reactor core, the model was expanded to include the thermal column and compared with the previously established ATTILA model. Subsequently, the computational model is simplified in order to reduce the calculation time. Both simulation models are validated by experiments with different setups using alanine dosimetry and gold activation measurements with two different types of phantoms. Results: The MCNP5 simulated neutron spectrum and source strength are found to be in good agreement with the previous ATTILA model whereas the photon production is much lower. Both MCNP5 simulation models predict all experimental dose values with an accuracy of about 5%. The simulations reveal that a Teflon environment favorably reduces the gamma dose component as compared to a polymethyl methacrylate phantom. Conclusions: A computer model for BNCT dosimetry was established, allowing the prediction of dosimetric quantities without further calibration and within a reasonable computation time for clinical applications. The good agreement between the MCNP5 simulations and experiments demonstrates that the ATTILA model overestimates the gamma dose contribution. The detailed model can be used for the planning of structural modifications in the thermal column irradiation channel or the use of different irradiation sites than the thermal column, e.g., the beam tubes.« less

  1. Test and evaluation of a multifunction keyboard and a dedicated keyboard for control of a flight management computer

    NASA Technical Reports Server (NTRS)

    Crane, J. M.; Boucek, G. P., Jr.; Smith, W. D.

    1986-01-01

    A flight management computer (FMC) control display unit (CDU) test was conducted to compare two types of input devices: a fixed legend (dedicated) keyboard and a programmable legend (multifunction) keyboard. The task used for comparison was operation of the flight management computer for the Boeing 737-300. The same tasks were performed by twelve pilots on the FMC control display unit configured with a programmable legend keyboard and with the currently used B737-300 dedicated keyboard. Flight simulator work activity levels and input task complexity were varied during each pilot session. Half of the points tested were previously familiar with the B737-300 dedicated keyboard CDU and half had no prior experience with it. The data collected included simulator flight parameters, keystroke time and sequences, and pilot questionnaire responses. A timeline analysis was also used for evaluation of the two keyboard concepts.

  2. A composite computational model of liver glucose homeostasis. I. Building the composite model.

    PubMed

    Hetherington, J; Sumner, T; Seymour, R M; Li, L; Rey, M Varela; Yamaji, S; Saffrey, P; Margoninski, O; Bogle, I D L; Finkelstein, A; Warner, A

    2012-04-07

    A computational model of the glucagon/insulin-driven liver glucohomeostasis function, focusing on the buffering of glucose into glycogen, has been developed. The model exemplifies an 'engineering' approach to modelling in systems biology, and was produced by linking together seven component models of separate aspects of the physiology. The component models use a variety of modelling paradigms and degrees of simplification. Model parameters were determined by an iterative hybrid of fitting to high-scale physiological data, and determination from small-scale in vitro experiments or molecular biological techniques. The component models were not originally designed for inclusion within such a composite model, but were integrated, with modification, using our published modelling software and computational frameworks. This approach facilitates the development of large and complex composite models, although, inevitably, some compromises must be made when composing the individual models. Composite models of this form have not previously been demonstrated.

  3. Computation of Three-Dimensional Compressible Flow From a Rectangular Nozzle with Delta Tabs

    NASA Technical Reports Server (NTRS)

    Reddy, D. R.; Steffen, C. J., Jr.; Zaman, K. B. M. Q.

    1999-01-01

    A three-dimensional viscous flow analysis is performed using a time-marching Reynolds-averaged Navier-Stokes code for a 3:1 rectangular nozzle with two delta tabs located at the nozz1e exit plane to enhance mixing. Two flow configurations, a subsonic jet case and a supersonic jet case using the same rate configuration, which were previously studied experimentally, are computed and compared with the experimental data. The experimental data include streamwise velocity and vorticity distributions for the subsonic case, and Mach number distributions for the supersonic case, at various axial locations downstream of the nozzle exit. The computational results show very good agreement with the experimental data. In addition, the effect of compressibility on vorticity dynamics is examined by comparing the vorticity contours of the subsonic jet case with those of the supersonic jet case which were not measured in the experiment.

  4. A reassessment of Galileo radiation exposures in the Jupiter magnetosphere.

    PubMed

    Atwell, William; Townsend, Lawrence; Miller, Thomas; Campbell, Christina

    2005-01-01

    Earlier particle experiments in the 1970s on Pioneer-10 and -11 and Voyager-1 and -2 provided Jupiter flyby particle data, which were used by Divine and Garrett to develop the first Jupiter trapped radiation environment model. This model was used to establish a baseline radiation effects design limit for the Galileo onboard electronics. Recently, Garrett et al. have developed an updated Galileo Interim Radiation Environment (GIRE) model based on Galileo electron data. In this paper, we have used the GIRE model to reassess the computed radiation exposures and dose effects for Galileo. The 34-orbit 'as flown' Galileo trajectory data and the updated GIRE model were used to compute the electron and proton spectra for each of the 34 orbits. The total ionisation doses of electrons and protons have been computed based on a parametric shielding configuration, and these results are compared with previously published results.

  5. Computational mechanisms underlying cortical responses to the affordance properties of visual scenes

    PubMed Central

    Epstein, Russell A.

    2018-01-01

    Biologically inspired deep convolutional neural networks (CNNs), trained for computer vision tasks, have been found to predict cortical responses with remarkable accuracy. However, the internal operations of these models remain poorly understood, and the factors that account for their success are unknown. Here we develop a set of techniques for using CNNs to gain insights into the computational mechanisms underlying cortical responses. We focused on responses in the occipital place area (OPA), a scene-selective region of dorsal occipitoparietal cortex. In a previous study, we showed that fMRI activation patterns in the OPA contain information about the navigational affordances of scenes; that is, information about where one can and cannot move within the immediate environment. We hypothesized that this affordance information could be extracted using a set of purely feedforward computations. To test this idea, we examined a deep CNN with a feedforward architecture that had been previously trained for scene classification. We found that responses in the CNN to scene images were highly predictive of fMRI responses in the OPA. Moreover the CNN accounted for the portion of OPA variance relating to the navigational affordances of scenes. The CNN could thus serve as an image-computable candidate model of affordance-related responses in the OPA. We then ran a series of in silico experiments on this model to gain insights into its internal operations. These analyses showed that the computation of affordance-related features relied heavily on visual information at high-spatial frequencies and cardinal orientations, both of which have previously been identified as low-level stimulus preferences of scene-selective visual cortex. These computations also exhibited a strong preference for information in the lower visual field, which is consistent with known retinotopic biases in the OPA. Visualizations of feature selectivity within the CNN suggested that affordance-based responses encoded features that define the layout of the spatial environment, such as boundary-defining junctions and large extended surfaces. Together, these results map the sensory functions of the OPA onto a fully quantitative model that provides insights into its visual computations. More broadly, they advance integrative techniques for understanding visual cortex across multiple level of analysis: from the identification of cortical sensory functions to the modeling of their underlying algorithms. PMID:29684011

  6. HGIMDA: Heterogeneous graph inference for miRNA-disease association prediction

    PubMed Central

    Zhang, Xu; You, Zhu-Hong; Huang, Yu-An; Yan, Gui-Ying

    2016-01-01

    Recently, microRNAs (miRNAs) have drawn more and more attentions because accumulating experimental studies have indicated miRNA could play critical roles in multiple biological processes as well as the development and progression of human complex diseases. Using the huge number of known heterogeneous biological datasets to predict potential associations between miRNAs and diseases is an important topic in the field of biology, medicine, and bioinformatics. In this study, considering the limitations in the previous computational methods, we developed the computational model of Heterogeneous Graph Inference for MiRNA-Disease Association prediction (HGIMDA) to uncover potential miRNA-disease associations by integrating miRNA functional similarity, disease semantic similarity, Gaussian interaction profile kernel similarity, and experimentally verified miRNA-disease associations into a heterogeneous graph. HGIMDA obtained AUCs of 0.8781 and 0.8077 based on global and local leave-one-out cross validation, respectively. Furthermore, HGIMDA was applied to three important human cancers for performance evaluation. As a result, 90% (Colon Neoplasms), 88% (Esophageal Neoplasms) and 88% (Kidney Neoplasms) of top 50 predicted miRNAs are confirmed by recent experiment reports. Furthermore, HGIMDA could be effectively applied to new diseases and new miRNAs without any known associations, which overcome the important limitations of many previous computational models. PMID:27533456

  7. HGIMDA: Heterogeneous graph inference for miRNA-disease association prediction.

    PubMed

    Chen, Xing; Yan, Chenggang Clarence; Zhang, Xu; You, Zhu-Hong; Huang, Yu-An; Yan, Gui-Ying

    2016-10-04

    Recently, microRNAs (miRNAs) have drawn more and more attentions because accumulating experimental studies have indicated miRNA could play critical roles in multiple biological processes as well as the development and progression of human complex diseases. Using the huge number of known heterogeneous biological datasets to predict potential associations between miRNAs and diseases is an important topic in the field of biology, medicine, and bioinformatics. In this study, considering the limitations in the previous computational methods, we developed the computational model of Heterogeneous Graph Inference for MiRNA-Disease Association prediction (HGIMDA) to uncover potential miRNA-disease associations by integrating miRNA functional similarity, disease semantic similarity, Gaussian interaction profile kernel similarity, and experimentally verified miRNA-disease associations into a heterogeneous graph. HGIMDA obtained AUCs of 0.8781 and 0.8077 based on global and local leave-one-out cross validation, respectively. Furthermore, HGIMDA was applied to three important human cancers for performance evaluation. As a result, 90% (Colon Neoplasms), 88% (Esophageal Neoplasms) and 88% (Kidney Neoplasms) of top 50 predicted miRNAs are confirmed by recent experiment reports. Furthermore, HGIMDA could be effectively applied to new diseases and new miRNAs without any known associations, which overcome the important limitations of many previous computational models.

  8. The influence of multiple trials and computer-mediated communication on collaborative and individual semantic recall.

    PubMed

    Hinds, Joanne M; Payne, Stephen J

    2018-04-01

    Collaborative inhibition is a phenomenon where collaborating groups experience a decrement in recall when interacting with others. Despite this, collaboration has been found to improve subsequent individual recall. We explore these effects in semantic recall, which is seldom studied in collaborative retrieval. We also examine "parallel CMC", a synchronous form of computer-mediated communication that has previously been found to improve collaborative recall [Hinds, J. M., & Payne, S. J. (2016). Collaborative inhibition and semantic recall: Improving collaboration through computer-mediated communication. Applied Cognitive Psychology, 30(4), 554-565]. Sixty three triads completed a semantic recall task, which involved generating words beginning with "PO" or "HE" across three recall trials, in one of three retrieval conditions: Individual-Individual-Individual (III), Face-to-face-Face-to-Face-Individual (FFI) and Parallel-Parallel-Individual (PPI). Collaborative inhibition was present across both collaborative conditions. Individual recall in Recall 3 was higher when participants had previously collaborated in comparison to recalling three times individually. There was no difference between face-to-face and parallel CMC recall, however subsidiary analyses of instance repetitions and subjective organisation highlighted differences in group members' approaches to recall in terms of organisation and attention to others' contributions. We discuss the implications of these findings in relation to retrieval strategy disruption.

  9. Dopaminergic Balance between Reward Maximization and Policy Complexity

    PubMed Central

    Parush, Naama; Tishby, Naftali; Bergman, Hagai

    2011-01-01

    Previous reinforcement-learning models of the basal ganglia network have highlighted the role of dopamine in encoding the mismatch between prediction and reality. Far less attention has been paid to the computational goals and algorithms of the main-axis (actor). Here, we construct a top-down model of the basal ganglia with emphasis on the role of dopamine as both a reinforcement learning signal and as a pseudo-temperature signal controlling the general level of basal ganglia excitability and motor vigilance of the acting agent. We argue that the basal ganglia endow the thalamic-cortical networks with the optimal dynamic tradeoff between two constraints: minimizing the policy complexity (cost) and maximizing the expected future reward (gain). We show that this multi-dimensional optimization processes results in an experience-modulated version of the softmax behavioral policy. Thus, as in classical softmax behavioral policies, probability of actions are selected according to their estimated values and the pseudo-temperature, but in addition also vary according to the frequency of previous choices of these actions. We conclude that the computational goal of the basal ganglia is not to maximize cumulative (positive and negative) reward. Rather, the basal ganglia aim at optimization of independent gain and cost functions. Unlike previously suggested single-variable maximization processes, this multi-dimensional optimization process leads naturally to a softmax-like behavioral policy. We suggest that beyond its role in the modulation of the efficacy of the cortico-striatal synapses, dopamine directly affects striatal excitability and thus provides a pseudo-temperature signal that modulates the tradeoff between gain and cost. The resulting experience and dopamine modulated softmax policy can then serve as a theoretical framework to account for the broad range of behaviors and clinical states governed by the basal ganglia and dopamine systems. PMID:21603228

  10. Understanding titanium-catalysed radical-radical reactions: a DFT study unravels the complex kinetics of ketone-nitrile couplings.

    PubMed

    Streuff, Jan; Himmel, Daniel; Younas, Sara L

    2018-04-03

    The computational investigation of a titanium-catalysed reductive radical-radical coupling is reported. The results match the conclusions from an earlier experimental study and enable a further interpretation of the previously observed complex reaction kinetics. Furthermore, the interplay between neutral and cationic reaction pathways in titanium(iii)-catalysed reactions is investigated for the first time. The results show that hydrochloride additives and reaction byproducts play an important role in the respective equilibria. A full reaction profile is assembled and the computed activation barrier is found to be in reasonable agreement with the experiment. The conclusions are of fundamental importance to the field of low-valent titanium catalysis and the understanding of related catalytic radical-radical coupling reactions.

  11. Hybrid glowworm swarm optimization for task scheduling in the cloud environment

    NASA Astrophysics Data System (ADS)

    Zhou, Jing; Dong, Shoubin

    2018-06-01

    In recent years many heuristic algorithms have been proposed to solve task scheduling problems in the cloud environment owing to their optimization capability. This article proposes a hybrid glowworm swarm optimization (HGSO) based on glowworm swarm optimization (GSO), which uses a technique of evolutionary computation, a strategy of quantum behaviour based on the principle of neighbourhood, offspring production and random walk, to achieve more efficient scheduling with reasonable scheduling costs. The proposed HGSO reduces the redundant computation and the dependence on the initialization of GSO, accelerates the convergence and more easily escapes from local optima. The conducted experiments and statistical analysis showed that in most cases the proposed HGSO algorithm outperformed previous heuristic algorithms to deal with independent tasks.

  12. Experimental Realization of a Quantum Support Vector Machine

    NASA Astrophysics Data System (ADS)

    Li, Zhaokai; Liu, Xiaomei; Xu, Nanyang; Du, Jiangfeng

    2015-04-01

    The fundamental principle of artificial intelligence is the ability of machines to learn from previous experience and do future work accordingly. In the age of big data, classical learning machines often require huge computational resources in many practical cases. Quantum machine learning algorithms, on the other hand, could be exponentially faster than their classical counterparts by utilizing quantum parallelism. Here, we demonstrate a quantum machine learning algorithm to implement handwriting recognition on a four-qubit NMR test bench. The quantum machine learns standard character fonts and then recognizes handwritten characters from a set with two candidates. Because of the wide spread importance of artificial intelligence and its tremendous consumption of computational resources, quantum speedup would be extremely attractive against the challenges of big data.

  13. Test Facilities and Experience on Space Nuclear System Developments at the Kurchatov Institute

    NASA Astrophysics Data System (ADS)

    Ponomarev-Stepnoi, Nikolai N.; Garin, Vladimir P.; Glushkov, Evgeny S.; Kompaniets, George V.; Kukharkin, Nikolai E.; Madeev, Vicktor G.; Papin, Vladimir K.; Polyakov, Dmitry N.; Stepennov, Boris S.; Tchuniyaev, Yevgeny I.; Tikhonov, Lev Ya.; Uksusov, Yevgeny I.

    2004-02-01

    The complexity of space fission systems and rigidity of requirement on minimization of weight and dimension characteristics along with the wish to decrease expenditures on their development demand implementation of experimental works which results shall be used in designing, safety substantiation, and licensing procedures. Experimental facilities are intended to solve the following tasks: obtainment of benchmark data for computer code validations, substantiation of design solutions when computational efforts are too expensive, quality control in a production process, and ``iron'' substantiation of criticality safety design solutions for licensing and public relations. The NARCISS and ISKRA critical facilities and unique ORM facility on shielding investigations at the operating OR nuclear research reactor were created in the Kurchatov Institute to solve the mentioned tasks. The range of activities performed at these facilities within the implementation of the previous Russian nuclear power system programs is briefly described in the paper. This experience shall be analyzed in terms of methodological approach to development of future space nuclear systems (this analysis is beyond this paper). Because of the availability of these facilities for experiments, the brief description of their critical assemblies and characteristics is given in this paper.

  14. An Integrative and Collaborative Approach to Creating a Diverse and Computationally Competent Geoscience Workforce

    NASA Astrophysics Data System (ADS)

    Moore, S. L.; Kar, A.; Gomez, R.

    2015-12-01

    A partnership between Fort Valley State University (FVSU), the Jackson School of Geosciences at The University of Texas (UT) at Austin, and the Texas Advanced Computing Center (TACC) is engaging computational geoscience faculty and researchers with academically talented underrepresented minority (URM) students, training them to solve grand challenges . These next generation computational geoscientists are being trained to solve some of the world's most challenging geoscience grand challenges requiring data intensive large scale modeling and simulation on high performance computers . UT Austin's geoscience outreach program GeoFORCE, recently awarded the Presidential Award in Excellence in Science, Mathematics and Engineering Mentoring, contributes to the collaborative best practices in engaging researchers with URM students. Collaborative efforts over the past decade are providing data demonstrating that integrative pipeline programs with mentoring and paid internship opportunities, multi-year scholarships, computational training, and communication skills development are having an impact on URMs developing middle skills for geoscience careers. Since 1997, the Cooperative Developmental Energy Program at FVSU and its collaborating universities have graduated 87 engineers, 33 geoscientists, and eight health physicists. Recruited as early as high school, students enroll for three years at FVSU majoring in mathematics, chemistry or biology, and then transfer to UT Austin or other partner institutions to complete a second STEM degree, including geosciences. A partnership with the Integrative Computational Education and Research Traineeship (ICERT), a National Science Foundation (NSF) Research Experience for Undergraduates (REU) Site at TACC provides students with a 10-week summer research experience at UT Austin. Mentored by TACC researchers, students with no previous background in computational science learn to use some of the world's most powerful high performance computing resources to address a grand geosciences problem. Students increase their ability to understand and explain the societal impact of their research and communicate the research to multidisciplinary and lay audiences via near-peer mentoring, poster presentations, and publication opportunities.

  15. Mesencephalic representations of recent experience influence decision making.

    PubMed

    Thompson, John A; Costabile, Jamie D; Felsen, Gidon

    2016-07-25

    Decisions are influenced by recent experience, but the neural basis for this phenomenon is not well understood. Here, we address this question in the context of action selection. We focused on activity in the pedunculopontine tegmental nucleus (PPTg), a mesencephalic region that provides input to several nuclei in the action selection network, in well-trained mice selecting actions based on sensory cues and recent trial history. We found that, at the time of action selection, the activity of many PPTg neurons reflected the action on the previous trial and its outcome, and the strength of this activity predicted the upcoming choice. Further, inactivating the PPTg predictably decreased the influence of recent experience on action selection. These findings suggest that PPTg input to downstream motor regions, where it can be integrated with other relevant information, provides a simple mechanism for incorporating recent experience into the computations underlying action selection.

  16. Generating Stimuli for Neuroscience Using PsychoPy.

    PubMed

    Peirce, Jonathan W

    2008-01-01

    PsychoPy is a software library written in Python, using OpenGL to generate very precise visual stimuli on standard personal computers. It is designed to allow the construction of as wide a variety of neuroscience experiments as possible, with the least effort. By writing scripts in standard Python syntax users can generate an enormous variety of visual and auditory stimuli and can interact with a wide range of external hardware (enabling its use in fMRI, EEG, MEG etc.). The structure of scripts is simple and intuitive. As a result, new experiments can be written very quickly, and trying to understand a previously written script is easy, even with minimal code comments. PsychoPy can also generate movies and image sequences to be used in demos or simulated neuroscience experiments. This paper describes the range of tools and stimuli that it provides and the environment in which experiments are conducted.

  17. Confirmation bias in human reinforcement learning: Evidence from counterfactual feedback processing

    PubMed Central

    Lefebvre, Germain; Blakemore, Sarah-Jayne

    2017-01-01

    Previous studies suggest that factual learning, that is, learning from obtained outcomes, is biased, such that participants preferentially take into account positive, as compared to negative, prediction errors. However, whether or not the prediction error valence also affects counterfactual learning, that is, learning from forgone outcomes, is unknown. To address this question, we analysed the performance of two groups of participants on reinforcement learning tasks using a computational model that was adapted to test if prediction error valence influences learning. We carried out two experiments: in the factual learning experiment, participants learned from partial feedback (i.e., the outcome of the chosen option only); in the counterfactual learning experiment, participants learned from complete feedback information (i.e., the outcomes of both the chosen and unchosen option were displayed). In the factual learning experiment, we replicated previous findings of a valence-induced bias, whereby participants learned preferentially from positive, relative to negative, prediction errors. In contrast, for counterfactual learning, we found the opposite valence-induced bias: negative prediction errors were preferentially taken into account, relative to positive ones. When considering valence-induced bias in the context of both factual and counterfactual learning, it appears that people tend to preferentially take into account information that confirms their current choice. PMID:28800597

  18. Confirmation bias in human reinforcement learning: Evidence from counterfactual feedback processing.

    PubMed

    Palminteri, Stefano; Lefebvre, Germain; Kilford, Emma J; Blakemore, Sarah-Jayne

    2017-08-01

    Previous studies suggest that factual learning, that is, learning from obtained outcomes, is biased, such that participants preferentially take into account positive, as compared to negative, prediction errors. However, whether or not the prediction error valence also affects counterfactual learning, that is, learning from forgone outcomes, is unknown. To address this question, we analysed the performance of two groups of participants on reinforcement learning tasks using a computational model that was adapted to test if prediction error valence influences learning. We carried out two experiments: in the factual learning experiment, participants learned from partial feedback (i.e., the outcome of the chosen option only); in the counterfactual learning experiment, participants learned from complete feedback information (i.e., the outcomes of both the chosen and unchosen option were displayed). In the factual learning experiment, we replicated previous findings of a valence-induced bias, whereby participants learned preferentially from positive, relative to negative, prediction errors. In contrast, for counterfactual learning, we found the opposite valence-induced bias: negative prediction errors were preferentially taken into account, relative to positive ones. When considering valence-induced bias in the context of both factual and counterfactual learning, it appears that people tend to preferentially take into account information that confirms their current choice.

  19. Extracting quasi-steady Lagrangian transport patterns from the ocean circulation: An application to the Gulf of Mexico.

    PubMed

    Duran, R; Beron-Vera, F J; Olascoaga, M J

    2018-03-26

    We construct a climatology of Lagrangian coherent structures (LCSs)-the concealed skeleton that shapes transport-with a twelve-year-long data-assimilative simulation of the sea-surface circulation in the Gulf of Mexico (GoM). Computed as time-mean Cauchy-Green strain tensorlines of the climatological velocity, the climatological LCSs (cLCSs) unveil recurrent Lagrangian circulation patterns. The cLCSs strongly constrain the ensemble-mean Lagrangian circulation of the instantaneous model velocity, showing that a climatological velocity can preserve meaningful transport information. The quasi-steady transport patterns revealed by the cLCSs agree well with aspects of the GoM circulation described in several previous observational and numerical studies. For example, the cLCSs identify regions of persistent isolation, and suggest that coastal regions previously identified as high-risk for pollution impact are regions of maximal attraction. We also show that cLCSs are remarkably accurate at identifying transport patterns observed during the Deepwater Horizon and Ixtoc oil spills, and during the Grand LAgrangian Deployment (GLAD) experiment. Thus it is shown that computing cLCSs is an efficient and meaningful way of synthesizing vast amounts of Lagrangian information. The cLCS method confirms previous GoM studies, and contributes to our understanding by revealing the persistent nature of the dynamics and kinematics treated therein.

  20. Experimental violation of multipartite Bell inequalities with trapped ions.

    PubMed

    Lanyon, B P; Zwerger, M; Jurcevic, P; Hempel, C; Dür, W; Briegel, H J; Blatt, R; Roos, C F

    2014-03-14

    We report on the experimental violation of multipartite Bell inequalities by entangled states of trapped ions. First, we consider resource states for measurement-based quantum computation of between 3 and 7 ions and show that all strongly violate a Bell-type inequality for graph states, where the criterion for violation is a sufficiently high fidelity. Second, we analyze Greenberger-Horne-Zeilinger states of up to 14 ions generated in a previous experiment using stronger Mermin-Klyshko inequalities, and show that in this case the violation of local realism increases exponentially with system size. These experiments represent a violation of multipartite Bell-type inequalities of deterministically prepared entangled states. In addition, the detection loophole is closed.

  1. A simple theory of back-surface-field /BSF/ solar cells

    NASA Technical Reports Server (NTRS)

    Von Roos, O.

    1979-01-01

    An earlier calculation of the I-V characteristics of solar cells contains a mistake. The current generated by light within the depletion layer is too large by a factor of 2. When this mistake is corrected, not only are all previous conclusions unchanged, but the agreement with experiment becomes better. Results are presented in graphical form of new computations which not only take account of the factor of 2, but also include more recent data on material parameters.

  2. Computation of transonic viscous-inviscid interacting flow

    NASA Technical Reports Server (NTRS)

    Whitfield, D. L.; Thomas, J. L.; Jameson, A.; Schmidt, W.

    1983-01-01

    Transonic viscous-inviscid interaction is considered using the Euler and inverse compressible turbulent boundary-layer equations. Certain improvements in the inverse boundary-layer method are mentioned, along with experiences in using various Runge-Kutta schemes to solve the Euler equations. Numerical conditions imposed on the Euler equations at a surface for viscous-inviscid interaction using the method of equivalent sources are developed, and numerical solutions are presented and compared with experimental data to illustrate essential points. Previously announced in STAR N83-17829

  3. Gastrointestinal bleeding detection in wireless capsule endoscopy images using handcrafted and CNN features.

    PubMed

    Xiao Jia; Meng, Max Q-H

    2017-07-01

    Gastrointestinal (GI) bleeding detection plays an essential role in wireless capsule endoscopy (WCE) examination. In this paper, we present a new approach for WCE bleeding detection that combines handcrafted (HC) features and convolutional neural network (CNN) features. Compared with our previous work, a smaller-scale CNN architecture is constructed to lower the computational cost. In experiments, we show that the proposed strategy is highly capable when training data is limited, and yields comparable or better results than the latest methods.

  4. On the three-dimensional instability of strained vortices

    NASA Technical Reports Server (NTRS)

    Waleffe, Fabian

    1990-01-01

    The three-dimensional (3-D) instability of a two-dimensional (2-D) flow with elliptical streamlines has been proposed as a generic mechanism for the breakdown of many 2-D flows. A physical interpretation for the mechanism is presented together with an analytical treatment of the problem. It is shown that the stability of an elliptical flow is governed by an Ince equation. An analytical representation for a localized solution is given and establishes a direct link with previous computations and experiments.

  5. Casimir force in the Gödel space-time and its possible induced cosmological inhomogeneity

    NASA Astrophysics Data System (ADS)

    Khodabakhshi, Sh.; Shojai, A.

    2017-07-01

    The Casimir force between two parallel plates in the Gödel universe is computed for a scalar field at finite temperature. It is observed that when the plates' separation is comparable with the scale given by the rotation of the space-time, the force becomes repulsive and then approaches zero. Since it has been shown previously that the universe may experience a Gödel phase for a small period of time, the induced inhomogeneities from the Casimir force are also studied.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cembranos, Jose A. R.; Diaz-Cruz, J. Lorenzo; Prado, Lilian

    Dark Matter direct detection experiments are able to exclude interesting parameter space regions of particle models which predict an important amount of thermal relics. We use recent data to constrain the branon model and to compute the region that is favored by CDMS measurements. Within this work, we also update present colliders constraints with new studies coming from the LHC. Despite the present low luminosity, it is remarkable that for heavy branons, CMS and ATLAS measurements are already more constraining than previous analyses performed with TEVATRON and LEP data.

  7. Progress on the Fabric for Frontier Experiments Project at Fermilab

    NASA Astrophysics Data System (ADS)

    Box, Dennis; Boyd, Joseph; Dykstra, Dave; Garzoglio, Gabriele; Herner, Kenneth; Kirby, Michael; Kreymer, Arthur; Levshina, Tanya; Mhashilkar, Parag; Sharma, Neha

    2015-12-01

    The FabrIc for Frontier Experiments (FIFE) project is an ambitious, major-impact initiative within the Fermilab Scientific Computing Division designed to lead the computing model for Fermilab experiments. FIFE is a collaborative effort between experimenters and computing professionals to design and develop integrated computing models for experiments of varying needs and infrastructure. The major focus of the FIFE project is the development, deployment, and integration of Open Science Grid solutions for high throughput computing, data management, database access and collaboration within experiment. To accomplish this goal, FIFE has developed workflows that utilize Open Science Grid sites along with dedicated and commercial cloud resources. The FIFE project has made significant progress integrating into experiment computing operations several services including new job submission services, software and reference data distribution through CVMFS repositories, flexible data transfer client, and access to opportunistic resources on the Open Science Grid. The progress with current experiments and plans for expansion with additional projects will be discussed. FIFE has taken a leading role in the definition of the computing model for Fermilab experiments, aided in the design of computing for experiments beyond Fermilab, and will continue to define the future direction of high throughput computing for future physics experiments worldwide.

  8. News | Computing

    Science.gov Websites

    Support News Publications Computing for Experiments Computing for Neutrino and Muon Physics Computing for Collider Experiments Computing for Astrophysics Research and Development Accelerator Modeling ComPASS - Impact of Detector Simulation on Particle Physics Collider Experiments Daniel Elvira's paper "Impact

  9. Numerical studies of the reversed-field pinch at high aspect ratio

    NASA Astrophysics Data System (ADS)

    Sätherblom, H.-E.; Drake, J. R.

    1998-10-01

    The reversed field pinch (RFP) configuration at an aspect ratio of 8.8 is studied numerically by means of the three-dimensional magnetohydrodynamic code DEBS [D. D. Schnack et al., J. Comput. Phys. 70, 330 (1987)]. This aspect ratio is equal to that of the Extrap T1 experiment [S. Mazur et al., Nucl. Fusion 34, 427 (1994)]. A numerical study of a RFP with this level of aspect ratio requires extensive computer achievements and has hitherto not been performed. The results are compared with previous studies [Y. L. Ho et al., Phys. Plasmas 2, 3407 (1995)] of lower aspect ratio RFP configurations. In particular, an evaluation of the extrapolation to the aspect ratio of 8.8 made in this previous study shows that the extrapolation of the spectral spread, as well as most of the other findings, are confirmed. An important exception, however, is the magnetic diffusion coefficient, which is found to decrease with aspect ratio. Furthermore, an aspect ratio dependence of the magnetic energy and of the helicity of the RFP is found.

  10. Pseudoracemic amino acid complexes: blind predictions for flexible two-component crystals.

    PubMed

    Görbitz, Carl Henrik; Dalhus, Bjørn; Day, Graeme M

    2010-08-14

    Ab initio prediction of the crystal packing in complexes between two flexible molecules is a particularly challenging computational chemistry problem. In this work we present results of single crystal structure determinations as well as theoretical predictions for three 1 ratio 1 complexes between hydrophobic l- and d-amino acids (pseudoracemates), known from previous crystallographic work to form structures with one of two alternative hydrogen bonding arrangements. These are accurately reproduced in the theoretical predictions together with a series of patterns that have never been observed experimentally. In this bewildering forest of potential polymorphs, hydrogen bonding arrangements and molecular conformations, the theoretical predictions succeeded, for all three complexes, in finding the correct hydrogen bonding pattern. For two of the complexes, the calculations also reproduce the exact space group and side chain orientations in the best ranked predicted structure. This includes one complex for which the observed crystal packing clearly contradicted previous experience based on experimental data for a substantial number of related amino acid complexes. The results highlight the significant recent advances that have been made in computational methods for crystal structure prediction.

  11. Comparing DNS and Experiments of Subcritical Flow Past an Isolated Surface Roughness Element

    NASA Astrophysics Data System (ADS)

    Doolittle, Charles; Goldstein, David

    2009-11-01

    Results are presented from computational and experimental studies of subcritical roughness within a Blasius boundary layer. This work stems from discrepancies presented by Stephani and Goldstein (AIAA Paper 2009-585) where DNS results did not agree with hot-wire measurements. The near wake regions of cylindrical surface roughness elements corresponding to roughness-based Reynolds numbers Rek of about 202 are of specific concern. Laser-Doppler anemometry and flow visualization in water, as well as the same spectral DNS code used by Stephani and Goldstein are used to obtain both quantitative and qualitative comparisons with previous results. Conclusions regarding previous studies will be presented alongside discussion of current work including grid resolution studies and an examination of vorticity dynamics.

  12. Collecting and Analyzing Patient Experiences of Health Care From Social Media.

    PubMed

    Rastegar-Mojarad, Majid; Ye, Zhan; Wall, Daniel; Murali, Narayana; Lin, Simon

    2015-07-02

    Social Media, such as Yelp, provides rich information of consumer experience. Previous studies suggest that Yelp can serve as a new source to study patient experience. However, the lack of a corpus of patient reviews causes a major bottleneck for applying computational techniques. The objective of this study is to create a corpus of patient experience (COPE) and report descriptive statistics to characterize COPE. Yelp reviews about health care-related businesses were extracted from the Yelp Academic Dataset. Natural language processing (NLP) tools were used to split reviews into sentences, extract noun phrases and adjectives from each sentence, and generate parse trees and dependency trees for each sentence. Sentiment analysis techniques and Hadoop were used to calculate a sentiment score of each sentence and for parallel processing, respectively. COPE contains 79,173 sentences from 6914 patient reviews of 985 health care facilities near 30 universities in the United States. We found that patients wrote longer reviews when they rated the facility poorly (1 or 2 stars). We demonstrated that the computed sentiment scores correlated well with consumer-generated ratings. A consumer vocabulary to describe their health care experience was constructed by a statistical analysis of word counts and co-occurrences in COPE. A corpus called COPE was built as an initial step to utilize social media to understand patient experiences at health care facilities. The corpus is available to download and COPE can be used in future studies to extract knowledge of patients' experiences from their perspectives. Such information can subsequently inform and provide opportunity to improve the quality of health care.

  13. Towards Reproducibility in Computational Hydrology

    NASA Astrophysics Data System (ADS)

    Hutton, Christopher; Wagener, Thorsten; Freer, Jim; Han, Dawei; Duffy, Chris; Arheimer, Berit

    2017-04-01

    Reproducibility is a foundational principle in scientific research. The ability to independently re-run an experiment helps to verify the legitimacy of individual findings, and evolve (or reject) hypotheses and models of how environmental systems function, and move them from specific circumstances to more general theory. Yet in computational hydrology (and in environmental science more widely) the code and data that produces published results are not regularly made available, and even if they are made available, there remains a multitude of generally unreported choices that an individual scientist may have made that impact the study result. This situation strongly inhibits the ability of our community to reproduce and verify previous findings, as all the information and boundary conditions required to set up a computational experiment simply cannot be reported in an article's text alone. In Hutton et al 2016 [1], we argue that a cultural change is required in the computational hydrological community, in order to advance and make more robust the process of knowledge creation and hypothesis testing. We need to adopt common standards and infrastructures to: (1) make code readable and re-useable; (2) create well-documented workflows that combine re-useable code together with data to enable published scientific findings to be reproduced; (3) make code and workflows available, easy to find, and easy to interpret, using code and code metadata repositories. To create change we argue for improved graduate training in these areas. In this talk we reflect on our progress in achieving reproducible, open science in computational hydrology, which are relevant to the broader computational geoscience community. In particular, we draw on our experience in the Switch-On (EU funded) virtual water science laboratory (http://www.switch-on-vwsl.eu/participate/), which is an open platform for collaboration in hydrological experiments (e.g. [2]). While we use computational hydrology as the example application area, we believe that our conclusions are of value to the wider environmental and geoscience community as far as the use of code and models for scientific advancement is concerned. References: [1] Hutton, C., T. Wagener, J. Freer, D. Han, C. Duffy, and B. Arheimer (2016), Most computational hydrology is not reproducible, so is it really science?, Water Resour. Res., 52, 7548-7555, doi:10.1002/2016WR019285. [2] Ceola, S., et al. (2015), Virtual laboratories: New opportunities for collaborative water science, Hydrol. Earth Syst. Sci. Discuss., 11(12), 13443-13478, doi:10.5194/hessd-11-13443-2014.

  14. Direct Numerical Simulation of an Airfoil with Sand Grain Roughness on the Leading Edge

    NASA Technical Reports Server (NTRS)

    Ribeiro, Andre F. P.; Casalino, Damiano; Fares, Ehab; Choudhari, Meelan

    2016-01-01

    As part of a computational study of acoustic radiation due to the passage of turbulent boundary layer eddies over the trailing edge of an airfoil, the Lattice-Boltzmann method is used to perform direct numerical simulations of compressible, low Mach number flow past an NACA 0012 airfoil at zero degrees angle of attack. The chord Reynolds number of approximately 0.657 million models one of the test conditions from a previous experiment by Brooks, Pope, and Marcolini at NASA Langley Research Center. A unique feature of these simulations involves direct modeling of the sand grain roughness on the leading edge, which was used in the abovementioned experiment to trip the boundary layer to fully turbulent flow. This report documents the findings of preliminary, proof-of-concept simulations based on a narrow spanwise domain and a limited time interval. The inclusion of fully-resolved leading edge roughness in this simulation leads to significantly earlier transition than that in the absence of any roughness. The simulation data is used in conjunction with both the Ffowcs Williams-Hawkings acoustic analogy and a semi-analytical model by Roger and Moreau to predict the farfield noise. The encouraging agreement between the computed noise spectrum and that measured in the experiment indicates the potential payoff from a full-fledged numerical investigation based on the current approach. Analysis of the computed data is used to identify the required improvements to the preliminary simulations described herein.

  15. Performance monitoring for brain-computer-interface actions.

    PubMed

    Schurger, Aaron; Gale, Steven; Gozel, Olivia; Blanke, Olaf

    2017-02-01

    When presented with a difficult perceptual decision, human observers are able to make metacognitive judgements of subjective certainty. Such judgements can be made independently of and prior to any overt response to a sensory stimulus, presumably via internal monitoring. Retrospective judgements about one's own task performance, on the other hand, require first that the subject perform a task and thus could potentially be made based on motor processes, proprioceptive, and other sensory feedback rather than internal monitoring. With this dichotomy in mind, we set out to study performance monitoring using a brain-computer interface (BCI), with which subjects could voluntarily perform an action - moving a cursor on a computer screen - without any movement of the body, and thus without somatosensory feedback. Real-time visual feedback was available to subjects during training, but not during the experiment where the true final position of the cursor was only revealed after the subject had estimated where s/he thought it had ended up after 6s of BCI-based cursor control. During the first half of the experiment subjects based their assessments primarily on the prior probability of the end position of the cursor on previous trials. However, during the second half of the experiment subjects' judgements moved significantly closer to the true end position of the cursor, and away from the prior. This suggests that subjects can monitor task performance when the task is performed without overt movement of the body. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Camouflage, detection and identification of moving targets

    PubMed Central

    Hall, Joanna R.; Cuthill, Innes C.; Baddeley, Roland; Shohet, Adam J.; Scott-Samuel, Nicholas E.

    2013-01-01

    Nearly all research on camouflage has investigated its effectiveness for concealing stationary objects. However, animals have to move, and patterns that only work when the subject is static will heavily constrain behaviour. We investigated the effects of different camouflages on the three stages of predation—detection, identification and capture—in a computer-based task with humans. An initial experiment tested seven camouflage strategies on static stimuli. In line with previous literature, background-matching and disruptive patterns were found to be most successful. Experiment 2 showed that if stimuli move, an isolated moving object on a stationary background cannot avoid detection or capture regardless of the type of camouflage. Experiment 3 used an identification task and showed that while camouflage is unable to slow detection or capture, camouflaged targets are harder to identify than uncamouflaged targets when similar background objects are present. The specific details of the camouflage patterns have little impact on this effect. If one has to move, camouflage cannot impede detection; but if one is surrounded by similar targets (e.g. other animals in a herd, or moving background distractors), then camouflage can slow identification. Despite previous assumptions, motion does not entirely ‘break’ camouflage. PMID:23486439

  17. Camouflage, detection and identification of moving targets.

    PubMed

    Hall, Joanna R; Cuthill, Innes C; Baddeley, Roland; Shohet, Adam J; Scott-Samuel, Nicholas E

    2013-05-07

    Nearly all research on camouflage has investigated its effectiveness for concealing stationary objects. However, animals have to move, and patterns that only work when the subject is static will heavily constrain behaviour. We investigated the effects of different camouflages on the three stages of predation-detection, identification and capture-in a computer-based task with humans. An initial experiment tested seven camouflage strategies on static stimuli. In line with previous literature, background-matching and disruptive patterns were found to be most successful. Experiment 2 showed that if stimuli move, an isolated moving object on a stationary background cannot avoid detection or capture regardless of the type of camouflage. Experiment 3 used an identification task and showed that while camouflage is unable to slow detection or capture, camouflaged targets are harder to identify than uncamouflaged targets when similar background objects are present. The specific details of the camouflage patterns have little impact on this effect. If one has to move, camouflage cannot impede detection; but if one is surrounded by similar targets (e.g. other animals in a herd, or moving background distractors), then camouflage can slow identification. Despite previous assumptions, motion does not entirely 'break' camouflage.

  18. Participation in Team Sports Can Eliminate the Effect of Social Loafing.

    PubMed

    Czyż, Stanisław H; Szmajke, Andrzej; Kruger, Ankebé; Kübler, Magdalena

    2016-12-01

    The effect known as Ringelmann effect states that as group size increases, individual behavior may be less productive. If this decrease in productivity in groups is attributed to a decrement in individual motivation, it is called social loafing. We tested hypotheses that the collectivism associated with participation in team sports would reduce the level of social loafing compared to people who were not involved in team sports. In one experiment, participants (n = 72; M age = 21.7 years, SD = 2.0) had to pull a rope individually and collectively. Groups of two, three, four, and six persons were formed from among individuals with no previous sports experience, and of those who had engaged in individual and team sports. For each team, the sum of individual achievements of the individuals constituting a team was computed. This sum served as the anticipated result (expected value). The expected values were later compared to the actual achievements, i.e., the value achieved by the whole team. The results of the study suggested that previous experience in collective (team) sports eliminated the effect of social loafing. © The Author(s) 2016.

  19. Principal pitch of frequency-modulated tones with asymmetrical modulation waveform: a comparison of models.

    PubMed

    Etchemendy, Pablo E; Eguia, Manuel C; Mesz, Bruno

    2014-03-01

    In this work, the overall perceived pitch (principal pitch) of pure tones modulated in frequency with an asymmetric waveform is studied. The dependence of the principal pitch on the degree of asymmetric modulation was obtained from a psychophysical experiment. The modulation waveform consisted of a flat portion of constant frequency and two linear segments forming a peak. Consistent with previous results, significant pitch shifts with respect to the time-averaged geometric mean were observed. The direction of the shifts was always toward the flat portion of the modulation. The results from the psychophysical experiment, along with those obtained from previously reported studies, were compared with the predictions of six models of pitch perception proposed in the literature. Even though no single model was able to predict accurately the perceived pitch for all experiments, there were two models that give robust predictions that are within the range of acceptable tuning of modulated tones for almost all the cases. Both models point to the existence of an underlying "stability sensitive" mechanism for the computation of pitch that gives more weight to the portion of the stimuli where the frequency is changing more slowly.

  20. Controversial electronic structures and energies of Fe{sub 2}, Fe{sub 2}{sup +}, and Fe{sub 2}{sup −} resolved by RASPT2 calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoyer, Chad E.; Manni, Giovanni Li; Truhlar, Donald G., E-mail: truhlar@umn.edu, E-mail: gagliard@umn.edu

    2014-11-28

    The diatomic molecule Fe{sub 2} was investigated using restricted active space second-order perturbation theory (RASPT2). This molecule is very challenging to study computationally because predictions about the ground state and excited states depend sensitively on the choice of the quantum chemical method. For Fe{sub 2} we show that one needs to go beyond a full-valence active space in order to achieve even qualitative agreement with experiment for the dissociation energy, and we also obtain a smooth ground-state potential curve. In addition we report the first multireference study of Fe{sub 2}{sup +}, for which we predict an {sup 8}Σ{sub u}{sup −}more » ground state, which was not predicted by previous computational studies. By using an active space large enough to remove the most serious deficiencies of previous theoretical work and by explicitly investigating the interpretations of previous experimental results, this study elucidates previous difficulties and provides – for the first time – a qualitatively correct treatment of Fe{sub 2}, Fe{sub 2}{sup +}, and Fe{sub 2}{sup −}. Moreover, this study represents a record in terms of the number or active electrons and active orbitals in the active space, namely 16 electrons in 28 orbitals. Conventional CASPT2 calculations can be performed with at most 16 electrons in 16 orbitals. We were able to overcome this limit by using the RASPT2 formalism.« less

  1. Computer literacy and attitudes towards e-learning among first year medical students

    PubMed Central

    Link, Thomas Michael; Marz, Richard

    2006-01-01

    Background At the Medical University of Vienna, most information for students is available only online. In 2005, an e-learning project was initiated and there are plans to introduce a learning management system. In this study, we estimate the level of students' computer skills, the number of students having difficulty with e-learning, and the number of students opposed to e-learning. Methods The study was conducted in an introductory course on computer-based and web-based training (CBT/WBT). Students were asked to fill out a questionnaire online that covered a wide range of relevant attitudes and experiences. Results While the great majority of students possess sufficient computer skills and acknowledge the advantages of interactive and multimedia-enhanced learning material, a small percentage lacks basic computer skills and/or is very skeptical about e-learning. There is also a consistently significant albeit weak gender difference in available computer infrastructure and Internet access. As for student attitudes toward e-learning, we found that age, computer use, and previous exposure to computers are more important than gender. A sizable number of students, 12% of the total, make little or no use of existing e-learning offerings. Conclusion Many students would benefit from a basic introduction to computers and to the relevant computer-based resources of the university. Given to the wide range of computer skills among students, a single computer course for all students would not be useful nor would it be accepted. Special measures should be taken to prevent students who lack computer skills from being disadvantaged or from developing computer-hostile attitudes. PMID:16784524

  2. Computer literacy and attitudes towards e-learning among first year medical students.

    PubMed

    Link, Thomas Michael; Marz, Richard

    2006-06-19

    At the Medical University of Vienna, most information for students is available only online. In 2005, an e-learning project was initiated and there are plans to introduce a learning management system. In this study, we estimate the level of students' computer skills, the number of students having difficulty with e-learning, and the number of students opposed to e-learning. The study was conducted in an introductory course on computer-based and web-based training (CBT/WBT). Students were asked to fill out a questionnaire online that covered a wide range of relevant attitudes and experiences. While the great majority of students possess sufficient computer skills and acknowledge the advantages of interactive and multimedia-enhanced learning material, a small percentage lacks basic computer skills and/or is very skeptical about e-learning. There is also a consistently significant albeit weak gender difference in available computer infrastructure and Internet access. As for student attitudes toward e-learning, we found that age, computer use, and previous exposure to computers are more important than gender. A sizable number of students, 12% of the total, make little or no use of existing e-learning offerings. Many students would benefit from a basic introduction to computers and to the relevant computer-based resources of the university. Given to the wide range of computer skills among students, a single computer course for all students would not be useful nor would it be accepted. Special measures should be taken to prevent students who lack computer skills from being disadvantaged or from developing computer-hostile attitudes.

  3. Progress on the FabrIc for Frontier Experiments project at Fermilab

    DOE PAGES

    Box, Dennis; Boyd, Joseph; Dykstra, Dave; ...

    2015-12-23

    The FabrIc for Frontier Experiments (FIFE) project is an ambitious, major-impact initiative within the Fermilab Scientific Computing Division designed to lead the computing model for Fermilab experiments. FIFE is a collaborative effort between experimenters and computing professionals to design and develop integrated computing models for experiments of varying needs and infrastructure. The major focus of the FIFE project is the development, deployment, and integration of Open Science Grid solutions for high throughput computing, data management, database access and collaboration within experiment. To accomplish this goal, FIFE has developed workflows that utilize Open Science Grid sites along with dedicated and commercialmore » cloud resources. The FIFE project has made significant progress integrating into experiment computing operations several services including new job submission services, software and reference data distribution through CVMFS repositories, flexible data transfer client, and access to opportunistic resources on the Open Science Grid. Hence, the progress with current experiments and plans for expansion with additional projects will be discussed. FIFE has taken a leading role in the definition of the computing model for Fermilab experiments, aided in the design of computing for experiments beyond Fermilab, and will continue to define the future direction of high throughput computing for future physics experiments worldwide« less

  4. Large-Eddy/Lattice Boltzmann Simulations of Micro-blowing Strategies for Subsonic and Supersonic Drag Control

    NASA Technical Reports Server (NTRS)

    Menon, Suresh

    2003-01-01

    This report summarizes the progress made in the first 8 to 9 months of this research. The Lattice Boltzmann Equation (LBE) methodology for Large-eddy Simulations (LES) of microblowing has been validated using a jet-in-crossflow test configuration. In this study, the flow intake is also simulated to allow the interaction to occur naturally. The Lattice Boltzmann Equation Large-eddy Simulations (LBELES) approach is capable of capturing not only the flow features associated with the flow, such as hairpin vortices and recirculation behind the jet, but also is able to show better agreement with experiments when compared to previous RANS predictions. The LBELES is shown to be computationally very efficient and therefore, a viable method for simulating the injection process. Two strategies have been developed to simulate multi-hole injection process as in the experiment. In order to allow natural interaction between the injected fluid and the primary stream, the flow intakes for all the holes have to be simulated. The LBE method is computationally efficient but is still 3D in nature and therefore, there may be some computational penalty. In order to study a large number or holes, a new 1D subgrid model has been developed that will simulate a reduced form of the Navier-Stokes equation in these holes.

  5. Global analysis of protein folding using massively parallel design, synthesis and testing

    PubMed Central

    Rocklin, Gabriel J.; Chidyausiku, Tamuka M.; Goreshnik, Inna; Ford, Alex; Houliston, Scott; Lemak, Alexander; Carter, Lauren; Ravichandran, Rashmi; Mulligan, Vikram K.; Chevalier, Aaron; Arrowsmith, Cheryl H.; Baker, David

    2017-01-01

    Proteins fold into unique native structures stabilized by thousands of weak interactions that collectively overcome the entropic cost of folding. Though these forces are “encoded” in the thousands of known protein structures, “decoding” them is challenging due to the complexity of natural proteins that have evolved for function, not stability. Here we combine computational protein design, next-generation gene synthesis, and a high-throughput protease susceptibility assay to measure folding and stability for over 15,000 de novo designed miniproteins, 1,000 natural proteins, 10,000 point-mutants, and 30,000 negative control sequences, identifying over 2,500 new stable designed proteins in four basic folds. This scale—three orders of magnitude greater than that of previous studies of design or folding—enabled us to systematically examine how sequence determines folding and stability in uncharted protein space. Iteration between design and experiment increased the design success rate from 6% to 47%, produced stable proteins unlike those found in nature for topologies where design was initially unsuccessful, and revealed subtle contributions to stability as designs became increasingly optimized. Our approach achieves the long-standing goal of a tight feedback cycle between computation and experiment, and promises to transform computational protein design into a data-driven science. PMID:28706065

  6. Students views of integrating web-based learning technology into the nursing curriculum - A descriptive survey.

    PubMed

    Adams, Audrey; Timmins, Fiona

    2006-01-01

    This paper describes students' experiences of a Web-based innovation at one university. This paper reports on the first phase of this development where two Web-based modules were developed. Using a survey approach (n=44) students' access to and use of computer technology were explored. Findings revealed that students' prior use of computers and Internet technologies was higher than previously reported, although use of databases was low. Skills in this area increased during the programme, with a significant rise in database, email, search engine and word processing use. Many specific computer skills were learned during the programme, with high numbers reporting ability to deal adequately with files and folders. Overall, the experience was a positive one for students. While a sense of student isolation was not reported, as many students kept in touch by phone and class attendance continued, some individual students did appear to isolate themselves. This teaching methodology has much to offer in the provision of convenient easy to access programmes that can be easily adapted to the individual lifestyle. However, student support mechanisms need careful consideration for students who are at risk of becoming isolated. Staff also need to supported in the provision of this methodology and face-to-face contact with teachers for some part of the programme is preferable.

  7. Implementation of a filmless mini picture archiving and communication system in ultrasonography: experience after one year of use.

    PubMed

    Henri, C J; Cox, R D; Bret, P M

    1997-08-01

    This article details our experience in developing and operating an ultrasound mini-picture archiving and communication system (PACS). Using software developed in-house, low-end Macintosh computers (Apple Computer Co. Cupertino, CA) equipped with framegrabbers coordinate the entry of patient demographic information, image acquisition, and viewing on each ultrasound scanner. After each exam, the data are transmitted to a central archive server where they can be accessed from anywhere on the network. The archive server also provides web-based access to the data and manages pre-fetch and other requests for data that may no longer be on-line. Archival is fully automatic and is performed on recordable compact disk (CD) without compression. The system has been filmless now for over 18 months. In the meantime, one film processor has been eliminated and the position of one film clerk has been reallocated. Previously, nine ultrasound machines produced approximately 150 sheets of laser film per day (at 14 images per sheet). The same quantity of data are now archived without compression onto a single CD. Start-up costs were recovered within six months, and the project has been extended to include computed tomography (CT) and magnetic resonance imaging (MRI).

  8. The joint Simon effect depends on perceived agency, but not intentionality, of the alternative action

    PubMed Central

    Stenzel, Anna; Dolk, Thomas; Colzato, Lorenza S.; Sellaro, Roberta; Hommel, Bernhard; Liepelt, Roman

    2014-01-01

    A co-actor's intentionality has been suggested to be a key modulating factor for joint action effects like the joint Simon effect (JSE). However, in previous studies intentionality has often been confounded with agency defined as perceiving the initiator of an action as being the causal source of the action. The aim of the present study was to disentangle the role of agency and intentionality as modulating factors of the JSE. In Experiment 1, participants performed a joint go/nogo Simon task next to a co-actor who either intentionally controlled a response button with own finger movements (agency+/intentionality+) or who passively placed the hand on a response button that moved up and down on its own as triggered by computer signals (agency−/intentionality−). In Experiment 2, we included a condition in which participants believed that the co-actor intentionally controlled the response button with a Brain-Computer Interface (BCI) while placing the response finger clearly besides the response button, so that the causal relationship between agent and action effect was perceptually disrupted (agency−/intentionality+). As a control condition, the response button was computer controlled while the co-actor placed the response finger besides the response button (agency−/intentionality−). Experiment 1 showed that the JSE is present with an intentional co-actor and causality between co-actor and action effect, but absent with an unintentional co-actor and a lack of causality between co-actor and action effect. Experiment 2 showed that the JSE is absent with an intentional co-actor, but no causality between co-actor and action effect. Our findings indicate an important role of the co-actor's agency for the JSE. They also suggest that the attribution of agency has a strong perceptual basis. PMID:25140144

  9. The joint Simon effect depends on perceived agency, but not intentionality, of the alternative action.

    PubMed

    Stenzel, Anna; Dolk, Thomas; Colzato, Lorenza S; Sellaro, Roberta; Hommel, Bernhard; Liepelt, Roman

    2014-01-01

    A co-actor's intentionality has been suggested to be a key modulating factor for joint action effects like the joint Simon effect (JSE). However, in previous studies intentionality has often been confounded with agency defined as perceiving the initiator of an action as being the causal source of the action. The aim of the present study was to disentangle the role of agency and intentionality as modulating factors of the JSE. In Experiment 1, participants performed a joint go/nogo Simon task next to a co-actor who either intentionally controlled a response button with own finger movements (agency+/intentionality+) or who passively placed the hand on a response button that moved up and down on its own as triggered by computer signals (agency-/intentionality-). In Experiment 2, we included a condition in which participants believed that the co-actor intentionally controlled the response button with a Brain-Computer Interface (BCI) while placing the response finger clearly besides the response button, so that the causal relationship between agent and action effect was perceptually disrupted (agency-/intentionality+). As a control condition, the response button was computer controlled while the co-actor placed the response finger besides the response button (agency-/intentionality-). Experiment 1 showed that the JSE is present with an intentional co-actor and causality between co-actor and action effect, but absent with an unintentional co-actor and a lack of causality between co-actor and action effect. Experiment 2 showed that the JSE is absent with an intentional co-actor, but no causality between co-actor and action effect. Our findings indicate an important role of the co-actor's agency for the JSE. They also suggest that the attribution of agency has a strong perceptual basis.

  10. A resilient and efficient CFD framework: Statistical learning tools for multi-fidelity and heterogeneous information fusion

    NASA Astrophysics Data System (ADS)

    Lee, Seungjoon; Kevrekidis, Ioannis G.; Karniadakis, George Em

    2017-09-01

    Exascale-level simulations require fault-resilient algorithms that are robust against repeated and expected software and/or hardware failures during computations, which may render the simulation results unsatisfactory. If each processor can share some global information about the simulation from a coarse, limited accuracy but relatively costless auxiliary simulator we can effectively fill-in the missing spatial data at the required times by a statistical learning technique - multi-level Gaussian process regression, on the fly; this has been demonstrated in previous work [1]. Based on the previous work, we also employ another (nonlinear) statistical learning technique, Diffusion Maps, that detects computational redundancy in time and hence accelerate the simulation by projective time integration, giving the overall computation a "patch dynamics" flavor. Furthermore, we are now able to perform information fusion with multi-fidelity and heterogeneous data (including stochastic data). Finally, we set the foundations of a new framework in CFD, called patch simulation, that combines information fusion techniques from, in principle, multiple fidelity and resolution simulations (and even experiments) with a new adaptive timestep refinement technique. We present two benchmark problems (the heat equation and the Navier-Stokes equations) to demonstrate the new capability that statistical learning tools can bring to traditional scientific computing algorithms. For each problem, we rely on heterogeneous and multi-fidelity data, either from a coarse simulation of the same equation or from a stochastic, particle-based, more "microscopic" simulation. We consider, as such "auxiliary" models, a Monte Carlo random walk for the heat equation and a dissipative particle dynamics (DPD) model for the Navier-Stokes equations. More broadly, in this paper we demonstrate the symbiotic and synergistic combination of statistical learning, domain decomposition, and scientific computing in exascale simulations.

  11. Development and assessment of a chemistry-based computer video game as a learning tool

    NASA Astrophysics Data System (ADS)

    Martinez-Hernandez, Kermin Joel

    The chemistry-based computer video game is a multidisciplinary collaboration between chemistry and computer graphics and technology fields developed to explore the use of video games as a possible learning tool. This innovative approach aims to integrate elements of commercial video game and authentic chemistry context environments into a learning experience through gameplay. The project consists of three areas: development, assessment, and implementation. However, the foci of this study were the development and assessment of the computer video game including possible learning outcomes and game design elements. A chemistry-based game using a mixed genre of a single player first-person game embedded with action-adventure and puzzle components was developed to determine if students' level of understanding of chemistry concepts change after gameplay intervention. Three phases have been completed to assess students' understanding of chemistry concepts prior and after gameplay intervention. Two main assessment instruments (pre/post open-ended content survey and individual semi-structured interviews) were used to assess student understanding of concepts. In addition, game design elements were evaluated for future development phases. Preliminary analyses of the interview data suggest that students were able to understand most of the chemistry challenges presented in the game and the game served as a review for previously learned concepts as well as a way to apply such previous knowledge. To guarantee a better understanding of the chemistry concepts, additions such as debriefing and feedback about the content presented in the game seem to be needed. The use of visuals in the game to represent chemical processes, game genre, and game idea appear to be the game design elements that students like the most about the current computer video game.

  12. Design, Modeling, Fabrication, and Evaluation of the Air Amplifier for Improved Detection of Biomolecules by Electrospray Ionization Mass Spectrometry

    PubMed Central

    Robichaud, Guillaume; Dixon, R. Brent; Potturi, Amarnatha S.; Cassidy, Dan; Edwards, Jack R.; Sohn, Alex; Dow, Thomas A.; Muddiman, David C.

    2010-01-01

    Through a multi-disciplinary approach, the air amplifier is being evolved as a highly engineered device to improve detection limits of biomolecules when using electrospray ionization. Several key aspects have driven the modifications to the device through experimentation and simulations. We have developed a computer simulation that accurately portrays actual conditions and the results from these simulations are corroborated by the experimental data. These computer simulations can be used to predict outcomes from future designs resulting in a design process that is efficient in terms of financial cost and time. We have fabricated a new device with annular gap control over a range of 50 to 70 μm using piezoelectric actuators. This has enabled us to obtain better aerodynamic performance when compared to the previous design (2× more vacuum) and also more reproducible results. This is allowing us to study a broader experimental space than the previous design which is critical in guiding future directions. This work also presents and explains the principles behind a fractional factorial design of experiments methodology for testing a large number of experimental parameters in an orderly and efficient manner to understand and optimize the critical parameters that lead to obtain improved detection limits while minimizing the number of experiments performed. Preliminary results showed that several folds of improvements could be obtained for certain condition of operations (up to 34 folds). PMID:21499524

  13. Radiation Protection Studies of International Space Station Extravehicular Activity Space Suits

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A. (Editor); Shavers, Mark R. (Editor); Saganti, Premkumar B. (Editor); Miller, Jack (Editor)

    2003-01-01

    This publication describes recent investigations that evaluate radiation shielding characteristics of NASA's and the Russian Space Agency's space suits. The introduction describes the suits and presents goals of several experiments performed with them. The first chapter provides background information about the dynamic radiation environment experienced at ISS and summarized radiation health and protection requirements for activities in low Earth orbit. Supporting studies report the development and application of a computer model of the EMU space suit and the difficulty of shielding EVA crewmembers from high-energy reentrant electrons, a previously unevaluated component of the space radiation environment. Chapters 2 through 6 describe experiments that evaluate the space suits' radiation shielding characteristics. Chapter 7 describes a study of the potential radiological health impact on EVA crewmembers of two virtually unexamined environmental sources of high-energy electrons-reentrant trapped electrons and atmospheric albedo or "splash" electrons. The radiological consequences of those sources have not been evaluated previously and, under closer scrutiny. A detailed computational model of the shielding distribution provided by components of the NASA astronauts' EMU is being developed for exposure evaluation studies. The model is introduced in Chapters 8 and 9 and used in Chapter 10 to investigate how trapped particle anisotropy impacts female organ doses during EVA. Chapter 11 presents a review of issues related to estimating skin cancer risk form space radiation. The final chapter contains conclusions about the protective qualities of the suit brought to light form these studies, as well as recommendations for future operational radiation protection.

  14. Properties of model-averaged BMDLs: a study of model averaging in dichotomous response risk estimation.

    PubMed

    Wheeler, Matthew W; Bailer, A John

    2007-06-01

    Model averaging (MA) has been proposed as a method of accounting for model uncertainty in benchmark dose (BMD) estimation. The technique has been used to average BMD dose estimates derived from dichotomous dose-response experiments, microbial dose-response experiments, as well as observational epidemiological studies. While MA is a promising tool for the risk assessor, a previous study suggested that the simple strategy of averaging individual models' BMD lower limits did not yield interval estimators that met nominal coverage levels in certain situations, and this performance was very sensitive to the underlying model space chosen. We present a different, more computationally intensive, approach in which the BMD is estimated using the average dose-response model and the corresponding benchmark dose lower bound (BMDL) is computed by bootstrapping. This method is illustrated with TiO(2) dose-response rat lung cancer data, and then systematically studied through an extensive Monte Carlo simulation. The results of this study suggest that the MA-BMD, estimated using this technique, performs better, in terms of bias and coverage, than the previous MA methodology. Further, the MA-BMDL achieves nominal coverage in most cases, and is superior to picking the "best fitting model" when estimating the benchmark dose. Although these results show utility of MA for benchmark dose risk estimation, they continue to highlight the importance of choosing an adequate model space as well as proper model fit diagnostics.

  15. Description of data base management systems activities

    NASA Technical Reports Server (NTRS)

    1983-01-01

    One of the major responsibilities of the JPL Computing and Information Services Office is to develop and maintain a JPL plan for providing computing services to the JPL management and administrative community that will lead to improved productivity. The CISO plan to accomplish this objective has been titled 'Management and Administrative Support Systems' (MASS). The MASS plan is based on the continued use of JPL's IBM 3032 Computer system for administrative computing and for the MASS functions. The current candidate administrative Data Base Management Systems required to support the MASS include ADABASE, Cullinane IDMS and TOTAL. Previous uses of administrative Data Base Systems have been applied to specific local functions rather than in a centralized manner with elements common to the many user groups. Limited capacity data base systems have been installed in microprocessor based office automation systems in a few Project and Management Offices using Ashton-Tate dBASE II. These experiences plus some other localized in house DBMS uses have provided an excellent background for developing user and system requirements for a single DBMS to support the MASS program.

  16. A Parallel Distributed-Memory Particle Method Enables Acquisition-Rate Segmentation of Large Fluorescence Microscopy Images

    PubMed Central

    Afshar, Yaser; Sbalzarini, Ivo F.

    2016-01-01

    Modern fluorescence microscopy modalities, such as light-sheet microscopy, are capable of acquiring large three-dimensional images at high data rate. This creates a bottleneck in computational processing and analysis of the acquired images, as the rate of acquisition outpaces the speed of processing. Moreover, images can be so large that they do not fit the main memory of a single computer. We address both issues by developing a distributed parallel algorithm for segmentation of large fluorescence microscopy images. The method is based on the versatile Discrete Region Competition algorithm, which has previously proven useful in microscopy image segmentation. The present distributed implementation decomposes the input image into smaller sub-images that are distributed across multiple computers. Using network communication, the computers orchestrate the collectively solving of the global segmentation problem. This not only enables segmentation of large images (we test images of up to 1010 pixels), but also accelerates segmentation to match the time scale of image acquisition. Such acquisition-rate image segmentation is a prerequisite for the smart microscopes of the future and enables online data compression and interactive experiments. PMID:27046144

  17. Computer Model Predicts the Movement of Dust

    NASA Technical Reports Server (NTRS)

    2002-01-01

    A new computer model of the atmosphere can now actually pinpoint where global dust events come from, and can project where they're going. The model may help scientists better evaluate the impact of dust on human health, climate, ocean carbon cycles, ecosystems, and atmospheric chemistry. Also, by seeing where dust originates and where it blows people with respiratory problems can get advanced warning of approaching dust clouds. 'The model is physically more realistic than previous ones,' said Mian Chin, a co-author of the study and an Earth and atmospheric scientist at Georgia Tech and the Goddard Space Flight Center (GSFC) in Greenbelt, Md. 'It is able to reproduce the short term day-to-day variations and long term inter-annual variations of dust concentrations and distributions that are measured from field experiments and observed from satellites.' The above images show both aerosols measured from space (left) and the movement of aerosols predicted by computer model for the same date (right). For more information, read New Computer Model Tracks and Predicts Paths Of Earth's Dust Images courtesy Paul Giroux, Georgia Tech/NASA Goddard Space Flight Center

  18. Circular motion geometry using minimal data.

    PubMed

    Jiang, Guang; Quan, Long; Tsui, Hung-Tat

    2004-06-01

    Circular motion or single axis motion is widely used in computer vision and graphics for 3D model acquisition. This paper describes a new and simple method for recovering the geometry of uncalibrated circular motion from a minimal set of only two points in four images. This problem has been previously solved using nonminimal data either by computing the fundamental matrix and trifocal tensor in three images or by fitting conics to tracked points in five or more images. It is first established that two sets of tracked points in different images under circular motion for two distinct space points are related by a homography. Then, we compute a plane homography from a minimal two points in four images. After that, we show that the unique pair of complex conjugate eigenvectors of this homography are the image of the circular points of the parallel planes of the circular motion. Subsequently, all other motion and structure parameters are computed from this homography in a straighforward manner. The experiments on real image sequences demonstrate the simplicity, accuracy, and robustness of the new method.

  19. EMILiO: a fast algorithm for genome-scale strain design.

    PubMed

    Yang, Laurence; Cluett, William R; Mahadevan, Radhakrishnan

    2011-05-01

    Systems-level design of cell metabolism is becoming increasingly important for renewable production of fuels, chemicals, and drugs. Computational models are improving in the accuracy and scope of predictions, but are also growing in complexity. Consequently, efficient and scalable algorithms are increasingly important for strain design. Previous algorithms helped to consolidate the utility of computational modeling in this field. To meet intensifying demands for high-performance strains, both the number and variety of genetic manipulations involved in strain construction are increasing. Existing algorithms have experienced combinatorial increases in computational complexity when applied toward the design of such complex strains. Here, we present EMILiO, a new algorithm that increases the scope of strain design to include reactions with individually optimized fluxes. Unlike existing approaches that would experience an explosion in complexity to solve this problem, we efficiently generated numerous alternate strain designs producing succinate, l-glutamate and l-serine. This was enabled by successive linear programming, a technique new to the area of computational strain design. Copyright © 2011 Elsevier Inc. All rights reserved.

  20. Further Investigation of the Support System Effects and Wing Twist on the NASA Common Research Model

    NASA Technical Reports Server (NTRS)

    Rivers, Melissa B.; Hunter, Craig A.; Campbell, Richard L.

    2012-01-01

    An experimental investigation of the NASA Common Research Model was conducted in the NASA Langley National Transonic Facility and NASA Ames 11-foot Transonic Wind Tunnel Facility for use in the Drag Prediction Workshop. As data from the experimental investigations was collected, a large difference in moment values was seen between the experiment and computational data from the 4th Drag Prediction Workshop. This difference led to a computational assessment to investigate model support system interference effects on the Common Research Model. The results from this investigation showed that the addition of the support system to the computational cases did increase the pitching moment so that it more closely matched the experimental results, but there was still a large discrepancy in pitching moment. This large discrepancy led to an investigation into the shape of the as-built model, which in turn led to a change in the computational grids and re-running of all the previous support system cases. The results of these cases are the focus of this paper.

  1. Navier-Stokes simulations of slender axisymmetric shapes in supersonic, turbulent flow

    NASA Astrophysics Data System (ADS)

    Moran, Kenneth J.; Beran, Philip S.

    1994-07-01

    Computational fluid dynamics is used to study flows about slender, axisymmetric bodies at very high speeds. Numerical experiments are conducted to simulate a broad range of flight conditions. Mach number is varied from 1.5 to 8 and Reynolds number is varied from 1 X 10(exp 6)/m to 10(exp 8)/m. The primary objective is to develop and validate a computational and methodology for the accurate simulation of a wide variety of flow structures. Accurate results are obtained for detached bow shocks, recompression shocks, corner-point expansions, base-flow recirculations, and turbulent boundary layers. Accuracy is assessed through comparison with theory and experimental data; computed surface pressure, shock structure, base-flow structure, and velocity profiles are within measurement accuracy throughout the range of conditions tested. The methodology is both practical and general: general in its applicability, and practicaal in its performance. To achieve high accuracy, modifications to previously reported techniques are implemented in the scheme. These modifications improve computed results in the vicinity of symmetry lines and in the base flow region, including the turbulent wake.

  2. A Parallel Distributed-Memory Particle Method Enables Acquisition-Rate Segmentation of Large Fluorescence Microscopy Images.

    PubMed

    Afshar, Yaser; Sbalzarini, Ivo F

    2016-01-01

    Modern fluorescence microscopy modalities, such as light-sheet microscopy, are capable of acquiring large three-dimensional images at high data rate. This creates a bottleneck in computational processing and analysis of the acquired images, as the rate of acquisition outpaces the speed of processing. Moreover, images can be so large that they do not fit the main memory of a single computer. We address both issues by developing a distributed parallel algorithm for segmentation of large fluorescence microscopy images. The method is based on the versatile Discrete Region Competition algorithm, which has previously proven useful in microscopy image segmentation. The present distributed implementation decomposes the input image into smaller sub-images that are distributed across multiple computers. Using network communication, the computers orchestrate the collectively solving of the global segmentation problem. This not only enables segmentation of large images (we test images of up to 10(10) pixels), but also accelerates segmentation to match the time scale of image acquisition. Such acquisition-rate image segmentation is a prerequisite for the smart microscopes of the future and enables online data compression and interactive experiments.

  3. Computational techniques in gamma-ray skyshine analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    George, D.L.

    1988-12-01

    Two computer codes were developed to analyze gamma-ray skyshine, the scattering of gamma photons by air molecules. A review of previous gamma-ray skyshine studies discusses several Monte Carlo codes, programs using a single-scatter model, and the MicroSkyshine program for microcomputers. A benchmark gamma-ray skyshine experiment performed at Kansas State University is also described. A single-scatter numerical model was presented which traces photons from the source to their first scatter, then applies a buildup factor along a direct path from the scattering point to a detector. The FORTRAN code SKY, developed with this model before the present study, was modified tomore » use Gauss quadrature, recent photon attenuation data and a more accurate buildup approximation. The resulting code, SILOGP, computes response from a point photon source on the axis of a silo, with and without concrete shielding over the opening. Another program, WALLGP, was developed using the same model to compute response from a point gamma source behind a perfectly absorbing wall, with and without shielding overhead. 29 refs., 48 figs., 13 tabs.« less

  4. Flame-Vortex Studies to Quantify Markstein Numbers Needed to Model Flame Extinction Limits

    NASA Technical Reports Server (NTRS)

    Driscoll, James F.; Feikema, Douglas A.

    2003-01-01

    This has quantified a database of Markstein numbers for unsteady flames; future work will quantify a database of flame extinction limits for unsteady conditions. Unsteady extinction limits have not been documented previously; both a stretch rate and a residence time must be measured, since extinction requires that the stretch rate be sufficiently large for a sufficiently long residence time. Ma was measured for an inwardly-propagating flame (IPF) that is negatively-stretched under microgravity conditions. Computations also were performed using RUN-1DL to explain the measurements. The Markstein number of an inwardly-propagating flame, for both the microgravity experiment and the computations, is significantly larger than that of an outwardy-propagating flame. The computed profiles of the various species within the flame suggest reasons. Computed hydrogen concentrations build up ahead of the IPF but not the OPF. Understanding was gained by running the computations for both simplified and full-chemistry conditions. Numerical Simulations. To explain the experimental findings, numerical simulations of both inwardly and outwardly propagating spherical flames (with complex chemistry) were generated using the RUN-1DL code, which includes 16 species and 46 reactions.

  5. Ranking of stopping criteria for log domain diffeomorphic demons application in clinical radiation therapy.

    PubMed

    Peroni, M; Golland, P; Sharp, G C; Baroni, G

    2011-01-01

    Deformable Image Registration is a complex optimization algorithm with the goal of modeling a non-rigid transformation between two images. A crucial issue in this field is guaranteeing the user a robust but computationally reasonable algorithm. We rank the performances of four stopping criteria and six stopping value computation strategies for a log domain deformable registration. The stopping criteria we test are: (a) velocity field update magnitude, (b) vector field Jacobian, (c) mean squared error, and (d) harmonic energy. Experiments demonstrate that comparing the metric value over the last three iterations with the metric minimum of between four and six previous iterations is a robust and appropriate strategy. The harmonic energy and vector field update magnitude metrics give the best results in terms of robustness and speed of convergence.

  6. WiLE: A Mathematica package for weak coupling expansion of Wilson loops in ABJ(M) theory

    NASA Astrophysics Data System (ADS)

    Preti, M.

    2018-06-01

    We present WiLE, a Mathematica® package designed to perform the weak coupling expansion of any Wilson loop in ABJ(M) theory at arbitrary perturbative order. For a given set of fields on the loop and internal vertices, the package displays all the possible Feynman diagrams and their integral representations. The user can also choose to exclude non planar diagrams, tadpoles and self-energies. Through the use of interactive input windows, the package should be easily accessible to users with little or no previous experience. The package manual provides some pedagogical examples and the computation of all ladder diagrams at three-loop relevant for the cusp anomalous dimension in ABJ(M). The latter application gives also support to some recent results computed in different contexts.

  7. Structural factoring approach for analyzing stochastic networks

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J.; Shier, Douglas R.

    1991-01-01

    The problem of finding the distribution of the shortest path length through a stochastic network is investigated. A general algorithm for determining the exact distribution of the shortest path length is developed based on the concept of conditional factoring, in which a directed, stochastic network is decomposed into an equivalent set of smaller, generally less complex subnetworks. Several network constructs are identified and exploited to reduce significantly the computational effort required to solve a network problem relative to complete enumeration. This algorithm can be applied to two important classes of stochastic path problems: determining the critical path distribution for acyclic networks and the exact two-terminal reliability for probabilistic networks. Computational experience with the algorithm was encouraging and allowed the exact solution of networks that have been previously analyzed only by approximation techniques.

  8. Study of USGS/NASA land use classification system. [compatibility of land use classification system with computer processing techniques employed for land use mapping from ERTS data

    NASA Technical Reports Server (NTRS)

    Spann, G. W.; Faust, N. L.

    1974-01-01

    It is known from several previous investigations that many categories of land-use can be mapped via computer processing of Earth Resources Technology Satellite data. The results are presented of one such experiment using the USGS/NASA land-use classification system. Douglas County, Georgia, was chosen as the test site for this project. It was chosen primarily because of its recent rapid growth and future growth potential. Results of the investigation indicate an overall land-use mapping accuracy of 67% with higher accuracies in rural areas and lower accuracies in urban areas. It is estimated, however, that 95% of the State of Georgia could be mapped by these techniques with an accuracy of 80% to 90%.

  9. Subsonic aerodynamic characteristics of interacting lifting surfaces with separated flow around sharp edges predicted by a vortex-lattice method

    NASA Technical Reports Server (NTRS)

    Lamar, J. E.; Gloss, B. B.

    1975-01-01

    Because the potential flow suction along the leading and side edges of a planform can be used to determine both leading- and side-edge vortex lift, the present investigation was undertaken to apply the vortex-lattice method to computing side-edge suction force for isolated or interacting planforms. Although there is a small effect of bound vortex sweep on the computation of the side-edge suction force, the results obtained for a number of different isolated planforms produced acceptable agreement with results obtained from a method employing continuous induced-velocity distributions. By using the method outlined, better agreement between theory and experiment was noted for a wing in the presence of a canard than was previously obtained.

  10. The FIFE Project at Fermilab

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Box, D.; Boyd, J.; Di Benedetto, V.

    2016-01-01

    The FabrIc for Frontier Experiments (FIFE) project is an initiative within the Fermilab Scientific Computing Division designed to steer the computing model for non-LHC Fermilab experiments across multiple physics areas. FIFE is a collaborative effort between experimenters and computing professionals to design and develop integrated computing models for experiments of varying size, needs, and infrastructure. The major focus of the FIFE project is the development, deployment, and integration of solutions for high throughput computing, data management, database access and collaboration management within an experiment. To accomplish this goal, FIFE has developed workflows that utilize Open Science Grid compute sites alongmore » with dedicated and commercial cloud resources. The FIFE project has made significant progress integrating into experiment computing operations several services including a common job submission service, software and reference data distribution through CVMFS repositories, flexible and robust data transfer clients, and access to opportunistic resources on the Open Science Grid. The progress with current experiments and plans for expansion with additional projects will be discussed. FIFE has taken the leading role in defining the computing model for Fermilab experiments, aided in the design of experiments beyond those hosted at Fermilab, and will continue to define the future direction of high throughput computing for future physics experiments worldwide.« less

  11. A methodology for the design of experiments in computational intelligence with multiple regression models.

    PubMed

    Fernandez-Lozano, Carlos; Gestal, Marcos; Munteanu, Cristian R; Dorado, Julian; Pazos, Alejandro

    2016-01-01

    The design of experiments and the validation of the results achieved with them are vital in any research study. This paper focuses on the use of different Machine Learning approaches for regression tasks in the field of Computational Intelligence and especially on a correct comparison between the different results provided for different methods, as those techniques are complex systems that require further study to be fully understood. A methodology commonly accepted in Computational intelligence is implemented in an R package called RRegrs. This package includes ten simple and complex regression models to carry out predictive modeling using Machine Learning and well-known regression algorithms. The framework for experimental design presented herein is evaluated and validated against RRegrs. Our results are different for three out of five state-of-the-art simple datasets and it can be stated that the selection of the best model according to our proposal is statistically significant and relevant. It is of relevance to use a statistical approach to indicate whether the differences are statistically significant using this kind of algorithms. Furthermore, our results with three real complex datasets report different best models than with the previously published methodology. Our final goal is to provide a complete methodology for the use of different steps in order to compare the results obtained in Computational Intelligence problems, as well as from other fields, such as for bioinformatics, cheminformatics, etc., given that our proposal is open and modifiable.

  12. A methodology for the design of experiments in computational intelligence with multiple regression models

    PubMed Central

    Gestal, Marcos; Munteanu, Cristian R.; Dorado, Julian; Pazos, Alejandro

    2016-01-01

    The design of experiments and the validation of the results achieved with them are vital in any research study. This paper focuses on the use of different Machine Learning approaches for regression tasks in the field of Computational Intelligence and especially on a correct comparison between the different results provided for different methods, as those techniques are complex systems that require further study to be fully understood. A methodology commonly accepted in Computational intelligence is implemented in an R package called RRegrs. This package includes ten simple and complex regression models to carry out predictive modeling using Machine Learning and well-known regression algorithms. The framework for experimental design presented herein is evaluated and validated against RRegrs. Our results are different for three out of five state-of-the-art simple datasets and it can be stated that the selection of the best model according to our proposal is statistically significant and relevant. It is of relevance to use a statistical approach to indicate whether the differences are statistically significant using this kind of algorithms. Furthermore, our results with three real complex datasets report different best models than with the previously published methodology. Our final goal is to provide a complete methodology for the use of different steps in order to compare the results obtained in Computational Intelligence problems, as well as from other fields, such as for bioinformatics, cheminformatics, etc., given that our proposal is open and modifiable. PMID:27920952

  13. Development of a computational model on the neural activity patterns of a visual working memory in a hierarchical feedforward Network

    NASA Astrophysics Data System (ADS)

    An, Soyoung; Choi, Woochul; Paik, Se-Bum

    2015-11-01

    Understanding the mechanism of information processing in the human brain remains a unique challenge because the nonlinear interactions between the neurons in the network are extremely complex and because controlling every relevant parameter during an experiment is difficult. Therefore, a simulation using simplified computational models may be an effective approach. In the present study, we developed a general model of neural networks that can simulate nonlinear activity patterns in the hierarchical structure of a neural network system. To test our model, we first examined whether our simulation could match the previously-observed nonlinear features of neural activity patterns. Next, we performed a psychophysics experiment for a simple visual working memory task to evaluate whether the model could predict the performance of human subjects. Our studies show that the model is capable of reproducing the relationship between memory load and performance and may contribute, in part, to our understanding of how the structure of neural circuits can determine the nonlinear neural activity patterns in the human brain.

  14. Cosmic reionization on computers. Mean and fluctuating redshifted 21 CM signal

    DOE PAGES

    Kaurov, Alexander A.; Gnedin, Nickolay Y.

    2016-06-20

    We explore the mean and fluctuating redshifted 21 cm signal in numerical simulations from the Cosmic Reionization On Computers project. We find that the mean signal varies between about ±25 mK. Most significantly, we find that the negative pre-reionization dip at z ~ 10–15 only extends tomore » $$\\langle {\\rm{\\Delta }}{T}_{B}\\rangle \\sim -25\\,{\\rm{mK}}$$, requiring substantially higher sensitivity from global signal experiments that operate in this redshift range (EDGES-II, LEDA, SCI-HI, and DARE) than has often been assumed previously. We also explore the role of dense substructure (filaments and embedded galaxies) in the formation of the 21 cm power spectrum. We find that by neglecting the semi-neutral substructure inside ionized bubbles, the power spectrum can be misestimated by 25%–50% at scales k ~ 0.1–1h Mpc –1. Furthermore, this scale range is of particular interest, because the upcoming 21 cm experiments (Murchison Widefield Array, Precision Array for Probing the Epoch of Reionization, Hydrogen Epoch of Reionization Array) are expected to be most sensitive within it.« less

  15. Unraveling Entropic Rate Acceleration Induced by Solvent Dynamics in Membrane Enzymes.

    PubMed

    Kürten, Charlotte; Syrén, Per-Olof

    2016-01-16

    Enzyme catalysis evolved in an aqueous environment. The influence of solvent dynamics on catalysis is, however, currently poorly understood and usually neglected. The study of water dynamics in enzymes and the associated thermodynamical consequences is highly complex and has involved computer simulations, nuclear magnetic resonance (NMR) experiments, and calorimetry. Water tunnels that connect the active site with the surrounding solvent are key to solvent displacement and dynamics. The protocol herein allows for the engineering of these motifs for water transport, which affects specificity, activity and thermodynamics. By providing a biophysical framework founded on theory and experiments, the method presented herein can be used by researchers without previous expertise in computer modeling or biophysical chemistry. The method will advance our understanding of enzyme catalysis on the molecular level by measuring the enthalpic and entropic changes associated with catalysis by enzyme variants with obstructed water tunnels. The protocol can be used for the study of membrane-bound enzymes and other complex systems. This will enhance our understanding of the importance of solvent reorganization in catalysis as well as provide new catalytic strategies in protein design and engineering.

  16. Cosmic Reionization On Computers. Mean and Fluctuating Redshifted 21 cm Signal

    NASA Astrophysics Data System (ADS)

    Kaurov, Alexander A.; Gnedin, Nickolay Y.

    2016-06-01

    We explore the mean and fluctuating redshifted 21 cm signal in numerical simulations from the Cosmic Reionization On Computers project. We find that the mean signal varies between about ±25 mK. Most significantly, we find that the negative pre-reionization dip at z ˜ 10-15 only extends to < {{Δ }}{T}B> ˜ -25 {{mK}}, requiring substantially higher sensitivity from global signal experiments that operate in this redshift range (EDGES-II, LEDA, SCI-HI, and DARE) than has often been assumed previously. We also explore the role of dense substructure (filaments and embedded galaxies) in the formation of the 21 cm power spectrum. We find that by neglecting the semi-neutral substructure inside ionized bubbles, the power spectrum can be misestimated by 25%-50% at scales k ˜ 0.1-1h Mpc-1. This scale range is of particular interest, because the upcoming 21 cm experiments (Murchison Widefield Array, Precision Array for Probing the Epoch of Reionization, Hydrogen Epoch of Reionization Array) are expected to be most sensitive within it.

  17. Multi -omics and metabolic modelling pipelines: challenges and tools for systems microbiology.

    PubMed

    Fondi, Marco; Liò, Pietro

    2015-02-01

    Integrated -omics approaches are quickly spreading across microbiology research labs, leading to (i) the possibility of detecting previously hidden features of microbial cells like multi-scale spatial organization and (ii) tracing molecular components across multiple cellular functional states. This promises to reduce the knowledge gap between genotype and phenotype and poses new challenges for computational microbiologists. We underline how the capability to unravel the complexity of microbial life will strongly depend on the integration of the huge and diverse amount of information that can be derived today from -omics experiments. In this work, we present opportunities and challenges of multi -omics data integration in current systems biology pipelines. We here discuss which layers of biological information are important for biotechnological and clinical purposes, with a special focus on bacterial metabolism and modelling procedures. A general review of the most recent computational tools for performing large-scale datasets integration is also presented, together with a possible framework to guide the design of systems biology experiments by microbiologists. Copyright © 2015. Published by Elsevier GmbH.

  18. Opposition-Based Memetic Algorithm and Hybrid Approach for Sorting Permutations by Reversals.

    PubMed

    Soncco-Álvarez, José Luis; Muñoz, Daniel M; Ayala-Rincón, Mauricio

    2018-02-21

    Sorting unsigned permutations by reversals is a difficult problem; indeed, it was proved to be NP-hard by Caprara (1997). Because of its high complexity, many approximation algorithms to compute the minimal reversal distance were proposed until reaching the nowadays best-known theoretical ratio of 1.375. In this article, two memetic algorithms to compute the reversal distance are proposed. The first one uses the technique of opposition-based learning leading to an opposition-based memetic algorithm; the second one improves the previous algorithm by applying the heuristic of two breakpoint elimination leading to a hybrid approach. Several experiments were performed with one-hundred randomly generated permutations, single benchmark permutations, and biological permutations. Results of the experiments showed that the proposed OBMA and Hybrid-OBMA algorithms achieve the best results for practical cases, that is, for permutations of length up to 120. Also, Hybrid-OBMA showed to improve the results of OBMA for permutations greater than or equal to 60. The applicability of our proposed algorithms was checked processing permutations based on biological data, in which case OBMA gave the best average results for all instances.

  19. Registration of planar bioluminescence to magnetic resonance and x-ray computed tomography images as a platform for the development of bioluminescence tomography reconstruction algorithms.

    PubMed

    Beattie, Bradley J; Klose, Alexander D; Le, Carl H; Longo, Valerie A; Dobrenkov, Konstantine; Vider, Jelena; Koutcher, Jason A; Blasberg, Ronald G

    2009-01-01

    The procedures we propose make possible the mapping of two-dimensional (2-D) bioluminescence image (BLI) data onto a skin surface derived from a three-dimensional (3-D) anatomical modality [magnetic resonance (MR) or computed tomography (CT)] dataset. This mapping allows anatomical information to be incorporated into bioluminescence tomography (BLT) reconstruction procedures and, when applied using sources visible to both optical and anatomical modalities, can be used to evaluate the accuracy of those reconstructions. Our procedures, based on immobilization of the animal and a priori determined fixed projective transforms, should be more robust and accurate than previously described efforts, which rely on a poorly constrained retrospectively determined warping of the 3-D anatomical information. Experiments conducted to measure the accuracy of the proposed registration procedure found it to have a mean error of 0.36+/-0.23 mm. Additional experiments highlight some of the confounds that are often overlooked in the BLT reconstruction process, and for two of these confounds, simple corrections are proposed.

  20. Cosmic reionization on computers. Mean and fluctuating redshifted 21 CM signal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaurov, Alexander A.; Gnedin, Nickolay Y.

    We explore the mean and fluctuating redshifted 21 cm signal in numerical simulations from the Cosmic Reionization On Computers project. We find that the mean signal varies between about ±25 mK. Most significantly, we find that the negative pre-reionization dip at z ~ 10–15 only extends tomore » $$\\langle {\\rm{\\Delta }}{T}_{B}\\rangle \\sim -25\\,{\\rm{mK}}$$, requiring substantially higher sensitivity from global signal experiments that operate in this redshift range (EDGES-II, LEDA, SCI-HI, and DARE) than has often been assumed previously. We also explore the role of dense substructure (filaments and embedded galaxies) in the formation of the 21 cm power spectrum. We find that by neglecting the semi-neutral substructure inside ionized bubbles, the power spectrum can be misestimated by 25%–50% at scales k ~ 0.1–1h Mpc –1. Furthermore, this scale range is of particular interest, because the upcoming 21 cm experiments (Murchison Widefield Array, Precision Array for Probing the Epoch of Reionization, Hydrogen Epoch of Reionization Array) are expected to be most sensitive within it.« less

  1. Numerical study to assess sulfur hexafluoride as a medium for testing multielement airfoils

    NASA Technical Reports Server (NTRS)

    Bonhaus, Daryl L.; Anderson, W. Kyle; Mavriplis, Dimitri J.

    1995-01-01

    A methodology is described for computing viscous flows of air and sulfur hexafluoride (SF6). The basis is an existing flow solver that calculates turbulent flows in two dimensions on unstructured triangular meshes. The solver has been modified to incorporate the thermodynamic model for SF6 and used to calculate the viscous flow over two multielement airfoils that have been tested in a wind tunnel with air as the test medium. Flows of both air and SF6 at a free-stream Mach number of 0.2 and a Reynolds number of 9 x 10(exp 6) are computed for a range of angles of attack corresponding to the wind-tunnel test. The computations are used to investigate the suitability of SF6 as a test medium in wind tunnels and are a follow-on to previous computations for single-element airfoils. Surface-pressure, lift, and drag coefficients are compared with experimental data. The effects of heavy gas on the details of the flow are investigated based on computed boundary-layer and skin-friction data. In general, the predictions in SF6 vary little from those in air. Within the limitations of the computational method, the results presented are sufficiently encouraging to warrant further experiments.

  2. The joint effect of mesoscale and microscale roughness on perceived gloss.

    PubMed

    Qi, Lin; Chantler, Mike J; Siebert, J Paul; Dong, Junyu

    2015-10-01

    Computer simulated stimuli can provide a flexible method for creating artificial scenes in the study of visual perception of material surface properties. Previous work based on this approach reported that the properties of surface roughness and glossiness are mutually interdependent and therefore, perception of one affects the perception of the other. In this case roughness was limited to a surface property termed bumpiness. This paper reports a study into how perceived gloss varies with two model parameters related to surface roughness in computer simulations: the mesoscale roughness parameter in a surface geometry model and the microscale roughness parameter in a surface reflectance model. We used a real-world environment map to provide complex illumination and a physically-based path tracer for rendering the stimuli. Eight observers took part in a 2AFC experiment, and the results were tested against conjoint measurement models. We found that although both of the above roughness parameters significantly affect perceived gloss, the additive model does not adequately describe their mutually interactive and nonlinear influence, which is at variance with previous findings. We investigated five image properties used to quantify specular highlights, and found that perceived gloss is well predicted using a linear model. Our findings provide computational support to the 'statistical appearance models' proposed recently for material perception. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Facilitating higher-fidelity simulations of axial compressor instability and other turbomachinery flow conditions

    NASA Astrophysics Data System (ADS)

    Herrick, Gregory Paul

    The quest to accurately capture flow phenomena with length-scales both short and long and to accurately represent complex flow phenomena within disparately sized geometry inspires a need for an efficient, high-fidelity, multi-block structured computational fluid dynamics (CFD) parallel computational scheme. This research presents and demonstrates a more efficient computational method by which to perform multi-block structured CFD parallel computational simulations, thus facilitating higher-fidelity solutions of complicated geometries (due to the inclusion of grids for "small'' flow areas which are often merely modeled) and their associated flows. This computational framework offers greater flexibility and user-control in allocating the resource balance between process count and wall-clock computation time. The principal modifications implemented in this revision consist of a "multiple grid block per processing core'' software infrastructure and an analytic computation of viscous flux Jacobians. The development of this scheme is largely motivated by the desire to simulate axial compressor stall inception with more complete gridding of the flow passages (including rotor tip clearance regions) than has been previously done while maintaining high computational efficiency (i.e., minimal consumption of computational resources), and thus this paradigm shall be demonstrated with an examination of instability in a transonic axial compressor. However, the paradigm presented herein facilitates CFD simulation of myriad previously impractical geometries and flows and is not limited to detailed analyses of axial compressor flows. While the simulations presented herein were technically possible under the previous structure of the subject software, they were much less computationally efficient and thus not pragmatically feasible; the previous research using this software to perform three-dimensional, full-annulus, time-accurate, unsteady, full-stage (with sliding-interface) simulations of rotating stall inception in axial compressors utilized tip clearance periodic models, while the scheme here is demonstrated by a simulation of axial compressor stall inception utilizing gridded rotor tip clearance regions. As will be discussed, much previous research---experimental, theoretical, and computational---has suggested that understanding clearance flow behavior is critical to understanding stall inception, and previous computational research efforts which have used tip clearance models have begged the question, "What about the clearance flows?''. This research begins to address that question.

  4. Cone beam x-ray luminescence computed tomography: a feasibility study.

    PubMed

    Chen, Dongmei; Zhu, Shouping; Yi, Huangjian; Zhang, Xianghan; Chen, Duofang; Liang, Jimin; Tian, Jie

    2013-03-01

    The appearance of x-ray luminescence computed tomography (XLCT) opens new possibilities to perform molecular imaging by x ray. In the previous XLCT system, the sample was irradiated by a sequence of narrow x-ray beams and the x-ray luminescence was measured by a highly sensitive charge coupled device (CCD) camera. This resulted in a relatively long sampling time and relatively low utilization of the x-ray beam. In this paper, a novel cone beam x-ray luminescence computed tomography strategy is proposed, which can fully utilize the x-ray dose and shorten the scanning time. The imaging model and reconstruction method are described. The validity of the imaging strategy has been studied in this paper. In the cone beam XLCT system, the cone beam x ray was adopted to illuminate the sample and a highly sensitive CCD camera was utilized to acquire luminescent photons emitted from the sample. Photons scattering in biological tissues makes it an ill-posed problem to reconstruct the 3D distribution of the x-ray luminescent sample in the cone beam XLCT. In order to overcome this issue, the authors used the diffusion approximation model to describe the photon propagation in tissues, and employed the sparse regularization method for reconstruction. An incomplete variables truncated conjugate gradient method and permissible region strategy were used for reconstruction. Meanwhile, traditional x-ray CT imaging could also be performed in this system. The x-ray attenuation effect has been considered in their imaging model, which is helpful in improving the reconstruction accuracy. First, simulation experiments with cylinder phantoms were carried out to illustrate the validity of the proposed compensated method. The experimental results showed that the location error of the compensated algorithm was smaller than that of the uncompensated method. The permissible region strategy was applied and reduced the reconstruction error to less than 2 mm. The robustness and stability were then evaluated from different view numbers, different regularization parameters, different measurement noise levels, and optical parameters mismatch. The reconstruction results showed that the settings had a small effect on the reconstruction. The nonhomogeneous phantom simulation was also carried out to simulate a more complex experimental situation and evaluated their proposed method. Second, the physical cylinder phantom experiments further showed similar results in their prototype XLCT system. With the discussion of the above experiments, it was shown that the proposed method is feasible to the general case and actual experiments. Utilizing numerical simulation and physical experiments, the authors demonstrated the validity of the new cone beam XLCT method. Furthermore, compared with the previous narrow beam XLCT, the cone beam XLCT could more fully utilize the x-ray dose and the scanning time would be shortened greatly. The study of both simulation experiments and physical phantom experiments indicated that the proposed method was feasible to the general case and actual experiments.

  5. Automated Search for new Quantum Experiments.

    PubMed

    Krenn, Mario; Malik, Mehul; Fickler, Robert; Lapkiewicz, Radek; Zeilinger, Anton

    2016-03-04

    Quantum mechanics predicts a number of, at first sight, counterintuitive phenomena. It therefore remains a question whether our intuition is the best way to find new experiments. Here, we report the development of the computer algorithm Melvin which is able to find new experimental implementations for the creation and manipulation of complex quantum states. Indeed, the discovered experiments extensively use unfamiliar and asymmetric techniques which are challenging to understand intuitively. The results range from the first implementation of a high-dimensional Greenberger-Horne-Zeilinger state, to a vast variety of experiments for asymmetrically entangled quantum states-a feature that can only exist when both the number of involved parties and dimensions is larger than 2. Additionally, new types of high-dimensional transformations are found that perform cyclic operations. Melvin autonomously learns from solutions for simpler systems, which significantly speeds up the discovery rate of more complex experiments. The ability to automate the design of a quantum experiment can be applied to many quantum systems and allows the physical realization of quantum states previously thought of only on paper.

  6. Evolution of design considerations in complex craniofacial reconstruction using patient-specific implants.

    PubMed

    Peel, Sean; Bhatia, Satyajeet; Eggbeer, Dominic; Morris, Daniel S; Hayhurst, Caroline

    2017-06-01

    Previously published evidence has established major clinical benefits from using computer-aided design, computer-aided manufacturing, and additive manufacturing to produce patient-specific devices. These include cutting guides, drilling guides, positioning guides, and implants. However, custom devices produced using these methods are still not in routine use, particularly by the UK National Health Service. Oft-cited reasons for this slow uptake include the following: a higher up-front cost than conventionally fabricated devices, material-choice uncertainty, and a lack of long-term follow-up due to their relatively recent introduction. This article identifies a further gap in current knowledge - that of design rules, or key specification considerations for complex computer-aided design/computer-aided manufacturing/additive manufacturing devices. This research begins to address the gap by combining a detailed review of the literature with first-hand experience of interdisciplinary collaboration on five craniofacial patient case studies. In each patient case, bony lesions in the orbito-temporal region were segmented, excised, and reconstructed in the virtual environment. Three cases translated these digital plans into theatre via polymer surgical guides. Four cases utilised additive manufacturing to fabricate titanium implants. One implant was machined from polyether ether ketone. From the literature, articles with relevant abstracts were analysed to extract design considerations. In all, 19 frequently recurring design considerations were extracted from previous publications. Nine new design considerations were extracted from the case studies - on the basis of subjective clinical evaluation. These were synthesised to produce a design considerations framework to assist clinicians with prescribing and design engineers with modelling. Promising avenues for further research are proposed.

  7. Identifying Differences between Depressed Adolescent Suicide Ideators and Attempters

    PubMed Central

    Auerbach, Randy P.; Millner, Alexander J.; Stewart, Jeremy G.; Esposito, Erika

    2015-01-01

    Background Adolescent depression and suicide are pressing public health concerns, and identifying key differences among suicide ideators and attempters is critical. The goal of the current study is to test whether depressed adolescent suicide attempters report greater anhedonia severity and exhibit aberrant effort-cost computations in the face of uncertainty. Methods Depressed adolescents (n = 101) ages 13–19 years were administered structured clinical interviews to assess current mental health disorders and a history of suicidality (suicide ideators = 55, suicide attempters = 46). Then, participants completed self-report instruments assessing symptoms of suicidal ideation, depression, anhedonia, and anxiety as well as a computerized effort-cost computation task. Results Compared with depressed adolescent suicide ideators, attempters report greater anhedonia severity, even after concurrently controlling for symptoms of suicidal ideation, depression, and anxiety. Additionally, when completing the effort-cost computation task, suicide attempters are less likely to pursue the difficult, high value option when outcomes are uncertain. Follow-up, trial-level analyses of effort-cost computations suggest that receipt of reward does not influence future decision-making among suicide attempters, however, suicide ideators exhibit a win-stay approach when receiving rewards on previous trials. Limitations Findings should be considered in light of limitations including a modest sample size, which limits generalizability, and the cross-sectional design. Conclusions Depressed adolescent suicide attempters are characterized by greater anhedonia severity, which may impair the ability to integrate previous rewarding experiences to inform future decisions. Taken together, this may generate a feeling of powerlessness that contributes to increased suicidality and a needless loss of life. PMID:26233323

  8. A Lagrangian subgrid-scale model with dynamic estimation of Lagrangian time scale for large eddy simulation of complex flows

    NASA Astrophysics Data System (ADS)

    Verma, Aman; Mahesh, Krishnan

    2012-08-01

    The dynamic Lagrangian averaging approach for the dynamic Smagorinsky model for large eddy simulation is extended to an unstructured grid framework and applied to complex flows. The Lagrangian time scale is dynamically computed from the solution and does not need any adjustable parameter. The time scale used in the standard Lagrangian model contains an adjustable parameter θ. The dynamic time scale is computed based on a "surrogate-correlation" of the Germano-identity error (GIE). Also, a simple material derivative relation is used to approximate GIE at different events along a pathline instead of Lagrangian tracking or multi-linear interpolation. Previously, the time scale for homogeneous flows was computed by averaging along directions of homogeneity. The present work proposes modifications for inhomogeneous flows. This development allows the Lagrangian averaged dynamic model to be applied to inhomogeneous flows without any adjustable parameter. The proposed model is applied to LES of turbulent channel flow on unstructured zonal grids at various Reynolds numbers. Improvement is observed when compared to other averaging procedures for the dynamic Smagorinsky model, especially at coarse resolutions. The model is also applied to flow over a cylinder at two Reynolds numbers and good agreement with previous computations and experiments is obtained. Noticeable improvement is obtained using the proposed model over the standard Lagrangian model. The improvement is attributed to a physically consistent Lagrangian time scale. The model also shows good performance when applied to flow past a marine propeller in an off-design condition; it regularizes the eddy viscosity and adjusts locally to the dominant flow features.

  9. Antitumor Activity of Lankacidin Group Antibiotics Is Due to Microtubule Stabilization via a Paclitaxel-like Mechanism.

    PubMed

    Ayoub, Ahmed Taha; Abou El-Magd, Rabab M; Xiao, Jack; Lewis, Cody Wayne; Tilli, Tatiana Martins; Arakawa, Kenji; Nindita, Yosi; Chan, Gordon; Sun, Luxin; Glover, Mark; Klobukowski, Mariusz; Tuszynski, Jack

    2016-10-27

    Lankacidin group antibiotics show strong antimicrobial activity against various Gram-positive bacteria. In addition, they were shown to have considerable antitumor activity against certain cell line models. For decades, the antitumor activity of lankacidin was associated with the mechanism of its antimicrobial action, which is interference with peptide bond formation during protein synthesis. This, however, was never confirmed experimentally. Due to significant similarity to paclitaxel-like hits in a previous computational virtual screening study, we suggested that the cytotoxic effect of lankacidin is due to a paclitaxel-like action. In this study, we tested this hypothesis computationally and experimentally and confirmed that lankacidin is a microtubule stabilizer that enhances tubulin assembly and displaces taxoids from their binding site. This study serves as a starting point for optimization of lankacidin derivatives for better antitumor activities. It also highlights the power of computational predictions and their aid in guiding experiments and formulating rigorous hypotheses.

  10. How Much Higher Can HTCondor Fly?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fajardo, E. M.; Dost, J. M.; Holzman, B.

    The HTCondor high throughput computing system is heavily used in the high energy physics (HEP) community as the batch system for several Worldwide LHC Computing Grid (WLCG) resources. Moreover, it is the backbone of GlidelnWMS, the pilot system used by the computing organization of the Compact Muon Solenoid (CMS) experiment. To prepare for LHC Run 2, we probed the scalability limits of new versions and configurations of HTCondor with a goal of reaching 200,000 simultaneous running jobs in a single internationally distributed dynamic pool.In this paper, we first describe how we created an opportunistic distributed testbed capable of exercising runsmore » with 200,000 simultaneous jobs without impacting production. This testbed methodology is appropriate not only for scale testing HTCondor, but potentially for many other services. In addition to the test conditions and the testbed topology, we include the suggested configuration options used to obtain the scaling results, and describe some of the changes to HTCondor inspired by our testing that enabled sustained operations at scales well beyond previous limits.« less

  11. Parallel Fortran-MPI software for numerical inversion of the Laplace transform and its application to oscillatory water levels in groundwater environments

    USGS Publications Warehouse

    Zhan, X.

    2005-01-01

    A parallel Fortran-MPI (Message Passing Interface) software for numerical inversion of the Laplace transform based on a Fourier series method is developed to meet the need of solving intensive computational problems involving oscillatory water level's response to hydraulic tests in a groundwater environment. The software is a parallel version of ACM (The Association for Computing Machinery) Transactions on Mathematical Software (TOMS) Algorithm 796. Running 38 test examples indicated that implementation of MPI techniques with distributed memory architecture speedups the processing and improves the efficiency. Applications to oscillatory water levels in a well during aquifer tests are presented to illustrate how this package can be applied to solve complicated environmental problems involved in differential and integral equations. The package is free and is easy to use for people with little or no previous experience in using MPI but who wish to get off to a quick start in parallel computing. ?? 2004 Elsevier Ltd. All rights reserved.

  12. Transport of Space Environment Electrons: A Simplified Rapid-Analysis Computational Procedure

    NASA Technical Reports Server (NTRS)

    Nealy, John E.; Anderson, Brooke M.; Cucinotta, Francis A.; Wilson, John W.; Katz, Robert; Chang, C. K.

    2002-01-01

    A computational procedure for describing transport of electrons in condensed media has been formulated for application to effects and exposures from spectral distributions typical of electrons trapped in planetary magnetic fields. The procedure is based on earlier parameterizations established from numerous electron beam experiments. New parameterizations have been derived that logically extend the domain of application to low molecular weight (high hydrogen content) materials and higher energies (approximately 50 MeV). The production and transport of high energy photons (bremsstrahlung) generated in the electron transport processes have also been modeled using tabulated values of photon production cross sections. A primary purpose for developing the procedure has been to provide a means for rapidly performing numerous repetitive calculations essential for electron radiation exposure assessments for complex space structures. Several favorable comparisons have been made with previous calculations for typical space environment spectra, which have indicated that accuracy has not been substantially compromised at the expense of computational speed.

  13. Effects of Geometric Details on Slat Noise Generation and Propagation

    NASA Technical Reports Server (NTRS)

    Khorrami, Mehdi R.; Lockard, David P.

    2006-01-01

    The relevance of geometric details to the generation and propagation of noise from leading-edge slats is considered. Typically, such details are omitted in computational simulations and model-scale experiments thereby creating ambiguities in comparisons with acoustic results from flight tests. The current study uses two-dimensional, computational simulations in conjunction with a Ffowcs Williams-Hawkings (FW-H) solver to investigate the effects of previously neglected slat "bulb" and "blade" seals on the local flow field and the associated acoustic radiation. The computations clearly show that the presence of the "blade" seal at the cusp significantly changes the slat cove flow dynamics, reduces the amplitudes of the radiated sound, and to a lesser extent, alters the directivity beneath the airfoil. Furthermore, it is demonstrated that a modest extension of the baseline "blade" seal further enhances the suppression of slat noise. As a side issue, the utility and equivalence of FW-H methodology for calculating far-field noise as opposed to a more direct approach is examined and demonstrated.

  14. Investigation of in-flame soot optical properties in laminar coflow diffusion flames using thermophoretic particle sampling and spectral light extinction

    NASA Astrophysics Data System (ADS)

    Kempema, Nathan J.; Ma, Bin; Long, Marshall B.

    2016-09-01

    Soot optical properties are essential to the noninvasive study of the in-flame evolution of soot particles since they allow quantitative interpretation of optical diagnostics. Such experimental data are critical for comparison to results from computational models and soot sub-models. In this study, the thermophoretic sampling particle diagnostic (TSPD) technique is applied along with data from a previous spectrally resolved line-of-sight light attenuation experiment to determine the soot volume fraction and absorption function. The TSPD technique is applied in a flame stabilized on the Yale burner, and the soot scattering-to-absorption ratio is calculated using the Rayleigh-Debye-Gans theory for fractal aggregates and morphology information from a previous sampling experiment. The soot absorption function is determined as a function of wavelength and found to be in excellent agreement with previous in-flame measurements of the soot absorption function in coflow laminar diffusion flames. Two-dimensional maps of the soot dispersion exponent are calculated and show that the soot absorption function may have a positive or negative exponential wavelength dependence depending on the in-flame location. Finally, the wavelength dependence of the soot absorption function is related to the ratio of soot absorption functions, as would be found using two-excitation-wavelength laser-induced incandescence.

  15. The Search for Effective Algorithms for Recovery from Loss of Separation

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Hagen, George E.; Maddalon, Jeffrey M.; Munoz, Cesar A.; Narawicz, Anthony J.

    2012-01-01

    Our previous work presented an approach for developing high confidence algorithms for recovering aircraft from loss of separation situations. The correctness theorems for the algorithms relied on several key assumptions, namely that state data for all local aircraft is perfectly known, that resolution maneuvers can be achieved instantaneously, and that all aircraft compute resolutions using exactly the same data. Experiments showed that these assumptions were adequate in cases where the aircraft are far away from losing separation, but are insufficient when the aircraft have already lost separation. This paper describes the results of this experimentation and proposes a new criteria specification for loss of separation recovery that preserves the formal safety properties of the previous criteria while overcoming some key limitations. Candidate algorithms that satisfy the new criteria are presented.

  16. Natural History of Ground-Glass Lesions Among Patients With Previous Lung Cancer.

    PubMed

    Shewale, Jitesh B; Nelson, David B; Rice, David C; Sepesi, Boris; Hofstetter, Wayne L; Mehran, Reza J; Vaporciyan, Ara A; Walsh, Garrett L; Swisher, Stephen G; Roth, Jack A; Antonoff, Mara B

    2018-06-01

    Among patients with previous lung cancer, the malignant potential of subsequent ground-glass opacities (GGOs) on computed tomography remains unknown, with a lack of consensus regarding surveillance and intervention. This study sought to describe the natural history of GGO in patients with a history of lung cancer. A retrospective review was performed of 210 patients with a history of lung cancer and ensuing computed tomography evidence of pure or mixed GGOs between 2007 and 2013. Computed tomography reports were reviewed to determine the fate of the GGOs, by classifying all lesions as stable, resolved, or progressive over the course of the study. Multivariable analysis was performed to identify predictors of GGO progression and resolution. The mean follow-up time was 13 months. During this period, 55 (26%) patients' GGOs were stable, 131 (62%) resolved, and 24 (11%) progressed. Of the 24 GGOs that progressed, three were subsequently diagnosed as adenocarcinoma. Patients of black race (odds ratio [OR], 0.26) and other races besides white (OR, 0.89) had smaller odds of GGO resolution (p = 0.033), whereas patients with previous lung squamous cell carcinoma (OR, 5.16) or small cell carcinoma (OR, 5.36) were more likely to experience GGO resolution (p < 0.001). On multivariable analysis, only a history of adenocarcinoma was an independent predictor of GGO progression (OR, 6.9; p = 0.011). Among patients with a history of lung cancer, prior adenocarcinoma emerged as a predictor of GGO progression, whereas a history of squamous cell carcinoma or small cell carcinoma and white race were identified as predictors of GGO resolution. Copyright © 2018 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  17. Identification of genes related to proliferative diabetic retinopathy through RWR algorithm based on protein-protein interaction network.

    PubMed

    Zhang, Jian; Suo, Yan; Liu, Min; Xu, Xun

    2018-06-01

    Proliferative diabetic retinopathy (PDR) is one of the most common complications of diabetes and can lead to blindness. Proteomic studies have provided insight into the pathogenesis of PDR and a series of PDR-related genes has been identified but are far from fully characterized because the experimental methods are expensive and time consuming. In our previous study, we successfully identified 35 candidate PDR-related genes through the shortest-path algorithm. In the current study, we developed a computational method using the random walk with restart (RWR) algorithm and the protein-protein interaction (PPI) network to identify potential PDR-related genes. After some possible genes were obtained by the RWR algorithm, a three-stage filtration strategy, which includes the permutation test, interaction test and enrichment test, was applied to exclude potential false positives caused by the structure of PPI network, the poor interaction strength, and the limited similarity on gene ontology (GO) terms and biological pathways. As a result, 36 candidate genes were discovered by the method which was different from the 35 genes reported in our previous study. A literature review showed that 21 of these 36 genes are supported by previous experiments. These findings suggest the robustness and complementary effects of both our efforts using different computational methods, thus providing an alternative method to study PDR pathogenesis. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Investigation of the computer experiences and attitudes of pre-service mathematics teachers: new evidence from Turkey.

    PubMed

    Birgin, Osman; Catlioğlu, Hakan; Gürbüz, Ramazan; Aydin, Serhat

    2010-10-01

    This study aimed to investigate the experiences of pre-service mathematics (PSM) teachers with computers and their attitudes toward them. The Computer Attitude Scale, Computer Competency Survey, and Computer Use Information Form were administered to 180 Turkish PSM teachers. Results revealed that most PSM teachers used computers at home and at Internet cafes, and that their competency was generally intermediate and upper level. The study concludes that PSM teachers' attitudes about computers differ according to their years of study, computer ownership, level of computer competency, frequency of computer use, computer experience, and whether they had attended a computer-aided instruction course. However, computer attitudes were not affected by gender.

  19. Mathematical analysis of deception.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rogers, Deanna Tamae Koike; Durgin, Nancy Ann

    This report describes the results of a three year research project about the use of deception in information protection. The work involved a collaboration between Sandia employees and students in the Center for Cyber Defenders (CCD) and at the University of California at Davis. This report includes a review of the history of deception, a discussion of some cognitive issues, an overview of previous work in deception, the results of experiments on the effects of deception on an attacker, and a mathematical model of error types associated with deception in computer systems.

  20. Cosmological backgrounds of gravitational waves and eLISA/NGO: phase transitions, cosmic strings and other sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Binétruy, Pierre; Dufaux, Jean-François; Bohé, Alejandro

    We review several cosmological backgrounds of gravitational waves accessible to direct-detection experiments, with a special emphasis on those backgrounds due to first-order phase transitions and networks of cosmic (super-)strings. For these two particular sources, we revisit in detail the computation of the gravitational wave background and improve the results of previous works in the literature. We apply our results to identify the scientific potential of the NGO/eLISA mission of ESA regarding the detectability of cosmological backgrounds.

  1. On the computation of steady Hopper flows. II: von Mises materials in various geometries

    NASA Astrophysics Data System (ADS)

    Gremaud, Pierre A.; Matthews, John V.; O'Malley, Meghan

    2004-11-01

    Similarity solutions are constructed for the flow of granular materials through hoppers. Unlike previous work, the present approach applies to nonaxisymmetric containers. The model involves ten unknowns (stresses, velocity, and plasticity function) determined by nine nonlinear first order partial differential equations together with a quadratic algebraic constraint (yield condition). A pseudospectral discretization is applied; the resulting problem is solved with a trust region method. The important role of the hopper geometry on the flow is illustrated by several numerical experiments of industrial relevance.

  2. Monte Carlo treatment planning for molecular targeted radiotherapy within the MINERVA system

    NASA Astrophysics Data System (ADS)

    Lehmann, Joerg; Hartmann Siantar, Christine; Wessol, Daniel E.; Wemple, Charles A.; Nigg, David; Cogliati, Josh; Daly, Tom; Descalle, Marie-Anne; Flickinger, Terry; Pletcher, David; DeNardo, Gerald

    2005-03-01

    The aim of this project is to extend accurate and patient-specific treatment planning to new treatment modalities, such as molecular targeted radiation therapy, incorporating previously crafted and proven Monte Carlo and deterministic computation methods. A flexible software environment is being created that allows planning radiation treatment for these new modalities and combining different forms of radiation treatment with consideration of biological effects. The system uses common input interfaces, medical image sets for definition of patient geometry and dose reporting protocols. Previously, the Idaho National Engineering and Environmental Laboratory (INEEL), Montana State University (MSU) and Lawrence Livermore National Laboratory (LLNL) had accrued experience in the development and application of Monte Carlo based, three-dimensional, computational dosimetry and treatment planning tools for radiotherapy in several specialized areas. In particular, INEEL and MSU have developed computational dosimetry systems for neutron radiotherapy and neutron capture therapy, while LLNL has developed the PEREGRINE computational system for external beam photon-electron therapy. Building on that experience, the INEEL and MSU are developing the MINERVA (modality inclusive environment for radiotherapeutic variable analysis) software system as a general framework for computational dosimetry and treatment planning for a variety of emerging forms of radiotherapy. In collaboration with this development, LLNL has extended its PEREGRINE code to accommodate internal sources for molecular targeted radiotherapy (MTR), and has interfaced it with the plugin architecture of MINERVA. Results from the extended PEREGRINE code have been compared to published data from other codes, and found to be in general agreement (EGS4—2%, MCNP—10%) (Descalle et al 2003 Cancer Biother. Radiopharm. 18 71-9). The code is currently being benchmarked against experimental data. The interpatient variability of the drug pharmacokinetics in MTR can only be properly accounted for by image-based, patient-specific treatment planning, as has been common in external beam radiation therapy for many years. MINERVA offers 3D Monte Carlo-based MTR treatment planning as its first integrated operational capability. The new MINERVA system will ultimately incorporate capabilities for a comprehensive list of radiation therapies. In progress are modules for external beam photon-electron therapy and boron neutron capture therapy (BNCT). Brachytherapy and proton therapy are planned. Through the open application programming interface (API), other groups can add their own modules and share them with the community.

  3. Evaluation of Emerging Energy-Efficient Heterogeneous Computing Platforms for Biomolecular and Cellular Simulation Workloads.

    PubMed

    Stone, John E; Hallock, Michael J; Phillips, James C; Peterson, Joseph R; Luthey-Schulten, Zaida; Schulten, Klaus

    2016-05-01

    Many of the continuing scientific advances achieved through computational biology are predicated on the availability of ongoing increases in computational power required for detailed simulation and analysis of cellular processes on biologically-relevant timescales. A critical challenge facing the development of future exascale supercomputer systems is the development of new computing hardware and associated scientific applications that dramatically improve upon the energy efficiency of existing solutions, while providing increased simulation, analysis, and visualization performance. Mobile computing platforms have recently become powerful enough to support interactive molecular visualization tasks that were previously only possible on laptops and workstations, creating future opportunities for their convenient use for meetings, remote collaboration, and as head mounted displays for immersive stereoscopic viewing. We describe early experiences adapting several biomolecular simulation and analysis applications for emerging heterogeneous computing platforms that combine power-efficient system-on-chip multi-core CPUs with high-performance massively parallel GPUs. We present low-cost power monitoring instrumentation that provides sufficient temporal resolution to evaluate the power consumption of individual CPU algorithms and GPU kernels. We compare the performance and energy efficiency of scientific applications running on emerging platforms with results obtained on traditional platforms, identify hardware and algorithmic performance bottlenecks that affect the usability of these platforms, and describe avenues for improving both the hardware and applications in pursuit of the needs of molecular modeling tasks on mobile devices and future exascale computers.

  4. Task Context Influences Brain Activation during Music Listening

    PubMed Central

    Markovic, Andjela; Kühnis, Jürg; Jäncke, Lutz

    2017-01-01

    In this paper, we examined brain activation in subjects during two music listening conditions: listening while simultaneously rating the musical piece being played [Listening and Rating (LR)] and listening to the musical pieces unconstrained [Listening (L)]. Using these two conditions, we tested whether the sequence in which the two conditions were fulfilled influenced the brain activation observable during the L condition (LR → L or L → LR). We recorded high-density EEG during the playing of four well-known positively experienced soundtracks in two subject groups. One group started with the L condition and continued with the LR condition (L → LR); the second group performed this experiment in reversed order (LR → L). We computed from the recorded EEG the power for different frequency bands (theta, lower alpha, upper alpha, lower beta, and upper beta). Statistical analysis revealed that the power in all examined frequency bands increased during the L condition but only when the subjects had not had previous experience with the LR condition (i.e., L → LR). For the subjects who began with the LR condition, there were no power increases during the L condition. Thus, the previous experience with the LR condition prevented subjects from developing the particular mental state associated with the typical power increase in all frequency bands. The subjects without previous experience of the LR condition listened to the musical pieces in an unconstrained and undisturbed manner and showed a general power increase in all frequency bands. We interpret the fact that unconstrained music listening was associated with increased power in all examined frequency bands as a neural indicator of a mental state that can best be described as a mind-wandering state during which the subjects are “drawn into” the music. PMID:28706480

  5. Success of non-traditional students in an undergraduate occupational therapy programme.

    PubMed

    Wheeler, NEIL

    2001-01-01

    An exit survey designed to examine the experiences of occupational therapy undergraduates was administered to 365 students in a four-year honours programme. The survey had a response rate of 51% (186). The survey was informed and supplemented by focus groups with international students and computer-mediated conferencing with community leaders from relevant ethnic minorities. Results showed that older students and those with non-traditional entry qualifications in this sample were as successful as school-leaver entrants (those with UK A Level qualifications). There were no significant differences between the support needs of the groups and previous experience did not have a beneficial or significant effect on support needs. Having to maintain part-time employment significantly increased the likelihood that students would consider withdrawing from the programme. For those who considered withdrawing but who went on to successful completion, the desire to practise occupational therapy following their successful experiences in the programme was a powerful motivator.

  6. Performance of a Liner-on-Target Injector for Staged Z-Pinch Experiments

    NASA Astrophysics Data System (ADS)

    Conti, F.; Valenzuela, J. C.; Narkis, J.; Krasheninnikov, I.; Beg, F.; Wessel, F. J.; Ruskov, E.; Rahman, H. U.; McGee, E.

    2016-10-01

    We present the design and characterization of a compact liner-on-target injector, used in the Staged Z-pinch experiments conducted on the UNR-NTF Zebra Facility. Previous experiments and analysis indicate that high-Z gas liners produce a uniform and efficient implosion on a low-Z target plasma. The liner gas shell is produced by an annular solenoid valve and a converging-diverging nozzle designed to achieve a collimated, supersonic, Mach-5 flow. The on-axis target is produced by a coaxial plasma gun, where a high voltage pulse is applied to ionize neutral gas and accelerate the plasma by the J-> × B-> force. Measurements of the liner and target dynamics, resolved by interferometry in space and time, fast imaging, and collection of the emitted light, are presented. The results are compared to the predictions from Computational Fluid Dynamics and MHD simulations that model the injector. Optimization of the design parameters, for upcoming Staged Z-pinch experiments, will be discussed. Advanced Research Projects Agency - Energy, DE-AR0000569.

  7. Weighted analysis of paired microarray experiments.

    PubMed

    Kristiansson, Erik; Sjögren, Anders; Rudemo, Mats; Nerman, Olle

    2005-01-01

    In microarray experiments quality often varies, for example between samples and between arrays. The need for quality control is therefore strong. A statistical model and a corresponding analysis method is suggested for experiments with pairing, including designs with individuals observed before and after treatment and many experiments with two-colour spotted arrays. The model is of mixed type with some parameters estimated by an empirical Bayes method. Differences in quality are modelled by individual variances and correlations between repetitions. The method is applied to three real and several simulated datasets. Two of the real datasets are of Affymetrix type with patients profiled before and after treatment, and the third dataset is of two-colour spotted cDNA type. In all cases, the patients or arrays had different estimated variances, leading to distinctly unequal weights in the analysis. We suggest also plots which illustrate the variances and correlations that affect the weights computed by our analysis method. For simulated data the improvement relative to previously published methods without weighting is shown to be substantial.

  8. Mesencephalic representations of recent experience influence decision making

    PubMed Central

    Thompson, John A; Costabile, Jamie D; Felsen, Gidon

    2016-01-01

    Decisions are influenced by recent experience, but the neural basis for this phenomenon is not well understood. Here, we address this question in the context of action selection. We focused on activity in the pedunculopontine tegmental nucleus (PPTg), a mesencephalic region that provides input to several nuclei in the action selection network, in well-trained mice selecting actions based on sensory cues and recent trial history. We found that, at the time of action selection, the activity of many PPTg neurons reflected the action on the previous trial and its outcome, and the strength of this activity predicted the upcoming choice. Further, inactivating the PPTg predictably decreased the influence of recent experience on action selection. These findings suggest that PPTg input to downstream motor regions, where it can be integrated with other relevant information, provides a simple mechanism for incorporating recent experience into the computations underlying action selection. DOI: http://dx.doi.org/10.7554/eLife.16572.001 PMID:27454033

  9. Virtual geotechnical laboratory experiments using a simulator

    NASA Astrophysics Data System (ADS)

    Penumadu, Dayakar; Zhao, Rongda; Frost, David

    2000-04-01

    The details of a test simulator that provides a realistic environment for performing virtual laboratory experimentals in soil mechanics is presented. A computer program Geo-Sim that can be used to perform virtual experiments, and allow for real-time observations of material response is presented. The results of experiments, for a given set of input parameters, are obtained with the test simulator using well-trained artificial neural-network-based soil models for different soil types and stress paths. Multimedia capabilities are integrated in Geo-Sim, using software that links and controls a laser disc player with a real-time parallel processing ability. During the simulation of a virtual experiment, relevant portions of the video image of a previously recorded test on an actual soil specimen are dispalyed along with the graphical presentation of response from the feedforward ANN model predictions. The pilot simulator developed to date includes all aspects related to performing a triaxial test on cohesionless soil under undrained and drained conditions. The benefits of the test simulator are also presented.

  10. Solving a Hamiltonian Path Problem with a bacterial computer

    PubMed Central

    Baumgardner, Jordan; Acker, Karen; Adefuye, Oyinade; Crowley, Samuel Thomas; DeLoache, Will; Dickson, James O; Heard, Lane; Martens, Andrew T; Morton, Nickolaus; Ritter, Michelle; Shoecraft, Amber; Treece, Jessica; Unzicker, Matthew; Valencia, Amanda; Waters, Mike; Campbell, A Malcolm; Heyer, Laurie J; Poet, Jeffrey L; Eckdahl, Todd T

    2009-01-01

    Background The Hamiltonian Path Problem asks whether there is a route in a directed graph from a beginning node to an ending node, visiting each node exactly once. The Hamiltonian Path Problem is NP complete, achieving surprising computational complexity with modest increases in size. This challenge has inspired researchers to broaden the definition of a computer. DNA computers have been developed that solve NP complete problems. Bacterial computers can be programmed by constructing genetic circuits to execute an algorithm that is responsive to the environment and whose result can be observed. Each bacterium can examine a solution to a mathematical problem and billions of them can explore billions of possible solutions. Bacterial computers can be automated, made responsive to selection, and reproduce themselves so that more processing capacity is applied to problems over time. Results We programmed bacteria with a genetic circuit that enables them to evaluate all possible paths in a directed graph in order to find a Hamiltonian path. We encoded a three node directed graph as DNA segments that were autonomously shuffled randomly inside bacteria by a Hin/hixC recombination system we previously adapted from Salmonella typhimurium for use in Escherichia coli. We represented nodes in the graph as linked halves of two different genes encoding red or green fluorescent proteins. Bacterial populations displayed phenotypes that reflected random ordering of edges in the graph. Individual bacterial clones that found a Hamiltonian path reported their success by fluorescing both red and green, resulting in yellow colonies. We used DNA sequencing to verify that the yellow phenotype resulted from genotypes that represented Hamiltonian path solutions, demonstrating that our bacterial computer functioned as expected. Conclusion We successfully designed, constructed, and tested a bacterial computer capable of finding a Hamiltonian path in a three node directed graph. This proof-of-concept experiment demonstrates that bacterial computing is a new way to address NP-complete problems using the inherent advantages of genetic systems. The results of our experiments also validate synthetic biology as a valuable approach to biological engineering. We designed and constructed basic parts, devices, and systems using synthetic biology principles of standardization and abstraction. PMID:19630940

  11. Innovative Science Experiments Using Phoenix

    ERIC Educational Resources Information Center

    Kumar, B. P. Ajith; Satyanarayana, V. V. V.; Singh, Kundan; Singh, Parmanand

    2009-01-01

    A simple, flexible and very low cost hardware plus software framework for developing computer-interfaced science experiments is presented. It can be used for developing computer-interfaced science experiments without getting into the details of electronics or computer programming. For developing experiments this is a middle path between…

  12. Space Station Furnace Facility. Volume 3: Program cost estimate

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The approach used to estimate costs for the Space Station Furnace Facility (SSFF) is based on a computer program developed internally at Teledyne Brown Engineering (TBE). The program produces time-phased estimates of cost elements for each hardware component, based on experience with similar components. Engineering estimates of the degree of similarity or difference between the current project and the historical data is then used to adjust the computer-produced cost estimate and to fit it to the current project Work Breakdown Structure (WBS). The SSFF Concept as presented at the Requirements Definition Review (RDR) was used as the base configuration for the cost estimate. This program incorporates data on costs of previous projects and the allocation of those costs to the components of one of three, time-phased, generic WBS's. Input consists of a list of similar components for which cost data exist, number of interfaces with their type and complexity, identification of the extent to which previous designs are applicable, and programmatic data concerning schedules and miscellaneous data (travel, off-site assignments). Output is program cost in labor hours and material dollars, for each component, broken down by generic WBS task and program schedule phase.

  13. Language Identification in Short Utterances Using Long Short-Term Memory (LSTM) Recurrent Neural Networks.

    PubMed

    Zazo, Ruben; Lozano-Diez, Alicia; Gonzalez-Dominguez, Javier; Toledano, Doroteo T; Gonzalez-Rodriguez, Joaquin

    2016-01-01

    Long Short Term Memory (LSTM) Recurrent Neural Networks (RNNs) have recently outperformed other state-of-the-art approaches, such as i-vector and Deep Neural Networks (DNNs), in automatic Language Identification (LID), particularly when dealing with very short utterances (∼3s). In this contribution we present an open-source, end-to-end, LSTM RNN system running on limited computational resources (a single GPU) that outperforms a reference i-vector system on a subset of the NIST Language Recognition Evaluation (8 target languages, 3s task) by up to a 26%. This result is in line with previously published research using proprietary LSTM implementations and huge computational resources, which made these former results hardly reproducible. Further, we extend those previous experiments modeling unseen languages (out of set, OOS, modeling), which is crucial in real applications. Results show that a LSTM RNN with OOS modeling is able to detect these languages and generalizes robustly to unseen OOS languages. Finally, we also analyze the effect of even more limited test data (from 2.25s to 0.1s) proving that with as little as 0.5s an accuracy of over 50% can be achieved.

  14. Multi-Station Broad Regional Event Detection Using Waveform Correlation

    NASA Astrophysics Data System (ADS)

    Slinkard, M.; Stephen, H.; Young, C. J.; Eckert, R.; Schaff, D. P.; Richards, P. G.

    2013-12-01

    Previous waveform correlation studies have established the occurrence of repeating seismic events in various regions, and the utility of waveform-correlation event-detection on broad regional or even global scales to find events currently not included in traditionally-prepared bulletins. The computational burden, however, is high, limiting previous experiments to relatively modest template libraries and/or processing time periods. We have developed a distributed computing waveform correlation event detection utility that allows us to process years of continuous waveform data with template libraries numbering in the thousands. We have used this system to process several years of waveform data from IRIS stations in East Asia, using libraries of template events taken from global and regional bulletins. Detections at a given station are confirmed by 1) comparison with independent bulletins of seismicity, and 2) consistent detections at other stations. We find that many of the detected events are not in traditional catalogs, hence the multi-station comparison is essential. In addition to detecting the similar events, we also estimate magnitudes very precisely based on comparison with the template events (when magnitudes are available). We have investigated magnitude variation within detected families of similar events, false alarm rates, and the temporal and spatial reach of templates.

  15. Language Identification in Short Utterances Using Long Short-Term Memory (LSTM) Recurrent Neural Networks

    PubMed Central

    Zazo, Ruben; Lozano-Diez, Alicia; Gonzalez-Dominguez, Javier; T. Toledano, Doroteo; Gonzalez-Rodriguez, Joaquin

    2016-01-01

    Long Short Term Memory (LSTM) Recurrent Neural Networks (RNNs) have recently outperformed other state-of-the-art approaches, such as i-vector and Deep Neural Networks (DNNs), in automatic Language Identification (LID), particularly when dealing with very short utterances (∼3s). In this contribution we present an open-source, end-to-end, LSTM RNN system running on limited computational resources (a single GPU) that outperforms a reference i-vector system on a subset of the NIST Language Recognition Evaluation (8 target languages, 3s task) by up to a 26%. This result is in line with previously published research using proprietary LSTM implementations and huge computational resources, which made these former results hardly reproducible. Further, we extend those previous experiments modeling unseen languages (out of set, OOS, modeling), which is crucial in real applications. Results show that a LSTM RNN with OOS modeling is able to detect these languages and generalizes robustly to unseen OOS languages. Finally, we also analyze the effect of even more limited test data (from 2.25s to 0.1s) proving that with as little as 0.5s an accuracy of over 50% can be achieved. PMID:26824467

  16. Computational aeroelasticity using a pressure-based solver

    NASA Astrophysics Data System (ADS)

    Kamakoti, Ramji

    A computational methodology for performing fluid-structure interaction computations for three-dimensional elastic wing geometries is presented. The flow solver used is based on an unsteady Reynolds-Averaged Navier-Stokes (RANS) model. A well validated k-ε turbulence model with wall function treatment for near wall region was used to perform turbulent flow calculations. Relative merits of alternative flow solvers were investigated. The predictor-corrector-based Pressure Implicit Splitting of Operators (PISO) algorithm was found to be computationally economic for unsteady flow computations. Wing structure was modeled using Bernoulli-Euler beam theory. A fully implicit time-marching scheme (using the Newmark integration method) was used to integrate the equations of motion for structure. Bilinear interpolation and linear extrapolation techniques were used to transfer necessary information between fluid and structure solvers. Geometry deformation was accounted for by using a moving boundary module. The moving grid capability was based on a master/slave concept and transfinite interpolation techniques. Since computations were performed on a moving mesh system, the geometric conservation law must be preserved. This is achieved by appropriately evaluating the Jacobian values associated with each cell. Accurate computation of contravariant velocities for unsteady flows using the momentum interpolation method on collocated, curvilinear grids was also addressed. Flutter computations were performed for the AGARD 445.6 wing at subsonic, transonic and supersonic Mach numbers. Unsteady computations were performed at various dynamic pressures to predict the flutter boundary. Results showed favorable agreement of experiment and previous numerical results. The computational methodology exhibited capabilities to predict both qualitative and quantitative features of aeroelasticity.

  17. Multiprocessor architectural study

    NASA Technical Reports Server (NTRS)

    Kosmala, A. L.; Stanten, S. F.; Vandever, W. H.

    1972-01-01

    An architectural design study was made of a multiprocessor computing system intended to meet functional and performance specifications appropriate to a manned space station application. Intermetrics, previous experience, and accumulated knowledge of the multiprocessor field is used to generate a baseline philosophy for the design of a future SUMC* multiprocessor. Interrupts are defined and the crucial questions of interrupt structure, such as processor selection and response time, are discussed. Memory hierarchy and performance is discussed extensively with particular attention to the design approach which utilizes a cache memory associated with each processor. The ability of an individual processor to approach its theoretical maximum performance is then analyzed in terms of a hit ratio. Memory management is envisioned as a virtual memory system implemented either through segmentation or paging. Addressing is discussed in terms of various register design adopted by current computers and those of advanced design.

  18. Gold rush - A swarm dynamics in games

    NASA Astrophysics Data System (ADS)

    Zelinka, Ivan; Bukacek, Michal

    2017-07-01

    This paper is focused on swarm intelligence techniques and its practical use in computer games. The aim is to show how a swarm dynamics can be generated by multiplayer game, then recorded, analyzed and eventually controlled. In this paper we also discuss possibility to use swarm intelligence instead of game players. Based on our previous experiments two games, using swarm algorithms are mentioned briefly here. The first one is strategy game StarCraft: Brood War, and TicTacToe in which SOMA algorithm has also take a role of player against human player. Open research reported here has shown potential benefit of swarm computation in the field of strategy games and players strategy based on swarm behavior record and analysis. We propose new game called Gold Rush as an experimental environment for human or artificial swarm behavior and consequent analysis.

  19. Experimental and computational investigation of lateral gauge response in polycarbonate

    NASA Astrophysics Data System (ADS)

    Eliot, Jim; Harris, Ernst; Hazell, Paul; Appleby-Thomas, Gareth; Winter, Ronald; Wood, David; Owen, Gareth

    2011-06-01

    Polycarbonate's use in personal armour systems means its high strain-rate response has been extensively studied. Interestingly, embedded lateral manganin stress gauges in polycarbonate have shown gradients behind incident shocks, suggestive of increasing shear strength. However, such gauges need to be embedded in a central (typically) epoxy interlayer - an inherently invasive approach. Recently, research has suggested that in such metal systems interlayer/target impedance may contribute to observed gradients in lateral stress. Here, experimental T-gauge (Vishay Micro-Measurements® type J2M-SS-580SF-025) traces from polycarbonate targets are compared to computational simulations. This work extends previous efforts such that similar impedance exists between the interlayer and matrix (target) interface. Further, experiments and simulations are presented investigating the effects of a ``dry joint'' in polycarbonate, in which no encapsulating medium is employed.

  20. Using transonic small disturbance theory for predicting the aeroelastic stability of a flexible wind-tunnel model

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.; Bennett, Robert M.

    1990-01-01

    The CAP-TSD (Computational Aeroelasticity Program - Transonic Small Disturbance) code, developed at the NASA - Langley Research Center, is applied to the Active Flexible Wing (AFW) wind tunnel model for prediction of the model's transonic aeroelastic behavior. Static aeroelastic solutions using CAP-TSD are computed. Dynamic (flutter) analyses are then performed as perturbations about the static aeroelastic deformations of the AFW. The accuracy of the static aeroelastic procedure is investigated by comparing analytical results to those from previous AFW wind tunnel experiments. Dynamic results are presented in the form of root loci at different Mach numbers for a heavy gas and air. The resultant flutter boundaries for both gases are also presented. The effects of viscous damping and angle-of-attack, on the flutter boundary in air, are presented as well.

  1. Fostering Multilinguality in the UMLS: A Computational Approach to Terminology Expansion for Multiple Languages

    PubMed Central

    Hellrich, Johannes; Hahn, Udo

    2014-01-01

    We here report on efforts to computationally support the maintenance and extension of multilingual biomedical terminology resources. Our main idea is to treat term acquisition as a classification problem guided by term alignment in parallel multilingual corpora, using termhood information coming from of a named entity recognition system as a novel feature. We report on experiments for Spanish, French, German and Dutch parts of a multilingual UMLS-derived biomedical terminology. These efforts yielded 19k, 18k, 23k and 12k new terms and synonyms, respectively, from which about half relate to concepts without a previously available term label for these non-English languages. Based on expert assessment of a novel German terminology sample, 80% of the newly acquired terms were judged as reasonable additions to the terminology. PMID:25954371

  2. Reduction of fine particle emissions from wood combustion with optimized condensing heat exchangers.

    PubMed

    Gröhn, Arto; Suonmaa, Valtteri; Auvinen, Ari; Lehtinen, Kari E J; Jokiniemi, Jorma

    2009-08-15

    In this study, we designed and built a condensing heat exchanger capable of simultaneous fine particle emission reduction and waste heat recovery. The deposition mechanisms inside the heat exchanger prototype were maximized using a computer model which was later compared to actual measurements. The main deposition mechanisms were diffusio- and thermophoresis which have previously been examined in similar conditions only separately. The obtained removal efficiency in the experiments was measured in the total number concentration and ranged between 26 and 40% for the given pellet stove and the heat exchanger. Size distributions and number concentrations were measured with a TSI Fast mobility particle sizer (FMPS). The computer model predicts that there exists a specific upper limit for thermo- and diffusiophoretic deposition for each temperature and water vapor concentration in the flue gas.

  3. Reply to comment by Melsen et al. on "Most computational hydrology is not reproducible, so is it really science?"

    NASA Astrophysics Data System (ADS)

    Hutton, Christopher; Wagener, Thorsten; Freer, Jim; Han, Dawei; Duffy, Chris; Arheimer, Berit

    2017-03-01

    In this article, we reply to a comment made by Melsen et al. [2017] on our previous commentary regarding reproducibility in computational hydrology. Re-executing someone else's code and workflow to derive a set of published results does not by itself constitute reproducibility. However, it forms a key part of the process: it demonstrates that all the degrees of freedom and choices made by the scientist in running the experiment are contained within that code and workflow. This does not only allow us to build and extend directly from the original work, but with full knowledge of decisions made in the original experimental setup, we can then focus our attention to the degrees of freedom of interest: those that occur in hydrological systems that are ultimately our subject of study.

  4. The computer integrated documentation project: A merge of hypermedia and AI techniques

    NASA Technical Reports Server (NTRS)

    Mathe, Nathalie; Boy, Guy

    1993-01-01

    To generate intelligent indexing that allows context-sensitive information retrieval, a system must be able to acquire knowledge directly through interaction with users. In this paper, we present the architecture for CID (Computer Integrated Documentation). CID is a system that enables integration of various technical documents in a hypertext framework and includes an intelligent browsing system that incorporates indexing in context. CID's knowledge-based indexing mechanism allows case based knowledge acquisition by experimentation. It utilizes on-line user information requirements and suggestions either to reinforce current indexing in case of success or to generate new knowledge in case of failure. This allows CID's intelligent interface system to provide helpful responses, based on previous experience (user feedback). We describe CID's current capabilities and provide an overview of our plans for extending the system.

  5. Boundary-Layer Transition on a Slender Cone in Hypervelocity Flow with Real Gas Effects

    NASA Astrophysics Data System (ADS)

    Jewell, Joseph Stephen

    The laminar to turbulent transition process in boundary layer flows in thermochemical nonequilibrium at high enthalpy is measured and characterized. Experiments are performed in the T5 Hypervelocity Reflected Shock Tunnel at Caltech, using a 1 m length 5-degree half angle axisymmetric cone instrumented with 80 fast-response annular thermocouples, complemented by boundary layer stability computations using the STABL software suite. A new mixing tank is added to the shock tube fill apparatus for premixed freestream gas experiments, and a new cleaning procedure results in more consistent transition measurements. Transition location is nondimensionalized using a scaling with the boundary layer thickness, which is correlated with the acoustic properties of the boundary layer, and compared with parabolized stability equation (PSE) analysis. In these nondimensionalized terms, transition delay with increasing CO2 concentration is observed: tests in 100% and 50% CO2, by mass, transition up to 25% and 15% later, respectively, than air experiments. These results are consistent with previous work indicating that CO2 molecules at elevated temperatures absorb acoustic instabilities in the MHz range, which is the expected frequency of the Mack second-mode instability at these conditions, and also consistent with predictions from PSE analysis. A strong unit Reynolds number effect is observed, which is believed to arise from tunnel noise. NTr for air from 5.4 to 13.2 is computed, substantially higher than previously reported for noisy facilities. Time- and spatially-resolved heat transfer traces are used to track the propagation of turbulent spots, and convection rates at 90%, 76%, and 63% of the boundary layer edge velocity, respectively, are observed for the leading edge, centroid, and trailing edge of the spots. A model constructed with these spot propagation parameters is used to infer spot generation rates from measured transition onset to completion distance. Finally, a novel method to control transition location with boundary layer gas injection is investigated. An appropriate porous-metal injector section for the cone is designed and fabricated, and the efficacy of injected CO2 for delaying transition is gauged at various mass flow rates, and compared with both no injection and chemically inert argon injection cases. While CO2 injection seems to delay transition, and argon injection seems to promote it, the experimental results are inconclusive and matching computations do not predict a reduction in N factor from any CO2 injection condition computed.

  6. Affective Interaction with a Virtual Character Through an fNIRS Brain-Computer Interface.

    PubMed

    Aranyi, Gabor; Pecune, Florian; Charles, Fred; Pelachaud, Catherine; Cavazza, Marc

    2016-01-01

    Affective brain-computer interfaces (BCI) harness Neuroscience knowledge to develop affective interaction from first principles. In this article, we explore affective engagement with a virtual agent through Neurofeedback (NF). We report an experiment where subjects engage with a virtual agent by expressing positive attitudes towards her under a NF paradigm. We use for affective input the asymmetric activity in the dorsolateral prefrontal cortex (DL-PFC), which has been previously found to be related to the high-level affective-motivational dimension of approach/avoidance. The magnitude of left-asymmetric DL-PFC activity, measured using functional near infrared spectroscopy (fNIRS) and treated as a proxy for approach, is mapped onto a control mechanism for the virtual agent's facial expressions, in which action units (AUs) are activated through a neural network. We carried out an experiment with 18 subjects, which demonstrated that subjects are able to successfully engage with the virtual agent by controlling their mental disposition through NF, and that they perceived the agent's responses as realistic and consistent with their projected mental disposition. This interaction paradigm is particularly relevant in the case of affective BCI as it facilitates the volitional activation of specific areas normally not under conscious control. Overall, our contribution reconciles a model of affect derived from brain metabolic data with an ecologically valid, yet computationally controllable, virtual affective communication environment.

  7. Application verification research of cloud computing technology in the field of real time aerospace experiment

    NASA Astrophysics Data System (ADS)

    Wan, Junwei; Chen, Hongyan; Zhao, Jing

    2017-08-01

    According to the requirements of real-time, reliability and safety for aerospace experiment, the single center cloud computing technology application verification platform is constructed. At the IAAS level, the feasibility of the cloud computing technology be applied to the field of aerospace experiment is tested and verified. Based on the analysis of the test results, a preliminary conclusion is obtained: Cloud computing platform can be applied to the aerospace experiment computing intensive business. For I/O intensive business, it is recommended to use the traditional physical machine.

  8. Strategies for laser-induced fluorescence detection of nitric oxide in high-pressure flames. I. A-X (0,0) excitation

    NASA Astrophysics Data System (ADS)

    Bessler, Wolfgang G.; Schulz, Christof; Lee, Tonghun; Jeffries, Jay B.; Hanson, Ronald K.

    2002-06-01

    Three different high-pressure flame measurement strategies for NO laser-induced fluorescence (LIF) with A-X (0,0) excitation have been studied previously with computational simulations and experiments in flames up to 15 bars. Interference from O2 LIF is a significant problem in lean flames for NO LIF measurements, and pressure broadening and quenching lead to increased interference with increased pressure. We investigate the NO LIF signal strength, interference by hot molecular oxygen, and temperature dependence of the three previous schemes and for two newly chosen excitation schemes with wavelength-resolved LIF measurements in premixed methane and air flames at pressures between 1 and 60 bars and a range of fuel /air ratios. In slightly lean flames with an equivalence ratio of 0.83 at 60 bars, the contribution of O2 LIF to the NO LIF signal varies between 8% and 29% for the previous schemes. The O2 interference is best suppressed with excitation at 226.03 nm.

  9. Integrated Graphics Operations and Analysis Lab Development of Advanced Computer Graphics Algorithms

    NASA Technical Reports Server (NTRS)

    Wheaton, Ira M.

    2011-01-01

    The focus of this project is to aid the IGOAL in researching and implementing algorithms for advanced computer graphics. First, this project focused on porting the current International Space Station (ISS) Xbox experience to the web. Previously, the ISS interior fly-around education and outreach experience only ran on an Xbox 360. One of the desires was to take this experience and make it into something that can be put on NASA s educational site for anyone to be able to access. The current code works in the Unity game engine which does have cross platform capability but is not 100% compatible. The tasks for an intern to complete this portion consisted of gaining familiarity with Unity and the current ISS Xbox code, porting the Xbox code to the web as is, and modifying the code to work well as a web application. In addition, a procedurally generated cloud algorithm will be developed. Currently, the clouds used in AGEA animations and the Xbox experiences are a texture map. The desire is to create a procedurally generated cloud algorithm to provide dynamically generated clouds for both AGEA animations and the Xbox experiences. This task consists of gaining familiarity with AGEA and the plug-in interface, developing the algorithm, creating an AGEA plug-in to implement the algorithm inside AGEA, and creating a Unity script to implement the algorithm for the Xbox. This portion of the project was unable to be completed in the time frame of the internship; however, the IGOAL will continue to work on it in the future.

  10. A method to determine the mammographic regions that show early changes due to the development of breast cancer

    NASA Astrophysics Data System (ADS)

    Karemore, Gopal; Nielsen, Mads; Karssemeijer, Nico; Brandt, Sami S.

    2014-11-01

    It is well understood nowadays that changes in the mammographic parenchymal pattern are an indicator of a risk of breast cancer and we have developed a statistical method that estimates the mammogram regions where the parenchymal changes, due to breast cancer, occur. This region of interest is computed from a score map by utilising the anatomical breast coordinate system developed in our previous work. The method also makes an automatic scale selection to avoid overfitting while the region estimates are computed by a nested cross-validation scheme. In this way, it is possible to recover those mammogram regions that show a significant difference in classification scores between the cancer and the control group. Our experiments suggested that the most significant mammogram region is the region behind the nipple and that can be justified by previous findings from other research groups. This result was conducted on the basis of the cross-validation experiments on independent training, validation and testing sets from the case-control study of 490 women, of which 245 women were diagnosed with breast cancer within a period of 2-4 years after the baseline mammograms. We additionally generalised the estimated region to another, mini-MIAS study and showed that the transferred region estimate gives at least a similar classification result when compared to the case where the whole breast region is used. In all, by following our method, one most likely improves both preclinical and follow-up breast cancer screening, but a larger study population will be required to test this hypothesis.

  11. Compositional clustering in task structure learning

    PubMed Central

    Frank, Michael J.

    2018-01-01

    Humans are remarkably adept at generalizing knowledge between experiences in a way that can be difficult for computers. Often, this entails generalizing constituent pieces of experiences that do not fully overlap, but nonetheless share useful similarities with, previously acquired knowledge. However, it is often unclear how knowledge gained in one context should generalize to another. Previous computational models and data suggest that rather than learning about each individual context, humans build latent abstract structures and learn to link these structures to arbitrary contexts, facilitating generalization. In these models, task structures that are more popular across contexts are more likely to be revisited in new contexts. However, these models can only re-use policies as a whole and are unable to transfer knowledge about the transition structure of the environment even if only the goal has changed (or vice-versa). This contrasts with ecological settings, where some aspects of task structure, such as the transition function, will be shared between context separately from other aspects, such as the reward function. Here, we develop a novel non-parametric Bayesian agent that forms independent latent clusters for transition and reward functions, affording separable transfer of their constituent parts across contexts. We show that the relative performance of this agent compared to an agent that jointly clusters reward and transition functions depends environmental task statistics: the mutual information between transition and reward functions and the stochasticity of the observations. We formalize our analysis through an information theoretic account of the priors, and propose a meta learning agent that dynamically arbitrates between strategies across task domains to optimize a statistical tradeoff. PMID:29672581

  12. Stretching the Traditional Notion of Experiment in Computing: Explorative Experiments.

    PubMed

    Schiaffonati, Viola

    2016-06-01

    Experimentation represents today a 'hot' topic in computing. If experiments made with the support of computers, such as computer simulations, have received increasing attention from philosophers of science and technology, questions such as "what does it mean to do experiments in computer science and engineering and what are their benefits?" emerged only recently as central in the debate over the disciplinary status of the discipline. In this work we aim at showing, also by means of paradigmatic examples, how the traditional notion of controlled experiment should be revised to take into account a part of the experimental practice in computing along the lines of experimentation as exploration. Taking inspiration from the discussion on exploratory experimentation in the philosophy of science-experimentation that is not theory-driven-we advance the idea of explorative experiments that, although not new, can contribute to enlarge the debate about the nature and role of experimental methods in computing. In order to further refine this concept we recast explorative experiments as socio-technical experiments, that test new technologies in their socio-technical contexts. We suggest that, when experiments are explorative, control should be intended in a posteriori form, in opposition to the a priori form that usually takes place in traditional experimental contexts.

  13. Computational Fluid Dynamics at ICMA (Institute for Computational Mathematics and Applications)

    DTIC Science & Technology

    1988-10-18

    PERSONAL. AUTHOR(S) Charles A. Hall and Thomas A. Porsching 13a. TYPE OF REPORT 13b. TIME COVERED 114. DATE OF REPORT (YearMOth, De ) 1. PAGE COUNT...of ten ICtA (Institute for Computational Mathe- matics and Applications) personnel, relating to the general area of computational fluid mechanics...questions raised in the previous subsection. Our previous work in this area concentrated on a study of the differential geometric aspects of the prob- lem

  14. Sequential responding and planning in capuchin monkeys (Cebus apella).

    PubMed

    Beran, Michael J; Parrish, Audrey E

    2012-11-01

    Previous experiments have assessed planning during sequential responding to computer generated stimuli by Old World nonhuman primates including chimpanzees and rhesus macaques. However, no such assessment has been made with a New World primate species. Capuchin monkeys (Cebus apella) are an interesting test case for assessing the distribution of cognitive processes in the Order Primates because they sometimes show proficiency in tasks also mastered by apes and Old World monkeys, but in other cases fail to match the proficiency of those other species. In two experiments, eight capuchin monkeys selected five arbitrary stimuli in distinct locations on a computer monitor in a learned sequence. In Experiment 1, shift trials occurred in which the second and third stimuli were transposed when the first stimulus was selected by the animal. In Experiment 2, mask trials occurred in which all remaining stimuli were masked after the monkey selected the first stimulus. Monkeys made more mistakes on trials in which the locations of the second and third stimuli were interchanged than on trials in which locations were not interchanged, suggesting they had already planned to select a location that no longer contained the correct stimulus. When mask trials occurred, monkeys performed at levels significantly better than chance, but their performance exceeded chance levels only for the first and the second selections on a trial. These data indicate that capuchin monkeys performed very similarly to chimpanzees and rhesus monkeys and appeared to plan their selection sequences during the computerized task, but only to a limited degree.

  15. "Whom should I pass to?" the more options the more attentional guidance from working memory.

    PubMed

    Furley, Philip; Memmert, Daniel

    2013-01-01

    Three experiments investigated the predictions of the biased competition theory of selective attention in a computer based sport task. According to this theory objects held in the circuitry of working memory (WM) automatically bias attention to objects in a visual scene that match or are related to the WM representation. Specifically, we investigated whether certain players that are activated in the circuitry of WM automatically draw attention and receive a competitive advantage in a computer based sport task. In all three experiments participants had to hold an image of a certain player in WM while engaged in a speeded sport task. In Experiment 1 participants had to identify as quickly as possible which player was in possession of the ball. In Experiment 2 and 3 participants had to decide to which player they would pass to in a cartoon team handball situation and a photo picture basketball situation. The results support the biased competition theory of selective attention and suggest that certain decision options receive a competitive advantage if they are associated with the activated contents in the circuitry of WM and that this effect is more pronounced when more decision options compete for attention. A further extension compared to previous research was that the contents of working memory not only biased attention but also actual decisions that can lead to passing errors in sport. We critically discuss the applied implications of the findings.

  16. “Whom Should I Pass To?” The More Options the More Attentional Guidance from Working Memory

    PubMed Central

    Furley, Philip; Memmert, Daniel

    2013-01-01

    Three experiments investigated the predictions of the biased competition theory of selective attention in a computer based sport task. According to this theory objects held in the circuitry of working memory (WM) automatically bias attention to objects in a visual scene that match or are related to the WM representation. Specifically, we investigated whether certain players that are activated in the circuitry of WM automatically draw attention and receive a competitive advantage in a computer based sport task. In all three experiments participants had to hold an image of a certain player in WM while engaged in a speeded sport task. In Experiment 1 participants had to identify as quickly as possible which player was in possession of the ball. In Experiment 2 and 3 participants had to decide to which player they would pass to in a cartoon team handball situation and a photo picture basketball situation. The results support the biased competition theory of selective attention and suggest that certain decision options receive a competitive advantage if they are associated with the activated contents in the circuitry of WM and that this effect is more pronounced when more decision options compete for attention. A further extension compared to previous research was that the contents of working memory not only biased attention but also actual decisions that can lead to passing errors in sport. We critically discuss the applied implications of the findings. PMID:23658719

  17. Evaluation of the polyurethane foam (PUF) disk passive air sampler: Computational modeling and experimental measurements

    NASA Astrophysics Data System (ADS)

    May, Andrew A.; Ashman, Paul; Huang, Jiaoyan; Dhaniyala, Suresh; Holsen, Thomas M.

    2011-08-01

    Computational fluid dynamics (CFD) simulations coupled with wind tunnel-experiments were used to determine the sampling rate (SR) of the widely used polyurethane foam (PUF) disk passive sampler. In the wind-tunnel experiments, water evaporation rates from a water saturated PUF disk installed in the sampler housing were determined by measuring weight loss over time. In addition, a modified passive sampler designed to collect elemental mercury (Hg 0) with gold-coated filters was used. Experiments were carried out at different wind speeds and various sampler angles. The SRs obtained from wind-tunnel experiments were compared to those obtained from the field by scaling the values by the ratios of air diffusivities. Three-dimensional (3D) CFD simulations were also used to generate SRs for both polychlorinated biphenyls (PCBs) and Hg 0. Overall, the modeled and measured SRs agree well and are consistent with the values obtained from field studies. As previously observed, the SRs increased linearly with increasing wind speed. In addition, it was determined that the SR was strongly dependent on the angle of the ambient wind. The SRs increased when the base was tilted up pointing into the wind and when the base was tilted down (i.e., such that the top of the sampler was facing the wind) the SR decreased initially and then increased. The results suggest that there may be significant uncertainty in concentrations obtained from passive sampler measurements without knowledge of wind speed and wind angle relative to the sampler.

  18. Numerical Investigation of Plasma Detachment in Magnetic Nozzle Experiments

    NASA Technical Reports Server (NTRS)

    Sankaran, Kamesh; Polzin, Kurt A.

    2008-01-01

    At present there exists no generally accepted theoretical model that provides a consistent physical explanation of plasma detachment from an externally-imposed magnetic nozzle. To make progress towards that end, simulation of plasma flow in the magnetic nozzle of an arcjet experiment is performed using a multidimensional numerical simulation tool that includes theoretical models of the various dispersive and dissipative processes present in the plasma. This is an extension of the simulation tool employed in previous work by Sankaran et al. The aim is to compare the computational results with various proposed magnetic nozzle detachment theories to develop an understanding of the physical mechanisms that cause detachment. An applied magnetic field topology is obtained using a magnetostatic field solver (see Fig. I), and this field is superimposed on the time-dependent magnetic field induced in the plasma to provide a self-consistent field description. The applied magnetic field and model geometry match those found in experiments by Kuriki and Okada. This geometry is modeled because there is a substantial amount of experimental data that can be compared to the computational results, allowing for validation of the model. In addition, comparison of the simulation results with the experimentally obtained plasma parameters will provide insight into the mechanisms that lead to plasma detachment, revealing how they scale with different input parameters. Further studies will focus on modeling literature experiments both for the purpose of additional code validation and to extract physical insight regarding the mechanisms driving detachment.

  19. ZTF Undergraduate Astronomy Institute at Caltech and Pomona College

    NASA Astrophysics Data System (ADS)

    Penprase, Bryan Edward; Bellm, Eric Christopher

    2017-01-01

    From the new Zwicky Transient Facility (ZTF), an NSF funded project based at Caltech, comes a new initiative for undergraduate research known as the Summer Undergraduate Astronomy Institute. The Institute brings together 15-20 students from across the world for an immersive experience in astronomy techniques before they begin their summer research projects. The students are primarly based at Caltech in their SURF program but also includes a large cohort of students enrolled in research internships at Pomona College in nearby Claremont CA. The program is intended to introduce students to research techniques in astronomy, laboratory and computational technologies, and to observational astronomy. Since many of the students are previously computer science or physics majors with little astronomy experience, this immersive experience has been extremely helpful for enabling students to learn about the terminologies, techniques and technologies of astronomy. The field trips to the Mount Wilson and Palomar telescopes deepen their knowledge and excitement about astronomy. Lectures about astronomical research from Caltech staff scientists and graduate students also provide context for the student research. Perhaps more importantly, the creation of a cohort of like-minded students, and the chance to reflect about careers in astronomy and research, give these students opportunities to consider themselves as future research scientists and help them immensely as they move forward in their careers. We discuss some of the social and intercultural aspects of the experience as well, as our cohorts typically include international students from many countries and several students from under-represented groups in science.

  20. A Gaussian mixture model based adaptive classifier for fNIRS brain-computer interfaces and its testing via simulation

    NASA Astrophysics Data System (ADS)

    Li, Zheng; Jiang, Yi-han; Duan, Lian; Zhu, Chao-zhe

    2017-08-01

    Objective. Functional near infra-red spectroscopy (fNIRS) is a promising brain imaging technology for brain-computer interfaces (BCI). Future clinical uses of fNIRS will likely require operation over long time spans, during which neural activation patterns may change. However, current decoders for fNIRS signals are not designed to handle changing activation patterns. The objective of this study is to test via simulations a new adaptive decoder for fNIRS signals, the Gaussian mixture model adaptive classifier (GMMAC). Approach. GMMAC can simultaneously classify and track activation pattern changes without the need for ground-truth labels. This adaptive classifier uses computationally efficient variational Bayesian inference to label new data points and update mixture model parameters, using the previous model parameters as priors. We test GMMAC in simulations in which neural activation patterns change over time and compare to static decoders and unsupervised adaptive linear discriminant analysis classifiers. Main results. Our simulation experiments show GMMAC can accurately decode under time-varying activation patterns: shifts of activation region, expansions of activation region, and combined contractions and shifts of activation region. Furthermore, the experiments show the proposed method can track the changing shape of the activation region. Compared to prior work, GMMAC performed significantly better than the other unsupervised adaptive classifiers on a difficult activation pattern change simulation: 99% versus  <54% in two-choice classification accuracy. Significance. We believe GMMAC will be useful for clinical fNIRS-based brain-computer interfaces, including neurofeedback training systems, where operation over long time spans is required.

  1. Motion and positional error correction for cone beam 3D-reconstruction with mobile C-arms.

    PubMed

    Bodensteiner, C; Darolti, C; Schumacher, H; Matthäus, L; Schweikard, A

    2007-01-01

    CT-images acquired by mobile C-arm devices can contain artefacts caused by positioning errors. We propose a data driven method based on iterative 3D-reconstruction and 2D/3D-registration to correct projection data inconsistencies. With a 2D/3D-registration algorithm, transformations are computed to align the acquired projection images to a previously reconstructed volume. In an iterative procedure, the reconstruction algorithm uses the results of the registration step. This algorithm also reduces small motion artefacts within 3D-reconstructions. Experiments with simulated projections from real patient data show the feasibility of the proposed method. In addition, experiments with real projection data acquired with an experimental robotised C-arm device have been performed with promising results.

  2. A Numerical Simulation and Statistical Modeling of High Intensity Radiated Fields Experiment Data

    NASA Technical Reports Server (NTRS)

    Smith, Laura J.

    2004-01-01

    Tests are conducted on a quad-redundant fault tolerant flight control computer to establish upset characteristics of an avionics system in an electromagnetic field. A numerical simulation and statistical model are described in this work to analyze the open loop experiment data collected in the reverberation chamber at NASA LaRC as a part of an effort to examine the effects of electromagnetic interference on fly-by-wire aircraft control systems. By comparing thousands of simulation and model outputs, the models that best describe the data are first identified and then a systematic statistical analysis is performed on the data. All of these efforts are combined which culminate in an extrapolation of values that are in turn used to support previous efforts used in evaluating the data.

  3. The Fabric for Frontier Experiments Project at Fermilab

    NASA Astrophysics Data System (ADS)

    Kirby, Michael

    2014-06-01

    The FabrIc for Frontier Experiments (FIFE) project is a new, far-reaching initiative within the Fermilab Scientific Computing Division to drive the future of computing services for experiments at FNAL and elsewhere. It is a collaborative effort between computing professionals and experiment scientists to produce an end-to-end, fully integrated set of services for computing on the grid and clouds, managing data, accessing databases, and collaborating within experiments. FIFE includes 1) easy to use job submission services for processing physics tasks on the Open Science Grid and elsewhere; 2) an extensive data management system for managing local and remote caches, cataloging, querying, moving, and tracking the use of data; 3) custom and generic database applications for calibrations, beam information, and other purposes; 4) collaboration tools including an electronic log book, speakers bureau database, and experiment membership database. All of these aspects will be discussed in detail. FIFE sets the direction of computing at Fermilab experiments now and in the future, and therefore is a major driver in the design of computing services worldwide.

  4. Modern Approaches to the Computation of the Probability of Target Detection in Cluttered Environments

    NASA Astrophysics Data System (ADS)

    Meitzler, Thomas J.

    The field of computer vision interacts with fields such as psychology, vision research, machine vision, psychophysics, mathematics, physics, and computer science. The focus of this thesis is new algorithms and methods for the computation of the probability of detection (Pd) of a target in a cluttered scene. The scene can be either a natural visual scene such as one sees with the naked eye (visual), or, a scene displayed on a monitor with the help of infrared sensors. The relative clutter and the temperature difference between the target and background (DeltaT) are defined and then used to calculate a relative signal -to-clutter ratio (SCR) from which the Pd is calculated for a target in a cluttered scene. It is shown how this definition can include many previous definitions of clutter and (DeltaT). Next, fuzzy and neural -fuzzy techniques are used to calculate the Pd and it is shown how these methods can give results that have a good correlation with experiment. The experimental design for actually measuring the Pd of a target by observers is described. Finally, wavelets are applied to the calculation of clutter and it is shown how this new definition of clutter based on wavelets can be used to compute the Pd of a target.

  5. Quantum wavepacket ab initio molecular dynamics: an approach for computing dynamically averaged vibrational spectra including critical nuclear quantum effects.

    PubMed

    Sumner, Isaiah; Iyengar, Srinivasan S

    2007-10-18

    We have introduced a computational methodology to study vibrational spectroscopy in clusters inclusive of critical nuclear quantum effects. This approach is based on the recently developed quantum wavepacket ab initio molecular dynamics method that combines quantum wavepacket dynamics with ab initio molecular dynamics. The computational efficiency of the dynamical procedure is drastically improved (by several orders of magnitude) through the utilization of wavelet-based techniques combined with the previously introduced time-dependent deterministic sampling procedure measure to achieve stable, picosecond length, quantum-classical dynamics of electrons and nuclei in clusters. The dynamical information is employed to construct a novel cumulative flux/velocity correlation function, where the wavepacket flux from the quantized particle is combined with classical nuclear velocities to obtain the vibrational density of states. The approach is demonstrated by computing the vibrational density of states of [Cl-H-Cl]-, inclusive of critical quantum nuclear effects, and our results are in good agreement with experiment. A general hierarchical procedure is also provided, based on electronic structure harmonic frequencies, classical ab initio molecular dynamics, computation of nuclear quantum-mechanical eigenstates, and employing quantum wavepacket ab initio dynamics to understand vibrational spectroscopy in hydrogen-bonded clusters that display large degrees of anharmonicities.

  6. A new computer-based counselling system for the promotion of physical activity in patients with chronic diseases--results from a pilot study.

    PubMed

    Becker, Annette; Herzberg, Dominikus; Marsden, Nicola; Thomanek, Sabine; Jung, Hartmut; Leonhardt, Corinna

    2011-05-01

    To develop a computer-based counselling system (CBCS) for the improvement of attitudes towards physical activity in chronically ill patients and to pilot its efficacy and acceptance in primary care. The system is tailored to patients' disease and motivational stage. During a pilot study in five German general practices, patients answered questions before, directly and 6 weeks after using the CBCS. Outcome criteria were attitudes and self-efficacy. Qualitative interviews were performed to identify acceptance indicators. Seventy-nine patients participated (mean age: 64.5 years, 53% males; 38% without previous computer experience). Patients' affective and cognitive attitudes changed significantly, self-efficacy showed only minor changes. Patients mentioned no difficulties in interacting with the CBCS. However, perception of the system's usefulness was inconsistent. Computer-based counselling for physical activity related attitudes in patients with chronic diseases is feasible, but the circumstances of use with respect to the target group and its integration into the management process have to be clarified in future studies. This study adds to the understanding of computer-based counselling in primary health care. Acceptance indicators identified in this study will be validated as part of a questionnaire on technology acceptability in a subsequent study. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  7. Spectroscopy of H3+ based on a new high-accuracy global potential energy surface.

    PubMed

    Polyansky, Oleg L; Alijah, Alexander; Zobov, Nikolai F; Mizus, Irina I; Ovsyannikov, Roman I; Tennyson, Jonathan; Lodi, Lorenzo; Szidarovszky, Tamás; Császár, Attila G

    2012-11-13

    The molecular ion H(3)(+) is the simplest polyatomic and poly-electronic molecular system, and its spectrum constitutes an important benchmark for which precise answers can be obtained ab initio from the equations of quantum mechanics. Significant progress in the computation of the ro-vibrational spectrum of H(3)(+) is discussed. A new, global potential energy surface (PES) based on ab initio points computed with an average accuracy of 0.01 cm(-1) relative to the non-relativistic limit has recently been constructed. An analytical representation of these points is provided, exhibiting a standard deviation of 0.097 cm(-1). Problems with earlier fits are discussed. The new PES is used for the computation of transition frequencies. Recently measured lines at visible wavelengths combined with previously determined infrared ro-vibrational data show that an accuracy of the order of 0.1 cm(-1) is achieved by these computations. In order to achieve this degree of accuracy, relativistic, adiabatic and non-adiabatic effects must be properly accounted for. The accuracy of these calculations facilitates the reassignment of some measured lines, further reducing the standard deviation between experiment and theory.

  8. Validation of Magnetic Resonance Thermometry by Computational Fluid Dynamics

    NASA Astrophysics Data System (ADS)

    Rydquist, Grant; Owkes, Mark; Verhulst, Claire M.; Benson, Michael J.; Vanpoppel, Bret P.; Burton, Sascha; Eaton, John K.; Elkins, Christopher P.

    2016-11-01

    Magnetic Resonance Thermometry (MRT) is a new experimental technique that can create fully three-dimensional temperature fields in a noninvasive manner. However, validation is still required to determine the accuracy of measured results. One method of examination is to compare data gathered experimentally to data computed with computational fluid dynamics (CFD). In this study, large-eddy simulations have been performed with the NGA computational platform to generate data for a comparison with previously run MRT experiments. The experimental setup consisted of a heated jet inclined at 30° injected into a larger channel. In the simulations, viscosity and density were scaled according to the local temperature to account for differences in buoyant and viscous forces. A mesh-independent study was performed with 5 mil-, 15 mil- and 45 mil-cell meshes. The program Star-CCM + was used to simulate the complete experimental geometry. This was compared to data generated from NGA. Overall, both programs show good agreement with the experimental data gathered with MRT. With this data, the validity of MRT as a diagnostic tool has been shown and the tool can be used to further our understanding of a range of flows with non-trivial temperature distributions.

  9. A multi-time-step noise reduction method for measuring velocity statistics from particle tracking velocimetry

    NASA Astrophysics Data System (ADS)

    Machicoane, Nathanaël; López-Caballero, Miguel; Bourgoin, Mickael; Aliseda, Alberto; Volk, Romain

    2017-10-01

    We present a method to improve the accuracy of velocity measurements for fluid flow or particles immersed in it, based on a multi-time-step approach that allows for cancellation of noise in the velocity measurements. Improved velocity statistics, a critical element in turbulent flow measurements, can be computed from the combination of the velocity moments computed using standard particle tracking velocimetry (PTV) or particle image velocimetry (PIV) techniques for data sets that have been collected over different values of time intervals between images. This method produces Eulerian velocity fields and Lagrangian velocity statistics with much lower noise levels compared to standard PIV or PTV measurements, without the need of filtering and/or windowing. Particle displacement between two frames is computed for multiple different time-step values between frames in a canonical experiment of homogeneous isotropic turbulence. The second order velocity structure function of the flow is computed with the new method and compared to results from traditional measurement techniques in the literature. Increased accuracy is also demonstrated by comparing the dissipation rate of turbulent kinetic energy measured from this function against previously validated measurements.

  10. Computational analysis of Ebolavirus data: prospects, promises and challenges.

    PubMed

    Michaelis, Martin; Rossman, Jeremy S; Wass, Mark N

    2016-08-15

    The ongoing Ebola virus (also known as Zaire ebolavirus, a member of the Ebolavirus family) outbreak in West Africa has so far resulted in >28000 confirmed cases compared with previous Ebolavirus outbreaks that affected a maximum of a few hundred individuals. Hence, Ebolaviruses impose a much greater threat than we may have expected (or hoped). An improved understanding of the virus biology is essential to develop therapeutic and preventive measures and to be better prepared for future outbreaks by members of the Ebolavirus family. Computational investigations can complement wet laboratory research for biosafety level 4 pathogens such as Ebolaviruses for which the wet experimental capacities are limited due to a small number of appropriate containment laboratories. During the current West Africa outbreak, sequence data from many Ebola virus genomes became available providing a rich resource for computational analysis. Here, we consider the studies that have already reported on the computational analysis of these data. A range of properties have been investigated including Ebolavirus evolution and pathogenicity, prediction of micro RNAs and identification of Ebolavirus specific signatures. However, the accuracy of the results remains to be confirmed by wet laboratory experiments. Therefore, communication and exchange between computational and wet laboratory researchers is necessary to make maximum use of computational analyses and to iteratively improve these approaches. © 2016 The Author(s). published by Portland Press Limited on behalf of the Biochemical Society.

  11. Successful application of the DBLOC method to the hydroxylation of camphor by cytochrome p450

    PubMed Central

    Jerome, Steven V.; Hughes, Thomas F.

    2015-01-01

    Abstract The activation barrier for the hydroxylation of camphor by cytochrome P450 was computed using a mixed quantum mechanics/molecular mechanics (QM/MM) model of the full protein‐ligand system and a fully QM calculation using a cluster model of the active site at the B3LYP/LACVP*/LACV3P** level of theory, which consisted of B3LYP/LACV3P** single point energies computed at B3LYP/LACVP* optimized geometries. From the QM/MM calculation, a barrier height of 17.5 kcal/mol was obtained, while the experimental value was known to be less than or equal to 10 kcal/mol. This process was repeated using the D3 correction for hybrid DFT in order to investigate whether the inadequate treatment of dispersion interaction was responsible for the overestimation of the barrier. While the D3 correction does reduce the computed barrier to 13.3 kcal/mol, it was still in disagreement with experiment. After application of a series of transition metal optimized localized orbital corrections (DBLOC) and without any refitting of parameters, the barrier was further reduced to 10.0 kcal/mol, which was consistent with the experimental results. The DBLOC method to C—H bond activation in methane monooxygenase (MMO) was also applied, as a second, independent test. The barrier in MMO was known, by experiment, to be 15.4 kcal/mol.1 After application of the DBLOC corrections to the MMO barrier compute by B3LYP, in a previous study, and accounting for dispersion with Grimme's D3 method, the unsigned deviation from experiment was improved from 3.2 to 2.3 kcal/mol. These results suggested that the combination of dispersion plus localized orbital corrections could yield significant quantitative improvements in modeling the catalytic chemistry of transition‐metal containing enzymes, within the limitations of the statistical errors of the model, which appear to be on the order of approximately 2 kcal/mole. PMID:26441133

  12. Making electronic mail accessible: perspectives of people with acquired cognitive impairments, caregivers and professionals.

    PubMed

    Todis, B; Sohlberg, M M; Hood, D; Fickas, S

    2005-06-01

    The primary objective of this study was to better understand the technology needs, barriers and strategies of individuals with acquired cognitive impairments (ACI) in order to design and modify technologies with potential for alleviating the diminished independence and social isolation common in this population. The authors hypothesized that (1) higher rates of computer use would be reported by younger, more highly educated individuals with ACI, those with less severe injuries and those with previous computer experience; (2) A low percentage of survey respondents would own their own computers; and (3) People with ACI would experience social isolation and report low frequency of connecting with important people who live far away. A total of 133 individuals with ACI, professionals and care providers completed the survey. To gain more specific information, seven focus groups were conducted with 66 individuals with ACI and 20 care providers. Finally, 10 current email users participated in structured conversations, detailing their strategies for using email. The survey revealed that 80% of subjects with ACI reported owning a computer. Age and education were not predictors of computer use, but individuals whose ACI was the result of more severe injuries were less likely to use computers. As expected, respondents reported that maintaining contact with distant loved ones is problematic. The focus groups and conversations provided more detail about the communication needs of the population and the relative advantages and disadvantages of email compared with telephone and mail. Participants also identified barriers to email use they had encountered or feared they would encounter when using email. A number of accommodations to overcome these barriers were suggested. The results of the survey, focus groups and conversations confirmed the utility of email and other technologies for people with ACI and the need to make these technologies more accessible. The results and suggestions provided by the focus groups and interviews are being used in the design of Think and Link, an email interface for use by individuals with ACI.

  13. Patients' acceptance of Internet-based home asthma telemonitoring.

    PubMed

    Finkelstein, J; Hripcsak, G; Cabrera, M R

    1998-01-01

    We studied asthma patients from a low-income inner-city community without previous computer experience. The patients were given portable spirometers to perform spirometry tests and palmtop computers to enter symptoms in a diary, to exchange messages with physician and to review test results. The self-testing was performed at home on a daily basis. The results were transmitted to the hospital information system immediately after completion of each test. Physician could review results using an Internet Web browser from any location. A constantly active decision support server monitored all data traffic and dispatched alerts when certain clinical conditions were met. Seventeen patients, out of 19 invited, agreed to participate in the study and have been monitored for three weeks. They have been surveyed then using standardized questionnaire. Most of the patients (82.4%) characterized self-testing procedures as "not complicated at all." In 70.6% of cases self-testing did not interfere with usual activities, and 82.4% of patients felt the self-testing required a "very little" amount of their time. All patients stated that it is important for them to know that the results can be reviewed by professional staff in a timely manner. However, only 29.5% of patients reviewed their results at least once a week at home independently. The majority of the patients (94.1%) were strongly interested in using home asthma telemonitoring in the future. We concluded that Internet-based home asthma telemonitoring can be successfully implemented in the group of patients without previous computer background.

  14. Hierarchical combinatorial deep learning architecture for pancreas segmentation of medical computed tomography cancer images.

    PubMed

    Fu, Min; Wu, Wenming; Hong, Xiafei; Liu, Qiuhua; Jiang, Jialin; Ou, Yaobin; Zhao, Yupei; Gong, Xinqi

    2018-04-24

    Efficient computational recognition and segmentation of target organ from medical images are foundational in diagnosis and treatment, especially about pancreas cancer. In practice, the diversity in appearance of pancreas and organs in abdomen, makes detailed texture information of objects important in segmentation algorithm. According to our observations, however, the structures of previous networks, such as the Richer Feature Convolutional Network (RCF), are too coarse to segment the object (pancreas) accurately, especially the edge. In this paper, we extend the RCF, proposed to the field of edge detection, for the challenging pancreas segmentation, and put forward a novel pancreas segmentation network. By employing multi-layer up-sampling structure replacing the simple up-sampling operation in all stages, the proposed network fully considers the multi-scale detailed contexture information of object (pancreas) to perform per-pixel segmentation. Additionally, using the CT scans, we supply and train our network, thus get an effective pipeline. Working with our pipeline with multi-layer up-sampling model, we achieve better performance than RCF in the task of single object (pancreas) segmentation. Besides, combining with multi scale input, we achieve the 76.36% DSC (Dice Similarity Coefficient) value in testing data. The results of our experiments show that our advanced model works better than previous networks in our dataset. On the other words, it has better ability in catching detailed contexture information. Therefore, our new single object segmentation model has practical meaning in computational automatic diagnosis.

  15. Gyrofluid Modeling of Turbulent, Kinetic Physics

    NASA Astrophysics Data System (ADS)

    Despain, Kate Marie

    2011-12-01

    Gyrofluid models to describe plasma turbulence combine the advantages of fluid models, such as lower dimensionality and well-developed intuition, with those of gyrokinetics models, such as finite Larmor radius (FLR) effects. This allows gyrofluid models to be more tractable computationally while still capturing much of the physics related to the FLR of the particles. We present a gyrofluid model derived to capture the behavior of slow solar wind turbulence and describe the computer code developed to implement the model. In addition, we describe the modifications we made to a gyrofluid model and code that simulate plasma turbulence in tokamak geometries. Specifically, we describe a nonlinear phase mixing phenomenon, part of the E x B term, that was previously missing from the model. An inherently FLR effect, it plays an important role in predicting turbulent heat flux and diffusivity levels for the plasma. We demonstrate this importance by comparing results from the updated code to studies done previously by gyrofluid and gyrokinetic codes. We further explain what would be necessary to couple the updated gyrofluid code, gryffin, to a turbulent transport code, thus allowing gryffin to play a role in predicting profiles for fusion devices such as ITER and to explore novel fusion configurations. Such a coupling would require the use of Graphical Processing Units (GPUs) to make the modeling process fast enough to be viable. Consequently, we also describe our experience with GPU computing and demonstrate that we are poised to complete a gryffin port to this innovative architecture.

  16. Coupling fast fluid dynamics and multizone airflow models in Modelica Buildings library to simulate the dynamics of HVAC systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tian, Wei; Sevilla, Thomas Alonso; Zuo, Wangda

    Historically, multizone models are widely used in building airflow and energy performance simulations due to their fast computing speed. However, multizone models assume that the air in a room is well mixed, consequently limiting their application. In specific rooms where this assumption fails, the use of computational fluid dynamics (CFD) models may be an alternative option. Previous research has mainly focused on coupling CFD models and multizone models to study airflow in large spaces. While significant, most of these analyses did not consider the coupled simulation of the building airflow with the building's Heating, Ventilation, and Air-Conditioning (HVAC) systems. Thismore » paper tries to fill the gap by integrating the models for HVAC systems with coupled multizone and CFD simulations for airflows, using the Modelica simul ation platform. To improve the computational efficiency, we incorporated a simplified CFD model named fast fluid dynamics (FFD). We first introduce the data synchronization strategy and implementation in Modelica. Then, we verify the implementation using two case studies involving an isothermal and a non-isothermal flow by comparing model simulations to experiment data. Afterward, we study another three cases that are deemed more realistic. This is done by attaching a variable air volume (VAV) terminal box and a VAV system to previous flows to assess the capability of the models in studying the dynamic control of HVAC systems. Finally, we discuss further research needs on the coupled simulation using the models.« less

  17. Criterion for Identifying Vortices in High-Pressure Flows

    NASA Technical Reports Server (NTRS)

    Bellan, Josette; Okong'o, Nora

    2007-01-01

    A study of four previously published computational criteria for identifying vortices in high-pressure flows has led to the selection of one of them as the best. This development can be expected to contribute to understanding of high-pressure flows, which occur in diverse settings, including diesel, gas turbine, and rocket engines and the atmospheres of Jupiter and other large gaseous planets. Information on the atmospheres of gaseous planets consists mainly of visual and thermal images of the flows over the planets. Also, validation of recently proposed computational models of high-pressure flows entails comparison with measurements, which are mainly of visual nature. Heretofore, the interpretation of images of high-pressure flows to identify vortices has been based on experience with low-pressure flows. However, high-pressure flows have features distinct from those of low-pressure flows, particularly in regions of high pressure gradient magnitude caused by dynamic turbulent effects and by thermodynamic mixing of chemical species. Therefore, interpretations based on low-pressure behavior may lead to misidentification of vortices and other flow structures in high-pressure flows. The study reported here was performed in recognition of the need for one or more quantitative criteria for identifying coherent flow structures - especially vortices - from previously generated flow-field data, to complement or supersede the determination of flow structures by visual inspection of instantaneous fields or flow animations. The focus in the study was on correlating visible images of flow features with various quantities computed from flow-field data.

  18. Evidence-based pathology in its second decade: toward probabilistic cognitive computing.

    PubMed

    Marchevsky, Alberto M; Walts, Ann E; Wick, Mark R

    2017-03-01

    Evidence-based pathology advocates using a combination of best available data ("evidence") from the literature and personal experience for the diagnosis, estimation of prognosis, and assessment of other variables that impact individual patient care. Evidence-based pathology relies on systematic reviews of the literature, evaluation of the quality of evidence as categorized by evidence levels and statistical tools such as meta-analyses, estimates of probabilities and odds, and others. However, it is well known that previously "statistically significant" information usually does not accurately forecast the future for individual patients. There is great interest in "cognitive computing" in which "data mining" is combined with "predictive analytics" designed to forecast future events and estimate the strength of those predictions. This study demonstrates the use of IBM Watson Analytics software to evaluate and predict the prognosis of 101 patients with typical and atypical pulmonary carcinoid tumors in which Ki-67 indices have been determined. The results obtained with this system are compared with those previously reported using "routine" statistical software and the help of a professional statistician. IBM Watson Analytics interactively provides statistical results that are comparable to those obtained with routine statistical tools but much more rapidly, with considerably less effort and with interactive graphics that are intuitively easy to apply. It also enables analysis of natural language variables and yields detailed survival predictions for patient subgroups selected by the user. Potential applications of this tool and basic concepts of cognitive computing are discussed. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Adaptive-weighted Total Variation Minimization for Sparse Data toward Low-dose X-ray Computed Tomography Image Reconstruction

    PubMed Central

    Liu, Yan; Ma, Jianhua; Fan, Yi; Liang, Zhengrong

    2012-01-01

    Previous studies have shown that by minimizing the total variation (TV) of the to-be-estimated image with some data and other constraints, a piecewise-smooth X-ray computed tomography (CT) can be reconstructed from sparse-view projection data without introducing noticeable artifacts. However, due to the piecewise constant assumption for the image, a conventional TV minimization algorithm often suffers from over-smoothness on the edges of the resulting image. To mitigate this drawback, we present an adaptive-weighted TV (AwTV) minimization algorithm in this paper. The presented AwTV model is derived by considering the anisotropic edge property among neighboring image voxels, where the associated weights are expressed as an exponential function and can be adaptively adjusted by the local image-intensity gradient for the purpose of preserving the edge details. Inspired by the previously-reported TV-POCS (projection onto convex sets) implementation, a similar AwTV-POCS implementation was developed to minimize the AwTV subject to data and other constraints for the purpose of sparse-view low-dose CT image reconstruction. To evaluate the presented AwTV-POCS algorithm, both qualitative and quantitative studies were performed by computer simulations and phantom experiments. The results show that the presented AwTV-POCS algorithm can yield images with several noticeable gains, in terms of noise-resolution tradeoff plots and full width at half maximum values, as compared to the corresponding conventional TV-POCS algorithm. PMID:23154621

  20. Adaptive-weighted total variation minimization for sparse data toward low-dose x-ray computed tomography image reconstruction.

    PubMed

    Liu, Yan; Ma, Jianhua; Fan, Yi; Liang, Zhengrong

    2012-12-07

    Previous studies have shown that by minimizing the total variation (TV) of the to-be-estimated image with some data and other constraints, piecewise-smooth x-ray computed tomography (CT) can be reconstructed from sparse-view projection data without introducing notable artifacts. However, due to the piecewise constant assumption for the image, a conventional TV minimization algorithm often suffers from over-smoothness on the edges of the resulting image. To mitigate this drawback, we present an adaptive-weighted TV (AwTV) minimization algorithm in this paper. The presented AwTV model is derived by considering the anisotropic edge property among neighboring image voxels, where the associated weights are expressed as an exponential function and can be adaptively adjusted by the local image-intensity gradient for the purpose of preserving the edge details. Inspired by the previously reported TV-POCS (projection onto convex sets) implementation, a similar AwTV-POCS implementation was developed to minimize the AwTV subject to data and other constraints for the purpose of sparse-view low-dose CT image reconstruction. To evaluate the presented AwTV-POCS algorithm, both qualitative and quantitative studies were performed by computer simulations and phantom experiments. The results show that the presented AwTV-POCS algorithm can yield images with several notable gains, in terms of noise-resolution tradeoff plots and full-width at half-maximum values, as compared to the corresponding conventional TV-POCS algorithm.

  1. Computational Studies of Magnetic Nozzle Performance

    NASA Technical Reports Server (NTRS)

    Ebersohn, Frans H.; Longmier, Benjamin W.; Sheehan, John P.; Shebalin, John B.; Raja, Laxminarayan

    2013-01-01

    An extensive literature review of magnetic nozzle research has been performed, examining previous work, as well as a review of fundamental principles. This has allow us to catalog all basic physical mechanisms which we believe underlie the thrust generation process. Energy conversion mechanisms include the approximate conservation of the magnetic moment adiabatic invariant, generalized hall and thermoelectric acceleration, swirl acceleration, thermal energy transformation into directed kinetic energy, and Joule heating. Momentum transfer results from the interaction of the applied magnetic field with currents induced in the plasma plume., while plasma detachment mechanisms include resistive diffusion, recombination and charge exchange collisions, magnetic reconnection, loss of adiabaticity, inertial forces, current closure, and self-field detachment. We have performed a preliminary study of Hall effects on magnetic nozzle jets with weak guiding magnetic fields and weak expansions (p(sub jet) approx. = P(sub background)). The conclusion from this study is that the Hall effect creates an azimuthal rotation of the plasma jet and, more generally, creates helical structures in the induced current, velocity field, and magnetic fields. We have studied plasma jet expansion to near vacuum without a guiding magnetic field, and are presently including a guiding magnetic field using a resistive MHD solver. This research is progressing toward the implementation of a full generalized Ohm's law solver. In our paper, we will summarize the basic principle, as well as the literature survey and briefly review our previous results. Our most recent results at the time of submittal will also be included. Efforts are currently underway to construct an experiment at the University of Michigan Plasmadynamics and Electric Propulsion Laboratory (PEPL) to study magnetic nozzle physics for a RF-thruster. Our computational study will work directly with this experiment to validate the numerical model, in order to study magnetic nozzle physics and optimize magnetic nozzle design. Preliminary results from the PEPL experiment will also be presented.

  2. Evolving technologies for Space Station Freedom computer-based workstations

    NASA Technical Reports Server (NTRS)

    Jensen, Dean G.; Rudisill, Marianne

    1990-01-01

    Viewgraphs on evolving technologies for Space Station Freedom computer-based workstations are presented. The human-computer computer software environment modules are described. The following topics are addressed: command and control workstation concept; cupola workstation concept; Japanese experiment module RMS workstation concept; remote devices controlled from workstations; orbital maneuvering vehicle free flyer; remote manipulator system; Japanese experiment module exposed facility; Japanese experiment module small fine arm; flight telerobotic servicer; human-computer interaction; and workstation/robotics related activities.

  3. Evaluation of Emerging Energy-Efficient Heterogeneous Computing Platforms for Biomolecular and Cellular Simulation Workloads

    PubMed Central

    Stone, John E.; Hallock, Michael J.; Phillips, James C.; Peterson, Joseph R.; Luthey-Schulten, Zaida; Schulten, Klaus

    2016-01-01

    Many of the continuing scientific advances achieved through computational biology are predicated on the availability of ongoing increases in computational power required for detailed simulation and analysis of cellular processes on biologically-relevant timescales. A critical challenge facing the development of future exascale supercomputer systems is the development of new computing hardware and associated scientific applications that dramatically improve upon the energy efficiency of existing solutions, while providing increased simulation, analysis, and visualization performance. Mobile computing platforms have recently become powerful enough to support interactive molecular visualization tasks that were previously only possible on laptops and workstations, creating future opportunities for their convenient use for meetings, remote collaboration, and as head mounted displays for immersive stereoscopic viewing. We describe early experiences adapting several biomolecular simulation and analysis applications for emerging heterogeneous computing platforms that combine power-efficient system-on-chip multi-core CPUs with high-performance massively parallel GPUs. We present low-cost power monitoring instrumentation that provides sufficient temporal resolution to evaluate the power consumption of individual CPU algorithms and GPU kernels. We compare the performance and energy efficiency of scientific applications running on emerging platforms with results obtained on traditional platforms, identify hardware and algorithmic performance bottlenecks that affect the usability of these platforms, and describe avenues for improving both the hardware and applications in pursuit of the needs of molecular modeling tasks on mobile devices and future exascale computers. PMID:27516922

  4. Potential evapotranspiration and continental drying

    USGS Publications Warehouse

    Milly, Paul C.D.; Dunne, Krista A.

    2016-01-01

    By various measures (drought area and intensity, climatic aridity index, and climatic water deficits), some observational analyses have suggested that much of the Earth’s land has been drying during recent decades, but such drying seems inconsistent with observations of dryland greening and decreasing pan evaporation. ‘Offline’ analyses of climate-model outputs from anthropogenic climate change (ACC) experiments portend continuation of putative drying through the twenty-first century, despite an expected increase in global land precipitation. A ubiquitous increase in estimates of potential evapotranspiration (PET), driven by atmospheric warming, underlies the drying trends, but may be a methodological artefact. Here we show that the PET estimator commonly used (the Penman–Monteith PET for either an open-water surface or a reference crop) severely overpredicts the changes in non-water-stressed evapotranspiration computed in the climate models themselves in ACC experiments. This overprediction is partially due to neglect of stomatal conductance reductions commonly induced by increasing atmospheric CO2 concentrations in climate models. Our findings imply that historical and future tendencies towards continental drying, as characterized by offline-computed runoff, as well as other PET-dependent metrics, may be considerably weaker and less extensive than previously thought.

  5. Prediction of common epitopes on hemagglutinin of the influenza A virus (H1 subtype).

    PubMed

    Guo, Chunyan; Xie, Xin; Li, Huijin; Zhao, Penghua; Zhao, Xiangrong; Sun, Jingying; Wang, Haifang; Liu, Yang; Li, Yan; Hu, Qiaoxia; Hu, Jun; Li, Yuan

    2015-02-01

    Influenza A virus infection is a persistent threat to public health worldwide due to hemagglutinin (HA) variation. Current vaccines against influenza A virus provide immunity to viral isolates similar to vaccine strains. Antibodies against common epitopes provide immunity to diverse influenza virus strains and protect against future pandemic influenza. Therefore, it is vital to analyze common HA antigenic epitopes of influenza virus. In this study, 14 strains of monoclonal antibodies with high sensitivity to common epitopes of influenza virus antigens identified in our previous study were selected as the tool to predict common HA epitopes. The common HA antigenic epitopes were divided into four categories by ELISA blocking experiments, and separately, into three categories according to the preliminary results of computer simulation. Comparison between the results of computer simulations and ELISA blocking experiments indicated that at least two classes of common epitopes are present in influenza virus HA. This study provides experimental data for improving the prediction of HA epitopes of influenza virus (H1 subtype) and the development of a potential universal vaccine as well as a novel approach for the prediction of epitopes on other pathogenic microorganisms. Copyright © 2014 Elsevier Inc. All rights reserved.

  6. A tool for measuring the bending length in thin wires

    NASA Astrophysics Data System (ADS)

    Lorenzini, M.; Cagnoli, G.; Cesarini, E.; Losurdo, G.; Martelli, F.; Piergiovanni, F.; Vetrano, F.; Viceré, A.

    2013-03-01

    Great effort is currently being put into the development and construction of the second generation, advanced gravitational wave detectors, Advanced Virgo and Advanced LIGO. The development of new low thermal noise suspensions of mirrors, based on the experience gained in the previous experiments, is part of this task. Quasi-monolithic suspensions with fused silica wires avoid the problem of rubbing friction introduced by steel cradle arrangements by directly welding the wires to silica blocks bonded to the mirror. Moreover, the mechanical loss level introduced by silica (ϕfs ˜ 10-7 in thin fused silica wires) is by far less than the one associated with steel. The low frequency dynamical behaviour of the suspension can be computed and optimized, provided that the wire bending shape under pendulum motion is known. Due to the production process, fused silica wires are thicker near the two ends (necks), so that analytical bending computations are very complicated. We developed a tool to directly measure the low frequency bending parameters of fused silica wires, and we tested it on the wires produced for the Virgo+ monolithic suspensions. The working principle and a set of test measurements are presented and explained.

  7. A tool for measuring the bending length in thin wires.

    PubMed

    Lorenzini, M; Cagnoli, G; Cesarini, E; Losurdo, G; Martelli, F; Piergiovanni, F; Vetrano, F; Viceré, A

    2013-03-01

    Great effort is currently being put into the development and construction of the second generation, advanced gravitational wave detectors, Advanced Virgo and Advanced LIGO. The development of new low thermal noise suspensions of mirrors, based on the experience gained in the previous experiments, is part of this task. Quasi-monolithic suspensions with fused silica wires avoid the problem of rubbing friction introduced by steel cradle arrangements by directly welding the wires to silica blocks bonded to the mirror. Moreover, the mechanical loss level introduced by silica (φfs ∼ 10(-7) in thin fused silica wires) is by far less than the one associated with steel. The low frequency dynamical behaviour of the suspension can be computed and optimized, provided that the wire bending shape under pendulum motion is known. Due to the production process, fused silica wires are thicker near the two ends (necks), so that analytical bending computations are very complicated. We developed a tool to directly measure the low frequency bending parameters of fused silica wires, and we tested it on the wires produced for the Virgo+ monolithic suspensions. The working principle and a set of test measurements are presented and explained.

  8. CFD-CAA Coupled Calculations of a Tandem Cylinder Configuration to Assess Facility Installation Effects

    NASA Technical Reports Server (NTRS)

    Redonnet, Stephane; Lockard, David P.; Khorrami, Mehdi R.; Choudhari, Meelan M.

    2011-01-01

    This paper presents a numerical assessment of acoustic installation effects in the tandem cylinder (TC) experiments conducted in the NASA Langley Quiet Flow Facility (QFF), an open-jet, anechoic wind tunnel. Calculations that couple the Computational Fluid Dynamics (CFD) and Computational Aeroacoustics (CAA) of the TC configuration within the QFF are conducted using the CFD simulation results previously obtained at NASA LaRC. The coupled simulations enable the assessment of installation effects associated with several specific features in the QFF facility that may have impacted the measured acoustic signature during the experiment. The CFD-CAA coupling is based on CFD data along a suitably chosen surface, and employs a technique that was recently improved to account for installed configurations involving acoustic backscatter into the CFD domain. First, a CFD-CAA calculation is conducted for an isolated TC configuration to assess the coupling approach, as well as to generate a reference solution for subsequent assessments of QFF installation effects. Direct comparisons between the CFD-CAA calculations associated with the various installed configurations allow the assessment of the effects of each component (nozzle, collector, etc.) or feature (confined vs. free jet flow, etc.) characterizing the NASA LaRC QFF facility.

  9. Bayesian Face Recognition and Perceptual Narrowing in Face-Space

    PubMed Central

    Balas, Benjamin

    2012-01-01

    During the first year of life, infants’ face recognition abilities are subject to “perceptual narrowing,” the end result of which is that observers lose the ability to distinguish previously discriminable faces (e.g. other-race faces) from one another. Perceptual narrowing has been reported for faces of different species and different races, in developing humans and primates. Though the phenomenon is highly robust and replicable, there have been few efforts to model the emergence of perceptual narrowing as a function of the accumulation of experience with faces during infancy. The goal of the current study is to examine how perceptual narrowing might manifest as statistical estimation in “face space,” a geometric framework for describing face recognition that has been successfully applied to adult face perception. Here, I use a computer vision algorithm for Bayesian face recognition to study how the acquisition of experience in face space and the presence of race categories affect performance for own and other-race faces. Perceptual narrowing follows from the establishment of distinct race categories, suggesting that the acquisition of category boundaries for race is a key computational mechanism in developing face expertise. PMID:22709406

  10. Physical modeling of Tibetan bowls

    NASA Astrophysics Data System (ADS)

    Antunes, Jose; Inacio, Octavio

    2004-05-01

    Tibetan bowls produce rich penetrating sounds, used in musical contexts and to induce a state of relaxation for meditation or therapy purposes. To understand the dynamics of these instruments under impact and rubbing excitation, we developed a simulation method based on the modal approach, following our previous papers on physical modeling of plucked/bowed strings and impacted/bowed bars. This technique is based on a compact representation of the system dynamics, in terms of the unconstrained bowl modes. Nonlinear contact/friction interaction forces, between the exciter (puja) and the bowl, are computed at each time step and projected on the bowl modal basis, followed by step integration of the modal equations. We explore the behavior of two different-sized bowls, for extensive ranges of excitation conditions (contact/friction parameters, normal force, and tangential puja velocity). Numerical results and experiments show that various self-excited motions may arise depending on the playing conditions and, mainly, on the contact/friction interaction parameters. Indeed, triggering of a given bowl modal frequency mainly depends on the puja material. Computed animations and experiments demonstrate that self-excited modes spin, following the puja motion. Accordingly, the sensed pressure field pulsates, with frequency controlled by the puja spinning velocity and the spatial pattern of the singing mode.

  11. Taming Many-Parameter BSM Models with Bayesian Neural Networks

    NASA Astrophysics Data System (ADS)

    Kuchera, M. P.; Karbo, A.; Prosper, H. B.; Sanchez, A.; Taylor, J. Z.

    2017-09-01

    The search for physics Beyond the Standard Model (BSM) is a major focus of large-scale high energy physics experiments. One method is to look for specific deviations from the Standard Model that are predicted by BSM models. In cases where the model has a large number of free parameters, standard search methods become intractable due to computation time. This talk presents results using Bayesian Neural Networks, a supervised machine learning method, to enable the study of higher-dimensional models. The popular phenomenological Minimal Supersymmetric Standard Model was studied as an example of the feasibility and usefulness of this method. Graphics Processing Units (GPUs) are used to expedite the calculations. Cross-section predictions for 13 TeV proton collisions will be presented. My participation in the Conference Experience for Undergraduates (CEU) in 2004-2006 exposed me to the national and global significance of cutting-edge research. At the 2005 CEU, I presented work from the previous summer's SULI internship at Lawrence Berkeley Laboratory, where I learned to program while working on the Majorana Project. That work inspired me to follow a similar research path, which led me to my current work on computational methods applied to BSM physics.

  12. Electron and positron states in HgBa2CuO4

    NASA Astrophysics Data System (ADS)

    Barbiellini, B.; Jarlborg, T.

    1994-08-01

    Local-density-calculations of the electronic structure of HgBa2CuO4 have been performed with the self-consistent linear muffin-tin orbital method. The positron-density distribution and its sensitivity due to different potentials are calculated. The annihilation rates are computed in order to study the chemical bonding and to predict the Fermi-surface signal. Comparisons are made with previous calculations on other high-Tc copper oxides concerning the Fermi-surface properties and electron-positron overlap. We discuss the possibility of observing the Fermi surface associated with the Cu-O planes in positron-annihilation experiments.

  13. Improved formula for continuous-wave measurements of ultrasonic phase velocity

    NASA Technical Reports Server (NTRS)

    Chern, E. J.; Cantrell, J. H., Jr.; Heyman, J. S.

    1981-01-01

    An improved formula for continuous-wave ultrasonic phase velocity measurements using contact transducers is derived from the transmission line theory. The effect of transducer-sample coupling bonds is considered for measurements of solid samples even though it is often neglected because of the difficulty of accurately determining the bond thickness. Computer models show that the present formula is more accurate than previous expressions. Laboratory measurements using contacting transducers with the present formula are compared to measurements using noncontacting (hence effectively correction-free) capacitive transducers. The results of the experiments verify the validity and accuracy of the new formula.

  14. Application of M-JPEG compression hardware to dynamic stimulus production.

    PubMed

    Mulligan, J B

    1997-01-01

    Inexpensive circuit boards have appeared on the market which transform a normal micro-computer's disk drive into a video disk capable of playing extended video sequences in real time. This technology enables the performance of experiments which were previously impossible, or at least prohibitively expensive. The new technology achieves this capability using special-purpose hardware to compress and decompress individual video frames, enabling a video stream to be transferred over relatively low-bandwidth disk interfaces. This paper will describe the use of such devices for visual psychophysics and present the technical issues that must be considered when evaluating individual products.

  15. Fully coupled six-dimensional calculations of the water dimer vibration-rotation-tunneling states with split Wigner pseudospectral approach. II. Improvements and tests of additional potentials

    NASA Astrophysics Data System (ADS)

    Fellers, R. S.; Braly, L. B.; Saykally, R. J.; Leforestier, C.

    1999-04-01

    The SWPS method is improved by the addition of H.E.G. contractions for generating a more compact basis. An error in the definition of the internal fragment axis system used in our previous calculation is described and corrected. Fully coupled 6D (rigid monomers) VRT states are computed for several new water dimer potential surfaces and compared with experiment and our earlier SWPS results. This work sets the stage for refinement of such potential surfaces via regression analysis of VRT spectroscopic data.

  16. MRPrimerW: a tool for rapid design of valid high-quality primers for multiple target qPCR experiments

    PubMed Central

    Kim, Hyerin; Kang, NaNa; An, KyuHyeon; Koo, JaeHyung; Kim, Min-Soo

    2016-01-01

    Design of high-quality primers for multiple target sequences is essential for qPCR experiments, but is challenging due to the need to consider both homology tests on off-target sequences and the same stringent filtering constraints on the primers. Existing web servers for primer design have major drawbacks, including requiring the use of BLAST-like tools for homology tests, lack of support for ranking of primers, TaqMan probes and simultaneous design of primers against multiple targets. Due to the large-scale computational overhead, the few web servers supporting homology tests use heuristic approaches or perform homology tests within a limited scope. Here, we describe the MRPrimerW, which performs complete homology testing, supports batch design of primers for multi-target qPCR experiments, supports design of TaqMan probes and ranks the resulting primers to return the top-1 best primers to the user. To ensure high accuracy, we adopted the core algorithm of a previously reported MapReduce-based method, MRPrimer, but completely redesigned it to allow users to receive query results quickly in a web interface, without requiring a MapReduce cluster or a long computation. MRPrimerW provides primer design services and a complete set of 341 963 135 in silico validated primers covering 99% of human and mouse genes. Free access: http://MRPrimerW.com. PMID:27154272

  17. On the structure of crystalline and molten cryolite: Insights from the ab initio molecular dynamics in NpT ensemble

    NASA Astrophysics Data System (ADS)

    Bučko, Tomáš; Šimko, František

    2016-02-01

    Ab initio molecular dynamics simulations in isobaric-isothermal ensemble have been performed to study the low- and the high-temperature crystalline and liquid phases of cryolite. The temperature induced transitions from the low-temperature solid (α) to the high-temperature solid phase (β) and from the phase β to the liquid phase have been simulated using a series of MD runs performed at gradually increasing temperature. The structure of crystalline and liquid phases is analysed in detail and our computational approach is shown to reliably reproduce the available experimental data for a wide range of temperatures. Relatively frequent reorientations of the AlF6 octahedra observed in our simulation of the phase β explain the thermal disorder in positions of the F- ions observed in X-ray diffraction experiments. The isolated AlF63-, AlF52-, AlF4-, as well as the bridged Al 2 Fm 6 - m ionic entities have been identified as the main constituents of cryolite melt. In accord with the previous high-temperature NMR and Raman spectroscopic experiments, the compound AlF5 2 - has been shown to be the most abundant Al-containing species formed in the melt. The characteristic vibrational frequencies for the AlFn 3 - n species in realistic environment have been determined and the computed values have been found to be in a good agreement with experiment.

  18. Optimization of Multi-Fidelity Computer Experiments via the EQIE Criterion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    He, Xu; Tuo, Rui; Jeff Wu, C. F.

    Computer experiments based on mathematical models are powerful tools for understanding physical processes. This article addresses the problem of kriging-based optimization for deterministic computer experiments with tunable accuracy. Our approach is to use multi- delity computer experiments with increasing accuracy levels and a nonstationary Gaussian process model. We propose an optimization scheme that sequentially adds new computer runs by following two criteria. The first criterion, called EQI, scores candidate inputs with given level of accuracy, and the second criterion, called EQIE, scores candidate combinations of inputs and accuracy. Here, from simulation results and a real example using finite element analysis,more » our method out-performs the expected improvement (EI) criterion which works for single-accuracy experiments.« less

  19. Optimization of Multi-Fidelity Computer Experiments via the EQIE Criterion

    DOE PAGES

    He, Xu; Tuo, Rui; Jeff Wu, C. F.

    2017-01-31

    Computer experiments based on mathematical models are powerful tools for understanding physical processes. This article addresses the problem of kriging-based optimization for deterministic computer experiments with tunable accuracy. Our approach is to use multi- delity computer experiments with increasing accuracy levels and a nonstationary Gaussian process model. We propose an optimization scheme that sequentially adds new computer runs by following two criteria. The first criterion, called EQI, scores candidate inputs with given level of accuracy, and the second criterion, called EQIE, scores candidate combinations of inputs and accuracy. Here, from simulation results and a real example using finite element analysis,more » our method out-performs the expected improvement (EI) criterion which works for single-accuracy experiments.« less

  20. On the spectroscopic constants, first electronic state, vibrational frequencies, and isomerization of hydroxymethylene (HCOH+)

    NASA Astrophysics Data System (ADS)

    Theis, Riley A.; Fortenberry, Ryan C.

    2017-09-01

    The hydroxymethylene cation (HCOH+) is believed to be chemically independent of the more stable formaldehyde cation isomer in interstellar chemistry and may likely be a precursor to methanol in chemical reaction networks. Previous work is corroborated here showing that the trans conformer of HCOH+ is 3.48 kcal/mol lower than the cis on the potential energy surface. The small energy difference between the conformers and the much larger dipole moment of cis-HCOH+ (2.73 D) make this conformer more likely to be observed than trans-HCOH+ via telescopic rotational spectroscopy. A strong adiabatic shift is also predicted in the first electronic excitation into the 1 2A‧‧/2 2A state out of either conformer into a C1 structure reducing the excitation wavelength from the near-ultraviolet all the way into the near-infrared. The full set of fundamental vibrational frequencies are also computed here at high-level. The 3306.0 cm-1 and 3225.3 cm-1 hydroxide stretches, respective of bare trans- and cis-HCOH+ , are in agreement with previous theory but are significantly higher than the frequencies determined from previous experiment utilizing argon tagging techniques. This shift is likely because the proton-bound complex created with the argon tag reduces the experimental frequencies. Lower-level computations including the argon tag bring the hydroxide stretches much closer to the experimental frequencies indicating that the predicted frequencies for bare HCOH+ are likely well-described.

  1. Embodied conversational agents for multimodal automated social skills training in people with autism spectrum disorders.

    PubMed

    Tanaka, Hiroki; Negoro, Hideki; Iwasaka, Hidemi; Nakamura, Satoshi

    2017-01-01

    Social skills training, performed by human trainers, is a well-established method for obtaining appropriate skills in social interaction. Previous work automated the process of social skills training by developing a dialogue system that teaches social communication skills through interaction with a computer avatar. Even though previous work that simulated social skills training only considered acoustic and linguistic information, human social skills trainers take into account visual and other non-verbal features. In this paper, we create and evaluate a social skills training system that closes this gap by considering the audiovisual features of the smiling ratio and the head pose (yaw and pitch). In addition, the previous system was only tested with graduate students; in this paper, we applied our system to children or young adults with autism spectrum disorders. For our experimental evaluation, we recruited 18 members from the general population and 10 people with autism spectrum disorders and gave them our proposed multimodal system to use. An experienced human social skills trainer rated the social skills of the users. We evaluated the system's effectiveness by comparing pre- and post-training scores and identified significant improvement in their social skills using our proposed multimodal system. Computer-based social skills training is useful for people who experience social difficulties. Such a system can be used by teachers, therapists, and social skills trainers for rehabilitation and the supplemental use of human-based training anywhere and anytime.

  2. Embodied conversational agents for multimodal automated social skills training in people with autism spectrum disorders

    PubMed Central

    Negoro, Hideki; Iwasaka, Hidemi; Nakamura, Satoshi

    2017-01-01

    Social skills training, performed by human trainers, is a well-established method for obtaining appropriate skills in social interaction. Previous work automated the process of social skills training by developing a dialogue system that teaches social communication skills through interaction with a computer avatar. Even though previous work that simulated social skills training only considered acoustic and linguistic information, human social skills trainers take into account visual and other non-verbal features. In this paper, we create and evaluate a social skills training system that closes this gap by considering the audiovisual features of the smiling ratio and the head pose (yaw and pitch). In addition, the previous system was only tested with graduate students; in this paper, we applied our system to children or young adults with autism spectrum disorders. For our experimental evaluation, we recruited 18 members from the general population and 10 people with autism spectrum disorders and gave them our proposed multimodal system to use. An experienced human social skills trainer rated the social skills of the users. We evaluated the system’s effectiveness by comparing pre- and post-training scores and identified significant improvement in their social skills using our proposed multimodal system. Computer-based social skills training is useful for people who experience social difficulties. Such a system can be used by teachers, therapists, and social skills trainers for rehabilitation and the supplemental use of human-based training anywhere and anytime. PMID:28796781

  3. Analysis of the Source Physics Experiment SPE4 Prime Using State-Of Parallel Numerical Tools.

    NASA Astrophysics Data System (ADS)

    Vorobiev, O.; Ezzedine, S. M.; Antoun, T.; Glenn, L.

    2015-12-01

    This work describes a methodology used for large scale modeling of wave propagation from underground chemical explosions conducted at the Nevada National Security Site (NNSS) fractured granitic rock. We show that the discrete natures of rock masses as well as the spatial variability of the fabric of rock properties are very important to understand ground motions induced by underground explosions. In order to build a credible conceptual model of the subsurface we integrated the geological, geomechanical and geophysical characterizations conducted during recent test at the NNSS as well as historical data from the characterization during the underground nuclear test conducted at the NNSS. Because detailed site characterization is limited, expensive and, in some instances, impossible we have numerically investigated the effects of the characterization gaps on the overall response of the system. We performed several computational studies to identify the key important geologic features specific to fractured media mainly the joints characterized at the NNSS. We have also explored common key features to both geological environments such as saturation and topography and assess which characteristics affect the most the ground motion in the near-field and in the far-field. Stochastic representation of these features based on the field characterizations has been implemented into LLNL's Geodyn-L hydrocode. Simulations were used to guide site characterization efforts in order to provide the essential data to the modeling community. We validate our computational results by comparing the measured and computed ground motion at various ranges for the recently executed SPE4 prime experiment. We have also conducted a comparative study between SPE4 prime and previous experiments SPE1 and SPE3 to assess similarities and differences and draw conclusions on designing SPE5.

  4. Metronome LKM: An open source virtual keyboard driver to measure experiment software latencies.

    PubMed

    Garaizar, Pablo; Vadillo, Miguel A

    2017-10-01

    Experiment software is often used to measure reaction times gathered with keyboards or other input devices. In previous studies, the accuracy and precision of time stamps has been assessed through several means: (a) generating accurate square wave signals from an external device connected to the parallel port of the computer running the experiment software, (b) triggering the typematic repeat feature of some keyboards to get an evenly separated series of keypress events, or (c) using a solenoid handled by a microcontroller to press the input device (keyboard, mouse button, touch screen) that will be used in the experimental setup. Despite the advantages of these approaches in some contexts, none of them can isolate the measurement error caused by the experiment software itself. Metronome LKM provides a virtual keyboard to assess an experiment's software. Using this open source driver, researchers can generate keypress events using high-resolution timers and compare the time stamps collected by the experiment software with those gathered by Metronome LKM (with nanosecond resolution). Our software is highly configurable (in terms of keys pressed, intervals, SysRq activation) and runs on 2.6-4.8 Linux kernels.

  5. Response Surface Model Building Using Orthogonal Arrays for Computer Experiments

    NASA Technical Reports Server (NTRS)

    Unal, Resit; Braun, Robert D.; Moore, Arlene A.; Lepsch, Roger A.

    1997-01-01

    This study investigates response surface methods for computer experiments and discusses some of the approaches available. Orthogonal arrays constructed for computer experiments are studied and an example application to a technology selection and optimization study for a reusable launch vehicle is presented.

  6. Using the Computer as a Laboratory Instrument.

    ERIC Educational Resources Information Center

    Collings, Peter J.; Greenslade, Thomas B., Jr.

    1989-01-01

    Reports experiences during a two-year period in introducing the computer to the laboratory and students to the computer as a laboratory instrument. Describes a working philosophy, data acquisition system, and experiments. Summarizes the laboratory procedures of nine experiments, covering mechanics, heat, electromagnetism, and optics. (YP)

  7. The Fabric for Frontier Experiments Project at Fermilab

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kirby, Michael

    2014-01-01

    The FabrIc for Frontier Experiments (FIFE) project is a new, far-reaching initiative within the Fermilab Scientific Computing Division to drive the future of computing services for experiments at FNAL and elsewhere. It is a collaborative effort between computing professionals and experiment scientists to produce an end-to-end, fully integrated set of services for computing on the grid and clouds, managing data, accessing databases, and collaborating within experiments. FIFE includes 1) easy to use job submission services for processing physics tasks on the Open Science Grid and elsewhere, 2) an extensive data management system for managing local and remote caches, cataloging, querying,more » moving, and tracking the use of data, 3) custom and generic database applications for calibrations, beam information, and other purposes, 4) collaboration tools including an electronic log book, speakers bureau database, and experiment membership database. All of these aspects will be discussed in detail. FIFE sets the direction of computing at Fermilab experiments now and in the future, and therefore is a major driver in the design of computing services worldwide.« less

  8. The Information Science Experiment System - The computer for science experiments in space

    NASA Technical Reports Server (NTRS)

    Foudriat, Edwin C.; Husson, Charles

    1989-01-01

    The concept of the Information Science Experiment System (ISES), potential experiments, and system requirements are reviewed. The ISES is conceived as a computer resource in space whose aim is to assist computer, earth, and space science experiments, to develop and demonstrate new information processing concepts, and to provide an experiment base for developing new information technology for use in space systems. The discussion covers system hardware and architecture, operating system software, the user interface, and the ground communication link.

  9. Nonlocal Intuition: Replication and Paired-subjects Enhancement Effects

    PubMed Central

    Mirzaei, Maryam; Zali, Mohammad Reza

    2014-01-01

    This article reports the results of a study of repeat entrepreneurs in Tehran, Iran, in which nonlocal intuition was investigated in a replication and extension of experiment using measures of heart rate variability (HRV). Nonlocal intuition is the perception of information about a distant or future event by the body's psychophysiological systems, which is not based on reason or memories of prior experience. This study follows up on the McCraty, Radin, and Bradley studies, which found evidence of nonlocal intuition. We used Radin's experimental protocol, with the addition of HRV measures as in the McCraty studies involving computer administration of a random sequence of calm and emotional pictures as the stimulus, and conducted two experiments on mutually exclusive samples—the first on a group of single participants (N=15) and the second on a group of co-participant pairs (N=30)—to investigate the question of the “amplification” of intuition effects by social connection. Each experiment was conducted over 45 trials while heart rate rhythm activity was recorded continuously. Results, using random permutation analysis, a statistically conservative procedure, show significant pre-stimulus results—that is, for the period before the computer had randomly selected the picture stimulus—for both experiments. Moreover, while significant separation between the emotional and calm HRV curves was observed in the single-participant experiment, an even larger separation was apparent for the experiment on co-participant pairs; the difference between the two groups was also significant. Overall, the results of the single-participant experiment confirm previous finding: that electrophysiological measures, especially changes in the heart rhythm, can detect intuitive foreknowledge. This result is notable because it constitutes cross-cultural corroboration in a non-Western context—namely, Iran. In addition, the results for co-participant pairs offer new evidence on the amplification of the nonlocal intuition signal. PMID:24808977

  10. Development of Computer-Based Experiment Set on Simple Harmonic Motion of Mass on Springs

    ERIC Educational Resources Information Center

    Musik, Panjit

    2017-01-01

    The development of computer-based experiment set has become necessary in teaching physics in schools so that students can learn from their real experiences. The purpose of this study is to create and to develop the computer-based experiment set on simple harmonic motion of mass on springs for teaching and learning physics. The average period of…

  11. Computational wing optimization and comparisons with experiment for a semi-span wing model

    NASA Technical Reports Server (NTRS)

    Waggoner, E. G.; Haney, H. P.; Ballhaus, W. F.

    1978-01-01

    A computational wing optimization procedure was developed and verified by an experimental investigation of a semi-span variable camber wing model in the NASA Ames Research Center 14 foot transonic wind tunnel. The Bailey-Ballhaus transonic potential flow analysis and Woodward-Carmichael linear theory codes were linked to Vanderplaats constrained minimization routine to optimize model configurations at several subsonic and transonic design points. The 35 deg swept wing is characterized by multi-segmented leading and trailing edge flaps whose hinge lines are swept relative to the leading and trailing edges of the wing. By varying deflection angles of the flap segments, camber and twist distribution can be optimized for different design conditions. Results indicate that numerical optimization can be both an effective and efficient design tool. The optimized configurations had as good or better lift to drag ratios at the design points as the best designs previously tested during an extensive parametric study.

  12. On Favorable Thermal Fields for Detached Bridgman Growth

    NASA Technical Reports Server (NTRS)

    Stelian, Carmen; Volz, Martin P.; Derby, Jeffrey J.

    2009-01-01

    The thermal fields of two Bridgman-like configurations, representative of real systems used in prior experiments for the detached growth of CdTe and Ge crystals, are studied. These detailed heat transfer computations are performed using the CrysMAS code and expand upon our previous analyses [14] that posited a new mechanism involving the thermal field and meniscus position to explain stable conditions for dewetted Bridgman growth. Computational results indicate that heat transfer conditions that led to successful detached growth in both of these systems are in accordance with our prior assertion, namely that the prevention of crystal reattachment to the crucible wall requires the avoidance of any undercooling of the melt meniscus during the growth run. Significantly, relatively simple process modifications that promote favorable thermal conditions for detached growth may overcome detrimental factors associated with meniscus shape and crucible wetting. Thus, these ideas may be important to advance the practice of detached growth for many materials.

  13. A study of oceanic surface heat fluxes in the Greenland, Norwegian, and Barents Seas

    NASA Technical Reports Server (NTRS)

    Hakkinen, Sirpa; Cavalieri, Donald J.

    1989-01-01

    This study examines oceanic surface heat fluxes in the Norwegian, Greenland, and Barents seas using the gridded Navy Fleet Numerical Oceanography Central surface analysis and the First GARP Global Experiment (FGGE) IIc cloudiness data bases. Monthly and annual means of net and turbulent heat fluxes are computed for the FGGE year 1979. The FGGE IIb data base consisting of individual observations provides particularly good data coverage in this region for a comparison with the gridded Navy winds and air temperatures. The standard errors of estimate between the Navy and FGGE IIb winds and air temperatures are 3.6 m/s and 2.5 C, respectively. The computations for the latent and sensible heat fluxes are based on bulk formulas with the same constant heat exchange coefficient of 0.0015. The results show extremely strong wintertime heat fluxes in the northern Greenland Sea and especially in the Barents Sea in contrast to previous studies.

  14. High-resolution coded-aperture design for compressive X-ray tomography using low resolution detectors

    NASA Astrophysics Data System (ADS)

    Mojica, Edson; Pertuz, Said; Arguello, Henry

    2017-12-01

    One of the main challenges in Computed Tomography (CT) is obtaining accurate reconstructions of the imaged object while keeping a low radiation dose in the acquisition process. In order to solve this problem, several researchers have proposed the use of compressed sensing for reducing the amount of measurements required to perform CT. This paper tackles the problem of designing high-resolution coded apertures for compressed sensing computed tomography. In contrast to previous approaches, we aim at designing apertures to be used with low-resolution detectors in order to achieve super-resolution. The proposed method iteratively improves random coded apertures using a gradient descent algorithm subject to constraints in the coherence and homogeneity of the compressive sensing matrix induced by the coded aperture. Experiments with different test sets show consistent results for different transmittances, number of shots and super-resolution factors.

  15. Robust and Imperceptible Watermarking of Video Streams for Low Power Devices

    NASA Astrophysics Data System (ADS)

    Ishtiaq, Muhammad; Jaffar, M. Arfan; Khan, Muhammad A.; Jan, Zahoor; Mirza, Anwar M.

    With the advent of internet, every aspect of life is going online. From online working to watching videos, everything is now available on the internet. With the greater business benefits, increased availability and other online business advantages, there is a major challenge of security and ownership of data. Videos downloaded from an online store can easily be shared among non-intended or unauthorized users. Invisible watermarking is used to hide copyright protection information in the videos. The existing methods of watermarking are less robust and imperceptible and also the computational complexity of these methods does not suit low power devices. In this paper, we have proposed a new method to address the problem of robustness and imperceptibility. Experiments have shown that our method has better robustness and imperceptibility as well as our method is computationally efficient than previous approaches in practice. Hence our method can easily be applied on low power devices.

  16. Memoized Symbolic Execution

    NASA Technical Reports Server (NTRS)

    Yang, Guowei; Pasareanu, Corina S.; Khurshid, Sarfraz

    2012-01-01

    This paper introduces memoized symbolic execution (Memoise), a novel approach for more efficient application of forward symbolic execution, which is a well-studied technique for systematic exploration of program behaviors based on bounded execution paths. Our key insight is that application of symbolic execution often requires several successive runs of the technique on largely similar underlying problems, e.g., running it once to check a program to find a bug, fixing the bug, and running it again to check the modified program. Memoise introduces a trie-based data structure that stores the key elements of a run of symbolic execution. Maintenance of the trie during successive runs allows re-use of previously computed results of symbolic execution without the need for re-computing them as is traditionally done. Experiments using our prototype embodiment of Memoise show the benefits it holds in various standard scenarios of using symbolic execution, e.g., with iterative deepening of exploration depth, to perform regression analysis, or to enhance coverage.

  17. Masking of Figure-Ground Texture and Single Targets by Surround Inhibition: A Computational Spiking Model

    PubMed Central

    Supèr, Hans; Romeo, August

    2012-01-01

    A visual stimulus can be made invisible, i.e. masked, by the presentation of a second stimulus. In the sensory cortex, neural responses to a masked stimulus are suppressed, yet how this suppression comes about is still debated. Inhibitory models explain masking by asserting that the mask exerts an inhibitory influence on the responses of a neuron evoked by the target. However, other models argue that the masking interferes with recurrent or reentrant processing. Using computer modeling, we show that surround inhibition evoked by ON and OFF responses to the mask suppresses the responses to a briefly presented stimulus in forward and backward masking paradigms. Our model results resemble several previously described psychophysical and neurophysiological findings in perceptual masking experiments and are in line with earlier theoretical descriptions of masking. We suggest that precise spatiotemporal influence of surround inhibition is relevant for visual detection. PMID:22393370

  18. Active classifier selection for RGB-D object categorization using a Markov random field ensemble method

    NASA Astrophysics Data System (ADS)

    Durner, Maximilian; Márton, Zoltán.; Hillenbrand, Ulrich; Ali, Haider; Kleinsteuber, Martin

    2017-03-01

    In this work, a new ensemble method for the task of category recognition in different environments is presented. The focus is on service robotic perception in an open environment, where the robot's task is to recognize previously unseen objects of predefined categories, based on training on a public dataset. We propose an ensemble learning approach to be able to flexibly combine complementary sources of information (different state-of-the-art descriptors computed on color and depth images), based on a Markov Random Field (MRF). By exploiting its specific characteristics, the MRF ensemble method can also be executed as a Dynamic Classifier Selection (DCS) system. In the experiments, the committee- and topology-dependent performance boost of our ensemble is shown. Despite reduced computational costs and using less information, our strategy performs on the same level as common ensemble approaches. Finally, the impact of large differences between datasets is analyzed.

  19. Neural Basis of Reinforcement Learning and Decision Making

    PubMed Central

    Lee, Daeyeol; Seo, Hyojung; Jung, Min Whan

    2012-01-01

    Reinforcement learning is an adaptive process in which an animal utilizes its previous experience to improve the outcomes of future choices. Computational theories of reinforcement learning play a central role in the newly emerging areas of neuroeconomics and decision neuroscience. In this framework, actions are chosen according to their value functions, which describe how much future reward is expected from each action. Value functions can be adjusted not only through reward and penalty, but also by the animal’s knowledge of its current environment. Studies have revealed that a large proportion of the brain is involved in representing and updating value functions and using them to choose an action. However, how the nature of a behavioral task affects the neural mechanisms of reinforcement learning remains incompletely understood. Future studies should uncover the principles by which different computational elements of reinforcement learning are dynamically coordinated across the entire brain. PMID:22462543

  20. Chemical and protein structural basis for biological crosstalk between PPAR α and COX enzymes

    NASA Astrophysics Data System (ADS)

    Cleves, Ann E.; Jain, Ajay N.

    2015-02-01

    We have previously validated a probabilistic framework that combined computational approaches for predicting the biological activities of small molecule drugs. Molecule comparison methods included molecular structural similarity metrics and similarity computed from lexical analysis of text in drug package inserts. Here we present an analysis of novel drug/target predictions, focusing on those that were not obvious based on known pharmacological crosstalk. Considering those cases where the predicted target was an enzyme with known 3D structure allowed incorporation of information from molecular docking and protein binding pocket similarity in addition to ligand-based comparisons. Taken together, the combination of orthogonal information sources led to investigation of a surprising predicted relationship between a transcription factor and an enzyme, specifically, PPAR α and the cyclooxygenase enzymes. These predictions were confirmed by direct biochemical experiments which validate the approach and show for the first time that PPAR α agonists are cyclooxygenase inhibitors.

  1. Optimized Quasi-Interpolators for Image Reconstruction.

    PubMed

    Sacht, Leonardo; Nehab, Diego

    2015-12-01

    We propose new quasi-interpolators for the continuous reconstruction of sampled images, combining a narrowly supported piecewise-polynomial kernel and an efficient digital filter. In other words, our quasi-interpolators fit within the generalized sampling framework and are straightforward to use. We go against standard practice and optimize for approximation quality over the entire Nyquist range, rather than focusing exclusively on the asymptotic behavior as the sample spacing goes to zero. In contrast to previous work, we jointly optimize with respect to all degrees of freedom available in both the kernel and the digital filter. We consider linear, quadratic, and cubic schemes, offering different tradeoffs between quality and computational cost. Experiments with compounded rotations and translations over a range of input images confirm that, due to the additional degrees of freedom and the more realistic objective function, our new quasi-interpolators perform better than the state of the art, at a similar computational cost.

  2. HIV-1 Strategies of Immune Evasion

    NASA Astrophysics Data System (ADS)

    Castiglione, F.; Bernaschi, M.

    We simulate the progression of the HIV-1 infection in untreated host organisms. The phenotype features of the virus are represented by the replication rate, the probability of activating the transcription, the mutation rate and the capacity to stimulate an immune response (the so-called immunogenicity). It is very difficult to study in-vivo or in-vitro how these characteristics of the virus influence the evolution of the disease. Therefore we resorted to simulations based on a computer model validated in previous studies. We observe, by means of computer experiments, that the virus continuously evolves under the selective pressure of an immune response whose effectiveness downgrades along with the disease progression. The results of the simulations show that immunogenicity is the most important factor in determining the rate of disease progression but, by itself, it is not sufficient to drive the disease to a conclusion in all cases.

  3. Democratizing Children's Computation: Learning Computational Science as Aesthetic Experience

    ERIC Educational Resources Information Center

    Farris, Amy Voss; Sengupta, Pratim

    2016-01-01

    In this essay, Amy Voss Farris and Pratim Sengupta argue that a democratic approach to children's computing education in a science class must focus on the "aesthetics" of children's experience. In "Democracy and Education," Dewey links "democracy" with a distinctive understanding of "experience." For Dewey,…

  4. Extent of hydrogen coverage of Si(001) under chemical vapor deposition conditions from ab initio approaches

    NASA Astrophysics Data System (ADS)

    Rosenow, Phil; Tonner, Ralf

    2016-05-01

    The extent of hydrogen coverage of the Si(001) c(4 × 2) surface in the presence of hydrogen gas has been studied with dispersion corrected density functional theory. Electronic energy contributions are well described using a hybrid functional. The temperature dependence of the coverage in thermodynamic equilibrium was studied computing the phonon spectrum in a supercell approach. As an approximation to these demanding computations, an interpolated phonon approach was found to give comparable accuracy. The simpler ab initio thermodynamic approach is not accurate enough for the system studied, even if corrections by the Einstein model for surface vibrations are considered. The on-set of H2 desorption from the fully hydrogenated surface is predicted to occur at temperatures around 750 K. Strong changes in hydrogen coverage are found between 1000 and 1200 K in good agreement with previous reflectance anisotropy spectroscopy experiments. These findings allow a rational choice for the surface state in the computational treatment of chemical reactions under typical metal organic vapor phase epitaxy conditions on Si(001).

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoban, Matty J.; Department of Computer Science, University of Oxford, Wolfson Building, Parks Road, Oxford OX1 3QD; Wallman, Joel J.

    We consider general settings of Bell inequality experiments with many parties, where each party chooses from a finite number of measurement settings each with a finite number of outcomes. We investigate the constraints that Bell inequalities place upon the correlations possible in local hidden variable theories using a geometrical picture of correlations. We show that local hidden variable theories can be characterized in terms of limited computational expressiveness, which allows us to characterize families of Bell inequalities. The limited computational expressiveness for many settings (each with many outcomes) generalizes previous results about the many-party situation each with a choice ofmore » two possible measurements (each with two outcomes). Using this computational picture we present generalizations of the Popescu-Rohrlich nonlocal box for many parties and nonbinary inputs and outputs at each site. Finally, we comment on the effect of preprocessing on measurement data in our generalized setting and show that it becomes problematic outside of the binary setting, in that it allows local hidden variable theories to simulate maximally nonlocal correlations such as those of these generalized Popescu-Rohrlich nonlocal boxes.« less

  6. Extent of hydrogen coverage of Si(001) under chemical vapor deposition conditions from ab initio approaches

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosenow, Phil; Tonner, Ralf, E-mail: tonner@chemie.uni-marburg.de

    2016-05-28

    The extent of hydrogen coverage of the Si(001) c(4 × 2) surface in the presence of hydrogen gas has been studied with dispersion corrected density functional theory. Electronic energy contributions are well described using a hybrid functional. The temperature dependence of the coverage in thermodynamic equilibrium was studied computing the phonon spectrum in a supercell approach. As an approximation to these demanding computations, an interpolated phonon approach was found to give comparable accuracy. The simpler ab initio thermodynamic approach is not accurate enough for the system studied, even if corrections by the Einstein model for surface vibrations are considered. Themore » on-set of H{sub 2} desorption from the fully hydrogenated surface is predicted to occur at temperatures around 750 K. Strong changes in hydrogen coverage are found between 1000 and 1200 K in good agreement with previous reflectance anisotropy spectroscopy experiments. These findings allow a rational choice for the surface state in the computational treatment of chemical reactions under typical metal organic vapor phase epitaxy conditions on Si(001).« less

  7. The application of embodied conversational agents for mentoring African American STEM doctoral students

    NASA Astrophysics Data System (ADS)

    Gosha, Kinnis

    This dissertation presents the design, development and short-term evaluation of an embodied conversational agent designed to mentor human users. An embodied conversational agent (ECA) was created and programmed to mentor African American computer science majors on their decision to pursue graduate study in computing. Before constructing the ECA, previous research in the fields of embodied conversational agents, relational agents, mentorship, telementorship and successful mentoring programs and practices for African American graduate students were reviewed. A survey used to find areas of interest of the sample population. Experts were then interviewed to collect information on those areas of interest and a dialogue for the ECA was constructed based on the interview's transcripts. A between-group, mixed method experiment was conducted with 37 African American male undergraduate computer science majors where one group used the ECA mentor while the other group pursued mentoring advice from a human mentor. Results showed no significant difference between the ECA and human mentor when dealing with career mentoring functions. However, the human mentor was significantly better than the ECA mentor when addressing psychosocial mentoring functions.

  8. An optimization approach for fitting canonical tensor decompositions.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dunlavy, Daniel M.; Acar, Evrim; Kolda, Tamara Gibson

    Tensor decompositions are higher-order analogues of matrix decompositions and have proven to be powerful tools for data analysis. In particular, we are interested in the canonical tensor decomposition, otherwise known as the CANDECOMP/PARAFAC decomposition (CPD), which expresses a tensor as the sum of component rank-one tensors and is used in a multitude of applications such as chemometrics, signal processing, neuroscience, and web analysis. The task of computing the CPD, however, can be difficult. The typical approach is based on alternating least squares (ALS) optimization, which can be remarkably fast but is not very accurate. Previously, nonlinear least squares (NLS) methodsmore » have also been recommended; existing NLS methods are accurate but slow. In this paper, we propose the use of gradient-based optimization methods. We discuss the mathematical calculation of the derivatives and further show that they can be computed efficiently, at the same cost as one iteration of ALS. Computational experiments demonstrate that the gradient-based optimization methods are much more accurate than ALS and orders of magnitude faster than NLS.« less

  9. On the isotropic Raman spectrum of Ar2 and how to benchmark ab initio calculations of small atomic clusters: Paradox lost.

    PubMed

    Chrysos, Michael; Dixneuf, Sophie; Rachet, Florent

    2015-07-14

    This is the long-overdue answer to the discrepancies observed between theory and experiment in Ar2 regarding both the isotropic Raman spectrum and the second refractivity virial coefficient, BR [Gaye et al., Phys. Rev. A 55, 3484 (1997)]. At the origin of this progress is the advent (posterior to 1997) of advanced computational methods for weakly interconnected neutral species at close separations. Here, we report agreement between the previously taken Raman measurements and quantum lineshapes now computed with the employ of large-scale CCSD or smartly constructed MP2 induced-polarizability data. By using these measurements as a benchmark tool, we assess the degree of performance of various other ab initio computed data for the mean polarizability α, and we show that an excellent agreement with the most recently measured value of BR is reached. We propose an even more refined model for α, which is solution of the inverse-scattering problem and whose lineshape matches exactly the measured spectrum over the entire frequency-shift range probed.

  10. Enhancing the Undergraduate Computing Experience in Chemical Engineering CACHE Corporation

    ERIC Educational Resources Information Center

    Edgar, Thomas F.

    2006-01-01

    This white paper focuses on the integration and enhancement of the computing experience for undergraduates throughout the chemical engineering curriculum. The computing experience for undergraduates in chemical engineering should have continuity and be coordinated from course to course, because a single software solution is difficult to achieve in…

  11. Influence of direct computer experience on older adults' attitudes toward computers.

    PubMed

    Jay, G M; Willis, S L

    1992-07-01

    This research examined whether older adults' attitudes toward computers became more positive as a function of computer experience. The sample comprised 101 community-dwelling older adults aged 57 to 87. The intervention involved a 2-week computer training program in which subjects learned to use a desktop publishing software program. A multidimensional computer attitude measure was used to assess differential attitude change and maintenance of change following training. The results indicated that older adults' computer attitudes are modifiable and that direct computer experience is an effective means of change. Attitude change as a function of training was found for the attitude dimensions targeted by the intervention program: computer comfort and efficacy. In addition, maintenance of attitude change was established for at least two weeks following training.

  12. Anisotropic hydrogen diffusion in α-Zr and Zircaloy predicted by accelerated kinetic Monte Carlo simulations

    NASA Astrophysics Data System (ADS)

    Zhang, Yongfeng; Jiang, Chao; Bai, Xianming

    2017-01-01

    This report presents an accelerated kinetic Monte Carlo (KMC) method to compute the diffusivity of hydrogen in hcp metals and alloys, considering both thermally activated hopping and quantum tunneling. The acceleration is achieved by replacing regular KMC jumps in trapping energy basins formed by neighboring tetrahedral interstitial sites, with analytical solutions for basin exiting time and probability. Parameterized by density functional theory (DFT) calculations, the accelerated KMC method is shown to be capable of efficiently calculating hydrogen diffusivity in α-Zr and Zircaloy, without altering the kinetics of long-range diffusion. Above room temperature, hydrogen diffusion in α-Zr and Zircaloy is dominated by thermal hopping, with negligible contribution from quantum tunneling. The diffusivity predicted by this DFT + KMC approach agrees well with that from previous independent experiments and theories, without using any data fitting. The diffusivity along is found to be slightly higher than that along , with the anisotropy saturated at about 1.20 at high temperatures, resolving contradictory results in previous experiments. Demonstrated using hydrogen diffusion in α-Zr, the same method can be extended for on-lattice diffusion in hcp metals, or systems with similar trapping basins.

  13. Anisotropic hydrogen diffusion in α-Zr and Zircaloy predicted by accelerated kinetic Monte Carlo simulations

    PubMed Central

    Zhang, Yongfeng; Jiang, Chao; Bai, Xianming

    2017-01-01

    This report presents an accelerated kinetic Monte Carlo (KMC) method to compute the diffusivity of hydrogen in hcp metals and alloys, considering both thermally activated hopping and quantum tunneling. The acceleration is achieved by replacing regular KMC jumps in trapping energy basins formed by neighboring tetrahedral interstitial sites, with analytical solutions for basin exiting time and probability. Parameterized by density functional theory (DFT) calculations, the accelerated KMC method is shown to be capable of efficiently calculating hydrogen diffusivity in α-Zr and Zircaloy, without altering the kinetics of long-range diffusion. Above room temperature, hydrogen diffusion in α-Zr and Zircaloy is dominated by thermal hopping, with negligible contribution from quantum tunneling. The diffusivity predicted by this DFT + KMC approach agrees well with that from previous independent experiments and theories, without using any data fitting. The diffusivity along is found to be slightly higher than that along , with the anisotropy saturated at about 1.20 at high temperatures, resolving contradictory results in previous experiments. Demonstrated using hydrogen diffusion in α-Zr, the same method can be extended for on-lattice diffusion in hcp metals, or systems with similar trapping basins. PMID:28106154

  14. Activity Recognition on Streaming Sensor Data.

    PubMed

    Krishnan, Narayanan C; Cook, Diane J

    2014-02-01

    Many real-world applications that focus on addressing needs of a human, require information about the activities being performed by the human in real-time. While advances in pervasive computing have lead to the development of wireless and non-intrusive sensors that can capture the necessary activity information, current activity recognition approaches have so far experimented on either a scripted or pre-segmented sequence of sensor events related to activities. In this paper we propose and evaluate a sliding window based approach to perform activity recognition in an on line or streaming fashion; recognizing activities as and when new sensor events are recorded. To account for the fact that different activities can be best characterized by different window lengths of sensor events, we incorporate the time decay and mutual information based weighting of sensor events within a window. Additional contextual information in the form of the previous activity and the activity of the previous window is also appended to the feature describing a sensor window. The experiments conducted to evaluate these techniques on real-world smart home datasets suggests that combining mutual information based weighting of sensor events and adding past contextual information into the feature leads to best performance for streaming activity recognition.

  15. Anisotropic hydrogen diffusion in α-Zr and Zircaloy predicted by accelerated kinetic Monte Carlo simulations

    DOE PAGES

    Zhang, Yongfeng; Jiang, Chao; Bai, Xianming

    2017-01-20

    Here, this report presents an accelerated kinetic Monte Carlo (KMC) method to compute the diffusivity of hydrogen in hcp metals and alloys, considering both thermally activated hopping and quantum tunneling. The acceleration is achieved by replacing regular KMC jumps in trapping energy basins formed by neighboring tetrahedral interstitial sites, with analytical solutions for basin exiting time and probability. Parameterized by density functional theory (DFT) calculations, the accelerated KMC method is shown to be capable of efficiently calculating hydrogen diffusivity in α-Zr and Zircaloy, without altering the kinetics of long-range diffusion. Above room temperature, hydrogen diffusion in α-Zr and Zircaloy ismore » dominated by thermal hopping, with negligible contribution from quantum tunneling. The diffusivity predicted by this DFT + KMC approach agrees well with that from previous independent experiments and theories, without using any data fitting. The diffusivity along < c > is found to be slightly higher than that along < a >, with the anisotropy saturated at about 1.20 at high temperatures, resolving contradictory results in previous experiments. Demonstrated using hydrogen diffusion in α-Zr, the same method can be extended for on-lattice diffusion in hcp metals, or systems with similar trapping basins.« less

  16. Development, Testing, and Validation of a Model-Based Tool to Predict Operator Responses in Unexpected Workload Transitions

    NASA Technical Reports Server (NTRS)

    Sebok, Angelia; Wickens, Christopher; Sargent, Robert

    2015-01-01

    One human factors challenge is predicting operator performance in novel situations. Approaches such as drawing on relevant previous experience, and developing computational models to predict operator performance in complex situations, offer potential methods to address this challenge. A few concerns with modeling operator performance are that models need to realistic, and they need to be tested empirically and validated. In addition, many existing human performance modeling tools are complex and require that an analyst gain significant experience to be able to develop models for meaningful data collection. This paper describes an effort to address these challenges by developing an easy to use model-based tool, using models that were developed from a review of existing human performance literature and targeted experimental studies, and performing an empirical validation of key model predictions.

  17. Model Development for VDE Computations in NIMROD

    NASA Astrophysics Data System (ADS)

    Bunkers, K. J.; Sovinec, C. R.

    2017-10-01

    Vertical displacement events (VDEs) and the disruptions associated with them have potential for causing considerable physical damage to ITER and other tokamak experiments. We report on simulations of generic axisymmetric VDEs and a vertically unstable case from Alcator C-MOD using the NIMROD code. Previous calculations have been done with closures for heat flux and viscous stress. Initial calculations show that halo current width is dependent on temperature boundary conditions, and so transport together with plasma-surface interaction may play a role in determining halo currents in experiments. The behavior of VDEs with Braginskii thermal conductivity and viscosity closures and Spitzer-like resistivity are investigated for both the generic axisymmetric VDE case and the C-MOD case. This effort is supported by the U.S. Dept. of Energy, Award Numbers DE-FG02-06ER54850 and DE-FC02-08ER54975.

  18. TopHat: discovering splice junctions with RNA-Seq

    PubMed Central

    Trapnell, Cole; Pachter, Lior; Salzberg, Steven L.

    2009-01-01

    Motivation: A new protocol for sequencing the messenger RNA in a cell, known as RNA-Seq, generates millions of short sequence fragments in a single run. These fragments, or ‘reads’, can be used to measure levels of gene expression and to identify novel splice variants of genes. However, current software for aligning RNA-Seq data to a genome relies on known splice junctions and cannot identify novel ones. TopHat is an efficient read-mapping algorithm designed to align reads from an RNA-Seq experiment to a reference genome without relying on known splice sites. Results: We mapped the RNA-Seq reads from a recent mammalian RNA-Seq experiment and recovered more than 72% of the splice junctions reported by the annotation-based software from that study, along with nearly 20 000 previously unreported junctions. The TopHat pipeline is much faster than previous systems, mapping nearly 2.2 million reads per CPU hour, which is sufficient to process an entire RNA-Seq experiment in less than a day on a standard desktop computer. We describe several challenges unique to ab initio splice site discovery from RNA-Seq reads that will require further algorithm development. Availability: TopHat is free, open-source software available from http://tophat.cbcb.umd.edu Contact: cole@cs.umd.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:19289445

  19. Visual capture and the experience of having two bodies – Evidence from two different virtual reality techniques

    PubMed Central

    Heydrich, Lukas; Dodds, Trevor J.; Aspell, Jane E.; Herbelin, Bruno; Bülthoff, Heinrich H.; Mohler, Betty J.; Blanke, Olaf

    2013-01-01

    In neurology and psychiatry the detailed study of illusory own body perceptions has suggested close links between bodily processing and self-consciousness. One such illusory own body perception is heautoscopy where patients have the sensation of being reduplicated and to exist at two or even more locations. In previous experiments, using a video head-mounted display, self-location and self-identification were manipulated by applying conflicting visuo-tactile information. Yet the experienced singularity of the self was not affected, i.e., participants did not experience having multiple bodies or selves. In two experiments presented in this paper, we investigated self-location and self-identification while participants saw two virtual bodies (video-generated in study 1 and 3D computer generated in study 2) that were stroked either synchronously or asynchronously with their own body. In both experiments, we report that self-identification with two virtual bodies was stronger during synchronous stroking. Furthermore, in the video generated setup with synchronous stroking participants reported a greater feeling of having multiple bodies than in the control conditions. In study 1, but not in study 2, we report that self-location – measured by anterior posterior drift – was significantly shifted towards the two bodies in the synchronous condition only. Self-identification with two bodies, the sensation of having multiple bodies, and the changes in self-location show that the experienced singularity of the self can be studied experimentally. We discuss our data with respect to ownership for supernumerary hands and heautoscopy. We finally compare the effects of the video and 3D computer generated head-mounted display technology and discuss the possible benefits of using either technology to induce changes in illusory self-identification with a virtual body. PMID:24385970

  20. Investigation of the Feasibility of Utilizing Gamma Emission Computed Tomography in Evaluating Fission Product Migration in Irradiated TRISO Fuel Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jason M. Harp; Paul A. Demkowicz

    2014-10-01

    In the High Temperature Gas-Cooled Reactor (HTGR) the TRISO particle fuel serves as the primary fission product containment. However the large number of TRISO particles present in proposed HTGRs dictates that there will be a small fraction (~10 -4 to 10 -5) of as manufactured and in-pile particle failures that will lead to some fission product release. The matrix material surrounding the TRISO particles in fuel compacts and the structural graphite holding the TRISO particles in place can also serve as sinks for containing any released fission products. However data on the migration of solid fission products through these materialsmore » is lacking. One of the primary goals of the AGR-3/4 experiment is to study fission product migration from failed TRISO particles in prototypic HTGR components such as structural graphite and compact matrix material. In this work, the potential for a Gamma Emission Computed Tomography (GECT) technique to non-destructively examine the fission product distribution in AGR-3/4 components and other irradiation experiments is explored. Specifically, the feasibility of using the Idaho National Laboratory (INL) Hot Fuels Examination Facility (HFEF) Precision Gamma Scanner (PGS) system for this GECT application is considered. To test the feasibility, the response of the PGS system to idealized fission product distributions has been simulated using Monte Carlo radiation transport simulations. Previous work that applied similar techniques during the AGR-1 experiment will also be discussed as well as planned uses for the GECT technique during the post irradiation examination of the AGR-2 experiment. The GECT technique has also been applied to other irradiated nuclear fuel systems that were currently available in the HFEF hot cell including oxide fuel pins, metallic fuel pins, and monolithic plate fuel.« less

  1. Gestures make memories, but what kind? Patients with impaired procedural memory display disruptions in gesture production and comprehension

    PubMed Central

    Klooster, Nathaniel B.; Cook, Susan W.; Uc, Ergun Y.; Duff, Melissa C.

    2015-01-01

    Hand gesture, a ubiquitous feature of human interaction, facilitates communication. Gesture also facilitates new learning, benefiting speakers and listeners alike. Thus, gestures must impact cognition beyond simply supporting the expression of already-formed ideas. However, the cognitive and neural mechanisms supporting the effects of gesture on learning and memory are largely unknown. We hypothesized that gesture's ability to drive new learning is supported by procedural memory and that procedural memory deficits will disrupt gesture production and comprehension. We tested this proposal in patients with intact declarative memory, but impaired procedural memory as a consequence of Parkinson's disease (PD), and healthy comparison participants with intact declarative and procedural memory. In separate experiments, we manipulated the gestures participants saw and produced in a Tower of Hanoi (TOH) paradigm. In the first experiment, participants solved the task either on a physical board, requiring high arching movements to manipulate the discs from peg to peg, or on a computer, requiring only flat, sideways movements of the mouse. When explaining the task, healthy participants with intact procedural memory displayed evidence of their previous experience in their gestures, producing higher, more arching hand gestures after solving on a physical board, and smaller, flatter gestures after solving on a computer. In the second experiment, healthy participants who saw high arching hand gestures in an explanation prior to solving the task subsequently moved the mouse with significantly higher curvature than those who saw smaller, flatter gestures prior to solving the task. These patterns were absent in both gesture production and comprehension experiments in patients with procedural memory impairment. These findings suggest that the procedural memory system supports the ability of gesture to drive new learning. PMID:25628556

  2. AnnotCompute: annotation-based exploration and meta-analysis of genomics experiments

    PubMed Central

    Zheng, Jie; Stoyanovich, Julia; Manduchi, Elisabetta; Liu, Junmin; Stoeckert, Christian J.

    2011-01-01

    The ever-increasing scale of biological data sets, particularly those arising in the context of high-throughput technologies, requires the development of rich data exploration tools. In this article, we present AnnotCompute, an information discovery platform for repositories of functional genomics experiments such as ArrayExpress. Our system leverages semantic annotations of functional genomics experiments with controlled vocabulary and ontology terms, such as those from the MGED Ontology, to compute conceptual dissimilarities between pairs of experiments. These dissimilarities are then used to support two types of exploratory analysis—clustering and query-by-example. We show that our proposed dissimilarity measures correspond to a user's intuition about conceptual dissimilarity, and can be used to support effective query-by-example. We also evaluate the quality of clustering based on these measures. While AnnotCompute can support a richer data exploration experience, its effectiveness is limited in some cases, due to the quality of available annotations. Nonetheless, tools such as AnnotCompute may provide an incentive for richer annotations of experiments. Code is available for download at http://www.cbil.upenn.edu/downloads/AnnotCompute. Database URL: http://www.cbil.upenn.edu/annotCompute/ PMID:22190598

  3. Hot or cold: is communicating anger or threats more effective in negotiation?

    PubMed

    Sinaceur, Marwan; Van Kleef, Gerben A; Neale, Margaret A; Adam, Hajo; Haag, Christophe

    2011-09-01

    Is communicating anger or threats more effective in eliciting concessions in negotiation? Recent research has emphasized the effectiveness of anger communication, an emotional strategy. In this article, we argue that anger communication conveys an implied threat, and we document that issuing threats is a more effective negotiation strategy than communicating anger. In 3 computer-mediated negotiation experiments, participants received either angry or threatening messages from a simulated counterpart. Experiment 1 showed that perceptions of threat mediated the effect of anger (vs. a control) on concessions. Experiment 2 showed that (a) threat communication elicited greater concessions than anger communication and (b) poise (being confident and in control of one's own feelings and decisions) ascribed to the counterpart mediated the positive effect of threat compared to anger on concessions. Experiment 3 replicated this positive effect of threat over anger when recipients had an attractive alternative to a negotiated agreement. These findings qualify previous research on anger communication in negotiation. Implications for the understanding of emotion and negotiation are discussed. PsycINFO Database Record (c) 2011 APA, all rights reserved

  4. Results of a laboratory experiment that tests rotating unbalanced-mass devices for scanning gimbaled payloads and free-flying spacecraft

    NASA Technical Reports Server (NTRS)

    Alhorn, D. C.; Polites, M. E.

    1994-01-01

    Rotating unbalanced-mass (RUM) devices are a new way to scan space-based, balloon-borne, and ground-based gimbaled payloads, like x-ray and gamma-ray telescopes. They can also be used to scan free-flying spacecraft. Circular scans, linear scans, and raster scans can be generated. A pair of RUM devices generates the basic scan motion and an auxiliary control system using torque motors, control moment gyros, or reaction wheels keeps the scan centered on the target and produces some complementary motion for raster scanning. Previous analyses and simulation results show that this approach offers significant power savings compared to scanning only with the auxiliary control system, especially with large payloads and high scan frequencies. However, these claims have never been proven until now. This paper describes a laboratory experiment which tests the concept of scanning a gimbaled payload with RUM devices. A description of the experiment is given and test results that prove the concept are presented. The test results are compared with those from a computer simulation model of the experiment and the differences are discussed.

  5. Microgravity nucleation and particle coagulation experiments support

    NASA Technical Reports Server (NTRS)

    Lilleleht, L. U.; Lass, T. J.

    1987-01-01

    A hollow sphere model is developed to predict the range of supersaturation ratio values for refractory metal vapors in a proposed experimental nucleation apparatus. Since the experiments are to be carried out in a microgravity environment, the model neglects the effects of convection and assumes that the only transfer of vapors through an inert gas atmosphere is by conduction and molecular diffusion. A consistent set of physical properties data is assembled for the various candidate metals and inert ambient gases expected to be used in the nucleation experiments. Transient partial pressure profiles are computed for the diffusing refractory species for two possible temperature distributions. The supersaturation ratio values from both candidate temperature profiles are compared with previously obtained experimetnal data on a silver-hydrogen system. The model is used to simulate the diffusion of magnesium vapor through argon and other inert gas atmospheres over ranges of initial and boundary conditions. These results identify different combinations of design and operating parameters which are liekly to produce supersaturation ratio values high enough to induce homogeneous nucleation in the apparatus being designed for the microgravity nucleation experiments.

  6. Computer games: a double-edged sword?

    PubMed

    Sun, De-Lin; Ma, Ning; Bao, Min; Chen, Xang-Chuan; Zhang, Da-Ren

    2008-10-01

    Excessive computer game playing (ECGP) has already become a serious social problem. However, limited data from experimental lab studies are available about the negative consequences of ECGP on players' cognitive characteristics. In the present study, we compared three groups of participants (current ECGP participants, previous ECGP participants, and control participants) on a Multiple Object Tracking (MOT) task. The previous ECGP participants performed significantly better than the control participants, which suggested a facilitation effect of computer games on visuospatial abilities. Moreover, the current ECGP participants performed significantly worse than the previous ECGP participants. This more important finding indicates that ECGP may be related to cognitive deficits. Implications of this study are discussed.

  7. Welding Experiments of Aluminum Alloy by Space GHTA Welding at ISS Orbital Pressure

    NASA Astrophysics Data System (ADS)

    Suita, Yoshikazu; Takai, Daisuke; Sugiyama, Satoshi; Terajima, Noboru; Tsukuda, Yoshiyuki; Fujisawa, Shoichiro; Imagawa, Kichiro

    As a feasible welding method in space, the authors previously proposed the space GHTA (Gas Hollow Tungsten Arc) welding process. However, space GHTA welding with a high-frequency device for arc start may cause electromagnetic noise problems for the computer equipment placed on the ISS (International Space Station). Therefore, in this report, welding experiments of space GHTA welding using aluminum alloy with a high-voltage DC device for arc start were carried out at the ISS orbital pressure, 10-5 Pa. It is clear from the experiments using a high-voltage DC device in a high-vacuum condition, that there is a shifting phenomenon in which the spark discharge shifts to either a glow discharge or an arc discharge when starting the arc. Welding projects in space need an arc discharge, so we investigated the effects of welding parameters on the arc formation ratio. As a result, space GHTA welding with a high-voltage DC device can be used for arc start when welding at the ISS orbital pressure.

  8. Rhesus monkeys (Macaca mulatta) remember agency information from past events and integrate this knowledge with spatial and temporal features in working memory.

    PubMed

    Hoffman, Megan L; Beran, Michael J; Washburn, David A

    2018-01-01

    The purpose of the present study was to examine whether rhesus monkeys remember information about their own agency-along with spatial, temporal and contextual properties-from a previously experienced event. In Experiment 1, rhesus monkeys (n = 4) used symbols to reliably indicate whether they had performed or observed an event on a computer screen. In Experiment 2, naïve and experienced monkeys (n = 8) reported agency information when stringent controls for perceptual and proprioceptive cues were included. In Experiment 3, five of the monkeys completed a task in which they reported agency information along with spatial and temporal features of events. Two monkeys performed this agency discrimination when they could not anticipate which memory test they would receive. There was also evidence that these features were integrated in memory. Implications of this research are discussed in relation to working memory, episodic memory and self-awareness in nonhuman animals.

  9. Radiotherapy and chemotherapy change vessel tree geometry and metastatic spread in a small cell lung cancer xenograft mouse tumor model

    PubMed Central

    Bethge, Anja; Schumacher, Udo

    2017-01-01

    Background Tumor vasculature is critical for tumor growth, formation of distant metastases and efficiency of radio- and chemotherapy treatments. However, how the vasculature itself is affected during cancer treatment regarding to the metastatic behavior has not been thoroughly investigated. Therefore, the aim of this study was to analyze the influence of hypofractionated radiotherapy and cisplatin chemotherapy on vessel tree geometry and metastasis formation in a small cell lung cancer xenograft mouse tumor model to investigate the spread of malignant cells during different treatments modalities. Methods The biological data gained during these experiments were fed into our previously developed computer model “Cancer and Treatment Simulation Tool” (CaTSiT) to model the growth of the primary tumor, its metastatic deposit and also the influence on different therapies. Furthermore, we performed quantitative histology analyses to verify our predictions in xenograft mouse tumor model. Results According to the computer simulation the number of cells engrafting must vary considerably to explain the different weights of the primary tumor at the end of the experiment. Once a primary tumor is established, the fractal dimension of its vasculature correlates with the tumor size. Furthermore, the fractal dimension of the tumor vasculature changes during treatment, indicating that the therapy affects the blood vessels’ geometry. We corroborated these findings with a quantitative histological analysis showing that the blood vessel density is depleted during radiotherapy and cisplatin chemotherapy. The CaTSiT computer model reveals that chemotherapy influences the tumor’s therapeutic susceptibility and its metastatic spreading behavior. Conclusion Using a system biological approach in combination with xenograft models and computer simulations revealed that the usage of chemotherapy and radiation therapy determines the spreading behavior by changing the blood vessel geometry of the primary tumor. PMID:29107953

  10. Quantification of uncertainties in the tsunami hazard for Cascadia using statistical emulation

    NASA Astrophysics Data System (ADS)

    Guillas, S.; Day, S. J.; Joakim, B.

    2016-12-01

    We present new high resolution tsunami wave propagation and coastal inundation for the Cascadia region in the Pacific Northwest. The coseismic representation in this analysis is novel, and more realistic than in previous studies, as we jointly parametrize multiple aspects of the seabed deformation. Due to the large computational cost of such simulators, statistical emulation is required in order to carry out uncertainty quantification tasks, as emulators efficiently approximate simulators. The emulator replaces the tsunami model VOLNA by a fast surrogate, so we are able to efficiently propagate uncertainties from the source characteristics to wave heights, in order to probabilistically assess tsunami hazard for Cascadia. We employ a new method for the design of the computer experiments in order to reduce the number of runs while maintaining good approximations properties of the emulator. Out of the initial nine parameters, mostly describing the geometry and time variation of the seabed deformation, we drop two parameters since these turn out to not have an influence on the resulting tsunami waves at the coast. We model the impact of another parameter linearly as its influence on the wave heights is identified as linear. We combine this screening approach with the sequential design algorithm MICE (Mutual Information for Computer Experiments), that adaptively selects the input values at which to run the computer simulator, in order to maximize the expected information gain (mutual information) over the input space. As a result, the emulation is made possible and accurate. Starting from distributions of the source parameters that encapsulate geophysical knowledge of the possible source characteristics, we derive distributions of the tsunami wave heights along the coastline.

  11. Efficient globally optimal segmentation of cells in fluorescence microscopy images using level sets and convex energy functionals.

    PubMed

    Bergeest, Jan-Philip; Rohr, Karl

    2012-10-01

    In high-throughput applications, accurate and efficient segmentation of cells in fluorescence microscopy images is of central importance for the quantification of protein expression and the understanding of cell function. We propose an approach for segmenting cell nuclei which is based on active contours using level sets and convex energy functionals. Compared to previous work, our approach determines the global solution. Thus, the approach does not suffer from local minima and the segmentation result does not depend on the initialization. We consider three different well-known energy functionals for active contour-based segmentation and introduce convex formulations of these functionals. We also suggest a numeric approach for efficiently computing the solution. The performance of our approach has been evaluated using fluorescence microscopy images from different experiments comprising different cell types. We have also performed a quantitative comparison with previous segmentation approaches. Copyright © 2012 Elsevier B.V. All rights reserved.

  12. One Head Start Classroom's Experience: Computers and Young Children's Development.

    ERIC Educational Resources Information Center

    Fischer, Melissa Anne; Gillespie, Catherine Wilson

    2003-01-01

    Contends that early childhood educators need to understand how exposure to computers and constructive computer programs affects the development of children. Specifically examines: (1) research on children's technology experiences; (2) determining best practices; and (3) addressing educators' concerns about computers replacing other developmentally…

  13. Improving Simulated Annealing by Recasting it as a Non-Cooperative Game

    NASA Technical Reports Server (NTRS)

    Wolpert, David; Bandari, Esfandiar; Tumer, Kagan

    2001-01-01

    The game-theoretic field of COllective INtelligence (COIN) concerns the design of computer-based players engaged in a non-cooperative game so that as those players pursue their self-interests, a pre-specified global goal for the collective computational system is achieved "as a side-effect". Previous implementations of COIN algorithms have outperformed conventional techniques by up to several orders of magnitude, on domains ranging from telecommunications control to optimization in congestion problems. Recent mathematical developments have revealed that these previously developed game-theory-motivated algorithms were based on only two of the three factors determining performance. Consideration of only the third factor would instead lead to conventional optimization techniques like simulated annealing that have little to do with non-cooperative games. In this paper we present an algorithm based on all three terms at once. This algorithm can be viewed as a way to modify simulated annealing by recasting it as a non-cooperative game, with each variable replaced by a player. This recasting allows us to leverage the intelligent behavior of the individual players to substantially improve the exploration step of the simulated annealing. Experiments are presented demonstrating that this recasting improves simulated annealing by several orders of magnitude for spin glass relaxation and bin-packing.

  14. GPU-accelerated two dimensional synthetic aperture focusing for photoacoustic microscopy

    NASA Astrophysics Data System (ADS)

    Liu, Siyu; Feng, Xiaohua; Gao, Fei; Jin, Haoran; Zhang, Ruochong; Luo, Yunqi; Zheng, Yuanjin

    2018-02-01

    Acoustic resolution photoacoustic microscopy (AR-PAM) generally suffers from limited depth of focus, which had been extended by synthetic aperture focusing techniques (SAFTs). However, for three dimensional AR-PAM, current one dimensional (1D) SAFT and its improved version like cross-shaped SAFT do not provide isotropic resolution in the lateral direction. The full potential of the SAFT remains to be tapped. To this end, two dimensional (2D) SAFT with fast computing architecture is proposed in this work. Explained by geometric modeling and Fourier acoustics theories, 2D-SAFT provide the narrowest post-focusing capability, thus to achieve best lateral resolution. Compared with previous 1D-SAFT techniques, the proposed 2D-SAFT improved the lateral resolution by at least 1.7 times and the signal-to-noise ratio (SNR) by about 10 dB in both simulation and experiments. Moreover, the improved 2D-SAFT algorithm is accelerated by a graphical processing unit that reduces the long period of reconstruction to only a few seconds. The proposed 2D-SAFT is demonstrated to outperform previous reported 1D SAFT in the aspects of improving the depth of focus, imaging resolution, and SNR with fast computational efficiency. This work facilitates future studies on in vivo deeper and high-resolution photoacoustic microscopy beyond several centimeters.

  15. Richardson-Lucy/maximum likelihood image restoration algorithm for fluorescence microscopy: further testing.

    PubMed

    Holmes, T J; Liu, Y H

    1989-11-15

    A maximum likelihood based iterative algorithm adapted from nuclear medicine imaging for noncoherent optical imaging was presented in a previous publication with some initial computer-simulation testing. This algorithm is identical in form to that previously derived in a different way by W. H. Richardson "Bayesian-Based Iterative Method of Image Restoration," J. Opt. Soc. Am. 62, 55-59 (1972) and L. B. Lucy "An Iterative Technique for the Rectification of Observed Distributions," Astron. J. 79, 745-765 (1974). Foreseen applications include superresolution and 3-D fluorescence microscopy. This paper presents further simulation testing of this algorithm and a preliminary experiment with a defocused camera. The simulations show quantified resolution improvement as a function of iteration number, and they show qualitatively the trend in limitations on restored resolution when noise is present in the data. Also shown are results of a simulation in restoring missing-cone information for 3-D imaging. Conclusions are in support of the feasibility of using these methods with real systems, while computational cost and timing estimates indicate that it should be realistic to implement these methods. Itis suggested in the Appendix that future extensions to the maximum likelihood based derivation of this algorithm will address some of the limitations that are experienced with the nonextended form of the algorithm presented here.

  16. Speeding Up the Bilateral Filter: A Joint Acceleration Way.

    PubMed

    Dai, Longquan; Yuan, Mengke; Zhang, Xiaopeng

    2016-06-01

    Computational complexity of the brute-force implementation of the bilateral filter (BF) depends on its filter kernel size. To achieve the constant-time BF whose complexity is irrelevant to the kernel size, many techniques have been proposed, such as 2D box filtering, dimension promotion, and shiftability property. Although each of the above techniques suffers from accuracy and efficiency problems, previous algorithm designers were used to take only one of them to assemble fast implementations due to the hardness of combining them together. Hence, no joint exploitation of these techniques has been proposed to construct a new cutting edge implementation that solves these problems. Jointly employing five techniques: kernel truncation, best N-term approximation as well as previous 2D box filtering, dimension promotion, and shiftability property, we propose a unified framework to transform BF with arbitrary spatial and range kernels into a set of 3D box filters that can be computed in linear time. To the best of our knowledge, our algorithm is the first method that can integrate all these acceleration techniques and, therefore, can draw upon one another's strong point to overcome deficiencies. The strength of our method has been corroborated by several carefully designed experiments. In particular, the filtering accuracy is significantly improved without sacrificing the efficiency at running time.

  17. Effects of Buoyancy in Hydrogen Jet Diffusion Flames

    NASA Technical Reports Server (NTRS)

    Agrawal, A. K.; Al-Ammar, K.; Gollahalli, S. R.; Griffin, D. W.

    1999-01-01

    This project was carried out to understand the effects of heat release and buoyancy on the flame structure of diffusion flames. Experiments were conducted at atmospheric pressure in both normal gravity and microgravity conditions in the NASA LeRC 2.2 s drop tower. Experiments were also conducted in a variable pressure combustion facility in normal gravity to scale buoyancy and thus, to supplement the drop tower experiments. Pure H2 or H2 mixed with He was used as the jet fluid to avoid the complexities associated with soot formation. Fuel jet burning in quiescent air was visualized and quantified by the Rainbow Schlieren Deflectometry (RSD) to obtain scalar profiles (temperature, oxygen concentration) within the flame. Burner tube diameter (d) was varied from 0.3 to 1.19 mm producing jet exit Reynolds numbers ranging from 40 to 1900, and generating flames encompassing laminar and transitional (laminar to turbulent) flow structure. Some experiments were also complemented with the CFD analysis. In a previous paper, we have presented details of the RSD technique, comparison of computed and measured scalar distributions, and effects of buoyancy on laminar and transitional H2 gas-jet diffusion flames. Results obtained from the RSD technique, variable pressure combustion chamber, and theoretical models have been published. Subsequently, we have developed a new drop rig with improved optical and image acquisition. In this set up, the schlieren images are acquired in real time and stored digitally in RAM of an onboard computer. This paper deals with laminar diffusion flames of pure H2 in normal and microgravity.

  18. Interdisciplinary Team-Teaching Experience for a Computer and Nuclear Energy Course for Electrical and Computer Engineering Students

    ERIC Educational Resources Information Center

    Kim, Charles; Jackson, Deborah; Keiller, Peter

    2016-01-01

    A new, interdisciplinary, team-taught course has been designed to educate students in Electrical and Computer Engineering (ECE) so that they can respond to global and urgent issues concerning computer control systems in nuclear power plants. This paper discusses our experience and assessment of the interdisciplinary computer and nuclear energy…

  19. Computation of Chemical Shifts for Paramagnetic Molecules: A Laboratory Experiment for the Undergraduate Curriculum

    ERIC Educational Resources Information Center

    Pritchard, Benjamin P.; Simpson, Scott; Zurek, Eva; Autschbach, Jochen

    2014-01-01

    A computational experiment investigating the [superscript 1]H and [superscript 13]C nuclear magnetic resonance (NMR) chemical shifts of molecules with unpaired electrons has been developed and implemented. This experiment is appropriate for an upper-level undergraduate laboratory course in computational, physical, or inorganic chemistry. The…

  20. A Comparison between Predicted and Observed Atmospheric States and their Effects on Infrasonic Source Time Function Inversion at Source Physics Experiment 6

    NASA Astrophysics Data System (ADS)

    Aur, K. A.; Poppeliers, C.; Preston, L. A.

    2017-12-01

    The Source Physics Experiment (SPE) consists of a series of underground chemical explosions at the Nevada National Security Site (NNSS) designed to gain an improved understanding of the generation and propagation of physical signals in the near and far field. Characterizing the acoustic and infrasound source mechanism from underground explosions is of great importance to underground explosion monitoring. To this end we perform full waveform source inversion of infrasound data collected from the SPE-6 experiment at distances from 300 m to 6 km and frequencies up to 20 Hz. Our method requires estimating the state of the atmosphere at the time of each experiment, computing Green's functions through these atmospheric models, and subsequently inverting the observed data in the frequency domain to obtain a source time function. To estimate the state of the atmosphere at the time of the experiment, we utilize the Weather Research and Forecasting - Data Assimilation (WRF-DA) modeling system to derive a unified atmospheric state model by combining Global Energy and Water Cycle Experiment (GEWEX) Continental-scale International Project (GCIP) data and locally obtained sonde and surface weather observations collected at the time of the experiment. We synthesize Green's functions through these atmospheric models using Sandia's moving media acoustic propagation simulation suite (TDAAPS). These models include 3-D variations in topography, temperature, pressure, and wind. We compare inversion results using the atmospheric models derived from the unified weather models versus previous modeling results and discuss how these differences affect computed source waveforms with respect to observed waveforms at various distances. Sandia National Laboratories is a multi-mission laboratory managed and operated by National Technology and Engineering Solutions of Sandia LLC, a wholly owned subsidiary of Honeywell International Inc. for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-NA0003525.

  1. Ownership and Agency of an Independent Supernumerary Hand Induced by an Imitation Brain-Computer Interface.

    PubMed

    Bashford, Luke; Mehring, Carsten

    2016-01-01

    To study body ownership and control, illusions that elicit these feelings in non-body objects are widely used. Classically introduced with the Rubber Hand Illusion, these illusions have been replicated more recently in virtual reality and by using brain-computer interfaces. Traditionally these illusions investigate the replacement of a body part by an artificial counterpart, however as brain-computer interface research develops it offers us the possibility to explore the case where non-body objects are controlled in addition to movements of our own limbs. Therefore we propose a new illusion designed to test the feeling of ownership and control of an independent supernumerary hand. Subjects are under the impression they control a virtual reality hand via a brain-computer interface, but in reality there is no causal connection between brain activity and virtual hand movement but correct movements are observed with 80% probability. These imitation brain-computer interface trials are interspersed with movements in both the subjects' real hands, which are in view throughout the experiment. We show that subjects develop strong feelings of ownership and control over the third hand, despite only receiving visual feedback with no causal link to the actual brain signals. Our illusion is crucially different from previously reported studies as we demonstrate independent ownership and control of the third hand without loss of ownership in the real hands.

  2. Advanced networks and computing in healthcare

    PubMed Central

    Ackerman, Michael

    2011-01-01

    As computing and network capabilities continue to rise, it becomes increasingly important to understand the varied applications for using them to provide healthcare. The objective of this review is to identify key characteristics and attributes of healthcare applications involving the use of advanced computing and communication technologies, drawing upon 45 research and development projects in telemedicine and other aspects of healthcare funded by the National Library of Medicine over the past 12 years. Only projects publishing in the professional literature were included in the review. Four projects did not publish beyond their final reports. In addition, the authors drew on their first-hand experience as project officers, reviewers and monitors of the work. Major themes in the corpus of work were identified, characterizing key attributes of advanced computing and network applications in healthcare. Advanced computing and network applications are relevant to a range of healthcare settings and specialties, but they are most appropriate for solving a narrower range of problems in each. Healthcare projects undertaken primarily to explore potential have also demonstrated effectiveness and depend on the quality of network service as much as bandwidth. Many applications are enabling, making it possible to provide service or conduct research that previously was not possible or to achieve outcomes in addition to those for which projects were undertaken. Most notable are advances in imaging and visualization, collaboration and sense of presence, and mobility in communication and information-resource use. PMID:21486877

  3. Technology at the zoo: the influence of a touchscreen computer on orangutans and zoo visitors.

    PubMed

    Perdue, Bonnie M; Clay, Andrea W; Gaalema, Diann E; Maple, Terry L; Stoinski, Tara S

    2012-01-01

    A computer-controlled touchscreen apparatus (hereafter referred to as "touchscreen") in the orangutan exhibit at Zoo Atlanta provides enrichment to the animals and allows cognitive research to take place on exhibit. This study investigated the impact of the touchscreen on orangutan behavior and visibility, as well as its impact on zoo visitors. Despite previous research suggesting that providing a single computer system may negatively affect orangutan behavior, there was not a significant increase in aggression, stereotypic, or distress-related behaviors following the activation of the on-exhibit touchscreen. We also investigated the possibility that zoo visitors may be negatively affected by technology because it deviates from naturalism. However, we did not find a change in stay time or overall experience rating when the computer was turned on. This research was the first to assess visitor attitudes toward technology at the zoo, and we found that visitors report highly positive attitudes about technology for both animals and visitors. If subjects visited the exhibit when the computer was turned on, they more strongly agreed that orangutans benefit from interacting with computerized enrichment. This study is the first investigation of an on-exhibit touchscreen in group-housed apes; our findings of no negative effects on the animals or zoo visitors and positive attitudes toward technology suggest a significant value of this practice. © 2011 Wiley Periodicals, Inc.

  4. Analytical solution for reactive solute transport considering incomplete mixing within a reference elementary volume

    NASA Astrophysics Data System (ADS)

    Chiogna, Gabriele; Bellin, Alberto

    2013-05-01

    The laboratory experiments of Gramling et al. (2002) showed that incomplete mixing at the pore scale exerts a significant impact on transport of reactive solutes and that assuming complete mixing leads to overestimation of product concentration in bimolecular reactions. Successively, several attempts have been made to model this experiment, either considering spatial segregation of the reactants, non-Fickian transport applying a Continuous Time Random Walk (CTRW) or an effective upscaled time-dependent kinetic reaction term. Previous analyses of these experimental results showed that, at the Darcy scale, conservative solute transport is well described by a standard advection dispersion equation, which assumes complete mixing at the pore scale. However, reactive transport is significantly affected by incomplete mixing at smaller scales, i.e., within a reference elementary volume (REV). We consider here the family of equilibrium reactions for which the concentration of the reactants and the product can be expressed as a function of the mixing ratio, the concentration of a fictitious non reactive solute. For this type of reactions we propose, in agreement with previous studies, to model the effect of incomplete mixing at scales smaller than the Darcy scale assuming that the mixing ratio is distributed within an REV according to a Beta distribution. We compute the parameters of the Beta model by imposing that the mean concentration is equal to the value that the concentration assumes at the continuum Darcy scale, while the variance decays with time as a power law. We show that our model reproduces the concentration profiles of the reaction product measured in the Gramling et al. (2002) experiments using the transport parameters obtained from conservative experiments and an instantaneous reaction kinetic. The results are obtained applying analytical solutions both for conservative and for reactive solute transport, thereby providing a method to handle the effect of incomplete mixing on multispecies reactive solute transport, which is simpler than other previously developed methods.

  5. Using Computer Games for Instruction: The Student Experience

    ERIC Educational Resources Information Center

    Grimley, Michael; Green, Richard; Nilsen, Trond; Thompson, David; Tomes, Russell

    2011-01-01

    Computer games are fun, exciting and motivational when used as leisure pursuits. But do they have similar attributes when utilized for educational purposes? This article investigates whether learning by computer game can improve student experiences compared with a more formal lecture approach and whether computer games have potential for improving…

  6. Experience with a UNIX based batch computing facility for H1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerhards, R.; Kruener-Marquis, U.; Szkutnik, Z.

    1994-12-31

    A UNIX based batch computing facility for the H1 experiment at DESY is described. The ultimate goal is to replace the DESY IBM mainframe by a multiprocessor SGI Challenge series computer, using the UNIX operating system, for most of the computing tasks in H1.

  7. Predictors of Computer Anxiety and Performance in Information Systems.

    ERIC Educational Resources Information Center

    Anderson, Alastair A.

    1996-01-01

    Reports on the results of a study of business undergraduates in Australia that was conducted to determine whether or not perceived knowledge of software, microcomputer experience, overall knowledge of computers, programming experience, and gender were predictors of computer anxiety. Use of the Computer Anxiety Rating Scale is discussed.…

  8. Emergent Leadership and Team Effectiveness on a Team Resource Allocation Task

    DTIC Science & Technology

    1987-10-01

    equivalent training and experience on this task, but they had different levels of experience with computers and video games . This differential experience...typed: that is. it is sex-typed to the extent that males spend mnore time on related instrumeuts like computers and video games . However. the sex...perform better or worse than less talkative teams? Did teams with much computer and ’or video game experience perform better than inexperienced teams

  9. Experimental evaluation of the effect of a modified port-location mode on the performance of a three-zone simulated moving-bed process for the separation of valine and isoleucine.

    PubMed

    Park, Chanhun; Nam, Hee-Geun; Kim, Pung-Ho; Mun, Sungyong

    2014-06-01

    The removal of isoleucine from valine has been a key issue in the stage of valine crystallization, which is the final step in the valine production process in industry. To address this issue, a three-zone simulated moving-bed (SMB) process for the separation of valine and isoleucine has been developed previously. However, the previous process, which was based on a classical port-location mode, had some limitations in throughput and valine product concentration. In this study, a three-zone SMB process based on a modified port-location mode was applied to the separation of valine and isoleucine for the purpose of making a marked improvement in throughput and valine product concentration. Computer simulations and a lab-scale process experiment showed that the modified three-zone SMB for valine separation led to >65% higher throughput and >160% higher valine concentration compared to the previous three-zone SMB for the same separation. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Computing nucleon EDM on a lattice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abramczyk, Michael; Izubuchi, Taku

    I will discuss briefly recent changes in the methodology of computing the baryon EDM on a lattice. The associated correction substantially reduces presently existing lattice values for the proton and neutron theta-induced EDMs, so that even the most precise previous lattice results become consistent with zero. On one hand, this change removes previous disagreements between these lattice results and the phenomenological estimates of the nucleon EDM. On the other hand, the nucleon EDM becomes much harder to compute on a lattice. In addition, I will review the progress in computing quark chromo-EDM-induced nucleon EDM using chiral quark action.

  11. Comparing Computer Game and Traditional Lecture Using Experience Ratings from High and Low Achieving Students

    ERIC Educational Resources Information Center

    Grimley, Michael; Green, Richard; Nilsen, Trond; Thompson, David

    2012-01-01

    Computer games are purported to be effective instructional tools that enhance motivation and improve engagement. The aim of this study was to investigate how tertiary student experiences change when instruction was computer game based compared to lecture based, and whether experiences differed between high and low achieving students. Participants…

  12. Attention in a Bayesian Framework

    PubMed Central

    Whiteley, Louise; Sahani, Maneesh

    2012-01-01

    The behavioral phenomena of sensory attention are thought to reflect the allocation of a limited processing resource, but there is little consensus on the nature of the resource or why it should be limited. Here we argue that a fundamental bottleneck emerges naturally within Bayesian models of perception, and use this observation to frame a new computational account of the need for, and action of, attention – unifying diverse attentional phenomena in a way that goes beyond previous inferential, probabilistic and Bayesian models. Attentional effects are most evident in cluttered environments, and include both selective phenomena, where attention is invoked by cues that point to particular stimuli, and integrative phenomena, where attention is invoked dynamically by endogenous processing. However, most previous Bayesian accounts of attention have focused on describing relatively simple experimental settings, where cues shape expectations about a small number of upcoming stimuli and thus convey “prior” information about clearly defined objects. While operationally consistent with the experiments it seeks to describe, this view of attention as prior seems to miss many essential elements of both its selective and integrative roles, and thus cannot be easily extended to complex environments. We suggest that the resource bottleneck stems from the computational intractability of exact perceptual inference in complex settings, and that attention reflects an evolved mechanism for approximate inference which can be shaped to refine the local accuracy of perception. We show that this approach extends the simple picture of attention as prior, so as to provide a unified and computationally driven account of both selective and integrative attentional phenomena. PMID:22712010

  13. Quinoa - Adaptive Computational Fluid Dynamics, 0.2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bakosi, Jozsef; Gonzalez, Francisco; Rogers, Brandon

    Quinoa is a set of computational tools that enables research and numerical analysis in fluid dynamics. At this time it remains a test-bed to experiment with various algorithms using fully asynchronous runtime systems. Currently, Quinoa consists of the following tools: (1) Walker, a numerical integrator for systems of stochastic differential equations in time. It is a mathematical tool to analyze and design the behavior of stochastic differential equations. It allows the estimation of arbitrary coupled statistics and probability density functions and is currently used for the design of statistical moment approximations for multiple mixing materials in variable-density turbulence. (2) Inciter,more » an overdecomposition-aware finite element field solver for partial differential equations using 3D unstructured grids. Inciter is used to research asynchronous mesh-based algorithms and to experiment with coupling asynchronous to bulk-synchronous parallel code. Two planned new features of Inciter, compared to the previous release (LA-CC-16-015), to be implemented in 2017, are (a) a simple Navier-Stokes solver for ideal single-material compressible gases, and (b) solution-adaptive mesh refinement (AMR), which enables dynamically concentrating compute resources to regions with interesting physics. Using the NS-AMR problem we plan to explore how to scale such high-load-imbalance simulations, representative of large production multiphysics codes, to very large problems on very large computers using an asynchronous runtime system. (3) RNGTest, a test harness to subject random number generators to stringent statistical tests enabling quantitative ranking with respect to their quality and computational cost. (4) UnitTest, a unit test harness, running hundreds of tests per second, capable of testing serial, synchronous, and asynchronous functions. (5) MeshConv, a mesh file converter that can be used to convert 3D tetrahedron meshes from and to either of the following formats: Gmsh, (http://www.geuz.org/gmsh), Netgen, (http://sourceforge.net/apps/mediawiki/netgen-mesher), ExodusII, (http://sourceforge.net/projects/exodusii), HyperMesh, (http://www.altairhyperworks.com/product/HyperMesh).« less

  14. Audio-visual perception of 3D cinematography: an fMRI study using condition-based and computation-based analyses.

    PubMed

    Ogawa, Akitoshi; Bordier, Cecile; Macaluso, Emiliano

    2013-01-01

    The use of naturalistic stimuli to probe sensory functions in the human brain is gaining increasing interest. Previous imaging studies examined brain activity associated with the processing of cinematographic material using both standard "condition-based" designs, as well as "computational" methods based on the extraction of time-varying features of the stimuli (e.g. motion). Here, we exploited both approaches to investigate the neural correlates of complex visual and auditory spatial signals in cinematography. In the first experiment, the participants watched a piece of a commercial movie presented in four blocked conditions: 3D vision with surround sounds (3D-Surround), 3D with monaural sound (3D-Mono), 2D-Surround, and 2D-Mono. In the second experiment, they watched two different segments of the movie both presented continuously in 3D-Surround. The blocked presentation served for standard condition-based analyses, while all datasets were submitted to computation-based analyses. The latter assessed where activity co-varied with visual disparity signals and the complexity of auditory multi-sources signals. The blocked analyses associated 3D viewing with the activation of the dorsal and lateral occipital cortex and superior parietal lobule, while the surround sounds activated the superior and middle temporal gyri (S/MTG). The computation-based analyses revealed the effects of absolute disparity in dorsal occipital and posterior parietal cortices and of disparity gradients in the posterior middle temporal gyrus plus the inferior frontal gyrus. The complexity of the surround sounds was associated with activity in specific sub-regions of S/MTG, even after accounting for changes of sound intensity. These results demonstrate that the processing of naturalistic audio-visual signals entails an extensive set of visual and auditory areas, and that computation-based analyses can track the contribution of complex spatial aspects characterizing such life-like stimuli.

  15. High-Performance Java Codes for Computational Fluid Dynamics

    NASA Technical Reports Server (NTRS)

    Riley, Christopher; Chatterjee, Siddhartha; Biswas, Rupak; Biegel, Bryan (Technical Monitor)

    2001-01-01

    The computational science community is reluctant to write large-scale computationally -intensive applications in Java due to concerns over Java's poor performance, despite the claimed software engineering advantages of its object-oriented features. Naive Java implementations of numerical algorithms can perform poorly compared to corresponding Fortran or C implementations. To achieve high performance, Java applications must be designed with good performance as a primary goal. This paper presents the object-oriented design and implementation of two real-world applications from the field of Computational Fluid Dynamics (CFD): a finite-volume fluid flow solver (LAURA, from NASA Langley Research Center), and an unstructured mesh adaptation algorithm (2D_TAG, from NASA Ames Research Center). This work builds on our previous experience with the design of high-performance numerical libraries in Java. We examine the performance of the applications using the currently available Java infrastructure and show that the Java version of the flow solver LAURA performs almost within a factor of 2 of the original procedural version. Our Java version of the mesh adaptation algorithm 2D_TAG performs within a factor of 1.5 of its original procedural version on certain platforms. Our results demonstrate that object-oriented software design principles are not necessarily inimical to high performance.

  16. Planning for distributed workflows: constraint-based coscheduling of computational jobs and data placement in distributed environments

    NASA Astrophysics Data System (ADS)

    Makatun, Dzmitry; Lauret, Jérôme; Rudová, Hana; Šumbera, Michal

    2015-05-01

    When running data intensive applications on distributed computational resources long I/O overheads may be observed as access to remotely stored data is performed. Latencies and bandwidth can become the major limiting factor for the overall computation performance and can reduce the CPU/WallTime ratio to excessive IO wait. Reusing the knowledge of our previous research, we propose a constraint programming based planner that schedules computational jobs and data placements (transfers) in a distributed environment in order to optimize resource utilization and reduce the overall processing completion time. The optimization is achieved by ensuring that none of the resources (network links, data storages and CPUs) are oversaturated at any moment of time and either (a) that the data is pre-placed at the site where the job runs or (b) that the jobs are scheduled where the data is already present. Such an approach eliminates the idle CPU cycles occurring when the job is waiting for the I/O from a remote site and would have wide application in the community. Our planner was evaluated and simulated based on data extracted from log files of batch and data management systems of the STAR experiment. The results of evaluation and estimation of performance improvements are discussed in this paper.

  17. Geant4 Computing Performance Benchmarking and Monitoring

    DOE PAGES

    Dotti, Andrea; Elvira, V. Daniel; Folger, Gunter; ...

    2015-12-23

    Performance evaluation and analysis of large scale computing applications is essential for optimal use of resources. As detector simulation is one of the most compute intensive tasks and Geant4 is the simulation toolkit most widely used in contemporary high energy physics (HEP) experiments, it is important to monitor Geant4 through its development cycle for changes in computing performance and to identify problems and opportunities for code improvements. All Geant4 development and public releases are being profiled with a set of applications that utilize different input event samples, physics parameters, and detector configurations. Results from multiple benchmarking runs are compared tomore » previous public and development reference releases to monitor CPU and memory usage. Observed changes are evaluated and correlated with code modifications. Besides the full summary of call stack and memory footprint, a detailed call graph analysis is available to Geant4 developers for further analysis. The set of software tools used in the performance evaluation procedure, both in sequential and multi-threaded modes, include FAST, IgProf and Open|Speedshop. In conclusion, the scalability of the CPU time and memory performance in multi-threaded application is evaluated by measuring event throughput and memory gain as a function of the number of threads for selected event samples.« less

  18. Improved Classification of Lung Cancer Using Radial Basis Function Neural Network with Affine Transforms of Voss Representation.

    PubMed

    Adetiba, Emmanuel; Olugbara, Oludayo O

    2015-01-01

    Lung cancer is one of the diseases responsible for a large number of cancer related death cases worldwide. The recommended standard for screening and early detection of lung cancer is the low dose computed tomography. However, many patients diagnosed die within one year, which makes it essential to find alternative approaches for screening and early detection of lung cancer. We present computational methods that can be implemented in a functional multi-genomic system for classification, screening and early detection of lung cancer victims. Samples of top ten biomarker genes previously reported to have the highest frequency of lung cancer mutations and sequences of normal biomarker genes were respectively collected from the COSMIC and NCBI databases to validate the computational methods. Experiments were performed based on the combinations of Z-curve and tetrahedron affine transforms, Histogram of Oriented Gradient (HOG), Multilayer perceptron and Gaussian Radial Basis Function (RBF) neural networks to obtain an appropriate combination of computational methods to achieve improved classification of lung cancer biomarker genes. Results show that a combination of affine transforms of Voss representation, HOG genomic features and Gaussian RBF neural network perceptibly improves classification accuracy, specificity and sensitivity of lung cancer biomarker genes as well as achieving low mean square error.

  19. Assessing Gait Impairments Based on Auto-Encoded Patterns of Mahalanobis Distances from Consecutive Steps.

    PubMed

    Muñoz-Organero, Mario; Davies, Richard; Mawson, Sue

    2017-01-01

    Insole pressure sensors capture the force distribution patterns during the stance phase while walking. By comparing patterns obtained from healthy individuals to patients suffering different medical conditions based on a given similarity measure, automatic impairment indexes can be computed in order to help in applications such as rehabilitation. This paper uses the data sensed from insole pressure sensors for a group of healthy controls to train an auto-encoder using patterns of stochastic distances in series of consecutive steps while walking at normal speeds. Two experiment groups are compared to the healthy control group: a group of patients suffering knee pain and a group of post-stroke survivors. The Mahalanobis distance is computed for every single step by each participant compared to the entire dataset sensed from healthy controls. The computed distances for consecutive steps are fed into the previously trained autoencoder and the average error is used to assess how close the walking segment is to the autogenerated model from healthy controls. The results show that automatic distortion indexes can be used to assess each participant as compared to normal patterns computed from healthy controls. The stochastic distances observed for the group of stroke survivors are bigger than those for the people with knee pain.

  20. A systems biology approach to predict and characterize human gut microbial metabolites in colorectal cancer.

    PubMed

    Wang, QuanQiu; Li, Li; Xu, Rong

    2018-04-18

    Colorectal cancer (CRC) is the second leading cause of cancer-related deaths. It is estimated that about half the cases of CRC occurring today are preventable. Recent studies showed that human gut microbiota and their collective metabolic outputs play important roles in CRC. However, the mechanisms by which human gut microbial metabolites interact with host genetics in contributing CRC remain largely unknown. We hypothesize that computational approaches that integrate and analyze vast amounts of publicly available biomedical data have great potential in better understanding how human gut microbial metabolites are mechanistically involved in CRC. Leveraging vast amount of publicly available data, we developed a computational algorithm to predict human gut microbial metabolites for CRC. We validated the prediction algorithm by showing that previously known CRC-associated gut microbial metabolites ranked highly (mean ranking: top 10.52%; median ranking: 6.29%; p-value: 3.85E-16). Moreover, we identified new gut microbial metabolites likely associated with CRC. Through computational analysis, we propose potential roles for tartaric acid, the top one ranked metabolite, in CRC etiology. In summary, our data-driven computation-based study generated a large amount of associations that could serve as a starting point for further experiments to refute or validate these microbial metabolite associations in CRC cancer.

  1. Integrated design, execution, and analysis of arrayed and pooled CRISPR genome-editing experiments.

    PubMed

    Canver, Matthew C; Haeussler, Maximilian; Bauer, Daniel E; Orkin, Stuart H; Sanjana, Neville E; Shalem, Ophir; Yuan, Guo-Cheng; Zhang, Feng; Concordet, Jean-Paul; Pinello, Luca

    2018-05-01

    CRISPR (clustered regularly interspaced short palindromic repeats) genome-editing experiments offer enormous potential for the evaluation of genomic loci using arrayed single guide RNAs (sgRNAs) or pooled sgRNA libraries. Numerous computational tools are available to help design sgRNAs with optimal on-target efficiency and minimal off-target potential. In addition, computational tools have been developed to analyze deep-sequencing data resulting from genome-editing experiments. However, these tools are typically developed in isolation and oftentimes are not readily translatable into laboratory-based experiments. Here, we present a protocol that describes in detail both the computational and benchtop implementation of an arrayed and/or pooled CRISPR genome-editing experiment. This protocol provides instructions for sgRNA design with CRISPOR (computational tool for the design, evaluation, and cloning of sgRNA sequences), experimental implementation, and analysis of the resulting high-throughput sequencing data with CRISPResso (computational tool for analysis of genome-editing outcomes from deep-sequencing data). This protocol allows for design and execution of arrayed and pooled CRISPR experiments in 4-5 weeks by non-experts, as well as computational data analysis that can be performed in 1-2 d by both computational and noncomputational biologists alike using web-based and/or command-line versions.

  2. A meta-analysis of outcomes from the use of computer-simulated experiments in science education

    NASA Astrophysics Data System (ADS)

    Lejeune, John Van

    The purpose of this study was to synthesize the findings from existing research on the effects of computer simulated experiments on students in science education. Results from 40 reports were integrated by the process of meta-analysis to examine the effect of computer-simulated experiments and interactive videodisc simulations on student achievement and attitudes. Findings indicated significant positive differences in both low-level and high-level achievement of students who use computer-simulated experiments and interactive videodisc simulations as compared to students who used more traditional learning activities. No significant differences in retention, student attitudes toward the subject, or toward the educational method were found. Based on the findings of this study, computer-simulated experiments and interactive videodisc simulations should be used to enhance students' learning in science, especially in cases where the use of traditional laboratory activities are expensive, dangerous, or impractical.

  3. A descriptive qualitative study of adolescent girls' well-being in Northern Finland.

    PubMed

    Wiens, Varpu; Kyngäs, Helvi; Pölkki, Tarja

    2014-01-01

    Previous studies have shown that girls present welfare-related symptoms differently than boys and that the severity of their symptoms increases with age. Girls living in Northern Finland experience reduced well-being in some aspects of their lives. However, the opinions of girls on these matters have not previously been studied. The aim of this study was to describe girls' well-being in Northern Finland. This is a descriptive qualitative study. The participants were 117 girls aged between 13 and 16 who were living in the province of Lapland in Finland and attending primary school. Data were collected electronically; the girls were asked to respond to a set of open-ended questions using a computer during a school day. The responses were evaluated by using inductive content analysis. Four main categories of girls' well-being were identified: health as a resource, a beneficial lifestyle, positive experience of life course, and favourable social relationships. Health as a resource was about feeling healthy and the ability to enjoy life. A beneficial lifestyle was about healthy habits and meaningful hobbies. Positive experience of life course is related to high self-esteem and feeling good, safe, and optimistic. Favourable social relationships meant having good relationships with family and friends. To the participating girls, well-being was a positive experience and feeling which was revealed when they interact between their relationships, living conditions, lifestyle, and environment. Knowledge about girls' description of their well-being can be used to understand how the girls themselves and their environment influence their well-being and what can be done to promote it.

  4. Analog and numerical experiments investigating force chain influences on bed conditions in granular flows

    NASA Astrophysics Data System (ADS)

    Estep, J.; Dufek, J.

    2013-12-01

    Granular flows are fundamental processes in several terrestrial and planetary natural events; including surficial flows on volcanic edifices, debris flows, landslides, dune formation, rock falls, sector collapses, and avalanches. Often granular flows can be two-phase, whereby interstitial fluids occupy void space within the particulates. The mobility of granular flows has received significant attention, however the physics that govern their internal behavior remain poorly understood. Here we extend upon previous research showing that force chains can transmit extreme localized forces to the substrates of free surface granular flows, and we combine experimental and computational approaches to further investigate the forces at the bed of simplified granular flows. Analog experiments resolve discrete bed forces via a photoelastic technique, while numerical experiments validate laboratory tests using discrete element model (DEM) simulations. The current work investigates (1) the role of distributed grain sizes on force transmission via force chains, and (2) how the inclusion of interstitial fluids effects force chain development. We also include 3D numerical simulations to apply observed 2D characteristics into real world perspective, and ascertain if the added dimension alters force chain behavior. Previous research showed that bed forces generated by force chain structures can transiently greatly exceed (by several 100%) the bed forces predicted from continuum approaches, and that natural materials are more prone to excessive bed forces than photoelastic materials due to their larger contact stiffnesses. This work suggests that force chain activity may play an important role in the bed physics of dense granular flows by influencing substrate entrainment. Photoelastic experiment image showing force chains in gravity driven granular flow.

  5. Heat Transfer Experiments in the Internal Cooling Passages of a Cooled Radial Turbine Rotor

    NASA Technical Reports Server (NTRS)

    Johnson, B. V.; Wagner, J. H.

    1996-01-01

    An experimental study was conducted (1) to experimentally measure, assess and analyze the heat transfer within the internal cooling configuration of a radial turbine rotor blade and (2) to obtain heat transfer data to evaluate and improve computational fluid dynamics (CFD) procedures and turbulent transport models of internal coolant flows. A 1.15 times scale model of the coolant passages within the NASA LERC High Temperature Radial Turbine was designed, fabricated of Lucite and instrumented for transient beat transfer tests using thin film surface thermocouples and liquid crystals to indicate temperatures. Transient heat transfer tests were conducted for Reynolds numbers of one-fourth, one-half, and equal to the operating Reynolds number for the NASA Turbine. Tests were conducted for stationary and rotating conditions with rotation numbers in the range occurring in the NASA Turbine. Results from the experiments showed the heat transfer characteristics within the coolant passage were affected by rotation. In general, the heat transfer increased and decreased on the sides of the straight radial passages with rotation as previously reported from NASA-HOST-sponsored experiments. The heat transfer in the tri-passage axial flow region adjacent to the blade exit was relatively unaffected by rotation. However, the heat transfer on one surface, in the transitional region between the radial inflow passage and axial, constant radius passages, decreased to approximately 20 percent of the values without rotation. Comparisons with previous 3-D numerical studies indicated regions where the heat transfer characteristics agreed and disagreed with the present experiment.

  6. "Hot Spots" of Land Atmosphere Coupling

    NASA Technical Reports Server (NTRS)

    Koster, Randal D.; Dirmeyer, Paul A.; Guo, Zhi-Chang; Bonan, Gordan; Chan, Edmond; Cox, Peter; Gordon, T. C.; Kanae, Shinjiro; Kowalczyk, Eva; Lawrence, David

    2004-01-01

    Previous estimates of land-atmosphere interaction (the impact of soil moisture on precipitation) have been limited by a severe paucity of relevant observational data and by the model-dependence of the various computational estimates. To counter this limitation, a dozen climate modeling groups have recently performed the same highly-controlled numerical experiment as part of a coordinated intercomparison project. This allows, for the first time ever, a superior multi-model approach to the estimation of the regions on the globe where precipitation is affected by soil moisture anomalies during Northern Hemisphere summer. Such estimation has many potential benefits; it can contribute, for example, to seasonal rainfall prediction efforts.

  7. Predicting multi-wall structural response to hypervelocity impact using the hull code

    NASA Technical Reports Server (NTRS)

    Schonberg, William P.

    1993-01-01

    Previously, multi-wall structures have been analyzed extensively, primarily through experiment, as a means of increasing the meteoroid/space debris impact protection of spacecraft. As structural configurations become more varied, the number of tests required to characterize their response increases dramatically. As an alternative to experimental testing, numerical modeling of high-speed impact phenomena is often being used to predict the response of a variety of structural systems under different impact loading conditions. The results of comparing experimental tests to Hull Hydrodynamic Computer Code predictions are reported. Also, the results of a numerical parametric study of multi-wall structural response to hypervelocity cylindrical projectile impact are presented.

  8. Prototyping the graphical user interface for the operator of the Cherenkov Telescope Array

    NASA Astrophysics Data System (ADS)

    Sadeh, I.; Oya, I.; Schwarz, J.; Pietriga, E.

    2016-07-01

    The Cherenkov Telescope Array (CTA) is a planned gamma-ray observatory. CTA will incorporate about 100 imaging atmospheric Cherenkov telescopes (IACTs) at a Southern site, and about 20 in the North. Previous IACT experiments have used up to five telescopes. Subsequently, the design of a graphical user interface (GUI) for the operator of CTA involves new challenges. We present a GUI prototype, the concept for which is being developed in collaboration with experts from the field of Human-Computer Interaction (HCI). The prototype is based on Web technology; it incorporates a Python web server, Web Sockets and graphics generated with the d3.js Javascript library.

  9. Automated documentation generator for advanced protein crystal growth

    NASA Technical Reports Server (NTRS)

    Maddux, Gary A.; Provancha, Anna; Chattam, David; Ford, Ronald

    1993-01-01

    The System Management and Production Laboratory at the Research Institute, the University of Alabama in Huntsville (UAH), was tasked by the Microgravity Experiment Projects (MEP) Office of the Payload Projects Office (PPO) at Marshall Space Flight Center (MSFC) to conduct research in the current methods of written documentation control and retrieval. The goals of this research were to determine the logical interrelationships within selected NASA documentation, and to expand on a previously developed prototype system to deliver a distributable, electronic knowledge-based system. This computer application would then be used to provide a paperless interface between the appropriate parties for the required NASA document.

  10. Fully coupled six-dimensional calculations of the water dimer vibration-rotation-tunneling states with split Wigner pseudospectral approach. II. Improvements and tests of additional potentials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fellers, R.S.; Braly, L.B.; Saykally, R.J.

    The SWPS method is improved by the addition of H.E.G. contractions for generating a more compact basis. An error in the definition of the internal fragment axis system used in our previous calculation is described and corrected. Fully coupled 6D (rigid monomers) VRT states are computed for several new water dimer potential surfaces and compared with experiment and our earlier SWPS results. This work sets the stage for refinement of such potential surfaces via regression analysis of VRT spectroscopic data. {copyright} {ital 1999 American Institute of Physics.}

  11. DBSecSys 2.0: a database of Burkholderia mallei and Burkholderia pseudomallei secretion systems.

    PubMed

    Memišević, Vesna; Kumar, Kamal; Zavaljevski, Nela; DeShazer, David; Wallqvist, Anders; Reifman, Jaques

    2016-09-20

    Burkholderia mallei and B. pseudomallei are the causative agents of glanders and melioidosis, respectively, diseases with high morbidity and mortality rates. B. mallei and B. pseudomallei are closely related genetically; B. mallei evolved from an ancestral strain of B. pseudomallei by genome reduction and adaptation to an obligate intracellular lifestyle. Although these two bacteria cause different diseases, they share multiple virulence factors, including bacterial secretion systems, which represent key components of bacterial pathogenicity. Despite recent progress, the secretion system proteins for B. mallei and B. pseudomallei, their pathogenic mechanisms of action, and host factors are not well characterized. We previously developed a manually curated database, DBSecSys, of bacterial secretion system proteins for B. mallei. Here, we report an expansion of the database with corresponding information about B. pseudomallei. DBSecSys 2.0 contains comprehensive literature-based and computationally derived information about B. mallei ATCC 23344 and literature-based and computationally derived information about B. pseudomallei K96243. The database contains updated information for 163 B. mallei proteins from the previous database and 61 additional B. mallei proteins, and new information for 281 B. pseudomallei proteins associated with 5 secretion systems, their 1,633 human- and murine-interacting targets, and 2,400 host-B. mallei interactions and 2,286 host-B. pseudomallei interactions. The database also includes information about 13 pathogenic mechanisms of action for B. mallei and B. pseudomallei secretion system proteins inferred from the available literature or computationally. Additionally, DBSecSys 2.0 provides details about 82 virulence attenuation experiments for 52 B. mallei secretion system proteins and 98 virulence attenuation experiments for 61 B. pseudomallei secretion system proteins. We updated the Web interface and data access layer to speed-up users' search of detailed information for orthologous proteins related to secretion systems of the two pathogens. The updates of DBSecSys 2.0 provide unique capabilities to access comprehensive information about secretion systems of B. mallei and B. pseudomallei. They enable studies and comparisons of corresponding proteins of these two closely related pathogens and their host-interacting partners. The database is available at http://dbsecsys.bhsai.org .

  12. Educational Computer Use in Leisure Contexts: A Phenomenological Study of Adolescents' Experiences at Internet Cafes

    ERIC Educational Resources Information Center

    Cilesiz, Sebnem

    2009-01-01

    Computer use is a widespread leisure activity for adolescents. Leisure contexts, such as Internet cafes, constitute specific social environments for computer use and may hold significant educational potential. This article reports a phenomenological study of adolescents' experiences of educational computer use at Internet cafes in Turkey. The…

  13. Computers in Science Education: Can They Go Far Enough? Have We Gone Too Far?

    ERIC Educational Resources Information Center

    Schrock, John Richard

    1984-01-01

    Indicates that although computers may churn out creative research, science is still dependent on science education, and that science education consists of increasing human experience. Also considers uses and misuses of computers in the science classroom, examining Edgar Dale's "cone of experience" related to laboratory computer and "extended…

  14. AEC Experiment Establishes Computer Link Between California and Paris

    Science.gov Websites

    demonstrated that a terminal in Paris could search a computer in California and display the resulting (Copies) AEC EXPERIMENT ESTABLISHES COMPUTER LINK BETWEEN CALIFORNIA AND PARIS The feasibility of a worldwide information retrieval system which would tie a computer base of information to terminals on the

  15. Ontological and Epistemological Issues Regarding Climate Models and Computer Experiments

    NASA Astrophysics Data System (ADS)

    Vezer, M. A.

    2010-12-01

    Recent philosophical discussions (Parker 2009; Frigg and Reiss 2009; Winsberg, 2009; Morgon 2002, 2003, 2005; Gula 2002) about the ontology of computer simulation experiments and the epistemology of inferences drawn from them are of particular relevance to climate science as computer modeling and analysis are instrumental in understanding climatic systems. How do computer simulation experiments compare with traditional experiments? Is there an ontological difference between these two methods of inquiry? Are there epistemological considerations that result in one type of inference being more reliable than the other? What are the implications of these questions with respect to climate studies that rely on computer simulation analysis? In this paper, I examine these philosophical questions within the context of climate science, instantiating concerns in the philosophical literature with examples found in analysis of global climate change. I concentrate on Wendy Parker’s (2009) account of computer simulation studies, which offers a treatment of these and other questions relevant to investigations of climate change involving such modelling. Two theses at the center of Parker’s account will be the focus of this paper. The first is that computer simulation experiments ought to be regarded as straightforward material experiments; which is to say, there is no significant ontological difference between computer and traditional experimentation. Parker’s second thesis is that some of the emphasis on the epistemological importance of materiality has been misplaced. I examine both of these claims. First, I inquire as to whether viewing computer and traditional experiments as ontologically similar in the way she does implies that there is no proper distinction between abstract experiments (such as ‘thought experiments’ as well as computer experiments) and traditional ‘concrete’ ones. Second, I examine the notion of materiality (i.e., the material commonality between object and target systems) and some arguments for the claim that materiality entails some inferential advantage to traditional experimentation. I maintain that Parker’s account of the ontology of computer simulations has some interesting though potentially problematic implications regarding conventional distinctions between abstract and concrete methods of inquiry. With respect to her account of materiality, I outline and defend an alternative account, posited by Mary Morgan (2002, 2003, 2005), which holds that ontological similarity between target and object systems confers some epistemological advantage to traditional forms of experimental inquiry.

  16. Fermilab computing at the Intensity Frontier

    DOE PAGES

    Group, Craig; Fuess, S.; Gutsche, O.; ...

    2015-12-23

    The Intensity Frontier refers to a diverse set of particle physics experiments using high- intensity beams. In this paper I will focus the discussion on the computing requirements and solutions of a set of neutrino and muon experiments in progress or planned to take place at the Fermi National Accelerator Laboratory located near Chicago, Illinois. In addition, the experiments face unique challenges, but also have overlapping computational needs. In principle, by exploiting the commonality and utilizing centralized computing tools and resources, requirements can be satisfied efficiently and scientists of individual experiments can focus more on the science and less onmore » the development of tools and infrastructure.« less

  17. Improving Protocols for Protein Mapping through Proper Comparison to Crystallography Data

    PubMed Central

    Lexa, Katrina W.; Carlson, Heather A.

    2013-01-01

    Computational approaches to fragment-based drug design (FBDD) can complement experiments and facilitate the identification of potential hot spots along the protein surface. However, the evaluation of computational methods for mapping binding sites frequently focuses upon the ability to reproduce crystallographic coordinates to within a low RMSD threshold. This dependency on the deposited coordinate data overlooks the original electron density from the experiment, thus techniques may be developed based upon subjective - or even erroneous - atomic coordinates. This can become a significant drawback in applications to systems where the location of hot spots is unknown. Based on comparison to crystallographic density, we previously showed that mixed-solvent molecular dynamics (MixMD) accurately identifies the active site for HEWL, with acetonitrile as an organic solvent. Here, we concentrated on the influence of protic solvent on simulation and refined the optimal MixMD approach for extrapolation of the method to systems without established sites. Our results establish an accurate approach for comparing simulations to experiment. We have outlined the most efficient strategy for MixMD, based on simulation length and number of runs. The development outlined here makes MixMD a robust method which should prove useful across a broad range of target structures. Lastly, our results with MixMD match experimental data so well that consistency between simulations and density may be a useful way to aid the identification of probes vs waters during the refinement of future MSCS crystallographic structures. PMID:23327200

  18. MRPrimerW: a tool for rapid design of valid high-quality primers for multiple target qPCR experiments.

    PubMed

    Kim, Hyerin; Kang, NaNa; An, KyuHyeon; Koo, JaeHyung; Kim, Min-Soo

    2016-07-08

    Design of high-quality primers for multiple target sequences is essential for qPCR experiments, but is challenging due to the need to consider both homology tests on off-target sequences and the same stringent filtering constraints on the primers. Existing web servers for primer design have major drawbacks, including requiring the use of BLAST-like tools for homology tests, lack of support for ranking of primers, TaqMan probes and simultaneous design of primers against multiple targets. Due to the large-scale computational overhead, the few web servers supporting homology tests use heuristic approaches or perform homology tests within a limited scope. Here, we describe the MRPrimerW, which performs complete homology testing, supports batch design of primers for multi-target qPCR experiments, supports design of TaqMan probes and ranks the resulting primers to return the top-1 best primers to the user. To ensure high accuracy, we adopted the core algorithm of a previously reported MapReduce-based method, MRPrimer, but completely redesigned it to allow users to receive query results quickly in a web interface, without requiring a MapReduce cluster or a long computation. MRPrimerW provides primer design services and a complete set of 341 963 135 in silico validated primers covering 99% of human and mouse genes. Free access: http://MRPrimerW.com. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  19. Computational Analysis of the Interaction Energies between Amino Acid Residues of the Measles Virus Hemagglutinin and Its Receptors.

    PubMed

    Xu, Fengqi; Tanaka, Shigenori; Watanabe, Hirofumi; Shimane, Yasuhiro; Iwasawa, Misako; Ohishi, Kazue; Maruyama, Tadashi

    2018-05-03

    Measles virus (MV) causes an acute and highly devastating contagious disease in humans. Employing the crystal structures of three human receptors, signaling lymphocyte-activation molecule (SLAM), CD46, and Nectin-4, in complex with the measles virus hemagglutinin (MVH), we elucidated computationally the details of binding energies between the amino acid residues of MVH and those of the receptors with an ab initio fragment molecular orbital (FMO) method. The calculated inter-fragment interaction energies (IFIEs) revealed a number of significantly interacting amino acid residues of MVH that played essential roles in binding to the receptors. As predicted from previously reported experiments, some important amino-acid residues of MVH were shown to be common but others were specific to interactions with the three receptors. Particularly, some of the (non-polar) hydrophobic residues of MVH were found to be attractively interacting with multiple receptors, thus indicating the importance of the hydrophobic pocket for intermolecular interactions (especially in the case of Nectin-4). In contrast, the electrostatic interactions tended to be used for specific molecular recognition. Furthermore, we carried out FMO calculations for in silico experiments of amino acid mutations, finding reasonable agreements with virological experiments concerning the substitution effect of residues. Thus, the present study demonstrates that the electron-correlated FMO method is a powerful tool to search exhaustively for amino acid residues that contribute to interactions with receptor molecules. It is also applicable for designing inhibitors of MVH and engineered MVs for cancer therapy.

  20. Comparison of Mixing Characteristics for Several Fuel Injectors on an Open Plate and in a Ducted Flowpath Configuration at Hypervelocity Flow Conditions

    NASA Technical Reports Server (NTRS)

    Drozda, Tomasz G.; Shenoy, Rajiv R.; Passe, Bradley J.; Baurle, Robert A.; Drummond, J. Philip

    2017-01-01

    In order to reduce the cost and complexity associated with fuel injection and mixing experiments for high-speed flows, and to further enable optical access to the test section for nonintrusive diagnostics, the Enhanced Injection and Mixing Project (EIMP) utilizes an open flat plate configuration to characterize inert mixing properties of various fuel injectors for hypervelocity applications. The experiments also utilize reduced total temperature conditions to alleviate the need for hardware cooling. The use of "cold" flows and non-reacting mixtures for mixing experiments is not new, and has been extensively utilized as a screening technique for scramjet fuel injectors. The impact of reduced facility-air total temperature, and the use of inert fuel simulants, such as helium, on the mixing character of the flow has been assessed in previous numerical studies by the authors. Mixing performance was characterized for three different injectors: a strut, a ramp, and a flushwall. The present study focuses on the impact of using an open plate to approximate mixing in the duct. Toward this end, Reynolds-averaged simulations (RAS) were performed for the three fuel injectors in an open plate configuration and in a duct. The mixing parameters of interest, such as mixing efficiency and total pressure recovery, are then computed and compared for the two configurations. In addition to mixing efficiency and total pressure recovery, the combustion efficiency and thrust potential are also computed for the reacting simulations.

Top