The effect of introducing computers into an introductory physics problem-solving laboratory
NASA Astrophysics Data System (ADS)
McCullough, Laura Ellen
2000-10-01
Computers are appearing in every type of classroom across the country. Yet they often appear without benefit of studying their effects. The research that is available on computer use in classrooms has found mixed results, and often ignores the theoretical and instructional contexts of the computer in the classroom. The University of Minnesota's physics department employs a cooperative-group problem solving pedagogy, based on a cognitive apprenticeship instructional model, in its calculus-based introductory physics course. This study was designed to determine possible negative effects of introducing a computerized data-acquisition and analysis tool into this pedagogy as a problem-solving tool for students to use in laboratory. To determine the effects of the computer tool, two quasi-experimental treatment groups were selected. The computer-tool group (N = 170) used a tool, designed for this study (VideoTool), to collect and analyze motion data in the laboratory. The control group (N = 170) used traditional non-computer equipment (spark tapes and Polaroid(TM) film). The curriculum was kept as similar as possible for the two groups. During the ten week academic quarter, groups were examined for effects on performance on conceptual tests and grades, attitudes towards the laboratory and the laboratory tools, and behaviors within cooperative groups. Possible interactions with gender were also examined. Few differences were found between the control and computer-tool groups. The control group received slightly higher scores on one conceptual test, but this difference was not educationally significant. The computer-tool group had slightly more positive attitudes towards using the computer tool than their counterparts had towards the traditional tools. The computer-tool group also perceived that they spoke more frequently about physics misunderstandings, while the control group felt that they discussed equipment difficulties more often. This perceptual difference interacted with gender, with the men in the control group more likely to discuss equipment difficulties than any other group. Overall, the differences between the control and quasi-experimental groups were minimal. It was concluded that carefully replacing traditional data collection and analysis tools with a computer tool had no negative effects on achievement, attitude, group behavior, and did not interact with gender.
Integrating Computational Science Tools into a Thermodynamics Course
NASA Astrophysics Data System (ADS)
Vieira, Camilo; Magana, Alejandra J.; García, R. Edwin; Jana, Aniruddha; Krafcik, Matthew
2018-01-01
Computational tools and methods have permeated multiple science and engineering disciplines, because they enable scientists and engineers to process large amounts of data, represent abstract phenomena, and to model and simulate complex concepts. In order to prepare future engineers with the ability to use computational tools in the context of their disciplines, some universities have started to integrate these tools within core courses. This paper evaluates the effect of introducing three computational modules within a thermodynamics course on student disciplinary learning and self-beliefs about computation. The results suggest that using worked examples paired to computer simulations to implement these modules have a positive effect on (1) student disciplinary learning, (2) student perceived ability to do scientific computing, and (3) student perceived ability to do computer programming. These effects were identified regardless of the students' prior experiences with computer programming.
Effects of Attitudes and Behaviours on Learning Mathematics with Computer Tools
ERIC Educational Resources Information Center
Reed, Helen C.; Drijvers, Paul; Kirschner, Paul A.
2010-01-01
This mixed-methods study investigates the effects of student attitudes and behaviours on the outcomes of learning mathematics with computer tools. A computer tool was used to help students develop the mathematical concept of function. In the whole sample (N = 521), student attitudes could account for a 3.4 point difference in test scores between…
A Visual Tool for Computer Supported Learning: The Robot Motion Planning Example
ERIC Educational Resources Information Center
Elnagar, Ashraf; Lulu, Leena
2007-01-01
We introduce an effective computer aided learning visual tool (CALVT) to teach graph-based applications. We present the robot motion planning problem as an example of such applications. The proposed tool can be used to simulate and/or further to implement practical systems in different areas of computer science such as graphics, computational…
Ma, Li; Runesha, H Birali; Dvorkin, Daniel; Garbe, John R; Da, Yang
2008-01-01
Background Genome-wide association studies (GWAS) using single nucleotide polymorphism (SNP) markers provide opportunities to detect epistatic SNPs associated with quantitative traits and to detect the exact mode of an epistasis effect. Computational difficulty is the main bottleneck for epistasis testing in large scale GWAS. Results The EPISNPmpi and EPISNP computer programs were developed for testing single-locus and epistatic SNP effects on quantitative traits in GWAS, including tests of three single-locus effects for each SNP (SNP genotypic effect, additive and dominance effects) and five epistasis effects for each pair of SNPs (two-locus interaction, additive × additive, additive × dominance, dominance × additive, and dominance × dominance) based on the extended Kempthorne model. EPISNPmpi is the parallel computing program for epistasis testing in large scale GWAS and achieved excellent scalability for large scale analysis and portability for various parallel computing platforms. EPISNP is the serial computing program based on the EPISNPmpi code for epistasis testing in small scale GWAS using commonly available operating systems and computer hardware. Three serial computing utility programs were developed for graphical viewing of test results and epistasis networks, and for estimating CPU time and disk space requirements. Conclusion The EPISNPmpi parallel computing program provides an effective computing tool for epistasis testing in large scale GWAS, and the epiSNP serial computing programs are convenient tools for epistasis analysis in small scale GWAS using commonly available computer hardware. PMID:18644146
ERIC Educational Resources Information Center
Shacham, Mordechai; Cutlip, Michael B.; Brauner, Neima
2009-01-01
A continuing challenge to the undergraduate chemical engineering curriculum is the time-effective incorporation and use of computer-based tools throughout the educational program. Computing skills in academia and industry require some proficiency in programming and effective use of software packages for solving 1) single-model, single-algorithm…
Tool Use of Experienced Learners in Computer-Based Learning Environments: Can Tools Be Beneficial?
ERIC Educational Resources Information Center
Juarez Collazo, Norma A.; Corradi, David; Elen, Jan; Clarebout, Geraldine
2014-01-01
Research has documented the use of tools in computer-based learning environments as problematic, that is, learners do not use the tools and when they do, they tend to do it suboptimally. This study attempts to disentangle cause and effect of this suboptimal tool use for experienced learners. More specifically, learner variables (metacognitive and…
Northwest Trajectory Analysis Capability: A Platform for Enhancing Computational Biophysics Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peterson, Elena S.; Stephan, Eric G.; Corrigan, Abigail L.
2008-07-30
As computational resources continue to increase, the ability of computational simulations to effectively complement, and in some cases replace, experimentation in scientific exploration also increases. Today, large-scale simulations are recognized as an effective tool for scientific exploration in many disciplines including chemistry and biology. A natural side effect of this trend has been the need for an increasingly complex analytical environment. In this paper, we describe Northwest Trajectory Analysis Capability (NTRAC), an analytical software suite developed to enhance the efficiency of computational biophysics analyses. Our strategy is to layer higher-level services and introduce improved tools within the user’s familiar environmentmore » without preventing researchers from using traditional tools and methods. Our desire is to share these experiences to serve as an example for effectively analyzing data intensive large scale simulation data.« less
Development of computer-based analytical tool for assessing physical protection system
NASA Astrophysics Data System (ADS)
Mardhi, Alim; Pengvanich, Phongphaeth
2016-01-01
Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer-based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system's detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.
ERIC Educational Resources Information Center
Chapman, Debra; Wang, Shuyan
2015-01-01
Multimedia instructional tools (MMIT) have been identified as a way effectively and economically present instructional material. MMITs are commonly used in introductory computer applications courses as MMITs should be effective in increasing student knowledge and positively impact motivation and learning strategies, without increasing costs. This…
A meta-analysis of pedagogical tools used in introductory programming courses
NASA Astrophysics Data System (ADS)
Trees, Frances P.
Programming is recognized as being challenging for teachers to teach and difficult for students to learn. For decades, computer science educators have looked at innovative approaches by creating pedagogical software tools that attempt to facilitate both the teaching of and the learning of programming. This dissertation investigates the motivations for the integration of pedagogical tools in introductory programming courses and the characteristics that are perceived to contribute to the effectiveness of these tools. The study employs three research stages that examine the tool characteristics and their use. The first stage surveys teachers who use pedagogical tools in an introductory programming course. The second interviews teachers to explore the survey results in more detail and to add greater depth into the choice and use of pedagogical tools in the introductory programming class. The third interviews tool developers to provide an explanatory insight of the tool and the motivation for its creation. The results indicate that the pedagogical tools perceived to be effective share common characteristics: They provide an environment that is manageable, flexible and visual; they provide for active engagement in learning activities and support programming in small pieces; they allow for an easy transition to subsequent courses and more robust environments; they provide technical support and resource materials. The results of this study also indicate that recommendations from other computer science educators have a strong impact on a teacher's initial tool choice for an introductory programming course. This study informs present and future tool developers of the characteristics that the teachers perceive to contribute to the effectiveness of a pedagogical tool and how to present their tools to encourage a more efficient and more effective widespread adoption of the tool into the teacher's curriculum. The teachers involved in this study are actively involved in the computer science education community. The results of this study, based on the perceptions of these computer science educators, provide guidance to those educators choosing to introduce a new pedagogical tool into their programming course.
ERIC Educational Resources Information Center
Lamb, Richard L.
2016-01-01
Within the last 10 years, new tools for assisting in the teaching and learning of academic skills and content within the context of science have arisen. These new tools include multiple types of computer software and hardware to include (video) games. The purpose of this study was to examine and compare the effect of computer learning games in the…
Development and Evaluation of Computer-Based Laboratory Practical Learning Tool
ERIC Educational Resources Information Center
Gandole, Y. B.
2006-01-01
Effective evaluation of educational software is a key issue for successful introduction of advanced tools in the curriculum. This paper details to developing and evaluating a tool for computer assisted learning of science laboratory courses. The process was based on the generic instructional system design model. Various categories of educational…
ERIC Educational Resources Information Center
O'Reilly, Daniel J.
2011-01-01
This study examined the capability of computer simulation as a tool for assessing the strategic competency of emergency department nurses as they responded to authentically computer simulated biohazard-exposed patient case studies. Thirty registered nurses from a large, urban hospital completed a series of computer-simulated case studies of…
Cost-effective cloud computing: a case study using the comparative genomics tool, roundup.
Kudtarkar, Parul; Deluca, Todd F; Fusaro, Vincent A; Tonellato, Peter J; Wall, Dennis P
2010-12-22
Comparative genomics resources, such as ortholog detection tools and repositories are rapidly increasing in scale and complexity. Cloud computing is an emerging technological paradigm that enables researchers to dynamically build a dedicated virtual cluster and may represent a valuable alternative for large computational tools in bioinformatics. In the present manuscript, we optimize the computation of a large-scale comparative genomics resource-Roundup-using cloud computing, describe the proper operating principles required to achieve computational efficiency on the cloud, and detail important procedures for improving cost-effectiveness to ensure maximal computation at minimal costs. Utilizing the comparative genomics tool, Roundup, as a case study, we computed orthologs among 902 fully sequenced genomes on Amazon's Elastic Compute Cloud. For managing the ortholog processes, we designed a strategy to deploy the web service, Elastic MapReduce, and maximize the use of the cloud while simultaneously minimizing costs. Specifically, we created a model to estimate cloud runtime based on the size and complexity of the genomes being compared that determines in advance the optimal order of the jobs to be submitted. We computed orthologous relationships for 245,323 genome-to-genome comparisons on Amazon's computing cloud, a computation that required just over 200 hours and cost $8,000 USD, at least 40% less than expected under a strategy in which genome comparisons were submitted to the cloud randomly with respect to runtime. Our cost savings projections were based on a model that not only demonstrates the optimal strategy for deploying RSD to the cloud, but also finds the optimal cluster size to minimize waste and maximize usage. Our cost-reduction model is readily adaptable for other comparative genomics tools and potentially of significant benefit to labs seeking to take advantage of the cloud as an alternative to local computing infrastructure.
ERIC Educational Resources Information Center
Ciftci, S. Koza; Karadag, Engin; Akdal, Pinar
2014-01-01
The purpose of this study was to determine the effect of statistics instruction using computer-based tools, on statistics anxiety, attitude, and achievement. This study was designed as quasi-experimental research and the pattern used was a matched pre-test/post-test with control group design. Data was collected using three scales: a Statistics…
ERIC Educational Resources Information Center
Li, Rui; Liu, Min
2007-01-01
The purpose of this study is to examine the potential of using computer databases as cognitive tools to share learners' cognitive load and facilitate learning in a multimedia problem-based learning (PBL) environment designed for sixth graders. Two research questions were: (a) can the computer database tool share sixth-graders' cognitive load? and…
ERIC Educational Resources Information Center
Sykes, Edward R.
2007-01-01
Student retention in Computer Science is becoming a serious concern among Educators in many colleges and universities. Most institutions currently face a significant drop in enrollment in Computer Science. A number of different tools and strategies have emerged to address this problem (e.g., BlueJ, Karel Robot, etc.). Although these tools help to…
Dynamic Analyses of Result Quality in Energy-Aware Approximate Programs
NASA Astrophysics Data System (ADS)
RIngenburg, Michael F.
Energy efficiency is a key concern in the design of modern computer systems. One promising approach to energy-efficient computation, approximate computing, trades off output precision for energy efficiency. However, this tradeoff can have unexpected effects on computation quality. This thesis presents dynamic analysis tools to study, debug, and monitor the quality and energy efficiency of approximate computations. We propose three styles of tools: prototyping tools that allow developers to experiment with approximation in their applications, online tools that instrument code to determine the key sources of error, and online tools that monitor the quality of deployed applications in real time. Our prototyping tool is based on an extension to the functional language OCaml. We add approximation constructs to the language, an approximation simulator to the runtime, and profiling and auto-tuning tools for studying and experimenting with energy-quality tradeoffs. We also present two online debugging tools and three online monitoring tools. The first online tool identifies correlations between output quality and the total number of executions of, and errors in, individual approximate operations. The second tracks the number of approximate operations that flow into a particular value. Our online tools comprise three low-cost approaches to dynamic quality monitoring. They are designed to monitor quality in deployed applications without spending more energy than is saved by approximation. Online monitors can be used to perform real time adjustments to energy usage in order to meet specific quality goals. We present prototype implementations of all of these tools and describe their usage with several applications. Our prototyping, profiling, and autotuning tools allow us to experiment with approximation strategies and identify new strategies, our online tools succeed in providing new insights into the effects of approximation on output quality, and our monitors succeed in controlling output quality while still maintaining significant energy efficiency gains.
Computer assisted surgery with 3D robot models and visualisation of the telesurgical action.
Rovetta, A
2000-01-01
This paper deals with the support of virtual reality computer action in the procedures of surgical robotics. Computer support gives a direct representation of the surgical theatre. The modelization of the procedure in course and in development gives a psychological reaction towards safety and reliability. Robots similar to the ones used by the manufacturing industry can be used with little modification as very effective surgical tools. They have high precision, repeatability and are versatile in integrating with the medical instrumentation. Now integrated surgical rooms, with computer and robot-assisted intervention, are operating. The computer is the element for a decision taking aid, and the robot works as a very effective tool.
The Effect of a Computer-Based Cartooning Tool on Children's Cartoons and Written Stories
ERIC Educational Resources Information Center
Madden, M.; Chung, P. W. H.; Dawson, C. W.
2008-01-01
This paper reports a study assessing a new computer tool for cartoon storytelling, created by the authors for a target audience in the upper half of the English and Welsh Key Stage 2 (years 5 and 6, covering ages 9-11 years). The tool attempts to provide users with more opportunities for expressive visualisation than previous educational software;…
Computer tools for systems engineering at LaRC
NASA Technical Reports Server (NTRS)
Walters, J. Milam
1994-01-01
The Systems Engineering Office (SEO) has been established to provide life cycle systems engineering support to Langley research Center projects. over the last two years, the computing market has been reviewed for tools which could enhance the effectiveness and efficiency of activities directed towards this mission. A group of interrelated applications have been procured, or are under development including a requirements management tool, a system design and simulation tool, and project and engineering data base. This paper will review the current configuration of these tools and provide information on future milestones and directions.
NASA Astrophysics Data System (ADS)
Jain, A.
2017-08-01
Computer based method can help in discovery of leads and can potentially eliminate chemical synthesis and screening of many irrelevant compounds, and in this way, it save time as well as cost. Molecular modeling systems are powerful tools for building, visualizing, analyzing and storing models of complex molecular structure that can help to interpretate structure activity relationship. The use of various techniques of molecular mechanics and dynamics and software in Computer aided drug design along with statistics analysis is powerful tool for the medicinal chemistry to synthesis therapeutic and effective drugs with minimum side effect.
Physics-based Entry, Descent and Landing Risk Model
NASA Technical Reports Server (NTRS)
Gee, Ken; Huynh, Loc C.; Manning, Ted
2014-01-01
A physics-based risk model was developed to assess the risk associated with thermal protection system failures during the entry, descent and landing phase of a manned spacecraft mission. In the model, entry trajectories were computed using a three-degree-of-freedom trajectory tool, the aerothermodynamic heating environment was computed using an engineering-level computational tool and the thermal response of the TPS material was modeled using a one-dimensional thermal response tool. The model was capable of modeling the effect of micrometeoroid and orbital debris impact damage on the TPS thermal response. A Monte Carlo analysis was used to determine the effects of uncertainties in the vehicle state at Entry Interface, aerothermodynamic heating and material properties on the performance of the TPS design. The failure criterion was set as a temperature limit at the bondline between the TPS and the underlying structure. Both direct computation and response surface approaches were used to compute the risk. The model was applied to a generic manned space capsule design. The effect of material property uncertainty and MMOD damage on risk of failure were analyzed. A comparison of the direct computation and response surface approach was undertaken.
NASA Technical Reports Server (NTRS)
Chan, William M.; Rogers, Stuart E.; Nash, Steven M.; Buning, Pieter G.; Meakin, Robert
2005-01-01
Chimera Grid Tools (CGT) is a software package for performing computational fluid dynamics (CFD) analysis utilizing the Chimera-overset-grid method. For modeling flows with viscosity about geometrically complex bodies in relative motion, the Chimera-overset-grid method is among the most computationally cost-effective methods for obtaining accurate aerodynamic results. CGT contains a large collection of tools for generating overset grids, preparing inputs for computer programs that solve equations of flow on the grids, and post-processing of flow-solution data. The tools in CGT include grid editing tools, surface-grid-generation tools, volume-grid-generation tools, utility scripts, configuration scripts, and tools for post-processing (including generation of animated images of flows and calculating forces and moments exerted on affected bodies). One of the tools, denoted OVERGRID, is a graphical user interface (GUI) that serves to visualize the grids and flow solutions and provides central access to many other tools. The GUI facilitates the generation of grids for a new flow-field configuration. Scripts that follow the grid generation process can then be constructed to mostly automate grid generation for similar configurations. CGT is designed for use in conjunction with a computer-aided-design program that provides the geometry description of the bodies, and a flow-solver program.
Cost-Effective Cloud Computing: A Case Study Using the Comparative Genomics Tool, Roundup
Kudtarkar, Parul; DeLuca, Todd F.; Fusaro, Vincent A.; Tonellato, Peter J.; Wall, Dennis P.
2010-01-01
Background Comparative genomics resources, such as ortholog detection tools and repositories are rapidly increasing in scale and complexity. Cloud computing is an emerging technological paradigm that enables researchers to dynamically build a dedicated virtual cluster and may represent a valuable alternative for large computational tools in bioinformatics. In the present manuscript, we optimize the computation of a large-scale comparative genomics resource—Roundup—using cloud computing, describe the proper operating principles required to achieve computational efficiency on the cloud, and detail important procedures for improving cost-effectiveness to ensure maximal computation at minimal costs. Methods Utilizing the comparative genomics tool, Roundup, as a case study, we computed orthologs among 902 fully sequenced genomes on Amazon’s Elastic Compute Cloud. For managing the ortholog processes, we designed a strategy to deploy the web service, Elastic MapReduce, and maximize the use of the cloud while simultaneously minimizing costs. Specifically, we created a model to estimate cloud runtime based on the size and complexity of the genomes being compared that determines in advance the optimal order of the jobs to be submitted. Results We computed orthologous relationships for 245,323 genome-to-genome comparisons on Amazon’s computing cloud, a computation that required just over 200 hours and cost $8,000 USD, at least 40% less than expected under a strategy in which genome comparisons were submitted to the cloud randomly with respect to runtime. Our cost savings projections were based on a model that not only demonstrates the optimal strategy for deploying RSD to the cloud, but also finds the optimal cluster size to minimize waste and maximize usage. Our cost-reduction model is readily adaptable for other comparative genomics tools and potentially of significant benefit to labs seeking to take advantage of the cloud as an alternative to local computing infrastructure. PMID:21258651
Computation of Effect Size for Moderating Effects of Categorical Variables in Multiple Regression
ERIC Educational Resources Information Center
Aguinis, Herman; Pierce, Charles A.
2006-01-01
The computation and reporting of effect size estimates is becoming the norm in many journals in psychology and related disciplines. Despite the increased importance of effect sizes, researchers may not report them or may report inaccurate values because of a lack of appropriate computational tools. For instance, Pierce, Block, and Aguinis (2004)…
ERIC Educational Resources Information Center
Angeli, Charoula
2013-01-01
An investigation was carried out to examine the effects of cognitive style on learners' performance and interaction during complex problem solving with a computer modeling tool. One hundred and nineteen undergraduates volunteered to participate in the study. Participants were first administered a test, and based on their test scores they were…
ERIC Educational Resources Information Center
Chien, Tien-Chen
2008-01-01
Computer is not only a powerful technology for managing information and enhancing productivity, but also an efficient tool for education and training. Computer anxiety can be one of the major problems that affect the effectiveness of learning. Through analyzing related literature, this study describes the phenomenon of computer anxiety,…
Medical informatics--an Australian perspective.
Hannan, T
1991-06-01
Computers, like the X-ray and stethoscope can be seen as clinical tools, that provide physicians with improved expertise in solving patient management problems. As tools they enable us to extend our clinical information base, and they also provide facilities that improve the delivery of the health care we provide. Automation (computerisation) in the health domain will cause the computer to become a more integral part of health care management and delivery before the start of the next century. To understand how the computer assists those who deliver and manage health care, it is important to be aware of its functional capabilities and how we can use them in medical practice. The rapid technological advances in computers over the last two decades has had both beneficial and counterproductive effects on the implementation of effective computer applications in the delivery of health care. For example, in the 1990s the computer hobbyist is able to make an investment of less than $10,000 on computer hardware that will match or exceed the technological capacities of machines of the 1960s. These rapid technological advances, which have produced a quantum leap in our ability to store and process information, have tended to make us overlook the need for effective computer programmes which will meet the needs of patient care. As the 1990s begin, those delivering health care (eg, physicians, nurses, pharmacists, administrators ...) need to become more involved in directing the effective implementation of computer applications that will provide the tools for improved information management, knowledge processing, and ultimately better patient care.
Computer-Based Cognitive Tools in Teacher Training: The COG-TECH Projects
ERIC Educational Resources Information Center
Orhun, Emrah
2003-01-01
The COG-TECH (Cognitive Technologies for Problem Solving and Learning) Network conducted three international projects between 1994 and 2001 under the auspices of the European Commission. The main purpose of these projects was to train teacher educators in the Mediterranean countries to use computers as effective pedagogical tools. The summer…
Computer Technology Integration and Student Learning: Barriers and Promise
ERIC Educational Resources Information Center
Keengwe, Jared; Onchwari, Grace; Wachira, Patrick
2008-01-01
Political and institutional support has enabled many institutions of learning to spend millions of dollars to acquire educational computing tools (Ficklen and Muscara, "Am Educ" 25(3):22-29, 2001) that have not been effectively integrated into the curriculum. While access to educational technology tools has remarkably improved in most schools,…
Computational Analysis of Material Flow During Friction Stir Welding of AA5059 Aluminum Alloys
2011-01-01
tool material (AISI H13 tool steel ) is modeled as an isotropic linear-elastic material. Within the analysis, the effects of some of the FSW key process...threads/m; (b) tool 598 material = AISI H13 tool steel ; (c) workpiece material = 599 AA5059; (d) tool rotation speed = 500 rpm; (e) tool travel 600 speed...the strain-hardening term is augmented to take into account for the effect of dynamic recrystallization) while the FSW tool material (AISI H13
Automatic Generation of OpenMP Directives and Its Application to Computational Fluid Dynamics Codes
NASA Technical Reports Server (NTRS)
Yan, Jerry; Jin, Haoqiang; Frumkin, Michael; Yan, Jerry (Technical Monitor)
2000-01-01
The shared-memory programming model is a very effective way to achieve parallelism on shared memory parallel computers. As great progress was made in hardware and software technologies, performance of parallel programs with compiler directives has demonstrated large improvement. The introduction of OpenMP directives, the industrial standard for shared-memory programming, has minimized the issue of portability. In this study, we have extended CAPTools, a computer-aided parallelization toolkit, to automatically generate OpenMP-based parallel programs with nominal user assistance. We outline techniques used in the implementation of the tool and discuss the application of this tool on the NAS Parallel Benchmarks and several computational fluid dynamics codes. This work demonstrates the great potential of using the tool to quickly port parallel programs and also achieve good performance that exceeds some of the commercial tools.
Development of computer-based analytical tool for assessing physical protection system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mardhi, Alim, E-mail: alim-m@batan.go.id; Chulalongkorn University, Faculty of Engineering, Nuclear Engineering Department, 254 Phayathai Road, Pathumwan, Bangkok Thailand. 10330; Pengvanich, Phongphaeth, E-mail: ppengvan@gmail.com
Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work,more » we have developed a computer–based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system’s detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.« less
Emerging Approach of Natural Language Processing in Opinion Mining: A Review
NASA Astrophysics Data System (ADS)
Kim, Tai-Hoon
Natural language processing (NLP) is a subfield of artificial intelligence and computational linguistics. It studies the problems of automated generation and understanding of natural human languages. This paper outlines a framework to use computer and natural language techniques for various levels of learners to learn foreign languages in Computer-based Learning environment. We propose some ideas for using the computer as a practical tool for learning foreign language where the most of courseware is generated automatically. We then describe how to build Computer Based Learning tools, discuss its effectiveness, and conclude with some possibilities using on-line resources.
The iPlant Collaborative: Cyberinfrastructure for Enabling Data to Discovery for the Life Sciences.
Merchant, Nirav; Lyons, Eric; Goff, Stephen; Vaughn, Matthew; Ware, Doreen; Micklos, David; Antin, Parker
2016-01-01
The iPlant Collaborative provides life science research communities access to comprehensive, scalable, and cohesive computational infrastructure for data management; identity management; collaboration tools; and cloud, high-performance, high-throughput computing. iPlant provides training, learning material, and best practice resources to help all researchers make the best use of their data, expand their computational skill set, and effectively manage their data and computation when working as distributed teams. iPlant's platform permits researchers to easily deposit and share their data and deploy new computational tools and analysis workflows, allowing the broader community to easily use and reuse those data and computational analyses.
NASA Astrophysics Data System (ADS)
Saighi, Ouafa; Salah Zerouala, Mohamed
2017-12-01
This The paper particularly deals with the way in which computer tools are used by students in their design studio’s projects. Four institutions of architecture education in Algeria are considered as a case study to evaluate the impact of such tools on student design process. This aims to inspect in depth such use, to sort out its advantages and shortcomings in order to suggest some solutions. A field survey was undertaken on a sample of students and their teachers at the same institutions. The analysed results mainly show that computer tools are highly focusing on improving the quality of drawings representation and images seeking observers’ satisfaction hence influencing their decision. Some teachers are not very keen to overuse the computer during the design phase; they prefer the “traditional” approach. This is the present situation that Algerian university is facing which leads to conflict and disagreement between students and teachers. Meanwhile, there was no doubt that computer tools have effectively contributed to improve the competitive level among students.
ERIC Educational Resources Information Center
Felix, Vanessa G.; Mena, Luis J.; Ostos, Rodolfo; Maestre, Gladys E.
2017-01-01
Despite the potential benefits that computer approaches could provide for children with cognitive disabilities, research and implementation of emerging approaches to learning supported by computing technology has not received adequate attention. We conducted a pilot study to assess the effectiveness of a computer-assisted learning tool, named…
New and revised fire effects tools for fire management
Robert E. Keane; Greg Dillon; Stacy Drury; Robin Innes; Penny Morgan; Duncan Lutes; Susan J. Prichard; Jane Smith; Eva Strand
2014-01-01
Announcing the release of new software packages for application in wildland fire science and management, two fields that are already fully saturated with computer technology, may seem a bit too much to many managers. However, there have been some recent releases of new computer programs and revisions of existing software and information tools that deserve mention...
ERIC Educational Resources Information Center
Okonta, Olomeruom
2010-01-01
Recent research studies in open and distance learning have focused on the differences between traditional learning versus online learning, the benefits of computer-mediated communication (CMC) tools in an e-learning environment, and the relationship between online discussion posts and students' achievement. In fact, there is an extant…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Calderer, Antoni; Yang, Xiaolei; Angelidis, Dionysios
2015-10-30
The present project involves the development of modeling and analysis design tools for assessing offshore wind turbine technologies. The computational tools developed herein are able to resolve the effects of the coupled interaction of atmospheric turbulence and ocean waves on aerodynamic performance and structural stability and reliability of offshore wind turbines and farms. Laboratory scale experiments have been carried out to derive data sets for validating the computational models.
Computational methods in drug discovery
Leelananda, Sumudu P
2016-01-01
The process for drug discovery and development is challenging, time consuming and expensive. Computer-aided drug discovery (CADD) tools can act as a virtual shortcut, assisting in the expedition of this long process and potentially reducing the cost of research and development. Today CADD has become an effective and indispensable tool in therapeutic development. The human genome project has made available a substantial amount of sequence data that can be used in various drug discovery projects. Additionally, increasing knowledge of biological structures, as well as increasing computer power have made it possible to use computational methods effectively in various phases of the drug discovery and development pipeline. The importance of in silico tools is greater than ever before and has advanced pharmaceutical research. Here we present an overview of computational methods used in different facets of drug discovery and highlight some of the recent successes. In this review, both structure-based and ligand-based drug discovery methods are discussed. Advances in virtual high-throughput screening, protein structure prediction methods, protein–ligand docking, pharmacophore modeling and QSAR techniques are reviewed. PMID:28144341
Computational methods in drug discovery.
Leelananda, Sumudu P; Lindert, Steffen
2016-01-01
The process for drug discovery and development is challenging, time consuming and expensive. Computer-aided drug discovery (CADD) tools can act as a virtual shortcut, assisting in the expedition of this long process and potentially reducing the cost of research and development. Today CADD has become an effective and indispensable tool in therapeutic development. The human genome project has made available a substantial amount of sequence data that can be used in various drug discovery projects. Additionally, increasing knowledge of biological structures, as well as increasing computer power have made it possible to use computational methods effectively in various phases of the drug discovery and development pipeline. The importance of in silico tools is greater than ever before and has advanced pharmaceutical research. Here we present an overview of computational methods used in different facets of drug discovery and highlight some of the recent successes. In this review, both structure-based and ligand-based drug discovery methods are discussed. Advances in virtual high-throughput screening, protein structure prediction methods, protein-ligand docking, pharmacophore modeling and QSAR techniques are reviewed.
Development and Validation of a Standardized Tool for Prioritization of Information Sources.
Akwar, Holy; Kloeze, Harold; Mukhi, Shamir
2016-01-01
To validate the utility and effectiveness of a standardized tool for prioritization of information sources for early detection of diseases. The tool was developed with input from diverse public health experts garnered through survey. Ten raters used the tool to evaluate ten information sources and reliability among raters was computed. The Proc mixed procedure with random effect statement and SAS Macros were used to compute multiple raters' Fleiss Kappa agreement and Kendall's Coefficient of Concordance. Ten disparate information sources evaluated obtained the following composite scores: ProMed 91%; WAHID 90%; Eurosurv 87%; MediSys 85%; SciDaily 84%; EurekAl 83%; CSHB 78%; GermTrax 75%; Google 74%; and CBC 70%. A Fleiss Kappa agreement of 50.7% was obtained for ten information sources and 72.5% for a sub-set of five sources rated, which is substantial agreement validating the utility and effectiveness of the tool. This study validated the utility and effectiveness of a standardized criteria tool developed to prioritize information sources. The new tool was used to identify five information sources suited for use by the KIWI system in the CEZD-IIR project to improve surveillance of infectious diseases. The tool can be generalized to situations when prioritization of numerous information sources is necessary.
The iPlant Collaborative: Cyberinfrastructure for Enabling Data to Discovery for the Life Sciences
Merchant, Nirav; Lyons, Eric; Goff, Stephen; Vaughn, Matthew; Ware, Doreen; Micklos, David; Antin, Parker
2016-01-01
The iPlant Collaborative provides life science research communities access to comprehensive, scalable, and cohesive computational infrastructure for data management; identity management; collaboration tools; and cloud, high-performance, high-throughput computing. iPlant provides training, learning material, and best practice resources to help all researchers make the best use of their data, expand their computational skill set, and effectively manage their data and computation when working as distributed teams. iPlant’s platform permits researchers to easily deposit and share their data and deploy new computational tools and analysis workflows, allowing the broader community to easily use and reuse those data and computational analyses. PMID:26752627
Policy Analysis: A Tool for Setting District Computer Use Policy. Paper and Report Series No. 97.
ERIC Educational Resources Information Center
Gray, Peter J.
This report explores the use of policy analysis as a tool for setting computer use policy in a school district by discussing the steps in the policy formation and implementation processes and outlining how policy analysis methods can contribute to the creation of effective policy. Factors related to the adoption and implementation of innovations…
ERIC Educational Resources Information Center
Akpinar, Ercan
2014-01-01
This study investigates the effects of using interactive computer animations based on predict-observe-explain (POE) as a presentation tool on primary school students' understanding of the static electricity concepts. A quasi-experimental pre-test/post-test control group design was utilized in this study. The experiment group consisted of 30…
ERIC Educational Resources Information Center
Stanton, Michael; And Others
1985-01-01
Three reports on the effects of high technology on the nature of work include (1) Stanton on applications and implications of computer-aided design for engineers, drafters, and architects; (2) Nardone on the outlook and training of numerical-control machine tool operators; and (3) Austin and Drake on the future of clerical occupations in automated…
ERIC Educational Resources Information Center
Korfiatis, K.; Papatheodorou, E.; Paraskevopoulous, S.; Stamou, G. P.
1999-01-01
Describes a study of the effectiveness of computer-simulation programs in enhancing biology students' familiarity with ecological modeling and concepts. Finds that computer simulations improved student comprehension of ecological processes expressed in mathematical form, but did not allow a full understanding of ecological concepts. Contains 28…
Identifying the Factors that Influence Computer Use in the Early Childhood Classroom
ERIC Educational Resources Information Center
Edwards, Suzy
2005-01-01
Computers have become an increasingly accepted learning tool in the early childhood classroom. Despite initial concerns regarding the effect of computers on children's development, past research has indicated that computer use by young children can support their learning and developmental outcomes (Siraj-Blatchford & Whitebread, 2003; Yelland,…
Computing Linear Mathematical Models Of Aircraft
NASA Technical Reports Server (NTRS)
Duke, Eugene L.; Antoniewicz, Robert F.; Krambeer, Keith D.
1991-01-01
Derivation and Definition of Linear Aircraft Model (LINEAR) computer program provides user with powerful, and flexible, standard, documented, and verified software tool for linearization of mathematical models of aerodynamics of aircraft. Intended for use in software tool to drive linear analysis of stability and design of control laws for aircraft. Capable of both extracting such linearized engine effects as net thrust, torque, and gyroscopic effects, and including these effects in linear model of system. Designed to provide easy selection of state, control, and observation variables used in particular model. Also provides flexibility of allowing alternate formulations of both state and observation equations. Written in FORTRAN.
Closing the Technological Gender Gap: Feminist Pedagogy in the Computer-Assisted Classroom.
ERIC Educational Resources Information Center
Hesse-Biber, Sharlene; Gilbert, Melissa Kesler
1994-01-01
Asserts that, although computers are playing an increasingly important role in the classroom, a technological gender gap serves as a barrier to the effective use of computers by women instructors in higher education. Encourages women to seize computer tools for their own educational purposes and argues for enhancing women's computer learning. (CFR)
The Effect of Computer Literacy Course on Students' Attitudes toward Computer Applications
ERIC Educational Resources Information Center
Erlich, Zippy; Gadot, Rivka; Shahak, Daphna
2009-01-01
Studies indicate that the use of technologies as teaching aids and tools for self-study is influenced by students' attitudes toward computers and their applications. The purpose of this study is to determine whether taking a Computer Literacy and Applications (CLA) course has an impact on students' attitudes toward computer applications, across…
Dinov, Ivo D; Sanchez, Juana; Christou, Nicolas
2008-01-01
Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment.The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual classes; however, pooling the results across all courses and sections, SOCR effects on the treatment groups were exceptionally robust and significant. Coupling these findings with a clear decrease in the variance of the quantitative examination measures in the treatment groups indicates that employing technology, like SOCR, in a sound pedagogical and scientific manner enhances overall the students' understanding and suggests better long-term knowledge retention.
Dinov, Ivo D.; Sanchez, Juana; Christou, Nicolas
2009-01-01
Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment. The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual classes; however, pooling the results across all courses and sections, SOCR effects on the treatment groups were exceptionally robust and significant. Coupling these findings with a clear decrease in the variance of the quantitative examination measures in the treatment groups indicates that employing technology, like SOCR, in a sound pedagogical and scientific manner enhances overall the students’ understanding and suggests better long-term knowledge retention. PMID:19750185
Towards a cyberinfrastructure for the biological sciences: progress, visions and challenges.
Stein, Lincoln D
2008-09-01
Biology is an information-driven science. Large-scale data sets from genomics, physiology, population genetics and imaging are driving research at a dizzying rate. Simultaneously, interdisciplinary collaborations among experimental biologists, theorists, statisticians and computer scientists have become the key to making effective use of these data sets. However, too many biologists have trouble accessing and using these electronic data sets and tools effectively. A 'cyberinfrastructure' is a combination of databases, network protocols and computational services that brings people, information and computational tools together to perform science in this information-driven world. This article reviews the components of a biological cyberinfrastructure, discusses current and pending implementations, and notes the many challenges that lie ahead.
Multidisciplinary Shape Optimization of a Composite Blended Wing Body Aircraft
NASA Astrophysics Data System (ADS)
Boozer, Charles Maxwell
A multidisciplinary shape optimization tool coupling aerodynamics, structure, and performance was developed for battery powered aircraft. Utilizing high-fidelity computational fluid dynamics analysis tools and a structural wing weight tool, coupled based on the multidisciplinary feasible optimization architecture; aircraft geometry is modified in the optimization of the aircraft's range or endurance. The developed tool is applied to three geometries: a hybrid blended wing body, delta wing UAS, the ONERA M6 wing, and a modified ONERA M6 wing. First, the optimization problem is presented with the objective function, constraints, and design vector. Next, the tool's architecture and the analysis tools that are utilized are described. Finally, various optimizations are described and their results analyzed for all test subjects. Results show that less computationally expensive inviscid optimizations yield positive performance improvements using planform, airfoil, and three-dimensional degrees of freedom. From the results obtained through a series of optimizations, it is concluded that the newly developed tool is both effective at improving performance and serves as a platform ready to receive additional performance modules, further improving its computational design support potential.
Examining the Feasibility and Effect of Transitioning GED Tests to Computer
ERIC Educational Resources Information Center
Higgins, Jennifer; Patterson, Margaret Becker; Bozman, Martha; Katz, Michael
2010-01-01
This study examined the feasibility of administering GED Tests using a computer based testing system with embedded accessibility tools and the impact on test scores and test-taker experience when GED Tests are transitioned from paper to computer. Nineteen test centers across five states successfully installed the computer based testing program,…
Organizational/Memory Tools: A Technique for Improving Problem Solving Skills.
ERIC Educational Resources Information Center
Steinberg, Esther R.; And Others
1986-01-01
This study was conducted to determine whether students would use a computer-presented organizational/memory tool as an aid in problem solving, and whether and how locus of control would affect tool use and problem-solving performance. Learners did use the tools, which were most effective in the learner control with feedback condition. (MBR)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Habib, Salman; Roser, Robert
Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3)more » Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Habib, Salman; Roser, Robert; LeCompte, Tom
2015-10-29
Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3)more » Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.« less
Assistive Software Tools for Secondary-Level Students with Literacy Difficulties
ERIC Educational Resources Information Center
Lange, Alissa A.; McPhillips, Martin; Mulhern, Gerry; Wylie, Judith
2006-01-01
The present study assessed the compensatory effectiveness of four assistive software tools (speech synthesis, spellchecker, homophone tool, and dictionary) on literacy. Secondary-level students (N = 93) with reading difficulties completed computer-based tests of literacy skills. Training on their respective software followed for those assigned to…
Chameleon in the Classroom: Developing Roles for Computers. Symposium. Technical Report No. 22.
ERIC Educational Resources Information Center
Sheingold, Karen; And Others
This symposium includes the following papers: "Software for the Classroom: Issues in the Design of Effective Software Tools" (D. Midian Kurland); "Computers for Composing" (Janet H. Kane); "LOGO Programming and Problem Solving" (Roy D. Pea); "The Computer as Sandcastle" (Jeanne Bamberger); "Learning…
Programming the Navier-Stokes computer: An abstract machine model and a visual editor
NASA Technical Reports Server (NTRS)
Middleton, David; Crockett, Tom; Tomboulian, Sherry
1988-01-01
The Navier-Stokes computer is a parallel computer designed to solve Computational Fluid Dynamics problems. Each processor contains several floating point units which can be configured under program control to implement a vector pipeline with several inputs and outputs. Since the development of an effective compiler for this computer appears to be very difficult, machine level programming seems necessary and support tools for this process have been studied. These support tools are organized into a graphical program editor. A programming process is described by which appropriate computations may be efficiently implemented on the Navier-Stokes computer. The graphical editor would support this programming process, verifying various programmer choices for correctness and deducing values such as pipeline delays and network configurations. Step by step details are provided and demonstrated with two example programs.
A computational continuum model of poroelastic beds
Zampogna, G. A.
2017-01-01
Despite the ubiquity of fluid flows interacting with porous and elastic materials, we lack a validated non-empirical macroscale method for characterizing the flow over and through a poroelastic medium. We propose a computational tool to describe such configurations by deriving and validating a continuum model for the poroelastic bed and its interface with the above free fluid. We show that, using stress continuity condition and slip velocity condition at the interface, the effective model captures the effects of small changes in the microstructure anisotropy correctly and predicts the overall behaviour in a physically consistent and controllable manner. Moreover, we show that the performance of the effective model is accurate by validating with fully microscopic resolved simulations. The proposed computational tool can be used in investigations in a wide range of fields, including mechanical engineering, bio-engineering and geophysics. PMID:28413355
Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas
2016-01-01
Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments.
Dinov, Ivo D; Rubin, Daniel; Lorensen, William; Dugan, Jonathan; Ma, Jeff; Murphy, Shawn; Kirschner, Beth; Bug, William; Sherman, Michael; Floratos, Aris; Kennedy, David; Jagadish, H V; Schmidt, Jeanette; Athey, Brian; Califano, Andrea; Musen, Mark; Altman, Russ; Kikinis, Ron; Kohane, Isaac; Delp, Scott; Parker, D Stott; Toga, Arthur W
2008-05-28
The advancement of the computational biology field hinges on progress in three fundamental directions--the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources--data, software tools and web-services. The iTools design, implementation and resource meta-data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long-term resource management. We demonstrate several applications of iTools as a framework for integrated bioinformatics. iTools and the complete details about its specifications, usage and interfaces are available at the iTools web page http://iTools.ccb.ucla.edu.
Dinov, Ivo D.; Rubin, Daniel; Lorensen, William; Dugan, Jonathan; Ma, Jeff; Murphy, Shawn; Kirschner, Beth; Bug, William; Sherman, Michael; Floratos, Aris; Kennedy, David; Jagadish, H. V.; Schmidt, Jeanette; Athey, Brian; Califano, Andrea; Musen, Mark; Altman, Russ; Kikinis, Ron; Kohane, Isaac; Delp, Scott; Parker, D. Stott; Toga, Arthur W.
2008-01-01
The advancement of the computational biology field hinges on progress in three fundamental directions – the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources–data, software tools and web-services. The iTools design, implementation and resource meta - data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long-term resource management. We demonstrate several applications of iTools as a framework for integrated bioinformatics. iTools and the complete details about its specifications, usage and interfaces are available at the iTools web page http://iTools.ccb.ucla.edu. PMID:18509477
ERIC Educational Resources Information Center
Joyner, Amy
2003-01-01
Handheld computers provide students tremendous computing and learning power at about a 10th the cost of a regular computer. Describes the evolution of handhelds; provides some examples of their uses; and cites research indicating they are effective classroom tools that can improve efficiency and instruction. A sidebar lists handheld resources.…
ERIC Educational Resources Information Center
Dikli, Semire
2006-01-01
The impacts of computers on writing have been widely studied for three decades. Even basic computers functions, i.e. word processing, have been of great assistance to writers in modifying their essays. The research on Automated Essay Scoring (AES) has revealed that computers have the capacity to function as a more effective cognitive tool (Attali,…
GISpark: A Geospatial Distributed Computing Platform for Spatiotemporal Big Data
NASA Astrophysics Data System (ADS)
Wang, S.; Zhong, E.; Wang, E.; Zhong, Y.; Cai, W.; Li, S.; Gao, S.
2016-12-01
Geospatial data are growing exponentially because of the proliferation of cost effective and ubiquitous positioning technologies such as global remote-sensing satellites and location-based devices. Analyzing large amounts of geospatial data can provide great value for both industrial and scientific applications. Data- and compute- intensive characteristics inherent in geospatial big data increasingly pose great challenges to technologies of data storing, computing and analyzing. Such challenges require a scalable and efficient architecture that can store, query, analyze, and visualize large-scale spatiotemporal data. Therefore, we developed GISpark - a geospatial distributed computing platform for processing large-scale vector, raster and stream data. GISpark is constructed based on the latest virtualized computing infrastructures and distributed computing architecture. OpenStack and Docker are used to build multi-user hosting cloud computing infrastructure for GISpark. The virtual storage systems such as HDFS, Ceph, MongoDB are combined and adopted for spatiotemporal data storage management. Spark-based algorithm framework is developed for efficient parallel computing. Within this framework, SuperMap GIScript and various open-source GIS libraries can be integrated into GISpark. GISpark can also integrated with scientific computing environment (e.g., Anaconda), interactive computing web applications (e.g., Jupyter notebook), and machine learning tools (e.g., TensorFlow/Orange). The associated geospatial facilities of GISpark in conjunction with the scientific computing environment, exploratory spatial data analysis tools, temporal data management and analysis systems make up a powerful geospatial computing tool. GISpark not only provides spatiotemporal big data processing capacity in the geospatial field, but also provides spatiotemporal computational model and advanced geospatial visualization tools that deals with other domains related with spatial property. We tested the performance of the platform based on taxi trajectory analysis. Results suggested that GISpark achieves excellent run time performance in spatiotemporal big data applications.
WASTE REDUCTION USING COMPUTER-AIDED DESIGN TOOLS
Growing environmental concerns have spurred considerable interest in pollution prevention. In most instances, pollution prevention involves introducing radical changes to the design of processes so that waste generation is minimized.
Process simulators can be effective tools i...
ERIC Educational Resources Information Center
Rao, J. Durga Prasad; Singh, Raksha
2011-01-01
The study was conducted to determine the effectiveness of Information Communication and Technology tools viz DLP (Distance Learning Projector) and Computer/Laptop in comparison with selected instructional media for teaching primary and secondary school pupils. It examined the effect of grade on the performance of the pupils taught with four…
Chapter 13: Tools for analysis
William Elliot; Kevin Hyde; Lee MacDonald; James McKean
2007-01-01
This chapter presents a synthesis of current computer modeling tools that are, or could be, adopted for use in evaluating the cumulative watershed effects of fuel management. The chapter focuses on runoff, soil erosion and slope stability predictive tools. Readers should refer to chapters on soil erosion and stability for more detailed information on the physical...
Visualization and Interaction in Research, Teaching, and Scientific Communication
NASA Astrophysics Data System (ADS)
Ammon, C. J.
2017-12-01
Modern computing provides many tools for exploring observations, numerical calculations, and theoretical relationships. The number of options is, in fact, almost overwhelming. But the choices provide those with modest programming skills opportunities to create unique views of scientific information and to develop deeper insights into their data, their computations, and the underlying theoretical data-model relationships. I present simple examples of using animation and human-computer interaction to explore scientific data and scientific-analysis approaches. I illustrate how valuable a little programming ability can free scientists from the constraints of existing tools and can facilitate the development of deeper appreciation data and models. I present examples from a suite of programming languages ranging from C to JavaScript including the Wolfram Language. JavaScript is valuable for sharing tools and insight (hopefully) with others because it is integrated into one of the most powerful communication tools in human history, the web browser. Although too much of that power is often spent on distracting advertisements, the underlying computation and graphics engines are efficient, flexible, and almost universally available in desktop and mobile computing platforms. Many are working to fulfill the browser's potential to become the most effective tool for interactive study. Open-source frameworks for visualizing everything from algorithms to data are available, but advance rapidly. One strategy for dealing with swiftly changing tools is to adopt common, open data formats that are easily adapted (often by framework or tool developers). I illustrate the use of animation and interaction in research and teaching with examples from earthquake seismology.
Effectiveness of a Case-Based Computer Program on Students' Ethical Decision Making.
Park, Eun-Jun; Park, Mihyun
2015-11-01
The aim of this study was to test the effectiveness of a case-based computer program, using an integrative ethical decision-making model, on the ethical decision-making competency of nursing students in South Korea. This study used a pre- and posttest comparison design. Students in the intervention group used a computer program for case analysis assignments, whereas students in the standard group used a traditional paper assignment for case analysis. The findings showed that using the case-based computer program as a complementary tool for the ethics courses offered at the university enhanced students' ethical preparedness and satisfaction with the course. On the basis of the findings, it is recommended that nurse educators use a case-based computer program as a complementary self-study tool in ethics courses to supplement student learning without an increase in course hours, particularly in terms of analyzing ethics cases with dilemma scenarios and exercising ethical decision making. Copyright 2015, SLACK Incorporated.
Investigating the Effectiveness of Computer Simulations for Chemistry Learning
ERIC Educational Resources Information Center
Plass, Jan L.; Milne, Catherine; Homer, Bruce D.; Schwartz, Ruth N.; Hayward, Elizabeth O.; Jordan, Trace; Verkuilen, Jay; Ng, Florrie; Wang, Yan; Barrientos, Juan
2012-01-01
Are well-designed computer simulations an effective tool to support student understanding of complex concepts in chemistry when integrated into high school science classrooms? We investigated scaling up the use of a sequence of simulations of kinetic molecular theory and associated topics of diffusion, gas laws, and phase change, which we designed…
NASA Technical Reports Server (NTRS)
Perkins, Hugh Douglas
2010-01-01
In order to improve the understanding of particle vitiation effects in hypersonic propulsion test facilities, a quasi-one dimensional numerical tool was developed to efficiently model reacting particle-gas flows over a wide range of conditions. Features of this code include gas-phase finite-rate kinetics, a global porous-particle combustion model, mass, momentum and energy interactions between phases, and subsonic and supersonic particle drag and heat transfer models. The basic capabilities of this tool were validated against available data or other validated codes. To demonstrate the capabilities of the code a series of computations were performed for a model hypersonic propulsion test facility and scramjet. Parameters studied were simulated flight Mach number, particle size, particle mass fraction and particle material.
A simple and inexpensive method of preoperative computer imaging for rhinoplasty.
Ewart, Christopher J; Leonard, Christopher J; Harper, J Garrett; Yu, Jack
2006-01-01
GOALS/PURPOSE: Despite concerns of legal liability, preoperative computer imaging has become a popular tool for the plastic surgeon. The ability to project possible surgical outcomes can facilitate communication between the patient and surgeon. It can be an effective tool in the education and training of residents. Unfortunately, these imaging programs are expensive and have a steep learning curve. The purpose of this paper is to present a relatively inexpensive method of preoperative computer imaging with a reasonable learning curve. The price of currently available imaging programs was acquired through an online search, and inquiries were made to the software distributors. Their prices were compared to Adobe PhotoShop, which has special filters called "liquify" and "photocopy." It was used in the preoperative computer planning of 2 patients who presented for rhinoplasty at our institution. Projected images were created based on harmonious discussions between the patient and physician. Importantly, these images were presented to the patient as potential results, with no guarantees as to actual outcomes. Adobe PhotoShop can be purchased for 900-5800 dollars less than the leading computer imaging software for cosmetic rhinoplasty. Effective projected images were created using the "liquify" and "photocopy" filters in PhotoShop. Both patients had surgical planning and operations based on these images. They were satisfied with the results. Preoperative computer imaging can be a very effective tool for the plastic surgeon by providing improved physician-patient communication, increased patient confidence, and enhanced surgical planning. Adobe PhotoShop is a relatively inexpensive program that can provide these benefits using only 1 or 2 features.
Advanced techniques in reliability model representation and solution
NASA Technical Reports Server (NTRS)
Palumbo, Daniel L.; Nicol, David M.
1992-01-01
The current tendency of flight control system designs is towards increased integration of applications and increased distribution of computational elements. The reliability analysis of such systems is difficult because subsystem interactions are increasingly interdependent. Researchers at NASA Langley Research Center have been working for several years to extend the capability of Markov modeling techniques to address these problems. This effort has been focused in the areas of increased model abstraction and increased computational capability. The reliability model generator (RMG) is a software tool that uses as input a graphical object-oriented block diagram of the system. RMG uses a failure-effects algorithm to produce the reliability model from the graphical description. The ASSURE software tool is a parallel processing program that uses the semi-Markov unreliability range evaluator (SURE) solution technique and the abstract semi-Markov specification interface to the SURE tool (ASSIST) modeling language. A failure modes-effects simulation is used by ASSURE. These tools were used to analyze a significant portion of a complex flight control system. The successful combination of the power of graphical representation, automated model generation, and parallel computation leads to the conclusion that distributed fault-tolerant system architectures can now be analyzed.
A Computer-Aided Writing Program for Learning Disabled Adolescents.
ERIC Educational Resources Information Center
Fais, Laurie; Wanderman, Richard
The paper describes the application of a computer-assisted writing program in a special high school for learning disabled and dyslexic students and reports on a study of the program's effectiveness. Particular advantages of the Macintosh Computer for such a program are identified including use of the mouse pointing tool, graphic icons to identify…
Chimera grids in the simulation of three-dimensional flowfields in turbine-blade-coolant passages
NASA Technical Reports Server (NTRS)
Stephens, M. A.; Rimlinger, M. J.; Shih, T. I.-P.; Civinskas, K. C.
1993-01-01
When computing flows inside geometrically complex turbine-blade coolant passages, the structure of the grid system used can affect significantly the overall time and cost required to obtain solutions. This paper addresses this issue while evaluating and developing computational tools for the design and analysis of coolant-passages, and is divided into two parts. In the first part, the various types of structured and unstructured grids are compared in relation to their ability to provide solutions in a timely and cost-effective manner. This comparison shows that the overlapping structured grids, known as Chimera grids, can rival and in some instances exceed the cost-effectiveness of unstructured grids in terms of both the man hours needed to generate grids and the amount of computer memory and CPU time needed to obtain solutions. In the second part, a computational tool utilizing Chimera grids was used to compute the flow and heat transfer in two different turbine-blade coolant passages that contain baffles and numerous pin fins. These computations showed the versatility and flexibility offered by Chimera grids.
NASA Astrophysics Data System (ADS)
Furlong, Cosme; Pryputniewicz, Ryszard J.
1998-05-01
Increased demands on the performance and efficiency of mechanical components impose challenges on their engineering design and optimization, especially when new and more demanding applications must be developed in relatively short periods of time while satisfying design objectives, as well as cost and manufacturability. In addition, reliability and durability must be taken into consideration. As a consequence, effective quantitative methodologies, computational and experimental, should be applied in the study and optimization of mechanical components. Computational investigations enable parametric studies and the determination of critical engineering design conditions, while experimental investigations, especially those using optical techniques, provide qualitative and quantitative information on the actual response of the structure of interest to the applied load and boundary conditions. We discuss a hybrid experimental and computational approach for investigation and optimization of mechanical components. The approach is based on analytical, computational, and experimental resolutions methodologies in the form of computational, noninvasive optical techniques, and fringe prediction analysis tools. Practical application of the hybrid approach is illustrated with representative examples that demonstrate the viability of the approach as an effective engineering tool for analysis and optimization.
Computer-Generated Movies for Mission Planning
NASA Technical Reports Server (NTRS)
Roberts, P. H., Jr.; vanDillen, S. L.
1973-01-01
Computer-generated movies help the viewer to understand mission dynamics and get quantitative details. Sample movie frames demonstrate the uses and effectiveness of movies in mission planning. Tools needed for movie-making include computer programs to generate images on film and film processing to give the desired result. Planning scenes to make an effective product requires some thought and experience. Viewpoints and timing are particularly important. Lessons learned so far and problems still encountered are discussed.
Memorization Effects of Pronunciation and Stroke Order Animation in Digital Flashcards
ERIC Educational Resources Information Center
Zhu, Yu; Fung, Andy S. L.; Wang, Hongyan
2012-01-01
Digital flashcards are one of the most popular self-study computer-assisted vocabulary learning tools for beginners of Chinese as a foreign language. However, studies on the effects of this widely used learning tool are scarce. Introducing a new concept--referential stimulus--into the Dual Coding Theory (DCT) framework, this study acknowledges the…
The Effects of the Coordination Support on Shared Mental Models and Coordinated Action
ERIC Educational Resources Information Center
Kim, Hyunsong; Kim, Dongsik
2008-01-01
The purpose of this study was to examine the effects of coordination support (tool support and tutor support) on the development of shared mental models (SMMs) and coordinated action in a computer-supported collaborative learning environment. Eighteen students were randomly assigned to one of three conditions, including the tool condition, the…
Using Tracker as a Pedagogical Tool for Understanding Projectile Motion
ERIC Educational Resources Information Center
Wee, Loo Kang; Chew, Charles; Goh, Giam Hwee; Tan, Samuel; Lee, Tat Leong
2012-01-01
This article reports on the use of Tracker as a pedagogical tool in the effective learning and teaching of projectile motion in physics. When a computer model building learning process is supported and driven by video analysis data, this free Open Source Physics tool can provide opportunities for students to engage in active enquiry-based…
ERIC Educational Resources Information Center
Shamsudin, Sarimah; Nesi, Hilary
2006-01-01
This paper will describe an ESP approach to the design and implementation of computer-mediated communication (CMC) tasks for computer science students at Universiti Teknologi Malaysia, and discuss the effectiveness of the chat feature of Windows NetMeeting as a tool for developing specified language skills. CMC tasks were set within a programme of…
The challenge of big data in public health: an opportunity for visual analytics.
Ola, Oluwakemi; Sedig, Kamran
2014-01-01
Public health (PH) data can generally be characterized as big data. The efficient and effective use of this data determines the extent to which PH stakeholders can sufficiently address societal health concerns as they engage in a variety of work activities. As stakeholders interact with data, they engage in various cognitive activities such as analytical reasoning, decision-making, interpreting, and problem solving. Performing these activities with big data is a challenge for the unaided mind as stakeholders encounter obstacles relating to the data's volume, variety, velocity, and veracity. Such being the case, computer-based information tools are needed to support PH stakeholders. Unfortunately, while existing computational tools are beneficial in addressing certain work activities, they fall short in supporting cognitive activities that involve working with large, heterogeneous, and complex bodies of data. This paper presents visual analytics (VA) tools, a nascent category of computational tools that integrate data analytics with interactive visualizations, to facilitate the performance of cognitive activities involving big data. Historically, PH has lagged behind other sectors in embracing new computational technology. In this paper, we discuss the role that VA tools can play in addressing the challenges presented by big data. In doing so, we demonstrate the potential benefit of incorporating VA tools into PH practice, in addition to highlighting the need for further systematic and focused research.
The Challenge of Big Data in Public Health: An Opportunity for Visual Analytics
Ola, Oluwakemi; Sedig, Kamran
2014-01-01
Public health (PH) data can generally be characterized as big data. The efficient and effective use of this data determines the extent to which PH stakeholders can sufficiently address societal health concerns as they engage in a variety of work activities. As stakeholders interact with data, they engage in various cognitive activities such as analytical reasoning, decision-making, interpreting, and problem solving. Performing these activities with big data is a challenge for the unaided mind as stakeholders encounter obstacles relating to the data’s volume, variety, velocity, and veracity. Such being the case, computer-based information tools are needed to support PH stakeholders. Unfortunately, while existing computational tools are beneficial in addressing certain work activities, they fall short in supporting cognitive activities that involve working with large, heterogeneous, and complex bodies of data. This paper presents visual analytics (VA) tools, a nascent category of computational tools that integrate data analytics with interactive visualizations, to facilitate the performance of cognitive activities involving big data. Historically, PH has lagged behind other sectors in embracing new computational technology. In this paper, we discuss the role that VA tools can play in addressing the challenges presented by big data. In doing so, we demonstrate the potential benefit of incorporating VA tools into PH practice, in addition to highlighting the need for further systematic and focused research. PMID:24678376
ERIC Educational Resources Information Center
Windschitl, Mark; Andre, Thomas
1998-01-01
Investigates the effects of a constructivist versus objectivist learning environment on college students' conceptual change using a computer simulation of the human cardiovascular system as an instructional tool. Contains 33 references. (DDR)
FuelCalc: A Method for Estimating Fuel Characteristics
Elizabeth Reinhardt; Duncan Lutes; Joe Scott
2006-01-01
This paper describes the FuelCalc computer program. FuelCalc is a tool to compute surface and canopy fuel loads and characteristics from inventory data, to support fuel treatment decisions by simulating effects of a wide range of silvicultural treatments on surface fuels and canopy fuels, and to provide linkages to stand visualization, fire behavior and fire effects...
ERIC Educational Resources Information Center
Sherblom, John C.
2010-01-01
There is a "prevalence of computer-mediated communication (CMC) in education," and a concern for its negative psychosocial consequences and lack of effectiveness as an instructional tool. This essay identifies five variables in the CMC research literature and shows their moderating effect on the psychosocial, instructional expevrience of the CMC…
ERIC Educational Resources Information Center
Yan, Yaw-liang
2010-01-01
Computer technology has been applied widely as an educational tool in second language learning for a long time. There have been many studies discussing the application of computer technology to different aspects in second language learning. However, the learning effect of both de-contextualized multimedia software and sound gloss on second…
The Effectiveness of Computer-Mediated Communication on SLA: A Meta-Analysis and Research Synthesis
ERIC Educational Resources Information Center
Lin, Huifen
2012-01-01
Over the past two decades, a large body of research has been conducted on the effectiveness of computer-mediated communication (CMC) employed as either standalone or instructional tools in SLA classrooms. Findings from this large body of work, however, are not conclusive, making it important to identify factors that would inform its successful…
AnnotCompute: annotation-based exploration and meta-analysis of genomics experiments
Zheng, Jie; Stoyanovich, Julia; Manduchi, Elisabetta; Liu, Junmin; Stoeckert, Christian J.
2011-01-01
The ever-increasing scale of biological data sets, particularly those arising in the context of high-throughput technologies, requires the development of rich data exploration tools. In this article, we present AnnotCompute, an information discovery platform for repositories of functional genomics experiments such as ArrayExpress. Our system leverages semantic annotations of functional genomics experiments with controlled vocabulary and ontology terms, such as those from the MGED Ontology, to compute conceptual dissimilarities between pairs of experiments. These dissimilarities are then used to support two types of exploratory analysis—clustering and query-by-example. We show that our proposed dissimilarity measures correspond to a user's intuition about conceptual dissimilarity, and can be used to support effective query-by-example. We also evaluate the quality of clustering based on these measures. While AnnotCompute can support a richer data exploration experience, its effectiveness is limited in some cases, due to the quality of available annotations. Nonetheless, tools such as AnnotCompute may provide an incentive for richer annotations of experiments. Code is available for download at http://www.cbil.upenn.edu/downloads/AnnotCompute. Database URL: http://www.cbil.upenn.edu/annotCompute/ PMID:22190598
Decision tools in health care: focus on the problem, not the solution.
Liu, Joseph; Wyatt, Jeremy C; Altman, Douglas G
2006-01-20
Systematic reviews or randomised-controlled trials usually help to establish the effectiveness of drugs and other health technologies, but are rarely sufficient by themselves to ensure actual clinical use of the technology. The process from innovation to routine clinical use is complex. Numerous computerised decision support systems (DSS) have been developed, but many fail to be taken up into actual use. Some developers construct technologically advanced systems with little relevance to the real world. Others did not determine whether a clinical need exists. With NHS investing 5 billion pounds sterling in computer systems, also occurring in other countries, there is an urgent need to shift from a technology-driven approach to one that identifies and employs the most cost-effective method to manage knowledge, regardless of the technology. The generic term, 'decision tool' (DT), is therefore suggested to demonstrate that these aids, which seem different technically, are conceptually the same from a clinical viewpoint. Many computerised DSSs failed for various reasons, for example, they were not based on best available knowledge; there was insufficient emphasis on their need for high quality clinical data; their development was technology-led; or evaluation methods were misapplied. We argue that DSSs and other computer-based, paper-based and even mechanical decision aids are members of a wider family of decision tools. A DT is an active knowledge resource that uses patient data to generate case specific advice, which supports decision making about individual patients by health professionals, the patients themselves or others concerned about them. The identification of DTs as a consistent and important category of health technology should encourage the sharing of lessons between DT developers and users and reduce the frequency of decision tool projects focusing only on technology. The focus of evaluation should become more clinical, with the impact of computer-based DTs being evaluated against other computer, paper- or mechanical tools, to identify the most cost effective tool for each clinical problem. We suggested the generic term 'decision tool' to demonstrate that decision-making aids, such as computerised DSSs, paper algorithms, and reminders are conceptually the same, so the methods to evaluate them should be the same.
Memory management in genome-wide association studies
2009-01-01
Genome-wide association is a powerful tool for the identification of genes that underlie common diseases. Genome-wide association studies generate billions of genotypes and pose significant computational challenges for most users including limited computer memory. We applied a recently developed memory management tool to two analyses of North American Rheumatoid Arthritis Consortium studies and measured the performance in terms of central processing unit and memory usage. We conclude that our memory management approach is simple, efficient, and effective for genome-wide association studies. PMID:20018047
Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas
2017-01-01
Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments. PMID:28190948
ERIC Educational Resources Information Center
Grimley, Michael; Green, Richard; Nilsen, Trond; Thompson, David
2012-01-01
Computer games are purported to be effective instructional tools that enhance motivation and improve engagement. The aim of this study was to investigate how tertiary student experiences change when instruction was computer game based compared to lecture based, and whether experiences differed between high and low achieving students. Participants…
Computational Analysis of Material Flow During Friction Stir Welding of AA5059 Aluminum Alloys
NASA Astrophysics Data System (ADS)
Grujicic, M.; Arakere, G.; Pandurangan, B.; Ochterbeck, J. M.; Yen, C.-F.; Cheeseman, B. A.; Reynolds, A. P.; Sutton, M. A.
2012-09-01
Workpiece material flow and stirring/mixing during the friction stir welding (FSW) process are investigated computationally. Within the numerical model of the FSW process, the FSW tool is treated as a Lagrangian component while the workpiece material is treated as an Eulerian component. The employed coupled Eulerian/Lagrangian computational analysis of the welding process was of a two-way thermo-mechanical character (i.e., frictional-sliding/plastic-work dissipation is taken to act as a heat source in the thermal-energy balance equation) while temperature is allowed to affect mechanical aspects of the model through temperature-dependent material properties. The workpiece material (AA5059, solid-solution strengthened and strain-hardened aluminum alloy) is represented using a modified version of the classical Johnson-Cook model (within which the strain-hardening term is augmented to take into account for the effect of dynamic recrystallization) while the FSW tool material (AISI H13 tool steel) is modeled as an isotropic linear-elastic material. Within the analysis, the effects of some of the FSW key process parameters are investigated (e.g., weld pitch, tool tilt-angle, and the tool pin-size). The results pertaining to the material flow during FSW are compared with their experimental counterparts. It is found that, for the most part, experimentally observed material-flow characteristics are reproduced within the current FSW-process model.
NASA Astrophysics Data System (ADS)
Madaras, Gary S.
2002-05-01
The use of computer modeling as a marketing, diagnosis, design, and research tool in the practice of acoustical consulting is discussed. From the time it is obtained, the software can be used as an effective marketing tool. It is not until the software basics are learned and some amount of testing and verification occurs that the software can be used as a tool for diagnosing the acoustics of existing rooms. A greater understanding of the output types and formats as well as experience in interpreting the results is required before the software can be used as an efficient design tool. Lastly, it is only after repetitive use as a design tool that the software can be used as a cost-effective means of conducting research in practice. The discussion is supplemented with specific examples of actual projects provided by various consultants within multiple firms. Focus is placed on the use of CATT-Acoustic software and predicting the room acoustics of large performing arts halls as well as other public assembly spaces.
FIA BioSum: a tool to evaluate financial costs, opportunities and effectiveness of fuel treatments.
Jeremy Fried; Glenn Christensen
2004-01-01
FIA BioSum, a tool developed by the USDA Forest Services Forest Inventory and Analysis (FIA) Program, generates reliable cost estimates, identifies opportunities and evaluates the effectiveness of fuel treatments in forested landscapes. BioSum is an analytic framework that integrates a suite of widely used computer models with a foundation of attribute-rich,...
Interactive Computer-Based Testing.
ERIC Educational Resources Information Center
Franklin, Stephen; Marasco, Joseph
1977-01-01
Discusses the use of the Interactive Computer-based Testing (ICBT) in university-level science courses as an effective and economical educational tool. The authors discuss: (1) major objectives to ICBT; (2) advantages and pitfalls of the student use of ICBT; and (3) future prospects of ICBT. (HM)
Reading the Writing on the Graffiti Wall: The World Wide Web and Training.
ERIC Educational Resources Information Center
Jones, Charles M.
This paper examines the benefits to be derived from networked computer-based instruction (CBI) and discusses the potential of the World Wide Web (WWW) as an effective tool in employee training. Methods of utilizing the WWW as a training tool and communication tool are explored. The discussion is divided into the following sections: (1) "WWW and…
ERIC Educational Resources Information Center
Micro-Ideas, Glenview, IL.
Fifty-five papers focusing on the role of computer technology in education at all levels are included in the proceedings of this conference, which was designed to model effective and appropriate uses of the computer as an extension of the teacher-based instructional system. The use of the computer as a tool was emphasized, and the word processor…
Computers in medical education 1: evaluation of a problem-orientated learning package.
Devitt, P; Palmer, E
1998-04-01
A computer-based learning package has been developed, aimed at expanding students' knowledge base, as well as improving data-handling abilities and clinical problem-solving skills. The program was evaluated by monitoring its use by students, canvassing users' opinions and measuring its effectiveness as a learning tool compared to tutorials on the same material. Evaluation was undertaken using three methods: initially, by a questionnaire on computers as a learning tool and the applicability of the content: second, through monitoring by the computer of student use, decisions and performance; finally, through pre- and post-test assessment of fifth-year students who either used a computer package or attended a tutorial on equivalent material. Most students provided positive comments on the learning material and expressed a willingness to see computer-aided learning (CAL) introduced into the curriculum. Over a 3-month period, 26 modules in the program were used on 1246 occasions. Objective measurement showed a significant gain in knowledge, data handling and problem-solving skills. Computer-aided learning is a valuable learning resource that deserves better attention in medical education. When used appropriately, the computer can be an effective learning resource, not only for the delivery of knowledge. but also to help students develop their problem-solving skills.
2011-01-01
Background Computational models play an increasingly important role in the assessment and control of public health crises, as demonstrated during the 2009 H1N1 influenza pandemic. Much research has been done in recent years in the development of sophisticated data-driven models for realistic computer-based simulations of infectious disease spreading. However, only a few computational tools are presently available for assessing scenarios, predicting epidemic evolutions, and managing health emergencies that can benefit a broad audience of users including policy makers and health institutions. Results We present "GLEaMviz", a publicly available software system that simulates the spread of emerging human-to-human infectious diseases across the world. The GLEaMviz tool comprises three components: the client application, the proxy middleware, and the simulation engine. The latter two components constitute the GLEaMviz server. The simulation engine leverages on the Global Epidemic and Mobility (GLEaM) framework, a stochastic computational scheme that integrates worldwide high-resolution demographic and mobility data to simulate disease spread on the global scale. The GLEaMviz design aims at maximizing flexibility in defining the disease compartmental model and configuring the simulation scenario; it allows the user to set a variety of parameters including: compartment-specific features, transition values, and environmental effects. The output is a dynamic map and a corresponding set of charts that quantitatively describe the geo-temporal evolution of the disease. The software is designed as a client-server system. The multi-platform client, which can be installed on the user's local machine, is used to set up simulations that will be executed on the server, thus avoiding specific requirements for large computational capabilities on the user side. Conclusions The user-friendly graphical interface of the GLEaMviz tool, along with its high level of detail and the realism of its embedded modeling approach, opens up the platform to simulate realistic epidemic scenarios. These features make the GLEaMviz computational tool a convenient teaching/training tool as well as a first step toward the development of a computational tool aimed at facilitating the use and exploitation of computational models for the policy making and scenario analysis of infectious disease outbreaks. PMID:21288355
Thermomechanical conditions and stresses on the friction stir welding tool
NASA Astrophysics Data System (ADS)
Atthipalli, Gowtam
Friction stir welding has been commercially used as a joining process for aluminum and other soft materials. However, the use of this process in joining of hard alloys is still developing primarily because of the lack of cost effective, long lasting tools. Here I have developed numerical models to understand the thermo mechanical conditions experienced by the FSW tool and to improve its reusability. A heat transfer and visco-plastic flow model is used to calculate the torque, and traverse force on the tool during FSW. The computed values of torque and traverse force are validated using the experimental results for FSW of AA7075, AA2524, AA6061 and Ti-6Al-4V alloys. The computed torque components are used to determine the optimum tool shoulder diameter based on the maximum use of torque and maximum grip of the tool on the plasticized workpiece material. The estimation of the optimum tool shoulder diameter for FSW of AA6061 and AA7075 was verified with experimental results. The computed values of traverse force and torque are used to calculate the maximum shear stress on the tool pin to determine the load bearing ability of the tool pin. The load bearing ability calculations are used to explain the failure of H13 steel tool during welding of AA7075 and commercially pure tungsten during welding of L80 steel. Artificial neural network (ANN) models are developed to predict the important FSW output parameters as function of selected input parameters. These ANN consider tool shoulder radius, pin radius, pin length, welding velocity, tool rotational speed and axial pressure as input parameters. The total torque, sliding torque, sticking torque, peak temperature, traverse force, maximum shear stress and bending stress are considered as the output for ANN models. These output parameters are selected since they define the thermomechanical conditions around the tool during FSW. The developed ANN models are used to understand the effect of various input parameters on the total torque and traverse force during FSW of AA7075 and 1018 mild steel. The ANN models are also used to determine tool safety factor for wide range of input parameters. A numerical model is developed to calculate the strain and strain rates along the streamlines during FSW. The strain and strain rate values are calculated for FSW of AA2524. Three simplified models are also developed for quick estimation of output parameters such as material velocity field, torque and peak temperature. The material velocity fields are computed by adopting an analytical method of calculating velocities for flow of non-compressible fluid between two discs where one is rotating and other is stationary. The peak temperature is estimated based on a non-dimensional correlation with dimensionless heat input. The dimensionless heat input is computed using known welding parameters and material properties. The torque is computed using an analytical function based on shear strength of the workpiece material. These simplified models are shown to be able to predict these output parameters successfully.
The importance of employing computational resources for the automation of drug discovery.
Rosales-Hernández, Martha Cecilia; Correa-Basurto, José
2015-03-01
The application of computational tools to drug discovery helps researchers to design and evaluate new drugs swiftly with a reduce economic resources. To discover new potential drugs, computational chemistry incorporates automatization for obtaining biological data such as adsorption, distribution, metabolism, excretion and toxicity (ADMET), as well as drug mechanisms of action. This editorial looks at examples of these computational tools, including docking, molecular dynamics simulation, virtual screening, quantum chemistry, quantitative structural activity relationship, principal component analysis and drug screening workflow systems. The authors then provide their perspectives on the importance of these techniques for drug discovery. Computational tools help researchers to design and discover new drugs for the treatment of several human diseases without side effects, thus allowing for the evaluation of millions of compounds with a reduced cost in both time and economic resources. The problem is that operating each program is difficult; one is required to use several programs and understand each of the properties being tested. In the future, it is possible that a single computer and software program will be capable of evaluating the complete properties (mechanisms of action and ADMET properties) of ligands. It is also possible that after submitting one target, this computer-software will be capable of suggesting potential compounds along with ways to synthesize them, and presenting biological models for testing.
Creating a Minnesota Statewide SNAP-Ed Program Evaluation
ERIC Educational Resources Information Center
Gold, Abby; Barno, Trina Adler; Sherman, Shelley; Lovett, Kathleen; Hurtado, G. Ali
2013-01-01
Systematic evaluation is an essential tool for understanding program effectiveness. This article describes the pilot test of a statewide evaluation tool for the Supplemental Nutrition Assistance Program-Education (SNAP-Ed). A computer algorithm helped Community Nutrition Educators (CNEs) build surveys specific to their varied educational settings…
Microcomputer-Based Intelligent Tutoring Systems: An Assessment.
ERIC Educational Resources Information Center
Schaffer, John William
Computer-assisted instruction, while familiar to most teachers, has failed to become an effective self-motivating instructional tool. Developments in artificial intelligence, however, have provided new and better tools for exploring human knowledge acquisition and utilization. Expert system technology represents one of the most promising of these…
TOOLS FOR PRESENTING SPATIAL AND TEMPORAL PATTERNS OF ENVIRONMENTAL MONITORING DATA
The EPA Health Effects Research Laboratory has developed this data presentation tool for use with a variety of types of data which may contain spatial and temporal patterns of interest. he technology links mainframe computing power to the new generation of "desktop publishing" ha...
ERIC Educational Resources Information Center
Technology & Learning, 2005
2005-01-01
The employment of administrative tools for effective management is evaluated in this article. Using an administrative tool like a Palm[R] handheld computer and appropriate software can make the difference. For anyone running a single school or an entire district, managing data is the key to maintaining an efficient organization. Many…
High-Performance Data Analysis Tools for Sun-Earth Connection Missions
NASA Technical Reports Server (NTRS)
Messmer, Peter
2011-01-01
The data analysis tool of choice for many Sun-Earth Connection missions is the Interactive Data Language (IDL) by ITT VIS. The increasing amount of data produced by these missions and the increasing complexity of image processing algorithms requires access to higher computing power. Parallel computing is a cost-effective way to increase the speed of computation, but algorithms oftentimes have to be modified to take advantage of parallel systems. Enhancing IDL to work on clusters gives scientists access to increased performance in a familiar programming environment. The goal of this project was to enable IDL applications to benefit from both computing clusters as well as graphics processing units (GPUs) for accelerating data analysis tasks. The tool suite developed in this project enables scientists now to solve demanding data analysis problems in IDL that previously required specialized software, and it allows them to be solved orders of magnitude faster than on conventional PCs. The tool suite consists of three components: (1) TaskDL, a software tool that simplifies the creation and management of task farms, collections of tasks that can be processed independently and require only small amounts of data communication; (2) mpiDL, a tool that allows IDL developers to use the Message Passing Interface (MPI) inside IDL for problems that require large amounts of data to be exchanged among multiple processors; and (3) GPULib, a tool that simplifies the use of GPUs as mathematical coprocessors from within IDL. mpiDL is unique in its support for the full MPI standard and its support of a broad range of MPI implementations. GPULib is unique in enabling users to take advantage of an inexpensive piece of hardware, possibly already installed in their computer, and achieve orders of magnitude faster execution time for numerically complex algorithms. TaskDL enables the simple setup and management of task farms on compute clusters. The products developed in this project have the potential to interact, so one can build a cluster of PCs, each equipped with a GPU, and use mpiDL to communicate between the nodes and GPULib to accelerate the computations on each node.
Controlled English for Effective Communication during Coalition Operations
2013-06-01
Linguistic variations and cultural differences often create unexpected challenges for effective communication and thus for Command and Control (C2...CE), and CE-based tools to improve cross- linguistic /cross-cultural communication. We will discuss various types of linguistic variations and cultural...human-computer interaction, reasoning, and explanation CE and CE-based tools can play an important role in facilitating cross- linguistic and cross
System capacity and economic modeling computer tool for satellite mobile communications systems
NASA Technical Reports Server (NTRS)
Wiedeman, Robert A.; Wen, Doong; Mccracken, Albert G.
1988-01-01
A unique computer modeling tool that combines an engineering tool with a financial analysis program is described. The resulting combination yields a flexible economic model that can predict the cost effectiveness of various mobile systems. Cost modeling is necessary in order to ascertain if a given system with a finite satellite resource is capable of supporting itself financially and to determine what services can be supported. Personal computer techniques using Lotus 123 are used for the model in order to provide as universal an application as possible such that the model can be used and modified to fit many situations and conditions. The output of the engineering portion of the model consists of a channel capacity analysis and link calculations for several qualities of service using up to 16 types of earth terminal configurations. The outputs of the financial model are a revenue analysis, an income statement, and a cost model validation section.
Computational Fluid Dynamics at NASA Ames Research Center
NASA Technical Reports Server (NTRS)
Kutler, Paul
1994-01-01
Computational fluid dynamics (CFD) is beginning to play a major role in the aircraft industry of the United States because of the realization that CFD can be a new and effective design tool and thus could provide a company with a competitive advantage. It is also playing a significant role in research institutions, both governmental and academic, as a tool for researching new fluid physics, as well as supplementing and complementing experimental testing. In this presentation, some of the progress made to date in CFD at NASA Ames will be reviewed. The presentation addresses the status of CFD in terms of methods, examples of CFD solutions, and computer technology. In addition, the role CFD will play in supporting the revolutionary goals set forth by the Aeronautical Policy Review Committee established by the Office of Science and Technology Policy is noted. The need for validated CFD tools is also briefly discussed.
1986-10-31
Reference Card Given to Participants) Cognoter Reference Select = LeftButton Menu = MiddleButton TitleBar menu for tool operations Item menu for item...collaborative tools and their uses, the Colab system and the Cognoter presentation tool were implemented and used for both real and posed idea organization...tasks. To test the system design and its effect on structured problem-solving, many early Colab/ Cognoter meetings were monitored and a series of
Rich Language Analysis for Counterterrorism
NASA Astrophysics Data System (ADS)
Guidère, Mathieu; Howard, Newton; Argamon, Shlomo
Accurate and relevant intelligence is critical for effective counterterrorism. Too much irrelevant information is as bad or worse than not enough information. Modern computational tools promise to provide better search and summarization capabilities to help analysts filter and select relevant and key information. However, to do this task effectively, such tools must have access to levels of meaning beyond the literal. Terrorists operating in context-rich cultures like fundamentalist Islam use messages with multiple levels of interpretation, which are easily misunderstood by non-insiders. This chapter discusses several kinds of such “encryption” used by terrorists and insurgents in the Arabic language, and how knowledge of such methods can be used to enhance computational text analysis techniques for use in counterterrorism.
Ambient Assisted Living spaces validation by services and devices simulation.
Fernández-Llatas, Carlos; Mocholí, Juan Bautista; Sala, Pilar; Naranjo, Juan Carlos; Pileggi, Salvatore F; Guillén, Sergio; Traver, Vicente
2011-01-01
The design of Ambient Assisted Living (AAL) products is a very demanding challenge. AAL products creation is a complex iterative process which must accomplish exhaustive prerequisites about accessibility and usability. In this process the early detection of errors is crucial to create cost-effective systems. Computer-assisted tools can suppose a vital help to usability designers in order to avoid design errors. Specifically computer simulation of products in AAL environments can be used in all the design phases to support the validation. In this paper, a computer simulation tool for supporting usability designers in the creation of innovative AAL products is presented. This application will benefit their work saving time and improving the final system functionality.
Factors influencing exemplary science teachers' levels of computer use
NASA Astrophysics Data System (ADS)
Hakverdi, Meral
This study examines exemplary science teachers' use of technology in science instruction, factors influencing their level of computer use, their level of knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, and their students' use of computer applications/tools in or for their science class. After a relevant review of the literature certain variables were selected for analysis. These variables included personal self-efficacy in teaching with computers, outcome expectancy, pupil-control ideology, level of computer use, age, gender, teaching experience, personal computer use, professional computer use and science teachers' level of knowledge/skills in using specific computer applications for science instruction. The sample for this study includes middle and high school science teachers who received the Presidential Award for Excellence in Science Teaching Award (sponsored by the White House and the National Science Foundation) between the years 1997 and 2003 from all 50 states and U.S. territories. Award-winning science teachers were contacted about the survey via e-mail or letter with an enclosed return envelope. Of the 334 award-winning science teachers, usable responses were received from 92 science teachers, which made a response rate of 27.5%. Analysis of the survey responses indicated that exemplary science teachers have a variety of knowledge/skills in using computer related applications/tools. The most commonly used computer applications/tools are information retrieval via the Internet, presentation tools, online communication, digital cameras, and data collection probes. Results of the study revealed that students' use of technology in their science classroom is highly correlated with the frequency of their science teachers' use of computer applications/tools. The results of the multiple regression analysis revealed that personal self-efficacy related to the exemplary science teachers' level of computer use suggesting that computer use is dependent on perceived abilities at using computers. The teachers' use of computer-related applications/tools during class, and their personal self-efficacy, age, and gender are highly related with their level of knowledge/skills in using specific computer applications for science instruction. The teachers' level of knowledge/skills in using specific computer applications for science instruction and gender related to their use of computer-related applications/tools during class and the students' use of computer-related applications/tools in or for their science class. In conclusion, exemplary science teachers need assistance in learning and using computer-related applications/tool in their science class.
Investigating the Effectiveness of Classroom Diagnostic Tools
ERIC Educational Resources Information Center
Schultz, Robert K.
2012-01-01
The primary purposes of the study are to investigate what teachers experience while using the Classroom Diagnostic Tools (CDT) and to relate those experiences to the rate of growth in students' mathematics achievement. The CDT contains three components: an online computer adaptive diagnostic test, interactive web-based student reports, and…
Kearney, N; Kidd, L; Miller, M; Sage, M; Khorrami, J; McGee, M; Cassidy, J; Niven, K; Gray, P
2006-07-01
Recent changes in cancer service provision mean that many patients spend a limited time in hospital and therefore experience and must cope with and manage treatment-related side effects at home. Information technology can provide innovative solutions in promoting patient care through information provision, enhancing communication, monitoring treatment-related side effects and promoting self-care. The aim of this feasibility study was to evaluate the acceptability of using handheld computers as a symptom assessment and management tool for patients receiving chemotherapy for cancer. A convenience sample of patients (n = 18) and health professionals (n = 9) at one Scottish cancer centre was recruited. Patients used the handheld computer to record and send daily symptom reports to the cancer centre and receive instant, tailored symptom management advice during two treatment cycles. Both patients' and health professionals' perceptions of the handheld computer system were evaluated at baseline and at the end of the project. Patients believed the handheld computer had improved their symptom management and felt comfortable in using it. The health professionals also found the handheld computer to be helpful in assessing and managing patients' symptoms. This project suggests that a handheld-computer-based symptom management tool is feasible and acceptable to both patients and health professionals in complementing the care of patients receiving chemotherapy.
Hasegawa, Tomoyuki; Kojima, Haruna; Masu, Chisato; Fukushima, Yasuhiro; Kojima, Hironori; Konokawa, Kiminori; Isobe, Tomonori; Sato, Eisuke; Murayama, Hideo; Maruyama, Koichi; Umeda, Tokuo
2010-01-01
Physics-related subjects are important in the educational fields of radiological physics and technology. However, conventional teaching tools, for example texts, equations, and two-dimensional figures, are not very effective in attracting the interest of students. Therefore, we have created several multimedia educational materials covering radiological physics and technology. Each educational presentation includes several segments of high-quality computer-graphic animations designed to attract students' interest. We used personal computers (PCs) and commercial software to create and compile these. Undergraduate and graduate students and teachers and related professionals contributed to the design and creation of the educational materials as part of student research. The educational materials can be displayed on a PC monitor and manipulated with popular free software. Opinion surveys conducted in undergraduate courses at Kitasato University support the effectiveness of our educational tools in helping students gain a better understanding of the subjects offered and in raising their interest.
Role of Statistical Random-Effects Linear Models in Personalized Medicine.
Diaz, Francisco J; Yeh, Hung-Wen; de Leon, Jose
2012-03-01
Some empirical studies and recent developments in pharmacokinetic theory suggest that statistical random-effects linear models are valuable tools that allow describing simultaneously patient populations as a whole and patients as individuals. This remarkable characteristic indicates that these models may be useful in the development of personalized medicine, which aims at finding treatment regimes that are appropriate for particular patients, not just appropriate for the average patient. In fact, published developments show that random-effects linear models may provide a solid theoretical framework for drug dosage individualization in chronic diseases. In particular, individualized dosages computed with these models by means of an empirical Bayesian approach may produce better results than dosages computed with some methods routinely used in therapeutic drug monitoring. This is further supported by published empirical and theoretical findings that show that random effects linear models may provide accurate representations of phase III and IV steady-state pharmacokinetic data, and may be useful for dosage computations. These models have applications in the design of clinical algorithms for drug dosage individualization in chronic diseases; in the computation of dose correction factors; computation of the minimum number of blood samples from a patient that are necessary for calculating an optimal individualized drug dosage in therapeutic drug monitoring; measure of the clinical importance of clinical, demographic, environmental or genetic covariates; study of drug-drug interactions in clinical settings; the implementation of computational tools for web-site-based evidence farming; design of pharmacogenomic studies; and in the development of a pharmacological theory of dosage individualization.
TethysCluster: A comprehensive approach for harnessing cloud resources for hydrologic modeling
NASA Astrophysics Data System (ADS)
Nelson, J.; Jones, N.; Ames, D. P.
2015-12-01
Advances in water resources modeling are improving the information that can be supplied to support decisions affecting the safety and sustainability of society. However, as water resources models become more sophisticated and data-intensive they require more computational power to run. Purchasing and maintaining the computing facilities needed to support certain modeling tasks has been cost-prohibitive for many organizations. With the advent of the cloud, the computing resources needed to address this challenge are now available and cost-effective, yet there still remains a significant technical barrier to leverage these resources. This barrier inhibits many decision makers and even trained engineers from taking advantage of the best science and tools available. Here we present the Python tools TethysCluster and CondorPy, that have been developed to lower the barrier to model computation in the cloud by providing (1) programmatic access to dynamically scalable computing resources, (2) a batch scheduling system to queue and dispatch the jobs to the computing resources, (3) data management for job inputs and outputs, and (4) the ability to dynamically create, submit, and monitor computing jobs. These Python tools leverage the open source, computing-resource management, and job management software, HTCondor, to offer a flexible and scalable distributed-computing environment. While TethysCluster and CondorPy can be used independently to provision computing resources and perform large modeling tasks, they have also been integrated into Tethys Platform, a development platform for water resources web apps, to enable computing support for modeling workflows and decision-support systems deployed as web apps.
Corrias, A.; Jie, X.; Romero, L.; Bishop, M. J.; Bernabeu, M.; Pueyo, E.; Rodriguez, B.
2010-01-01
In this paper, we illustrate how advanced computational modelling and simulation can be used to investigate drug-induced effects on cardiac electrophysiology and on specific biomarkers of pro-arrhythmic risk. To do so, we first perform a thorough literature review of proposed arrhythmic risk biomarkers from the ionic to the electrocardiogram levels. The review highlights the variety of proposed biomarkers, the complexity of the mechanisms of drug-induced pro-arrhythmia and the existence of significant animal species differences in drug-induced effects on cardiac electrophysiology. Predicting drug-induced pro-arrhythmic risk solely using experiments is challenging both preclinically and clinically, as attested by the rise in the cost of releasing new compounds to the market. Computational modelling and simulation has significantly contributed to the understanding of cardiac electrophysiology and arrhythmias over the last 40 years. In the second part of this paper, we illustrate how state-of-the-art open source computational modelling and simulation tools can be used to simulate multi-scale effects of drug-induced ion channel block in ventricular electrophysiology at the cellular, tissue and whole ventricular levels for different animal species. We believe that the use of computational modelling and simulation in combination with experimental techniques could be a powerful tool for the assessment of drug safety pharmacology. PMID:20478918
ERIC Educational Resources Information Center
Olson, Gary A.
2007-01-01
Many professors, staff members, and even administrators see campus computers and e-mail accounts as their own private property--a type of employment benefit provided with no constraints on use. The fact is, universities "assign" computer equipment to personnel as tools to help them perform their jobs more effectively and efficiently, in the same…
NASA Tech Briefs, November/December 1986, Special Edition
NASA Technical Reports Server (NTRS)
1986-01-01
Topics: Computing: The View from NASA Headquarters; Earth Resources Laboratory Applications Software: Versatile Tool for Data Analysis; The Hypercube: Cost-Effective Supercomputing; Artificial Intelligence: Rendezvous with NASA; NASA's Ada Connection; COSMIC: NASA's Software Treasurehouse; Golden Oldies: Tried and True NASA Software; Computer Technical Briefs; NASA TU Services; Digital Fly-by-Wire.
Working Together: Computers and People with Mobility Impairments.
ERIC Educational Resources Information Center
Washington Univ., Seattle.
This brief paper describes several computing tools that have been effectively used by individuals with mobility impairments. Emphasis is on tasks to be completed and how the individuals abilities (not disabilities), with possible assistance from technology, can be used to accomplish them. Preliminary information addresses the importance of…
Study of high altitude plume impingement
NASA Technical Reports Server (NTRS)
Wojciechowski, C. J.; Penny, M. M.; Prozan, R. J.; Seymour, D.; Greenwood, T. F.
1972-01-01
Computer program has been developed as analytical tool to predict severity of effects of exhaust of rocket engines on adjacent spacecraft surfaces. Program computes forces, moments, pressures, and heating rates on surfaces immersed in or subjected to exhaust plume environments. Predictions will be useful in design of systems where such problems are anticipated.
Detached-Eddy Simulations of Separated Flow Around Wings With Ice Accretions: Year One Report
NASA Technical Reports Server (NTRS)
Choo, Yung K. (Technical Monitor); Thompson, David; Mogili, Prasad
2004-01-01
A computational investigation was performed to assess the effectiveness of Detached-Eddy Simulation (DES) as a tool for predicting icing effects. The AVUS code, a public domain flow solver, was employed to compute solutions for an iced wing configuration using DES and steady Reynolds Averaged Navier-Stokes (RANS) equation methodologies. The configuration was an extruded GLC305/944-ice shape section with a rectangular planform. The model was mounted between two walls so no tip effects were considered. The numerical results were validated by comparison with experimental data for the same configuration. The time-averaged DES computations showed some improvement in lift and drag results near stall when compared to steady RANS results. However, comparisons of the flow field details did not show the level of agreement suggested by the integrated quantities. Based on our results, we believe that DES may prove useful in a limited sense to provide analysis of iced wing configurations when there is significant flow separation, e.g., near stall, where steady RANS computations are demonstrably ineffective. However, more validation is needed to determine what role DES can play as part of an overall icing effects prediction strategy. We conclude the report with an assessment of existing computational tools for application to the iced wing problem and a discussion of issues that merit further study.
Computer programing for geosciences: Teach your students how to make tools
NASA Astrophysics Data System (ADS)
Grapenthin, Ronni
2011-12-01
When I announced my intention to pursue a Ph.D. in geophysics, some people gave me confused looks, because I was working on a master's degree in computer science at the time. My friends, like many incoming geoscience graduate students, have trouble linking these two fields. From my perspective, it is pretty straightforward: Much of geoscience evolves around novel analyses of large data sets that require custom tools—computer programs—to minimize the drudgery of manual data handling; other disciplines share this characteristic. While most faculty adapted to the need for tool development quite naturally, as they grew up around computer terminal interfaces, incoming graduate students lack intuitive understanding of programing concepts such as generalization and automation. I believe the major cause is the intuitive graphical user interfaces of modern operating systems and applications, which isolate the user from all technical details. Generally, current curricula do not recognize this gap between user and machine. For students to operate effectively, they require specialized courses teaching them the skills they need to make tools that operate on particular data sets and solve their specific problems. Courses in computer science departments are aimed at a different audience and are of limited help.
Use of cloud computing technology in natural hazard assessment and emergency management
NASA Astrophysics Data System (ADS)
Webley, P. W.; Dehn, J.
2015-12-01
During a natural hazard event, the most up-to-date data needs to be in the hands of those on the front line. Decision support system tools can be developed to provide access to pre-made outputs to quickly assess the hazard and potential risk. However, with the ever growing availability of new satellite data as well as ground and airborne data generated in real-time there is a need to analyze the large volumes of data in an easy-to-access and effective environment. With the growth in the use of cloud computing, where the analysis and visualization system can grow with the needs of the user, then these facilities can used to provide this real-time analysis. Think of a central command center uploading the data to the cloud compute system and then those researchers in-the-field connecting to a web-based tool to view the newly acquired data. New data can be added by any user and then viewed instantly by anyone else in the organization through the cloud computing interface. This provides the ideal tool for collaborative data analysis, hazard assessment and decision making. We present the rationale for developing a cloud computing systems and illustrate how this tool can be developed for use in real-time environments. Users would have access to an interactive online image analysis tool without the need for specific remote sensing software on their local system therefore increasing their understanding of the ongoing hazard and mitigate its impact on the surrounding region.
Serious Games as New Educational Tools: How Effective Are They? A Meta-Analysis of Recent Studies
ERIC Educational Resources Information Center
Girard, C.; Ecalle, J.; Magnan, A.
2013-01-01
Computer-assisted learning is known to be an effective tool for improving learning in both adults and children. Recent years have seen the emergence of the so-called "serious games (SGs)" that are flooding the educational games market. In this paper, the term "serious games" is used to refer to video games (VGs) intended to serve a useful purpose.…
Preparing and Analyzing Iced Airfoils
NASA Technical Reports Server (NTRS)
Vickerman, Mary B.; Baez, Marivell; Braun, Donald C.; Cotton, Barbara J.; Choo, Yung K.; Coroneos, Rula M.; Pennline, James A.; Hackenberg, Anthony W.; Schilling, Herbert W.; Slater, John W.;
2004-01-01
SmaggIce version 1.2 is a computer program for preparing and analyzing iced airfoils. It includes interactive tools for (1) measuring ice-shape characteristics, (2) controlled smoothing of ice shapes, (3) curve discretization, (4) generation of artificial ice shapes, and (5) detection and correction of input errors. Measurements of ice shapes are essential for establishing relationships between characteristics of ice and effects of ice on airfoil performance. The shape-smoothing tool helps prepare ice shapes for use with already available grid-generation and computational-fluid-dynamics software for studying the aerodynamic effects of smoothed ice on airfoils. The artificial ice-shape generation tool supports parametric studies since ice-shape parameters can easily be controlled with the artificial ice. In such studies, artificial shapes generated by this program can supplement simulated ice obtained from icing research tunnels and real ice obtained from flight test under icing weather condition. SmaggIce also automatically detects geometry errors such as tangles or duplicate points in the boundary which may be introduced by digitization and provides tools to correct these. By use of interactive tools included in SmaggIce version 1.2, one can easily characterize ice shapes and prepare iced airfoils for grid generation and flow simulations.
NASA Astrophysics Data System (ADS)
Akpınar, Ercan
2014-08-01
This study investigates the effects of using interactive computer animations based on predict-observe-explain (POE) as a presentation tool on primary school students' understanding of the static electricity concepts. A quasi-experimental pre-test/post-test control group design was utilized in this study. The experiment group consisted of 30 students, and the control group of 27 students. The control group received normal instruction in which the teacher provided instruction by means of lecture, discussion and homework. Whereas in the experiment group, dynamic and interactive animations based on POE were used as a presentation tool. Data collection tools used in the study were static electricity concept test and open-ended questions. The static electricity concept test was used as pre-test before the implementation, as post-test at the end of the implementation and as delay test approximately 6 weeks after the implementation. Open-ended questions were used at the end of the implementation and approximately 6 weeks after the implementation. Results indicated that the interactive animations used as presentation tools were more effective on the students' understanding of static electricity concepts compared to normal instruction.
Improvement of Computer Software Quality through Software Automated Tools.
1986-08-31
requirement for increased emphasis on software quality assurance has lead to the creation of various methods of verification and validation. Experience...result was a vast array of methods , systems, languages and automated tools to assist in the process. Given that the primary role of quality assurance is...Unfortunately, there is no single method , tool or technique that can insure accurate, reliable and cost effective software. Therefore, government and industry
Sedig, Kamran; Parsons, Paul; Dittmer, Mark; Ola, Oluwakemi
2012-01-01
Public health professionals work with a variety of information sources to carry out their everyday activities. In recent years, interactive computational tools have become deeply embedded in such activities. Unlike the early days of computational tool use, the potential of tools nowadays is not limited to simply providing access to information; rather, they can act as powerful mediators of human-information discourse, enabling rich interaction with public health information. If public health informatics tools are designed and used properly, they can facilitate, enhance, and support the performance of complex cognitive activities that are essential to public health informatics, such as problem solving, forecasting, sense-making, and planning. However, the effective design and evaluation of public health informatics tools requires an understanding of the cognitive and perceptual issues pertaining to how humans work and think with information to perform such activities. This paper draws on research that has examined some of the relevant issues, including interaction design, complex cognition, and visual representations, to offer some human-centered design and evaluation considerations for public health informatics tools.
NASA Technical Reports Server (NTRS)
Marconi, F.; Salas, M.; Yaeger, L.
1976-01-01
A numerical procedure has been developed to compute the inviscid super/hypersonic flow field about complex vehicle geometries accurately and efficiently. A second order accurate finite difference scheme is used to integrate the three dimensional Euler equations in regions of continuous flow, while all shock waves are computed as discontinuities via the Rankine Hugoniot jump conditions. Conformal mappings are used to develop a computational grid. The effects of blunt nose entropy layers are computed in detail. Real gas effects for equilibrium air are included using curve fits of Mollier charts. Typical calculated results for shuttle orbiter, hypersonic transport, and supersonic aircraft configurations are included to demonstrate the usefulness of this tool.
Lindblom, Katrina; Gregory, Tess; Flight, Ingrid H K; Zajac, Ian
2011-01-01
Objective This study investigated the efficacy of an internet-based personalized decision support (PDS) tool designed to aid in the decision to screen for colorectal cancer (CRC) using a fecal occult blood test. We tested whether the efficacy of the tool in influencing attitudes to screening was mediated by perceived usability and acceptability, and considered the role of computer self-efficacy and computer anxiety in these relationships. Methods Eighty-one participants aged 50–76 years worked through the on-line PDS tool and completed questionnaires on computer self-efficacy, computer anxiety, attitudes to and beliefs about CRC screening before and after exposure to the PDS, and perceived usability and acceptability of the tool. Results Repeated measures ANOVA found that PDS exposure led to a significant increase in knowledge about CRC and screening, and more positive attitudes to CRC screening as measured by factors from the Preventive Health Model. Perceived usability and acceptability of the PDS mediated changes in attitudes toward CRC screening (but not CRC knowledge), and computer self-efficacy and computer anxiety were significant predictors of individuals' perceptions of the tool. Conclusion Interventions designed to decrease computer anxiety, such as computer courses and internet training, may improve the acceptability of new health information technologies including internet-based decision support tools, increasing their impact on behavior change. PMID:21857024
Development of a fourth generation predictive capability maturity model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hills, Richard Guy; Witkowski, Walter R.; Urbina, Angel
2013-09-01
The Predictive Capability Maturity Model (PCMM) is an expert elicitation tool designed to characterize and communicate completeness of the approaches used for computational model definition, verification, validation, and uncertainty quantification associated for an intended application. The primary application of this tool at Sandia National Laboratories (SNL) has been for physics-based computational simulations in support of nuclear weapons applications. The two main goals of a PCMM evaluation are 1) the communication of computational simulation capability, accurately and transparently, and 2) the development of input for effective planning. As a result of the increasing importance of computational simulation to SNLs mission, themore » PCMM has evolved through multiple generations with the goal to provide more clarity, rigor, and completeness in its application. This report describes the approach used to develop the fourth generation of the PCMM.« less
Analyzing the Cohesion of English Text and Discourse with Automated Computer Tools
ERIC Educational Resources Information Center
Jeon, Moongee
2014-01-01
This article investigates the lexical and discourse features of English text and discourse with automated computer technologies. Specifically, this article examines the cohesion of English text and discourse with automated computer tools, Coh-Metrix and TEES. Coh-Metrix is a text analysis computer tool that can analyze English text and discourse…
The Implications of Cognitive Psychology for Computer-Based Learning Tools.
ERIC Educational Resources Information Center
Kozma, Robert B.
1987-01-01
Defines cognitive computer tools as software programs that use the control capabilities of computers to amplify, extend, or enhance human cognition; suggests seven ways in which computers can aid learning; and describes the "Learning Tool," a software package for the Apple Macintosh microcomputer that is designed to aid learning of…
GenomicTools: a computational platform for developing high-throughput analytics in genomics.
Tsirigos, Aristotelis; Haiminen, Niina; Bilal, Erhan; Utro, Filippo
2012-01-15
Recent advances in sequencing technology have resulted in the dramatic increase of sequencing data, which, in turn, requires efficient management of computational resources, such as computing time, memory requirements as well as prototyping of computational pipelines. We present GenomicTools, a flexible computational platform, comprising both a command-line set of tools and a C++ API, for the analysis and manipulation of high-throughput sequencing data such as DNA-seq, RNA-seq, ChIP-seq and MethylC-seq. GenomicTools implements a variety of mathematical operations between sets of genomic regions thereby enabling the prototyping of computational pipelines that can address a wide spectrum of tasks ranging from pre-processing and quality control to meta-analyses. Additionally, the GenomicTools platform is designed to analyze large datasets of any size by minimizing memory requirements. In practical applications, where comparable, GenomicTools outperforms existing tools in terms of both time and memory usage. The GenomicTools platform (version 2.0.0) was implemented in C++. The source code, documentation, user manual, example datasets and scripts are available online at http://code.google.com/p/ibm-cbc-genomic-tools.
Improving Cognitive Abilities and e-Inclusion in Children with Cerebral Palsy
NASA Astrophysics Data System (ADS)
Martinengo, Chiara; Curatelli, Francesco
Besides overcoming the motor barriers for accessing to computers and Internet, ICT tools can provide a very useful, and often necessary, support for the cognitive development of motor-impaired children with cerebral palsy. In fact, software tools for computation and communication allow teachers to put into effect, in a more complete and efficient way, the learning methods and the educational plans studied for the child. In the present article, after a brief analysis of the general objectives to be pursued for favouring the learning for children with cerebral palsy, we take account of some specific difficulties in the logical-linguistic and logical-mathematical fields, and we show how they can be overcome using general ICT tools and specifically implemented software programs.
Measuring Assurance of Learning Goals: Effectiveness of Computer Training and Assessment Tools
ERIC Educational Resources Information Center
Murphy, Marianne C.; Sharma, Aditya; Rosso, Mark
2012-01-01
Teaching office applications such as word processing, spreadsheet and presentation skills has been widely debated regarding its necessity, extent and delivery method. Training and Assessment applications such as MyITLab, SAM, etc. are popular tools for training students and are particularly useful in measuring Assurance of Learning (AOL)…
Iranian EFL Teachers' Perceptions of the Difficulties of Implementing CALL
ERIC Educational Resources Information Center
Hedayati, Hora; Marandi, S. Susan
2014-01-01
Despite the spread of reliable technological tools and the availability of computers in Iranian universities, as well as the mounting evidence of the effectiveness of blended learning, many Iranian language teachers are still reluctant to incorporate such tools in their English as a foreign language (EFL) classes. This study inspected the status…
ERIC Educational Resources Information Center
Beatty, Ian D.
There is a growing consensus among educational researchers that traditional problem-based assessments are not effective tools for diagnosing a student's knowledge state and for guiding pedagogical intervention, and that new tools grounded in the results of cognitive science research are needed. The ConMap ("Conceptual Mapping") project, described…
ERIC Educational Resources Information Center
Lee, Mark J. W.; Pradhan, Sunam; Dalgarno, Barney
2008-01-01
Modern information technology and computer science curricula employ a variety of graphical tools and development environments to facilitate student learning of introductory programming concepts and techniques. While the provision of interactive features and the use of visualization can enhance students' understanding and assist them in grasping…
Advanced Tools for Smartphone-Based Experiments: Phyphox
ERIC Educational Resources Information Center
Staacks, S.; Hütz, S.; Stampfer, C.; Heinke, H.
2018-01-01
The sensors in modern smartphones are a promising and cost-effective tool for experimentation in physics education, but many experiments face practical problems. Often the phone is inaccessible during the experiment and the data usually needs to be analyzed subsequently on a computer. We address both problems by introducing a new app, called…
Automatic Assessment of 3D Modeling Exams
ERIC Educational Resources Information Center
Sanna, A.; Lamberti, F.; Paravati, G.; Demartini, C.
2012-01-01
Computer-based assessment of exams provides teachers and students with two main benefits: fairness and effectiveness in the evaluation process. This paper proposes a fully automatic evaluation tool for the Graphic and Virtual Design (GVD) curriculum at the First School of Architecture of the Politecnico di Torino, Italy. In particular, the tool is…
A New Look at Security Education: YouTube as YouTool
ERIC Educational Resources Information Center
Werner, Laurie A.; Frank, Charles E.
2010-01-01
Teaching a computer security course which includes network administration and protection software is especially challenging because textbook tools are out of date by the time the text is published. In an effort to use lab activities that work effectively, we turned to the internet. This paper describes several resources for teaching computer…
ERIC Educational Resources Information Center
Yankelevich, Eleonora
2017-01-01
A variety of computing devices are available in today's classrooms, but they have not guaranteed the effective integration of technology. Nationally, teachers have ample devices, applications, productivity software, and digital audio and video tools. Despite all this, the literature suggests these tools are not employed to enhance student learning…
National Fusion Collaboratory: Grid Computing for Simulations and Experiments
NASA Astrophysics Data System (ADS)
Greenwald, Martin
2004-05-01
The National Fusion Collaboratory Project is creating a computational grid designed to advance scientific understanding and innovation in magnetic fusion research by facilitating collaborations, enabling more effective integration of experiments, theory and modeling and allowing more efficient use of experimental facilities. The philosophy of FusionGrid is that data, codes, analysis routines, visualization tools, and communication tools should be thought of as network available services, easily used by the fusion scientist. In such an environment, access to services is stressed rather than portability. By building on a foundation of established computer science toolkits, deployment time can be minimized. These services all share the same basic infrastructure that allows for secure authentication and resource authorization which allows stakeholders to control their own resources such as computers, data and experiments. Code developers can control intellectual property, and fair use of shared resources can be demonstrated and controlled. A key goal is to shield scientific users from the implementation details such that transparency and ease-of-use are maximized. The first FusionGrid service deployed was the TRANSP code, a widely used tool for transport analysis. Tools for run preparation, submission, monitoring and management have been developed and shared among a wide user base. This approach saves user sites from the laborious effort of maintaining such a large and complex code while at the same time reducing the burden on the development team by avoiding the need to support a large number of heterogeneous installations. Shared visualization and A/V tools are being developed and deployed to enhance long-distance collaborations. These include desktop versions of the Access Grid, a highly capable multi-point remote conferencing tool and capabilities for sharing displays and analysis tools over local and wide-area networks.
ERIC Educational Resources Information Center
Lee, Chun-Yi; Chen, Ming-Jang; Chang, Wen-Long
2014-01-01
The aim of this study is to investigate the effects of solution methods and question prompts on generalization and justification of non-routine problem solving for Grade 9 students. The learning activities are based on the context of the frog jumping game. In addition, related computer tools were used to support generalization and justification of…
Use of handheld computers in clinical practice: a systematic review.
Mickan, Sharon; Atherton, Helen; Roberts, Nia Wyn; Heneghan, Carl; Tilson, Julie K
2014-07-06
Many healthcare professionals use smartphones and tablets to inform patient care. Contemporary research suggests that handheld computers may support aspects of clinical diagnosis and management. This systematic review was designed to synthesise high quality evidence to answer the question; Does healthcare professionals' use of handheld computers improve their access to information and support clinical decision making at the point of care? A detailed search was conducted using Cochrane, MEDLINE, EMBASE, PsycINFO, Science and Social Science Citation Indices since 2001. Interventions promoting healthcare professionals seeking information or making clinical decisions using handheld computers were included. Classroom learning and the use of laptop computers were excluded. Two authors independently selected studies, assessed quality using the Cochrane Risk of Bias tool and extracted data. High levels of data heterogeneity negated statistical synthesis. Instead, evidence for effectiveness was summarised narratively, according to each study's aim for assessing the impact of handheld computer use. We included seven randomised trials investigating medical or nursing staffs' use of Personal Digital Assistants. Effectiveness was demonstrated across three distinct functions that emerged from the data: accessing information for clinical knowledge, adherence to guidelines and diagnostic decision making. When healthcare professionals used handheld computers to access clinical information, their knowledge improved significantly more than peers who used paper resources. When clinical guideline recommendations were presented on handheld computers, clinicians made significantly safer prescribing decisions and adhered more closely to recommendations than peers using paper resources. Finally, healthcare professionals made significantly more appropriate diagnostic decisions using clinical decision making tools on handheld computers compared to colleagues who did not have access to these tools. For these clinical decisions, the numbers need to test/screen were all less than 11. Healthcare professionals' use of handheld computers may improve their information seeking, adherence to guidelines and clinical decision making. Handheld computers can provide real time access to and analysis of clinical information. The integration of clinical decision support systems within handheld computers offers clinicians the highest level of synthesised evidence at the point of care. Future research is needed to replicate these early results and to identify beneficial clinical outcomes.
Use of handheld computers in clinical practice: a systematic review
2014-01-01
Background Many healthcare professionals use smartphones and tablets to inform patient care. Contemporary research suggests that handheld computers may support aspects of clinical diagnosis and management. This systematic review was designed to synthesise high quality evidence to answer the question; Does healthcare professionals’ use of handheld computers improve their access to information and support clinical decision making at the point of care? Methods A detailed search was conducted using Cochrane, MEDLINE, EMBASE, PsycINFO, Science and Social Science Citation Indices since 2001. Interventions promoting healthcare professionals seeking information or making clinical decisions using handheld computers were included. Classroom learning and the use of laptop computers were excluded. Two authors independently selected studies, assessed quality using the Cochrane Risk of Bias tool and extracted data. High levels of data heterogeneity negated statistical synthesis. Instead, evidence for effectiveness was summarised narratively, according to each study’s aim for assessing the impact of handheld computer use. Results We included seven randomised trials investigating medical or nursing staffs’ use of Personal Digital Assistants. Effectiveness was demonstrated across three distinct functions that emerged from the data: accessing information for clinical knowledge, adherence to guidelines and diagnostic decision making. When healthcare professionals used handheld computers to access clinical information, their knowledge improved significantly more than peers who used paper resources. When clinical guideline recommendations were presented on handheld computers, clinicians made significantly safer prescribing decisions and adhered more closely to recommendations than peers using paper resources. Finally, healthcare professionals made significantly more appropriate diagnostic decisions using clinical decision making tools on handheld computers compared to colleagues who did not have access to these tools. For these clinical decisions, the numbers need to test/screen were all less than 11. Conclusion Healthcare professionals’ use of handheld computers may improve their information seeking, adherence to guidelines and clinical decision making. Handheld computers can provide real time access to and analysis of clinical information. The integration of clinical decision support systems within handheld computers offers clinicians the highest level of synthesised evidence at the point of care. Future research is needed to replicate these early results and to identify beneficial clinical outcomes. PMID:24998515
GPU-accelerated computational tool for studying the effectiveness of asteroid disruption techniques
NASA Astrophysics Data System (ADS)
Zimmerman, Ben J.; Wie, Bong
2016-10-01
This paper presents the development of a new Graphics Processing Unit (GPU) accelerated computational tool for asteroid disruption techniques. Numerical simulations are completed using the high-order spectral difference (SD) method. Due to the compact nature of the SD method, it is well suited for implementation with the GPU architecture, hence solutions are generated at orders of magnitude faster than the Central Processing Unit (CPU) counterpart. A multiphase model integrated with the SD method is introduced, and several asteroid disruption simulations are conducted, including kinetic-energy impactors, multi-kinetic energy impactor systems, and nuclear options. Results illustrate the benefits of using multi-kinetic energy impactor systems when compared to a single impactor system. In addition, the effectiveness of nuclear options is observed.
Agur, Zvia; Elishmereni, Moran; Kheifetz, Yuri
2014-01-01
Despite its great promise, personalized oncology still faces many hurdles, and it is increasingly clear that targeted drugs and molecular biomarkers alone yield only modest clinical benefit. One reason is the complex relationships between biomarkers and the patient's response to drugs, obscuring the true weight of the biomarkers in the overall patient's response. This complexity can be disentangled by computational models that integrate the effects of personal biomarkers into a simulator of drug-patient dynamic interactions, for predicting the clinical outcomes. Several computational tools have been developed for personalized oncology, notably evidence-based tools for simulating pharmacokinetics, Bayesian-estimated tools for predicting survival, etc. We describe representative statistical and mathematical tools, and discuss their merits, shortcomings and preliminary clinical validation attesting to their potential. Yet, the individualization power of mathematical models alone, or statistical models alone, is limited. More accurate and versatile personalization tools can be constructed by a new application of the statistical/mathematical nonlinear mixed effects modeling (NLMEM) approach, which until recently has been used only in drug development. Using these advanced tools, clinical data from patient populations can be integrated with mechanistic models of disease and physiology, for generating personal mathematical models. Upon a more substantial validation in the clinic, this approach will hopefully be applied in personalized clinical trials, P-trials, hence aiding the establishment of personalized medicine within the main stream of clinical oncology. © 2014 Wiley Periodicals, Inc.
ERIC Educational Resources Information Center
Ambrose, Regina Maria; Palpanathan, Shanthini
2017-01-01
Computer-assisted language learning (CALL) has evolved through various stages in both technology as well as the pedagogical use of technology (Warschauer & Healey, 1998). Studies show that the CALL trend has facilitated students in their English language writing with useful tools such as computer based activities and word processing. Students…
ERIC Educational Resources Information Center
Psycharis, Sarantos; Botsari, Evanthia; Chatzarakis, George
2014-01-01
Learning styles are increasingly being integrated into computational-enhanced earning environments and a great deal of recent research work is taking place in this area. The purpose of this study was to examine the impact of the computational experiment approach, learning styles, epistemic beliefs, and engagement with the inquiry process on the…
Nanostructure symmetry: Relevance for physics and computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dupertuis, Marc-André; Oberli, D. Y.; Karlsson, K. F.
2014-03-31
We review the research done in recent years in our group on the effects of nanostructure symmetry, and outline its relevance both for nanostructure physics and for computations of their electronic and optical properties. The exemples of C3v and C2v quantum dots are used. A number of surprises and non-trivial aspects are outlined, and a few symmetry-based tools for computing and analysis are shortly presented.
Chesser, Amy K; Keene Woods, Nikki; Wipperman, Jennifer; Wilson, Rachel; Dong, Frank
2014-02-01
Low health literacy is associated with poor health outcomes. Research is needed to understand the mechanisms and pathways of its effects. Computer-based assessment tools may improve efficiency and cost-effectiveness of health literacy research. The objective of this preliminary study was to assess if administration of the Short Test of Functional Health Literacy in Adults (STOFHLA) through a computer-based medium was comparable to the paper-based test in terms of accuracy and time to completion. A randomized, crossover design was used to compare computer versus paper format of the STOFHLA at a Midwestern family medicine residency program. Eighty participants were initially randomized to either computer (n = 42) or paper (n = 38) format of the STOFHLA. After a 30-day washout period, participants returned to complete the other version of the STOFHLA. Data analysis revealed no significant difference between paper- and computer-based surveys (p = .9401; N = 57). The majority of participants showed "adequate" health literacy via paper- and computer-based surveys (100% and 97% of participants, respectively). Electronic administration of STOFHLA results were equivalent to the paper administration results for evaluation of adult health literacy. Future investigations should focus on expanded populations in multiple health care settings and validation of other health literacy screening tools in a clinical setting.
Overview of the Aeroelastic Prediction Workshop
NASA Technical Reports Server (NTRS)
Heeg, Jennifer; Chwalowski, Pawel; Schuster, David M.; Dalenbring, Mats
2013-01-01
The AIAA Aeroelastic Prediction Workshop (AePW) was held in April, 2012, bringing together communities of aeroelasticians and computational fluid dynamicists. The objective in conducting this workshop on aeroelastic prediction was to assess state-of-the-art computational aeroelasticity methods as practical tools for the prediction of static and dynamic aeroelastic phenomena. No comprehensive aeroelastic benchmarking validation standard currently exists, greatly hindering validation and state-of-the-art assessment objectives. The workshop was a step towards assessing the state of the art in computational aeroelasticity. This was an opportunity to discuss and evaluate the effectiveness of existing computer codes and modeling techniques for unsteady flow, and to identify computational and experimental areas needing additional research and development. Three configurations served as the basis for the workshop, providing different levels of geometric and flow field complexity. All cases considered involved supercritical airfoils at transonic conditions. The flow fields contained oscillating shocks and in some cases, regions of separation. The computational tools principally employed Reynolds-Averaged Navier Stokes solutions. The successes and failures of the computations and the experiments are examined in this paper.
Virtual Record Keeping: Should Teachers Keep Online Grade Books?
ERIC Educational Resources Information Center
Lacina, Jan
2006-01-01
Teaching and learning radically changed with advances in technology. Research shows that the computer can be an effective tool in both teaching and learning, and for that reason, school districts throughout the United States support schools by purchasing computers and software for individual classrooms. As a result, many school districts are using…
ERIC Educational Resources Information Center
Johnson, Douglas A.; Rubin, Sophie
2011-01-01
Computer-based instruction (CBI) has been growing rapidly as a training tool in organizational settings, but close attention to behavioral factors has often been neglected. CBI represents a promising instructional advancement over current training methods. This review article summarizes 12 years of comparative research in interactive…
Computer-Based Learning of Neuroanatomy: A Longitudinal Study of Learning, Transfer, and Retention
ERIC Educational Resources Information Center
Chariker, Julia H.; Naaz, Farah; Pani, John R.
2011-01-01
A longitudinal experiment was conducted to evaluate the effectiveness of new methods for learning neuroanatomy with computer-based instruction. Using a three-dimensional graphical model of the human brain and sections derived from the model, tools for exploring neuroanatomy were developed to encourage "adaptive exploration". This is an…
Sign Language for K-8 Mathematics by 3D Interactive Animation
ERIC Educational Resources Information Center
Adamo-Villani, Nicoletta; Doublestein, John; Martin, Zachary
2005-01-01
We present a new highly interactive computer animation tool to increase the mathematical skills of deaf children. We aim at increasing the effectiveness of (hearing) parents in teaching arithmetic to their deaf children, and the opportunity of deaf children to learn arithmetic via interactive media. Using state-of-the-art computer animation…
Handheld, Wireless Computers: Can They Improve Learning and Instruction?
ERIC Educational Resources Information Center
Moallem, Mahnaz; Kermani, Hengameh; Chen, Sue-Jen
2006-01-01
Reports show that handheld, wireless computers, once used by business professionals to keep track of appointments, contacts, e-mail, and the Internet, have found their way into classrooms and schools across the United States. However, there has not been much systematic research to investigate the effects of these new technology tools on student…
Computational Exploration of a Protein Receptor Binding Space with Student Proposed Peptide Ligands
ERIC Educational Resources Information Center
King, Matthew D.; Phillips, Paul; Turner, Matthew W.; Katz, Michael; Lew, Sarah; Bradburn, Sarah; Andersen, Tim; McDougal, Owen M.
2016-01-01
Computational molecular docking is a fast and effective "in silico" method for the analysis of binding between a protein receptor model and a ligand. The visualization and manipulation of protein to ligand binding in three-dimensional space represents a powerful tool in the biochemistry curriculum to enhance student learning. The…
USDA-ARS?s Scientific Manuscript database
Background: Dietary assessment methods used in overweight/obese participants have been scrutinized for underreporting energy. Objective: Evaluate the effectiveness of a computer-administered, 24-hour recall method (ASA24) to measure energy and nutrient intake in overweight/obese women and to further...
Cloud Computing Technologies in Writing Class: Factors Influencing Students' Learning Experience
ERIC Educational Resources Information Center
Wang, Jenny
2017-01-01
The proposed interactive online group within the cloud computing technologies as a main contribution of this paper provides easy and simple access to the cloud-based Software as a Service (SaaS) system and delivers effective educational tools for students and teacher on after-class group writing assignment activities. Therefore, this study…
Component-Based Approach for Educating Students in Bioinformatics
ERIC Educational Resources Information Center
Poe, D.; Venkatraman, N.; Hansen, C.; Singh, G.
2009-01-01
There is an increasing need for an effective method of teaching bioinformatics. Increased progress and availability of computer-based tools for educating students have led to the implementation of a computer-based system for teaching bioinformatics as described in this paper. Bioinformatics is a recent, hybrid field of study combining elements of…
ERIC Educational Resources Information Center
Sung, Yao-Ting; Yang, Je-Ming; Lee, Han-Yueh
2017-01-01
One of the trends in collaborative learning is using mobile devices for supporting the process and products of collaboration, which has been forming the field of mobile-computer-supported collaborative learning (mCSCL). Although mobile devices have become valuable collaborative learning tools, evaluative evidence for their substantial…
Better informed in clinical practice - a brief overview of dental informatics.
Reynolds, P A; Harper, J; Dunne, S
2008-03-22
Uptake of dental informatics has been hampered by technical and user issues. Innovative systems have been developed, but usability issues have affected many. Advances in technology and artificial intelligence are now producing clinically useful systems, although issues still remain with adapting computer interfaces to the dental practice working environment. A dental electronic health record has become a priority in many countries, including the UK. However, experience shows that any dental electronic health record (EHR) system cannot be subordinate to, or a subset of, a medical record. Such a future dental EHR is likely to incorporate integrated care pathways. Future best dental practice will increasingly depend on computer-based support tools, although disagreement remains about the effectiveness of current support tools. Over the longer term, future dental informatics tools will incorporate dynamic, online evidence-based medicine (EBM) tools, and promise more adaptive, patient-focused and efficient dental care with educational advantages in training.
3D data processing with advanced computer graphics tools
NASA Astrophysics Data System (ADS)
Zhang, Song; Ekstrand, Laura; Grieve, Taylor; Eisenmann, David J.; Chumbley, L. Scott
2012-09-01
Often, the 3-D raw data coming from an optical profilometer contains spiky noises and irregular grid, which make it difficult to analyze and difficult to store because of the enormously large size. This paper is to address these two issues for an optical profilometer by substantially reducing the spiky noise of the 3-D raw data from an optical profilometer, and by rapidly re-sampling the raw data into regular grids at any pixel size and any orientation with advanced computer graphics tools. Experimental results will be presented to demonstrate the effectiveness of the proposed approach.
Visualization Tools for Teaching Computer Security
ERIC Educational Resources Information Center
Yuan, Xiaohong; Vega, Percy; Qadah, Yaseen; Archer, Ricky; Yu, Huiming; Xu, Jinsheng
2010-01-01
Using animated visualization tools has been an important teaching approach in computer science education. We have developed three visualization and animation tools that demonstrate various information security concepts and actively engage learners. The information security concepts illustrated include: packet sniffer and related computer network…
Role of Statistical Random-Effects Linear Models in Personalized Medicine
Diaz, Francisco J; Yeh, Hung-Wen; de Leon, Jose
2012-01-01
Some empirical studies and recent developments in pharmacokinetic theory suggest that statistical random-effects linear models are valuable tools that allow describing simultaneously patient populations as a whole and patients as individuals. This remarkable characteristic indicates that these models may be useful in the development of personalized medicine, which aims at finding treatment regimes that are appropriate for particular patients, not just appropriate for the average patient. In fact, published developments show that random-effects linear models may provide a solid theoretical framework for drug dosage individualization in chronic diseases. In particular, individualized dosages computed with these models by means of an empirical Bayesian approach may produce better results than dosages computed with some methods routinely used in therapeutic drug monitoring. This is further supported by published empirical and theoretical findings that show that random effects linear models may provide accurate representations of phase III and IV steady-state pharmacokinetic data, and may be useful for dosage computations. These models have applications in the design of clinical algorithms for drug dosage individualization in chronic diseases; in the computation of dose correction factors; computation of the minimum number of blood samples from a patient that are necessary for calculating an optimal individualized drug dosage in therapeutic drug monitoring; measure of the clinical importance of clinical, demographic, environmental or genetic covariates; study of drug-drug interactions in clinical settings; the implementation of computational tools for web-site-based evidence farming; design of pharmacogenomic studies; and in the development of a pharmacological theory of dosage individualization. PMID:23467392
NASA Technical Reports Server (NTRS)
Liu, Yi; Anusonti-Inthra, Phuriwat; Diskin, Boris
2011-01-01
A physics-based, systematically coupled, multidisciplinary prediction tool (MUTE) for rotorcraft noise was developed and validated with a wide range of flight configurations and conditions. MUTE is an aggregation of multidisciplinary computational tools that accurately and efficiently model the physics of the source of rotorcraft noise, and predict the noise at far-field observer locations. It uses systematic coupling approaches among multiple disciplines including Computational Fluid Dynamics (CFD), Computational Structural Dynamics (CSD), and high fidelity acoustics. Within MUTE, advanced high-order CFD tools are used around the rotor blade to predict the transonic flow (shock wave) effects, which generate the high-speed impulsive noise. Predictions of the blade-vortex interaction noise in low speed flight are also improved by using the Particle Vortex Transport Method (PVTM), which preserves the wake flow details required for blade/wake and fuselage/wake interactions. The accuracy of the source noise prediction is further improved by utilizing a coupling approach between CFD and CSD, so that the effects of key structural dynamics, elastic blade deformations, and trim solutions are correctly represented in the analysis. The blade loading information and/or the flow field parameters around the rotor blade predicted by the CFD/CSD coupling approach are used to predict the acoustic signatures at far-field observer locations with a high-fidelity noise propagation code (WOPWOP3). The predicted results from the MUTE tool for rotor blade aerodynamic loading and far-field acoustic signatures are compared and validated with a variation of experimental data sets, such as UH60-A data, DNW test data and HART II test data.
Modeling NIF experimental designs with adaptive mesh refinement and Lagrangian hydrodynamics
NASA Astrophysics Data System (ADS)
Koniges, A. E.; Anderson, R. W.; Wang, P.; Gunney, B. T. N.; Becker, R.; Eder, D. C.; MacGowan, B. J.; Schneider, M. B.
2006-06-01
Incorporation of adaptive mesh refinement (AMR) into Lagrangian hydrodynamics algorithms allows for the creation of a highly powerful simulation tool effective for complex target designs with three-dimensional structure. We are developing an advanced modeling tool that includes AMR and traditional arbitrary Lagrangian-Eulerian (ALE) techniques. Our goal is the accurate prediction of vaporization, disintegration and fragmentation in National Ignition Facility (NIF) experimental target elements. Although our focus is on minimizing the generation of shrapnel in target designs and protecting the optics, the general techniques are applicable to modern advanced targets that include three-dimensional effects such as those associated with capsule fill tubes. Several essential computations in ordinary radiation hydrodynamics need to be redesigned in order to allow for AMR to work well with ALE, including algorithms associated with radiation transport. Additionally, for our goal of predicting fragmentation, we include elastic/plastic flow into our computations. We discuss the integration of these effects into a new ALE-AMR simulation code. Applications of this newly developed modeling tool as well as traditional ALE simulations in two and three dimensions are applied to NIF early-light target designs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
A video on computer security is described. Lonnie Moore, the Computer Security Manager, CSSM/CPPM at Lawrence Livermore National Laboratory (LLNL) and Gale Warshawsky, the Coordinator for Computer Security Education and Awareness at LLNL, wanted to share topics such as computer ethics, software piracy, privacy issues, and protecting information in a format that would capture and hold an audience`s attention. Four Computer Security Short Subject videos were produced which ranged from 1--3 minutes each. These videos are very effective education and awareness tools that can be used to generate discussions about computer security concerns and good computing practices.
Larson, Natalie M.; Zok, Frank W.
2017-12-27
In-situ X-ray computed tomography during axial impregnation of unidirectional fiber beds is used to study coupled effects of fluid velocity, fiber movement and preferred flow channeling on permeability. Here, in order to interpret the experimental measurements, a new computational tool for predicting axial permeability of very large 2D arrays of non-uniformly packed fibers is developed. The results show that, when the impregnation velocity is high, full saturation is attained behind the flow front and the fibers rearrange into a less uniform configuration with higher permeability. In contrast, when the velocity is low, fluid flows preferentially in the narrowest channels betweenmore » fibers, yielding unsaturated permeabilities that are lower than those in the saturated state. Lastly, these insights combined with a new computational tool will enable improved prediction of permeability, ultimately for use in optimization of composite manufacturing via liquid impregnation.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larson, Natalie M.; Zok, Frank W.
In-situ X-ray computed tomography during axial impregnation of unidirectional fiber beds is used to study coupled effects of fluid velocity, fiber movement and preferred flow channeling on permeability. Here, in order to interpret the experimental measurements, a new computational tool for predicting axial permeability of very large 2D arrays of non-uniformly packed fibers is developed. The results show that, when the impregnation velocity is high, full saturation is attained behind the flow front and the fibers rearrange into a less uniform configuration with higher permeability. In contrast, when the velocity is low, fluid flows preferentially in the narrowest channels betweenmore » fibers, yielding unsaturated permeabilities that are lower than those in the saturated state. Lastly, these insights combined with a new computational tool will enable improved prediction of permeability, ultimately for use in optimization of composite manufacturing via liquid impregnation.« less
Evaluation of Computer Tools for Idea Generation and Team Formation in Project-Based Learning
ERIC Educational Resources Information Center
Ardaiz-Villanueva, Oscar; Nicuesa-Chacon, Xabier; Brene-Artazcoz, Oscar; Sanz de Acedo Lizarraga, Maria Luisa; Sanz de Acedo Baquedano, Maria Teresa
2011-01-01
The main objective of this research was to validate the effectiveness of Wikideas and Creativity Connector tools to stimulate the generation of ideas and originality by university students organized into groups according to their indexes of creativity and affinity. Another goal of the study was to evaluate the classroom climate created by these…
Networking as a Strategic Tool, 1991
NASA Technical Reports Server (NTRS)
1991-01-01
This conference focuses on the technological advances, pitfalls, requirements, and trends involved in planning and implementing an effective computer network system. The basic theme of the conference is networking as a strategic tool. Tutorials and conference presentations explore the technology and methods involved in this rapidly changing field. Future directions are explored from a global, as well as local, perspective.
Integrating Learning Services in the Cloud: An Approach That Benefits Both Systems and Learning
ERIC Educational Resources Information Center
Gutiérrez-Carreón, Gustavo; Daradoumis, Thanasis; Jorba, Josep
2015-01-01
Currently there is an increasing trend to implement functionalities that allow for the development of applications based on Cloud computing. In education there are high expectations for Learning Management Systems since they can be powerful tools to foster more effective collaboration within a virtual classroom. Tools can also be integrated with…
ERIC Educational Resources Information Center
Derry, Sharon; And Others
This study examined ways in which two independent variables, peer collaboration and the use of a specific tool (the TAPS interface), work together and individually to shape students' problem-solving processes. More specifically, the researchers were interested in determining how collaboration and TAPS use cause metacognitive processes to differ…
Software Engineering for Scientific Computer Simulations
NASA Astrophysics Data System (ADS)
Post, Douglass E.; Henderson, Dale B.; Kendall, Richard P.; Whitney, Earl M.
2004-11-01
Computer simulation is becoming a very powerful tool for analyzing and predicting the performance of fusion experiments. Simulation efforts are evolving from including only a few effects to many effects, from small teams with a few people to large teams, and from workstations and small processor count parallel computers to massively parallel platforms. Successfully making this transition requires attention to software engineering issues. We report on the conclusions drawn from a number of case studies of large scale scientific computing projects within DOE, academia and the DoD. The major lessons learned include attention to sound project management including setting reasonable and achievable requirements, building a good code team, enforcing customer focus, carrying out verification and validation and selecting the optimum computational mathematics approaches.
Methodologies and systems for heterogeneous concurrent computing
NASA Technical Reports Server (NTRS)
Sunderam, V. S.
1994-01-01
Heterogeneous concurrent computing is gaining increasing acceptance as an alternative or complementary paradigm to multiprocessor-based parallel processing as well as to conventional supercomputing. While algorithmic and programming aspects of heterogeneous concurrent computing are similar to their parallel processing counterparts, system issues, partitioning and scheduling, and performance aspects are significantly different. In this paper, we discuss critical design and implementation issues in heterogeneous concurrent computing, and describe techniques for enhancing its effectiveness. In particular, we highlight the system level infrastructures that are required, aspects of parallel algorithm development that most affect performance, system capabilities and limitations, and tools and methodologies for effective computing in heterogeneous networked environments. We also present recent developments and experiences in the context of the PVM system and comment on ongoing and future work.
NASA Technical Reports Server (NTRS)
Marconi, F.; Yaeger, L.
1976-01-01
A numerical procedure was developed to compute the inviscid super/hypersonic flow field about complex vehicle geometries accurately and efficiently. A second-order accurate finite difference scheme is used to integrate the three-dimensional Euler equations in regions of continuous flow, while all shock waves are computed as discontinuities via the Rankine-Hugoniot jump conditions. Conformal mappings are used to develop a computational grid. The effects of blunt nose entropy layers are computed in detail. Real gas effects for equilibrium air are included using curve fits of Mollier charts. Typical calculated results for shuttle orbiter, hypersonic transport, and supersonic aircraft configurations are included to demonstrate the usefulness of this tool.
Grid Computing and Collaboration Technology in Support of Fusion Energy Sciences
NASA Astrophysics Data System (ADS)
Schissel, D. P.
2004-11-01
The SciDAC Initiative is creating a computational grid designed to advance scientific understanding in fusion research by facilitating collaborations, enabling more effective integration of experiments, theory and modeling, and allowing more efficient use of experimental facilities. The philosophy is that data, codes, analysis routines, visualization tools, and communication tools should be thought of as easy to use network available services. Access to services is stressed rather than portability. Services share the same basic security infrastructure so that stakeholders can control their own resources and helps ensure fair use of resources. The collaborative control room is being developed using the open-source Access Grid software that enables secure group-to-group collaboration with capabilities beyond teleconferencing including application sharing and control. The ability to effectively integrate off-site scientists into a dynamic control room will be critical to the success of future international projects like ITER. Grid computing, the secure integration of computer systems over high-speed networks to provide on-demand access to data analysis capabilities and related functions, is being deployed as an alternative to traditional resource sharing among institutions. The first grid computational service deployed was the transport code TRANSP and included tools for run preparation, submission, monitoring and management. This approach saves user sites from the laborious effort of maintaining a complex code while at the same time reducing the burden on developers by avoiding the support of a large number of heterogeneous installations. This tutorial will present the philosophy behind an advanced collaborative environment, give specific examples, and discuss its usage beyond FES.
A CFD/CSD Interaction Methodology for Aircraft Wings
NASA Technical Reports Server (NTRS)
Bhardwaj, Manoj K.
1997-01-01
With advanced subsonic transports and military aircraft operating in the transonic regime, it is becoming important to determine the effects of the coupling between aerodynamic loads and elastic forces. Since aeroelastic effects can contribute significantly to the design of these aircraft, there is a strong need in the aerospace industry to predict these aero-structure interactions computationally. To perform static aeroelastic analysis in the transonic regime, high fidelity computational fluid dynamics (CFD) analysis tools must be used in conjunction with high fidelity computational structural fluid dynamics (CSD) analysis tools due to the nonlinear behavior of the aerodynamics in the transonic regime. There is also a need to be able to use a wide variety of CFD and CSD tools to predict these aeroelastic effects in the transonic regime. Because source codes are not always available, it is necessary to couple the CFD and CSD codes without alteration of the source codes. In this study, an aeroelastic coupling procedure is developed which will perform static aeroelastic analysis using any CFD and CSD code with little code integration. The aeroelastic coupling procedure is demonstrated on an F/A-18 Stabilator using NASTD (an in-house McDonnell Douglas CFD code) and NASTRAN. In addition, the Aeroelastic Research Wing (ARW-2) is used for demonstration of the aeroelastic coupling procedure by using ENSAERO (NASA Ames Research Center CFD code) and a finite element wing-box code (developed as part of this research).
Engineering computer graphics in gas turbine engine design, analysis and manufacture
NASA Technical Reports Server (NTRS)
Lopatka, R. S.
1975-01-01
A time-sharing and computer graphics facility designed to provide effective interactive tools to a large number of engineering users with varied requirements was described. The application of computer graphics displays at several levels of hardware complexity and capability is discussed, with examples of graphics systems tracing gas turbine product development, beginning with preliminary design through manufacture. Highlights of an operating system stylized for interactive engineering graphics is described.
Brian J. Williams; Bo Song; Chou Chiao-Ying; Thomas M. Williams; John Hom
2010-01-01
Three-dimensional (3D) visualization is a useful tool that depicts virtual forest landscapes on computer. Previous studies in visualization have required high end computer hardware and specialized technical skills. A virtual forest landscape can be used to show different effects of disturbances and management scenarios on a computer, which allows observation of forest...
Software Tools for Shipbuilding Productivity
1984-12-01
shipbuilding, is that design, manufacturing and robotic technology applications to shipbuilding have been proven. all aspects of shipbuilding is now a task...technical information about the process of Computer Aided Design (CAD) and Computer Aided Manufacturing (CAM) effectively has been a problem of serious and...Design (CAD) 3.4.1 CAD System Components 3.4.2 CAD System Benefits 3.4.3 New and Future CAD Technologies Computer Aided Manufacturing (CAM) 3.5.1 CAM
Computational Tools for Allosteric Drug Discovery: Site Identification and Focus Library Design.
Huang, Wenkang; Nussinov, Ruth; Zhang, Jian
2017-01-01
Allostery is an intrinsic phenomenon of biological macromolecules involving regulation and/or signal transduction induced by a ligand binding to an allosteric site distinct from a molecule's active site. Allosteric drugs are currently receiving increased attention in drug discovery because drugs that target allosteric sites can provide important advantages over the corresponding orthosteric drugs including specific subtype selectivity within receptor families. Consequently, targeting allosteric sites, instead of orthosteric sites, can reduce drug-related side effects and toxicity. On the down side, allosteric drug discovery can be more challenging than traditional orthosteric drug discovery due to difficulties associated with determining the locations of allosteric sites and designing drugs based on these sites and the need for the allosteric effects to propagate through the structure, reach the ligand binding site and elicit a conformational change. In this study, we present computational tools ranging from the identification of potential allosteric sites to the design of "allosteric-like" modulator libraries. These tools may be particularly useful for allosteric drug discovery.
RF Models for Plasma-Surface Interactions in VSim
NASA Astrophysics Data System (ADS)
Jenkins, Thomas G.; Smithe, D. N.; Pankin, A. Y.; Roark, C. M.; Zhou, C. D.; Stoltz, P. H.; Kruger, S. E.
2014-10-01
An overview of ongoing enhancements to the Plasma Discharge (PD) module of Tech-X's VSim software tool is presented. A sub-grid kinetic sheath model, developed for the accurate computation of sheath potentials near metal and dielectric-coated walls, enables the physical effects of DC and RF sheath physics to be included in macroscopic-scale plasma simulations that need not explicitly resolve sheath scale lengths. Sheath potential evolution, together with particle behavior near the sheath, can thus be simulated in complex geometries. Generalizations of the model to include sputtering, secondary electron emission, and effects from multiple ion species and background magnetic fields are summarized; related numerical results are also presented. In addition, improved tools for plasma chemistry and IEDF/EEDF visualization and modeling are discussed, as well as our initial efforts toward the development of hybrid fluid/kinetic transition capabilities within VSim. Ultimately, we aim to establish VSimPD as a robust, efficient computational tool for modeling industrial plasma processes. Supported by US DoE SBIR-I/II Award DE-SC0009501.
Computational approaches to metabolic engineering utilizing systems biology and synthetic biology.
Fong, Stephen S
2014-08-01
Metabolic engineering modifies cellular function to address various biochemical applications. Underlying metabolic engineering efforts are a host of tools and knowledge that are integrated to enable successful outcomes. Concurrent development of computational and experimental tools has enabled different approaches to metabolic engineering. One approach is to leverage knowledge and computational tools to prospectively predict designs to achieve the desired outcome. An alternative approach is to utilize combinatorial experimental tools to empirically explore the range of cellular function and to screen for desired traits. This mini-review focuses on computational systems biology and synthetic biology tools that can be used in combination for prospective in silico strain design.
TNEEL workshop. Interactive methods for teaching end-of-life care.
Wilkie, Diana J; Lin, Yu-Chuan; Judge, M Kay M; Shannon, Sarah E; Corless, Inge B; Farber, Stuart J; Brown, Marie-Annette
2004-01-01
Nurse educators have identified lack of end-of-life content as a serious deficit in undergraduate nursing education. TNEEL, a new software program with tools for teaching end-of-life topics, was created to help educators overcome this problem. The authors implemented an experiential workshop to help educators learn how to use TNEEL's wide variety of educational tools. Trainers provided information about TNEEL and coached participants (N = 94) as they practiced using laptop computers to increase their familiarity and comfort in using the toolkit. Workshop participants completed pre- and posttest evaluations addressing their opinions and beliefs about using this computer tool. Findings support the workshop as an effective way to facilitate adoption of this innovative educational resource and support the development of a nation-wide training plan for TNEEL with experiential workshops.
Spielberg, Freya; Kurth, Ann E; Severynen, Anneleen; Hsieh, Yu-Hsiang; Moring-Parris, Daniel; Mackenzie, Sara; Rothman, Richard
2011-06-01
Providers in emergency care settings (ECSs) often face barriers to expanded HIV testing. We undertook formative research to understand the potential utility of a computer tool, "CARE," to facilitate rapid HIV testing in ECSs. Computer tool usability and acceptability were assessed among 35 adult patients, and provider focus groups were held, in two ECSs in Washington State and Maryland. The computer tool was usable by patients of varying computer literacy. Patients appreciated the tool's privacy and lack of judgment and their ability to reflect on HIV risks and create risk reduction plans. Staff voiced concerns regarding ECS-based HIV testing generally, including resources for follow-up of newly diagnosed people. Computer-delivered HIV testing support was acceptable and usable among low-literacy populations in two ECSs. Such tools may help circumvent some practical barriers associated with routine HIV testing in busy settings though linkages to care will still be needed.
A Computer Model for Red Blood Cell Chemistry
1996-10-01
5012. 13. ABSTRACT (Maximum 200 There is a growing need for interactive computational tools for medical education and research. The most exciting...paradigm for interactive education is simulation. Fluid Mod is a simulation based computational tool developed in the late sixties and early seventies at...to a modern Windows, object oriented interface. This development will provide students with a useful computational tool for learning . More important
Voruganti, Teja R; O'Brien, Mary Ann; Straus, Sharon E; McLaughlin, John R; Grunfeld, Eva
2015-09-24
Health risk assessment tools compute an individual's risk of developing a disease. Routine use of such tools by primary care physicians (PCPs) is potentially useful in chronic disease prevention. We sought physicians' awareness and perceptions of the usefulness, usability and feasibility of performing assessments with computer-based risk assessment tools in primary care settings. Focus groups and usability testing with a computer-based risk assessment tool were conducted with PCPs from both university-affiliated and community-based practices. Analysis was derived from grounded theory methodology. PCPs (n = 30) were aware of several risk assessment tools although only select tools were used routinely. The decision to use a tool depended on how use impacted practice workflow and whether the tool had credibility. Participants felt that embedding tools in the electronic medical records (EMRs) system might allow for health information from the medical record to auto-populate into the tool. User comprehension of risk could also be improved with computer-based interfaces that present risk in different formats. In this study, PCPs chose to use certain tools more regularly because of usability and credibility. Despite there being differences in the particular tools a clinical practice used, there was general appreciation for the usefulness of tools for different clinical situations. Participants characterised particular features of an ideal tool, feeling strongly that embedding risk assessment tools in the EMR would maximise accessibility and use of the tool for chronic disease management. However, appropriate practice workflow integration and features that facilitate patient understanding at point-of-care are also essential.
Race and Emotion in Computer-Based HIV Prevention Videos for Emergency Department Patients
ERIC Educational Resources Information Center
Aronson, Ian David; Bania, Theodore C.
2011-01-01
Computer-based video provides a valuable tool for HIV prevention in hospital emergency departments. However, the type of video content and protocol that will be most effective remain underexplored and the subject of debate. This study employs a new and highly replicable methodology that enables comparisons of multiple video segments, each based on…
Some Problems of Computer-Aided Testing and "Interview-Like Tests"
ERIC Educational Resources Information Center
Smoline, D.V.
2008-01-01
Computer-based testing--is an effective teacher's tool, intended to optimize course goals and assessment techniques in a comparatively short time. However, this is accomplished only if we deal with high-quality tests. It is strange, but despite the 100-year history of Testing Theory (see, Anastasi, A., Urbina, S. (1997). Psychological testing.…
ERIC Educational Resources Information Center
Amaral, Luiz A.; Meurers, Detmar
2011-01-01
This paper explores the motivation and prerequisites for successful integration of Intelligent Computer-Assisted Language Learning (ICALL) tools into current foreign language teaching and learning (FLTL) practice. We focus on two aspects, which we argue to be important for effective ICALL system development and use: (i) the relationship between…
The Effect of Color Choice on Learner Interpretation of a Cosmology Visualization
ERIC Educational Resources Information Center
Buck, Zoe
2013-01-01
As we turn more and more to high-end computing to understand the Universe at cosmological scales, dynamic visualizations of simulations will take on a vital role as perceptual and cognitive tools. In collaboration with the Adler Planetarium and University of California High-Performance AstroComputing Center (UC-HiPACC), I am interested in better…
Integrating Computational Science Tools into a Thermodynamics Course
ERIC Educational Resources Information Center
Vieira, Camilo; Magana, Alejandra J.; García, R. Edwin; Jana, Aniruddha; Krafcik, Matthew
2018-01-01
Computational tools and methods have permeated multiple science and engineering disciplines, because they enable scientists and engineers to process large amounts of data, represent abstract phenomena, and to model and simulate complex concepts. In order to prepare future engineers with the ability to use computational tools in the context of…
On aerodynamic wake analysis and its relation to total aerodynamic drag in a wind tunnel environment
NASA Astrophysics Data System (ADS)
Guterres, Rui M.
The present work was developed with the goal of advancing the state of the art in the application of three-dimensional wake data analysis to the quantification of aerodynamic drag on a body in a low speed wind tunnel environment. Analysis of the existing tools, their strengths and limitations is presented. Improvements to the existing analysis approaches were made. Software tools were developed to integrate the analysis into a practical tool. A comprehensive derivation of the equations needed for drag computations based on three dimensional separated wake data is developed. A set of complete steps ranging from the basic mathematical concept to the applicable engineering equations is presented. An extensive experimental study was conducted. Three representative body types were studied in varying ground effect conditions. A detailed qualitative wake analysis using wake imaging and two and three dimensional flow visualization was performed. Several significant features of the flow were identified and their relation to the total aerodynamic drag established. A comprehensive wake study of this type is shown to be in itself a powerful tool for the analysis of the wake aerodynamics and its relation to body drag. Quantitative wake analysis techniques were developed. Significant post processing and data conditioning tools and precision analysis were developed. The quality of the data is shown to be in direct correlation with the accuracy of the computed aerodynamic drag. Steps are taken to identify the sources of uncertainty. These are quantified when possible and the accuracy of the computed results is seen to significantly improve. When post processing alone does not resolve issues related to precision and accuracy, solutions are proposed. The improved quantitative wake analysis is applied to the wake data obtained. Guidelines are established that will lead to more successful implementation of these tools in future research programs. Close attention is paid to implementation of issues that are of crucial importance for the accuracy of the results and that are not detailed in the literature. The impact of ground effect on the flows in hand is qualitatively and quantitatively studied. Its impact on the accuracy of the computations as well as the wall drag incompatibility with the theoretical model followed are discussed. The newly developed quantitative analysis provides significantly increased accuracy. The aerodynamic drag coefficient is computed within one percent of balance measured value for the best cases.
CREME: The 2011 Revision of the Cosmic Ray Effects on Micro-Electronics Code
NASA Technical Reports Server (NTRS)
Adams, James H., Jr.; Barghouty, Abdulnasser F.; Reed, Robert A.; Sierawski, Brian D.; Watts, John W., Jr.
2012-01-01
We describe a tool suite, CREME, which combines existing capabilities of CREME96 and CREME86 with new radiation environment models and new Monte Carlo computational capabilities for single event effects and total ionizing dose.
High Performance Computing Modeling Advances Accelerator Science for High-Energy Physics
Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis
2014-07-28
The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space, and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing are essential for accurately modeling them. In the past decade, the US Department of Energy's SciDAC program has produced accelerator-modeling tools that have been employed to tackle some of the most difficult accelerator science problems. The authors discuss the Synergia framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation package capable ofmore » handling the entire spectrum of beam dynamics simulations. Our authors present Synergia's design principles and its performance on HPC platforms.« less
Performance evaluation of the Engineering Analysis and Data Systems (EADS) 2
NASA Technical Reports Server (NTRS)
Debrunner, Linda S.
1994-01-01
The Engineering Analysis and Data System (EADS)II (1) was installed in March 1993 to provide high performance computing for science and engineering at Marshall Space Flight Center (MSFC). EADS II increased the computing capabilities over the existing EADS facility in the areas of throughput and mass storage. EADS II includes a Vector Processor Compute System (VPCS), a Virtual Memory Compute System (CFS), a Common Output System (COS), as well as Image Processing Station, Mini Super Computers, and Intelligent Workstations. These facilities are interconnected by a sophisticated network system. This work considers only the performance of the VPCS and the CFS. The VPCS is a Cray YMP. The CFS is implemented on an RS 6000 using the UniTree Mass Storage System. To better meet the science and engineering computing requirements, EADS II must be monitored, its performance analyzed, and appropriate modifications for performance improvement made. Implementing this approach requires tool(s) to assist in performance monitoring and analysis. In Spring 1994, PerfStat 2.0 was purchased to meet these needs for the VPCS and the CFS. PerfStat(2) is a set of tools that can be used to analyze both historical and real-time performance data. Its flexible design allows significant user customization. The user identifies what data is collected, how it is classified, and how it is displayed for evaluation. Both graphical and tabular displays are supported. The capability of the PerfStat tool was evaluated, appropriate modifications to EADS II to optimize throughput and enhance productivity were suggested and implemented, and the effects of these modifications on the systems performance were observed. In this paper, the PerfStat tool is described, then its use with EADS II is outlined briefly. Next, the evaluation of the VPCS, as well as the modifications made to the system are described. Finally, conclusions are drawn and recommendations for future worked are outlined.
Visual analysis of fluid dynamics at NASA's numerical aerodynamic simulation facility
NASA Technical Reports Server (NTRS)
Watson, Velvin R.
1991-01-01
A study aimed at describing and illustrating visualization tools used in Computational Fluid Dynamics (CFD) and indicating how these tools are likely to change by showing a projected resolution of the human computer interface is presented. The following are outlined using a graphically based test format: the revolution of human computer environments for CFD research; comparison of current environments; current environments with the ideal; predictions for the future CFD environments; what can be done to accelerate the improvements. The following comments are given: when acquiring visualization tools, potential rapid changes must be considered; environmental changes over the next ten years due to human computer interface cannot be fathomed; data flow packages such as AVS, apE, Explorer and Data Explorer are easy to learn and use for small problems, excellent for prototyping, but not so efficient for large problems; the approximation techniques used in visualization software must be appropriate for the data; it has become more cost effective to move jobs that fit on workstations and run only memory intensive jobs on the supercomputer; use of three dimensional skills will be maximized when the three dimensional environment is built in from the start.
Reifman, Jaques; Kumar, Kamal; Wesensten, Nancy J; Tountas, Nikolaos A; Balkin, Thomas J; Ramakrishnan, Sridhar
2016-12-01
Computational tools that predict the effects of daily sleep/wake amounts on neurobehavioral performance are critical components of fatigue management systems, allowing for the identification of periods during which individuals are at increased risk for performance errors. However, none of the existing computational tools is publicly available, and the commercially available tools do not account for the beneficial effects of caffeine on performance, limiting their practical utility. Here, we introduce 2B-Alert Web, an open-access tool for predicting neurobehavioral performance, which accounts for the effects of sleep/wake schedules, time of day, and caffeine consumption, while incorporating the latest scientific findings in sleep restriction, sleep extension, and recovery sleep. We combined our validated Unified Model of Performance and our validated caffeine model to form a single, integrated modeling framework instantiated as a Web-enabled tool. 2B-Alert Web allows users to input daily sleep/wake schedules and caffeine consumption (dosage and time) to obtain group-average predictions of neurobehavioral performance based on psychomotor vigilance tasks. 2B-Alert Web is accessible at: https://2b-alert-web.bhsai.org. The 2B-Alert Web tool allows users to obtain predictions for mean response time, mean reciprocal response time, and number of lapses. The graphing tool allows for simultaneous display of up to seven different sleep/wake and caffeine schedules. The schedules and corresponding predicted outputs can be saved as a Microsoft Excel file; the corresponding plots can be saved as an image file. The schedules and predictions are erased when the user logs off, thereby maintaining privacy and confidentiality. The publicly accessible 2B-Alert Web tool is available for operators, schedulers, and neurobehavioral scientists as well as the general public to determine the impact of any given sleep/wake schedule, caffeine consumption, and time of day on performance of a group of individuals. This evidence-based tool can be used as a decision aid to design effective work schedules, guide the design of future sleep restriction and caffeine studies, and increase public awareness of the effects of sleep amounts, time of day, and caffeine on alertness. © 2016 Associated Professional Sleep Societies, LLC.
Computer Aided Design of Computer Generated Holograms for electron beam fabrication
NASA Technical Reports Server (NTRS)
Urquhart, Kristopher S.; Lee, Sing H.; Guest, Clark C.; Feldman, Michael R.; Farhoosh, Hamid
1989-01-01
Computer Aided Design (CAD) systems that have been developed for electrical and mechanical design tasks are also effective tools for the process of designing Computer Generated Holograms (CGHs), particularly when these holograms are to be fabricated using electron beam lithography. CAD workstations provide efficient and convenient means of computing, storing, displaying, and preparing for fabrication many of the features that are common to CGH designs. Experience gained in the process of designing CGHs with various types of encoding methods is presented. Suggestions are made so that future workstations may further accommodate the CGH design process.
Building a generalized distributed system model
NASA Technical Reports Server (NTRS)
Mukkamala, Ravi; Foudriat, E. C.
1991-01-01
A modeling tool for both analysis and design of distributed systems is discussed. Since many research institutions have access to networks of workstations, the researchers decided to build a tool running on top of the workstations to function as a prototype as well as a distributed simulator for a computing system. The effects of system modeling on performance prediction in distributed systems and the effect of static locking and deadlocks on the performance predictions of distributed transactions are also discussed. While the probability of deadlock is considerably small, its effects on performance could be significant.
User’s Guide for the SAS (Stand-Off Attack Simulation) Computer Model.
1982-01-15
A99QAXFD000-01 Albuquerque, New Mexico 87110 I1. CONTROLLING OFFICE NAME AND ADDRESS 12. REPORT DATE Director 15 January 1982 Defense Nuclear Aqency 13...computer model. SAS is an effective survivability and security system design tool which allows an analyst to compare the relative effectiveness of selected...mounted against other systems during uploading for dispersal or for non -emergency relocation. GLCM and LANCE must be mobilized and formed into convoys
An interactive tool for visualization of spike train synchronization.
Terry, Kevin
2010-08-15
A number of studies have examined the synchronization of central and peripheral spike trains by applying signal analysis techniques in the time and frequency domains. These analyses can reveal the presence of one or more common neural inputs that produce synchronization. However, synchronization measurements can fluctuate significantly due to the inherent variability of neural discharges and a finite data record length. Moreover, the effect of these natural variations is further compounded by the number of parameters available for calculating coherence in the frequency domain and the number of indices used to quantify short-term synchronization (STS) in the time domain. The computational tool presented here provides the user with an interactive environment that dynamically calculates and displays spike train properties along with STS and coherence indices to show how these factors interact. It is intended for a broad range of users, from those who are new to synchronization to experienced researchers who want to develop more meaningful and effective computational and experimental studies. To ensure this freely available tool meets the needs of all users, there are two versions. The first is a stand-alone version for educational use that can run on any computer. The second version can be modified and expanded by researchers who want to explore more in-depth questions about synchronization. Therefore, the distribution and use of this tool should both improve the understanding of fundamental spike train synchronization dynamics and produce more efficient and meaningful synchronization studies. (c) 2010 Elsevier B.V. All rights reserved.
The Impact on Student Achievement of When CAS Technology Is Introduced
ERIC Educational Resources Information Center
Driver, David
2012-01-01
When a Computer Algebra System (CAS) is used as a pedagogical and functional tool in class and as a functional tool in exams, its effect on student achievement can be quite profound. The timing of when students are first introduced to a CAS has an impact on gains in student achievement. In this action research project, the CAS calculator was…
Multivariate Density Estimation and Remote Sensing
NASA Technical Reports Server (NTRS)
Scott, D. W.
1983-01-01
Current efforts to develop methods and computer algorithms to effectively represent multivariate data commonly encountered in remote sensing applications are described. While this may involve scatter diagrams, multivariate representations of nonparametric probability density estimates are emphasized. The density function provides a useful graphical tool for looking at data and a useful theoretical tool for classification. This approach is called a thunderstorm data analysis.
Dynamic VM Provisioning for TORQUE in a Cloud Environment
NASA Astrophysics Data System (ADS)
Zhang, S.; Boland, L.; Coddington, P.; Sevior, M.
2014-06-01
Cloud computing, also known as an Infrastructure-as-a-Service (IaaS), is attracting more interest from the commercial and educational sectors as a way to provide cost-effective computational infrastructure. It is an ideal platform for researchers who must share common resources but need to be able to scale up to massive computational requirements for specific periods of time. This paper presents the tools and techniques developed to allow the open source TORQUE distributed resource manager and Maui cluster scheduler to dynamically integrate OpenStack cloud resources into existing high throughput computing clusters.
Pohjonen, Hanna; Ross, Peeter; Blickman, Johan G; Kamman, Richard
2007-01-01
Emerging technologies are transforming the workflows in healthcare enterprises. Computing grids and handheld mobile/wireless devices are providing clinicians with enterprise-wide access to all patient data and analysis tools on a pervasive basis. In this paper, emerging technologies are presented that provide computing grids and streaming-based access to image and data management functions, and system architectures that enable pervasive computing on a cost-effective basis. Finally, the implications of such technologies are investigated regarding the positive impacts on clinical workflows.
Using block pulse functions for seismic vibration semi-active control of structures with MR dampers
NASA Astrophysics Data System (ADS)
Rahimi Gendeshmin, Saeed; Davarnia, Daniel
2018-03-01
This article applied the idea of block pulse functions in the semi-active control of structures. The BP functions give effective tools to approximate complex problems. The applied control algorithm has a major effect on the performance of the controlled system and the requirements of the control devices. In control problems, it is important to devise an accurate analytical technique with less computational cost. It is proved that the BP functions are fundamental tools in approximation problems which have been applied in disparate areas in last decades. This study focuses on the employment of BP functions in control algorithm concerning reduction the computational cost. Magneto-rheological (MR) dampers are one of the well-known semi-active tools that can be used to control the response of civil Structures during earthquake. For validation purposes, numerical simulations of a 5-story shear building frame with MR dampers are presented. The results of suggested method were compared with results obtained by controlling the frame by the optimal control method based on linear quadratic regulator theory. It can be seen from simulation results that the suggested method can be helpful in reducing seismic structural responses. Besides, this method has acceptable accuracy and is in agreement with optimal control method with less computational costs.
Environmental corrections of a dual-induction logging while drilling tool in vertical wells
NASA Astrophysics Data System (ADS)
Kang, Zhengming; Ke, Shizhen; Jiang, Ming; Yin, Chengfang; Li, Anzong; Li, Junjian
2018-04-01
With the development of Logging While Drilling (LWD) technology, dual-induction LWD logging is not only widely applied in deviated wells and horizontal wells, but it is used commonly in vertical wells. Accordingly, it is necessary to simulate the response of LWD tools in vertical wells for logging interpretation. In this paper, the investigation characteristics, the effects of the tool structure, skin effect and drilling environment of a dual-induction LWD tool are simulated by the three-dimensional (3D) finite element method (FEM). In order to closely simulate the actual situation, real structure of the tool is taking into account. The results demonstrate that the influence of the background value of the tool structure can be eliminated. The values of deducting the background of a tool structure and analytical solution have a quantitative agreement in homogeneous formations. The effect of measurement frequency could be effectively eliminated by chart of skin effect correction. In addition, the measurement environment, borehole size, mud resistivity, shoulder bed, layer thickness and invasion, have an effect on the true resistivity. To eliminate these effects, borehole correction charts, shoulder bed correction charts and tornado charts are computed based on real tool structure. Based on correction charts, well logging data can be corrected automatically by a suitable interpolation method, which is convenient and fast. Verified with actual logging data in vertical wells, this method could obtain the true resistivity of formation.
NASA Astrophysics Data System (ADS)
Fairley, J. P.; Hinds, J. J.
2003-12-01
The advent of the World Wide Web in the early 1990s not only revolutionized the exchange of ideas and information within the scientific community, but also provided educators with a new array of teaching, informational, and promotional tools. Use of computer graphics and animation to explain concepts and processes can stimulate classroom participation and student interest in the geosciences, which has historically attracted students with strong spatial and visualization skills. In today's job market, graduates are expected to have knowledge of computers and the ability to use them for acquiring, processing, and visually analyzing data. Furthermore, in addition to promoting visibility and communication within the scientific community, computer graphics and the Internet can be informative and educational for the general public. Although computer skills are crucial for earth science students and educators, many pitfalls exist in implementing computer technology and web-based resources into research and classroom activities. Learning to use these new tools effectively requires a significant time commitment and careful attention to the source and reliability of the data presented. Furthermore, educators have a responsibility to ensure that students and the public understand the assumptions and limitations of the materials presented, rather than allowing them to be overwhelmed by "gee-whiz" aspects of the technology. We present three examples of computer technology in the earth sciences classroom: 1) a computer animation of water table response to well pumping, 2) a 3-D fly-through animation of a fault controlled valley, and 3) a virtual field trip for an introductory geology class. These examples demonstrate some of the challenges and benefits of these new tools, and encourage educators to expand the responsible use of computer technology for teaching and communicating scientific results to the general public.
Making Your Tools Useful to a Broader Audience
NASA Astrophysics Data System (ADS)
Lyness, M. D.; Broten, M. J.
2006-12-01
With the increasing growth of Web Services and SOAP the ability to connect and reuse computational and also visualization tools from all over the world via Web Interfaces that can be easily displayed in any current browser has provided the means to construct an ideal online research environment. The age-old question of usability is a major determining factor whether a particular tool would find great success in its community. An interface that can be understood purely by a user's intuition is desirable and more closely obtainable than ever before. Through the use of increasingly sophisticated web-oriented technologies including JavaScript, AJAX, and the DOM, web interfaces are able to harness the advantages of the Internet along with the functional capabilities of native applications such as menus, partial page changes, background processing, and visual effects to name a few. Also, with computers becoming a normal part of the educational process companies, such as Google and Microsoft, give us a synthetic intuition as a foundation for new designs. Understanding the way earth science researchers know how to use computers will allow the VLab portal (http://vlab.msi.umn.edu) and other projects to create interfaces that will get used. To provide detailed communication with the users of VLab's computational tools, projects like the Porky Portlet (http://www.gorerle.com/vlab-wiki/index.php?title=Porky_Portlet) spawned to empower users with a fully- detailed, interactive visual representation of progressing workflows. With the well-thought design of such tools and interfaces, researchers around the world will become accustomed to new highly engaging, visual web- based research environments.
A vectorized Lanczos eigensolver for high-performance computers
NASA Technical Reports Server (NTRS)
Bostic, Susan W.
1990-01-01
The computational strategies used to implement a Lanczos-based-method eigensolver on the latest generation of supercomputers are described. Several examples of structural vibration and buckling problems are presented that show the effects of using optimization techniques to increase the vectorization of the computational steps. The data storage and access schemes and the tools and strategies that best exploit the computer resources are presented. The method is implemented on the Convex C220, the Cray 2, and the Cray Y-MP computers. Results show that very good computation rates are achieved for the most computationally intensive steps of the Lanczos algorithm and that the Lanczos algorithm is many times faster than other methods extensively used in the past.
Cutting tool form compensation system and method
Barkman, W.E.; Babelay, E.F. Jr.; Klages, E.J.
1993-10-19
A compensation system for a computer-controlled machining apparatus having a controller and including a cutting tool and a workpiece holder which are movable relative to one another along a preprogrammed path during a machining operation utilizes a camera and a vision computer for gathering information at a preselected stage of a machining operation relating to the actual shape and size of the cutting edge of the cutting tool and for altering the preprogrammed path in accordance with detected variations between the actual size and shape of the cutting edge and an assumed size and shape of the cutting edge. The camera obtains an image of the cutting tool against a background so that the cutting tool and background possess contrasting light intensities, and the vision computer utilizes the contrasting light intensities of the image to locate points therein which correspond to points along the actual cutting edge. Following a series of computations involving the determining of a tool center from the points identified along the tool edge, the results of the computations are fed to the controller where the preprogrammed path is altered as aforedescribed. 9 figures.
Cutting tool form compensaton system and method
Barkman, William E.; Babelay, Jr., Edwin F.; Klages, Edward J.
1993-01-01
A compensation system for a computer-controlled machining apparatus having a controller and including a cutting tool and a workpiece holder which are movable relative to one another along a preprogrammed path during a machining operation utilizes a camera and a vision computer for gathering information at a preselected stage of a machining operation relating to the actual shape and size of the cutting edge of the cutting tool and for altering the preprogrammed path in accordance with detected variations between the actual size and shape of the cutting edge and an assumed size and shape of the cutting edge. The camera obtains an image of the cutting tool against a background so that the cutting tool and background possess contrasting light intensities, and the vision computer utilizes the contrasting light intensities of the image to locate points therein which correspond to points along the actual cutting edge. Following a series of computations involving the determining of a tool center from the points identified along the tool edge, the results of the computations are fed to the controller where the preprogrammed path is altered as aforedescribed.
Manninen, Tiina; Aćimović, Jugoslava; Havela, Riikka; Teppola, Heidi; Linne, Marja-Leena
2018-01-01
The possibility to replicate and reproduce published research results is one of the biggest challenges in all areas of science. In computational neuroscience, there are thousands of models available. However, it is rarely possible to reimplement the models based on the information in the original publication, let alone rerun the models just because the model implementations have not been made publicly available. We evaluate and discuss the comparability of a versatile choice of simulation tools: tools for biochemical reactions and spiking neuronal networks, and relatively new tools for growth in cell cultures. The replicability and reproducibility issues are considered for computational models that are equally diverse, including the models for intracellular signal transduction of neurons and glial cells, in addition to single glial cells, neuron-glia interactions, and selected examples of spiking neuronal networks. We also address the comparability of the simulation results with one another to comprehend if the studied models can be used to answer similar research questions. In addition to presenting the challenges in reproducibility and replicability of published results in computational neuroscience, we highlight the need for developing recommendations and good practices for publishing simulation tools and computational models. Model validation and flexible model description must be an integral part of the tool used to simulate and develop computational models. Constant improvement on experimental techniques and recording protocols leads to increasing knowledge about the biophysical mechanisms in neural systems. This poses new challenges for computational neuroscience: extended or completely new computational methods and models may be required. Careful evaluation and categorization of the existing models and tools provide a foundation for these future needs, for constructing multiscale models or extending the models to incorporate additional or more detailed biophysical mechanisms. Improving the quality of publications in computational neuroscience, enabling progressive building of advanced computational models and tools, can be achieved only through adopting publishing standards which underline replicability and reproducibility of research results.
Manninen, Tiina; Aćimović, Jugoslava; Havela, Riikka; Teppola, Heidi; Linne, Marja-Leena
2018-01-01
The possibility to replicate and reproduce published research results is one of the biggest challenges in all areas of science. In computational neuroscience, there are thousands of models available. However, it is rarely possible to reimplement the models based on the information in the original publication, let alone rerun the models just because the model implementations have not been made publicly available. We evaluate and discuss the comparability of a versatile choice of simulation tools: tools for biochemical reactions and spiking neuronal networks, and relatively new tools for growth in cell cultures. The replicability and reproducibility issues are considered for computational models that are equally diverse, including the models for intracellular signal transduction of neurons and glial cells, in addition to single glial cells, neuron-glia interactions, and selected examples of spiking neuronal networks. We also address the comparability of the simulation results with one another to comprehend if the studied models can be used to answer similar research questions. In addition to presenting the challenges in reproducibility and replicability of published results in computational neuroscience, we highlight the need for developing recommendations and good practices for publishing simulation tools and computational models. Model validation and flexible model description must be an integral part of the tool used to simulate and develop computational models. Constant improvement on experimental techniques and recording protocols leads to increasing knowledge about the biophysical mechanisms in neural systems. This poses new challenges for computational neuroscience: extended or completely new computational methods and models may be required. Careful evaluation and categorization of the existing models and tools provide a foundation for these future needs, for constructing multiscale models or extending the models to incorporate additional or more detailed biophysical mechanisms. Improving the quality of publications in computational neuroscience, enabling progressive building of advanced computational models and tools, can be achieved only through adopting publishing standards which underline replicability and reproducibility of research results. PMID:29765315
A Computer Assisted Application in Preschool Education: Seasons and Their Characteristics
ERIC Educational Resources Information Center
Akçay, Nilufer Okur
2016-01-01
In this study, it is aimed to determine the effect of computer-assisted instruction while teaching the subject seasons to preschool students on the academic success. The sample of the study consists of 86 children from the nursery classes of private and official schools in Agri city center. As data collecting tools General Achievement Test used as…
Computer-Aided Air-Traffic Control In The Terminal Area
NASA Technical Reports Server (NTRS)
Erzberger, Heinz
1995-01-01
Developmental computer-aided system for automated management and control of arrival traffic at large airport includes three integrated subsystems. One subsystem, called Traffic Management Advisor, another subsystem, called Descent Advisor, and third subsystem, called Final Approach Spacing Tool. Data base that includes current wind measurements and mathematical models of performances of types of aircraft contributes to effective operation of system.
Experiments Using Cell Phones in Physics Classroom Education: The Computer-Aided "g" Determination
ERIC Educational Resources Information Center
Vogt, Patrik; Kuhn, Jochen; Muller, Sebastian
2011-01-01
This paper continues the collection of experiments that describe the use of cell phones as experimental tools in physics classroom education. We describe a computer-aided determination of the free-fall acceleration "g" using the acoustical Doppler effect. The Doppler shift is a function of the speed of the source. Since a free-falling objects…
ERIC Educational Resources Information Center
Fridge, Evorell; Bagui, Sikha
2016-01-01
The goal of this research was to investigate the effects of automated testing software on levels of student reflection and student performance. This was a self-selecting, between subjects design that examined the performance of students in introductory computer programming classes. Participants were given the option of using the Web-CAT…
The Effect of Length of Exposure to CALL Technology on Young Iranian EFL Learners' Grammar Gain
ERIC Educational Resources Information Center
Sadeghi, Karim; Dousti, Masoumeh
2013-01-01
In the twenty-first century, integration of technology into education is a force worthy of contemplation. Among all the possible technological tools that can be integrated into EFL classes, computers seem to have achieved a more dominant position. One of the outstanding features of computers is their potential to present educational games and to…
Task scheduling in dataflow computer architectures
NASA Technical Reports Server (NTRS)
Katsinis, Constantine
1994-01-01
Dataflow computers provide a platform for the solution of a large class of computational problems, which includes digital signal processing and image processing. Many typical applications are represented by a set of tasks which can be repetitively executed in parallel as specified by an associated dataflow graph. Research in this area aims to model these architectures, develop scheduling procedures, and predict the transient and steady state performance. Researchers at NASA have created a model and developed associated software tools which are capable of analyzing a dataflow graph and predicting its runtime performance under various resource and timing constraints. These models and tools were extended and used in this work. Experiments using these tools revealed certain properties of such graphs that require further study. Specifically, the transient behavior at the beginning of the execution of a graph can have a significant effect on the steady state performance. Transformation and retiming of the application algorithm and its initial conditions can produce a different transient behavior and consequently different steady state performance. The effect of such transformations on the resource requirements or under resource constraints requires extensive study. Task scheduling to obtain maximum performance (based on user-defined criteria), or to satisfy a set of resource constraints, can also be significantly affected by a transformation of the application algorithm. Since task scheduling is performed by heuristic algorithms, further research is needed to determine if new scheduling heuristics can be developed that can exploit such transformations. This work has provided the initial development for further long-term research efforts. A simulation tool was completed to provide insight into the transient and steady state execution of a dataflow graph. A set of scheduling algorithms was completed which can operate in conjunction with the modeling and performance tools previously developed. Initial studies on the performance of these algorithms were done to examine the effects of application algorithm transformations as measured by such quantities as number of processors, time between outputs, time between input and output, communication time, and memory size.
Tools for Administration of a UNIX-Based Network
NASA Technical Reports Server (NTRS)
LeClaire, Stephen; Farrar, Edward
2004-01-01
Several computer programs have been developed to enable efficient administration of a large, heterogeneous, UNIX-based computing and communication network that includes a variety of computers connected to a variety of subnetworks. One program provides secure software tools for administrators to create, modify, lock, and delete accounts of specific users. This program also provides tools for users to change their UNIX passwords and log-in shells. These tools check for errors. Another program comprises a client and a server component that, together, provide a secure mechanism to create, modify, and query quota levels on a network file system (NFS) mounted by use of the VERITAS File SystemJ software. The client software resides on an internal secure computer with a secure Web interface; one can gain access to the client software from any authorized computer capable of running web-browser software. The server software resides on a UNIX computer configured with the VERITAS software system. Directories where VERITAS quotas are applied are NFS-mounted. Another program is a Web-based, client/server Internet Protocol (IP) address tool that facilitates maintenance lookup of information about IP addresses for a network of computers.
NASA Astrophysics Data System (ADS)
Alameda, J. C.
2011-12-01
Development and optimization of computational science models, particularly on high performance computers, and with the advent of ubiquitous multicore processor systems, practically on every system, has been accomplished with basic software tools, typically, command-line based compilers, debuggers, performance tools that have not changed substantially from the days of serial and early vector computers. However, model complexity, including the complexity added by modern message passing libraries such as MPI, and the need for hybrid code models (such as openMP and MPI) to be able to take full advantage of high performance computers with an increasing core count per shared memory node, has made development and optimization of such codes an increasingly arduous task. Additional architectural developments, such as many-core processors, only complicate the situation further. In this paper, we describe how our NSF-funded project, "SI2-SSI: A Productive and Accessible Development Workbench for HPC Applications Using the Eclipse Parallel Tools Platform" (WHPC) seeks to improve the Eclipse Parallel Tools Platform, an environment designed to support scientific code development targeted at a diverse set of high performance computing systems. Our WHPC project to improve Eclipse PTP takes an application-centric view to improve PTP. We are using a set of scientific applications, each with a variety of challenges, and using PTP to drive further improvements to both the scientific application, as well as to understand shortcomings in Eclipse PTP from an application developer perspective, to drive our list of improvements we seek to make. We are also partnering with performance tool providers, to drive higher quality performance tool integration. We have partnered with the Cactus group at Louisiana State University to improve Eclipse's ability to work with computational frameworks and extremely complex build systems, as well as to develop educational materials to incorporate into computational science and engineering codes. Finally, we are partnering with the lead PTP developers at IBM, to ensure we are as effective as possible within the Eclipse community development. We are also conducting training and outreach to our user community, including conference BOF sessions, monthly user calls, and an annual user meeting, so that we can best inform the improvements we make to Eclipse PTP. With these activities we endeavor to encourage use of modern software engineering practices, as enabled through the Eclipse IDE, with computational science and engineering applications. These practices include proper use of source code repositories, tracking and rectifying issues, measuring and monitoring code performance changes against both optimizations as well as ever-changing software stacks and configurations on HPC systems, as well as ultimately encouraging development and maintenance of testing suites -- things that have become commonplace in many software endeavors, but have lagged in the development of science applications. We view that the challenge with the increased complexity of both HPC systems and science applications demands the use of better software engineering methods, preferably enabled by modern tools such as Eclipse PTP, to help the computational science community thrive as we evolve the HPC landscape.
EEGLAB, SIFT, NFT, BCILAB, and ERICA: new tools for advanced EEG processing.
Delorme, Arnaud; Mullen, Tim; Kothe, Christian; Akalin Acar, Zeynep; Bigdely-Shamlo, Nima; Vankov, Andrey; Makeig, Scott
2011-01-01
We describe a set of complementary EEG data collection and processing tools recently developed at the Swartz Center for Computational Neuroscience (SCCN) that connect to and extend the EEGLAB software environment, a freely available and readily extensible processing environment running under Matlab. The new tools include (1) a new and flexible EEGLAB STUDY design facility for framing and performing statistical analyses on data from multiple subjects; (2) a neuroelectromagnetic forward head modeling toolbox (NFT) for building realistic electrical head models from available data; (3) a source information flow toolbox (SIFT) for modeling ongoing or event-related effective connectivity between cortical areas; (4) a BCILAB toolbox for building online brain-computer interface (BCI) models from available data, and (5) an experimental real-time interactive control and analysis (ERICA) environment for real-time production and coordination of interactive, multimodal experiments.
Use of Continuous Integration Tools for Application Performance Monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vergara Larrea, Veronica G; Joubert, Wayne; Fuson, Christopher B
High performance computing systems are becom- ing increasingly complex, both in node architecture and in the multiple layers of software stack required to compile and run applications. As a consequence, the likelihood is increasing for application performance regressions to occur as a result of routine upgrades of system software components which interact in complex ways. The purpose of this study is to evaluate the effectiveness of continuous integration tools for application performance monitoring on HPC systems. In addition, this paper also describes a prototype system for application perfor- mance monitoring based on Jenkins, a Java-based continuous integration tool. The monitoringmore » system described leverages several features in Jenkins to track application performance results over time. Preliminary results and lessons learned from monitoring applications on Cray systems at the Oak Ridge Leadership Computing Facility are presented.« less
Development of Computational Simulation Tools to Model Weapon Propulsors
2004-01-01
Calculation in Permanent Magnet Motors with Rotor Eccentricity: With Slotting Effect Considered," IEEE Transactions on Magnetics, Volume 34, No. 4, 2253-2266...1998). [3] Lieu, Dennis K., Kim, Ungtae. "Magnetic Field Calculation in Permanent Magnet Motors with Rotor Eccentricity: Without Slotting Effect
NASA Astrophysics Data System (ADS)
Lamb, Richard L.
2016-02-01
Within the last 10 years, new tools for assisting in the teaching and learning of academic skills and content within the context of science have arisen. These new tools include multiple types of computer software and hardware to include (video) games. The purpose of this study was to examine and compare the effect of computer learning games in the form of three-dimensional serious educational games, two-dimensional online laboratories, and traditional lecture-based instruction in the context of student content learning in science. In particular, this study examines the impact of dimensionality, or the ability to move along the X-, Y-, and Z-axis in the games. Study subjects ( N = 551) were randomly selected using a stratified sampling technique. Independent strata subsamples were developed based upon the conditions of serious educational games, online laboratories, and lecture. The study also computationally models a potential mechanism of action and compares two- and three-dimensional learning environments. F test results suggest a significant difference for the main effect of condition across the factor of content gain score with large effect. Overall, comparisons using computational models suggest that three-dimensional serious educational games increase the level of success in learning as measured with content examinations through greater recruitment and attributional retraining of cognitive systems. The study supports assertions in the literature that the use of games in higher dimensions (i.e., three-dimensional versus two-dimensional) helps to increase student understanding of science concepts.
Computational Tools and Facilities for the Next-Generation Analysis and Design Environment
NASA Technical Reports Server (NTRS)
Noor, Ahmed K. (Compiler); Malone, John B. (Compiler)
1997-01-01
This document contains presentations from the joint UVA/NASA Workshop on Computational Tools and Facilities for the Next-Generation Analysis and Design Environment held at the Virginia Consortium of Engineering and Science Universities in Hampton, Virginia on September 17-18, 1996. The presentations focused on the computational tools and facilities for analysis and design of engineering systems, including, real-time simulations, immersive systems, collaborative engineering environment, Web-based tools and interactive media for technical training. Workshop attendees represented NASA, commercial software developers, the aerospace industry, government labs, and academia. The workshop objectives were to assess the level of maturity of a number of computational tools and facilities and their potential for application to the next-generation integrated design environment.
1992-09-01
to acquire or develop effective simulation tools to observe the behavior of a RISC implementation as it executes different types of programs . We choose...Performance Computer performance is measured by the amount of the time required to execute a program . Performance encompasses two types of time, elapsed time...and CPU time. Elapsed time is the time required to execute a program from start to finish. It includes latency of input/output activities such as
Advancing crime scene computer forensics techniques
NASA Astrophysics Data System (ADS)
Hosmer, Chet; Feldman, John; Giordano, Joe
1999-02-01
Computers and network technology have become inexpensive and powerful tools that can be applied to a wide range of criminal activity. Computers have changed the world's view of evidence because computers are used more and more as tools in committing `traditional crimes' such as embezzlements, thefts, extortion and murder. This paper will focus on reviewing the current state-of-the-art of the data recovery and evidence construction tools used in both the field and laboratory for prosection purposes.
A novel modification of the Turing test for artificial intelligence and robotics in healthcare.
Ashrafian, Hutan; Darzi, Ara; Athanasiou, Thanos
2015-03-01
The increasing demands of delivering higher quality global healthcare has resulted in a corresponding expansion in the development of computer-based and robotic healthcare tools that rely on artificially intelligent technologies. The Turing test was designed to assess artificial intelligence (AI) in computer technology. It remains an important qualitative tool for testing the next generation of medical diagnostics and medical robotics. Development of quantifiable diagnostic accuracy meta-analytical evaluative techniques for the Turing test paradigm. Modification of the Turing test to offer quantifiable diagnostic precision and statistical effect-size robustness in the assessment of AI for computer-based and robotic healthcare technologies. Modification of the Turing test to offer robust diagnostic scores for AI can contribute to enhancing and refining the next generation of digital diagnostic technologies and healthcare robotics. Copyright © 2014 John Wiley & Sons, Ltd.
Predictive Model and Methodology for Heat Treatment Distortion Final Report CRADA No. TC-298-92
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nikkel, D. J.; McCabe, J.
This project was a multi-lab, multi-partner CRADA involving LLNL, Los Alamos National Laboratory, Sandia National Laboratories, Oak Ridge National Laboratory, Martin Marietta Energy Systems and the industrial partner, The National Center of Manufacturing Sciences (NCMS). A number of member companies of NCMS participated including General Motors Corporation, Ford Motor Company, The Torrington Company, Gear Research, the Illinois Institute of Technology Research Institute, and Deformation Control Technology •. LLNL was the lead laboratory for metrology technology used for validation of the computational tool/methodology. LLNL was also the lead laboratory for the development of the software user interface , for the computationalmore » tool. This report focuses on the participation of LLNL and NCMS. The purpose of the project was to develop a computational tool/methodology that engineers would use to predict the effects of heat treatment on the _size and shape of industrial parts made of quench hardenable alloys. Initially, the target application of the tool was gears for automotive power trains.« less
Simulation techniques in hyperthermia treatment planning
Paulides, MM; Stauffer, PR; Neufeld, E; Maccarini, P; Kyriakou, A; Canters, RAM; Diederich, C; Bakker, JF; Van Rhoon, GC
2013-01-01
Clinical trials have shown that hyperthermia (HT), i.e. an increase of tissue temperature to 39-44°C, significantly enhance radiotherapy and chemotherapy effectiveness (1). Driven by the developments in computational techniques and computing power, personalized hyperthermia treatment planning (HTP) has matured and has become a powerful tool for optimizing treatment quality. Electromagnetic, ultrasound, and thermal simulations using realistic clinical setups are now being performed to achieve patient-specific treatment optimization. In addition, extensive studies aimed to properly implement novel HT tools and techniques, and to assess the quality of HT, are becoming more common. In this paper, we review the simulation tools and techniques developed for clinical hyperthermia, and evaluate their current status on the path from “model” to “clinic”. In addition, we illustrate the major techniques employed for validation and optimization. HTP has become an essential tool for improvement, control, and assessment of HT treatment quality. As such, it plays a pivotal role in the quest to establish HT as an efficacious addition to multi-modality treatment of cancer. PMID:23672453
Computational nanomedicine: modeling of nanoparticle-mediated hyperthermal cancer therapy
Kaddi, Chanchala D; Phan, John H; Wang, May D
2016-01-01
Nanoparticle-mediated hyperthermia for cancer therapy is a growing area of cancer nanomedicine because of the potential for localized and targeted destruction of cancer cells. Localized hyperthermal effects are dependent on many factors, including nanoparticle size and shape, excitation wavelength and power, and tissue properties. Computational modeling is an important tool for investigating and optimizing these parameters. In this review, we focus on computational modeling of magnetic and gold nanoparticle-mediated hyperthermia, followed by a discussion of new opportunities and challenges. PMID:23914967
Practical quality control tools for curves and surfaces
NASA Technical Reports Server (NTRS)
Small, Scott G.
1992-01-01
Curves (geometry) and surfaces created by Computer Aided Geometric Design systems in the engineering environment must satisfy two basic quality criteria: the geometric shape must have the desired engineering properties; and the objects must be parameterized in a way which does not cause computational difficulty for geometric processing and engineering analysis. Interactive techniques are described which are in use at Boeing to evaluate the quality of aircraft geometry prior to Computational Fluid Dynamic analysis, including newly developed methods for examining surface parameterization and its effects.
Current And Future Directions Of Lens Design Software
NASA Astrophysics Data System (ADS)
Gustafson, Darryl E.
1983-10-01
The most effective environment for doing lens design continues to evolve as new computer hardware and software tools become available. Important recent hardware developments include: Low-cost but powerful interactive multi-user 32 bit computers with virtual memory that are totally software-compatible with prior larger and more expensive members of the family. A rapidly growing variety of graphics devices for both hard-copy and screen graphics, including many with color capability. In addition, with optical design software readily accessible in many forms, optical design has become a part-time activity for a large number of engineers instead of being restricted to a small number of full-time specialists. A designer interface that is friendly for the part-time user while remaining efficient for the full-time designer is thus becoming more important as well as more practical. Along with these developments, software tools in other scientific and engineering disciplines are proliferating. Thus, the optical designer is less and less unique in his use of computer-aided techniques and faces the challenge and opportunity of efficiently communicating his designs to other computer-aided-design (CAD), computer-aided-manufacturing (CAM), structural, thermal, and mechanical software tools. This paper will address the impact of these developments on the current and future directions of the CODE VTM optical design software package, its implementation, and the resulting lens design environment.
Developing eThread pipeline using SAGA-pilot abstraction for large-scale structural bioinformatics.
Ragothaman, Anjani; Boddu, Sairam Chowdary; Kim, Nayong; Feinstein, Wei; Brylinski, Michal; Jha, Shantenu; Kim, Joohyun
2014-01-01
While most of computational annotation approaches are sequence-based, threading methods are becoming increasingly attractive because of predicted structural information that could uncover the underlying function. However, threading tools are generally compute-intensive and the number of protein sequences from even small genomes such as prokaryotes is large typically containing many thousands, prohibiting their application as a genome-wide structural systems biology tool. To leverage its utility, we have developed a pipeline for eThread--a meta-threading protein structure modeling tool, that can use computational resources efficiently and effectively. We employ a pilot-based approach that supports seamless data and task-level parallelism and manages large variation in workload and computational requirements. Our scalable pipeline is deployed on Amazon EC2 and can efficiently select resources based upon task requirements. We present runtime analysis to characterize computational complexity of eThread and EC2 infrastructure. Based on results, we suggest a pathway to an optimized solution with respect to metrics such as time-to-solution or cost-to-solution. Our eThread pipeline can scale to support a large number of sequences and is expected to be a viable solution for genome-scale structural bioinformatics and structure-based annotation, particularly, amenable for small genomes such as prokaryotes. The developed pipeline is easily extensible to other types of distributed cyberinfrastructure.
Developing eThread Pipeline Using SAGA-Pilot Abstraction for Large-Scale Structural Bioinformatics
Ragothaman, Anjani; Feinstein, Wei; Jha, Shantenu; Kim, Joohyun
2014-01-01
While most of computational annotation approaches are sequence-based, threading methods are becoming increasingly attractive because of predicted structural information that could uncover the underlying function. However, threading tools are generally compute-intensive and the number of protein sequences from even small genomes such as prokaryotes is large typically containing many thousands, prohibiting their application as a genome-wide structural systems biology tool. To leverage its utility, we have developed a pipeline for eThread—a meta-threading protein structure modeling tool, that can use computational resources efficiently and effectively. We employ a pilot-based approach that supports seamless data and task-level parallelism and manages large variation in workload and computational requirements. Our scalable pipeline is deployed on Amazon EC2 and can efficiently select resources based upon task requirements. We present runtime analysis to characterize computational complexity of eThread and EC2 infrastructure. Based on results, we suggest a pathway to an optimized solution with respect to metrics such as time-to-solution or cost-to-solution. Our eThread pipeline can scale to support a large number of sequences and is expected to be a viable solution for genome-scale structural bioinformatics and structure-based annotation, particularly, amenable for small genomes such as prokaryotes. The developed pipeline is easily extensible to other types of distributed cyberinfrastructure. PMID:24995285
Wan, Songlin; Zhang, Xiangchao; He, Xiaoying; Xu, Min
2016-12-20
Computer controlled optical surfacing requires an accurate tool influence function (TIF) for reliable path planning and deterministic fabrication. Near the edge of the workpieces, the TIF has a nonlinear removal behavior, which will cause a severe edge-roll phenomenon. In the present paper, a new edge pressure model is developed based on the finite element analysis results. The model is represented as the product of a basic pressure function and a correcting function. The basic pressure distribution is calculated according to the surface shape of the polishing pad, and the correcting function is used to compensate the errors caused by the edge effect. Practical experimental results demonstrate that the new model can accurately predict the edge TIFs with different overhang ratios. The relative error of the new edge model can be reduced to 15%.
Optimization of vascular-targeting drugs in a computational model of tumor growth
NASA Astrophysics Data System (ADS)
Gevertz, Jana
2012-04-01
A biophysical tool is introduced that seeks to provide a theoretical basis for helping drug design teams assess the most promising drug targets and design optimal treatment strategies. The tool is grounded in a previously validated computational model of the feedback that occurs between a growing tumor and the evolving vasculature. In this paper, the model is particularly used to explore the therapeutic effectiveness of two drugs that target the tumor vasculature: angiogenesis inhibitors (AIs) and vascular disrupting agents (VDAs). Using sensitivity analyses, the impact of VDA dosing parameters is explored, as is the effects of administering a VDA with an AI. Further, a stochastic optimization scheme is utilized to identify an optimal dosing schedule for treatment with an AI and a chemotherapeutic. The treatment regimen identified can successfully halt simulated tumor growth, even after the cessation of therapy.
NASA Technical Reports Server (NTRS)
Eckhardt, Dave E., Jr.; Jipping, Michael J.; Wild, Chris J.; Zeil, Steven J.; Roberts, Cathy C.
1993-01-01
A study of computer engineering tool integration using the Portable Common Tool Environment (PCTE) Public Interface Standard is presented. Over a 10-week time frame, three existing software products were encapsulated to work in the Emeraude environment, an implementation of the PCTE version 1.5 standard. The software products used were a computer-aided software engineering (CASE) design tool, a software reuse tool, and a computer architecture design and analysis tool. The tool set was then demonstrated to work in a coordinated design process in the Emeraude environment. The project and the features of PCTE used are described, experience with the use of Emeraude environment over the project time frame is summarized, and several related areas for future research are summarized.
ERIC Educational Resources Information Center
Heuer, Herbert; Hegele, Mathias
2010-01-01
Mechanical tools are transparent in the sense that their input-output relations can be derived from their perceptible characteristics. Modern technology creates more and more tools that lack mechanical transparency, such as in the control of the position of a cursor by means of a computer mouse or some other input device. We inquired whether an…
Students' Use of Electronic Support Tools in Mathematics
ERIC Educational Resources Information Center
Crawford, Lindy; Higgins, Kristina N.; Huscroft-D'Angelo, Jacqueline N.; Hall, Lindsay
2016-01-01
This study investigated students' use of electronic support tools within a computer-based mathematics program. Electronic support tools are tools, such as hyperlinks or calculators, available within many computer-based instructional programs. A convenience sample of 73 students in grades 4-6 was selected to participate in the study. Students…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Suresh, Niraj; Stephens, Sean A.; Adams, Lexor
Plant roots play a critical role in plant-soil-microbe interactions that occur in the rhizosphere, as well as processes with important implications to climate change and forest management. Quantitative size information on roots in their native environment is invaluable for studying root growth and environmental processes involving the plant. X ray computed tomography (XCT) has been demonstrated to be an effective tool for in situ root scanning and analysis. Our group at the Environmental Molecular Sciences Laboratory (EMSL) has developed an XCT-based tool to image and quantitatively analyze plant root structures in their native soil environment. XCT data collected on amore » Prairie dropseed (Sporobolus heterolepis) specimen was used to visualize its root structure. A combination of open-source software RooTrak and DDV were employed to segment the root from the soil, and calculate its isosurface, respectively. Our own computer script named 3DRoot-SV was developed and used to calculate root volume and surface area from a triangular mesh. The process utilizing a unique combination of tools, from imaging to quantitative root analysis, including the 3DRoot-SV computer script, is described.« less
ERIC Educational Resources Information Center
Carey, Cayelan C.; Gougis, Rebekka Darner
2017-01-01
Ecosystem modeling is a critically important tool for environmental scientists, yet is rarely taught in undergraduate and graduate classrooms. To address this gap, we developed a teaching module that exposes students to a suite of modeling skills and tools (including computer programming, numerical simulation modeling, and distributed computing)…
ERIC Educational Resources Information Center
Rose, Simon P.; Habgood, M. P. Jacob; Jay, Tim
2017-01-01
Programming tools are being used in education to teach computer science to children as young as 5 years old. This research aims to explore young children's approaches to programming in two tools with contrasting programming interfaces, ScratchJr and Lightbot, and considers the impact of programming approaches on developing computational thinking.…
Murray-Davis, Beth; McDonald, Helen; Cross-Sudworth, Fiona; Ahmed, Rashid; Simioni, Julia; Dore, Sharon; Marrin, Michael; DeSantis, Judy; Leyland, Nicholas; Gardosi, Jason; Hutton, Eileen; McDonald, Sarah
2015-08-01
Adverse events occur in up to 10% of obstetric cases, and up to one half of these could be prevented. Case reviews and root cause analysis using a structured tool may help health care providers to learn from adverse events and to identify trends and recurring systems issues. We sought to establish the reliability of a root cause analysis computer application called Standardized Clinical Outcome Review (SCOR). We designed a mixed methods study to evaluate the effectiveness of the tool. We conducted qualitative content analysis of five charts reviewed by both the traditional obstetric quality assurance methods and the SCOR tool. We also determined inter-rater reliability by having four health care providers review the same five cases using the SCOR tool. The comparative qualitative review revealed that the traditional quality assurance case review process used inconsistent language and made serious, personalized recommendations for those involved in the case. In contrast, the SCOR review provided a consistent format for recommendations, a list of action points, and highlighted systems issues. The mean percentage agreement between the four reviewers for the five cases was 75%. The different health care providers completed data entry and assessment of the case in a similar way. Missing data from the chart and poor wording of questions were identified as issues affecting percentage agreement. The SCOR tool provides a standardized, objective, obstetric-specific tool for root cause analysis that may improve identification of risk factors and dissemination of action plans to prevent future events.
The expanded role of computers in Space Station Freedom real-time operations
NASA Technical Reports Server (NTRS)
Crawford, R. Paul; Cannon, Kathleen V.
1990-01-01
The challenges that NASA and its international partners face in their real-time operation of the Space Station Freedom necessitate an increased role on the part of computers. In building the operational concepts concerning the role of the computer, the Space Station program is using lessons learned experience from past programs, knowledge of the needs of future space programs, and technical advances in the computer industry. The computer is expected to contribute most significantly in real-time operations by forming a versatile operating architecture, a responsive operations tool set, and an environment that promotes effective and efficient utilization of Space Station Freedom resources.
Veksler, Vladislav D; Buchler, Norbou; Hoffman, Blaine E; Cassenti, Daniel N; Sample, Char; Sugrim, Shridat
2018-01-01
Computational models of cognitive processes may be employed in cyber-security tools, experiments, and simulations to address human agency and effective decision-making in keeping computational networks secure. Cognitive modeling can addresses multi-disciplinary cyber-security challenges requiring cross-cutting approaches over the human and computational sciences such as the following: (a) adversarial reasoning and behavioral game theory to predict attacker subjective utilities and decision likelihood distributions, (b) human factors of cyber tools to address human system integration challenges, estimation of defender cognitive states, and opportunities for automation, (c) dynamic simulations involving attacker, defender, and user models to enhance studies of cyber epidemiology and cyber hygiene, and (d) training effectiveness research and training scenarios to address human cyber-security performance, maturation of cyber-security skill sets, and effective decision-making. Models may be initially constructed at the group-level based on mean tendencies of each subject's subgroup, based on known statistics such as specific skill proficiencies, demographic characteristics, and cultural factors. For more precise and accurate predictions, cognitive models may be fine-tuned to each individual attacker, defender, or user profile, and updated over time (based on recorded behavior) via techniques such as model tracing and dynamic parameter fitting.
Ertel, Rebekka Lund; Braae, Uffe Christian; Ngowi, Helena Aminiel; Johansen, Maria Vang
2017-01-01
Health education has been recognised as a specific intervention tool for control of Taenia solium taeniosis/cysticercosis but evaluation of the efficacy of the tool remains. The aim of our study was to assess the effect of a computer-based T. solium health education tool 'The Vicious Worm' on knowledge uptake among professionals and investigate attitudes towards the program. The study was carried out between March and May 2014 in Mbeya Region, Tanzania, where T. solium is endemic. The study was a pre and post assessment of a health education tool based on questionnaire surveys and focus group discussions to investigate knowledge and attitudes. A total of 79 study subjects participated in the study including study subjects from both health- and agriculture sector. The health education consisted of 1½h individual practice with the computer program. The baseline questionnaire showed an overall knowledge on aspects of acquisition and transmission of T. solium infections (78%), porcine cysticercosis treatment (77%), human tapeworm in general (72%), neurocysticercosis in general (49%), and porcine cysticercosis diagnosis (48%). However, there was a lack of knowledge on acquisition of neurocysticercosis (15%), prevention of T. solium taeniosis/cysticercosis (28%), and relation between porcine cysticercosis, human cysticercosis, and taeniosis (32%). Overall, the study subject's knowledge was significantly improved both immediately after (p=0.001) and two weeks after (p<0.001) the health education and knowledge regarding specific aspects was significantly improved in most aspects immediately after and two weeks after the health education. The focus group discussions showed positive attitudes towards the program and the study subjects found 'The Vicious Worm' efficient, simple, and appealing. The study revealed a good effect of 'The Vicious Worm' suggesting that it could be a useful health education tool, which should be further assessed and thereafter integrated in T. solium taeniosis/cysticercosis control. Copyright © 2016. Published by Elsevier B.V.
The educational effectiveness of computer-based instruction
NASA Astrophysics Data System (ADS)
Renshaw, Carl E.; Taylor, Holly A.
2000-07-01
Although numerous studies have shown that computer-based education is effective for enhancing rote memorization, the impact of these tools on higher-order cognitive skills, such as critical thinking, is less clear. Existing methods for evaluating educational effectiveness, such as surveys, quizzes and pre- or post-interviews, may not be effective for evaluating impact on critical thinking skills because students are not always aware of the effects the software has on their thought processes. We review an alternative evaluation strategy whereby the student's mastery of a specific cognitive skill is directly assessed both before and after participating in a computer-based exercise. Methodologies for assessing cognitive skill are based on recent advances in the fields of cognitive science. Results from two studies show that computer-based exercises can positively impact the higher-order cognitive skills of some students. However, a given exercise will not impact all students equally. This suggests that further work is needed to understand how and why CAI software is more or less effective within a given population.
Clinical nursing informatics. Developing tools for knowledge workers.
Ozbolt, J G; Graves, J R
1993-06-01
Current research in clinical nursing informatics is proceeding along three important dimensions: (1) identifying and defining nursing's language and structuring its data; (2) understanding clinical judgment and how computer-based systems can facilitate and not replace it; and (3) discovering how well-designed systems can transform nursing practice. A number of efforts are underway to find and use language that accurately represents nursing and that can be incorporated into computer-based information systems. These efforts add to understanding nursing problems, interventions, and outcomes, and provide the elements for databases from which nursing's costs and effectiveness can be studied. Research on clinical judgment focuses on how nurses (perhaps with different levels of expertise) assess patient needs, set goals, and plan and deliver care, as well as how computer-based systems can be developed to aid these cognitive processes. Finally, investigators are studying not only how computers can help nurses with the mechanics and logistics of processing information but also and more importantly how access to informatics tools changes nursing care.
Graphics Flutter Analysis Methods, an interactive computing system at Lockheed-California Company
NASA Technical Reports Server (NTRS)
Radovcich, N. A.
1975-01-01
An interactive computer graphics system, Graphics Flutter Analysis Methods (GFAM), was developed to complement FAMAS, a matrix-oriented batch computing system, and other computer programs in performing complex numerical calculations using a fully integrated data management system. GFAM has many of the matrix operation capabilities found in FAMAS, but on a smaller scale, and is utilized when the analysis requires a high degree of interaction between the engineer and computer, and schedule constraints exclude the use of batch entry programs. Applications of GFAM to a variety of preliminary design, development design, and project modification programs suggest that interactive flutter analysis using matrix representations is a feasible and cost effective computing tool.
A Grid Infrastructure for Supporting Space-based Science Operations
NASA Technical Reports Server (NTRS)
Bradford, Robert N.; Redman, Sandra H.; McNair, Ann R. (Technical Monitor)
2002-01-01
Emerging technologies for computational grid infrastructures have the potential for revolutionizing the way computers are used in all aspects of our lives. Computational grids are currently being implemented to provide a large-scale, dynamic, and secure research and engineering environments based on standards and next-generation reusable software, enabling greater science and engineering productivity through shared resources and distributed computing for less cost than traditional architectures. Combined with the emerging technologies of high-performance networks, grids provide researchers, scientists and engineers the first real opportunity for an effective distributed collaborative environment with access to resources such as computational and storage systems, instruments, and software tools and services for the most computationally challenging applications.
Methodologies and Tools for Tuning Parallel Programs: 80% Art, 20% Science, and 10% Luck
NASA Technical Reports Server (NTRS)
Yan, Jerry C.; Bailey, David (Technical Monitor)
1996-01-01
The need for computing power has forced a migration from serial computation on a single processor to parallel processing on multiprocessors. However, without effective means to monitor (and analyze) program execution, tuning the performance of parallel programs becomes exponentially difficult as program complexity and machine size increase. In the past few years, the ubiquitous introduction of performance tuning tools from various supercomputer vendors (Intel's ParAide, TMC's PRISM, CRI's Apprentice, and Convex's CXtrace) seems to indicate the maturity of performance instrumentation/monitor/tuning technologies and vendors'/customers' recognition of their importance. However, a few important questions remain: What kind of performance bottlenecks can these tools detect (or correct)? How time consuming is the performance tuning process? What are some important technical issues that remain to be tackled in this area? This workshop reviews the fundamental concepts involved in analyzing and improving the performance of parallel and heterogeneous message-passing programs. Several alternative strategies will be contrasted, and for each we will describe how currently available tuning tools (e.g. AIMS, ParAide, PRISM, Apprentice, CXtrace, ATExpert, Pablo, IPS-2) can be used to facilitate the process. We will characterize the effectiveness of the tools and methodologies based on actual user experiences at NASA Ames Research Center. Finally, we will discuss their limitations and outline recent approaches taken by vendors and the research community to address them.
Project-Based Teaching-Learning Computer-Aided Engineering Tools
ERIC Educational Resources Information Center
Simoes, J. A.; Relvas, C.; Moreira, R.
2004-01-01
Computer-aided design, computer-aided manufacturing, computer-aided analysis, reverse engineering and rapid prototyping are tools that play an important key role within product design. These are areas of technical knowledge that must be part of engineering and industrial design courses' curricula. This paper describes our teaching experience of…
Research on OpenStack of open source cloud computing in colleges and universities’ computer room
NASA Astrophysics Data System (ADS)
Wang, Lei; Zhang, Dandan
2017-06-01
In recent years, the cloud computing technology has a rapid development, especially open source cloud computing. Open source cloud computing has attracted a large number of user groups by the advantages of open source and low cost, have now become a large-scale promotion and application. In this paper, firstly we briefly introduced the main functions and architecture of the open source cloud computing OpenStack tools, and then discussed deeply the core problems of computer labs in colleges and universities. Combining with this research, it is not that the specific application and deployment of university computer rooms with OpenStack tool. The experimental results show that the application of OpenStack tool can efficiently and conveniently deploy cloud of university computer room, and its performance is stable and the functional value is good.
Instrumentation, performance visualization, and debugging tools for multiprocessors
NASA Technical Reports Server (NTRS)
Yan, Jerry C.; Fineman, Charles E.; Hontalas, Philip J.
1991-01-01
The need for computing power has forced a migration from serial computation on a single processor to parallel processing on multiprocessor architectures. However, without effective means to monitor (and visualize) program execution, debugging, and tuning parallel programs becomes intractably difficult as program complexity increases with the number of processors. Research on performance evaluation tools for multiprocessors is being carried out at ARC. Besides investigating new techniques for instrumenting, monitoring, and presenting the state of parallel program execution in a coherent and user-friendly manner, prototypes of software tools are being incorporated into the run-time environments of various hardware testbeds to evaluate their impact on user productivity. Our current tool set, the Ames Instrumentation Systems (AIMS), incorporates features from various software systems developed in academia and industry. The execution of FORTRAN programs on the Intel iPSC/860 can be automatically instrumented and monitored. Performance data collected in this manner can be displayed graphically on workstations supporting X-Windows. We have successfully compared various parallel algorithms for computational fluid dynamics (CFD) applications in collaboration with scientists from the Numerical Aerodynamic Simulation Systems Division. By performing these comparisons, we show that performance monitors and debuggers such as AIMS are practical and can illuminate the complex dynamics that occur within parallel programs.
Computer Forensics Education - the Open Source Approach
NASA Astrophysics Data System (ADS)
Huebner, Ewa; Bem, Derek; Cheung, Hon
In this chapter we discuss the application of the open source software tools in computer forensics education at tertiary level. We argue that open source tools are more suitable than commercial tools, as they provide the opportunity for students to gain in-depth understanding and appreciation of the computer forensic process as opposed to familiarity with one software product, however complex and multi-functional. With the access to all source programs the students become more than just the consumers of the tools as future forensic investigators. They can also examine the code, understand the relationship between the binary images and relevant data structures, and in the process gain necessary background to become the future creators of new and improved forensic software tools. As a case study we present an advanced subject, Computer Forensics Workshop, which we designed for the Bachelor's degree in computer science at the University of Western Sydney. We based all laboratory work and the main take-home project in this subject on open source software tools. We found that without exception more than one suitable tool can be found to cover each topic in the curriculum adequately. We argue that this approach prepares students better for forensic field work, as they gain confidence to use a variety of tools, not just a single product they are familiar with.
NASA Astrophysics Data System (ADS)
Lei, Xiaohui; Wang, Yuhui; Liao, Weihong; Jiang, Yunzhong; Tian, Yu; Wang, Hao
2011-09-01
Many regions are still threatened with frequent floods and water resource shortage problems in China. Consequently, the task of reproducing and predicting the hydrological process in watersheds is hard and unavoidable for reducing the risks of damage and loss. Thus, it is necessary to develop an efficient and cost-effective hydrological tool in China as many areas should be modeled. Currently, developed hydrological tools such as Mike SHE and ArcSWAT (soil and water assessment tool based on ArcGIS) show significant power in improving the precision of hydrological modeling in China by considering spatial variability both in land cover and in soil type. However, adopting developed commercial tools in such a large developing country comes at a high cost. Commercial modeling tools usually contain large numbers of formulas, complicated data formats, and many preprocessing or postprocessing steps that may make it difficult for the user to carry out simulation, thus lowering the efficiency of the modeling process. Besides, commercial hydrological models usually cannot be modified or improved to be suitable for some special hydrological conditions in China. Some other hydrological models are open source, but integrated into commercial GIS systems. Therefore, by integrating hydrological simulation code EasyDHM, a hydrological simulation tool named MWEasyDHM was developed based on open-source MapWindow GIS, the purpose of which is to establish the first open-source GIS-based distributed hydrological model tool in China by integrating modules of preprocessing, model computation, parameter estimation, result display, and analysis. MWEasyDHM provides users with a friendly manipulating MapWindow GIS interface, selectable multifunctional hydrological processing modules, and, more importantly, an efficient and cost-effective hydrological simulation tool. The general construction of MWEasyDHM consists of four major parts: (1) a general GIS module for hydrological analysis, (2) a preprocessing module for modeling inputs, (3) a model calibration module, and (4) a postprocessing module. The general GIS module for hydrological analysis is developed on the basis of totally open-source GIS software, MapWindow, which contains basic GIS functions. The preprocessing module is made up of three submodules including a DEM-based submodule for hydrological analysis, a submodule for default parameter calculation, and a submodule for the spatial interpolation of meteorological data. The calibration module contains parallel computation, real-time computation, and visualization. The postprocessing module includes model calibration and model results spatial visualization using tabular form and spatial grids. MWEasyDHM makes it possible for efficient modeling and calibration of EasyDHM, and promises further development of cost-effective applications in various watersheds.
ERIC Educational Resources Information Center
Khan, Zeenath Reza
2014-01-01
A year after the primary study that tested the impact of introducing blended learning and guided discovery to help teach computer application to business students, this paper looks into the continued success of using guided discovery and blended learning with learning management system in and out of classrooms to enhance student learning.…
ERIC Educational Resources Information Center
Sng, Dennis Cheng-Hong
The University of Illinois at Urbana-Champaign (UIUC) has a large campus computer network serving a community of about 20,000 users. With such a large network, it is inevitable that there are a wide variety of technologies co-existing in a multi-vendor environment. Effective network monitoring tools can help monitor traffic and link usage, as well…
Evaluation of verification and testing tools for FORTRAN programs
NASA Technical Reports Server (NTRS)
Smith, K. A.
1980-01-01
Two automated software verification and testing systems were developed for use in the analysis of computer programs. An evaluation of the static analyzer DAVE and the dynamic analyzer PET, which are used in the analysis of FORTRAN programs on Control Data (CDC) computers, are described. Both systems were found to be effective and complementary, and are recommended for use in testing FORTRAN programs.
ERIC Educational Resources Information Center
Powell, Loreen M.; Wimmer, Hayden
2015-01-01
Computer programming is challenging to teach and difficult for students to learn. Instructors have searched for ways to improve student learning in programming courses. In an attempt to foster hands-on learning and to increase student learning outcomes in a programming course, the authors conducted an exploratory study to examine student created…
Parallel software tools at Langley Research Center
NASA Technical Reports Server (NTRS)
Moitra, Stuti; Tennille, Geoffrey M.; Lakeotes, Christopher D.; Randall, Donald P.; Arthur, Jarvis J.; Hammond, Dana P.; Mall, Gerald H.
1993-01-01
This document gives a brief overview of parallel software tools available on the Intel iPSC/860 parallel computer at Langley Research Center. It is intended to provide a source of information that is somewhat more concise than vendor-supplied material on the purpose and use of various tools. Each of the chapters on tools is organized in a similar manner covering an overview of the functionality, access information, how to effectively use the tool, observations about the tool and how it compares to similar software, known problems or shortfalls with the software, and reference documentation. It is primarily intended for users of the iPSC/860 at Langley Research Center and is appropriate for both the experienced and novice user.
Computational thinking in life science education.
Rubinstein, Amir; Chor, Benny
2014-11-01
We join the increasing call to take computational education of life science students a step further, beyond teaching mere programming and employing existing software tools. We describe a new course, focusing on enriching the curriculum of life science students with abstract, algorithmic, and logical thinking, and exposing them to the computational "culture." The design, structure, and content of our course are influenced by recent efforts in this area, collaborations with life scientists, and our own instructional experience. Specifically, we suggest that an effective course of this nature should: (1) devote time to explicitly reflect upon computational thinking processes, resisting the temptation to drift to purely practical instruction, (2) focus on discrete notions, rather than on continuous ones, and (3) have basic programming as a prerequisite, so students need not be preoccupied with elementary programming issues. We strongly recommend that the mere use of existing bioinformatics tools and packages should not replace hands-on programming. Yet, we suggest that programming will mostly serve as a means to practice computational thinking processes. This paper deals with the challenges and considerations of such computational education for life science students. It also describes a concrete implementation of the course and encourages its use by others.
The magic words: Using computers to uncover mental associations for use in magic trick design.
Williams, Howard; McOwan, Peter W
2017-01-01
The use of computational systems to aid in the design of magic tricks has been previously explored. Here further steps are taken in this direction, introducing the use of computer technology as a natural language data sourcing and processing tool for magic trick design purposes. Crowd sourcing of psychological concepts is investigated; further, the role of human associative memory and its exploitation in magical effects is explored. A new trick is developed and evaluated: a physical card trick partially designed by a computational system configured to search for and explore conceptual spaces readily understood by spectators.
NASA Technical Reports Server (NTRS)
Babrauckas, Theresa
2000-01-01
The Affordable High Performance Computing (AHPC) project demonstrated that high-performance computing based on a distributed network of computer workstations is a cost-effective alternative to vector supercomputers for running CPU and memory intensive design and analysis tools. The AHPC project created an integrated system called a Network Supercomputer. By connecting computer work-stations through a network and utilizing the workstations when they are idle, the resulting distributed-workstation environment has the same performance and reliability levels as the Cray C90 vector Supercomputer at less than 25 percent of the C90 cost. In fact, the cost comparison between a Cray C90 Supercomputer and Sun workstations showed that the number of distributed networked workstations equivalent to a C90 costs approximately 8 percent of the C90.
Cost-Effective CNC Part Program Verification Development for Laboratory Instruction.
ERIC Educational Resources Information Center
Chen, Joseph C.; Chang, Ted C.
2000-01-01
Describes a computer numerical control program verification system that checks a part program before its execution. The system includes character recognition, word recognition, a fuzzy-nets system, and a tool path viewer. (SK)
Data-Informed Large-Eddy Simulation of Coastal Land-Air-Sea Interactions
NASA Astrophysics Data System (ADS)
Calderer, A.; Hao, X.; Fernando, H. J.; Sotiropoulos, F.; Shen, L.
2016-12-01
The study of atmospheric flows in coastal areas has not been fully addressed due to the complex processes emerging from the land-air-sea interactions, e.g., abrupt change in land topography, strong current shear, wave shoaling, and depth-limited wave breaking. The available computational tools that have been applied to study such littoral regions are mostly based on open-ocean assumptions, which most times do not lead to reliable solutions. The goal of the present study is to better understand some of these near-shore processes, employing the advanced computational tools, developed in our research group. Our computational framework combines a large-eddy simulation (LES) flow solver for atmospheric flows, a sharp-interface immersed boundary method that can deal with real complex topographies (Calderer et al., J. Comp. Physics 2014), and a phase-resolved, depth-dependent, wave model (Yang and Shen, J. Comp. Physics 2011). Using real measured data taken in the FRF station in Duck, North Carolina, we validate and demonstrate the predictive capabilities of the present computational framework, which are shown to be in overall good agreement with the measured data under different wind-wave scenarios. We also analyse the effects of some of the complex processes captured by our simulation tools.
NASA Astrophysics Data System (ADS)
Ryu, Hoon; Jeong, Yosang; Kang, Ji-Hoon; Cho, Kyu Nam
2016-12-01
Modelling of multi-million atomic semiconductor structures is important as it not only predicts properties of physically realizable novel materials, but can accelerate advanced device designs. This work elaborates a new Technology-Computer-Aided-Design (TCAD) tool for nanoelectronics modelling, which uses a sp3d5s∗ tight-binding approach to describe multi-million atomic structures, and simulate electronic structures with high performance computing (HPC), including atomic effects such as alloy and dopant disorders. Being named as Quantum simulation tool for Advanced Nanoscale Devices (Q-AND), the tool shows nice scalability on traditional multi-core HPC clusters implying the strong capability of large-scale electronic structure simulations, particularly with remarkable performance enhancement on latest clusters of Intel Xeon PhiTM coprocessors. A review of the recent modelling study conducted to understand an experimental work of highly phosphorus-doped silicon nanowires, is presented to demonstrate the utility of Q-AND. Having been developed via Intel Parallel Computing Center project, Q-AND will be open to public to establish a sound framework of nanoelectronics modelling with advanced HPC clusters of a many-core base. With details of the development methodology and exemplary study of dopant electronics, this work will present a practical guideline for TCAD development to researchers in the field of computational nanoelectronics.
Dataflow Design Tool: User's Manual
NASA Technical Reports Server (NTRS)
Jones, Robert L., III
1996-01-01
The Dataflow Design Tool is a software tool for selecting a multiprocessor scheduling solution for a class of computational problems. The problems of interest are those that can be described with a dataflow graph and are intended to be executed repetitively on a set of identical processors. Typical applications include signal processing and control law problems. The software tool implements graph-search algorithms and analysis techniques based on the dataflow paradigm. Dataflow analyses provided by the software are introduced and shown to effectively determine performance bounds, scheduling constraints, and resource requirements. The software tool provides performance optimization through the inclusion of artificial precedence constraints among the schedulable tasks. The user interface and tool capabilities are described. Examples are provided to demonstrate the analysis, scheduling, and optimization functions facilitated by the tool.
Deshmukh, Rupesh K; Sonah, Humira; Bélanger, Richard R
2016-01-01
Aquaporins (AQPs) are channel-forming integral membrane proteins that facilitate the movement of water and many other small molecules. Compared to animals, plants contain a much higher number of AQPs in their genome. Homology-based identification of AQPs in sequenced species is feasible because of the high level of conservation of protein sequences across plant species. Genome-wide characterization of AQPs has highlighted several important aspects such as distribution, genetic organization, evolution and conserved features governing solute specificity. From a functional point of view, the understanding of AQP transport system has expanded rapidly with the help of transcriptomics and proteomics data. The efficient analysis of enormous amounts of data generated through omic scale studies has been facilitated through computational advancements. Prediction of protein tertiary structures, pore architecture, cavities, phosphorylation sites, heterodimerization, and co-expression networks has become more sophisticated and accurate with increasing computational tools and pipelines. However, the effectiveness of computational approaches is based on the understanding of physiological and biochemical properties, transport kinetics, solute specificity, molecular interactions, sequence variations, phylogeny and evolution of aquaporins. For this purpose, tools like Xenopus oocyte assays, yeast expression systems, artificial proteoliposomes, and lipid membranes have been efficiently exploited to study the many facets that influence solute transport by AQPs. In the present review, we discuss genome-wide identification of AQPs in plants in relation with recent advancements in analytical tools, and their availability and technological challenges as they apply to AQPs. An exhaustive review of omics resources available for AQP research is also provided in order to optimize their efficient utilization. Finally, a detailed catalog of computational tools and analytical pipelines is offered as a resource for AQP research.
Deshmukh, Rupesh K.; Sonah, Humira; Bélanger, Richard R.
2016-01-01
Aquaporins (AQPs) are channel-forming integral membrane proteins that facilitate the movement of water and many other small molecules. Compared to animals, plants contain a much higher number of AQPs in their genome. Homology-based identification of AQPs in sequenced species is feasible because of the high level of conservation of protein sequences across plant species. Genome-wide characterization of AQPs has highlighted several important aspects such as distribution, genetic organization, evolution and conserved features governing solute specificity. From a functional point of view, the understanding of AQP transport system has expanded rapidly with the help of transcriptomics and proteomics data. The efficient analysis of enormous amounts of data generated through omic scale studies has been facilitated through computational advancements. Prediction of protein tertiary structures, pore architecture, cavities, phosphorylation sites, heterodimerization, and co-expression networks has become more sophisticated and accurate with increasing computational tools and pipelines. However, the effectiveness of computational approaches is based on the understanding of physiological and biochemical properties, transport kinetics, solute specificity, molecular interactions, sequence variations, phylogeny and evolution of aquaporins. For this purpose, tools like Xenopus oocyte assays, yeast expression systems, artificial proteoliposomes, and lipid membranes have been efficiently exploited to study the many facets that influence solute transport by AQPs. In the present review, we discuss genome-wide identification of AQPs in plants in relation with recent advancements in analytical tools, and their availability and technological challenges as they apply to AQPs. An exhaustive review of omics resources available for AQP research is also provided in order to optimize their efficient utilization. Finally, a detailed catalog of computational tools and analytical pipelines is offered as a resource for AQP research. PMID:28066459
Planetary-Scale Geospatial Data Analysis Techniques in Google's Earth Engine Platform (Invited)
NASA Astrophysics Data System (ADS)
Hancher, M.
2013-12-01
Geoscientists have more and more access to new tools for large-scale computing. With any tool, some tasks are easy and other tasks hard. It is natural to look to new computing platforms to increase the scale and efficiency of existing techniques, but there is a more exiting opportunity to discover and develop a new vocabulary of fundamental analysis idioms that are made easy and effective by these new tools. Google's Earth Engine platform is a cloud computing environment for earth data analysis that combines a public data catalog with a large-scale computational facility optimized for parallel processing of geospatial data. The data catalog includes a nearly complete archive of scenes from Landsat 4, 5, 7, and 8 that have been processed by the USGS, as well as a wide variety of other remotely-sensed and ancillary data products. Earth Engine supports a just-in-time computation model that enables real-time preview during algorithm development and debugging as well as during experimental data analysis and open-ended data exploration. Data processing operations are performed in parallel across many computers in Google's datacenters. The platform automatically handles many traditionally-onerous data management tasks, such as data format conversion, reprojection, resampling, and associating image metadata with pixel data. Early applications of Earth Engine have included the development of Google's global cloud-free fifteen-meter base map and global multi-decadal time-lapse animations, as well as numerous large and small experimental analyses by scientists from a range of academic, government, and non-governmental institutions, working in a wide variety of application areas including forestry, agriculture, urban mapping, and species habitat modeling. Patterns in the successes and failures of these early efforts have begun to emerge, sketching the outlines of a new set of simple and effective approaches to geospatial data analysis.
Computational tool for the early screening of monoclonal antibodies for their viscosities
Agrawal, Neeraj J; Helk, Bernhard; Kumar, Sandeep; Mody, Neil; Sathish, Hasige A.; Samra, Hardeep S.; Buck, Patrick M; Li, Li; Trout, Bernhardt L
2016-01-01
Highly concentrated antibody solutions often exhibit high viscosities, which present a number of challenges for antibody-drug development, manufacturing and administration. The antibody sequence is a key determinant for high viscosity of highly concentrated solutions; therefore, a sequence- or structure-based tool that can identify highly viscous antibodies from their sequence would be effective in ensuring that only antibodies with low viscosity progress to the development phase. Here, we present a spatial charge map (SCM) tool that can accurately identify highly viscous antibodies from their sequence alone (using homology modeling to determine the 3-dimensional structures). The SCM tool has been extensively validated at 3 different organizations, and has proved successful in correctly identifying highly viscous antibodies. As a quantitative tool, SCM is amenable to high-throughput automated analysis, and can be effectively implemented during the antibody screening or engineering phase for the selection of low-viscosity antibodies. PMID:26399600
Computer assisted learning (CAL) of oral manifestations of HIV disease.
Porter, S R; Telford, A; Chandler, K; Furber, S; Williams, J; Price, S; Scully, C; Triantos, D; Bain, L
1996-09-07
General dental practitioners (GDPs) in the UK may wish additional education on relevant aspects of human immunodeficiency virus (HIV) disease. The aim of the present study was to develop and assess a computer assisted learning package on the oral manifestations of HIV disease of relevance to GDPs. A package was developed using a commercially-available software development tool and assessed by a group of 75 GDPs interested in education and computers. Fifty-four (72%) of the GDPs completed a self-administered questionnaire of their opinions of the package. The majority reported the package to be easy to load and run, that it provided clear instructions and displays, and that it was a more effective educational tool than videotapes, audiotapes, professional journals and textbooks, and of similar benefit as post-graduate courses. The GDPs often commented favourably on the effectiveness of the clinical images and use of questions and answers, although some had criticisms of these and other aspects of the package. As a consequence of this investigation the package has been modified and distributed to GDPs in England and Wales.
Kellogg, Glen E; Fornabaio, Micaela; Chen, Deliang L; Abraham, Donald J; Spyrakis, Francesca; Cozzini, Pietro; Mozzarelli, Andrea
2006-05-01
Computational tools utilizing a unique empirical modeling system based on the hydrophobic effect and the measurement of logP(o/w) (the partition coefficient for solvent transfer between 1-octanol and water) are described. The associated force field, Hydropathic INTeractions (HINT), contains much rich information about non-covalent interactions in the biological environment because of its basis in an experiment that measures interactions in solution. HINT is shown to be the core of an evolving virtual screening system that is capable of taking into account a number of factors often ignored such as entropy, effects of solvent molecules at the active site, and the ionization states of acidic and basic residues and ligand functional groups. The outline of a comprehensive modeling system for virtual screening that incorporates these features is described. In addition, a detailed description of the Computational Titration algorithm is provided. As an example, three complexes of dihydrofolate reductase (DHFR) are analyzed with our system and these results are compared with the experimental free energies of binding.
Oulas, Anastasis; Karathanasis, Nestoras; Louloupi, Annita; Pavlopoulos, Georgios A; Poirazi, Panayiota; Kalantidis, Kriton; Iliopoulos, Ioannis
2015-01-01
Computational methods for miRNA target prediction are currently undergoing extensive review and evaluation. There is still a great need for improvement of these tools and bioinformatics approaches are looking towards high-throughput experiments in order to validate predictions. The combination of large-scale techniques with computational tools will not only provide greater credence to computational predictions but also lead to the better understanding of specific biological questions. Current miRNA target prediction tools utilize probabilistic learning algorithms, machine learning methods and even empirical biologically defined rules in order to build models based on experimentally verified miRNA targets. Large-scale protein downregulation assays and next-generation sequencing (NGS) are now being used to validate methodologies and compare the performance of existing tools. Tools that exhibit greater correlation between computational predictions and protein downregulation or RNA downregulation are considered the state of the art. Moreover, efficiency in prediction of miRNA targets that are concurrently verified experimentally provides additional validity to computational predictions and further highlights the competitive advantage of specific tools and their efficacy in extracting biologically significant results. In this review paper, we discuss the computational methods for miRNA target prediction and provide a detailed comparison of methodologies and features utilized by each specific tool. Moreover, we provide an overview of current state-of-the-art high-throughput methods used in miRNA target prediction.
The Effect of Interactive CD-ROM/Digitized Audio Courseware on Reading among Low-Literate Adults.
ERIC Educational Resources Information Center
Gretes, John A.; Green, Michael
1994-01-01
Compares a multimedia adult literacy instructional course, Reading to Educate and Develop Yourself (READY), to traditional classroom instruction by studying effects of replacing conventional learning tools with computer-assisted instruction (CD-ROMs and audio software). Results reveal that READY surpassed traditional instruction for virtually…
ERIC Educational Resources Information Center
Chen, Yu-Lung; Pan, Pei-Rong; Sung, Yao-Ting; Chang, Kuo-En
2013-01-01
Computer simulation has significant potential as a supplementary tool for effective conceptual-change learning based on the integration of technology and appropriate instructional strategies. This study elucidates misconceptions in learning on diodes and constructs a conceptual-change learning system that incorporates…
NASA Technical Reports Server (NTRS)
Kovarik, Madeline
1993-01-01
Intelligent computer aided training systems hold great promise for the application of this technology to mainstream education and training. Yet, this technology, which holds such a vast potential impact for the future of education and training, has had little impact beyond the enclaves of government research labs. This is largely due to the inaccessibility of the technology to those individuals in whose hands it can have the greatest impact, teachers and educators. Simply throwing technology at an educator and expecting them to use it as an effective tool is not the answer. This paper provides a background into the use of technology as a training tool. MindLink, developed by HyperTech Systems, provides trainers with a powerful rule-based tool that can be integrated directly into a Windows application. By embedding expert systems technology it becomes more accessible and easier to master.
Physically Based Virtual Surgery Planning and Simulation Tools for Personal Health Care Systems
NASA Astrophysics Data System (ADS)
Dogan, Firat; Atilgan, Yasemin
The virtual surgery planning and simulation tools have gained a great deal of importance in the last decade in a consequence of increasing capacities at the information technology level. The modern hardware architectures, large scale database systems, grid based computer networks, agile development processes, better 3D visualization and all the other strong aspects of the information technology brings necessary instruments into almost every desk. The last decade’s special software and sophisticated super computer environments are now serving to individual needs inside “tiny smart boxes” for reasonable prices. However, resistance to learning new computerized environments, insufficient training and all the other old habits prevents effective utilization of IT resources by the specialists of the health sector. In this paper, all the aspects of the former and current developments in surgery planning and simulation related tools are presented, future directions and expectations are investigated for better electronic health care systems.
EEGLAB, SIFT, NFT, BCILAB, and ERICA: New Tools for Advanced EEG Processing
Delorme, Arnaud; Mullen, Tim; Kothe, Christian; Akalin Acar, Zeynep; Bigdely-Shamlo, Nima; Vankov, Andrey; Makeig, Scott
2011-01-01
We describe a set of complementary EEG data collection and processing tools recently developed at the Swartz Center for Computational Neuroscience (SCCN) that connect to and extend the EEGLAB software environment, a freely available and readily extensible processing environment running under Matlab. The new tools include (1) a new and flexible EEGLAB STUDY design facility for framing and performing statistical analyses on data from multiple subjects; (2) a neuroelectromagnetic forward head modeling toolbox (NFT) for building realistic electrical head models from available data; (3) a source information flow toolbox (SIFT) for modeling ongoing or event-related effective connectivity between cortical areas; (4) a BCILAB toolbox for building online brain-computer interface (BCI) models from available data, and (5) an experimental real-time interactive control and analysis (ERICA) environment for real-time production and coordination of interactive, multimodal experiments. PMID:21687590
Increasing Access and Usability of Remote Sensing Data: The NASA Protected Area Archive
NASA Technical Reports Server (NTRS)
Geller, Gary N.
2004-01-01
Although remote sensing data are now widely available, much of it at low or no-cost, many managers of protected conservation areas do not have the expertise or tools to view or analyze it. Thus access to it by the protected area management community is effectively blocked. The Protected Area Archive will increase access to remote sensing data by creating collections of satellite images of protected areas and packaging them with simple-to-use visualization and analytical tools. The user can easily locate the area and image of interest on a map, then display, roam, and zoom the image. A set of simple tools will be provided so the user can explore the data and employ it to assist in management and monitoring of their area. The 'Phase 1 ' version requires only a Windows-based computer and basic computer skills, and may be of particular help to protected area managers in developing countries.
Evaluation of Visual Computer Simulator for Computer Architecture Education
ERIC Educational Resources Information Center
Imai, Yoshiro; Imai, Masatoshi; Moritoh, Yoshio
2013-01-01
This paper presents trial evaluation of a visual computer simulator in 2009-2011, which has been developed to play some roles of both instruction facility and learning tool simultaneously. And it illustrates an example of Computer Architecture education for University students and usage of e-Learning tool for Assembly Programming in order to…
EFL Learners' Attitudes towards Using Computers as a Learning Tool in Language Learning
ERIC Educational Resources Information Center
Kitchakarn, Orachorn
2015-01-01
The study was conducted to investigate attitudes toward using computers as a learning tool among undergraduate students in a private university. In this regards, some variables which might be potential antecedents of attitudes toward computer including gender, experience of using computers and perceived abilities in using programs were examined.…
DR2DI: a powerful computational tool for predicting novel drug-disease associations
NASA Astrophysics Data System (ADS)
Lu, Lu; Yu, Hua
2018-05-01
Finding the new related candidate diseases for known drugs provides an effective method for fast-speed and low-risk drug development. However, experimental identification of drug-disease associations is expensive and time-consuming. This motivates the need for developing in silico computational methods that can infer true drug-disease pairs with high confidence. In this study, we presented a novel and powerful computational tool, DR2DI, for accurately uncovering the potential associations between drugs and diseases using high-dimensional and heterogeneous omics data as information sources. Based on a unified and extended similarity kernel framework, DR2DI inferred the unknown relationships between drugs and diseases using Regularized Kernel Classifier. Importantly, DR2DI employed a semi-supervised and global learning algorithm which can be applied to uncover the diseases (drugs) associated with known and novel drugs (diseases). In silico global validation experiments showed that DR2DI significantly outperforms recent two approaches for predicting drug-disease associations. Detailed case studies further demonstrated that the therapeutic indications and side effects of drugs predicted by DR2DI could be validated by existing database records and literature, suggesting that DR2DI can be served as a useful bioinformatic tool for identifying the potential drug-disease associations and guiding drug repositioning. Our software and comparison codes are freely available at https://github.com/huayu1111/DR2DI.
DR2DI: a powerful computational tool for predicting novel drug-disease associations
NASA Astrophysics Data System (ADS)
Lu, Lu; Yu, Hua
2018-04-01
Finding the new related candidate diseases for known drugs provides an effective method for fast-speed and low-risk drug development. However, experimental identification of drug-disease associations is expensive and time-consuming. This motivates the need for developing in silico computational methods that can infer true drug-disease pairs with high confidence. In this study, we presented a novel and powerful computational tool, DR2DI, for accurately uncovering the potential associations between drugs and diseases using high-dimensional and heterogeneous omics data as information sources. Based on a unified and extended similarity kernel framework, DR2DI inferred the unknown relationships between drugs and diseases using Regularized Kernel Classifier. Importantly, DR2DI employed a semi-supervised and global learning algorithm which can be applied to uncover the diseases (drugs) associated with known and novel drugs (diseases). In silico global validation experiments showed that DR2DI significantly outperforms recent two approaches for predicting drug-disease associations. Detailed case studies further demonstrated that the therapeutic indications and side effects of drugs predicted by DR2DI could be validated by existing database records and literature, suggesting that DR2DI can be served as a useful bioinformatic tool for identifying the potential drug-disease associations and guiding drug repositioning. Our software and comparison codes are freely available at https://github.com/huayu1111/DR2DI.
Automatic Generation of Directive-Based Parallel Programs for Shared Memory Parallel Systems
NASA Technical Reports Server (NTRS)
Jin, Hao-Qiang; Yan, Jerry; Frumkin, Michael
2000-01-01
The shared-memory programming model is a very effective way to achieve parallelism on shared memory parallel computers. As great progress was made in hardware and software technologies, performance of parallel programs with compiler directives has demonstrated large improvement. The introduction of OpenMP directives, the industrial standard for shared-memory programming, has minimized the issue of portability. Due to its ease of programming and its good performance, the technique has become very popular. In this study, we have extended CAPTools, a computer-aided parallelization toolkit, to automatically generate directive-based, OpenMP, parallel programs. We outline techniques used in the implementation of the tool and present test results on the NAS parallel benchmarks and ARC3D, a CFD application. This work demonstrates the great potential of using computer-aided tools to quickly port parallel programs and also achieve good performance.
Rigidity controllable polishing tool based on magnetorheological effect
NASA Astrophysics Data System (ADS)
Wang, Jia; Wan, Yongjian; Shi, Chunyan
2012-10-01
A stable and predictable material removal function (MRF) plays a crucial role in computer controlled optical surfacing (CCOS). For physical contact polishing case, the stability of MRF depends on intimate contact between polishing interface and workpiece. Rigid laps maintain this function in polishing spherical surfaces, whose curvature has no variation with the position on the surface. Such rigid laps provide smoothing effect for mid-spatial frequency errors, but can't be used in aspherical surfaces for they will destroy the surface figure. Flexible tools such as magnetorheological fluid or air bonnet conform to the surface [1]. They lack rigidity and provide little natural smoothing effect. We present a rigidity controllable polishing tool that uses a kind of magnetorheological elastomers (MRE) medium [2]. It provides the ability of both conforming to the aspheric surface and maintaining natural smoothing effect. What's more, its rigidity can be controlled by the magnetic field. This paper will present the design, analysis, and stiffness variation mechanism model of such polishing tool [3].
NASA Astrophysics Data System (ADS)
Pierce, S. A.
2017-12-01
Decision making for groundwater systems is becoming increasingly important, as shifting water demands increasingly impact aquifers. As buffer systems, aquifers provide room for resilient responses and augment the actual timeframe for hydrological response. Yet the pace impacts, climate shifts, and degradation of water resources is accelerating. To meet these new drivers, groundwater science is transitioning toward the emerging field of Integrated Water Resources Management, or IWRM. IWRM incorporates a broad array of dimensions, methods, and tools to address problems that tend to be complex. Computational tools and accessible cyberinfrastructure (CI) are needed to cross the chasm between science and society. Fortunately cloud computing environments, such as the new Jetstream system, are evolving rapidly. While still targeting scientific user groups systems such as, Jetstream, offer configurable cyberinfrastructure to enable interactive computing and data analysis resources on demand. The web-based interfaces allow researchers to rapidly customize virtual machines, modify computing architecture and increase the usability and access for broader audiences to advanced compute environments. The result enables dexterous configurations and opening up opportunities for IWRM modelers to expand the reach of analyses, number of case studies, and quality of engagement with stakeholders and decision makers. The acute need to identify improved IWRM solutions paired with advanced computational resources refocuses the attention of IWRM researchers on applications, workflows, and intelligent systems that are capable of accelerating progress. IWRM must address key drivers of community concern, implement transdisciplinary methodologies, adapt and apply decision support tools in order to effectively support decisions about groundwater resource management. This presentation will provide an overview of advanced computing services in the cloud using integrated groundwater management case studies to highlight how Cloud CI streamlines the process for setting up an interactive decision support system. Moreover, advances in artificial intelligence offer new techniques for old problems from integrating data to adaptive sensing or from interactive dashboards to optimizing multi-attribute problems. The combination of scientific expertise, flexible cloud computing solutions, and intelligent systems opens new research horizons.
Software Tools on the Peregrine System | High-Performance Computing | NREL
Debugger or performance analysis Tool for understanding the behavior of MPI applications. Intel VTune environment for statistical computing and graphics. VirtualGL/TurboVNC Visualization and analytics Remote Tools on the Peregrine System Software Tools on the Peregrine System NREL has a variety of
Serino, Andrea; Canzoneri, Elisa; Marzolla, Marilena; di Pellegrino, Giuseppe; Magosso, Elisa
2015-01-01
Stimuli from different sensory modalities occurring on or close to the body are integrated in a multisensory representation of the space surrounding the body, i.e., peripersonal space (PPS). PPS dynamically modifies depending on experience, e.g., it extends after using a tool to reach far objects. However, the neural mechanism underlying PPS plasticity after tool use is largely unknown. Here we use a combined computational-behavioral approach to propose and test a possible mechanism accounting for PPS extension. We first present a neural network model simulating audio-tactile representation in the PPS around one hand. Simulation experiments showed that our model reproduced the main property of PPS neurons, i.e., selective multisensory response for stimuli occurring close to the hand. We used the neural network model to simulate the effects of a tool-use training. In terms of sensory inputs, tool use was conceptualized as a concurrent tactile stimulation from the hand, due to holding the tool, and an auditory stimulation from the far space, due to tool-mediated action. Results showed that after exposure to those inputs, PPS neurons responded also to multisensory stimuli far from the hand. The model thus suggests that synchronous pairing of tactile hand stimulation and auditory stimulation from the far space is sufficient to extend PPS, such as after tool-use. Such prediction was confirmed by a behavioral experiment, where we used an audio-tactile interaction paradigm to measure the boundaries of PPS representation. We found that PPS extended after synchronous tactile-hand stimulation and auditory-far stimulation in a group of healthy volunteers. Control experiments both in simulation and behavioral settings showed that the same amount of tactile and auditory inputs administered out of synchrony did not change PPS representation. We conclude by proposing a simple, biological-plausible model to explain plasticity in PPS representation after tool-use, which is supported by computational and behavioral data. PMID:25698947
Serino, Andrea; Canzoneri, Elisa; Marzolla, Marilena; di Pellegrino, Giuseppe; Magosso, Elisa
2015-01-01
Stimuli from different sensory modalities occurring on or close to the body are integrated in a multisensory representation of the space surrounding the body, i.e., peripersonal space (PPS). PPS dynamically modifies depending on experience, e.g., it extends after using a tool to reach far objects. However, the neural mechanism underlying PPS plasticity after tool use is largely unknown. Here we use a combined computational-behavioral approach to propose and test a possible mechanism accounting for PPS extension. We first present a neural network model simulating audio-tactile representation in the PPS around one hand. Simulation experiments showed that our model reproduced the main property of PPS neurons, i.e., selective multisensory response for stimuli occurring close to the hand. We used the neural network model to simulate the effects of a tool-use training. In terms of sensory inputs, tool use was conceptualized as a concurrent tactile stimulation from the hand, due to holding the tool, and an auditory stimulation from the far space, due to tool-mediated action. Results showed that after exposure to those inputs, PPS neurons responded also to multisensory stimuli far from the hand. The model thus suggests that synchronous pairing of tactile hand stimulation and auditory stimulation from the far space is sufficient to extend PPS, such as after tool-use. Such prediction was confirmed by a behavioral experiment, where we used an audio-tactile interaction paradigm to measure the boundaries of PPS representation. We found that PPS extended after synchronous tactile-hand stimulation and auditory-far stimulation in a group of healthy volunteers. Control experiments both in simulation and behavioral settings showed that the same amount of tactile and auditory inputs administered out of synchrony did not change PPS representation. We conclude by proposing a simple, biological-plausible model to explain plasticity in PPS representation after tool-use, which is supported by computational and behavioral data.
2016-11-01
Display Design, Methods , and Results for a User Study by Christopher J Garneau and Robert F Erbacher Approved for public...NOV 2016 US Army Research Laboratory Evaluation of Visualization Tools for Computer Network Defense Analysts: Display Design, Methods ...January 2013–September 2015 4. TITLE AND SUBTITLE Evaluation of Visualization Tools for Computer Network Defense Analysts: Display Design, Methods
NASA Astrophysics Data System (ADS)
van Griensven, Ann; Haest, Pieter Jan; Broekx, Steven; Seuntjens, Piet; Campling, Paul; Ducos, Geraldine; Blaha, Ludek; Slobodnik, Jaroslav
2010-05-01
The European Union (EU) adopted the Water Framework Directive (WFD) in 2000 ensuring that all aquatic ecosystems meet ‘good status' by 2015. However, it is a major challenge for river basin managers to meet this requirement in river basins with a high population density as well as intensive agricultural and industrial activities. The EU financed AQUAREHAB project (FP7) specifically examines the ecological and economic impact of innovative rehabilitation technologies for multi-pressured degraded water bodies. For this purpose, a generic collaborative management tool ‘REACH-ER' is being developed that can be used by stakeholders, citizens and water managers to evaluate the ecological and economical effects of different remedial actions on waterbodies. The tool is built using databases from large scale models simulating the hydrological dynamics of the river basing and sub-basins, the costs of the measures and the effectiveness of the measures in terms of ecological impact. Knowledge rules are used to describe the relationships between these data in order to compute the flux concentrations or to compute the effectiveness of measures. The management tool specifically addresses nitrate pollution and pollution by organic micropollutants. Detailed models are also used to predict the effectiveness of site remedial technologies using readily available global data. Rules describing ecological impacts are derived from ecotoxicological data for (mixtures of) specific contaminants (msPAF) and ecological indices relating effects to the presence of certain contaminants. Rules describing the cost-effectiveness of measures are derived from linear programming models identifying the least-cost combination of abatement measures to satisfy multi-pollutant reduction targets and from multi-criteria analysis.
Evaluating Internal Communication: The ICA Communication Audit.
ERIC Educational Resources Information Center
Goldhaber, Gerald M.
1978-01-01
The ICA Communication Audit is described in detail as an effective measurement procedure that can help an academic institution to evaluate its internal communication system. Tools, computer programs, analysis, and feedback procedures are described and illustrated. (JMF)
Preliminary design methods for fiber reinforced composite structures employing a personal computer
NASA Technical Reports Server (NTRS)
Eastlake, C. N.
1986-01-01
The objective of this project was to develop a user-friendly interactive computer program to be used as an analytical tool by structural designers. Its intent was to do preliminary, approximate stress analysis to help select or verify sizing choices for composite structural members. The approach to the project was to provide a subroutine which uses classical lamination theory to predict an effective elastic modulus for a laminate of arbitrary material and ply orientation. This effective elastic modulus can then be used in a family of other subroutines which employ the familiar basic structural analysis methods for isotropic materials. This method is simple and convenient to use but only approximate, as is appropriate for a preliminary design tool which will be subsequently verified by more sophisticated analysis. Additional subroutines have been provided to calculate laminate coefficient of thermal expansion and to calculate ply-by-ply strains within a laminate.
On the usage of ultrasound computational models for decision making under ambiguity
NASA Astrophysics Data System (ADS)
Dib, Gerges; Sexton, Samuel; Prowant, Matthew; Crawford, Susan; Diaz, Aaron
2018-04-01
Computer modeling and simulation is becoming pervasive within the non-destructive evaluation (NDE) industry as a convenient tool for designing and assessing inspection techniques. This raises a pressing need for developing quantitative techniques for demonstrating the validity and applicability of the computational models. Computational models provide deterministic results based on deterministic and well-defined input, or stochastic results based on inputs defined by probability distributions. However, computational models cannot account for the effects of personnel, procedures, and equipment, resulting in ambiguity about the efficacy of inspections based on guidance from computational models only. In addition, ambiguity arises when model inputs, such as the representation of realistic cracks, cannot be defined deterministically, probabilistically, or by intervals. In this work, Pacific Northwest National Laboratory demonstrates the ability of computational models to represent field measurements under known variabilities, and quantify the differences using maximum amplitude and power spectrum density metrics. Sensitivity studies are also conducted to quantify the effects of different input parameters on the simulation results.
NASA Astrophysics Data System (ADS)
Ellins, K. K.; Eriksson, S. C.; Samsel, F.; Lavier, L.
2017-12-01
A new undergraduate, upper level geoscience course was developed and taught by faculty and staff of the UT Austin Jackson School of Geosciences, the Center for Agile Technology, and the Texas Advanced Computational Center. The course examined the role of the visual arts in placing the scientific process and knowledge in a broader context and introduced students to innovations in the visual arts that promote scientific investigation through collaboration between geoscientists and artists. The course addressed (1) the role of the visual arts in teaching geoscience concepts and promoting geoscience learning; (2) the application of innovative visualization and artistic techniques to large volumes of geoscience data to enhance scientific understanding and to move scientific investigation forward; and (3) the illustrative power of art to communicate geoscience to the public. In-class activities and discussions, computer lab instruction on the application of Paraview software, reading assignments, lectures, and group projects with presentations comprised the two-credit, semester-long "special topics" course, which was taken by geoscience, computer science, and engineering students. Assessment of student learning was carried out by the instructors and course evaluation was done by an external evaluator using rubrics, likert-scale surveys and focus goups. The course achieved its goals of students' learning the concepts and techniques of the visual arts. The final projects demonstrated this, along with the communication of geologic concepts using what they had learned in the course. The basic skill of sketching for learning and using best practices in visual communication were used extensively and, in most cases, very effectively. The use of an advanced visualization tool, Paraview, was received with mixed reviews because of the lack of time to really learn the tool and the fact that it is not a tool used routinely in geoscience. Those senior students with advanced computer skills saw the importance of this tool. Students worked in teams, more or less effectively, and made suggestions for improving future offerings of the course.
Computer Network Attack: An Operational Tool?
2003-01-17
Spectrum of Conflict, Cyber Warfare , Preemptive Strike, Effects Based Targeting. 15. Abstract: Computer Network Attack (CNA) is defined as...great deal of attention as the world’s capabilities in cyber - warfare grow. 11 Although addressing the wide ranging legal aspects of CNA is beyond the...the notion of cyber - warfare has not yet developed to the point that international norms have been established.15 These norms will be developed in
Computer Aided Self-Forging Fragment Design,
1978-06-01
This value is reached so quickly that HEMP solutions using work hardening and those using only elastic—perfectly plastic formulations are quite...Elastic— Plastic Flow, UCRL—7322 , Lawrence Radiation Laboratory , Livermore , California (1969) . 4. Giroux , E. D . , HEMP Users Manual, UCRL—5l079...Laboratory, the HEMP computer code has been developed to serve as an effective design tool to simplify this task considerably. Using this code, warheads 78 06
NASA Astrophysics Data System (ADS)
Smetana, Lara Kathleen; Bell, Randy L.
2012-06-01
Researchers have explored the effectiveness of computer simulations for supporting science teaching and learning during the past four decades. The purpose of this paper is to provide a comprehensive, critical review of the literature on the impact of computer simulations on science teaching and learning, with the goal of summarizing what is currently known and providing guidance for future research. We report on the outcomes of 61 empirical studies dealing with the efficacy of, and implications for, computer simulations in science instruction. The overall findings suggest that simulations can be as effective, and in many ways more effective, than traditional (i.e. lecture-based, textbook-based and/or physical hands-on) instructional practices in promoting science content knowledge, developing process skills, and facilitating conceptual change. As with any other educational tool, the effectiveness of computer simulations is dependent upon the ways in which they are used. Thus, we outline specific research-based guidelines for best practice. Computer simulations are most effective when they (a) are used as supplements; (b) incorporate high-quality support structures; (c) encourage student reflection; and (d) promote cognitive dissonance. Used appropriately, computer simulations involve students in inquiry-based, authentic science explorations. Additionally, as educational technologies continue to evolve, advantages such as flexibility, safety, and efficiency deserve attention.
Hypercard Another Computer Tool.
ERIC Educational Resources Information Center
Geske, Joel
1991-01-01
Describes "Hypercard," a computer application package usable in all three modes of instructional computing: tutor, tool, and tutee. Suggests using Hypercard in scholastic journalism programs to teach such topics as news, headlines, design, photography, and advertising. Argues that the ability to access, organize, manipulate, and comprehend…
CAROLINA CENTER FOR COMPUTATIONAL TOXICOLOGY
The Center will advance the field of computational toxicology through the development of new methods and tools, as well as through collaborative efforts. In each Project, new computer-based models will be developed and published that represent the state-of-the-art. The tools p...
Learning Disabled Students and Computers: A Teacher's Guide Book.
ERIC Educational Resources Information Center
Metzger, Merrianne; And Others
This booklet is provided as a guide to teachers working with learning disabled (LD) students who are interested in using computers as a teaching tool. The computer is presented as a powerful option to enhance educational opportunities for LD children. The author outlines the three main modes in educational computer use (tutor, tool, and tutee) and…
Reading and Computers: Issues for Theory and Practice. Computers and Education Series.
ERIC Educational Resources Information Center
Reinking, David, Ed.
Embodying two themes--that the computer can become an even more exciting instructional tool than it is today, and that the research necessary for developing the potential of this tool is already underway, this book explores the theoretical, research, and instructional issues concerning computers and reading. The titles of the essays and their…
ERIC Educational Resources Information Center
Schwarz, Christina V.; Meyer, Jason; Sharma, Ajay
2007-01-01
This study infused computer modeling and simulation tools in a 1-semester undergraduate elementary science methods course to advance preservice teachers' understandings of computer software use in science teaching and to help them learn important aspects of pedagogy and epistemology. Preservice teachers used computer modeling and simulation tools…
Computers: Tools of Oppression, Tools of Liberation.
ERIC Educational Resources Information Center
Taylor, Jefferey H.
This paper contends that students who are learning to use computers can benefit from having an overview of the history and social context of computers. The paper highlights some milestones in the history of computers, from ancient times to ENIAC to Altair to Bill Gates to the Internet. It also suggests some things for students to think about and…
MetaboTools: A comprehensive toolbox for analysis of genome-scale metabolic models
Aurich, Maike K.; Fleming, Ronan M. T.; Thiele, Ines
2016-08-03
Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorialsmore » explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. In conclusion, this computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.« less
Discovering Synergistic Drug Combination from a Computational Perspective.
Ding, Pingjian; Luo, Jiawei; Liang, Cheng; Xiao, Qiu; Cao, Buwen; Li, Guanghui
2018-03-30
Synergistic drug combinations play an important role in the treatment of complex diseases. The identification of effective drug combination is vital to further reduce the side effects and improve therapeutic efficiency. In previous years, in vitro method has been the main route to discover synergistic drug combinations. However, many limitations of time and resource consumption lie within the in vitro method. Therefore, with the rapid development of computational models and the explosive growth of large and phenotypic data, computational methods for discovering synergistic drug combinations are an efficient and promising tool and contribute to precision medicine. It is the key of computational methods how to construct the computational model. Different computational strategies generate different performance. In this review, the recent advancements in computational methods for predicting effective drug combination are concluded from multiple aspects. First, various datasets utilized to discover synergistic drug combinations are summarized. Second, we discussed feature-based approaches and partitioned these methods into two classes including feature-based methods in terms of similarity measure, and feature-based methods in terms of machine learning. Third, we discussed network-based approaches for uncovering synergistic drug combinations. Finally, we analyzed and prospected computational methods for predicting effective drug combinations. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
The use of computer graphic simulation in the development of on-orbit tele-robotic systems
NASA Technical Reports Server (NTRS)
Fernandez, Ken; Hinman, Elaine
1987-01-01
This paper describes the use of computer graphic simulation techniques to resolve critical design and operational issues for robotic systems used for on-orbit operations. These issues are robot motion control, robot path-planning/verification, and robot dynamics. The major design issues in developing effective telerobotic systems are discussed, and the use of ROBOSIM, a NASA-developed computer graphic simulation tool, to address these issues is presented. Simulation plans for the Space Station and the Orbital Maneuvering Vehicle are presented and discussed.
NASA Technical Reports Server (NTRS)
Chan, J. S.; Freeman, J. A.
1984-01-01
The viscous, axisymmetric flow in the thrust chamber of the space shuttle main engine (SSME) was computed on the CRAY 205 computer using the general interpolants method (GIM) code. Results show that the Navier-Stokes codes can be used for these flows to study trends and viscous effects as well as determine flow patterns; but further research and development is needed before they can be used as production tools for nozzle performance calculations. The GIM formulation, numerical scheme, and computer code are described. The actual SSME nozzle computation showing grid points, flow contours, and flow parameter plots is discussed. The computer system and run times/costs are detailed.
Towards early software reliability prediction for computer forensic tools (case study).
Abu Talib, Manar
2016-01-01
Versatility, flexibility and robustness are essential requirements for software forensic tools. Researchers and practitioners need to put more effort into assessing this type of tool. A Markov model is a robust means for analyzing and anticipating the functioning of an advanced component based system. It is used, for instance, to analyze the reliability of the state machines of real time reactive systems. This research extends the architecture-based software reliability prediction model for computer forensic tools, which is based on Markov chains and COSMIC-FFP. Basically, every part of the computer forensic tool is linked to a discrete time Markov chain. If this can be done, then a probabilistic analysis by Markov chains can be performed to analyze the reliability of the components and of the whole tool. The purposes of the proposed reliability assessment method are to evaluate the tool's reliability in the early phases of its development, to improve the reliability assessment process for large computer forensic tools over time, and to compare alternative tool designs. The reliability analysis can assist designers in choosing the most reliable topology for the components, which can maximize the reliability of the tool and meet the expected reliability level specified by the end-user. The approach of assessing component-based tool reliability in the COSMIC-FFP context is illustrated with the Forensic Toolkit Imager case study.
NASA Astrophysics Data System (ADS)
Rachmat, Haris; Ibrahim, M. Rasidi; Hasan, Sulaiman bin
2017-04-01
On of high technology in machining is ultrasonic vibration assisted turning. The design of tool holder was a crucial step to make sure the tool holder is enough to handle all forces on turning process. Because of the direct experimental approach is expensive, the paper studied to predict feasibility of tool holder displacement and effective stress was used the computational in finite element simulation. SS201 and AISI 1045 materials were used with sharp and ramp corners flexure hinges on design. The result shows that AISI 1045 material and which has ramp corner flexure hinge was the best choice to be produced. The displacement is around 11.3 micron and effective stress is 1.71e+008 N/m2 and also the factor of safety is 3.10.
Integrated design, execution, and analysis of arrayed and pooled CRISPR genome-editing experiments.
Canver, Matthew C; Haeussler, Maximilian; Bauer, Daniel E; Orkin, Stuart H; Sanjana, Neville E; Shalem, Ophir; Yuan, Guo-Cheng; Zhang, Feng; Concordet, Jean-Paul; Pinello, Luca
2018-05-01
CRISPR (clustered regularly interspaced short palindromic repeats) genome-editing experiments offer enormous potential for the evaluation of genomic loci using arrayed single guide RNAs (sgRNAs) or pooled sgRNA libraries. Numerous computational tools are available to help design sgRNAs with optimal on-target efficiency and minimal off-target potential. In addition, computational tools have been developed to analyze deep-sequencing data resulting from genome-editing experiments. However, these tools are typically developed in isolation and oftentimes are not readily translatable into laboratory-based experiments. Here, we present a protocol that describes in detail both the computational and benchtop implementation of an arrayed and/or pooled CRISPR genome-editing experiment. This protocol provides instructions for sgRNA design with CRISPOR (computational tool for the design, evaluation, and cloning of sgRNA sequences), experimental implementation, and analysis of the resulting high-throughput sequencing data with CRISPResso (computational tool for analysis of genome-editing outcomes from deep-sequencing data). This protocol allows for design and execution of arrayed and pooled CRISPR experiments in 4-5 weeks by non-experts, as well as computational data analysis that can be performed in 1-2 d by both computational and noncomputational biologists alike using web-based and/or command-line versions.
VARS-TOOL: A Comprehensive, Efficient, and Robust Sensitivity Analysis Toolbox
NASA Astrophysics Data System (ADS)
Razavi, S.; Sheikholeslami, R.; Haghnegahdar, A.; Esfahbod, B.
2016-12-01
VARS-TOOL is an advanced sensitivity and uncertainty analysis toolbox, applicable to the full range of computer simulation models, including Earth and Environmental Systems Models (EESMs). The toolbox was developed originally around VARS (Variogram Analysis of Response Surfaces), which is a general framework for Global Sensitivity Analysis (GSA) that utilizes the variogram/covariogram concept to characterize the full spectrum of sensitivity-related information, thereby providing a comprehensive set of "global" sensitivity metrics with minimal computational cost. VARS-TOOL is unique in that, with a single sample set (set of simulation model runs), it generates simultaneously three philosophically different families of global sensitivity metrics, including (1) variogram-based metrics called IVARS (Integrated Variogram Across a Range of Scales - VARS approach), (2) variance-based total-order effects (Sobol approach), and (3) derivative-based elementary effects (Morris approach). VARS-TOOL is also enabled with two novel features; the first one being a sequential sampling algorithm, called Progressive Latin Hypercube Sampling (PLHS), which allows progressively increasing the sample size for GSA while maintaining the required sample distributional properties. The second feature is a "grouping strategy" that adaptively groups the model parameters based on their sensitivity or functioning to maximize the reliability of GSA results. These features in conjunction with bootstrapping enable the user to monitor the stability, robustness, and convergence of GSA with the increase in sample size for any given case study. VARS-TOOL has been shown to achieve robust and stable results within 1-2 orders of magnitude smaller sample sizes (fewer model runs) than alternative tools. VARS-TOOL, available in MATLAB and Python, is under continuous development and new capabilities and features are forthcoming.
Tools for Embedded Computing Systems Software
NASA Technical Reports Server (NTRS)
1978-01-01
A workshop was held to assess the state of tools for embedded systems software and to determine directions for tool development. A synopsis of the talk and the key figures of each workshop presentation, together with chairmen summaries, are presented. The presentations covered four major areas: (1) tools and the software environment (development and testing); (2) tools and software requirements, design, and specification; (3) tools and language processors; and (4) tools and verification and validation (analysis and testing). The utility and contribution of existing tools and research results for the development and testing of embedded computing systems software are described and assessed.
TokenPasser: A petri net specification tool. Thesis
NASA Technical Reports Server (NTRS)
Mittmann, Michael
1991-01-01
In computer program design it is essential to know the effectiveness of different design options in improving performance, and dependability. This paper provides a description of a CAD tool for distributed hierarchical Petri nets. After a brief review of Petri nets, Petri net languages, and Petri net transducers, and descriptions of several current Petri net tools, the specifications and design of the TokenPasser tool are presented. TokenPasser is a tool to allow design of distributed hierarchical systems based on Petri nets. A case study for an intelligent robotic system is conducted, a coordination structure with one dispatcher controlling three coordinators is built to model a proposed robotic assembly system. The system is implemented using TokenPasser, and the results are analyzed to allow judgment of the tool.
Development of Anthropometric Analogous Headforms. Phase 1.
1994-10-31
shown in figure 5. This surface mesh can then be transformed into polygon faces that are able to be rendered by the AutoCAD rendering tools . Rendering of...computer-generated surfaces. The material removal techniques require the programming of the tool path of the cutter and in some cases requires specialized... tooling . Tool path programs are available to transfer the computer-generated surface into actual paths of the cutting tool . In cases where the
A Debugger for Computational Grid Applications
NASA Technical Reports Server (NTRS)
Hood, Robert; Jost, Gabriele; Biegel, Bryan (Technical Monitor)
2001-01-01
This viewgraph presentation gives an overview of a debugger for computational grid applications. Details are given on NAS parallel tools groups (including parallelization support tools, evaluation of various parallelization strategies, and distributed and aggregated computing), debugger dependencies, scalability, initial implementation, the process grid, and information on Globus.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aurich, Maike K.; Fleming, Ronan M. T.; Thiele, Ines
Metabolomic data sets provide a direct read-out of cellular phenotypes and are increasingly generated to study biological questions. Previous work, by us and others, revealed the potential of analyzing extracellular metabolomic data in the context of the metabolic model using constraint-based modeling. With the MetaboTools, we make our methods available to the broader scientific community. The MetaboTools consist of a protocol, a toolbox, and tutorials of two use cases. The protocol describes, in a step-wise manner, the workflow of data integration, and computational analysis. The MetaboTools comprise the Matlab code required to complete the workflow described in the protocol. Tutorialsmore » explain the computational steps for integration of two different data sets and demonstrate a comprehensive set of methods for the computational analysis of metabolic models and stratification thereof into different phenotypes. The presented workflow supports integrative analysis of multiple omics data sets. Importantly, all analysis tools can be applied to metabolic models without performing the entire workflow. Taken together, the MetaboTools constitute a comprehensive guide to the intra-model analysis of extracellular metabolomic data from microbial, plant, or human cells. In conclusion, this computational modeling resource offers a broad set of computational analysis tools for a wide biomedical and non-biomedical research community.« less
NASA Technical Reports Server (NTRS)
Walker, Carrie K.
1991-01-01
A technique has been developed for combining features of a systems architecture design and assessment tool and a software development tool. This technique reduces simulation development time and expands simulation detail. The Architecture Design and Assessment System (ADAS), developed at the Research Triangle Institute, is a set of computer-assisted engineering tools for the design and analysis of computer systems. The ADAS system is based on directed graph concepts and supports the synthesis and analysis of software algorithms mapped to candidate hardware implementations. Greater simulation detail is provided by the ADAS functional simulator. With the functional simulator, programs written in either Ada or C can be used to provide a detailed description of graph nodes. A Computer-Aided Software Engineering tool developed at the Charles Stark Draper Laboratory (CSDL CASE) automatically generates Ada or C code from engineering block diagram specifications designed with an interactive graphical interface. A technique to use the tools together has been developed, which further automates the design process.
The Nanoelectric Modeling Tool (NEMO) and Its Expansion to High Performance Parallel Computing
NASA Technical Reports Server (NTRS)
Klimeck, G.; Bowen, C.; Boykin, T.; Oyafuso, F.; Salazar-Lazaro, C.; Stoica, A.; Cwik, T.
1998-01-01
Material variations on an atomic scale enable the quantum mechanical functionality of devices such as resonant tunneling diodes (RTDs), quantum well infrared photodetectors (QWIPs), quantum well lasers, and heterostructure field effect transistors (HFETs).
LIVING SHORES GALLERY MX964015
An interactive computer kiosk will allow the Texas State Aquarium to deliver a considerable amount of information in an efficient and highly effective manner. Touch screen interactives have proven to be excellent teaching tools in the Aquarium's Jellies: Floating Phantoms galler...
ERIC Educational Resources Information Center
Ocal, Mehmet Fatih
2017-01-01
Integrating the properties of computer algebra systems and dynamic geometry environments, Geogebra became an effective and powerful tool for teaching and learning mathematics. One of the reasons that teachers use Geogebra in mathematics classrooms is to make students learn mathematics meaningfully and conceptually. From this perspective, the…
Effects of Thinking Style on Design Strategies: Using Bridge Construction Simulation Programs
ERIC Educational Resources Information Center
Sun, Chuen-Tsai; Wang, Dai-Yi; Chang, Yu-Yeh
2013-01-01
Computer simulation users can freely control operational factors and simulation results, repeat processes, make changes, and learn from simulation environment feedback. The focus of this paper is on simulation-based design tools and their effects on student learning processes in a group of 101 Taiwanese senior high school students. Participants…
Text Simplification and Comprehensible Input: A Case for an Intuitive Approach
ERIC Educational Resources Information Center
Crossley, Scott A.; Allen, David; McNamara, Danielle S.
2012-01-01
Texts are routinely simplified to make them more comprehensible for second language learners. However, the effects of simplification upon the linguistic features of texts remain largely unexplored. Here we examine the effects of one type of text simplification: intuitive text simplification. We use the computational tool, Coh-Metrix, to examine…
Effects of Computer-Based Visual Representation on Mathematics Learning and Cognitive Load
ERIC Educational Resources Information Center
Yung, Hsin I.; Paas, Fred
2015-01-01
Visual representation has been recognized as a powerful learning tool in many learning domains. Based on the assumption that visual representations can support deeper understanding, we examined the effects of visual representations on learning performance and cognitive load in the domain of mathematics. An experimental condition with visual…
DEVELOPMENT AND USE OF COMPUTER-AIDED PROCESS ENGINEERING TOOLS FOR POLLUTION PREVENTION
The use of Computer-Aided Process Engineering (CAPE) and process simulation tools has become established industry practice to predict simulation software, new opportunities are available for the creation of a wide range of ancillary tools that can be used from within multiple sim...
ERIC Educational Resources Information Center
Goldfine, Alan H., Ed.
This workshop investigated how managers can evaluate, select, and effectively use information resource management (IRM) tools, especially data dictionary systems (DDS). An executive summary, which provides a definition of IRM as developed by workshop participants, precedes the keynote address, "Data: The Raw Material of a Paper Factory,"…
SOAP. A tool for the fast computation of photometry and radial velocity induced by stellar spots
NASA Astrophysics Data System (ADS)
Boisse, I.; Bonfils, X.; Santos, N. C.
2012-09-01
We define and put at the disposal of the community SOAP, Spot Oscillation And Planet, a software tool that simulates the effect of stellar spots and plages on radial velocimetry and photometry. This paper describes the tool release and provides instructions for its use. We present detailed tests with previous computations and real data to assess the code's performance and to validate its suitability. We characterize the variations of the radial velocity, line bisector, and photometric amplitude as a function of the main variables: projected stellar rotational velocity, filling factor of the spot, resolution of the spectrograph, linear limb-darkening coefficient, latitude of the spot, and inclination of the star. Finally, we model the spot distributions on the active stars HD 166435, TW Hya and HD 189733, which reproduce the observations. We show that the software is remarkably fast, allowing several evolutions in its capabilities that could be performed to study the next challenges in the exoplanetary field connected with the stellar variability. The tool is available at http://www.astro.up.pt/soap
High accurate interpolation of NURBS tool path for CNC machine tools
NASA Astrophysics Data System (ADS)
Liu, Qiang; Liu, Huan; Yuan, Songmei
2016-09-01
Feedrate fluctuation caused by approximation errors of interpolation methods has great effects on machining quality in NURBS interpolation, but few methods can efficiently eliminate or reduce it to a satisfying level without sacrificing the computing efficiency at present. In order to solve this problem, a high accurate interpolation method for NURBS tool path is proposed. The proposed method can efficiently reduce the feedrate fluctuation by forming a quartic equation with respect to the curve parameter increment, which can be efficiently solved by analytic methods in real-time. Theoretically, the proposed method can totally eliminate the feedrate fluctuation for any 2nd degree NURBS curves and can interpolate 3rd degree NURBS curves with minimal feedrate fluctuation. Moreover, a smooth feedrate planning algorithm is also proposed to generate smooth tool motion with considering multiple constraints and scheduling errors by an efficient planning strategy. Experiments are conducted to verify the feasibility and applicability of the proposed method. This research presents a novel NURBS interpolation method with not only high accuracy but also satisfying computing efficiency.
Computational biology for ageing
Wieser, Daniela; Papatheodorou, Irene; Ziehm, Matthias; Thornton, Janet M.
2011-01-01
High-throughput genomic and proteomic technologies have generated a wealth of publicly available data on ageing. Easy access to these data, and their computational analysis, is of great importance in order to pinpoint the causes and effects of ageing. Here, we provide a description of the existing databases and computational tools on ageing that are available for researchers. We also describe the computational approaches to data interpretation in the field of ageing including gene expression, comparative and pathway analyses, and highlight the challenges for future developments. We review recent biological insights gained from applying bioinformatics methods to analyse and interpret ageing data in different organisms, tissues and conditions. PMID:21115530
The Computer as a Tool for Learning
Starkweather, John A.
1986-01-01
Experimenters from the beginning recognized the advantages computers might offer in medical education. Several medical schools have gained experience in such programs in automated instruction. Television images and graphic display combined with computer control and user interaction are effective for teaching problem solving. The National Board of Medical Examiners has developed patient-case simulation for examining clinical skills, and the National Library of Medicine has experimented with combining media. Advances from the field of artificial intelligence and the availability of increasingly powerful microcomputers at lower cost will aid further development. Computers will likely affect existing educational methods, adding new capabilities to laboratory exercises, to self-assessment and to continuing education. PMID:3544511
The magic words: Using computers to uncover mental associations for use in magic trick design
2017-01-01
The use of computational systems to aid in the design of magic tricks has been previously explored. Here further steps are taken in this direction, introducing the use of computer technology as a natural language data sourcing and processing tool for magic trick design purposes. Crowd sourcing of psychological concepts is investigated; further, the role of human associative memory and its exploitation in magical effects is explored. A new trick is developed and evaluated: a physical card trick partially designed by a computational system configured to search for and explore conceptual spaces readily understood by spectators. PMID:28792941
Ten quick tips for machine learning in computational biology.
Chicco, Davide
2017-01-01
Machine learning has become a pivotal tool for many projects in computational biology, bioinformatics, and health informatics. Nevertheless, beginners and biomedical researchers often do not have enough experience to run a data mining project effectively, and therefore can follow incorrect practices, that may lead to common mistakes or over-optimistic results. With this review, we present ten quick tips to take advantage of machine learning in any computational biology context, by avoiding some common errors that we observed hundreds of times in multiple bioinformatics projects. We believe our ten suggestions can strongly help any machine learning practitioner to carry on a successful project in computational biology and related sciences.
ERIC Educational Resources Information Center
Rolandsson, Lennart; Skogh, Inga-Britt; Männikkö Barbutiu, Sirkku
2017-01-01
Computing and computers are introduced in school as important examples of technology, sometimes as a subject matter of their own, and sometimes they are used as tools for other subjects. All in all, one might even say that "learning about" computing and computers is part of "learning about" technology. Lately, many countries…
A computer simulation of an adaptive noise canceler with a single input
NASA Astrophysics Data System (ADS)
Albert, Stuart D.
1991-06-01
A description of an adaptive noise canceler using Widrows' LMS algorithm is presented. A computer simulation of canceler performance (adaptive convergence time and frequency transfer function) was written for use as a design tool. The simulations, assumptions, and input parameters are described in detail. The simulation is used in a design example to predict the performance of an adaptive noise canceler in the simultaneous presence of both strong and weak narrow-band signals (a cosited frequency hopping radio scenario). On the basis of the simulation results, it is concluded that the simulation is suitable for use as an adaptive noise canceler design tool; i.e., it can be used to evaluate the effect of design parameter changes on canceler performance.
IUWare and Computing Tools: Indiana University's Approach to Low-Cost Software.
ERIC Educational Resources Information Center
Sheehan, Mark C.; Williams, James G.
1987-01-01
Describes strategies for providing low-cost microcomputer-based software for classroom use on college campuses. Highlights include descriptions of the software (IUWare and Computing Tools); computing center support; license policies; documentation; promotion; distribution; staff, faculty, and user training; problems; and future plans. (LRW)
Development of Advanced Computational Aeroelasticity Tools at NASA Langley Research Center
NASA Technical Reports Server (NTRS)
Bartels, R. E.
2008-01-01
NASA Langley Research Center has continued to develop its long standing computational tools to address new challenges in aircraft and launch vehicle design. This paper discusses the application and development of those computational aeroelastic tools. Four topic areas will be discussed: 1) Modeling structural and flow field nonlinearities; 2) Integrated and modular approaches to nonlinear multidisciplinary analysis; 3) Simulating flight dynamics of flexible vehicles; and 4) Applications that support both aeronautics and space exploration.
Using Microsoft PowerPoint as an Astronomical Image Analysis Tool
NASA Astrophysics Data System (ADS)
Beck-Winchatz, Bernhard
2006-12-01
Engaging students in the analysis of authentic scientific data is an effective way to teach them about the scientific process and to develop their problem solving, teamwork and communication skills. In astronomy several image processing and analysis software tools have been developed for use in school environments. However, the practical implementation in the classroom is often difficult because the teachers may not have the comfort level with computers necessary to install and use these tools, they may not have adequate computer privileges and/or support, and they may not have the time to learn how to use specialized astronomy software. To address this problem, we have developed a set of activities in which students analyze astronomical images using basic tools provided in PowerPoint. These include measuring sizes, distances, and angles, and blinking images. In contrast to specialized software, PowerPoint is broadly available on school computers. Many teachers are already familiar with PowerPoint, and the skills developed while learning how to analyze astronomical images are highly transferable. We will discuss several practical examples of measurements, including the following: -Variations in the distances to the sun and moon from their angular sizes -Magnetic declination from images of shadows -Diameter of the moon from lunar eclipse images -Sizes of lunar craters -Orbital radii of the Jovian moons and mass of Jupiter -Supernova and comet searches -Expansion rate of the universe from images of distant galaxies
gene2drug: a computational tool for pathway-based rational drug repositioning.
Napolitano, Francesco; Carrella, Diego; Mandriani, Barbara; Pisonero-Vaquero, Sandra; Sirci, Francesco; Medina, Diego L; Brunetti-Pierri, Nicola; di Bernardo, Diego
2018-05-01
Drug repositioning has been proposed as an effective shortcut to drug discovery. The availability of large collections of transcriptional responses to drugs enables computational approaches to drug repositioning directly based on measured molecular effects. We introduce a novel computational methodology for rational drug repositioning, which exploits the transcriptional responses following treatment with small molecule. Specifically, given a therapeutic target gene, a prioritization of potential effective drugs is obtained by assessing their impact on the transcription of genes in the pathway(s) including the target. We performed in silico validation and comparison with a state-of-art technique based on similar principles. We next performed experimental validation in two different real-case drug repositioning scenarios: (i) upregulation of the glutamate-pyruvate transaminase (GPT), which has been shown to induce reduction of oxalate levels in a mouse model of primary hyperoxaluria, and (ii) activation of the transcription factor TFEB, a master regulator of lysosomal biogenesis and autophagy, whose modulation may be beneficial in neurodegenerative disorders. A web tool for Gene2drug is freely available at http://gene2drug.tigem.it. An R package is under development and can be obtained from https://github.com/franapoli/gep2pep. dibernardo@tigem.it. Supplementary data are available at Bioinformatics online.
Veksler, Vladislav D.; Buchler, Norbou; Hoffman, Blaine E.; Cassenti, Daniel N.; Sample, Char; Sugrim, Shridat
2018-01-01
Computational models of cognitive processes may be employed in cyber-security tools, experiments, and simulations to address human agency and effective decision-making in keeping computational networks secure. Cognitive modeling can addresses multi-disciplinary cyber-security challenges requiring cross-cutting approaches over the human and computational sciences such as the following: (a) adversarial reasoning and behavioral game theory to predict attacker subjective utilities and decision likelihood distributions, (b) human factors of cyber tools to address human system integration challenges, estimation of defender cognitive states, and opportunities for automation, (c) dynamic simulations involving attacker, defender, and user models to enhance studies of cyber epidemiology and cyber hygiene, and (d) training effectiveness research and training scenarios to address human cyber-security performance, maturation of cyber-security skill sets, and effective decision-making. Models may be initially constructed at the group-level based on mean tendencies of each subject's subgroup, based on known statistics such as specific skill proficiencies, demographic characteristics, and cultural factors. For more precise and accurate predictions, cognitive models may be fine-tuned to each individual attacker, defender, or user profile, and updated over time (based on recorded behavior) via techniques such as model tracing and dynamic parameter fitting. PMID:29867661
Computer Assisted Chronic Disease Management: Does It Work? A Pilot Study Using Mixed Methods
Jones, Kay M.; Biezen, Ruby; Piterman, Leon
2013-01-01
Background. Key factors for the effective chronic disease management (CDM) include the availability of practical and effective computer tools and continuing professional development/education. This study tested the effectiveness of a computer assisted chronic disease management tool, a broadband-based service known as cdmNet in increasing the development of care plans for patients with chronic disease in general practice. Methodology. Mixed methods are the breakthrough series methodology (workshops and plan-do-study-act cycles) and semistructured interviews. Results. Throughout the intervention period a pattern emerged suggesting GPs use of cdmNet initially increased, then plateaued practice nurses' and practice managers' roles expanded as they became more involved in using cdmNet. Seven main messages emerged from the GP interviews. Discussion. The overall use of cdmNet by participating GPs varied from “no change” to “significant change and developing many the GPMPs (general practice management plans) using cdmNet.” The variation may be due to several factors, not the least, allowing GPs adequate time to familiarise themselves with the software and recognising the benefit of the team approach. Conclusion. The breakthrough series methodology facilitated upskilling GPs' management of patients diagnosed with a chronic disease and learning how to use the broadband-based service cdmNet. PMID:24959576
Automation of a DXA-based finite element tool for clinical assessment of hip fracture risk.
Luo, Yunhua; Ahmed, Sharif; Leslie, William D
2018-03-01
Finite element analysis of medical images is a promising tool for assessing hip fracture risk. Although a number of finite element models have been developed for this purpose, none of them have been routinely used in clinic. The main reason is that the computer programs that implement the finite element models have not been completely automated, and heavy training is required before clinicians can effectively use them. By using information embedded in clinical dual energy X-ray absorptiometry (DXA), we completely automated a DXA-based finite element (FE) model that we previously developed for predicting hip fracture risk. The automated FE tool can be run as a standalone computer program with the subject's raw hip DXA image as input. The automated FE tool had greatly improved short-term precision compared with the semi-automated version. To validate the automated FE tool, a clinical cohort consisting of 100 prior hip fracture cases and 300 matched controls was obtained from a local community clinical center. Both the automated FE tool and femoral bone mineral density (BMD) were applied to discriminate the fracture cases from the controls. Femoral BMD is the gold standard reference recommended by the World Health Organization for screening osteoporosis and for assessing hip fracture risk. The accuracy was measured by the area under ROC curve (AUC) and odds ratio (OR). Compared with femoral BMD (AUC = 0.71, OR = 2.07), the automated FE tool had a considerably improved accuracy (AUC = 0.78, OR = 2.61 at the trochanter). This work made a large step toward applying our DXA-based FE model as a routine clinical tool for the assessment of hip fracture risk. Furthermore, the automated computer program can be embedded into a web-site as an internet application. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Arias, Adriel (Inventor)
2016-01-01
The main objective of the Holodeck Testbed is to create a cost effective, realistic, and highly immersive environment that can be used to train astronauts, carry out engineering analysis, develop procedures, and support various operations tasks. Currently, the Holodeck testbed allows to step into a simulated ISS (International Space Station) and interact with objects; as well as, perform Extra Vehicular Activities (EVA) on the surface of the Moon or Mars. The Holodeck Testbed is using the products being developed in the Hybrid Reality Lab (HRL). The HRL is combining technologies related to merging physical models with photo-realistic visuals to create a realistic and highly immersive environment. The lab also investigates technologies and concepts that are needed to allow it to be integrated with other testbeds; such as, the gravity offload capability provided by the Active Response Gravity Offload System (ARGOS). My main two duties were to develop and animate models for use in the HRL environments and work on a new way to interface with computers using Brain Computer Interface (BCI) technology. On my first task, I was able to create precise computer virtual tool models (accurate down to the thousandths or hundredths of an inch). To make these tools even more realistic, I produced animations for these tools so they would have the same mechanical features as the tools in real life. The computer models were also used to create 3D printed replicas that will be outfitted with tracking sensors. The sensor will allow the 3D printed models to align precisely with the computer models in the physical world and provide people with haptic/tactile feedback while wearing a VR (Virtual Reality) headset and interacting with the tools. Getting close to the end of my internship the lab bought a professional grade 3D Scanner. With this, I was able to replicate more intricate tools at a much more time-effective rate. The second task was to investigate the use of BCI to control objects inside the hybrid reality ISS environment. This task looked at using an Electroencephalogram (EEG) headset to collect brain state data that could be mapped to commands that a computer could execute. On this Task, I had a setback with the hardware, which stopped working and was returned to the vendor for repair. However, I was still able to collect some data, was able to process it, and started to create correlation algorithms between the electrical patterns in the brain and the commands we wanted the computer to carry out. I also carried out a test to investigate the comfort of the headset if it is worn for a long time. The knowledge gained will benefit me in my future career. I learned how to use various modeling and programming tools that included Blender, Maya, Substance Painter, Artec Studio, Github, and Unreal Engine 4. I learned how to use a professional grade 3D scanner and 3D printer. On the BCI Project I learned about data mining and how to create correlation algorithms. I also supported various demos including a live demo of the hybrid reality lab capabilities at ComicPalooza. This internship has given me a good look into engineering at NASA. I developed a more thorough understanding of engineering and my overall confidence has grown. I have also realized that any problem can be fixed, if you try hard enough, and as an engineer it is your job to not only fix problems but to embrace coming up with solutions to those problems.
Vishvakarma, Vijay K; Kumari, Kamlesh; Patel, Rajan; Dixit, V S; Singh, Prashant; Mehrotra, Gopal K; Chandra, Ramesh; Chakrawarty, Anand Kumar
2015-05-15
Surfactants are used to prevent the irreversible aggregation of partially refolded proteins and they also assist in protein refolding. We have reported the design and screening of gemini surfactant to stabilize bovine serum albumin (BSA) with the help of computational tool (iGEMDOCK). A series of gemini surfactant has been designed based on bis-N-alkyl nicotinate dianion via varying the alkyl group and anion. On changing the alkyl group and anion of the surfactant, the value of Log P changes means polarity of surfactant can be tuned. Further, the virtual screening of the gemini surfactant has been carried out based on generic evolutionary method. Herein, thermodynamic data was studied to determine the potential of gemini surfactant as BSA stabilizer. Computational tools help to find out the efficient gemini surfactant to stabilize the BSA rather than to use the surfactant randomly and directionless for the stabilization. It can be confirmed through the experimental techniques. Previously, researcher synthesized one of the designed and used gemini surfactant to stabilize the BSA and their interactions were confirmed through various techniques and computational docking. But herein, the authors find the most competent gemini surfactant to stabilize BSA using computational tools on the basis of energy score. Different from the single chain surfactant, the gemini surfactants exhibit much stronger electrostatic and hydrophobic interactions with the protein and are thus effective at much lower concentrations. Based on the present study, it is expected that gemini surfactants may prove useful in the protein stabilization operations and may thus be effectively employed to circumvent the problem of misfolding and aggregation. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Vishvakarma, Vijay K.; Kumari, Kamlesh; Patel, Rajan; Dixit, V. S.; Singh, Prashant; Mehrotra, Gopal K.; Chandra, Ramesh; Chakrawarty, Anand Kumar
2015-05-01
Surfactants are used to prevent the irreversible aggregation of partially refolded proteins and they also assist in protein refolding. We have reported the design and screening of gemini surfactant to stabilize bovine serum albumin (BSA) with the help of computational tool (iGEMDOCK). A series of gemini surfactant has been designed based on bis-N-alkyl nicotinate dianion via varying the alkyl group and anion. On changing the alkyl group and anion of the surfactant, the value of Log P changes means polarity of surfactant can be tuned. Further, the virtual screening of the gemini surfactant has been carried out based on generic evolutionary method. Herein, thermodynamic data was studied to determine the potential of gemini surfactant as BSA stabilizer. Computational tools help to find out the efficient gemini surfactant to stabilize the BSA rather than to use the surfactant randomly and directionless for the stabilization. It can be confirmed through the experimental techniques. Previously, researcher synthesized one of the designed and used gemini surfactant to stabilize the BSA and their interactions were confirmed through various techniques and computational docking. But herein, the authors find the most competent gemini surfactant to stabilize BSA using computational tools on the basis of energy score. Different from the single chain surfactant, the gemini surfactants exhibit much stronger electrostatic and hydrophobic interactions with the protein and are thus effective at much lower concentrations. Based on the present study, it is expected that gemini surfactants may prove useful in the protein stabilization operations and may thus be effectively employed to circumvent the problem of misfolding and aggregation.
NASA Technical Reports Server (NTRS)
Ferraro, R.; Some, R.
2002-01-01
The growth in data rates of instruments on future NASA spacecraft continues to outstrip the improvement in communications bandwidth and processing capabilities of radiation-hardened computers. Sophisticated autonomous operations strategies will further increase the processing workload. Given the reductions in spacecraft size and available power, standard radiation hardened computing systems alone will not be able to address the requirements of future missions. The REE project was intended to overcome this obstacle by developing a COTS- based supercomputer suitable for use as a science and autonomy data processor in most space environments. This development required a detailed knowledge of system behavior in the presence of Single Event Effect (SEE) induced faults so that mitigation strategies could be designed to recover system level reliability while maintaining the COTS throughput advantage. The REE project has developed a suite of tools and a methodology for predicting SEU induced transient fault rates in a range of natural space environments from ground-based radiation testing of component parts. In this paper we provide an overview of this methodology and tool set with a concentration on the radiation fault model and its use in the REE system development methodology. Using test data reported elsewhere in this and other conferences, we predict upset rates for a particular COTS single board computer configuration in several space environments.
Fazelniya, Zahra; Najafi, Mostafa; Moafi, Alireza; Talakoub, Sedigheh
2017-01-01
Quality of life (QOL) of children with cancer reduces right from the diagnosis of disease and the start of treatment. Computer games in medicine are utilized to interact with patients and to improve their health-related behaviors. This study aimed to investigate the effect of an interactive computer game on the QOL of children undergoing chemotherapy. In this clinical trial, 64 children with cancer aged between 8 and12 years were selected through convenience sampling and randomly assigned to experimental or control group. The experimental group played a computer game for 3 hours a week for 4 consecutive weeks and the control group only received routine care. The data collection tool was the Pediatric Quality of Life Inventory (PedsQL) 3.0 Cancer Module Child self-report designed for children aged between 8 to 12 years. Data were analyzed using descriptive and inferential statistics in SPSS software. Before intervention, there was no significant difference between the two groups in terms of mean total QOL score ( p = 0.87). However, immediately after the intervention ( p = 0.02) and 1 month after the intervention ( p < 0.001), the overall mean QOL score was significantly higher in the intervention group than the control group. Based on the findings, computer games seem to be effective as a tool in influencing health-related behavior and improving the QOL of children undergoing chemotherapy. Therefore, according to the findings of this study, computer games can be used to improve the QOL of children undergoing chemotherapy.
Fazelniya, Zahra; Najafi, Mostafa; Moafi, Alireza; Talakoub, Sedigheh
2017-01-01
Background: Quality of life (QOL) of children with cancer reduces right from the diagnosis of disease and the start of treatment. Computer games in medicine are utilized to interact with patients and to improve their health-related behaviors. This study aimed to investigate the effect of an interactive computer game on the QOL of children undergoing chemotherapy. Materials and Methods: In this clinical trial, 64 children with cancer aged between 8 and12 years were selected through convenience sampling and randomly assigned to experimental or control group. The experimental group played a computer game for 3 hours a week for 4 consecutive weeks and the control group only received routine care. The data collection tool was the Pediatric Quality of Life Inventory (PedsQL) 3.0 Cancer Module Child self-report designed for children aged between 8 to 12 years. Data were analyzed using descriptive and inferential statistics in SPSS software. Results: Before intervention, there was no significant difference between the two groups in terms of mean total QOL score (p = 0.87). However, immediately after the intervention (p = 0.02) and 1 month after the intervention (p < 0.001), the overall mean QOL score was significantly higher in the intervention group than the control group. Conclusions: Based on the findings, computer games seem to be effective as a tool in influencing health-related behavior and improving the QOL of children undergoing chemotherapy. Therefore, according to the findings of this study, computer games can be used to improve the QOL of children undergoing chemotherapy. PMID:29184580
The Use of Computer Tools to Support Meaningful Learning
ERIC Educational Resources Information Center
Keengwe, Jared; Onchwari, Grace; Wachira, Patrick
2008-01-01
This article attempts to provide a review of literature pertaining to computer technology use in education. The authors discuss the benefits of learning with technology tools when integrated into teaching. The argument that introducing computer technology into schools will neither improve nor change the quality of classroom instruction unless…
Advanced Computing Tools and Models for Accelerator Physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ryne, Robert; Ryne, Robert D.
2008-06-11
This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics.
Science-Driven Computing: NERSC's Plan for 2006-2010
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simon, Horst D.; Kramer, William T.C.; Bailey, David H.
NERSC has developed a five-year strategic plan focusing on three components: Science-Driven Systems, Science-Driven Services, and Science-Driven Analytics. (1) Science-Driven Systems: Balanced introduction of the best new technologies for complete computational systems--computing, storage, networking, visualization and analysis--coupled with the activities necessary to engage vendors in addressing the DOE computational science requirements in their future roadmaps. (2) Science-Driven Services: The entire range of support activities, from high-quality operations and user services to direct scientific support, that enable a broad range of scientists to effectively use NERSC systems in their research. NERSC will concentrate on resources needed to realize the promise ofmore » the new highly scalable architectures for scientific discovery in multidisciplinary computational science projects. (3) Science-Driven Analytics: The architectural and systems enhancements and services required to integrate NERSC's powerful computational and storage resources to provide scientists with new tools to effectively manipulate, visualize, and analyze the huge data sets derived from simulations and experiments.« less
Thermal Analysis of Magnetically-Coupled Pump for Cryogenic Applications
NASA Technical Reports Server (NTRS)
Senocak, Inanc; Udaykumar, H. S.; Ndri, Narcisse; Francois, Marianne; Shyy, Wei
1999-01-01
Magnetically-coupled pump is under evaluation at Kennedy Space Center for possible cryogenic applications. A major concern is the impact of low temperature fluid flows on the pump performance. As a first step toward addressing this and related issues, a computational fluid dynamics and heat transfer tool has been adopted in a pump geometry. The computational tool includes (i) a commercial grid generator to handle multiple grid blocks and complicated geometric definitions, and (ii) an in-house computational fluid dynamics and heat transfer software developed in the Principal Investigator's group at the University of Florida. Both pure-conduction and combined convection-conduction computations have been conducted. A pure-conduction analysis gives insufficient information about the overall thermal distribution. Combined convection-conduction analysis indicates the significant influence of the coolant over the entire flow path. Since 2-D simulation is of limited help, future work on full 3-D modeling of the pump using multi-materials is needed. A comprehensive and accurate model can be developed to take into account the effect of multi-phase flow in the cooling flow loop, and the magnetic interactions.
Kurth, Ann E.; Severynen, Anneleen; Spielberg, Freya
2014-01-01
HIV testing in emergency departments (EDs) remains underutilized. We evaluated a computer tool to facilitate rapid HIV testing in an urban ED. Randomly assigned non-acute adult ED patients to computer tool (‘CARE’) and rapid HIV testing before standard visit (n=258) or to standard visit (n=259) with chart access. Assessed intervention acceptability and compared noted HIV risks. Participants were 56% non-white, 58% male; median age 37 years. In the CARE arm nearly all (251/258) completed the session and received HIV results; 4 declined test consent. HIV risks were reported by 54% of users and there was one confirmed HIV-positive and 2 false-positives (seroprevalence 0.4%, 95% CI 0.01–2.2%). Half (55%) preferred computerized, over face-to-face, counseling for future HIV testing. In standard arm, one HIV test and 2 referrals for testing occurred. Computer-facilitated HIV testing appears acceptable to ED patients. Future research should assess cost-effectiveness compared with staff-delivered approaches. PMID:23837807
Students' explanations in complex learning of disciplinary programming
NASA Astrophysics Data System (ADS)
Vieira, Camilo
Computational Science and Engineering (CSE) has been denominated as the third pillar of science and as a set of important skills to solve the problems of a global society. Along with the theoretical and the experimental approaches, computation offers a third alternative to solve complex problems that require processing large amounts of data, or representing complex phenomena that are not easy to experiment with. Despite the relevance of CSE, current professionals and scientists are not well prepared to take advantage of this set of tools and methods. Computation is usually taught in an isolated way from engineering disciplines, and therefore, engineers do not know how to exploit CSE affordances. This dissertation intends to introduce computational tools and methods contextualized within the Materials Science and Engineering curriculum. Considering that learning how to program is a complex task, the dissertation explores effective pedagogical practices that can support student disciplinary and computational learning. Two case studies will be evaluated to identify the characteristics of effective worked examples in the context of CSE. Specifically, this dissertation explores students explanations of these worked examples in two engineering courses with different levels of transparency: a programming course in materials science and engineering glass box and a thermodynamics course involving computational representations black box. Results from this study suggest that students benefit in different ways from writing in-code comments. These benefits include but are not limited to: connecting xv individual lines of code to the overall problem, getting familiar with the syntax, learning effective algorithm design strategies, and connecting computation with their discipline. Students in the glass box context generate higher quality explanations than students in the black box context. These explanations are related to students prior experiences. Specifically, students with low ability to do programming engage in a more thorough explanation process than students with high ability. This dissertation concludes proposing an adaptation to the instructional principles of worked-examples for the context of CSE education.
NASA Astrophysics Data System (ADS)
Chang, Chih-Yuan; Owen, Gerry; Pease, Roger Fabian W.; Kailath, Thomas
1992-07-01
Dose correction is commonly used to compensate for the proximity effect in electron lithography. The computation of the required dose modulation is usually carried out using 'self-consistent' algorithms that work by solving a large number of simultaneous linear equations. However, there are two major drawbacks: the resulting correction is not exact, and the computation time is excessively long. A computational scheme, as shown in Figure 1, has been devised to eliminate this problem by the deconvolution of the point spread function in the pattern domain. The method is iterative, based on a steepest descent algorithm. The scheme has been successfully tested on a simple pattern with a minimum feature size 0.5 micrometers , exposed on a MEBES tool at 10 KeV in 0.2 micrometers of PMMA resist on a silicon substrate.
NASA Technical Reports Server (NTRS)
Carrio, Miguel A., Jr.
1988-01-01
Rapidly emerging technology and methodologies have out-paced the systems development processes' ability to use them effectively, if at all. At the same time, the tools used to build systems are becoming obsolescent themselves as a consequence of the same technology lag that plagues systems development. The net result is that systems development activities have not been able to take advantage of available technology and have become equally dependent on aging and ineffective computer-aided engineering tools. New methods and tools approaches are essential if the demands of non-stop and Mission and Safety Critical (MASC) components are to be met.
NASA Astrophysics Data System (ADS)
Buhari, Abudhahir; Zukarnain, Zuriati Ahmad; Khalid, Roszelinda; Zakir Dato', Wira Jaafar Ahmad
2016-11-01
The applications of quantum information science move towards bigger and better heights for the next generation technology. Especially, in the field of quantum cryptography and quantum computation, the world already witnessed various ground-breaking tangible product and promising results. Quantum cryptography is one of the mature field from quantum mechanics and already available in the markets. The current state of quantum cryptography is still under various researches in order to reach the heights of digital cryptography. The complexity of quantum cryptography is higher due to combination of hardware and software. The lack of effective simulation tool to design and analyze the quantum cryptography experiments delays the reaching distance of the success. In this paper, we propose a framework to achieve an effective non-entanglement based quantum cryptography simulation tool. We applied hybrid simulation technique i.e. discrete event, continuous event and system dynamics. We also highlight the limitations of a commercial photonic simulation tool based experiments. Finally, we discuss ideas for achieving one-stop simulation package for quantum based secure key distribution experiments. All the modules of simulation framework are viewed from the computer science perspective.
Aeroheating Design Issues for Reusable Launch Vehicles: A Perspective
NASA Technical Reports Server (NTRS)
Zoby, E. Vincent; Thompson, Richard A.; Wurster, Kathryn E.
2004-01-01
An overview of basic aeroheating design issues for Reusable Launch Vehicles (RLV), which addresses the application of hypersonic ground-based testing, and computational fluid dynamic (CFD) and engineering codes, is presented. Challenges inherent to the prediction of aeroheating environments required for the successful design of the RLV Thermal Protection System (TPS) are discussed in conjunction with the importance of employing appropriate experimental/computational tools. The impact of the information garnered by using these tools in the resulting analyses, ultimately enhancing the RLV TPS design is illustrated. A wide range of topics is presented in this overview; e.g. the impact of flow physics issues such as boundary-layer transition, including effects of distributed and discrete roughness, shock-shock interactions, and flow separation/reattachment. Also, the benefit of integrating experimental and computational studies to gain an improved understanding of flow phenomena is illustrated. From computational studies, the effect of low-density conditions and of uncertainties in material surface properties on the computed heating rates a r e highlighted as well as the significant role of CFD in improving the Outer Mold Line (OML) definition to reduce aeroheating while maintaining aerodynamic performance. Appropriate selection of the TPS design trajectories and trajectory shaping to mitigate aeroheating levels and loads are discussed. Lastly, an illustration of an aeroheating design process is presented whereby data from hypersonic wind-tunnel tests are integrated with predictions from CFD codes and engineering methods to provide heating environments along an entry trajectory as required for TPS design.
Aeroheating Design Issues for Reusable Launch Vehicles: A Perspective
NASA Technical Reports Server (NTRS)
Zoby, E. Vincent; Thompson, Richard A.; Wurster, Kathryn E.
2004-01-01
An overview of basic aeroheating design issues for Reusable Launch Vehicles (RLV), which addresses the application of hypersonic ground-based testing, and computational fluid dynamic (CFD) and engineering codes, is presented. Challenges inherent to the prediction of aeroheating environments required for the successful design of the RLV Thermal Protection System (TPS) are discussed in conjunction with the importance of employing appropriate experimental/computational tools. The impact of the information garnered by using these tools in the resulting analyses, ultimately enhancing the RLV TPS design is illustrated. A wide range of topics is presented in this overview; e.g. the impact of flow physics issues such as boundary-layer transition, including effects of distributed and discrete roughness, shockshock interactions, and flow separation/reattachment. Also, the benefit of integrating experimental and computational studies to gain an improved understanding of flow phenomena is illustrated. From computational studies, the effect of low-density conditions and of uncertainties in material surface properties on the computed heating rates are highlighted as well as the significant role of CFD in improving the Outer Mold Line (OML) definition to reduce aeroheating while maintaining aerodynamic performance. Appropriate selection of the TPS design trajectories and trajectory shaping to mitigate aeroheating levels and loads are discussed. Lastly, an illustration of an aeroheating design process is presented whereby data from hypersonic wind-tunnel tests are integrated with predictions from CFD codes and engineering methods to provide heating environments along an entry trajectory as required for TPS design.
A supportive architecture for CFD-based design optimisation
NASA Astrophysics Data System (ADS)
Li, Ni; Su, Zeya; Bi, Zhuming; Tian, Chao; Ren, Zhiming; Gong, Guanghong
2014-03-01
Multi-disciplinary design optimisation (MDO) is one of critical methodologies to the implementation of enterprise systems (ES). MDO requiring the analysis of fluid dynamics raises a special challenge due to its extremely intensive computation. The rapid development of computational fluid dynamic (CFD) technique has caused a rise of its applications in various fields. Especially for the exterior designs of vehicles, CFD has become one of the three main design tools comparable to analytical approaches and wind tunnel experiments. CFD-based design optimisation is an effective way to achieve the desired performance under the given constraints. However, due to the complexity of CFD, integrating with CFD analysis in an intelligent optimisation algorithm is not straightforward. It is a challenge to solve a CFD-based design problem, which is usually with high dimensions, and multiple objectives and constraints. It is desirable to have an integrated architecture for CFD-based design optimisation. However, our review on existing works has found that very few researchers have studied on the assistive tools to facilitate CFD-based design optimisation. In the paper, a multi-layer architecture and a general procedure are proposed to integrate different CFD toolsets with intelligent optimisation algorithms, parallel computing technique and other techniques for efficient computation. In the proposed architecture, the integration is performed either at the code level or data level to fully utilise the capabilities of different assistive tools. Two intelligent algorithms are developed and embedded with parallel computing. These algorithms, together with the supportive architecture, lay a solid foundation for various applications of CFD-based design optimisation. To illustrate the effectiveness of the proposed architecture and algorithms, the case studies on aerodynamic shape design of a hypersonic cruising vehicle are provided, and the result has shown that the proposed architecture and developed algorithms have performed successfully and efficiently in dealing with the design optimisation with over 200 design variables.
Applications of Parallel Process HiMAP for Large Scale Multidisciplinary Problems
NASA Technical Reports Server (NTRS)
Guruswamy, Guru P.; Potsdam, Mark; Rodriguez, David; Kwak, Dochay (Technical Monitor)
2000-01-01
HiMAP is a three level parallel middleware that can be interfaced to a large scale global design environment for code independent, multidisciplinary analysis using high fidelity equations. Aerospace technology needs are rapidly changing. Computational tools compatible with the requirements of national programs such as space transportation are needed. Conventional computation tools are inadequate for modern aerospace design needs. Advanced, modular computational tools are needed, such as those that incorporate the technology of massively parallel processors (MPP).
Computational fluid dynamics: An engineering tool?
NASA Astrophysics Data System (ADS)
Anderson, J. D., Jr.
1982-06-01
Computational fluid dynamics in general, and time dependent finite difference techniques in particular, are examined from the point of view of direct engineering applications. Examples are given of the supersonic blunt body problem and gasdynamic laser calculations, where such techniques are clearly engineering tools. In addition, Navier-Stokes calculations of chemical laser flows are discussed as an example of a near engineering tool. Finally, calculations of the flowfield in a reciprocating internal combustion engine are offered as a promising future engineering application of computational fluid dynamics.
Condor-COPASI: high-throughput computing for biochemical networks
2012-01-01
Background Mathematical modelling has become a standard technique to improve our understanding of complex biological systems. As models become larger and more complex, simulations and analyses require increasing amounts of computational power. Clusters of computers in a high-throughput computing environment can help to provide the resources required for computationally expensive model analysis. However, exploiting such a system can be difficult for users without the necessary expertise. Results We present Condor-COPASI, a server-based software tool that integrates COPASI, a biological pathway simulation tool, with Condor, a high-throughput computing environment. Condor-COPASI provides a web-based interface, which makes it extremely easy for a user to run a number of model simulation and analysis tasks in parallel. Tasks are transparently split into smaller parts, and submitted for execution on a Condor pool. Result output is presented to the user in a number of formats, including tables and interactive graphical displays. Conclusions Condor-COPASI can effectively use a Condor high-throughput computing environment to provide significant gains in performance for a number of model simulation and analysis tasks. Condor-COPASI is free, open source software, released under the Artistic License 2.0, and is suitable for use by any institution with access to a Condor pool. Source code is freely available for download at http://code.google.com/p/condor-copasi/, along with full instructions on deployment and usage. PMID:22834945
Harnessing the power of emerging petascale platforms
NASA Astrophysics Data System (ADS)
Mellor-Crummey, John
2007-07-01
As part of the US Department of Energy's Scientific Discovery through Advanced Computing (SciDAC-2) program, science teams are tackling problems that require computational simulation and modeling at the petascale. A grand challenge for computer science is to develop software technology that makes it easier to harness the power of these systems to aid scientific discovery. As part of its activities, the SciDAC-2 Center for Scalable Application Development Software (CScADS) is building open source software tools to support efficient scientific computing on the emerging leadership-class platforms. In this paper, we describe two tools for performance analysis and tuning that are being developed as part of CScADS: a tool for analyzing scalability and performance, and a tool for optimizing loop nests for better node performance. We motivate these tools by showing how they apply to S3D, a turbulent combustion code under development at Sandia National Laboratory. For S3D, our node performance analysis tool helped uncover several performance bottlenecks. Using our loop nest optimization tool, we transformed S3D's most costly loop nest to reduce execution time by a factor of 2.94 for a processor working on a 503 domain.
Regev, Sivan; Hadas-Lidor, Noami; Rosenberg, Limor
2016-08-01
In this study, the assessment tool "Internet and Computer User Profile" questionnaire (ICUP) is presented and validated. It was developed in order to gather information for setting intervention goals to meet current demands. Sixty-eight subjects aged 23-68 participated in the study. The study group (n = 28) was sampled from two vocational centers. The control group consisted of 40 participants from the general population that were sampled by convenience sampling based on the demographics of the study group. Subjects from both groups answered the ICUP questionnaire. Subjects of the study group answered the General Self- Efficacy (GSE) questionnaire and performed the Assessment of Computer Task Performance (ACTP) test in order to examine the convergent validity of the ICUP. Twenty subjects from both groups retook the ICUP questionnaire in order to obtain test-retest results. Differences between groups were tested using multiple analysis of variance (MANOVA) tests. Pearson and Spearman's tests were used for calculating correlations. Cronbach's alpha coefficient and k equivalent were used to assess internal consistency. The results indicate that the questionnaire is valid and reliable. They emphasize that the layout of the ICUP items facilitates in making a comprehensive examination of the client's perception regarding his participation in computer and internet activities. Implications for Rehabiliation The assessment tool "Internet and Computer User Profile" (ICUP) questionnaire is a novel assessment tool that evaluates operative use and individual perception of computer activities. The questionnaire is valid and reliable for use with participants of vocational centers dealing with mental illness. It is essential to facilitate access to computers for people with mental illnesses, seeing that they express similar interest in computers and internet as people from the general population of the same age. Early intervention will be particularly effective for young adults dealing with mental illness, since the digital gap between them and young people in general is relatively small.
Design of testbed and emulation tools
NASA Technical Reports Server (NTRS)
Lundstrom, S. F.; Flynn, M. J.
1986-01-01
The research summarized was concerned with the design of testbed and emulation tools suitable to assist in projecting, with reasonable accuracy, the expected performance of highly concurrent computing systems on large, complete applications. Such testbed and emulation tools are intended for the eventual use of those exploring new concurrent system architectures and organizations, either as users or as designers of such systems. While a range of alternatives was considered, a software based set of hierarchical tools was chosen to provide maximum flexibility, to ease in moving to new computers as technology improves and to take advantage of the inherent reliability and availability of commercially available computing systems.
Automatic Differentiation as a tool in engineering design
NASA Technical Reports Server (NTRS)
Barthelemy, Jean-Francois M.; Hall, Laura E.
1992-01-01
Automatic Differentiation (AD) is a tool that systematically implements the chain rule of differentiation to obtain the derivatives of functions calculated by computer programs. In this paper, it is assessed as a tool for engineering design. The paper discusses the forward and reverse modes of AD, their computing requirements, and approaches to implementing AD. It continues with application to two different tools to two medium-size structural analysis problems to generate sensitivity information typically necessary in an optimization or design situation. The paper concludes with the observation that AD is to be preferred to finite differencing in most cases, as long as sufficient computer storage is available.
Techniques and Tools for Estimating Ionospheric Effects in Interferometric and Polarimetric SAR Data
NASA Technical Reports Server (NTRS)
Rosen, P.; Lavalle, M.; Pi, X.; Buckley, S.; Szeliga, W.; Zebker, H.; Gurrola, E.
2011-01-01
The InSAR Scientific Computing Environment (ISCE) is a flexible, extensible software tool designed for the end-to-end processing and analysis of synthetic aperture radar data. ISCE inherits the core of the ROI_PAC interferometric tool, but contains improvements at all levels of the radar processing chain, including a modular and extensible architecture, new focusing approach, better geocoding of the data, handling of multi-polarization data, radiometric calibration, and estimation and correction of ionospheric effects. In this paper we describe the characteristics of ISCE with emphasis on the ionospheric modules. To detect ionospheric anomalies, ISCE implements the Faraday rotation method using quadpolarimetric images, and the split-spectrum technique using interferometric single-, dual- and quad-polarimetric images. The ability to generate co-registered time series of quad-polarimetric images makes ISCE also an ideal tool to be used for polarimetric-interferometric radar applications.
Teachers' Use of Computational Tools to Construct and Explore Dynamic Mathematical Models
ERIC Educational Resources Information Center
Santos-Trigo, Manuel; Reyes-Rodriguez, Aaron
2011-01-01
To what extent does the use of computational tools offer teachers the possibility of constructing dynamic models to identify and explore diverse mathematical relations? What ways of reasoning or thinking about the problems emerge during the model construction process that involves the use of the tools? These research questions guided the…
An Evaluation of Teaching Introductory Geomorphology Using Computer-based Tools.
ERIC Educational Resources Information Center
Wentz, Elizabeth A.; Vender, Joann C.; Brewer, Cynthia A.
1999-01-01
Compares student reactions to traditional teaching methods and an approach where computer-based tools (GEODe CD-ROM and GIS-based exercises) were either integrated with or replaced the traditional methods. Reveals that the students found both of these tools valuable forms of instruction when used in combination with the traditional methods. (CMK)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Tengfang; Flapper, Joris; Ke, Jing
The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry – including four dairy processes – cheese, fluid milk, butter, and milk powder.
A Framework for the Evaluation of CASE Tool Learnability in Educational Environments
ERIC Educational Resources Information Center
Senapathi, Mali
2005-01-01
The aim of the research is to derive a framework for the evaluation of Computer Aided Software Engineering (CASE) tool learnability in educational environments. Drawing from the literature of Human Computer Interaction and educational research, a framework for evaluating CASE tool learnability in educational environments is derived. The two main…
Toward A Simulation-Based Tool for the Treatment of Vocal Fold Paralysis
Mittal, Rajat; Zheng, Xudong; Bhardwaj, Rajneesh; Seo, Jung Hee; Xue, Qian; Bielamowicz, Steven
2011-01-01
Advances in high-performance computing are enabling a new generation of software tools that employ computational modeling for surgical planning. Surgical management of laryngeal paralysis is one area where such computational tools could have a significant impact. The current paper describes a comprehensive effort to develop a software tool for planning medialization laryngoplasty where a prosthetic implant is inserted into the larynx in order to medialize the paralyzed vocal fold (VF). While this is one of the most common procedures used to restore voice in patients with VF paralysis, it has a relatively high revision rate, and the tool being developed is expected to improve surgical outcomes. This software tool models the biomechanics of airflow-induced vibration in the human larynx and incorporates sophisticated approaches for modeling the turbulent laryngeal flow, the complex dynamics of the VFs, as well as the production of voiced sound. The current paper describes the key elements of the modeling approach, presents computational results that demonstrate the utility of the approach and also describes some of the limitations and challenges. PMID:21556320
Optimization of Microelectronic Devices for Sensor Applications
NASA Technical Reports Server (NTRS)
Cwik, Tom; Klimeck, Gerhard
2000-01-01
The NASA/JPL goal to reduce payload in future space missions while increasing mission capability demands miniaturization of active and passive sensors, analytical instruments and communication systems among others. Currently, typical system requirements include the detection of particular spectral lines, associated data processing, and communication of the acquired data to other systems. Advances in lithography and deposition methods result in more advanced devices for space application, while the sub-micron resolution currently available opens a vast design space. Though an experimental exploration of this widening design space-searching for optimized performance by repeated fabrication efforts-is unfeasible, it does motivate the development of reliable software design tools. These tools necessitate models based on fundamental physics and mathematics of the device to accurately model effects such as diffraction and scattering in opto-electronic devices, or bandstructure and scattering in heterostructure devices. The software tools must have convenient turn-around times and interfaces that allow effective usage. The first issue is addressed by the application of high-performance computers and the second by the development of graphical user interfaces driven by properly developed data structures. These tools can then be integrated into an optimization environment, and with the available memory capacity and computational speed of high performance parallel platforms, simulation of optimized components can proceed. In this paper, specific applications of the electromagnetic modeling of infrared filtering, as well as heterostructure device design will be presented using genetic algorithm global optimization methods.
Computational Ion Optics Design Evaluations
NASA Technical Reports Server (NTRS)
Malone, Shane P.; Soulas, George C.
2004-01-01
Ion optics computational models are invaluable tools in the design of ion optics systems. In this study a new computational model developed by an outside vendor for use at the NASA Glenn Research Center (GRC) is presented. This computational model is a gun code that has been modified to model the plasma sheaths both upstream and downstream of the ion optics. The model handles multiple species (e.g. singly and doubly-charged ions) and includes a charge-exchange model to support erosion estimations. The model uses commercially developed solid design and meshing software to allow high flexibility in ion optics geometric configurations. The results from this computational model are applied to the NEXT project to investigate the effects of crossover impingement erosion seen during the 2000-hour wear test.
Digital processing of mesoscale analysis and space sensor data
NASA Technical Reports Server (NTRS)
Hickey, J. S.; Karitani, S.
1985-01-01
The mesoscale analysis and space sensor (MASS) data management and analysis system on the research computer system is presented. The MASS data base management and analysis system was implemented on the research computer system which provides a wide range of capabilities for processing and displaying large volumes of conventional and satellite derived meteorological data. The research computer system consists of three primary computers (HP-1000F, Harris/6, and Perkin-Elmer 3250), each of which performs a specific function according to its unique capabilities. The overall tasks performed concerning the software, data base management and display capabilities of the research computer system in terms of providing a very effective interactive research tool for the digital processing of mesoscale analysis and space sensor data is described.
Health Literacy Assessment of the STOFHLA: Paper versus Electronic Administration Continuation Study
ERIC Educational Resources Information Center
Chesser, Amy K.; Keene Woods, Nikki; Wipperman, Jennifer; Wilson, Rachel; Dong, Frank
2014-01-01
Low health literacy is associated with poor health outcomes. Research is needed to understand the mechanisms and pathways of its effects. Computer-based assessment tools may improve efficiency and cost-effectiveness of health literacy research. The objective of this preliminary study was to assess if administration of the Short Test of Functional…
ERIC Educational Resources Information Center
Ok, Min Wook; Kim, Min Kyung; Kang, Eun Young; Bryant, Brian R.
2016-01-01
Computers can be an effective teaching method for students with learning disabilities (LD). The use of mobile devices as education tools for students with disabilities has received considerable attention in special education recently. Parents, teachers, and professionals look for effective applications (i.e., apps) that meet the needs of their…
ERIC Educational Resources Information Center
Campbell, Donald P.
2013-01-01
This study investigated the effect of student prior knowledge and feedback type on student achievement and satisfaction in an introductory managerial accounting course using computer-based formative assessment tools. The study involved a redesign of the existing Job Order Costing unit using the ADDIE model of instructional design. The…
Sub-Second Parallel State Estimation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yousu; Rice, Mark J.; Glaesemann, Kurt R.
This report describes the performance of Pacific Northwest National Laboratory (PNNL) sub-second parallel state estimation (PSE) tool using the utility data from the Bonneville Power Administrative (BPA) and discusses the benefits of the fast computational speed for power system applications. The test data were provided by BPA. They are two-days’ worth of hourly snapshots that include power system data and measurement sets in a commercial tool format. These data are extracted out from the commercial tool box and fed into the PSE tool. With the help of advanced solvers, the PSE tool is able to solve each BPA hourly statemore » estimation problem within one second, which is more than 10 times faster than today’s commercial tool. This improved computational performance can help increase the reliability value of state estimation in many aspects: (1) the shorter the time required for execution of state estimation, the more time remains for operators to take appropriate actions, and/or to apply automatic or manual corrective control actions. This increases the chances of arresting or mitigating the impact of cascading failures; (2) the SE can be executed multiple times within time allowance. Therefore, the robustness of SE can be enhanced by repeating the execution of the SE with adaptive adjustments, including removing bad data and/or adjusting different initial conditions to compute a better estimate within the same time as a traditional state estimator’s single estimate. There are other benefits with the sub-second SE, such as that the PSE results can potentially be used in local and/or wide-area automatic corrective control actions that are currently dependent on raw measurements to minimize the impact of bad measurements, and provides opportunities to enhance the power grid reliability and efficiency. PSE also can enable other advanced tools that rely on SE outputs and could be used to further improve operators’ actions and automated controls to mitigate effects of severe events on the grid. The power grid continues to grow and the number of measurements is increasing at an accelerated rate due to the variety of smart grid devices being introduced. A parallel state estimation implementation will have better performance than traditional, sequential state estimation by utilizing the power of high performance computing (HPC). This increased performance positions parallel state estimators as valuable tools for operating the increasingly more complex power grid.« less
Dynamic Load-Balancing for Distributed Heterogeneous Computing of Parallel CFD Problems
NASA Technical Reports Server (NTRS)
Ecer, A.; Chien, Y. P.; Boenisch, T.; Akay, H. U.
2000-01-01
The developed methodology is aimed at improving the efficiency of executing block-structured algorithms on parallel, distributed, heterogeneous computers. The basic approach of these algorithms is to divide the flow domain into many sub- domains called blocks, and solve the governing equations over these blocks. Dynamic load balancing problem is defined as the efficient distribution of the blocks among the available processors over a period of several hours of computations. In environments with computers of different architecture, operating systems, CPU speed, memory size, load, and network speed, balancing the loads and managing the communication between processors becomes crucial. Load balancing software tools for mutually dependent parallel processes have been created to efficiently utilize an advanced computation environment and algorithms. These tools are dynamic in nature because of the chances in the computer environment during execution time. More recently, these tools were extended to a second operating system: NT. In this paper, the problems associated with this application will be discussed. Also, the developed algorithms were combined with the load sharing capability of LSF to efficiently utilize workstation clusters for parallel computing. Finally, results will be presented on running a NASA based code ADPAC to demonstrate the developed tools for dynamic load balancing.
The Data Collector: A Qualitative Research Tool.
ERIC Educational Resources Information Center
Handler, Marianne G.; Turner, Sandra V.
Computer software that is intended to assist the qualitative researcher in the analysis of textual data is relatively new. One such program, the Data Collector, is a HyperCard computer program designed for use on the Macintosh computer. A tool for organizing and analyzing textual data obtained from observations, interviews, surveys, and other…
Integrating Data Base into the Elementary School Science Program.
ERIC Educational Resources Information Center
Schlenker, Richard M.
This document describes seven science activities that combine scientific principles and computers. The objectives for the activities are to show students how the computer can be used as a tool to store and arrange scientific data, provide students with experience using the computer as a tool to manage scientific data, and provide students with…
Tools for Analyzing Computing Resource Management Strategies and Algorithms for SDR Clouds
NASA Astrophysics Data System (ADS)
Marojevic, Vuk; Gomez-Miguelez, Ismael; Gelonch, Antoni
2012-09-01
Software defined radio (SDR) clouds centralize the computing resources of base stations. The computing resource pool is shared between radio operators and dynamically loads and unloads digital signal processing chains for providing wireless communications services on demand. Each new user session request particularly requires the allocation of computing resources for executing the corresponding SDR transceivers. The huge amount of computing resources of SDR cloud data centers and the numerous session requests at certain hours of a day require an efficient computing resource management. We propose a hierarchical approach, where the data center is divided in clusters that are managed in a distributed way. This paper presents a set of computing resource management tools for analyzing computing resource management strategies and algorithms for SDR clouds. We use the tools for evaluating a different strategies and algorithms. The results show that more sophisticated algorithms can achieve higher resource occupations and that a tradeoff exists between cluster size and algorithm complexity.
NASA Astrophysics Data System (ADS)
Herrera, I.; Herrera, G. S.
2015-12-01
Most geophysical systems are macroscopic physical systems. The behavior prediction of such systems is carried out by means of computational models whose basic models are partial differential equations (PDEs) [1]. Due to the enormous size of the discretized version of such PDEs it is necessary to apply highly parallelized super-computers. For them, at present, the most efficient software is based on non-overlapping domain decomposition methods (DDM). However, a limiting feature of the present state-of-the-art techniques is due to the kind of discretizations used in them. Recently, I. Herrera and co-workers using 'non-overlapping discretizations' have produced the DVS-Software which overcomes this limitation [2]. The DVS-software can be applied to a great variety of geophysical problems and achieves very high parallel efficiencies (90%, or so [3]). It is therefore very suitable for effectively applying the most advanced parallel supercomputers available at present. In a parallel talk, in this AGU Fall Meeting, Graciela Herrera Z. will present how this software is being applied to advance MOD-FLOW. Key Words: Parallel Software for Geophysics, High Performance Computing, HPC, Parallel Computing, Domain Decomposition Methods (DDM)REFERENCES [1]. Herrera Ismael and George F. Pinder, Mathematical Modelling in Science and Engineering: An axiomatic approach", John Wiley, 243p., 2012. [2]. Herrera, I., de la Cruz L.M. and Rosas-Medina A. "Non Overlapping Discretization Methods for Partial, Differential Equations". NUMER METH PART D E, 30: 1427-1454, 2014, DOI 10.1002/num 21852. (Open source) [3]. Herrera, I., & Contreras Iván "An Innovative Tool for Effectively Applying Highly Parallelized Software To Problems of Elasticity". Geofísica Internacional, 2015 (In press)
An Accurate and Computationally Efficient Model for Membrane-Type Circular-Symmetric Micro-Hotplates
Khan, Usman; Falconi, Christian
2014-01-01
Ideally, the design of high-performance micro-hotplates would require a large number of simulations because of the existence of many important design parameters as well as the possibly crucial effects of both spread and drift. However, the computational cost of FEM simulations, which are the only available tool for accurately predicting the temperature in micro-hotplates, is very high. As a result, micro-hotplate designers generally have no effective simulation-tools for the optimization. In order to circumvent these issues, here, we propose a model for practical circular-symmetric micro-hot-plates which takes advantage of modified Bessel functions, computationally efficient matrix-approach for considering the relevant boundary conditions, Taylor linearization for modeling the Joule heating and radiation losses, and external-region-segmentation strategy in order to accurately take into account radiation losses in the entire micro-hotplate. The proposed model is almost as accurate as FEM simulations and two to three orders of magnitude more computationally efficient (e.g., 45 s versus more than 8 h). The residual errors, which are mainly associated to the undesired heating in the electrical contacts, are small (e.g., few degrees Celsius for an 800 °C operating temperature) and, for important analyses, almost constant. Therefore, we also introduce a computationally-easy single-FEM-compensation strategy in order to reduce the residual errors to about 1 °C. As illustrative examples of the power of our approach, we report the systematic investigation of a spread in the membrane thermal conductivity and of combined variations of both ambient and bulk temperatures. Our model enables a much faster characterization of micro-hotplates and, thus, a much more effective optimization prior to fabrication. PMID:24763214
, through soil-structure interaction, to structural response. New computer simulation tools are necessary to of structures and soils to investigate challenging problems in soil-structure-foundation interaction including foundations and soils is used to study the effects of soil liquefaction and permanent
New Trends in Computer Assisted Language Learning and Teaching.
ERIC Educational Resources Information Center
Perez-Paredes, Pascual, Ed.; Cantos-Gomez, Pascual, Ed.
2002-01-01
Articles in this special issue include the following: "ICT and Modern Foreign Languages: Learning Opportunities and Training Needs" (Graham Davies); "Authoring, Pedagogy and the Web: Expectations Versus Reality" (Paul Bangs); "Web-based Instructional Environments: Tools and Techniques for Effective Second Language…
Space and Cyber: Shared Challenges, Shared Opportunities
2011-11-15
adversaries to have effective capabilities against networks and computer systems, unlike those anywhere else—here, cyber criminals , proxies for hire, and...or unintentional, conditions can impact our ability to use space and cyber capabilities. As the tools and techniques developed by cyber criminals continue
Computational System For Rapid CFD Analysis In Engineering
NASA Technical Reports Server (NTRS)
Barson, Steven L.; Ascoli, Edward P.; Decroix, Michelle E.; Sindir, Munir M.
1995-01-01
Computational system comprising modular hardware and software sub-systems developed to accelerate and facilitate use of techniques of computational fluid dynamics (CFD) in engineering environment. Addresses integration of all aspects of CFD analysis process, including definition of hardware surfaces, generation of computational grids, CFD flow solution, and postprocessing. Incorporates interfaces for integration of all hardware and software tools needed to perform complete CFD analysis. Includes tools for efficient definition of flow geometry, generation of computational grids, computation of flows on grids, and postprocessing of flow data. System accepts geometric input from any of three basic sources: computer-aided design (CAD), computer-aided engineering (CAE), or definition by user.
Software and resources for computational medicinal chemistry
Liao, Chenzhong; Sitzmann, Markus; Pugliese, Angelo; Nicklaus, Marc C
2011-01-01
Computer-aided drug design plays a vital role in drug discovery and development and has become an indispensable tool in the pharmaceutical industry. Computational medicinal chemists can take advantage of all kinds of software and resources in the computer-aided drug design field for the purposes of discovering and optimizing biologically active compounds. This article reviews software and other resources related to computer-aided drug design approaches, putting particular emphasis on structure-based drug design, ligand-based drug design, chemical databases and chemoinformatics tools. PMID:21707404
Zhang, Wenchao; Dai, Xinbin; Wang, Qishan; Xu, Shizhong; Zhao, Patrick X
2016-05-01
The term epistasis refers to interactions between multiple genetic loci. Genetic epistasis is important in regulating biological function and is considered to explain part of the 'missing heritability,' which involves marginal genetic effects that cannot be accounted for in genome-wide association studies. Thus, the study of epistasis is of great interest to geneticists. However, estimating epistatic effects for quantitative traits is challenging due to the large number of interaction effects that must be estimated, thus significantly increasing computing demands. Here, we present a new web server-based tool, the Pipeline for estimating EPIStatic genetic effects (PEPIS), for analyzing polygenic epistatic effects. The PEPIS software package is based on a new linear mixed model that has been used to predict the performance of hybrid rice. The PEPIS includes two main sub-pipelines: the first for kinship matrix calculation, and the second for polygenic component analyses and genome scanning for main and epistatic effects. To accommodate the demand for high-performance computation, the PEPIS utilizes C/C++ for mathematical matrix computing. In addition, the modules for kinship matrix calculations and main and epistatic-effect genome scanning employ parallel computing technology that effectively utilizes multiple computer nodes across our networked cluster, thus significantly improving the computational speed. For example, when analyzing the same immortalized F2 rice population genotypic data examined in a previous study, the PEPIS returned identical results at each analysis step with the original prototype R code, but the computational time was reduced from more than one month to about five minutes. These advances will help overcome the bottleneck frequently encountered in genome wide epistatic genetic effect analysis and enable accommodation of the high computational demand. The PEPIS is publically available at http://bioinfo.noble.org/PolyGenic_QTL/.
wft4galaxy: a workflow testing tool for galaxy.
Piras, Marco Enrico; Pireddu, Luca; Zanetti, Gianluigi
2017-12-01
Workflow managers for scientific analysis provide a high-level programming platform facilitating standardization, automation, collaboration and access to sophisticated computing resources. The Galaxy workflow manager provides a prime example of this type of platform. As compositions of simpler tools, workflows effectively comprise specialized computer programs implementing often very complex analysis procedures. To date, no simple way to automatically test Galaxy workflows and ensure their correctness has appeared in the literature. With wft4galaxy we offer a tool to bring automated testing to Galaxy workflows, making it feasible to bring continuous integration to their development and ensuring that defects are detected promptly. wft4galaxy can be easily installed as a regular Python program or launched directly as a Docker container-the latter reducing installation effort to a minimum. Available at https://github.com/phnmnl/wft4galaxy under the Academic Free License v3.0. marcoenrico.piras@crs4.it. © The Author 2017. Published by Oxford University Press.
Analysis of Cysteine Redox Post-Translational Modifications in Cell Biology and Drug Pharmacology.
Wani, Revati; Murray, Brion W
2017-01-01
Reversible cysteine oxidation is an emerging class of protein post-translational modification (PTM) that regulates catalytic activity, modulates conformation, impacts protein-protein interactions, and affects subcellular trafficking of numerous proteins. Redox PTMs encompass a broad array of cysteine oxidation reactions with different half-lives, topographies, and reactivities such as S-glutathionylation and sulfoxidation. Recent studies from our group underscore the lesser known effect of redox protein modifications on drug binding. To date, biological studies to understand mechanistic and functional aspects of redox regulation are technically challenging. A prominent issue is the lack of tools for labeling proteins oxidized to select chemotype/oxidant species in cells. Predictive computational tools and curated databases of oxidized proteins are facilitating structural and functional insights into regulation of the network of oxidized proteins or redox proteome. In this chapter, we discuss analytical platforms for studying protein oxidation, suggest computational tools currently available in the field to determine redox sensitive proteins, and begin to illuminate roles of cysteine redox PTMs in drug pharmacology.
Fan Noise Prediction with Applications to Aircraft System Noise Assessment
NASA Technical Reports Server (NTRS)
Nark, Douglas M.; Envia, Edmane; Burley, Casey L.
2009-01-01
This paper describes an assessment of current fan noise prediction tools by comparing measured and predicted sideline acoustic levels from a benchmark fan noise wind tunnel test. Specifically, an empirical method and newly developed coupled computational approach are utilized to predict aft fan noise for a benchmark test configuration. Comparisons with sideline noise measurements are performed to assess the relative merits of the two approaches. The study identifies issues entailed in coupling the source and propagation codes, as well as provides insight into the capabilities of the tools in predicting the fan noise source and subsequent propagation and radiation. In contrast to the empirical method, the new coupled computational approach provides the ability to investigate acoustic near-field effects. The potential benefits/costs of these new methods are also compared with the existing capabilities in a current aircraft noise system prediction tool. The knowledge gained in this work provides a basis for improved fan source specification in overall aircraft system noise studies.
NASA Astrophysics Data System (ADS)
Filippov, A. V.; Tarasov, S. Yu; Podgornyh, O. A.; Shamarin, N. N.; Filippova, E. O.
2017-01-01
Automatization of engineering processes requires developing relevant mathematical support and a computer software. Analysis of metal cutting kinematics and tool geometry is a necessary key task at the preproduction stage. This paper is focused on developing a procedure for determining the geometry of oblique peakless round-nose tool lathe machining with the use of vector/matrix transformations. Such an approach allows integration into modern mathematical software packages in distinction to the traditional analytic description. Such an advantage is very promising for developing automated control of the preproduction process. A kinematic criterion for the applicable tool geometry has been developed from the results of this study. The effect of tool blade inclination and curvature on the geometry-dependent process parameters was evaluated.
An interactive computer code for calculation of gas-phase chemical equilibrium (EQLBRM)
NASA Technical Reports Server (NTRS)
Pratt, B. S.; Pratt, D. T.
1984-01-01
A user friendly, menu driven, interactive computer program known as EQLBRM which calculates the adiabatic equilibrium temperature and product composition resulting from the combustion of hydrocarbon fuels with air, at specified constant pressure and enthalpy is discussed. The program is developed primarily as an instructional tool to be run on small computers to allow the user to economically and efficiency explore the effects of varying fuel type, air/fuel ratio, inlet air and/or fuel temperature, and operating pressure on the performance of continuous combustion devices such as gas turbine combustors, Stirling engine burners, and power generation furnaces.
NASA Technical Reports Server (NTRS)
Holland, Scott Douglas
1991-01-01
A combined computational and experimental parametric study of the internal aerodynamics of a generic three dimensional sidewall compression scramjet inlet configuration was performed. The study was designed to demonstrate the utility of computational fluid dynamics as a design tool in hypersonic inlet flow fields, to provide a detailed account of the nature and structure of the internal flow interactions, and to provide a comprehensive surface property and flow field database to determine the effects of contraction ratio, cowl position, and Reynolds number on the performance of a hypersonic scramjet inlet configuration.
NASA Astrophysics Data System (ADS)
Slaughter, A. E.; Permann, C.; Peterson, J. W.; Gaston, D.; Andrs, D.; Miller, J.
2014-12-01
The Idaho National Laboratory (INL)-developed Multiphysics Object Oriented Simulation Environment (MOOSE; www.mooseframework.org), is an open-source, parallel computational framework for enabling the solution of complex, fully implicit multiphysics systems. MOOSE provides a set of computational tools that scientists and engineers can use to create sophisticated multiphysics simulations. Applications built using MOOSE have computed solutions for chemical reaction and transport equations, computational fluid dynamics, solid mechanics, heat conduction, mesoscale materials modeling, geomechanics, and others. To facilitate the coupling of diverse and highly-coupled physical systems, MOOSE employs the Jacobian-free Newton-Krylov (JFNK) method when solving the coupled nonlinear systems of equations arising in multiphysics applications. The MOOSE framework is written in C++, and leverages other high-quality, open-source scientific software packages such as LibMesh, Hypre, and PETSc. MOOSE uses a "hybrid parallel" model which combines both shared memory (thread-based) and distributed memory (MPI-based) parallelism to ensure efficient resource utilization on a wide range of computational hardware. MOOSE-based applications are inherently modular, which allows for simulation expansion (via coupling of additional physics modules) and the creation of multi-scale simulations. Any application developed with MOOSE supports running (in parallel) any other MOOSE-based application. Each application can be developed independently, yet easily communicate with other applications (e.g., conductivity in a slope-scale model could be a constant input, or a complete phase-field micro-structure simulation) without additional code being written. This method of development has proven effective at INL and expedites the development of sophisticated, sustainable, and collaborative simulation tools.
Williams, Kent E; Voigt, Jeffrey R
2004-01-01
The research reported herein presents the results of an empirical evaluation that focused on the accuracy and reliability of cognitive models created using a computerized tool: the cognitive analysis tool for human-computer interaction (CAT-HCI). A sample of participants, expert in interacting with a newly developed tactical display for the U.S. Army's Bradley Fighting Vehicle, individually modeled their knowledge of 4 specific tasks employing the CAT-HCI tool. Measures of the accuracy and consistency of task models created by these task domain experts using the tool were compared with task models created by a double expert. The findings indicated a high degree of consistency and accuracy between the different "single experts" in the task domain in terms of the resultant models generated using the tool. Actual or potential applications of this research include assessing human-computer interaction complexity, determining the productivity of human-computer interfaces, and analyzing an interface design to determine whether methods can be automated.
Distributing digital video to multiple computers
Murray, James A.
2004-01-01
Video is an effective teaching tool, and live video microscopy is especially helpful in teaching dissection techniques and the anatomy of small neural structures. Digital video equipment is more affordable now and allows easy conversion from older analog video devices. I here describe a simple technique for bringing digital video from one camera to all of the computers in a single room. This technique allows students to view and record the video from a single camera on a microscope. PMID:23493464
Modeling of Tool-Tissue Interactions for Computer-Based Surgical Simulation: A Literature Review
Misra, Sarthak; Ramesh, K. T.; Okamura, Allison M.
2009-01-01
Surgical simulators present a safe and potentially effective method for surgical training, and can also be used in robot-assisted surgery for pre- and intra-operative planning. Accurate modeling of the interaction between surgical instruments and organs has been recognized as a key requirement in the development of high-fidelity surgical simulators. Researchers have attempted to model tool-tissue interactions in a wide variety of ways, which can be broadly classified as (1) linear elasticity-based, (2) nonlinear (hyperelastic) elasticity-based finite element (FE) methods, and (3) other techniques that not based on FE methods or continuum mechanics. Realistic modeling of organ deformation requires populating the model with real tissue data (which are difficult to acquire in vivo) and simulating organ response in real time (which is computationally expensive). Further, it is challenging to account for connective tissue supporting the organ, friction, and topological changes resulting from tool-tissue interactions during invasive surgical procedures. Overcoming such obstacles will not only help us to model tool-tissue interactions in real time, but also enable realistic force feedback to the user during surgical simulation. This review paper classifies the existing research on tool-tissue interactions for surgical simulators specifically based on the modeling techniques employed and the kind of surgical operation being simulated, in order to inform and motivate future research on improved tool-tissue interaction models. PMID:20119508
Bringing the CMS distributed computing system into scalable operations
NASA Astrophysics Data System (ADS)
Belforte, S.; Fanfani, A.; Fisk, I.; Flix, J.; Hernández, J. M.; Kress, T.; Letts, J.; Magini, N.; Miccio, V.; Sciabà, A.
2010-04-01
Establishing efficient and scalable operations of the CMS distributed computing system critically relies on the proper integration, commissioning and scale testing of the data and workload management tools, the various computing workflows and the underlying computing infrastructure, located at more than 50 computing centres worldwide and interconnected by the Worldwide LHC Computing Grid. Computing challenges periodically undertaken by CMS in the past years with increasing scale and complexity have revealed the need for a sustained effort on computing integration and commissioning activities. The Processing and Data Access (PADA) Task Force was established at the beginning of 2008 within the CMS Computing Program with the mandate of validating the infrastructure for organized processing and user analysis including the sites and the workload and data management tools, validating the distributed production system by performing functionality, reliability and scale tests, helping sites to commission, configure and optimize the networking and storage through scale testing data transfers and data processing, and improving the efficiency of accessing data across the CMS computing system from global transfers to local access. This contribution reports on the tools and procedures developed by CMS for computing commissioning and scale testing as well as the improvements accomplished towards efficient, reliable and scalable computing operations. The activities include the development and operation of load generators for job submission and data transfers with the aim of stressing the experiment and Grid data management and workload management systems, site commissioning procedures and tools to monitor and improve site availability and reliability, as well as activities targeted to the commissioning of the distributed production, user analysis and monitoring systems.
Computer aided design of architecture of degradable tissue engineering scaffolds.
Heljak, M K; Kurzydlowski, K J; Swieszkowski, W
2017-11-01
One important factor affecting the process of tissue regeneration is scaffold stiffness loss, which should be properly balanced with the rate of tissue regeneration. The aim of the research reported here was to develop a computer tool for designing the architecture of biodegradable scaffolds fabricated by melt-dissolution deposition systems (e.g. Fused Deposition Modeling) to provide the required scaffold stiffness at each stage of degradation/regeneration. The original idea presented in the paper is that the stiffness of a tissue engineering scaffold can be controlled during degradation by means of a proper selection of the diameter of the constituent fibers and the distances between them. This idea is based on the size-effect on degradation of aliphatic polyesters. The presented computer tool combines a genetic algorithm and a diffusion-reaction model of polymer hydrolytic degradation. In particular, we show how to design the architecture of scaffolds made of poly(DL-lactide-co-glycolide) with the required Young's modulus change during hydrolytic degradation.
Issues and approach to develop validated analysis tools for hypersonic flows: One perspective
NASA Technical Reports Server (NTRS)
Deiwert, George S.
1993-01-01
Critical issues concerning the modeling of low density hypervelocity flows where thermochemical nonequilibrium effects are pronounced are discussed. Emphasis is on the development of validated analysis tools, and the activity in the NASA Ames Research Center's Aerothermodynamics Branch is described. Inherent in the process is a strong synergism between ground test and real gas computational fluid dynamics (CFD). Approaches to develop and/or enhance phenomenological models and incorporate them into computational flowfield simulation codes are discussed. These models were partially validated with experimental data for flows where the gas temperature is raised (compressive flows). Expanding flows, where temperatures drop, however, exhibit somewhat different behavior. Experimental data for these expanding flow conditions is sparse and reliance must be made on intuition and guidance from computational chemistry to model transport processes under these conditions. Ground based experimental studies used to provide necessary data for model development and validation are described. Included are the performance characteristics of high enthalpy flow facilities, such as shock tubes and ballistic ranges.
Spielberg, Freya; Kurth, Ann; Reidy, William; McKnight, Teka; Dikobe, Wame; Wilson, Charles
2011-06-01
This article highlights findings from an evaluation that explored the impact of mobile versus clinic-based testing, rapid versus central-lab based testing, incentives for testing, and the use of a computer counseling program to guide counseling and automate evaluation in a mobile program reaching people of color at risk for HIV. The program's results show that an increased focus on mobile outreach using rapid testing, incentives and health information technology tools may improve program acceptability, quality, productivity and timeliness of reports. This article describes program design decisions based on continuous quality assessment efforts. It also examines the impact of the Computer Assessment and Risk Reduction Education computer tool on HIV testing rates, staff perception of counseling quality, program productivity, and on the timeliness of evaluation reports. The article concludes with a discussion of implications for programmatic responses to the Centers for Disease Control and Prevention's HIV testing recommendations.
Spielberg, Freya; Kurth, Ann; Reidy, William; McKnight, Teka; Dikobe, Wame; Wilson, Charles
2016-01-01
This article highlights findings from an evaluation that explored the impact of mobile versus clinic-based testing, rapid versus central-lab based testing, incentives for testing, and the use of a computer counseling program to guide counseling and automate evaluation in a mobile program reaching people of color at risk for HIV. The program’s results show that an increased focus on mobile outreach using rapid testing, incentives and health information technology tools may improve program acceptability, quality, productivity and timeliness of reports. This article describes program design decisions based on continuous quality assessment efforts. It also examines the impact of the Computer Assessment and Risk Reduction Education computer tool on HIV testing rates, staff perception of counseling quality, program productivity, and on the timeliness of evaluation reports. The article concludes with a discussion of implications for programmatic responses to the Centers for Disease Control and Prevention’s HIV testing recommendations. PMID:21689041
Issues and approach to develop validated analysis tools for hypersonic flows: One perspective
NASA Technical Reports Server (NTRS)
Deiwert, George S.
1992-01-01
Critical issues concerning the modeling of low-density hypervelocity flows where thermochemical nonequilibrium effects are pronounced are discussed. Emphasis is on the development of validated analysis tools. A description of the activity in the Ames Research Center's Aerothermodynamics Branch is also given. Inherent in the process is a strong synergism between ground test and real-gas computational fluid dynamics (CFD). Approaches to develop and/or enhance phenomenological models and incorporate them into computational flow-field simulation codes are discussed. These models have been partially validated with experimental data for flows where the gas temperature is raised (compressive flows). Expanding flows, where temperatures drop, however, exhibit somewhat different behavior. Experimental data for these expanding flow conditions are sparse; reliance must be made on intuition and guidance from computational chemistry to model transport processes under these conditions. Ground-based experimental studies used to provide necessary data for model development and validation are described. Included are the performance characteristics of high-enthalpy flow facilities, such as shock tubes and ballistic ranges.
Using NetMeeting for remote configuration of the Otto Bock C-Leg: technical considerations.
Lemaire, E D; Fawcett, J A
2002-08-01
Telehealth has the potential to be a valuable tool for technical and clinical support of computer controlled prosthetic devices. This pilot study examined the use of Internet-based, desktop video conferencing for remote configuration of the Otto Bock C-Leg. Laboratory tests involved connecting two computers running Microsoft NetMeeting over a local area network (IP protocol). Over 56 Kbs(-1), DSL/Cable, and 10 Mbs(-1) LAN speeds, a prosthetist remotely configured a user's C-Leg by using Application Sharing, Live Video, and Live Audio. A similar test between sites in Ottawa and Toronto, Canada was limited by the notebook computer's 28 Kbs(-1) modem. At the 28 Kbs(-1) Internet-connection speed, NetMeeting's application sharing feature was not able to update the remote Sliders window fast enough to display peak toe loads and peak knee angles. These results support the use of NetMeeting as an accessible and cost-effective tool for remote C-Leg configuration, provided that sufficient Internet data transfer speed is available.
Shuttle Debris Impact Tool Assessment Using the Modern Design of Experiments
NASA Technical Reports Server (NTRS)
DeLoach, R.; Rayos, E. M.; Campbell, C. H.; Rickman, S. L.
2006-01-01
Computational tools have been developed to estimate thermal and mechanical reentry loads experienced by the Space Shuttle Orbiter as the result of cavities in the Thermal Protection System (TPS). Such cavities can be caused by impact from ice or insulating foam debris shed from the External Tank (ET) on liftoff. The reentry loads depend on cavity geometry and certain Shuttle state variables, among other factors. Certain simplifying assumptions have been made in the tool development about the cavity geometry variables. For example, the cavities are all modeled as shoeboxes , with rectangular cross-sections and planar walls. So an actual cavity is typically approximated with an idealized cavity described in terms of its length, width, and depth, as well as its entry angle, exit angle, and side angles (assumed to be the same for both sides). As part of a comprehensive assessment of the uncertainty in reentry loads estimated by the debris impact assessment tools, an effort has been initiated to quantify the component of the uncertainty that is due to imperfect geometry specifications for the debris impact cavities. The approach is to compute predicted loads for a set of geometry factor combinations sufficient to develop polynomial approximations to the complex, nonparametric underlying computational models. Such polynomial models are continuous and feature estimable, continuous derivatives, conditions that facilitate the propagation of independent variable errors. As an additional benefit, once the polynomial models have been developed, they require fewer computational resources to execute than the underlying finite element and computational fluid dynamics codes, and can generate reentry loads estimates in significantly less time. This provides a practical screening capability, in which a large number of debris impact cavities can be quickly classified either as harmless, or subject to additional analysis with the more comprehensive underlying computational tools. The polynomial models also provide useful insights into the sensitivity of reentry loads to various cavity geometry variables, and reveal complex interactions among those variables that indicate how the sensitivity of one variable depends on the level of one or more other variables. For example, the effect of cavity length on certain reentry loads depends on the depth of the cavity. Such interactions are clearly displayed in the polynomial response models.
Computer-Aided Design Package for Designers of Digital Optical Computers
1991-02-01
circuit depth and in circuit breadth. It appears, from initial studies by PhD students Gupta and Majidi using the newly modified tools, that a few irregular...Gupta, which is based on an earlier tool developed by Majidi . The tool allows logic gates to have fan-ins and fan-outs that vary, and allows circuits
Caesy: A software tool for computer-aided engineering
NASA Technical Reports Server (NTRS)
Wette, Matt
1993-01-01
A new software tool, Caesy, is described. This tool provides a strongly typed programming environment for research in the development of algorithms and software for computer-aided control system design. A description of the user language and its implementation as they currently stand are presented along with a description of work in progress and areas of future work.
Scratch as a Computational Modelling Tool for Teaching Physics
ERIC Educational Resources Information Center
Lopez, Victor; Hernandez, Maria Isabel
2015-01-01
The Scratch online authoring tool, which features a simple programming language that has been adapted to primary and secondary students, is being used more and more in schools as it offers students and teachers the opportunity to use a tool to build scientific models and evaluate their behaviour, just as can be done with computational modelling…
ERIC Educational Resources Information Center
Xu, Q.; Lai, L. L.; Tse, N. C. F.; Ichiyanagi, K.
2011-01-01
An interactive computer-based learning tool with multiple sessions is proposed in this paper, which teaches students to think and helps them recognize the merits and limitations of simulation tools so as to improve their practical abilities in electrical circuit simulation based on the case of a power converter with progressive problems. The…
The efficiency of geophysical adjoint codes generated by automatic differentiation tools
NASA Astrophysics Data System (ADS)
Vlasenko, A. V.; Köhl, A.; Stammer, D.
2016-02-01
The accuracy of numerical models that describe complex physical or chemical processes depends on the choice of model parameters. Estimating an optimal set of parameters by optimization algorithms requires knowledge of the sensitivity of the process of interest to model parameters. Typically the sensitivity computation involves differentiation of the model, which can be performed by applying algorithmic differentiation (AD) tools to the underlying numerical code. However, existing AD tools differ substantially in design, legibility and computational efficiency. In this study we show that, for geophysical data assimilation problems of varying complexity, the performance of adjoint codes generated by the existing AD tools (i) Open_AD, (ii) Tapenade, (iii) NAGWare and (iv) Transformation of Algorithms in Fortran (TAF) can be vastly different. Based on simple test problems, we evaluate the efficiency of each AD tool with respect to computational speed, accuracy of the adjoint, the efficiency of memory usage, and the capability of each AD tool to handle modern FORTRAN 90-95 elements such as structures and pointers, which are new elements that either combine groups of variables or provide aliases to memory addresses, respectively. We show that, while operator overloading tools are the only ones suitable for modern codes written in object-oriented programming languages, their computational efficiency lags behind source transformation by orders of magnitude, rendering the application of these modern tools to practical assimilation problems prohibitive. In contrast, the application of source transformation tools appears to be the most efficient choice, allowing handling even large geophysical data assimilation problems. However, they can only be applied to numerical models written in earlier generations of programming languages. Our study indicates that applying existing AD tools to realistic geophysical problems faces limitations that urgently need to be solved to allow the continuous use of AD tools for solving geophysical problems on modern computer architectures.
Computer implemented method, and apparatus for controlling a hand-held tool
NASA Technical Reports Server (NTRS)
Wagner, Kenneth William (Inventor); Taylor, James Clayton (Inventor)
1999-01-01
The invention described here in is a computer-implemented method and apparatus for controlling a hand-held tool. In particular, the control of a hand held tool is for the purpose of controlling the speed of a fastener interface mechanism and the torque applied to fasteners by the fastener interface mechanism of the hand-held tool and monitoring the operating parameters of the tool. The control is embodied in intool software embedded on a processor within the tool which also communicates with remote software. An operator can run the tool, or through the interaction of both software, operate the tool from a remote location, analyze data from a performance history recorded by the tool, and select various torque and speed parameters for each fastener.
The Hematopoietic Expression Viewer: expanding mobile apps as a scientific tool.
James, Regis A; Rao, Mitchell M; Chen, Edward S; Goodell, Margaret A; Shaw, Chad A
2012-07-15
Many important data in current biological science comprise hundreds, thousands or more individual results. These massive data require computational tools to navigate results and effectively interact with the content. Mobile device apps are an increasingly important tool in the everyday lives of scientists and non-scientists alike. These software present individuals with compact and efficient tools to interact with complex data at meetings or other locations remote from their main computing environment. We believe that apps will be important tools for biologists, geneticists and physicians to review content while participating in biomedical research or practicing medicine. We have developed a prototype app for displaying gene expression data using the iOS platform. To present the software engineering requirements, we review the model-view-controller schema for Apple's iOS. We apply this schema to a simple app for querying locally developed microarray gene expression data. The challenge of this application is to balance between storing content locally within the app versus obtaining it dynamically via a network connection. The Hematopoietic Expression Viewer is available at http://www.shawlab.org/he_viewer. The source code for this project and any future information on how to obtain the app can be accessed at http://www.shawlab.org/he_viewer.
A Rat Body Phantom for Radiation Analysis
NASA Technical Reports Server (NTRS)
Qualls, Garry D.; Clowdsley, Martha S.; Slaba, Tony C.; Walker, Steven A.
2010-01-01
To reduce the uncertainties associated with estimating the biological effects of ionizing radiation in tissue, researchers rely on laboratory experiments in which mono-energetic, single specie beams are applied to cell cultures, insects, and small animals. To estimate the radiation effects on astronauts in deep space or low Earth orbit, who are exposed to mixed field broad spectrum radiation, these experimental results are extrapolated and combined with other data to produce radiation quality factors, radiation weighting factors, and other risk related quantities for humans. One way to reduce the uncertainty associated with such extrapolations is to utilize analysis tools that are applicable to both laboratory and space environments. The use of physical and computational body phantoms to predict radiation exposure and its effects is well established and a wide range of human and non-human phantoms are in use today. In this paper, a computational rat phantom is presented, as well as a description of the process through which that phantom has been coupled to existing radiation analysis tools. Sample results are presented for two space radiation environments.
cryoem-cloud-tools: A software platform to deploy and manage cryo-EM jobs in the cloud.
Cianfrocco, Michael A; Lahiri, Indrajit; DiMaio, Frank; Leschziner, Andres E
2018-06-01
Access to streamlined computational resources remains a significant bottleneck for new users of cryo-electron microscopy (cryo-EM). To address this, we have developed tools that will submit cryo-EM analysis routines and atomic model building jobs directly to Amazon Web Services (AWS) from a local computer or laptop. These new software tools ("cryoem-cloud-tools") have incorporated optimal data movement, security, and cost-saving strategies, giving novice users access to complex cryo-EM data processing pipelines. Integrating these tools into the RELION processing pipeline and graphical user interface we determined a 2.2 Å structure of ß-galactosidase in ∼55 hours on AWS. We implemented a similar strategy to submit Rosetta atomic model building and refinement to AWS. These software tools dramatically reduce the barrier for entry of new users to cloud computing for cryo-EM and are freely available at cryoem-tools.cloud. Copyright © 2018. Published by Elsevier Inc.
Virtual Screening with AutoDock: Theory and Practice
Cosconati, Sandro; Forli, Stefano; Perryman, Alex L.; Harris, Rodney; Goodsell, David S.; Olson, Arthur J.
2011-01-01
Importance to the field Virtual screening is a computer-based technique for identifying promising compounds to bind to a target molecule of known structure. Given the rapidly increasing number of protein and nucleic acid structures, virtual screening continues to grow as an effective method for the discovery of new inhibitors and drug molecules. Areas covered in this review We describe virtual screening methods that are available in the AutoDock suite of programs, and several of our successes in using AutoDock virtual screening in pharmaceutical lead discovery. What the reader will gain A general overview of the challenges of virtual screening is presented, along with the tools available in the AutoDock suite of programs for addressing these challenges. Take home message Virtual screening is an effective tool for the discovery of compounds for use as leads in drug discovery, and the free, open source program AutoDock is an effective tool for virtual screening. PMID:21532931
Flyby Geometry Optimization Tool
NASA Technical Reports Server (NTRS)
Karlgaard, Christopher D.
2007-01-01
The Flyby Geometry Optimization Tool is a computer program for computing trajectories and trajectory-altering impulsive maneuvers for spacecraft used in radio relay of scientific data to Earth from an exploratory airplane flying in the atmosphere of Mars.
PredictSNP: Robust and Accurate Consensus Classifier for Prediction of Disease-Related Mutations
Bendl, Jaroslav; Stourac, Jan; Salanda, Ondrej; Pavelka, Antonin; Wieben, Eric D.; Zendulka, Jaroslav; Brezovsky, Jan; Damborsky, Jiri
2014-01-01
Single nucleotide variants represent a prevalent form of genetic variation. Mutations in the coding regions are frequently associated with the development of various genetic diseases. Computational tools for the prediction of the effects of mutations on protein function are very important for analysis of single nucleotide variants and their prioritization for experimental characterization. Many computational tools are already widely employed for this purpose. Unfortunately, their comparison and further improvement is hindered by large overlaps between the training datasets and benchmark datasets, which lead to biased and overly optimistic reported performances. In this study, we have constructed three independent datasets by removing all duplicities, inconsistencies and mutations previously used in the training of evaluated tools. The benchmark dataset containing over 43,000 mutations was employed for the unbiased evaluation of eight established prediction tools: MAPP, nsSNPAnalyzer, PANTHER, PhD-SNP, PolyPhen-1, PolyPhen-2, SIFT and SNAP. The six best performing tools were combined into a consensus classifier PredictSNP, resulting into significantly improved prediction performance, and at the same time returned results for all mutations, confirming that consensus prediction represents an accurate and robust alternative to the predictions delivered by individual tools. A user-friendly web interface enables easy access to all eight prediction tools, the consensus classifier PredictSNP and annotations from the Protein Mutant Database and the UniProt database. The web server and the datasets are freely available to the academic community at http://loschmidt.chemi.muni.cz/predictsnp. PMID:24453961
Microstructure-Property-Design Relationships in the Simulation Era: An Introduction (PREPRINT)
2010-01-01
Astronautics (AIAA) paper #1026. 20. Dimiduk DM (1998) Systems engineering of gamma titanium aluminides : impact of fundamentals on development strategy...microstructure-sensitive design tools for single-crystal turbine blades provides an accessible glimpse into future computational tools and their data...requirements. 15. SUBJECT TERMS single-crystal turbine blades , computational methods, integrated computational materials 16. SECURITY
Programming Support Library (PSL). Users Manual.
1978-05-01
which provides the tools to organize, implement, and control computer program develop- ment. This involves the support of the actual programming process...provides the tools toorganize, implement, and control computer program development. The system is designed specifically to support top-down development...Structured Programming are finding increasing application in the computing community. Structured programs are, however, difficult to write in
Development and Assessment of a Chemistry-Based Computer Video Game as a Learning Tool
ERIC Educational Resources Information Center
Martinez-Hernandez, Kermin Joel
2010-01-01
The chemistry-based computer video game is a multidisciplinary collaboration between chemistry and computer graphics and technology fields developed to explore the use of video games as a possible learning tool. This innovative approach aims to integrate elements of commercial video game and authentic chemistry context environments into a learning…
The Computer as a Tool for Learning through Reflection. Technical Report No. 376.
ERIC Educational Resources Information Center
Collins, Allan; Brown, John Seely
Because of its ability to record and represent process, the computer can provide a powerful, motivating, and as yet untapped tool for focusing the students' attention directly on their own thought processes and learning through reflection. Properly abstracted and structured, the computational medium can capture the processes by which a novice or…
Choi, Jeeyae; Choi, Jeungok E
2014-01-01
To provide best recommendations at the point of care, guidelines have been implemented in computer systems. As a prerequisite, guidelines are translated into a computer-interpretable guideline format. Since there are no specific tools to translate nursing guidelines, only a few nursing guidelines are translated and implemented in computer systems. Unified modeling language (UML) is a software writing language and is known to well and accurately represent end-users' perspective, due to the expressive characteristics of the UML. In order to facilitate the development of computer systems for nurses' use, the UML was used to translate a paper-based nursing guideline, and its ease of use and the usefulness were tested through a case study of a genetic counseling guideline. The UML was found to be a useful tool to nurse informaticians and a sufficient tool to model a guideline in a computer program.
Data Intensive Computing on Amazon Web Services
DOE Office of Scientific and Technical Information (OSTI.GOV)
Magana-Zook, S. A.
The Geophysical Monitoring Program (GMP) has spent the past few years building up the capability to perform data intensive computing using what have been referred to as “big data” tools. These big data tools would be used against massive archives of seismic signals (>300 TB) to conduct research not previously possible. Examples of such tools include Hadoop (HDFS, MapReduce), HBase, Hive, Storm, Spark, Solr, and many more by the day. These tools are useful for performing data analytics on datasets that exceed the resources of traditional analytic approaches. To this end, a research big data cluster (“Cluster A”) was setmore » up as a collaboration between GMP and Livermore Computing (LC).« less
We anticipate that the software tool developed, and targeted data acquired, will be useful in the interpretation of biomarkers indicative of exposure to OP insecticide mixtures, including the effects of population and dose variability and uncertainty. Therefore, we expect tha...
Sperry Univac speech communications technology
NASA Technical Reports Server (NTRS)
Medress, Mark F.
1977-01-01
Technology and systems for effective verbal communication with computers were developed. A continuous speech recognition system for verbal input, a word spotting system to locate key words in conversational speech, prosodic tools to aid speech analysis, and a prerecorded voice response system for speech output are described.
A novel mechatronic tool for computer-assisted arthroscopy.
Dario, P; Carrozza, M C; Marcacci, M; D'Attanasio, S; Magnami, B; Tonet, O; Megali, G
2000-03-01
This paper describes a novel mechatronic tool for arthroscopy, which is at the same time a smart tool for traditional arthroscopy and the main component of a system for computer-assisted arthroscopy. The mechatronic arthroscope has a cable-actuated servomotor-driven multi-joint mechanical structure, is equipped with a position sensor measuring the orientation of the tip and with a force sensor detecting possible contact with delicate tissues in the knee, and incorporates an embedded microcontroller for sensor signal processing, motor driving and interfacing with the surgeon and/or the system control unit. When used manually, the mechatronic arthroscope enhances the surgeon's capabilities by enabling him/her to easily control tip motion and to prevent undesired contacts. When the tool is integrated in a complete system for computer-assisted arthroscopy, the trajectory of the arthroscope is reconstructed in real time by an optical tracking system using infrared emitters located in the handle, providing advantages in terms of improved intervention accuracy. The computer-assisted arthroscopy system comprises an image processing module for segmentation and three-dimensional reconstruction of preoperative computer tomography or magnetic resonance images, a registration module for measuring the position of the knee joint, tracking the trajectory of the operating tools, and matching preoperative and intra-operative images, and a human-machine interface that displays the enhanced reality scenario and data from the mechatronic arthroscope in a friendly and intuitive manner. By integrating preoperative and intra-operative images and information provided by the mechatronic arthroscope, the system allows virtual navigation in the knee joint during the planning phase and computer guidance by augmented reality during the intervention. This paper describes in detail the characteristics of the mechatronic arthroscope and of the system for computer-assisted arthroscopy and discusses experimental results obtained with a preliminary version of the tool and of the system.
One approach for evaluating the Distributed Computing Design System (DCDS)
NASA Technical Reports Server (NTRS)
Ellis, J. T.
1985-01-01
The Distributed Computer Design System (DCDS) provides an integrated environment to support the life cycle of developing real-time distributed computing systems. The primary focus of DCDS is to significantly increase system reliability and software development productivity, and to minimize schedule and cost risk. DCDS consists of integrated methodologies, languages, and tools to support the life cycle of developing distributed software and systems. Smooth and well-defined transistions from phase to phase, language to language, and tool to tool provide a unique and unified environment. An approach to evaluating DCDS highlights its benefits.
Computer-aided programming for message-passing system; Problems and a solution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, M.Y.; Gajski, D.D.
1989-12-01
As the number of processors and the complexity of problems to be solved increase, programming multiprocessing systems becomes more difficult and error-prone. Program development tools are necessary since programmers are not able to develop complex parallel programs efficiently. Parallel models of computation, parallelization problems, and tools for computer-aided programming (CAP) are discussed. As an example, a CAP tool that performs scheduling and inserts communication primitives automatically is described. It also generates the performance estimates and other program quality measures to help programmers in improving their algorithms and programs.
Common Accounting System for Monitoring the ATLAS Distributed Computing Resources
NASA Astrophysics Data System (ADS)
Karavakis, E.; Andreeva, J.; Campana, S.; Gayazov, S.; Jezequel, S.; Saiz, P.; Sargsyan, L.; Schovancova, J.; Ueda, I.; Atlas Collaboration
2014-06-01
This paper covers in detail a variety of accounting tools used to monitor the utilisation of the available computational and storage resources within the ATLAS Distributed Computing during the first three years of Large Hadron Collider data taking. The Experiment Dashboard provides a set of common accounting tools that combine monitoring information originating from many different information sources; either generic or ATLAS specific. This set of tools provides quality and scalable solutions that are flexible enough to support the constantly evolving requirements of the ATLAS user community.
2015-10-01
higher effect sizes than others when comparing any intervention (e.g., computer trainers, human tutors, group learning) to a control . It is difficult... control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE (DD-MM-YYYY) October 2015 2. REPORT TYPE Special Report 3...ABSTRACT While human tutoring and mentoring are common teaching tools, current US Army standards for training and education are group instruction and
Advanced tools for smartphone-based experiments: phyphox
NASA Astrophysics Data System (ADS)
Staacks, S.; Hütz, S.; Heinke, H.; Stampfer, C.
2018-07-01
The sensors in modern smartphones are a promising and cost-effective tool for experimentation in physics education, but many experiments face practical problems. Often the phone is inaccessible during the experiment and the data usually needs to be analyzed subsequently on a computer. We address both problems by introducing a new app, called ‘phyphox’, which is specifically designed for utilizing experiments in physics teaching. The app is free and designed to offer the same set of features on Android and iOS.
A data base processor semantics specification package
NASA Technical Reports Server (NTRS)
Fishwick, P. A.
1983-01-01
A Semantics Specification Package (DBPSSP) for the Intel Data Base Processor (DBP) is defined. DBPSSP serves as a collection of cross assembly tools that allow the analyst to assemble request blocks on the host computer for passage to the DBP. The assembly tools discussed in this report may be effectively used in conjunction with a DBP compatible data communications protocol to form a query processor, precompiler, or file management system for the database processor. The source modules representing the components of DBPSSP are fully commented and included.
NASA Astrophysics Data System (ADS)
Xu, M.; van Overloop, P. J.; van de Giesen, N. C.
2011-02-01
Model predictive control (MPC) of open channel flow is becoming an important tool in water management. The complexity of the prediction model has a large influence on the MPC application in terms of control effectiveness and computational efficiency. The Saint-Venant equations, called SV model in this paper, and the Integrator Delay (ID) model are either accurate but computationally costly, or simple but restricted to allowed flow changes. In this paper, a reduced Saint-Venant (RSV) model is developed through a model reduction technique, Proper Orthogonal Decomposition (POD), on the SV equations. The RSV model keeps the main flow dynamics and functions over a large flow range but is easier to implement in MPC. In the test case of a modeled canal reach, the number of states and disturbances in the RSV model is about 45 and 16 times less than the SV model, respectively. The computational time of MPC with the RSV model is significantly reduced, while the controller remains effective. Thus, the RSV model is a promising means to balance the control effectiveness and computational efficiency.
Ising Processing Units: Potential and Challenges for Discrete Optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coffrin, Carleton James; Nagarajan, Harsha; Bent, Russell Whitford
The recent emergence of novel computational devices, such as adiabatic quantum computers, CMOS annealers, and optical parametric oscillators, presents new opportunities for hybrid-optimization algorithms that leverage these kinds of specialized hardware. In this work, we propose the idea of an Ising processing unit as a computational abstraction for these emerging tools. Challenges involved in using and bench- marking these devices are presented, and open-source software tools are proposed to address some of these challenges. The proposed benchmarking tools and methodology are demonstrated by conducting a baseline study of established solution methods to a D-Wave 2X adiabatic quantum computer, one examplemore » of a commercially available Ising processing unit.« less
AstrodyToolsWeb an e-Science project in Astrodynamics and Celestial Mechanics fields
NASA Astrophysics Data System (ADS)
López, R.; San-Juan, J. F.
2013-05-01
Astrodynamics Web Tools, AstrodyToolsWeb (http://tastrody.unirioja.es), is an ongoing collaborative Web Tools computing infrastructure project which has been specially designed to support scientific computation. AstrodyToolsWeb provides project collaborators with all the technical and human facilities in order to wrap, manage, and use specialized noncommercial software tools in Astrodynamics and Celestial Mechanics fields, with the aim of optimizing the use of resources, both human and material. However, this project is open to collaboration from the whole scientific community in order to create a library of useful tools and their corresponding theoretical backgrounds. AstrodyToolsWeb offers a user-friendly web interface in order to choose applications, introduce data, and select appropriate constraints in an intuitive and easy way for the user. After that, the application is executed in real time, whenever possible; then the critical information about program behavior (errors and logs) and output, including the postprocessing and interpretation of its results (graphical representation of data, statistical analysis or whatever manipulation therein), are shown via the same web interface or can be downloaded to the user's computer.
Using Performance Tools to Support Experiments in HPC Resilience
DOE Office of Scientific and Technical Information (OSTI.GOV)
Naughton, III, Thomas J; Boehm, Swen; Engelmann, Christian
2014-01-01
The high performance computing (HPC) community is working to address fault tolerance and resilience concerns for current and future large scale computing platforms. This is driving enhancements in the programming environ- ments, specifically research on enhancing message passing libraries to support fault tolerant computing capabilities. The community has also recognized that tools for resilience experimentation are greatly lacking. However, we argue that there are several parallels between performance tools and resilience tools . As such, we believe the rich set of HPC performance-focused tools can be extended (repurposed) to benefit the resilience community. In this paper, we describe the initialmore » motivation to leverage standard HPC per- formance analysis techniques to aid in developing diagnostic tools to assist fault tolerance experiments for HPC applications. These diagnosis procedures help to provide context for the system when the errors (failures) occurred. We describe our initial work in leveraging an MPI performance trace tool to assist in provid- ing global context during fault injection experiments. Such tools will assist the HPC resilience community as they extend existing and new application codes to support fault tolerances.« less
Informed public choices for low-carbon electricity portfolios using a computer decision tool.
Mayer, Lauren A Fleishman; Bruine de Bruin, Wändi; Morgan, M Granger
2014-04-01
Reducing CO2 emissions from the electricity sector will likely require policies that encourage the widespread deployment of a diverse mix of low-carbon electricity generation technologies. Public discourse informs such policies. To make informed decisions and to productively engage in public discourse, citizens need to understand the trade-offs between electricity technologies proposed for widespread deployment. Building on previous paper-and-pencil studies, we developed a computer tool that aimed to help nonexperts make informed decisions about the challenges faced in achieving a low-carbon energy future. We report on an initial usability study of this interactive computer tool. After providing participants with comparative and balanced information about 10 electricity technologies, we asked them to design a low-carbon electricity portfolio. Participants used the interactive computer tool, which constrained portfolio designs to be realistic and yield low CO2 emissions. As they changed their portfolios, the tool updated information about projected CO2 emissions, electricity costs, and specific environmental impacts. As in the previous paper-and-pencil studies, most participants designed diverse portfolios that included energy efficiency, nuclear, coal with carbon capture and sequestration, natural gas, and wind. Our results suggest that participants understood the tool and used it consistently. The tool may be downloaded from http://cedmcenter.org/tools-for-cedm/informing-the-public-about-low-carbon-technologies/ .
Methods and Research for Multi-Component Cutting Force Sensing Devices and Approaches in Machining
Liang, Qiaokang; Zhang, Dan; Wu, Wanneng; Zou, Kunlin
2016-01-01
Multi-component cutting force sensing systems in manufacturing processes applied to cutting tools are gradually becoming the most significant monitoring indicator. Their signals have been extensively applied to evaluate the machinability of workpiece materials, predict cutter breakage, estimate cutting tool wear, control machine tool chatter, determine stable machining parameters, and improve surface finish. Robust and effective sensing systems with capability of monitoring the cutting force in machine operations in real time are crucial for realizing the full potential of cutting capabilities of computer numerically controlled (CNC) tools. The main objective of this paper is to present a brief review of the existing achievements in the field of multi-component cutting force sensing systems in modern manufacturing. PMID:27854322
Effects of an Approach Spacing Flight Deck Tool on Pilot Eyescan
NASA Technical Reports Server (NTRS)
Oseguera-Lohr, Rosa M.; Nadler, Eric D.
2004-01-01
An airborne tool has been developed based on the concept of an aircraft maintaining a time-based spacing interval from the preceding aircraft. The Advanced Terminal Area Approach Spacing (ATAAS) tool uses Automatic Dependent Surveillance-Broadcast (ADS-B) aircraft state data to compute a speed command for the ATAAS-equipped aircraft to obtain a required time interval behind another aircraft. The tool and candidate operational procedures were tested in a high-fidelity, full mission simulator with active airline subject pilots flying an arrival scenario using three different modes for speed control. Eyetracker data showed only slight changes in instrument scan patterns, and no significant change in the amount of time spent looking out the window with ATAAS, versus standard ILS procedures.
Methods and Research for Multi-Component Cutting Force Sensing Devices and Approaches in Machining.
Liang, Qiaokang; Zhang, Dan; Wu, Wanneng; Zou, Kunlin
2016-11-16
Multi-component cutting force sensing systems in manufacturing processes applied to cutting tools are gradually becoming the most significant monitoring indicator. Their signals have been extensively applied to evaluate the machinability of workpiece materials, predict cutter breakage, estimate cutting tool wear, control machine tool chatter, determine stable machining parameters, and improve surface finish. Robust and effective sensing systems with capability of monitoring the cutting force in machine operations in real time are crucial for realizing the full potential of cutting capabilities of computer numerically controlled (CNC) tools. The main objective of this paper is to present a brief review of the existing achievements in the field of multi-component cutting force sensing systems in modern manufacturing.
EEGNET: An Open Source Tool for Analyzing and Visualizing M/EEG Connectome.
Hassan, Mahmoud; Shamas, Mohamad; Khalil, Mohamad; El Falou, Wassim; Wendling, Fabrice
2015-01-01
The brain is a large-scale complex network often referred to as the "connectome". Exploring the dynamic behavior of the connectome is a challenging issue as both excellent time and space resolution is required. In this context Magneto/Electroencephalography (M/EEG) are effective neuroimaging techniques allowing for analysis of the dynamics of functional brain networks at scalp level and/or at reconstructed sources. However, a tool that can cover all the processing steps of identifying brain networks from M/EEG data is still missing. In this paper, we report a novel software package, called EEGNET, running under MATLAB (Math works, inc), and allowing for analysis and visualization of functional brain networks from M/EEG recordings. EEGNET is developed to analyze networks either at the level of scalp electrodes or at the level of reconstructed cortical sources. It includes i) Basic steps in preprocessing M/EEG signals, ii) the solution of the inverse problem to localize / reconstruct the cortical sources, iii) the computation of functional connectivity among signals collected at surface electrodes or/and time courses of reconstructed sources and iv) the computation of the network measures based on graph theory analysis. EEGNET is the unique tool that combines the M/EEG functional connectivity analysis and the computation of network measures derived from the graph theory. The first version of EEGNET is easy to use, flexible and user friendly. EEGNET is an open source tool and can be freely downloaded from this webpage: https://sites.google.com/site/eegnetworks/.
EEGNET: An Open Source Tool for Analyzing and Visualizing M/EEG Connectome
Hassan, Mahmoud; Shamas, Mohamad; Khalil, Mohamad; El Falou, Wassim; Wendling, Fabrice
2015-01-01
The brain is a large-scale complex network often referred to as the “connectome”. Exploring the dynamic behavior of the connectome is a challenging issue as both excellent time and space resolution is required. In this context Magneto/Electroencephalography (M/EEG) are effective neuroimaging techniques allowing for analysis of the dynamics of functional brain networks at scalp level and/or at reconstructed sources. However, a tool that can cover all the processing steps of identifying brain networks from M/EEG data is still missing. In this paper, we report a novel software package, called EEGNET, running under MATLAB (Math works, inc), and allowing for analysis and visualization of functional brain networks from M/EEG recordings. EEGNET is developed to analyze networks either at the level of scalp electrodes or at the level of reconstructed cortical sources. It includes i) Basic steps in preprocessing M/EEG signals, ii) the solution of the inverse problem to localize / reconstruct the cortical sources, iii) the computation of functional connectivity among signals collected at surface electrodes or/and time courses of reconstructed sources and iv) the computation of the network measures based on graph theory analysis. EEGNET is the unique tool that combines the M/EEG functional connectivity analysis and the computation of network measures derived from the graph theory. The first version of EEGNET is easy to use, flexible and user friendly. EEGNET is an open source tool and can be freely downloaded from this webpage: https://sites.google.com/site/eegnetworks/. PMID:26379232
Molnos, Sophie; Baumbach, Clemens; Wahl, Simone; Müller-Nurasyid, Martina; Strauch, Konstantin; Wang-Sattler, Rui; Waldenberger, Melanie; Meitinger, Thomas; Adamski, Jerzy; Kastenmüller, Gabi; Suhre, Karsten; Peters, Annette; Grallert, Harald; Theis, Fabian J; Gieger, Christian
2017-09-29
Genome-wide association studies allow us to understand the genetics of complex diseases. Human metabolism provides information about the disease-causing mechanisms, so it is usual to investigate the associations between genetic variants and metabolite levels. However, only considering genetic variants and their effects on one trait ignores the possible interplay between different "omics" layers. Existing tools only consider single-nucleotide polymorphism (SNP)-SNP interactions, and no practical tool is available for large-scale investigations of the interactions between pairs of arbitrary quantitative variables. We developed an R package called pulver to compute p-values for the interaction term in a very large number of linear regression models. Comparisons based on simulated data showed that pulver is much faster than the existing tools. This is achieved by using the correlation coefficient to test the null-hypothesis, which avoids the costly computation of inversions. Additional tricks are a rearrangement of the order, when iterating through the different "omics" layers, and implementing this algorithm in the fast programming language C++. Furthermore, we applied our algorithm to data from the German KORA study to investigate a real-world problem involving the interplay among DNA methylation, genetic variants, and metabolite levels. The pulver package is a convenient and rapid tool for screening huge numbers of linear regression models for significant interaction terms in arbitrary pairs of quantitative variables. pulver is written in R and C++, and can be downloaded freely from CRAN at https://cran.r-project.org/web/packages/pulver/ .
DOE Office of Scientific and Technical Information (OSTI.GOV)
Almgren, Ann; DeMar, Phil; Vetter, Jeffrey
The widespread use of computing in the American economy would not be possible without a thoughtful, exploratory research and development (R&D) community pushing the performance edge of operating systems, computer languages, and software libraries. These are the tools and building blocks — the hammers, chisels, bricks, and mortar — of the smartphone, the cloud, and the computing services on which we rely. Engineers and scientists need ever-more specialized computing tools to discover new material properties for manufacturing, make energy generation safer and more efficient, and provide insight into the fundamentals of the universe, for example. The research division of themore » U.S. Department of Energy’s (DOE’s) Office of Advanced Scientific Computing and Research (ASCR Research) ensures that these tools and building blocks are being developed and honed to meet the extreme needs of modern science. See also http://exascaleage.org/ascr/ for additional information.« less
Computer-aided design/computer-aided manufacturing skull base drill.
Couldwell, William T; MacDonald, Joel D; Thomas, Charles L; Hansen, Bradley C; Lapalikar, Aniruddha; Thakkar, Bharat; Balaji, Alagar K
2017-05-01
The authors have developed a simple device for computer-aided design/computer-aided manufacturing (CAD-CAM) that uses an image-guided system to define a cutting tool path that is shared with a surgical machining system for drilling bone. Information from 2D images (obtained via CT and MRI) is transmitted to a processor that produces a 3D image. The processor generates code defining an optimized cutting tool path, which is sent to a surgical machining system that can drill the desired portion of bone. This tool has applications for bone removal in both cranial and spine neurosurgical approaches. Such applications have the potential to reduce surgical time and associated complications such as infection or blood loss. The device enables rapid removal of bone within 1 mm of vital structures. The validity of such a machining tool is exemplified in the rapid (< 3 minutes machining time) and accurate removal of bone for transtemporal (for example, translabyrinthine) approaches.
Optimal low thrust geocentric transfer. [mission analysis computer program
NASA Technical Reports Server (NTRS)
Edelbaum, T. N.; Sackett, L. L.; Malchow, H. L.
1973-01-01
A computer code which will rapidly calculate time-optimal low thrust transfers is being developed as a mission analysis tool. The final program will apply to NEP or SEP missions and will include a variety of environmental effects. The current program assumes constant acceleration. The oblateness effect and shadowing may be included. Detailed state and costate equations are given for the thrust effect, oblateness effect, and shadowing. A simple but adequate model yields analytical formulas for power degradation due to the Van Allen radiation belts for SEP missions. The program avoids the classical singularities by the use of equinoctial orbital elements. Kryloff-Bogoliuboff averaging is used to facilitate rapid calculation. Results for selected cases using the current program are given.
Examples of Effective Data Sharing in Scientific Publishing
Kitchin, John R.
2015-05-11
Here, we present a perspective on an approach to data sharing in scientific publications we have been developing in our group. The essence of the approach is that data can be embedded in a human-readable and machine-addressable way within the traditional publishing environment. We show this by example for both computational and experimental data. We articulate a need for new authoring tools to facilitate data sharing, and we discuss the tools we have been developing for this purpose. With these tools, data generation, analysis, and manuscript preparation can be deeply integrated, resulting in easier and better data sharing in scientificmore » publications.« less
NASA Technical Reports Server (NTRS)
Manford, J. S.; Bennett, G. R.
1985-01-01
The Space Station Program will incorporate analysis of operations constraints and considerations in the early design phases to avoid the need for later modifications to the Space Station for operations. The application of modern tools and administrative techniques to minimize the cost of performing effective orbital operations planning and design analysis in the preliminary design phase of the Space Station Program is discussed. Tools and techniques discussed include: approach for rigorous analysis of operations functions, use of the resources of a large computer network, and providing for efficient research and access to information.
3D Graphics Through the Internet: A "Shoot-Out"
NASA Technical Reports Server (NTRS)
Watson, Val; Lasinski, T. A. (Technical Monitor)
1995-01-01
3D graphics through the Internet needs to move beyond the current lowest common denominator of pre-computed movies, which consume bandwidth and are non-interactive. Panelists will demonstrate and compare 3D graphical tools for accessing, analyzing, and collaborating on information through the Internet and World-wide web. The "shoot-out" will illustrate which tools are likely to be the best for the various types of information, including dynamic scientific data, 3-D objects, and virtual environments. The goal of the panel is to encourage more effective use of the Internet by encouraging suppliers and users of information to adopt the next generation of graphical tools.
Design tool for multiprocessor scheduling and evaluation of iterative dataflow algorithms
NASA Technical Reports Server (NTRS)
Jones, Robert L., III
1995-01-01
A graph-theoretic design process and software tool is defined for selecting a multiprocessing scheduling solution for a class of computational problems. The problems of interest are those that can be described with a dataflow graph and are intended to be executed repetitively on a set of identical processors. Typical applications include signal processing and control law problems. Graph-search algorithms and analysis techniques are introduced and shown to effectively determine performance bounds, scheduling constraints, and resource requirements. The software tool applies the design process to a given problem and includes performance optimization through the inclusion of additional precedence constraints among the schedulable tasks.
Collaboration technology and space science
NASA Technical Reports Server (NTRS)
Leiner, Barry M.; Brown, R. L.; Haines, R. F.
1990-01-01
A summary of available collaboration technologies and their applications to space science is presented as well as investigations into remote coaching paradigms and the role of a specific collaboration tool for distributed task coordination in supporting such teleoperations. The applicability and effectiveness of different communication media and tools in supporting remote coaching are investigated. One investigation concerns a distributed check-list, a computer-based tool that allows a group of people, e.g., onboard crew, ground based investigator, and mission control, to synchronize their actions while providing full flexibility for the flight crew to set the pace and remain on their operational schedule. This autonomy is shown to contribute to morale and productivity.
Examples of Effective Data Sharing in Scientific Publishing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kitchin, John R.
Here, we present a perspective on an approach to data sharing in scientific publications we have been developing in our group. The essence of the approach is that data can be embedded in a human-readable and machine-addressable way within the traditional publishing environment. We show this by example for both computational and experimental data. We articulate a need for new authoring tools to facilitate data sharing, and we discuss the tools we have been developing for this purpose. With these tools, data generation, analysis, and manuscript preparation can be deeply integrated, resulting in easier and better data sharing in scientificmore » publications.« less
He, W; Zhao, S; Liu, X; Dong, S; Lv, J; Liu, D; Wang, J; Meng, Z
2013-12-04
Large-scale next-generation sequencing (NGS)-based resequencing detects sequence variations, constructs evolutionary histories, and identifies phenotype-related genotypes. However, NGS-based resequencing studies generate extraordinarily large amounts of data, making computations difficult. Effective use and analysis of these data for NGS-based resequencing studies remains a difficult task for individual researchers. Here, we introduce ReSeqTools, a full-featured toolkit for NGS (Illumina sequencing)-based resequencing analysis, which processes raw data, interprets mapping results, and identifies and annotates sequence variations. ReSeqTools provides abundant scalable functions for routine resequencing analysis in different modules to facilitate customization of the analysis pipeline. ReSeqTools is designed to use compressed data files as input or output to save storage space and facilitates faster and more computationally efficient large-scale resequencing studies in a user-friendly manner. It offers abundant practical functions and generates useful statistics during the analysis pipeline, which significantly simplifies resequencing analysis. Its integrated algorithms and abundant sub-functions provide a solid foundation for special demands in resequencing projects. Users can combine these functions to construct their own pipelines for other purposes.
Modeling RF-induced Plasma-Surface Interactions with VSim
NASA Astrophysics Data System (ADS)
Jenkins, Thomas G.; Smithe, David N.; Pankin, Alexei Y.; Roark, Christine M.; Stoltz, Peter H.; Zhou, Sean C.-D.; Kruger, Scott E.
2014-10-01
An overview of ongoing enhancements to the Plasma Discharge (PD) module of Tech-X's VSim software tool is presented. A sub-grid kinetic sheath model, developed for the accurate computation of sheath potentials near metal and dielectric-coated walls, enables the physical effects of DC and RF sheath dynamics to be included in macroscopic-scale plasma simulations that need not explicitly resolve sheath scale lengths. Sheath potential evolution, together with particle behavior near the sheath (e.g. sputtering), can thus be simulated in complex, experimentally relevant geometries. Simulations of RF sheath-enhanced impurity production near surfaces of the C-Mod field-aligned ICRF antenna are presented to illustrate the model; impurity mitigation techniques are also explored. Model extensions to capture the physics of secondary electron emission and of multispecies plasmas are summarized, together with a discussion of improved tools for plasma chemistry and IEDF/EEDF visualization and modeling. The latter tools are also highly relevant for commercial plasma processing applications. Ultimately, we aim to establish VSimPD as a robust, efficient computational tool for modeling fusion and industrial plasma processes. Supported by U.S. DoE SBIR Phase I/II Award DE-SC0009501.
NASA Astrophysics Data System (ADS)
Korchuganova, M.; Syrbakov, A.; Chernysheva, T.; Ivanov, G.; Gnedasch, E.
2016-08-01
Out of all common chip curling methods, a special tool face form has become the most widespread which is developed either by means of grinding or by means of profile pressing in the production process of RMSP. Currently, over 15 large tool manufacturers produce tools using instrument materials of over 500 brands. To this, we must add a large variety of tool face geometries, which purpose includes the control over form and dimensions of the chip. Taking into account all the many processed materials, specific tasks of the process planner, requirements to the quality of manufactured products, all this makes the choice of a proper tool which can perform the processing in the most effective way significantly harder. Over recent years, the nomenclature of RMSP for lathe tools with mechanical mounting has been considerably broadened by means of diversification of their faces
Design Tools for Reconfigurable Hardware in Orbit (RHinO)
NASA Technical Reports Server (NTRS)
French, Mathew; Graham, Paul; Wirthlin, Michael; Larchev, Gregory; Bellows, Peter; Schott, Brian
2004-01-01
The Reconfigurable Hardware in Orbit (RHinO) project is focused on creating a set of design tools that facilitate and automate design techniques for reconfigurable computing in space, using SRAM-based field-programmable-gate-array (FPGA) technology. These tools leverage an established FPGA design environment and focus primarily on space effects mitigation and power optimization. The project is creating software to automatically test and evaluate the single-event-upsets (SEUs) sensitivities of an FPGA design and insert mitigation techniques. Extensions into the tool suite will also allow evolvable algorithm techniques to reconfigure around single-event-latchup (SEL) events. In the power domain, tools are being created for dynamic power visualiization and optimization. Thus, this technology seeks to enable the use of Reconfigurable Hardware in Orbit, via an integrated design tool-suite aiming to reduce risk, cost, and design time of multimission reconfigurable space processors using SRAM-based FPGAs.
Web 2.0 Technologies for Effective Knowledge Management in Organizations: A Qualitative Analysis
ERIC Educational Resources Information Center
Nath, Anupam Kumar
2012-01-01
A new generation of Internet-based collaborative tools, commonly known as Web 2.0, has increased in popularity, availability, and power in the last few years (Kane and Fichman, 2009). Web 2.0 is a set of Internet-based applications that harness network effects by facilitating collaborative and participative computing (O'Reilly, 2006).…
ERIC Educational Resources Information Center
Wu, Tung-Ju; Tai, Yu-Nan
2016-01-01
Under the waves of the Internet and the trend of era, information technology is a door connecting to the world to generate the multiplier effect of learning. Students' learning should not be regarded as the tool to cope with school examinations. The frequent contact with computers, networks, and relevant information allow students enjoying the…
ERIC Educational Resources Information Center
Chukharev-Hudilainen, Evgeny; Klepikova, Tatiana A.
2016-01-01
The purpose of the present paper is twofold; first, we present an empirical study evaluating the effectiveness of a novel CALL tool for foreign language vocabulary instruction based on spaced repetition of target vocabulary items. The study demonstrates that by spending an average of three minutes each day on automatically generated vocabulary…
ERIC Educational Resources Information Center
Zubas, Patrice; Heiss, Cindy; Pedersen, Mary
2006-01-01
The purpose of this study was to ascertain if an online computer tutorial on diabetes mellitus, supplemented to traditional classroom lecture, is an effective tool in the education of nutrition students. Students completing a web-based tutorial as a supplement to classroom lecture displayed greater improvement in pre- vs. post-test scores compared…
Systems Biology for Organotypic Cell Cultures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grego, Sonia; Dougherty, Edward R.; Alexander, Francis J.
Translating in vitro biological data into actionable information related to human health holds the potential to improve disease treatment and risk assessment of chemical exposures. While genomics has identified regulatory pathways at the cellular level, translation to the organism level requires a multiscale approach accounting for intra-cellular regulation, inter-cellular interaction, and tissue/organ-level effects. Tissue-level effects can now be probed in vitro thanks to recently developed systems of three-dimensional (3D), multicellular, “organotypic” cell cultures, which mimic functional responses of living tissue. However, there remains a knowledge gap regarding interactions across different biological scales, complicating accurate prediction of health outcomes from molecular/genomicmore » data and tissue responses. Systems biology aims at mathematical modeling of complex, non-linear biological systems. We propose to apply a systems biology approach to achieve a computational representation of tissue-level physiological responses by integrating empirical data derived from organotypic culture systems with computational models of intracellular pathways to better predict human responses. Successful implementation of this integrated approach will provide a powerful tool for faster, more accurate and cost-effective screening of potential toxicants and therapeutics. On September 11, 2015, an interdisciplinary group of scientists, engineers, and clinicians gathered for a workshop in Research Triangle Park, North Carolina, to discuss this ambitious goal. Participants represented laboratory-based and computational modeling approaches to pharmacology and toxicology, as well as the pharmaceutical industry, government, non-profits, and academia. Discussions focused on identifying critical system perturbations to model, the computational tools required, and the experimental approaches best suited to generating key data. This consensus report summarizes the discussions held.« less
Workshop Report: Systems Biology for Organotypic Cell Cultures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grego, Sonia; Dougherty, Edward R.; Alexander, Francis Joseph
Translating in vitro biological data into actionable information related to human health holds the potential to improve disease treatment and risk assessment of chemical exposures. While genomics has identified regulatory pathways at the cellular level, translation to the organism level requires a multiscale approach accounting for intra-cellular regulation, inter-cellular interaction, and tissue/organ-level effects. Tissue-level effects can now be probed in vitro thanks to recently developed systems of three-dimensional (3D), multicellular, “organotypic” cell cultures, which mimic functional responses of living tissue. However, there remains a knowledge gap regarding interactions across different biological scales, complicating accurate prediction of health outcomes from molecular/genomicmore » data and tissue responses. Systems biology aims at mathematical modeling of complex, non-linear biological systems. We propose to apply a systems biology approach to achieve a computational representation of tissue-level physiological responses by integrating empirical data derived from organotypic culture systems with computational models of intracellular pathways to better predict human responses. Successful implementation of this integrated approach will provide a powerful tool for faster, more accurate and cost-effective screening of potential toxicants and therapeutics. On September 11, 2015, an interdisciplinary group of scientists, engineers, and clinicians gathered for a workshop in Research Triangle Park, North Carolina, to discuss this ambitious goal. Participants represented laboratory-based and computational modeling approaches to pharmacology and toxicology, as well as the pharmaceutical industry, government, non-profits, and academia. Discussions focused on identifying critical system perturbations to model, the computational tools required, and the experimental approaches best suited to generating key data.« less
Workshop Report: Systems Biology for Organotypic Cell Cultures
Grego, Sonia; Dougherty, Edward R.; Alexander, Francis Joseph; ...
2016-11-14
Translating in vitro biological data into actionable information related to human health holds the potential to improve disease treatment and risk assessment of chemical exposures. While genomics has identified regulatory pathways at the cellular level, translation to the organism level requires a multiscale approach accounting for intra-cellular regulation, inter-cellular interaction, and tissue/organ-level effects. Tissue-level effects can now be probed in vitro thanks to recently developed systems of three-dimensional (3D), multicellular, “organotypic” cell cultures, which mimic functional responses of living tissue. However, there remains a knowledge gap regarding interactions across different biological scales, complicating accurate prediction of health outcomes from molecular/genomicmore » data and tissue responses. Systems biology aims at mathematical modeling of complex, non-linear biological systems. We propose to apply a systems biology approach to achieve a computational representation of tissue-level physiological responses by integrating empirical data derived from organotypic culture systems with computational models of intracellular pathways to better predict human responses. Successful implementation of this integrated approach will provide a powerful tool for faster, more accurate and cost-effective screening of potential toxicants and therapeutics. On September 11, 2015, an interdisciplinary group of scientists, engineers, and clinicians gathered for a workshop in Research Triangle Park, North Carolina, to discuss this ambitious goal. Participants represented laboratory-based and computational modeling approaches to pharmacology and toxicology, as well as the pharmaceutical industry, government, non-profits, and academia. Discussions focused on identifying critical system perturbations to model, the computational tools required, and the experimental approaches best suited to generating key data.« less
Systems biology for organotypic cell cultures.
Grego, Sonia; Dougherty, Edward R; Alexander, Francis J; Auerbach, Scott S; Berridge, Brian R; Bittner, Michael L; Casey, Warren; Cooley, Philip C; Dash, Ajit; Ferguson, Stephen S; Fennell, Timothy R; Hawkins, Brian T; Hickey, Anthony J; Kleensang, Andre; Liebman, Michael N J; Martin, Florian; Maull, Elizabeth A; Paragas, Jason; Qiao, Guilin Gary; Ramaiahgari, Sreenivasa; Sumner, Susan J; Yoon, Miyoung
2017-01-01
Translating in vitro biological data into actionable information related to human health holds the potential to improve disease treatment and risk assessment of chemical exposures. While genomics has identified regulatory pathways at the cellular level, translation to the organism level requires a multiscale approach accounting for intra-cellular regulation, inter-cellular interaction, and tissue/organ-level effects. Tissue-level effects can now be probed in vitro thanks to recently developed systems of three-dimensional (3D), multicellular, "organotypic" cell cultures, which mimic functional responses of living tissue. However, there remains a knowledge gap regarding interactions across different biological scales, complicating accurate prediction of health outcomes from molecular/genomic data and tissue responses. Systems biology aims at mathematical modeling of complex, non-linear biological systems. We propose to apply a systems biology approach to achieve a computational representation of tissue-level physiological responses by integrating empirical data derived from organotypic culture systems with computational models of intracellular pathways to better predict human responses. Successful implementation of this integrated approach will provide a powerful tool for faster, more accurate and cost-effective screening of potential toxicants and therapeutics. On September 11, 2015, an interdisciplinary group of scientists, engineers, and clinicians gathered for a workshop in Research Triangle Park, North Carolina, to discuss this ambitious goal. Participants represented laboratory-based and computational modeling approaches to pharmacology and toxicology, as well as the pharmaceutical industry, government, non-profits, and academia. Discussions focused on identifying critical system perturbations to model, the computational tools required, and the experimental approaches best suited to generating key data.
Computer Assisted Learning for Biomedical Engineering Education: Tools
2001-10-25
COMPUTER ASSISTED LEARNING FOR BIOMEDICAL ENGINEERING EDUCATION : TOOLS Ayhan ÝSTANBULLU1 Ýnan GÜLER2 1 Department of Electronic...of Technical Education , Gazi University, 06500 Ankara, Türkiye Abstract- Interactive multimedia learning environment is being proposed...Assisted Learning (CAL) are given and some tools used in this area are explained. Together with the developments in the area of distance education
ERIC Educational Resources Information Center
Rolka, Christine; Remshagen, Anja
2015-01-01
Contextualized learning is considered beneficial for student success. In this article, we assess the impact of context-based learning tools on student grade performance in an introductory computer science course. In particular, we investigate two central questions: (1) does the use context-based learning tools, robots and animations, affect…
Teaching and Learning Physics in a 1:1 Laptop School
NASA Astrophysics Data System (ADS)
Zucker, Andrew A.; Hug, Sarah T.
2008-12-01
1:1 laptop programs, in which every student is provided with a personal computer to use during the school year, permit increased and routine use of powerful, user-friendly computer-based tools. Growing numbers of 1:1 programs are reshaping the roles of teachers and learners in science classrooms. At the Denver School of Science and Technology, a public charter high school where a large percentage of students come from low-income families, 1:1 laptops are used often by teachers and students. This article describes the school's use of laptops, the Internet, and related digital tools, especially for teaching and learning physics. The data are from teacher and student surveys, interviews, classroom observations, and document analyses. Physics students and teachers use an interactive digital textbook; Internet-based simulations (some developed by a Nobel Prize winner); word processors; digital drop boxes; email; formative electronic assessments; computer-based and stand-alone graphing calculators; probes and associated software; and digital video cameras to explore hypotheses, collaborate, engage in scientific inquiry, and to identify strengths and weaknesses of students' understanding of physics. Technology provides students at DSST with high-quality tools to explore scientific concepts and the experiences of teachers and students illustrate effective uses of digital technology for high school physics.
Defining a Computational Framework for the Assessment of ...
The Adverse Outcome Pathway (AOP) framework describes the effects of environmental stressors across multiple scales of biological organization and function. This includes an evaluation of the potential for each key event to occur across a broad range of species in order to determine the taxonomic applicability of each AOP. Computational tools are needed to facilitate this process. Recently, we developed a tool that uses sequence homology to evaluate the applicability of molecular initiating events across species (Lalone et al., Toxicol. Sci., 2016). To extend our ability to make computational predictions at higher levels of biological organization, we have created the AOPdb. This database links molecular targets identified associated with key events in the AOPwiki to publically available data (e.g. gene-protein, pathway, species orthology, ontology, chemical, disease) including ToxCast assay information. The AOPdb combines different data types in order to characterize the impacts of chemicals to human health and the environment and serves as a decision support tool for case study development in the area of taxonomic applicability. As a proof of concept, the AOPdb allows identification of relevant molecular targets, biological pathways, and chemical and disease associations across species for four AOPs from the AOP-Wiki (https://aopwiki.org): Estrogen receptor antagonism leading to reproductive dysfunction (Aop:30); Aromatase inhibition leading to reproductive d
Woodhouse, Steven; Piterman, Nir; Wintersteiger, Christoph M; Göttgens, Berthold; Fisher, Jasmin
2018-05-25
Reconstruction of executable mechanistic models from single-cell gene expression data represents a powerful approach to understanding developmental and disease processes. New ambitious efforts like the Human Cell Atlas will soon lead to an explosion of data with potential for uncovering and understanding the regulatory networks which underlie the behaviour of all human cells. In order to take advantage of this data, however, there is a need for general-purpose, user-friendly and efficient computational tools that can be readily used by biologists who do not have specialist computer science knowledge. The Single Cell Network Synthesis toolkit (SCNS) is a general-purpose computational tool for the reconstruction and analysis of executable models from single-cell gene expression data. Through a graphical user interface, SCNS takes single-cell qPCR or RNA-sequencing data taken across a time course, and searches for logical rules that drive transitions from early cell states towards late cell states. Because the resulting reconstructed models are executable, they can be used to make predictions about the effect of specific gene perturbations on the generation of specific lineages. SCNS should be of broad interest to the growing number of researchers working in single-cell genomics and will help further facilitate the generation of valuable mechanistic insights into developmental, homeostatic and disease processes.
Enhancements in medicine by integrating content based image retrieval in computer-aided diagnosis
NASA Astrophysics Data System (ADS)
Aggarwal, Preeti; Sardana, H. K.
2010-02-01
Computer-aided diagnosis (CAD) has become one of the major research subjects in medical imaging and diagnostic radiology. With cad, radiologists use the computer output as a "second opinion" and make the final decisions. Retrieving images is a useful tool to help radiologist to check medical image and diagnosis. The impact of contentbased access to medical images is frequently reported but existing systems are designed for only a particular context of diagnosis. The challenge in medical informatics is to develop tools for analyzing the content of medical images and to represent them in a way that can be efficiently searched and compared by the physicians. CAD is a concept established by taking into account equally the roles of physicians and computers. To build a successful computer aided diagnostic system, all the relevant technologies, especially retrieval need to be integrated in such a manner that should provide effective and efficient pre-diagnosed cases with proven pathology for the current case at the right time. In this paper, it is suggested that integration of content-based image retrieval (CBIR) in cad can bring enormous results in medicine especially in diagnosis. This approach is also compared with other approaches by highlighting its advantages over those approaches.
Spinozzi, Giulio; Calabria, Andrea; Brasca, Stefano; Beretta, Stefano; Merelli, Ivan; Milanesi, Luciano; Montini, Eugenio
2017-11-25
Bioinformatics tools designed to identify lentiviral or retroviral vector insertion sites in the genome of host cells are used to address the safety and long-term efficacy of hematopoietic stem cell gene therapy applications and to study the clonal dynamics of hematopoietic reconstitution. The increasing number of gene therapy clinical trials combined with the increasing amount of Next Generation Sequencing data, aimed at identifying integration sites, require both highly accurate and efficient computational software able to correctly process "big data" in a reasonable computational time. Here we present VISPA2 (Vector Integration Site Parallel Analysis, version 2), the latest optimized computational pipeline for integration site identification and analysis with the following features: (1) the sequence analysis for the integration site processing is fully compliant with paired-end reads and includes a sequence quality filter before and after the alignment on the target genome; (2) an heuristic algorithm to reduce false positive integration sites at nucleotide level to reduce the impact of Polymerase Chain Reaction or trimming/alignment artifacts; (3) a classification and annotation module for integration sites; (4) a user friendly web interface as researcher front-end to perform integration site analyses without computational skills; (5) the time speedup of all steps through parallelization (Hadoop free). We tested VISPA2 performances using simulated and real datasets of lentiviral vector integration sites, previously obtained from patients enrolled in a hematopoietic stem cell gene therapy clinical trial and compared the results with other preexisting tools for integration site analysis. On the computational side, VISPA2 showed a > 6-fold speedup and improved precision and recall metrics (1 and 0.97 respectively) compared to previously developed computational pipelines. These performances indicate that VISPA2 is a fast, reliable and user-friendly tool for integration site analysis, which allows gene therapy integration data to be handled in a cost and time effective fashion. Moreover, the web access of VISPA2 ( http://openserver.itb.cnr.it/vispa/ ) ensures accessibility and ease of usage to researches of a complex analytical tool. We released the source code of VISPA2 in a public repository ( https://bitbucket.org/andreacalabria/vispa2 ).
Extensible Computational Chemistry Environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
2012-08-09
ECCE provides a sophisticated graphical user interface, scientific visualization tools, and the underlying data management framework enabling scientists to efficiently set up calculations and store, retrieve, and analyze the rapidly growing volumes of data produced by computational chemistry studies. ECCE was conceived as part of the Environmental Molecular Sciences Laboratory construction to solve the problem of researchers being able to effectively utilize complex computational chemistry codes and massively parallel high performance compute resources. Bringing the power of these codes and resources to the desktops of researcher and thus enabling world class research without users needing a detailed understanding of themore » inner workings of either the theoretical codes or the supercomputers needed to run them was a grand challenge problem in the original version of the EMSL. ECCE allows collaboration among researchers using a web-based data repository where the inputs and results for all calculations done within ECCE are organized. ECCE is a first of kind end-to-end problem solving environment for all phases of computational chemistry research: setting up calculations with sophisticated GUI and direct manipulation visualization tools, submitting and monitoring calculations on remote high performance supercomputers without having to be familiar with the details of using these compute resources, and performing results visualization and analysis including creating publication quality images. ECCE is a suite of tightly integrated applications that are employed as the user moves through the modeling process.« less