ERIC Educational Resources Information Center
van Han, Nguyen; van Rensburg, Henriette
2014-01-01
Many companies and organizations have been using the Test of English for International Communication (TOEIC) for business and commercial communication purpose in Vietnam and around the world. The present study investigated the effect of Computer Assisted Language Learning (CALL) on performance in the Test of English for International Communication…
Exascale computing and big data
Reed, Daniel A.; Dongarra, Jack
2015-06-25
Scientific discovery and engineering innovation requires unifying traditionally separated high-performance computing and big data analytics. The tools and cultures of high-performance computing and big data analytics have diverged, to the detriment of both; unification is essential to address a spectrum of major research domains. The challenges of scale tax our ability to transmit data, compute complicated functions on that data, or store a substantial part of it; new approaches are required to meet these challenges. Finally, the international nature of science demands further development of advanced computer architectures and global standards for processing data, even as international competition complicates themore » openness of the scientific process.« less
Exascale computing and big data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reed, Daniel A.; Dongarra, Jack
Scientific discovery and engineering innovation requires unifying traditionally separated high-performance computing and big data analytics. The tools and cultures of high-performance computing and big data analytics have diverged, to the detriment of both; unification is essential to address a spectrum of major research domains. The challenges of scale tax our ability to transmit data, compute complicated functions on that data, or store a substantial part of it; new approaches are required to meet these challenges. Finally, the international nature of science demands further development of advanced computer architectures and global standards for processing data, even as international competition complicates themore » openness of the scientific process.« less
NASA Technical Reports Server (NTRS)
Jackson, R. J.; Wang, T. T.
1974-01-01
A computer program was developed to describe the performance of ramjet and scramjet cycles. The program performs one dimensional calculations of the equilibrium, real-gas internal flow properties of the engine. The program can be used for the following: (1) preliminary design calculation and (2) design analysis of internal flow properties corresponding to stipulated flow areas. Only the combustion of hydrogen in air is considered in this case.
The Role of the Goal in Solving Hard Computational Problems: Do People Really Optimize?
ERIC Educational Resources Information Center
Carruthers, Sarah; Stege, Ulrike; Masson, Michael E. J.
2018-01-01
The role that the mental, or internal, representation plays when people are solving hard computational problems has largely been overlooked to date, despite the reality that this internal representation drives problem solving. In this work we investigate how performance on versions of two hard computational problems differs based on what internal…
Computed Tomography Measuring Inside Machines
NASA Technical Reports Server (NTRS)
Wozniak, James F.; Scudder, Henry J.; Anders, Jeffrey E.
1995-01-01
Computed tomography applied to obtain approximate measurements of radial distances from centerline of turbopump to leading edges of diffuser vanes in turbopump. Use of computed tomography has significance beyond turbopump application: example of general concept of measuring internal dimensions of assembly of parts without having to perform time-consuming task of taking assembly apart and measuring internal parts on coordinate-measuring machine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deng, Z.T.
2001-11-15
The objective of this project was to conduct high-performance computing research and teaching at AAMU, and to train African-American and other minority students and scientists in the computational science field for eventual employment with DOE. During the project period, eight tasks were accomplished. Student Research Assistant, Work Study, Summer Interns, Scholarship were proved to be one of the best ways for us to attract top-quality minority students. Under the support of DOE, through research, summer interns, collaborations, scholarships programs, AAMU has successfully provided research and educational opportunities to minority students in the field related to computational science.
Internal aerodynamics of a generic three-dimensional scramjet inlet at Mach 10
NASA Technical Reports Server (NTRS)
Holland, Scott D.
1995-01-01
A combined computational and experimental parametric study of the internal aerodynamics of a generic three-dimensional sidewall compression scramjet inlet configuration at Mach 10 has been performed. The study was designed to demonstrate the utility of computational fluid dynamics as a design tool in hypersonic inlet flow fields, to provide a detailed account of the nature and structure of the internal flow interactions, and to provide a comprehensive surface property and flow field database to determine the effects of contraction ratio, cowl position, and Reynolds number on the performance of a hypersonic scramjet inlet configuration. The work proceeded in several phases: the initial inviscid assessment of the internal shock structure, the preliminary computational parametric study, the coupling of the optimized configuration with the physical limitations of the facility, the wind tunnel blockage assessment, and the computational and experimental parametric study of the final configuration. Good agreement between computation and experimentation was observed in the magnitude and location of the interactions, particularly for weakly interacting flow fields. Large-scale forward separations resulted when the interaction strength was increased by increasing the contraction ratio or decreasing the Reynolds number.
NASA Technical Reports Server (NTRS)
Holland, Scott Douglas
1991-01-01
A combined computational and experimental parametric study of the internal aerodynamics of a generic three dimensional sidewall compression scramjet inlet configuration was performed. The study was designed to demonstrate the utility of computational fluid dynamics as a design tool in hypersonic inlet flow fields, to provide a detailed account of the nature and structure of the internal flow interactions, and to provide a comprehensive surface property and flow field database to determine the effects of contraction ratio, cowl position, and Reynolds number on the performance of a hypersonic scramjet inlet configuration.
Gentili, Rodolphe J.; Papaxanthis, Charalambos; Ebadzadeh, Mehdi; Eskiizmirliler, Selim; Ouanezar, Sofiane; Darlot, Christian
2009-01-01
Background Several authors suggested that gravitational forces are centrally represented in the brain for planning, control and sensorimotor predictions of movements. Furthermore, some studies proposed that the cerebellum computes the inverse dynamics (internal inverse model) whereas others suggested that it computes sensorimotor predictions (internal forward model). Methodology/Principal Findings This study proposes a model of cerebellar pathways deduced from both biological and physical constraints. The model learns the dynamic inverse computation of the effect of gravitational torques from its sensorimotor predictions without calculating an explicit inverse computation. By using supervised learning, this model learns to control an anthropomorphic robot arm actuated by two antagonists McKibben artificial muscles. This was achieved by using internal parallel feedback loops containing neural networks which anticipate the sensorimotor consequences of the neural commands. The artificial neural networks architecture was similar to the large-scale connectivity of the cerebellar cortex. Movements in the sagittal plane were performed during three sessions combining different initial positions, amplitudes and directions of movements to vary the effects of the gravitational torques applied to the robotic arm. The results show that this model acquired an internal representation of the gravitational effects during vertical arm pointing movements. Conclusions/Significance This is consistent with the proposal that the cerebellar cortex contains an internal representation of gravitational torques which is encoded through a learning process. Furthermore, this model suggests that the cerebellum performs the inverse dynamics computation based on sensorimotor predictions. This highlights the importance of sensorimotor predictions of gravitational torques acting on upper limb movements performed in the gravitational field. PMID:19384420
Modeling a Wireless Network for International Space Station
NASA Technical Reports Server (NTRS)
Alena, Richard; Yaprak, Ece; Lamouri, Saad
2000-01-01
This paper describes the application of wireless local area network (LAN) simulation modeling methods to the hybrid LAN architecture designed for supporting crew-computing tools aboard the International Space Station (ISS). These crew-computing tools, such as wearable computers and portable advisory systems, will provide crew members with real-time vehicle and payload status information and access to digital technical and scientific libraries, significantly enhancing human capabilities in space. A wireless network, therefore, will provide wearable computer and remote instruments with the high performance computational power needed by next-generation 'intelligent' software applications. Wireless network performance in such simulated environments is characterized by the sustainable throughput of data under different traffic conditions. This data will be used to help plan the addition of more access points supporting new modules and more nodes for increased network capacity as the ISS grows.
The visual system’s internal model of the world
Lee, Tai Sing
2015-01-01
The Bayesian paradigm has provided a useful conceptual theory for understanding perceptual computation in the brain. While the detailed neural mechanisms of Bayesian inference are not fully understood, recent computational and neurophysiological works have illuminated the underlying computational principles and representational architecture. The fundamental insights are that the visual system is organized as a modular hierarchy to encode an internal model of the world, and that perception is realized by statistical inference based on such internal model. In this paper, I will discuss and analyze the varieties of representational schemes of these internal models and how they might be used to perform learning and inference. I will argue for a unified theoretical framework for relating the internal models to the observed neural phenomena and mechanisms in the visual cortex. PMID:26566294
High performance computing for advanced modeling and simulation of materials
NASA Astrophysics Data System (ADS)
Wang, Jue; Gao, Fei; Vazquez-Poletti, Jose Luis; Li, Jianjiang
2017-02-01
The First International Workshop on High Performance Computing for Advanced Modeling and Simulation of Materials (HPCMS2015) was held in Austin, Texas, USA, Nov. 18, 2015. HPCMS 2015 was organized by Computer Network Information Center (Chinese Academy of Sciences), University of Michigan, Universidad Complutense de Madrid, University of Science and Technology Beijing, Pittsburgh Supercomputing Center, China Institute of Atomic Energy, and Ames Laboratory.
What a Decade of Experiments Reveals about Factors that Influence the Sense of Presence
2006-03-01
Function HRV heart rate variability IBM International Business Machines Corporation ICAT International Conference on Artificial Intelligence and...Questionnaire. Person-related meas.: Social anxiety , age, gender, computer use. Task-related measures: Social anxiety assessment of partner. Performance...co-presence. (4) Computer use had a significant positive correlation with co-presence. (5) Participant’s social anxiety had a significant
NASA Technical Reports Server (NTRS)
Mysko, Stephen J.; Chyu, Wei J.; Stortz, Michael W.; Chow, Chuen-Yen
1993-01-01
In this work, the computation of combined external/internal transonic flow on the complex forebody/inlet configuration of the AV-8B Harrier II is performed. The actual aircraft has been measured and its surface and surrounding domain, in which the fuselage and inlet have a common wall, have been described using structured grids. The 'thin-layer' Navier-Stokes equations were used to model the flow along with the Chimera embedded multi-block technique. A fully conservative, alternating direction implicit (ADI), approximately factored, partially fluxsplit algorithm was employed to perform the computation. Comparisons to some experimental wind tunnel data yielded good agreement for flow at zero incidence and angle of attack. The aim of this paper is to provide a methodology or computational tool for the numerical solution of complex external/internal flows.
Tandem internal models execute motor learning in the cerebellum.
Honda, Takeru; Nagao, Soichi; Hashimoto, Yuji; Ishikawa, Kinya; Yokota, Takanori; Mizusawa, Hidehiro; Ito, Masao
2018-06-25
In performing skillful movement, humans use predictions from internal models formed by repetition learning. However, the computational organization of internal models in the brain remains unknown. Here, we demonstrate that a computational architecture employing a tandem configuration of forward and inverse internal models enables efficient motor learning in the cerebellum. The model predicted learning adaptations observed in hand-reaching experiments in humans wearing a prism lens and explained the kinetic components of these behavioral adaptations. The tandem system also predicted a form of subliminal motor learning that was experimentally validated after training intentional misses of hand targets. Patients with cerebellar degeneration disease showed behavioral impairments consistent with tandemly arranged internal models. These findings validate computational tandemization of internal models in motor control and its potential uses in more complex forms of learning and cognition. Copyright © 2018 the Author(s). Published by PNAS.
Towards fully analog hardware reservoir computing for speech recognition
NASA Astrophysics Data System (ADS)
Smerieri, Anteo; Duport, François; Paquot, Yvan; Haelterman, Marc; Schrauwen, Benjamin; Massar, Serge
2012-09-01
Reservoir computing is a very recent, neural network inspired unconventional computation technique, where a recurrent nonlinear system is used in conjunction with a linear readout to perform complex calculations, leveraging its inherent internal dynamics. In this paper we show the operation of an optoelectronic reservoir computer in which both the nonlinear recurrent part and the readout layer are implemented in hardware for a speech recognition application. The performance obtained is close to the one of to state-of-the-art digital reservoirs, while the analog architecture opens the way to ultrafast computation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Qian, Xiaoqing; Deng, Z. T.
2009-11-10
This is the final report for the Department of Energy (DOE) project DE-FG02-06ER25746, entitled, "Continuing High Performance Computing Research and Education at AAMU". This three-year project was started in August 15, 2006, and it was ended in August 14, 2009. The objective of this project was to enhance high performance computing research and education capabilities at Alabama A&M University (AAMU), and to train African-American and other minority students and scientists in the computational science field for eventual employment with DOE. AAMU has successfully completed all the proposed research and educational tasks. Through the support of DOE, AAMU was able tomore » provide opportunities to minority students through summer interns and DOE computational science scholarship program. In the past three years, AAMU (1). Supported three graduate research assistants in image processing for hypersonic shockwave control experiment and in computational science related area; (2). Recruited and provided full financial support for six AAMU undergraduate summer research interns to participate Research Alliance in Math and Science (RAMS) program at Oak Ridge National Lab (ORNL); (3). Awarded highly competitive 30 DOE High Performance Computing Scholarships ($1500 each) to qualified top AAMU undergraduate students in science and engineering majors; (4). Improved high performance computing laboratory at AAMU with the addition of three high performance Linux workstations; (5). Conducted image analysis for electromagnetic shockwave control experiment and computation of shockwave interactions to verify the design and operation of AAMU-Supersonic wind tunnel. The high performance computing research and education activities at AAMU created great impact to minority students. As praised by Accreditation Board for Engineering and Technology (ABET) in 2009, ?The work on high performance computing that is funded by the Department of Energy provides scholarships to undergraduate students as computational science scholars. This is a wonderful opportunity to recruit under-represented students.? Three ASEE papers were published in 2007, 2008 and 2009 proceedings of ASEE Annual Conferences, respectively. Presentations of these papers were also made at the ASEE Annual Conferences. It is very critical to continue the research and education activities.« less
Generic Hypersonic Inlet Module Analysis
NASA Technical Reports Server (NTRS)
Cockrell, Chares E., Jr.; Huebner, Lawrence D.
2004-01-01
A computational study associated with an internal inlet drag analysis was performed for a generic hypersonic inlet module. The purpose of this study was to determine the feasibility of computing the internal drag force for a generic scramjet engine module using computational methods. The computational study consisted of obtaining two-dimensional (2D) and three-dimensional (3D) computational fluid dynamics (CFD) solutions using the Euler and parabolized Navier-Stokes (PNS) equations. The solution accuracy was assessed by comparisons with experimental pitot pressure data. The CFD analysis indicates that the 3D PNS solutions show the best agreement with experimental pitot pressure data. The internal inlet drag analysis consisted of obtaining drag force predictions based on experimental data and 3D CFD solutions. A comparative assessment of each of the drag prediction methods is made and the sensitivity of CFD drag values to computational procedures is documented. The analysis indicates that the CFD drag predictions are highly sensitive to the computational procedure used.
NASA Technical Reports Server (NTRS)
Deere, Karen A.; Flamm, Jeffrey D.; Berrier, Bobby L.; Johnson, Stuart K.
2007-01-01
A computational investigation of an axisymmetric Dual Throat Nozzle concept has been conducted. This fluidic thrust-vectoring nozzle was designed with a recessed cavity to enhance the throat shifting technique for improved thrust vectoring. The structured-grid, unsteady Reynolds- Averaged Navier-Stokes flow solver PAB3D was used to guide the nozzle design and analyze performance. Nozzle design variables included extent of circumferential injection, cavity divergence angle, cavity length, and cavity convergence angle. Internal nozzle performance (wind-off conditions) and thrust vector angles were computed for several configurations over a range of nozzle pressure ratios from 1.89 to 10, with the fluidic injection flow rate equal to zero and up to 4 percent of the primary flow rate. The effect of a variable expansion ratio on nozzle performance over a range of freestream Mach numbers up to 2 was investigated. Results indicated that a 60 circumferential injection was a good compromise between large thrust vector angles and efficient internal nozzle performance. A cavity divergence angle greater than 10 was detrimental to thrust vector angle. Shortening the cavity length improved internal nozzle performance with a small penalty to thrust vector angle. Contrary to expectations, a variable expansion ratio did not improve thrust efficiency at the flight conditions investigated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
2013-07-01
The Mathematics and Computation Division of the American Nuclear (ANS) and the Idaho Section of the ANS hosted the 2013 International Conference on Mathematics and Computational Methods Applied to Nuclear Science and Engineering (M and C 2013). This proceedings contains over 250 full papers with topics ranging from reactor physics; radiation transport; materials science; nuclear fuels; core performance and optimization; reactor systems and safety; fluid dynamics; medical applications; analytical and numerical methods; algorithms for advanced architectures; and validation verification, and uncertainty quantification.
Dai, Huanping; Micheyl, Christophe
2015-05-01
Proportion correct (Pc) is a fundamental measure of task performance in psychophysics. The maximum Pc score that can be achieved by an optimal (maximum-likelihood) observer in a given task is of both theoretical and practical importance, because it sets an upper limit on human performance. Within the framework of signal detection theory, analytical solutions for computing the maximum Pc score have been established for several common experimental paradigms under the assumption of Gaussian additive internal noise. However, as the scope of applications of psychophysical signal detection theory expands, the need is growing for psychophysicists to compute maximum Pc scores for situations involving non-Gaussian (internal or stimulus-induced) noise. In this article, we provide a general formula for computing the maximum Pc in various psychophysical experimental paradigms for arbitrary probability distributions of sensory activity. Moreover, easy-to-use MATLAB code implementing the formula is provided. Practical applications of the formula are illustrated, and its accuracy is evaluated, for two paradigms and two types of probability distributions (uniform and Gaussian). The results demonstrate that Pc scores computed using the formula remain accurate even for continuous probability distributions, as long as the conversion from continuous probability density functions to discrete probability mass functions is supported by a sufficiently high sampling resolution. We hope that the exposition in this article, and the freely available MATLAB code, facilitates calculations of maximum performance for a wider range of experimental situations, as well as explorations of the impact of different assumptions concerning internal-noise distributions on maximum performance in psychophysical experiments.
NASA Technical Reports Server (NTRS)
Waithe, Kenrick A.; Deere, Karen A.
2003-01-01
A computational and experimental study was conducted to investigate the effects of multiple injection ports in a two-dimensional, convergent-divergent nozzle, for fluidic thrust vectoring. The concept of multiple injection ports was conceived to enhance the thrust vectoring capability of a convergent-divergent nozzle over that of a single injection port without increasing the secondary mass flow rate requirements. The experimental study was conducted at static conditions in the Jet Exit Test Facility of the 16-Foot Transonic Tunnel Complex at NASA Langley Research Center. Internal nozzle performance was obtained at nozzle pressure ratios up to 10 with secondary nozzle pressure ratios up to 1 for five configurations. The computational study was conducted using the Reynolds Averaged Navier-Stokes computational fluid dynamics code PAB3D with two-equation turbulence closure and linear Reynolds stress modeling. Internal nozzle performance was predicted for nozzle pressure ratios up to 10 with a secondary nozzle pressure ratio of 0.7 for two configurations. Results from the experimental study indicate a benefit to multiple injection ports in a convergent-divergent nozzle. In general, increasing the number of injection ports from one to two increased the pitch thrust vectoring capability without any thrust performance penalties at nozzle pressure ratios less than 4 with high secondary pressure ratios. Results from the computational study are in excellent agreement with experimental results and validates PAB3D as a tool for predicting internal nozzle performance of a two dimensional, convergent-divergent nozzle with multiple injection ports.
Noise facilitation in associative memories of exponential capacity.
Karbasi, Amin; Salavati, Amir Hesam; Shokrollahi, Amin; Varshney, Lav R
2014-11-01
Recent advances in associative memory design through structured pattern sets and graph-based inference algorithms have allowed reliable learning and recall of an exponential number of patterns that satisfy certain subspace constraints. Although these designs correct external errors in recall, they assume neurons that compute noiselessly, in contrast to the highly variable neurons in brain regions thought to operate associatively, such as hippocampus and olfactory cortex. Here we consider associative memories with boundedly noisy internal computations and analytically characterize performance. As long as the internal noise level is below a specified threshold, the error probability in the recall phase can be made exceedingly small. More surprising, we show that internal noise improves the performance of the recall phase while the pattern retrieval capacity remains intact: the number of stored patterns does not reduce with noise (up to a threshold). Computational experiments lend additional support to our theoretical analysis. This work suggests a functional benefit to noisy neurons in biological neuronal networks.
Self-evaluation of decision-making: A general Bayesian framework for metacognitive computation.
Fleming, Stephen M; Daw, Nathaniel D
2017-01-01
People are often aware of their mistakes, and report levels of confidence in their choices that correlate with objective performance. These metacognitive assessments of decision quality are important for the guidance of behavior, particularly when external feedback is absent or sporadic. However, a computational framework that accounts for both confidence and error detection is lacking. In addition, accounts of dissociations between performance and metacognition have often relied on ad hoc assumptions, precluding a unified account of intact and impaired self-evaluation. Here we present a general Bayesian framework in which self-evaluation is cast as a "second-order" inference on a coupled but distinct decision system, computationally equivalent to inferring the performance of another actor. Second-order computation may ensue whenever there is a separation between internal states supporting decisions and confidence estimates over space and/or time. We contrast second-order computation against simpler first-order models in which the same internal state supports both decisions and confidence estimates. Through simulations we show that second-order computation provides a unified account of different types of self-evaluation often considered in separate literatures, such as confidence and error detection, and generates novel predictions about the contribution of one's own actions to metacognitive judgments. In addition, the model provides insight into why subjects' metacognition may sometimes be better or worse than task performance. We suggest that second-order computation may underpin self-evaluative judgments across a range of domains. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Self-Evaluation of Decision-Making: A General Bayesian Framework for Metacognitive Computation
2017-01-01
People are often aware of their mistakes, and report levels of confidence in their choices that correlate with objective performance. These metacognitive assessments of decision quality are important for the guidance of behavior, particularly when external feedback is absent or sporadic. However, a computational framework that accounts for both confidence and error detection is lacking. In addition, accounts of dissociations between performance and metacognition have often relied on ad hoc assumptions, precluding a unified account of intact and impaired self-evaluation. Here we present a general Bayesian framework in which self-evaluation is cast as a “second-order” inference on a coupled but distinct decision system, computationally equivalent to inferring the performance of another actor. Second-order computation may ensue whenever there is a separation between internal states supporting decisions and confidence estimates over space and/or time. We contrast second-order computation against simpler first-order models in which the same internal state supports both decisions and confidence estimates. Through simulations we show that second-order computation provides a unified account of different types of self-evaluation often considered in separate literatures, such as confidence and error detection, and generates novel predictions about the contribution of one’s own actions to metacognitive judgments. In addition, the model provides insight into why subjects’ metacognition may sometimes be better or worse than task performance. We suggest that second-order computation may underpin self-evaluative judgments across a range of domains. PMID:28004960
Computational Methods for Configurational Entropy Using Internal and Cartesian Coordinates.
Hikiri, Simon; Yoshidome, Takashi; Ikeguchi, Mitsunori
2016-12-13
The configurational entropy of solute molecules is a crucially important quantity to study various biophysical processes. Consequently, it is necessary to establish an efficient quantitative computational method to calculate configurational entropy as accurately as possible. In the present paper, we investigate the quantitative performance of the quasi-harmonic and related computational methods, including widely used methods implemented in popular molecular dynamics (MD) software packages, compared with the Clausius method, which is capable of accurately computing the change of the configurational entropy upon temperature change. Notably, we focused on the choice of the coordinate systems (i.e., internal or Cartesian coordinates). The Boltzmann-quasi-harmonic (BQH) method using internal coordinates outperformed all the six methods examined here. The introduction of improper torsions in the BQH method improves its performance, and anharmonicity of proper torsions in proteins is identified to be the origin of the superior performance of the BQH method. In contrast, widely used methods implemented in MD packages show rather poor performance. In addition, the enhanced sampling of replica-exchange MD simulations was found to be efficient for the convergent behavior of entropy calculations. Also in folding/unfolding transitions of a small protein, Chignolin, the BQH method was reasonably accurate. However, the independent term without the correlation term in the BQH method was most accurate for the folding entropy among the methods considered in this study, because the QH approximation of the correlation term in the BQH method was no longer valid for the divergent unfolded structures.
A mixed analog/digital chaotic neuro-computer system for quadratic assignment problems.
Horio, Yoshihiko; Ikeguchi, Tohru; Aihara, Kazuyuki
2005-01-01
We construct a mixed analog/digital chaotic neuro-computer prototype system for quadratic assignment problems (QAPs). The QAP is one of the difficult NP-hard problems, and includes several real-world applications. Chaotic neural networks have been used to solve combinatorial optimization problems through chaotic search dynamics, which efficiently searches optimal or near optimal solutions. However, preliminary experiments have shown that, although it obtained good feasible solutions, the Hopfield-type chaotic neuro-computer hardware system could not obtain the optimal solution of the QAP. Therefore, in the present study, we improve the system performance by adopting a solution construction method, which constructs a feasible solution using the analog internal state values of the chaotic neurons at each iteration. In order to include the construction method into our hardware, we install a multi-channel analog-to-digital conversion system to observe the internal states of the chaotic neurons. We show experimentally that a great improvement in the system performance over the original Hopfield-type chaotic neuro-computer is obtained. That is, we obtain the optimal solution for the size-10 QAP in less than 1000 iterations. In addition, we propose a guideline for parameter tuning of the chaotic neuro-computer system according to the observation of the internal states of several chaotic neurons in the network.
ERIC Educational Resources Information Center
Unlu, Ali; Schurig, Michael
2015-01-01
Recently, performance profiles in reading, mathematics and science were created using the data collectively available in the Trends in International Mathematics and Science Study (TIMSS) and the Progress in International Reading Literacy Study (PIRLS) 2011. In addition, a classification of children to the end of their primary school years was…
Bibbo, Giovanni; Brown, Scott; Linke, Rebecca
2016-08-01
Diagnostic Reference Levels (DRL) of procedures involving ionizing radiation are important tools to optimizing radiation doses delivered to patients and in identifying cases where the levels of doses are unusually high. This is particularly important for paediatric patients undergoing computed tomography (CT) examinations as these examinations are associated with relatively high-dose. Paediatric CT studies, performed at our institution from January 2010 to March 2014, have been retrospectively analysed to determine the 75th and 95th percentiles of both the volume computed tomography dose index (CTDIvol ) and dose-length product (DLP) for the most commonly performed studies to: establish local diagnostic reference levels for paediatric computed tomography examinations performed at our institution, benchmark our DRL with national and international published paediatric values, and determine the compliance of CT radiographer with established protocols. The derived local 75th percentile DRL have been found to be acceptable when compared with those published by the Australian National Radiation Dose Register and two national children's hospitals, and at the international level with the National Reference Doses for the UK. The 95th percentiles of CTDIvol for the various CT examinations have been found to be acceptable values for the CT scanner Dose-Check Notification. Benchmarking CT radiographers shows that they follow the set protocols for the various examinations without significant variations in the machine setting factors. The derivation of DRL has given us the tool to evaluate and improve the performance of our CT service by improved compliance and a reduction in radiation dose to our paediatric patients. We have also been able to benchmark our performance with similar national and international institutions. © 2016 The Royal Australian and New Zealand College of Radiologists.
Power-on performance predictions for a complete generic hypersonic vehicle configuration
NASA Technical Reports Server (NTRS)
Bennett, Bradford C.
1991-01-01
The Compressible Navier-Stokes (CNS) code was developed to compute external hypersonic flow fields. It has been applied to various hypersonic external flow applications. Here, the CNS code was modified to compute hypersonic internal flow fields. Calculations were performed on a Mach 18 sidewall compression inlet and on the Lewis Mach 5 inlet. The use of the ARC3D diagonal algorithm was evaluated for internal flows on the Mach 5 inlet flow. The initial modifications to the CNS code involved generalization of the boundary conditions and the addition of viscous terms in the second crossflow direction and modifications to the Baldwin-Lomax turbulence model for corner flows.
Casey, M
1996-08-15
Recurrent neural networks (RNNs) can learn to perform finite state computations. It is shown that an RNN performing a finite state computation must organize its state space to mimic the states in the minimal deterministic finite state machine that can perform that computation, and a precise description of the attractor structure of such systems is given. This knowledge effectively predicts activation space dynamics, which allows one to understand RNN computation dynamics in spite of complexity in activation dynamics. This theory provides a theoretical framework for understanding finite state machine (FSM) extraction techniques and can be used to improve training methods for RNNs performing FSM computations. This provides an example of a successful approach to understanding a general class of complex systems that has not been explicitly designed, e.g., systems that have evolved or learned their internal structure.
Users manual and modeling improvements for axial turbine design and performance computer code TD2-2
NASA Technical Reports Server (NTRS)
Glassman, Arthur J.
1992-01-01
Computer code TD2 computes design point velocity diagrams and performance for multistage, multishaft, cooled or uncooled, axial flow turbines. This streamline analysis code was recently modified to upgrade modeling related to turbine cooling and to the internal loss correlation. These modifications are presented in this report along with descriptions of the code's expanded input and output. This report serves as the users manual for the upgraded code, which is named TD2-2.
Neurosemantics, neurons and system theory.
Breidbach, Olaf
2007-08-01
Following the concept of internal representations, signal processing in a neuronal system has to be evaluated exclusively based on internal system characteristics. Thus, this approach omits the external observer as a control function for sensory integration. Instead, the configuration of the system and its computational performance are the effects of endogenous factors. Such self-referential operation is due to a strictly local computation in a network and, thereby, computations follow a set of rules that constitute the emergent behaviour of the system. These rules can be shown to correspond to a "logic" that is intrinsic to the system, an idea which provides the basis for neurosemantics.
ERIC Educational Resources Information Center
Punter, R. Annemiek; Meelissen, Martina R. M.; Glas, Cees A. W.
2017-01-01
IEA's International Computer and Information Literacy Study (ICILS) 2013 showed that in the majority of the participating countries, 14-year-old girls outperformed boys in computer and information literacy (CIL): results that seem to contrast with the common view of boys having better computer skills. This study used the ICILS data to explore…
Japanese supercomputer technology.
Buzbee, B L; Ewald, R H; Worlton, W J
1982-12-17
Under the auspices of the Ministry for International Trade and Industry the Japanese have launched a National Superspeed Computer Project intended to produce high-performance computers for scientific computation and a Fifth-Generation Computer Project intended to incorporate and exploit concepts of artificial intelligence. If these projects are successful, which appears likely, advanced economic and military research in the United States may become dependent on access to supercomputers of foreign manufacture.
Aero-Thermo-Structural Design Optimization of Internally Cooled Turbine Blades
NASA Technical Reports Server (NTRS)
Dulikravich, G. S.; Martin, T. J.; Dennis, B. H.; Lee, E.; Han, Z.-X.
1999-01-01
A set of robust and computationally affordable inverse shape design and automatic constrained optimization tools have been developed for the improved performance of internally cooled gas turbine blades. The design methods are applicable to the aerodynamics, heat transfer, and thermoelasticity aspects of the turbine blade. Maximum use of the existing proven disciplinary analysis codes is possible with this design approach. Preliminary computational results demonstrate possibilities to design blades with minimized total pressure loss and maximized aerodynamic loading. At the same time, these blades are capable of sustaining significantly higher inlet hot gas temperatures while requiring remarkably lower coolant mass flow rates. These results suggest that it is possible to design internally cooled turbine blades that will cost less to manufacture, will have longer life span, and will perform as good, if not better than, film cooled turbine blades.
Lee, Cheens; Robinson, Kerin M; Wendt, Kate; Williamson, Dianne
The unimpeded functioning of hospital Health Information Services (HIS) is essential for patient care, clinical governance, organisational performance measurement, funding and research. In an investigation of hospital Health Information Services' preparedness for internal disasters, all hospitals in the state of Victoria with the following characteristics were surveyed: they have a Health Information Service/ Department; there is a Manager of the Health Information Service/Department; and their inpatient capacity is greater than 80 beds. Fifty percent of the respondents have experienced an internal disaster within the past decade, the majority affecting the Health Information Service. The most commonly occurring internal disasters were computer system failure and floods. Two-thirds of the hospitals have internal disaster plans; the most frequently occurring scenarios provided for are computer system failure, power failure and fire. More large hospitals have established back-up systems than medium- and small-size hospitals. Fifty-three percent of hospitals have a recovery plan for internal disasters. Hospitals typically self-rate as having a 'medium' level of internal disaster preparedness. Overall, large hospitals are better prepared for internal disasters than medium and small hospitals, and preparation for disruption of computer systems and medical record services is relatively high on their agendas.
Varol, Altan; Basa, Selçuk
2009-06-01
Maxillary distraction osteogenesis is a challenging procedure when it is performed with internal submerged distractors due to obligation of setting accurate distraction vectors. Five patients with severe maxillary retrognathy were planned with Mimics 10.01 CMF and Simplant 10.01 software. Distraction vectors and rods of distractors were arranged in 3D environment and on STL models. All patients were operated under general anaesthesia and complete Le Fort I downfracture was performed. All distractions were performed according to orientated vectors. All patients achieved stable occlusion and satisfactory aesthetic outcome at the end of the treatment period. Preoperative bending of internal maxillary distractors prevents significant loss of operation time. 3D computer-aided surgical simulation and model surgery provide accurate orientation of distraction vectors for premaxillary and internal trans-sinusoidal maxillary distraction. Combination of virtual surgical simulation and stereolithographic models surgery can be validated as an effective method of preoperative planning for complicated maxillofacial surgery cases.
Importance of balanced architectures in the design of high-performance imaging systems
NASA Astrophysics Data System (ADS)
Sgro, Joseph A.; Stanton, Paul C.
1999-03-01
Imaging systems employed in demanding military and industrial applications, such as automatic target recognition and computer vision, typically require real-time high-performance computing resources. While high- performances computing systems have traditionally relied on proprietary architectures and custom components, recent advances in high performance general-purpose microprocessor technology have produced an abundance of low cost components suitable for use in high-performance computing systems. A common pitfall in the design of high performance imaging system, particularly systems employing scalable multiprocessor architectures, is the failure to balance computational and memory bandwidth. The performance of standard cluster designs, for example, in which several processors share a common memory bus, is typically constrained by memory bandwidth. The symptom characteristic of this problem is failure to the performance of the system to scale as more processors are added. The problem becomes exacerbated if I/O and memory functions share the same bus. The recent introduction of microprocessors with large internal caches and high performance external memory interfaces makes it practical to design high performance imaging system with balanced computational and memory bandwidth. Real word examples of such designs will be presented, along with a discussion of adapting algorithm design to best utilize available memory bandwidth.
Internal-illumination photoacoustic computed tomography
NASA Astrophysics Data System (ADS)
Li, Mucong; Lan, Bangxin; Liu, Wei; Xia, Jun; Yao, Junjie
2018-03-01
We report a photoacoustic computed tomography (PACT) system using a customized optical fiber with a cylindrical diffuser to internally illuminate deep targets. The traditional external light illumination in PACT usually limits the penetration depth to a few centimeters from the tissue surface, mainly due to strong optical attenuation along the light propagation path from the outside in. By contrast, internal light illumination, with external ultrasound detection, can potentially detect much deeper targets. Different from previous internal illumination PACT implementations using forward-looking optical fibers, our internal-illumination PACT system uses a customized optical fiber with a 3-cm-long conoid needle diffuser attached to the fiber tip, which can homogeneously illuminate the surrounding space and substantially enlarge the field of view. We characterized the internal illumination distribution and PACT system performance. We performed tissue phantom and in vivo animal studies to further demonstrate the superior imaging depth using internal illumination over external illumination. We imaged a 7.5-cm-deep leaf target embedded in optically scattering medium and the beating heart of a mouse overlaid with 3.7-cm-thick chicken tissue. Our results have collectively demonstrated that the internal light illumination combined with external ultrasound detection might be a useful strategy to improve the penetration depth of PACT in imaging deep organs of large animals and humans.
NASA Technical Reports Server (NTRS)
Kowalski, E. J.
1979-01-01
A computerized method which utilizes the engine performance data is described. The method estimates the installed performance of aircraft gas turbine engines. This installation includes: engine weight and dimensions, inlet and nozzle internal performance and drag, inlet and nacelle weight, and nacelle drag.
Software For Computer-Aided Design Of Control Systems
NASA Technical Reports Server (NTRS)
Wette, Matthew
1994-01-01
Computer Aided Engineering System (CAESY) software developed to provide means to evaluate methods for dealing with users' needs in computer-aided design of control systems. Interpreter program for performing engineering calculations. Incorporates features of both Ada and MATLAB. Designed to be flexible and powerful. Includes internally defined functions, procedures and provides for definition of functions and procedures by user. Written in C language.
Personal Computer and Workstation Operating Systems Tutorial
1994-03-01
to a RAM area where it is executed by the CPU. The program consists of instructions that perform operations on data. The CPU will perform two basic...memory to improve system performance. More often the user will buy a new fixed disk so the computer will hold more programs internally. The trend today...MHZ. Another way to view how fast the information is going into the register is in a time domain rather than a frequency domain knowing that time and
Computer predictions of ground storage effects on performance of Galileo and ISPM generators
NASA Technical Reports Server (NTRS)
Chmielewski, A.
1983-01-01
Radioisotope Thermoelectric Generators (RTG) that will supply electrical power to the Galileo and International Solar Polar Mission (ISPM) spacecraft are exposed to several degradation mechanisms during the prolonged ground storage before launch. To assess the effect of storage on the RTG flight performance, a computer code has been developed which simulates all known degradation mechanisms that occur in an RTG during storage and flight. The modeling of these mechanisms and their impact on the RTG performance are discussed.
NASA Technical Reports Server (NTRS)
Trinh, H. P.; Gross, K. W.
1989-01-01
Computational studies have been conducted to examine the capability of a CFD code by simulating the steady state thrust chamber internal flow. The SSME served as the sample case, and significant parameter profiles are presented and discussed. Performance predictions from TDK, the recommended JANNAF reference computer program, are compared with those from PHOENICS to establish the credibility of its results. The investigation of an overexpanded nozzle flow is particularly addressed since it plays an important role in the area ratio selection of future rocket engines. Experience gained during this uncompleted flow separation study and future steps are outlined.
Ferraro, M; Foster, D H
1991-01-01
Under certain experimental conditions, visual discrimination performance in multielement images is closely related to visual identification performance: elements of the image are distinguished only insofar as they appear to have distinct, discrete, internal characterizations. This report is concerned with the detailed relationship between such internal characterizations and observable discrimination performance. Two types of general processes that might underline discrimination are considered. The first is based on computing all possible internal image characterizations that could allow a correct decision, each characterization weighted by the probability of its occurrence and of a correct decision being made. The second process is based on computing the difference between the probabilities associated with the internal characterizations of the individual image elements, the difference quantified naturally with an l(p) norm. The relationship between the two processes was investigated analytically and by Monte Carlo simulations over a plausible range of numbers n of the internal characterizations of each of the m elements in the image. The predictions of the two processes were found to be closely similar. The relationship was precisely one-to-one, however, only for n = 2, m = 3, 4, 6, and for n greater than 2, m = 3, 4, p = 2. For all other cases tested, a one-to-one relationship was shown to be impossible.
Synthesis of Efficient Structures for Concurrent Computation.
1983-10-01
formal presentation of these techniques, called virtualisation and aggregation, can be found n [King-83$. 113.2 Census Functions Trees perform broadcast... Functions .. .. .. .. ... .... ... ... .... ... ... ....... 6 4 User-Assisted Aggregation .. .. .. .. ... ... ... .... ... .. .......... 6 5 Parallel...6. Simple Parallel Structure for Broadcasting .. .. .. .. .. . ... .. . .. . .... 4 Figure 7. Internal Structure of a Prefix Computation Network
NASA Technical Reports Server (NTRS)
Egolf, T. Alan; Anderson, Olof L.; Edwards, David E.; Landgrebe, Anton J.
1988-01-01
A user's manual for the computer program developed for the prediction of propeller-nacelle aerodynamic performance reported in, An Analysis for High Speed Propeller-Nacelle Aerodynamic Performance Prediction: Volume 1 -- Theory and Application, is presented. The manual describes the computer program mode of operation requirements, input structure, input data requirements and the program output. In addition, it provides the user with documentation of the internal program structure and the software used in the computer program as it relates to the theory presented in Volume 1. Sample input data setups are provided along with selected printout of the program output for one of the sample setups.
Chen, Feng; Wang, Shuang; Jiang, Xiaoqian; Ding, Sijie; Lu, Yao; Kim, Jihoon; Sahinalp, S. Cenk; Shimizu, Chisato; Burns, Jane C.; Wright, Victoria J.; Png, Eileen; Hibberd, Martin L.; Lloyd, David D.; Yang, Hai; Telenti, Amalio; Bloss, Cinnamon S.; Fox, Dov; Lauter, Kristin; Ohno-Machado, Lucila
2017-01-01
Abstract Motivation: We introduce PRINCESS, a privacy-preserving international collaboration framework for analyzing rare disease genetic data that are distributed across different continents. PRINCESS leverages Software Guard Extensions (SGX) and hardware for trustworthy computation. Unlike a traditional international collaboration model, where individual-level patient DNA are physically centralized at a single site, PRINCESS performs a secure and distributed computation over encrypted data, fulfilling institutional policies and regulations for protected health information. Results: To demonstrate PRINCESS’ performance and feasibility, we conducted a family-based allelic association study for Kawasaki Disease, with data hosted in three different continents. The experimental results show that PRINCESS provides secure and accurate analyses much faster than alternative solutions, such as homomorphic encryption and garbled circuits (over 40 000× faster). Availability and Implementation: https://github.com/achenfengb/PRINCESS_opensource Contact: shw070@ucsd.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:28065902
Performability evaluation of the SIFT computer
NASA Technical Reports Server (NTRS)
Meyer, J. F.; Furchtgott, D. G.; Wu, L. T.
1979-01-01
Performability modeling and evaluation techniques are applied to the SIFT computer as it might operate in the computational evironment of an air transport mission. User-visible performance of the total system (SIFT plus its environment) is modeled as a random variable taking values in a set of levels of accomplishment. These levels are defined in terms of four attributes of total system behavior: safety, no change in mission profile, no operational penalties, and no economic process whose states describe the internal structure of SIFT as well as relavant conditions of the environment. Base model state trajectories are related to accomplishment levels via a capability function which is formulated in terms of a 3-level model hierarchy. Performability evaluation algorithms are then applied to determine the performability of the total system for various choices of computer and environment parameter values. Numerical results of those evaluations are presented and, in conclusion, some implications of this effort are discussed.
Foundations for computer simulation of a low pressure oil flooded single screw air compressor
NASA Astrophysics Data System (ADS)
Bein, T. W.
1981-12-01
The necessary logic to construct a computer model to predict the performance of an oil flooded, single screw air compressor is developed. The geometric variables and relationships used to describe the general single screw mechanism are developed. The governing equations to describe the processes are developed from their primary relationships. The assumptions used in the development are also defined and justified. The computer model predicts the internal pressure, temperature, and flowrates through the leakage paths throughout the compression cycle of the single screw compressor. The model uses empirical external values as the basis for the internal predictions. The computer values are compared to the empirical values, and conclusions are drawn based on the results. Recommendations are made for future efforts to improve the computer model and to verify some of the conclusions that are drawn.
High Productivity Computing Systems and Competitiveness Initiative
2007-07-01
planning committee for the annual, international Supercomputing Conference in 2004 and 2005. This is the leading HPC industry conference in the world. It...sector partnerships. Partnerships will form a key part of discussions at the 2nd High Performance Computing Users Conference, planned for July 13, 2005...other things an interagency roadmap for high-end computing core technologies and an accessibility improvement plan . Improving HPC Education and
The composition of intern work while on call.
Fletcher, Kathlyn E; Visotcky, Alexis M; Slagle, Jason M; Tarima, Sergey; Weinger, Matthew B; Schapira, Marilyn M
2012-11-01
The work of house staff is being increasingly scrutinized as duty hours continue to be restricted. To describe the distribution of work performed by internal medicine interns while on call. Prospective time motion study on general internal medicine wards at a VA hospital affiliated with a tertiary care medical center and internal medicine residency program. Internal medicine interns. Trained observers followed interns during a "call" day. The observers continuously recorded the tasks performed by interns, using customized task analysis software. We measured the amount of time spent on each task. We calculated means and standard deviations for the amount of time spent on six categories of tasks: clinical computer work (e.g., writing orders and notes), non-patient communication, direct patient care (work done at the bedside), downtime, transit and teaching/learning. We also calculated means and standard deviations for time spent on specific tasks within each category. We compared the amount of time spent on the top three categories using analysis of variance. The largest proportion of intern time was spent in clinical computer work (40 %). Thirty percent of time was spent on non-patient communication. Only 12 % of intern time was spent at the bedside. Downtime activities, transit and teaching/learning accounted for 11 %, 5 % and 2 % of intern time, respectively. Our results suggest that during on call periods, relatively small amounts of time are spent on direct patient care and teaching/learning activities. As intern duty hours continue to decrease, attention should be directed towards preserving time with patients and increasing time in education.
Internal computational fluid mechanics on supercomputers for aerospace propulsion systems
NASA Technical Reports Server (NTRS)
Andersen, Bernhard H.; Benson, Thomas J.
1987-01-01
The accurate calculation of three-dimensional internal flowfields for application towards aerospace propulsion systems requires computational resources available only on supercomputers. A survey is presented of three-dimensional calculations of hypersonic, transonic, and subsonic internal flowfields conducted at the Lewis Research Center. A steady state Parabolized Navier-Stokes (PNS) solution of flow in a Mach 5.0, mixed compression inlet, a Navier-Stokes solution of flow in the vicinity of a terminal shock, and a PNS solution of flow in a diffusing S-bend with vortex generators are presented and discussed. All of these calculations were performed on either the NAS Cray-2 or the Lewis Research Center Cray XMP.
Study on turbulent flow and heat transfer performance of tubes with internal fins in EGR cooler
NASA Astrophysics Data System (ADS)
Liu, Lin; Ling, Xiang; Peng, Hao
2015-07-01
In this paper, flow and heat transfer performances of the tubes with internal longitudinal fins in Exhaust Gas Recirculation (EGR ) cooler were investigated by three-dimension computation and experiment . Each test tube was a single-pipe structure, without inner tube. Three-dimension computation was performed to determine the thermal characteristics difference between the two kinds of tubes, that is, the tube with an inner solid staff as a blocked structure and the tube without the blocked structure. The effects of fin width and fin height on heat transfer and flow are examined. For proving the validity of numerical method, the calculated results were compared with corresponding experimental data. The tube-side friction factor and heat transfer coefficient were examined. As a result, the maximum deviations between the numerical results and the experimental data are approximately 5.4 % for friction factor and 8.6 % for heat transfer coefficient, respectively. It is found that two types of internally finned tubes enhance significantly heat transfer. The heat transfer of the tube with blocked structure is better, while the pressure drop of the tube without blocked structure is lower. The comprehensive performance of the unblocked tube is better to applied in EGR cooler.
Lee, Wan-Sun; Kim, Woong-Chul
2015-01-01
PURPOSE To assess the marginal and internal gaps of the copings fabricated by computer-aided milling and direct metal laser sintering (DMLS) systems in comparison to casting method. MATERIALS AND METHODS Ten metal copings were fabricated by casting, computer-aided milling, and DMLS. Seven mesiodistal and labiolingual positions were then measured, and each of these were divided into the categories; marginal gap (MG), cervical gap (CG), axial wall at internal gap (AG), and incisal edge at internal gap (IG). Evaluation was performed by a silicone replica technique. A digital microscope was used for measurement of silicone layer. Statistical analyses included one-way and repeated measure ANOVA to test the difference between the fabrication methods and categories of measured points (α=.05), respectively. RESULTS The mean gap differed significantly with fabrication methods (P<.001). Casting produced the narrowest gap in each of the four measured positions, whereas CG, AG, and IG proved narrower in computer-aided milling than in DMLS. Thus, with the exception of MG, all positions exhibited a significant difference between computer-aided milling and DMLS (P<.05). CONCLUSION Although the gap was found to vary with fabrication methods, the marginal and internal gaps of the copings fabricated by computer-aided milling and DMLS fell within the range of clinical acceptance (<120 µm). However, the statistically significant difference to conventional casting indicates that the gaps in computer-aided milling and DMLS fabricated restorations still need to be further reduced. PMID:25932310
Park, Jong-Kyoung; Lee, Wan-Sun; Kim, Hae-Young; Kim, Woong-Chul; Kim, Ji-Hwan
2015-04-01
To assess the marginal and internal gaps of the copings fabricated by computer-aided milling and direct metal laser sintering (DMLS) systems in comparison to casting method. Ten metal copings were fabricated by casting, computer-aided milling, and DMLS. Seven mesiodistal and labiolingual positions were then measured, and each of these were divided into the categories; marginal gap (MG), cervical gap (CG), axial wall at internal gap (AG), and incisal edge at internal gap (IG). Evaluation was performed by a silicone replica technique. A digital microscope was used for measurement of silicone layer. Statistical analyses included one-way and repeated measure ANOVA to test the difference between the fabrication methods and categories of measured points (α=.05), respectively. The mean gap differed significantly with fabrication methods (P<.001). Casting produced the narrowest gap in each of the four measured positions, whereas CG, AG, and IG proved narrower in computer-aided milling than in DMLS. Thus, with the exception of MG, all positions exhibited a significant difference between computer-aided milling and DMLS (P<.05). Although the gap was found to vary with fabrication methods, the marginal and internal gaps of the copings fabricated by computer-aided milling and DMLS fell within the range of clinical acceptance (<120 µm). However, the statistically significant difference to conventional casting indicates that the gaps in computer-aided milling and DMLS fabricated restorations still need to be further reduced.
The Influence of Test Mode and Visuospatial Ability on Mathematics Assessment Performance
ERIC Educational Resources Information Center
Logan, Tracy
2015-01-01
Mathematics assessment and testing are increasingly situated within digital environments with international tests moving to computer-based testing in the near future. This paper reports on a secondary data analysis which explored the influence the mode of assessment--computer-based (CBT) and pencil-and-paper based (PPT)--and visuospatial ability…
Computed radiography imaging plates and associated methods of manufacture
Henry, Nathaniel F.; Moses, Alex K.
2015-08-18
Computed radiography imaging plates incorporating an intensifying material that is coupled to or intermixed with the phosphor layer, allowing electrons and/or low energy x-rays to impart their energy on the phosphor layer, while decreasing internal scattering and increasing resolution. The radiation needed to perform radiography can also be reduced as a result.
Optimizing Engineering Tools Using Modern Ground Architectures
2017-12-01
Considerations,” International Journal of Computer Science & Engineering Survey , vol. 5, no. 4, 2014. [10] R. Bell. (n.d). A beginner’s guide to big O notation...scientific community. Traditional computing architectures were not capable of processing the data efficiently, or in some cases, could not process the...thesis investigates how these modern computing architectures could be leveraged by industry and academia to improve the performance and capabilities of
Kish, Gary; Cook, Samuel A; Kis, Gréta
2013-01-01
The University of Debrecen's Faculty of Medicine has an international, multilingual student population with anatomy courses taught in English to all but Hungarian students. An elective computer-assisted gross anatomy course, the Computer Human Anatomy (CHA), has been taught in English at the Anatomy Department since 2008. This course focuses on an introduction to anatomical digital images along with clinical cases. This low-budget course has a large visual component using images from magnetic resonance imaging and computer axial tomogram scans, ultrasound clinical studies, and readily available anatomy software that presents topics which run in parallel to the university's core anatomy curriculum. From the combined computer images and CHA lecture information, students are asked to solve computer-based clinical anatomy problems in the CHA computer laboratory. A statistical comparison was undertaken of core anatomy oral examination performances of English program first-year medical students who took the elective CHA course and those who did not in the three academic years 2007-2008, 2008-2009, and 2009-2010. The results of this study indicate that the CHA-enrolled students improved their performance on required anatomy core curriculum oral examinations (P < 0.001), suggesting that computer-assisted learning may play an active role in anatomy curriculum improvement. These preliminary results have prompted ongoing evaluation of what specific aspects of CHA are valuable and which students benefit from computer-assisted learning in a multilingual and diverse cultural environment. Copyright © 2012 American Association of Anatomists.
Effect of varying internal geometry on the static performance of rectangular thrust-reverser ports
NASA Technical Reports Server (NTRS)
Re, Richard J.; Mason, Mary L.
1987-01-01
An investigation has been conducted to evaluate the effects of several geometric parameters on the internal performance of rectangular thrust-reverser ports for nonaxisymmetric nozzles. Internal geometry was varied with a test apparatus which simulated a forward-flight nozzle with a single, fully deployed reverser port. The test apparatus was designed to simulate thrust reversal (conceptually) either in the convergent section of the nozzle or in the constant-area duct just upstream of the nozzle. The main geometric parameters investigated were port angle, port corner radius, port location, and internal flow blocker angle. For all reverser port geometries, the port opening had an aspect ratio (throat width to throat height) of 6.1 and had a constant passage area from the geometric port throat to the exit. Reverser-port internal performance and thrust-vector angles computed from force-balance measurements are presented.
Computational Aerodynamic Simulations of a Spacecraft Cabin Ventilation Fan Design
NASA Technical Reports Server (NTRS)
Tweedt, Daniel L.
2010-01-01
Quieter working environments for astronauts are needed if future long-duration space exploration missions are to be safe and productive. Ventilation and payload cooling fans are known to be dominant sources of noise, with the International Space Station being a good case in point. To address this issue cost effectively, early attention to fan design, selection, and installation has been recommended, leading to an effort by NASA to examine the potential for small-fan noise reduction by improving fan aerodynamic design. As a preliminary part of that effort, the aerodynamics of a cabin ventilation fan designed by Hamilton Sundstrand has been simulated using computational fluid dynamics codes, and the computed solutions analyzed to quantify various aspects of the fan aerodynamics and performance. Four simulations were performed at the design rotational speed: two at the design flow rate and two at off-design flow rates. Following a brief discussion of the computational codes, various aerodynamic- and performance-related quantities derived from the computed flow fields are presented along with relevant flow field details. The results show that the computed fan performance is in generally good agreement with stated design goals.
NASA Technical Reports Server (NTRS)
1995-01-01
A computational fluid dynamics (CFD) analysis has been performed on the aft slot region of the Titan 4 Solid Rocket Motor Upgrade (SRMU). This analysis was performed in conjunction with MSFC structural modeling of the propellant grain to determine if the flow field induced stresses would adversely alter the propellant geometry to the extent of causing motor failure. The results of the coupled CFD/stress analysis have shown that there is a continual increase of flow field resistance at the aft slot due to the aft segment propellant grain being progressively moved radially toward the centerline of the motor port. This 'bootstrapping' effect between grain radial movement and internal flow resistance is conducive to causing a rapid motor failure.
Chen, Feng; Wang, Shuang; Jiang, Xiaoqian; Ding, Sijie; Lu, Yao; Kim, Jihoon; Sahinalp, S Cenk; Shimizu, Chisato; Burns, Jane C; Wright, Victoria J; Png, Eileen; Hibberd, Martin L; Lloyd, David D; Yang, Hai; Telenti, Amalio; Bloss, Cinnamon S; Fox, Dov; Lauter, Kristin; Ohno-Machado, Lucila
2017-03-15
We introduce PRINCESS, a privacy-preserving international collaboration framework for analyzing rare disease genetic data that are distributed across different continents. PRINCESS leverages Software Guard Extensions (SGX) and hardware for trustworthy computation. Unlike a traditional international collaboration model, where individual-level patient DNA are physically centralized at a single site, PRINCESS performs a secure and distributed computation over encrypted data, fulfilling institutional policies and regulations for protected health information. To demonstrate PRINCESS' performance and feasibility, we conducted a family-based allelic association study for Kawasaki Disease, with data hosted in three different continents. The experimental results show that PRINCESS provides secure and accurate analyses much faster than alternative solutions, such as homomorphic encryption and garbled circuits (over 40 000× faster). https://github.com/achenfengb/PRINCESS_opensource. shw070@ucsd.edu. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Rad-hard computer elements for space applications
NASA Technical Reports Server (NTRS)
Krishnan, G. S.; Longerot, Carl D.; Treece, R. Keith
1993-01-01
Space Hardened CMOS computer elements emulating a commercial microcontroller and microprocessor family have been designed, fabricated, qualified, and delivered for a variety of space programs including NASA's multiple launch International Solar-Terrestrial Physics (ISTP) program, Mars Observer, and government and commercial communication satellites. Design techniques and radiation performance of the 1.25 micron feature size products are described.
Measuring Weld Profiles By Computer Tomography
NASA Technical Reports Server (NTRS)
Pascua, Antonio G.; Roy, Jagatjit
1990-01-01
Noncontacting, nondestructive computer tomography system determines internal and external contours of welded objects. System makes it unnecessary to take metallurgical sections (destructive technique) or to take silicone impressions of hidden surfaces (technique that contaminates) to inspect them. Measurements of contours via tomography performed 10 times as fast as measurements via impression molds, and tomography does not contaminate inspected parts.
Combining high performance simulation, data acquisition, and graphics display computers
NASA Technical Reports Server (NTRS)
Hickman, Robert J.
1989-01-01
Issues involved in the continuing development of an advanced simulation complex are discussed. This approach provides the capability to perform the majority of tests on advanced systems, non-destructively. The controlled test environments can be replicated to examine the response of the systems under test to alternative treatments of the system control design, or test the function and qualification of specific hardware. Field tests verify that the elements simulated in the laboratories are sufficient. The digital computer is hosted by a Digital Equipment Corp. MicroVAX computer with an Aptec Computer Systems Model 24 I/O computer performing the communication function. An Applied Dynamics International AD100 performs the high speed simulation computing and an Evans and Sutherland PS350 performs on-line graphics display. A Scientific Computer Systems SCS40 acts as a high performance FORTRAN program processor to support the complex, by generating numerous large files from programs coded in FORTRAN that are required for the real time processing. Four programming languages are involved in the process, FORTRAN, ADSIM, ADRIO, and STAPLE. FORTRAN is employed on the MicroVAX host to initialize and terminate the simulation runs on the system. The generation of the data files on the SCS40 also is performed with FORTRAN programs. ADSIM and ADIRO are used to program the processing elements of the AD100 and its IOCP processor. STAPLE is used to program the Aptec DIP and DIA processors.
Modelling non-hydrostatic processes in sill regions
NASA Astrophysics Data System (ADS)
Souza, A.; Xing, J.; Davies, A.; Berntsen, J.
2007-12-01
We use a non-hydrostatic model to compute tidally induced flow and mixing in the region of bottom topography representing the sill at the entrance to Loch Etive (Scotland). This site is chosen since detailed measurements were recently made there. With non-hydrostatic dynamics in the model our results showed that the model could reproduce the observed flow characteristics, e.g., hydraulic transition, flow separation and internal waves. However, when calculations were performed using the model in the hydrostatic form, significant artificial convective mixing occurred. This influenced the computed temperature and flow field. We will discuss in detail the effects of non-hydrostatic dynamics on flow over the sill, especially investigate non-linear and non-hydrostatic contributions to modelled internal waves and internal wave energy fluxes.
Solid-propellant rocket motor internal ballistics performance variation analysis, phase 5
NASA Technical Reports Server (NTRS)
Sforzini, R. H.; Murph, J. E.
1980-01-01
The results of research aimed at improving the predictability of internal ballistics performance of solid-propellant rocket motors (SRM's) including thrust imbalance between two SRM's firing in parallel are presented. Static test data from the first six Space Shuttle SRM's is analyzed using a computer program previously developed for this purpose. The program permits intentional minor design biases affecting the imbalance between any two SMR's to be removed. Results for the last four of the six SRM's, with only the propellant bulk temperature as a non-random variable, are generally within limits predicted by theory. Extended studies of internal ballistic performance of single SRM's are presented based on an earlier developed mathematical model which includes an assessment of grain deformation. The erosive burning rate law used in the model is upgraded and made more general. Excellent results are obtained in predictions of the performances of five different SRM's of quite different sizes and configurations. These SRM's all employ PBAN type propellants with ammonium perchlorate oxidizer and 16 to 20% aluminum except one which uses carboxyl terminated butadiene binder. The only non-calculated parameters in the burning rate equations that are changed for the different SRM's are the zero crossflow velocity burning rate coefficients and exponents. The results, in general, confirm the importance of grain deformation. The improved internal ballistic model makes practical development of an effective computer program for application of an optimization technique to SRM design which is also demonstrated. The program uses a pattern search technique to minimize the difference between a desired thrust-time trace and one calculated based on the internal ballistic model.
International Ultraviolet Explorer (IUE) satellite mission analysis
NASA Technical Reports Server (NTRS)
Cook, R. A.; Griffin, J. H.
1975-01-01
The results are presented of the mission analysis performed by Computer Sciences Corporation (CSC) in support of the International Ultraviolet Explorer (IUE) satellite. The launch window is open for three separate periods (for a total time of 7 months) during the year extending from July 20, 1977, to July 20, 1978. The synchronous orbit shadow constraint limits the launch window to approximately 88 minutes per day. Apogee boost motor fuel was computed to be 455 pounds (206 kilograms) and on-station weight was 931 pounds (422 kilograms). The target orbit is elliptical synchronous, with eccentricity 0.272 and 24 hour period.
Euler Flow Computations on Non-Matching Unstructured Meshes
NASA Technical Reports Server (NTRS)
Gumaste, Udayan
1999-01-01
Advanced fluid solvers to predict aerodynamic performance-coupled treatment of multiple fields are described. The interaction between the fluid and structural components in the bladed regions of the engine is investigated with respect to known blade failures caused by either flutter or forced vibrations. Methods are developed to describe aeroelastic phenomena for internal flows in turbomachinery by accounting for the increased geometric complexity, mutual interaction between adjacent structural components and presence of thermal and geometric loading. The computer code developed solves the full three dimensional aeroelastic problem of-stage. The results obtained show that flow computations can be performed on non-matching finite-volume unstructured meshes with second order spatial accuracy.
Performance monitoring for brain-computer-interface actions.
Schurger, Aaron; Gale, Steven; Gozel, Olivia; Blanke, Olaf
2017-02-01
When presented with a difficult perceptual decision, human observers are able to make metacognitive judgements of subjective certainty. Such judgements can be made independently of and prior to any overt response to a sensory stimulus, presumably via internal monitoring. Retrospective judgements about one's own task performance, on the other hand, require first that the subject perform a task and thus could potentially be made based on motor processes, proprioceptive, and other sensory feedback rather than internal monitoring. With this dichotomy in mind, we set out to study performance monitoring using a brain-computer interface (BCI), with which subjects could voluntarily perform an action - moving a cursor on a computer screen - without any movement of the body, and thus without somatosensory feedback. Real-time visual feedback was available to subjects during training, but not during the experiment where the true final position of the cursor was only revealed after the subject had estimated where s/he thought it had ended up after 6s of BCI-based cursor control. During the first half of the experiment subjects based their assessments primarily on the prior probability of the end position of the cursor on previous trials. However, during the second half of the experiment subjects' judgements moved significantly closer to the true end position of the cursor, and away from the prior. This suggests that subjects can monitor task performance when the task is performed without overt movement of the body. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Lund, Matthew Lawrence
The space radiation environment is a significant challenge to future manned and unmanned space travels. Future missions will rely more on accurate simulations of radiation transport in space through spacecraft to predict astronaut dose and energy deposition within spacecraft electronics. The International Space Station provides long-term measurements of the radiation environment in Low Earth Orbit (LEO); however, only the Apollo missions provided dosimetry data beyond LEO. Thus dosimetry analysis for deep space missions is poorly supported with currently available data, and there is a need to develop dosimetry-predicting models for extended deep space missions. GEANT4, a Monte Carlo Method, provides a powerful toolkit in C++ for simulation of radiation transport in arbitrary media, thus including the spacecraft and space travels. The newest version of GEANT4 supports multithreading and MPI, resulting in faster distributive processing of simulations in high-performance computing clusters. This thesis introduces a new application based on GEANT4 that greatly reduces computational time using Kingspeak and Ember computational clusters at the Center for High Performance Computing (CHPC) to simulate radiation transport through full spacecraft geometry, reducing simulation time to hours instead of weeks without post simulation processing. Additionally, this thesis introduces a new set of detectors besides the historically used International Commission of Radiation Units (ICRU) spheres for calculating dose distribution, including a Thermoluminescent Detector (TLD), Tissue Equivalent Proportional Counter (TEPC), and human phantom combined with a series of new primitive scorers in GEANT4 to calculate dose equivalence based on the International Commission of Radiation Protection (ICRP) standards. The developed models in this thesis predict dose depositions in the International Space Station and during the Apollo missions showing good agreement with experimental measurements. From these models the greatest contributor to radiation dose for the Apollo missions was from Galactic Cosmic Rays due to the short time within the radiation belts. The Apollo 14 dose measurements were an order of magnitude higher compared to other Apollo missions. The GEANT4 model of the Apollo Command Module shows consistent doses due to Galactic Cosmic Rays and Radiation Belts for all missions, with a small variation in dose distribution across the capsule. The model also predicts well the dose depositions and equivalent dose values in various human organs for the International Space Station or Apollo Command Module.
Evaluation of internal noise methods for Hotelling observers
NASA Astrophysics Data System (ADS)
Zhang, Yani; Pham, Binh T.; Eckstein, Miguel P.
2005-04-01
Including internal noise in computer model observers to degrade model observer performance to human levels is a common method to allow for quantitatively comparisons of human and model performance. In this paper, we studied two different types of methods for injecting internal noise to Hotelling model observers. The first method adds internal noise to the output of the individual channels: a) Independent non-uniform channel noise, b) Independent uniform channel noise. The second method adds internal noise to the decision variable arising from the combination of channel responses: a) internal noise standard deviation proportional to decision variable's standard deviation due to the external noise, b) internal noise standard deviation proportional to decision variable's variance caused by the external noise. We tested the square window Hotelling observer (HO), channelized Hotelling observer (CHO), and Laguerre-Gauss Hotelling observer (LGHO). The studied task was detection of a filling defect of varying size/shape in one of four simulated arterial segment locations with real x-ray angiography backgrounds. Results show that the internal noise method that leads to the best prediction of human performance differs across the studied models observers. The CHO model best predicts human observer performance with the channel internal noise. The HO and LGHO best predict human observer performance with the decision variable internal noise. These results might help explain why previous studies have found different results on the ability of each Hotelling model to predict human performance. Finally, the present results might guide researchers with the choice of method to include internal noise into their Hotelling models.
High-Performance Algorithms and Complex Fluids | Computational Science |
only possible by combining experimental data with simulation. Capabilities Capabilities include: Block -laden, non-Newtonian, as well as traditional internal and external flows. Contact Ray Grout Group
Computational Model of Heat Transfer on the ISS
NASA Technical Reports Server (NTRS)
Torian, John G.; Rischar, Michael L.
2008-01-01
SCRAM Lite (SCRAM signifies Station Compact Radiator Analysis Model) is a computer program for analyzing convective and radiative heat-transfer and heat-rejection performance of coolant loops and radiators, respectively, in the active thermal-control systems of the International Space Station (ISS). SCRAM Lite is a derivative of prior versions of SCRAM but is more robust. SCRAM Lite computes thermal operating characteristics of active heat-transport and heat-rejection subsystems for the major ISS configurations from Flight 5A through completion of assembly. The program performs integrated analysis of both internal and external coolant loops of the various ISS modules and of an external active thermal control system, which includes radiators and the coolant loops that transfer heat to the radiators. The SCRAM Lite run time is of the order of one minute per day of mission time. The overall objective of the SCRAM Lite simulation is to process input profiles of equipment-rack, crew-metabolic, and other heat loads to determine flow rates, coolant supply temperatures, and available radiator heat-rejection capabilities. Analyses are performed for timelines of activities, orbital parameters, and attitudes for mission times ranging from a few hours to several months.
Trujillo, Macarena; Bon, Jose; Berjano, Enrique
2017-09-01
(1) To analyse rehydration, thermal convection and increased electrical conductivity as the three phenomena which distinguish the performance of internally cooled electrodes (IC) and internally cooled wet (ICW) electrodes during radiofrequency ablation (RFA), (2) Implement a RFA computer model with an ICW which includes these phenomena and (3) Assess their relative influence on the thermal and electrical tissue response and on the coagulation zone size. A 12-min RFA in liver was modelled using an ICW electrode (17 G, 3 cm tip) by an impedance-control pulsing protocol with a constant current of 1.5 A. A model of an IC electrode was used to compare the ICW electrode performance and the computational results with the experimental results. Rehydration and increased electrical conductivity were responsible for an increase in coagulation zone size and a delay (or absence) in the occurrence of abrupt increases in electrical impedance (roll-off). While the increased electrical conductivity had a remarkable effect on enlarging the coagulation zone (an increase of 0.74 cm for differences in electrical conductivity of 0.31 S/m), rehydration considerably affected the delay in roll-off, which, in fact, was absent with a sufficiently high rehydration level. In contrast, thermal convection had an insignificant effect for the flow rates considered (0.05 and 1 mL/min). Computer results suggest that rehydration and increased electrical conductivity were mainly responsible for the absence of roll-off and increased size of the coagulation zone, respectively, and in combination allow the thermal and electrical performance of ICW electrodes to be modelled during RFA.
NASA Technical Reports Server (NTRS)
Kowalski, E. J.
1979-01-01
A computerized method which utilizes the engine performance data and estimates the installed performance of aircraft gas turbine engines is presented. This installation includes: engine weight and dimensions, inlet and nozzle internal performance and drag, inlet and nacelle weight, and nacelle drag. A user oriented description of the program input requirements, program output, deck setup, and operating instructions is presented.
Additive Manufacturing and High-Performance Computing: a Disruptive Latent Technology
NASA Astrophysics Data System (ADS)
Goodwin, Bruce
2015-03-01
This presentation will discuss the relationship between recent advances in Additive Manufacturing (AM) technology, High-Performance Computing (HPC) simulation and design capabilities, and related advances in Uncertainty Quantification (UQ), and then examines their impacts upon national and international security. The presentation surveys how AM accelerates the fabrication process, while HPC combined with UQ provides a fast track for the engineering design cycle. The combination of AM and HPC/UQ almost eliminates the engineering design and prototype iterative cycle, thereby dramatically reducing cost of production and time-to-market. These methods thereby present significant benefits for US national interests, both civilian and military, in an age of austerity. Finally, considering cyber security issues and the advent of the ``cloud,'' these disruptive, currently latent technologies may well enable proliferation and so challenge both nuclear and non-nuclear aspects of international security.
Cook, Nicola A; Kim, Jin Un; Pasha, Yasmin; Crossey, Mary Me; Schembri, Adrian J; Harel, Brian T; Kimhofer, Torben; Taylor-Robinson, Simon D
2017-01-01
Psychometric testing is used to identify patients with cirrhosis who have developed hepatic encephalopathy (HE). Most batteries consist of a series of paper-and-pencil tests, which are cumbersome for most clinicians. A modern, easy-to-use, computer-based battery would be a helpful clinical tool, given that in its minimal form, HE has an impact on both patients' quality of life and the ability to drive and operate machinery (with societal consequences). We compared the Cogstate™ computer battery testing with the Psychometric Hepatic Encephalopathy Score (PHES) tests, with a view to simplify the diagnosis. This was a prospective study of 27 patients with histologically proven cirrhosis. An analysis of psychometric testing was performed using accuracy of task performance and speed of completion as primary variables to create a correlation matrix. A stepwise linear regression analysis was performed with backward elimination, using analysis of variance. Strong correlations were found between the international shopping list, international shopping list delayed recall of Cogstate and the PHES digit symbol test. The Shopping List Tasks were the only tasks that consistently had P values of <0.05 in the linear regression analysis. Subtests of the Cogstate battery correlated very strongly with the digit symbol component of PHES in discriminating severity of HE. These findings would indicate that components of the current PHES battery with the international shopping list tasks of Cogstate would be discriminant and have the potential to be used easily in clinical practice.
1989-03-22
models are used in the computer program EPIC2 to describe the structural response in the cylinder impact test are compared and the differences are...Inc. 8600 Le Salle Road Suite 614, Oxford Building Towson, Maryland 21204 This paper describes the development and application of a computer program ...performed using a dynamic viscoplastic finite element computer program . The resolution of the procedure has been investigated by obtaining replicate
NASA Technical Reports Server (NTRS)
Goldstein, Arthur W
1947-01-01
The performance of the turbine component of an NACA research jet engine was investigated with cold air. The interaction and the matching of the turbine with the NACA eight-stage compressor were computed with the combination considered as a jet engine. The over-all performance of the engine was then determined. The internal aerodynamics were studied to the extent of investigating the performance of the first stator ring and its influence on the turbine performance. For this ring, the stream-filament method for computing velocity distribution permitted efficient sections to be designed, but the design condition of free-vortex flow with uniform axial velocities was not obtained.
DOE Office of Scientific and Technical Information (OSTI.GOV)
James, Conrad D.; Schiess, Adrian B.; Howell, Jamie
2013-10-01
The human brain (volume=1200cm3) consumes 20W and is capable of performing > 10^16 operations/s. Current supercomputer technology has reached 1015 operations/s, yet it requires 1500m^3 and 3MW, giving the brain a 10^12 advantage in operations/s/W/cm^3. Thus, to reach exascale computation, two achievements are required: 1) improved understanding of computation in biological tissue, and 2) a paradigm shift towards neuromorphic computing where hardware circuits mimic properties of neural tissue. To address 1), we will interrogate corticostriatal networks in mouse brain tissue slices, specifically with regard to their frequency filtering capabilities as a function of input stimulus. To address 2), we willmore » instantiate biological computing characteristics such as multi-bit storage into hardware devices with future computational and memory applications. Resistive memory devices will be modeled, designed, and fabricated in the MESA facility in consultation with our internal and external collaborators.« less
Machine Learning: Proceedings of the Fifteenth International Conference
1998-07-01
Machine Learning Proceedings of the Fifteenth International Conference (ICML ) Edited by Jude Shavlik MADISON , WISCONSIN JULY 24-27, 1998 fc...W. Dayton Street Madison , WI 53706 PERFORMING ORGANIZATION REPORT NUMBER 144-HD17 9. SPONSORING /MONITORING AGENCY NAMES(S) AND ADDRESS(ES...ANISE Sad 239-18 298-102 University of Wisconsin - Madison Jude W. Shavlik Department of Computer Sciences Professor 1210 West Dayton Street
Kriegeskorte, Nikolaus
2015-11-24
Recent advances in neural network modeling have enabled major strides in computer vision and other artificial intelligence applications. Human-level visual recognition abilities are coming within reach of artificial systems. Artificial neural networks are inspired by the brain, and their computations could be implemented in biological neurons. Convolutional feedforward networks, which now dominate computer vision, take further inspiration from the architecture of the primate visual hierarchy. However, the current models are designed with engineering goals, not to model brain computations. Nevertheless, initial studies comparing internal representations between these models and primate brains find surprisingly similar representational spaces. With human-level performance no longer out of reach, we are entering an exciting new era, in which we will be able to build biologically faithful feedforward and recurrent computational models of how biological brains perform high-level feats of intelligence, including vision.
ARL Collaborative Research Alliance Materials in Extreme Dynamic Environments (MEDE)
2010-11-19
Program Internal to the CRA Staff Rotation Lectures, Workshops, and Research Reviews Education Opportunities for Government Personnel Student ... Engagement with ARL Research Environment Industry Partnership + Collaboration Other Collaboration Opportunities High Performance Computing DoD
Computer simulation of space charge
NASA Astrophysics Data System (ADS)
Yu, K. W.; Chung, W. K.; Mak, S. S.
1991-05-01
Using the particle-mesh (PM) method, a one-dimensional simulation of the well-known Langmuir-Child's law is performed on an INTEL 80386-based personal computer system. The program is coded in turbo basic (trademark of Borland International, Inc.). The numerical results obtained were in excellent agreement with theoretical predictions and the computational time required is quite modest. This simulation exercise demonstrates that some simple computer simulation using particles may be implemented successfully on PC's that are available today, and hopefully this will provide the necessary incentives for newcomers to the field who wish to acquire a flavor of the elementary aspects of the practice.
Performance Assessment Institute-NV
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lombardo, Joesph
2012-12-31
The National Supercomputing Center for Energy and the Environment’s intention is to purchase a multi-purpose computer cluster in support of the Performance Assessment Institute (PA Institute). The PA Institute will serve as a research consortium located in Las Vegas Nevada with membership that includes: national laboratories, universities, industry partners, and domestic and international governments. This center will provide a one-of-a-kind centralized facility for the accumulation of information for use by Institutions of Higher Learning, the U.S. Government, and Regulatory Agencies and approved users. This initiative will enhance and extend High Performance Computing (HPC) resources in Nevada to support critical nationalmore » and international needs in "scientific confirmation". The PA Institute will be promoted as the leading Modeling, Learning and Research Center worldwide. The program proposes to utilize the existing supercomputing capabilities and alliances of the University of Nevada Las Vegas as a base, and to extend these resource and capabilities through a collaborative relationship with its membership. The PA Institute will provide an academic setting for interactive sharing, learning, mentoring and monitoring of multi-disciplinary performance assessment and performance confirmation information. The role of the PA Institute is to facilitate research, knowledge-increase, and knowledge-sharing among users.« less
Capturing remote mixing due to internal tides using multi-scale modeling tool: SOMAR-LES
NASA Astrophysics Data System (ADS)
Santilli, Edward; Chalamalla, Vamsi; Scotti, Alberto; Sarkar, Sutanu
2016-11-01
Internal tides that are generated during the interaction of an oscillating barotropic tide with the bottom bathymetry dissipate only a fraction of their energy near the generation region. The rest is radiated away in the form of low- high-mode internal tides. These internal tides dissipate energy at remote locations when they interact with the upper ocean pycnocline, continental slope, and large scale eddies. Capturing the wide range of length and time scales involved during the life-cycle of internal tides is computationally very expensive. A recently developed multi-scale modeling tool called SOMAR-LES combines the adaptive grid refinement features of SOMAR with the turbulence modeling features of a Large Eddy Simulation (LES) to capture multi-scale processes at a reduced computational cost. Numerical simulations of internal tide generation at idealized bottom bathymetries are performed to demonstrate this multi-scale modeling technique. Although each of the remote mixing phenomena have been considered independently in previous studies, this work aims to capture remote mixing processes during the life cycle of an internal tide in more realistic settings, by allowing multi-level (coarse and fine) grids to co-exist and exchange information during the time stepping process.
Cook, Nicola A; Kim, Jin Un; Pasha, Yasmin; Crossey, Mary ME; Schembri, Adrian J; Harel, Brian T; Kimhofer, Torben; Taylor-Robinson, Simon D
2017-01-01
Background Psychometric testing is used to identify patients with cirrhosis who have developed hepatic encephalopathy (HE). Most batteries consist of a series of paper-and-pencil tests, which are cumbersome for most clinicians. A modern, easy-to-use, computer-based battery would be a helpful clinical tool, given that in its minimal form, HE has an impact on both patients’ quality of life and the ability to drive and operate machinery (with societal consequences). Aim We compared the Cogstate™ computer battery testing with the Psychometric Hepatic Encephalopathy Score (PHES) tests, with a view to simplify the diagnosis. Methods This was a prospective study of 27 patients with histologically proven cirrhosis. An analysis of psychometric testing was performed using accuracy of task performance and speed of completion as primary variables to create a correlation matrix. A stepwise linear regression analysis was performed with backward elimination, using analysis of variance. Results Strong correlations were found between the international shopping list, international shopping list delayed recall of Cogstate and the PHES digit symbol test. The Shopping List Tasks were the only tasks that consistently had P values of <0.05 in the linear regression analysis. Conclusion Subtests of the Cogstate battery correlated very strongly with the digit symbol component of PHES in discriminating severity of HE. These findings would indicate that components of the current PHES battery with the international shopping list tasks of Cogstate would be discriminant and have the potential to be used easily in clinical practice. PMID:28919805
NASA Technical Reports Server (NTRS)
1991-01-01
Analytical Design Service Corporation, Ann Arbor, MI, used NASTRAN (a NASA Structural Analysis program that analyzes a design and predicts how parts will perform) in tests of transmissions, engine cooling systems, internal engine parts, and body components. They also use it to design future automobiles. Analytical software can save millions by allowing computer simulated analysis of performance even before prototypes are built.
Program For Optimization Of Nuclear Rocket Engines
NASA Technical Reports Server (NTRS)
Plebuch, R. K.; Mcdougall, J. K.; Ridolphi, F.; Walton, James T.
1994-01-01
NOP is versatile digital-computer program devoloped for parametric analysis of beryllium-reflected, graphite-moderated nuclear rocket engines. Facilitates analysis of performance of engine with respect to such considerations as specific impulse, engine power, type of engine cycle, and engine-design constraints arising from complications of fuel loading and internal gradients of temperature. Predicts minimum weight for specified performance.
Gunst, S; Del Chicca, F; Fürst, A E; Kuemmerle, J M
2016-09-01
There are no reports on the configuration of equine central tarsal bone fractures based on cross-sectional imaging and clinical and radiographic long-term outcome after internal fixation. To report clinical, radiographic and computed tomographic findings of equine central tarsal bone fractures and to evaluate the long-term outcome of internal fixation. Retrospective case series. All horses diagnosed with a central tarsal bone fracture at our institution in 2009-2013 were included. Computed tomography and internal fixation using lag screw technique was performed in all patients. Medical records and diagnostic images were reviewed retrospectively. A clinical and radiographic follow-up examination was performed at least 1 year post operatively. A central tarsal bone fracture was diagnosed in 6 horses. Five were Warmbloods used for showjumping and one was a Quarter Horse used for reining. All horses had sagittal slab fractures that began dorsally, ran in a plantar or plantaromedial direction and exited the plantar cortex at the plantar or plantaromedial indentation of the central tarsal bone. Marked sclerosis of the central tarsal bone was diagnosed in all patients. At long-term follow-up, 5/6 horses were sound and used as intended although mild osteophyte formation at the distal intertarsal joint was commonly observed. Central tarsal bone fractures in nonracehorses had a distinct configuration but radiographically subtle additional fracture lines can occur. A chronic stress related aetiology seems likely. Internal fixation of these fractures based on an accurate diagnosis of the individual fracture configuration resulted in a very good prognosis. © 2015 EVJ Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lyonnais, Marc; Smith, Matt; Mace, Kate P.
SCinet is the purpose-built network that operates during the International Conference for High Performance Computing,Networking, Storage and Analysis (Super Computing or SC). Created each year for the conference, SCinet brings to life a high-capacity network that supports applications and experiments that are a hallmark of the SC conference. The network links the convention center to research and commercial networks around the world. This resource serves as a platform for exhibitors to demonstrate the advanced computing resources of their home institutions and elsewhere by supporting a wide variety of applications. Volunteers from academia, government and industry work together to design andmore » deliver the SCinet infrastructure. Industry vendors and carriers donate millions of dollars in equipment and services needed to build and support the local and wide area networks. Planning begins more than a year in advance of each SC conference and culminates in a high intensity installation in the days leading up to the conference. The SCinet architecture for SC16 illustrates a dramatic increase in participation from the vendor community, particularly those that focus on network equipment. Software-Defined Networking (SDN) and Data Center Networking (DCN) are present in nearly all aspects of the design.« less
Classification of Features of Pavement Profiles Using Empirical Mode Decomposition
DOT National Transportation Integrated Search
2014-12-01
The Long-Term Pavement Performance (LTPP) database contains surface profile data for numerous pavements that are used mainly for computing International Roughness Index (IRI).(2) In order to obtain more information from these surface profiles, a Hilb...
Performance assessment of KORAT-3D on the ANL IBM-SP computer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alexeyev, A.V.; Zvenigorodskaya, O.A.; Shagaliev, R.M.
1999-09-01
The TENAR code is currently being developed at the Russian Federal Nuclear Center (VNIIEF) as a coupled dynamics code for the simulation of transients in VVER and RBMK systems and other nuclear systems. The neutronic module in this code system is KORAT-3D. This module is also one of the most computationally intensive components of the code system. A parallel version of KORAT-3D has been implemented to achieve the goal of obtaining transient solutions in reasonable computational time, particularly for RBMK calculations that involve the application of >100,000 nodes. An evaluation of the KORAT-3D code performance was recently undertaken on themore » Argonne National Laboratory (ANL) IBM ScalablePower (SP) parallel computer located in the Mathematics and Computer Science Division of ANL. At the time of the study, the ANL IBM-SP computer had 80 processors. This study was conducted under the auspices of a technical staff exchange program sponsored by the International Nuclear Safety Center (INSC).« less
NASA Astrophysics Data System (ADS)
Greitzer, E. M.; Tan, C. S.; Graf, M. B.
2004-06-01
Focusing on phenomena important in implementing the performance of a broad range of fluid devices, this work describes the behavior of internal flows encountered in propulsion systems, fluid machinery (compressors, turbines, and pumps) and ducts (diffusers, nozzles and combustion chambers). The book equips students and practicing engineers with a range of new analytical tools. These tools offer enhanced interpretation and application of both experimental measurements and the computational procedures that characterize modern fluids engineering.
Modeling of Bulk Evaporation and Condensation
NASA Technical Reports Server (NTRS)
Anghaie, S.; Ding, Z.
1996-01-01
This report describes the modeling and mathematical formulation of the bulk evaporation and condensation involved in liquid-vapor phase change processes. An internal energy formulation, for these phase change processes that occur under the constraint of constant volume, was studied. Compared to the enthalpy formulation, the internal energy formulation has a more concise and compact form. The velocity and time scales of the interface movement were obtained through scaling analysis and verified by performing detailed numerical experiments. The convection effect induced by the density change was analyzed and found to be negligible compared to the conduction effect. Two iterative methods for updating the value of the vapor phase fraction, the energy based (E-based) and temperature based (T-based) methods, were investigated. Numerical experiments revealed that for the evaporation and condensation problems the E-based method is superior to the T-based method in terms of computational efficiency. The internal energy formulation and the E-based method were used to compute the bulk evaporation and condensation processes under different conditions. The evolution of the phase change processes was investigated. This work provided a basis for the modeling of thermal performance of multi-phase nuclear fuel elements under variable gravity conditions, in which the buoyancy convection due to gravity effects and internal heating are involved.
Aomatsu, Naoki; Nakamura, Masanori; Hasegawa, Tsuyoshi; Nakao, Shigetomi; Uchima, Yasutake; Aomatsu, Keiho
2014-11-01
We report a case of laparoscopic surgery for a rectal carcinoid after aluminum potassium and tannic acid (ALTA) therapy for an internal hemorrhoid. A 66-year-old man was admitted to our hospital because of bleeding during defecation. He was diagnosed via anoscopy with Goligher grade II internal hemorrhoids. Examination via colonoscopy revealed 2 yellowish submucosal tumors in the lower rectum that were 5mm and 10mm in diameter. A rectal carcinoid tumor was diagnosed based on histopathology. Abdominal computed tomography demonstrated no metastases to the liver or lymph nodes. First, we performed ALTA therapy for the internal hemorrhoids. Two weeks later, we performed laparoscopic-assisted low anterior resection (D2) for the rectal carcinoid. The patient was discharged without complications and has not experienced recurrence during the 2 years of follow-up care.
Comparison of scientific computing platforms for MCNP4A Monte Carlo calculations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hendricks, J.S.; Brockhoff, R.C.
1994-04-01
The performance of seven computer platforms is evaluated with the widely used and internationally available MCNP4A Monte Carlo radiation transport code. All results are reproducible and are presented in such a way as to enable comparison with computer platforms not in the study. The authors observed that the HP/9000-735 workstation runs MCNP 50% faster than the Cray YMP 8/64. Compared with the Cray YMP 8/64, the IBM RS/6000-560 is 68% as fast, the Sun Sparc10 is 66% as fast, the Silicon Graphics ONYX is 90% as fast, the Gateway 2000 model 4DX2-66V personal computer is 27% as fast, and themore » Sun Sparc2 is 24% as fast. In addition to comparing the timing performance of the seven platforms, the authors observe that changes in compilers and software over the past 2 yr have resulted in only modest performance improvements, hardware improvements have enhanced performance by less than a factor of [approximately]3, timing studies are very problem dependent, MCNP4Q runs about as fast as MCNP4.« less
Simulating the Gradually Deteriorating Performance of an RTG
NASA Technical Reports Server (NTRS)
Wood, Eric G.; Ewell, Richard C.; Patel, Jagdish; Hanks, David R.; Lozano, Juan A.; Snyder, G. Jeffrey; Noon, Larry
2008-01-01
Degra (now in version 3) is a computer program that simulates the performance of a radioisotope thermoelectric generator (RTG) over its lifetime. Degra is provided with a graphical user interface that is used to edit input parameters that describe the initial state of the RTG and the time-varying loads and environment to which it will be exposed. Performance is computed by modeling the flows of heat from the radioactive source and through the thermocouples, also allowing for losses, to determine the temperature drop across the thermocouples. This temperature drop is used to determine the open-circuit voltage, electrical resistance, and thermal conductance of the thermocouples. Output power can then be computed by relating the open-circuit voltage and the electrical resistance of the thermocouples to a specified time-varying load voltage. Degra accounts for the gradual deterioration of performance attributable primarily to decay of the radioactive source and secondarily to gradual deterioration of the thermoelectric material. To provide guidance to an RTG designer, given a minimum of input, Degra computes the dimensions, masses, and thermal conductances of important internal structures as well as the overall external dimensions and total mass.
NASA Astrophysics Data System (ADS)
Tang, Z. B.; Deng, Y. D.; Su, C. Q.; Yuan, X. H.
2015-06-01
In this study, a numerical model has been employed to analyze the internal flow field distribution in a heat exchanger applied for an automotive thermoelectric generator based on computational fluid dynamics. The model simulates the influence of factors relevant to the heat exchanger, including the automotive waste heat mass flow velocity, temperature, internal fins, and back pressure. The result is in good agreement with experimental test data. Sensitivity analysis of the inlet parameters shows that increase of the exhaust velocity, compared with the inlet temperature, makes little contribution (0.1 versus 0.19) to the heat transfer but results in a detrimental back pressure increase (0.69 versus 0.21). A configuration equipped with internal fins is proved to offer better thermal performance compared with that without fins. Finally, based on an attempt to improve the internal flow field, a more rational structure is obtained, offering a more homogeneous temperature distribution, higher average heat transfer coefficient, and lower back pressure.
The Continual Intercomparison of Radiation Codes: Results from Phase I
NASA Technical Reports Server (NTRS)
Oreopoulos, Lazaros; Mlawer, Eli; Delamere, Jennifer; Shippert, Timothy; Cole, Jason; Iacono, Michael; Jin, Zhonghai; Li, Jiangnan; Manners, James; Raisanen, Petri;
2011-01-01
The computer codes that calculate the energy budget of solar and thermal radiation in Global Climate Models (GCMs), our most advanced tools for predicting climate change, have to be computationally efficient in order to not impose undue computational burden to climate simulations. By using approximations to gain execution speed, these codes sacrifice accuracy compared to more accurate, but also much slower, alternatives. International efforts to evaluate the approximate schemes have taken place in the past, but they have suffered from the drawback that the accurate standards were not validated themselves for performance. The manuscript summarizes the main results of the first phase of an effort called "Continual Intercomparison of Radiation Codes" (CIRC) where the cases chosen to evaluate the approximate models are based on observations and where we have ensured that the accurate models perform well when compared to solar and thermal radiation measurements. The effort is endorsed by international organizations such as the GEWEX Radiation Panel and the International Radiation Commission and has a dedicated website (i.e., http://circ.gsfc.nasa.gov) where interested scientists can freely download data and obtain more information about the effort's modus operandi and objectives. In a paper published in the March 2010 issue of the Bulletin of the American Meteorological Society only a brief overview of CIRC was provided with some sample results. In this paper the analysis of submissions of 11 solar and 13 thermal infrared codes relative to accurate reference calculations obtained by so-called "line-by-line" radiation codes is much more detailed. We demonstrate that, while performance of the approximate codes continues to improve, significant issues still remain to be addressed for satisfactory performance within GCMs. We hope that by identifying and quantifying shortcomings, the paper will help establish performance standards to objectively assess radiation code quality, and will guide the development of future phases of CIRC
Hydration level is an internal variable for computing motivation to obtain water rewards in monkeys.
Minamimoto, Takafumi; Yamada, Hiroshi; Hori, Yukiko; Suhara, Tetsuya
2012-05-01
In the process of motivation to engage in a behavior, valuation of the expected outcome is comprised of not only external variables (i.e., incentives) but also internal variables (i.e., drive). However, the exact neural mechanism that integrates these variables for the computation of motivational value remains unclear. Besides, the signal of physiological needs, which serves as the primary internal variable for this computation, remains to be identified. Concerning fluid rewards, the osmolality level, one of the physiological indices for the level of thirst, may be an internal variable for valuation, since an increase in the osmolality level induces drinking behavior. Here, to examine the relationship between osmolality and the motivational value of a water reward, we repeatedly measured the blood osmolality level, while 2 monkeys continuously performed an instrumental task until they spontaneously stopped. We found that, as the total amount of water earned increased, the osmolality level progressively decreased (i.e., the hydration level increased) in an individual-dependent manner. There was a significant negative correlation between the error rate of the task (the proportion of trials with low motivation) and the osmolality level. We also found that the increase in the error rate with reward accumulation can be well explained by a formula describing the changes in the osmolality level. These results provide a biologically supported computational formula for the motivational value of a water reward that depends on the hydration level, enabling us to identify the neural mechanism that integrates internal and external variables.
Nestola, M G C; Faggiano, E; Vergara, C; Lancellotti, R M; Ippolito, S; Antona, C; Filippi, S; Quarteroni, A; Scrofani, R
2017-02-01
We provide a computational comparison of the performance of stentless and stented aortic prostheses, in terms of aortic root displacements and internal stresses. To this aim, we consider three real patients; for each of them, we draw the two prostheses configurations, which are characterized by different mechanical properties and we also consider the native configuration. For each of these scenarios, we solve the fluid-structure interaction problem arising between blood and aortic root, through Finite Elements. In particular, the Arbitrary Lagrangian-Eulerian formulation is used for the numerical solution of the fluid-dynamic equations and a hyperelastic material model is adopted to predict the mechanical response of the aortic wall and the two prostheses. The computational results are analyzed in terms of aortic flow, internal wall stresses and aortic wall/prosthesis displacements; a quantitative comparison of the mechanical behavior of the three scenarios is reported. The numerical results highlight a good agreement between stentless and native displacements and internal wall stresses, whereas higher/non-physiological stresses are found for the stented case.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arkin, Adam; Bader, David C.; Coffey, Richard
Understanding the fundamentals of genomic systems or the processes governing impactful weather patterns are examples of the types of simulation and modeling performed on the most advanced computing resources in America. High-performance computing and computational science together provide a necessary platform for the mission science conducted by the Biological and Environmental Research (BER) office at the U.S. Department of Energy (DOE). This report reviews BER’s computing needs and their importance for solving some of the toughest problems in BER’s portfolio. BER’s impact on science has been transformative. Mapping the human genome, including the U.S.-supported international Human Genome Project that DOEmore » began in 1987, initiated the era of modern biotechnology and genomics-based systems biology. And since the 1950s, BER has been a core contributor to atmospheric, environmental, and climate science research, beginning with atmospheric circulation studies that were the forerunners of modern Earth system models (ESMs) and by pioneering the implementation of climate codes onto high-performance computers. See http://exascaleage.org/ber/ for more information.« less
A study of workstation computational performance for real-time flight simulation
NASA Technical Reports Server (NTRS)
Maddalon, Jeffrey M.; Cleveland, Jeff I., II
1995-01-01
With recent advances in microprocessor technology, some have suggested that modern workstations provide enough computational power to properly operate a real-time simulation. This paper presents the results of a computational benchmark, based on actual real-time flight simulation code used at Langley Research Center, which was executed on various workstation-class machines. The benchmark was executed on different machines from several companies including: CONVEX Computer Corporation, Cray Research, Digital Equipment Corporation, Hewlett-Packard, Intel, International Business Machines, Silicon Graphics, and Sun Microsystems. The machines are compared by their execution speed, computational accuracy, and porting effort. The results of this study show that the raw computational power needed for real-time simulation is now offered by workstations.
Passot, Jean-Baptiste; Luque, Niceto R.; Arleo, Angelo
2013-01-01
The cerebellum is thought to mediate sensorimotor adaptation through the acquisition of internal models of the body-environment interaction. These representations can be of two types, identified as forward and inverse models. The first predicts the sensory consequences of actions, while the second provides the correct commands to achieve desired state transitions. In this paper, we propose a composite architecture consisting of multiple cerebellar internal models to account for the adaptation performance of humans during sensorimotor learning. The proposed model takes inspiration from the cerebellar microcomplex circuit, and employs spiking neurons to process information. We investigate the intrinsic properties of the cerebellar circuitry subserving efficient adaptation properties, and we assess the complementary contributions of internal representations by simulating our model in a procedural adaptation task. Our simulation results suggest that the coupling of internal models enhances learning performance significantly (compared with independent forward and inverse models), and it allows for the reproduction of human adaptation capabilities. Furthermore, we provide a computational explanation for the performance improvement observed after one night of sleep in a wide range of sensorimotor tasks. We predict that internal model coupling is a necessary condition for the offline consolidation of procedural memories. PMID:23874289
Dose estimates for the solid waste performance assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rittman, P.D.
1994-08-30
The Solid Waste Performance Assessment calculations by PNL in 1990 were redone to incorporate changes in methods and parameters since then. The ten scenarios found in their report were reduced to three, the Post-Drilling Resident, the Post-Excavation Resident, and an All Pathways Irrigator. In addition, estimates of population dose to people along the Columbia River are also included. The attached report describes the methods and parameters used in the calculations, and derives dose factors for each scenario. In addition, waste concentrations, ground water concentrations, and river water concentrations needed to reach the performance objectives of 100 mrem/yr and 500 person-rem/yrmore » are computed. Internal dose factors from DOE-0071 were applied when computing internal dose. External dose rate factors came from the GENII Version 1.485 software package. Dose calculations were carried out on a spreadsheet. The calculations are described in detail in the report for 63 nuclides, including 5 not presently in the GENII libraries. The spreadsheet calculations were checked by comparison with GENII, as described in Appendix D.« less
Reference Solutions for Benchmark Turbulent Flows in Three Dimensions
NASA Technical Reports Server (NTRS)
Diskin, Boris; Thomas, James L.; Pandya, Mohagna J.; Rumsey, Christopher L.
2016-01-01
A grid convergence study is performed to establish benchmark solutions for turbulent flows in three dimensions (3D) in support of turbulence-model verification campaign at the Turbulence Modeling Resource (TMR) website. The three benchmark cases are subsonic flows around a 3D bump and a hemisphere-cylinder configuration and a supersonic internal flow through a square duct. Reference solutions are computed for Reynolds Averaged Navier Stokes equations with the Spalart-Allmaras turbulence model using a linear eddy-viscosity model for the external flows and a nonlinear eddy-viscosity model based on a quadratic constitutive relation for the internal flow. The study involves three widely-used practical computational fluid dynamics codes developed and supported at NASA Langley Research Center: FUN3D, USM3D, and CFL3D. Reference steady-state solutions computed with these three codes on families of consistently refined grids are presented. Grid-to-grid and code-to-code variations are described in detail.
Advances in computational design and analysis of airbreathing propulsion systems
NASA Technical Reports Server (NTRS)
Klineberg, John M.
1989-01-01
The development of commercial and military aircraft depends, to a large extent, on engine manufacturers being able to achieve significant increases in propulsion capability through improved component aerodynamics, materials, and structures. The recent history of propulsion has been marked by efforts to develop computational techniques that can speed up the propulsion design process and produce superior designs. The availability of powerful supercomputers, such as the NASA Numerical Aerodynamic Simulator, and the potential for even higher performance offered by parallel computer architectures, have opened the door to the use of multi-dimensional simulations to study complex physical phenomena in propulsion systems that have previously defied analysis or experimental observation. An overview of several NASA Lewis research efforts is provided that are contributing toward the long-range goal of a numerical test-cell for the integrated, multidisciplinary design, analysis, and optimization of propulsion systems. Specific examples in Internal Computational Fluid Mechanics, Computational Structural Mechanics, Computational Materials Science, and High Performance Computing are cited and described in terms of current capabilities, technical challenges, and future research directions.
Space shuttle main engine computed tomography applications
NASA Technical Reports Server (NTRS)
Sporny, Richard F.
1990-01-01
For the past two years the potential applications of computed tomography to the fabrication and overhaul of the Space Shuttle Main Engine were evaluated. Application tests were performed at various government and manufacturer facilities with equipment produced by four different manufacturers. The hardware scanned varied in size and complexity from a small temperature sensor and turbine blades to an assembled heat exchanger and main injector oxidizer inlet manifold. The evaluation of capabilities included the ability to identify and locate internal flaws, measure the depth of surface cracks, measure wall thickness, compare manifold design contours to actual part contours, perform automatic dimensional inspections, generate 3D computer models of actual parts, and image the relationship of the details in a complex assembly. The capabilities evaluated, with the exception of measuring the depth of surface flaws, demonstrated the existing and potential ability to perform many beneficial Space Shuttle Main Engine applications.
Kar, Julia; Quesada, Peter M
2012-08-01
Anterior cruciate ligament (ACL) injuries are commonly incurred by recreational and professional women athletes during non-contact jumping maneuvers in sports like basketball and volleyball, where incidences of ACL injury is more frequent to females compared to males. What remains a numerical challenge is in vivo calculation of ACL strain and internal force. This study investigated effects of increasing stop-jump height on neuromuscular and bio-mechanical properties of knee and ACL, when performed by young female recreational athletes. The underlying hypothesis is increasing stop-jump (platform) height increases knee valgus angles and external moments which also increases ACL strain and internal force. Using numerical analysis tools comprised of Inverse Kinematics, Computed Muscle Control and Forward Dynamics, a novel approach is presented for computing ACL strain and internal force based on (1) knee joint kinematics and (2) optimization of muscle activation, with ACL insertion into musculoskeletal model. Results showed increases in knee valgus external moments and angles with increasing stop-jump height. Increase in stop-jump height from 30 to 50 cm lead to increase in average peak valgus external moment from 40.5 ± 3.2 to 43.2 ± 3.7 Nm which was co-incidental with increase in average peak ACL strain, from 9.3 ± 3.1 to 13.7 ± 1.1%, and average peak ACL internal force, from 1056.1 ± 71.4 to 1165.4 ± 123.8 N for the right side with comparable increases in the left. In effect this study demonstrates a technique for estimating dynamic changes to knee and ACL variables by conducting musculoskeletal simulation on motion analysis data, collected from actual stop-jump tasks performed by young recreational women athletes.
Development and numerical analysis of low specific speed mixed-flow pump
NASA Astrophysics Data System (ADS)
Li, H. F.; Huo, Y. W.; Pan, Z. B.; Zhou, W. C.; He, M. H.
2012-11-01
With the development of the city, the market of the mixed flow pump with large flux and high head is prospect. The KSB Shanghai Pump Co., LTD decided to develop low speed specific speed mixed flow pump to meet the market requirements. Based on the centrifugal pump and axial flow pump model, aiming at the characteristics of large flux and high head, a new type of guide vane mixed flow pump was designed. The computational fluid dynamics method was adopted to analyze the internal flow of the new type model and predict its performances. The time-averaged Navier-Stokes equations were closed by SST k-ω turbulent model to adapt internal flow of guide vane with larger curvatures. The multi-reference frame(MRF) method was used to deal with the coupling of rotating impeller and static guide vane, and the SIMPLEC method was adopted to achieve the coupling solution of velocity and pressure. The computational results shows that there is great flow impact on the head of vanes at different working conditions, and there is great flow separation at the tailing of the guide vanes at different working conditions, and all will affect the performance of pump. Based on the computational results, optimizations were carried out to decrease the impact on the head of vanes and flow separation at the tailing of the guide vanes. The optimized model was simulated and its performance was predicted. The computational results show that the impact on the head of vanes and the separation at the tailing of the guide vanes disappeared. The high efficiency of the optimized pump is wide, and it fit the original design destination. The newly designed mixed flow pump is now in modeling and its experimental performance will be getting soon.
Comparison of digital intraoral scanners by single-image capture system and full-color movie system.
Yamamoto, Meguru; Kataoka, Yu; Manabe, Atsufumi
2017-01-01
The use of dental computer-aided design/computer-aided manufacturing (CAD/CAM) restoration is rapidly increasing. This study was performed to evaluate the marginal and internal cement thickness and the adhesive gap of internal cavities comprising CAD/CAM materials using two digital impression acquisition methods and micro-computed tomography. Images obtained by a single-image acquisition system (Bluecam Ver. 4.0) and a full-color video acquisition system (Omnicam Ver. 4.2) were divided into the BL and OM groups, respectively. Silicone impressions were prepared from an ISO-standard metal mold, and CEREC Stone BC and New Fuji Rock IMP were used to create working models (n=20) in the BL and OM groups (n=10 per group), respectively. Individual inlays were designed in a conventional manner using designated software, and all restorations were prepared using CEREC inLab MC XL. These were assembled with the corresponding working models used for measurement, and the level of fit was examined by three-dimensional analysis based on micro-computed tomography. Significant differences in the marginal and internal cement thickness and adhesive gap spacing were found between the OM and BL groups. The full-color movie capture system appears to be a more optimal restoration system than the single-image capture system.
2014-08-04
ISS040-E-088730 (4 Aug. 2014) --- In the International Space Station?s Harmony node, NASA astronauts Steve Swanson (foreground), Expedition 40 commander; and Reid Wiseman, flight engineer, perform a portable onboard computer Dynamic Onboard Ubiquitous Graphics (DOUG) software review in preparation for two upcoming U.S. spacewalks.
77 FR 14525 - Statement of Organization, Functions, and Delegations of Authority
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-12
... maintains the CDC Computer Security Incident Response Team; (4) performs cyber security incident reporting... systems planning and support; internal security and emergency preparedness; and management analysis and... security; education, training, and workforce development in information and IT disciplines; development and...
Mir Cooperative Solar Array Flight Performance Data and Computational Analysis
NASA Technical Reports Server (NTRS)
Kerslake, Thomas W.; Hoffman, David J.
1997-01-01
The Mir Cooperative Solar Array (MCSA) was developed jointly by the United States (US) and Russia to provide approximately 6 kW of photovoltaic power to the Russian space station Mir. The MCSA was launched to Mir in November 1995 and installed on the Kvant-1 module in May 1996. Since the MCSA photovoltaic panel modules (PPMs) are nearly identical to those of the International Space Station (ISS) photovoltaic arrays, MCSA operation offered an opportunity to gather multi-year performance data on this technology prior to its implementation on ISS. Two specially designed test sequences were executed in June and December 1996 to measure MCSA performance. Each test period encompassed 3 orbital revolutions whereby the current produced by the MCSA channels was measured. The temperature of MCSA PPMs was also measured. To better interpret the MCSA flight data, a dedicated FORTRAN computer code was developed to predict the detailed thermal-electrical performance of the MCSA. Flight data compared very favorably with computational performance predictions. This indicated that the MCSA electrical performance was fully meeting pre-flight expectations. There were no measurable indications of unexpected or precipitous MCSA performance degradation due to contamination or other causes after 7 months of operation on orbit. Power delivered to the Mir bus was lower than desired as a consequence of the retrofitted power distribution cabling. The strong correlation of experimental and computational results further bolsters the confidence level of performance codes used in critical ISS electric power forecasting. In this paper, MCSA flight performance tests are described as well as the computational modeling behind the performance predictions.
RISC Processors and High Performance Computing
NASA Technical Reports Server (NTRS)
Bailey, David H.; Saini, Subhash; Craw, James M. (Technical Monitor)
1995-01-01
This tutorial will discuss the top five RISC microprocessors and the parallel systems in which they are used. It will provide a unique cross-machine comparison not available elsewhere. The effective performance of these processors will be compared by citing standard benchmarks in the context of real applications. The latest NAS Parallel Benchmarks, both absolute performance and performance per dollar, will be listed. The next generation of the NPB will be described. The tutorial will conclude with a discussion of future directions in the field. Technology Transfer Considerations: All of these computer systems are commercially available internationally. Information about these processors is available in the public domain, mostly from the vendors themselves. The NAS Parallel Benchmarks and their results have been previously approved numerous times for public release, beginning back in 1991.
NASA Technical Reports Server (NTRS)
Kowalski, E. J.
1979-01-01
A computerized method which utilizes the engine performance data and estimates the installed performance of aircraft gas turbine engines is presented. This installation includes: engine weight and dimensions, inlet and nozzle internal performance and drag, inlet and nacelle weight, and nacelle drag. The use of two data base files to represent the engine and the inlet/nozzle/aftbody performance characteristics is discussed. The existing library of performance characteristics for inlets and nozzle/aftbodies and an example of the 1000 series of engine data tables is presented.
Zimmermann, Moritz; Valcanaia, Andre; Neiva, Gisele; Mehl, Albert; Fasbinder, Dennis
2017-11-30
Several methods for the evaluation of fit of computer-aided design/computer-assisted manufacture (CAD/CAM)-fabricated restorations have been described. In the study, digital models were recorded with an intraoral scanning device and were measured using a new three-dimensional (3D) computer technique to evaluate restoration internal fit. The aim of the study was to evaluate the internal adaptation and fit of chairside CAD/CAM-fabricated zirconia-reinforced lithium silicate ceramic crowns fabricated with different post-milling protocols. The null hypothesis was that different post-milling protocols did not influence the fitting accuracy of zirconia-reinforced lithium silicate restorations. A master all-ceramic crown preparation was completed on a maxillary right first molar on a typodont. Twenty zirconia-reinforced lithium silicate ceramic crowns (Celtra Duo, Dentsply Sirona) were designed and milled using a chairside CAD/CAM system (CEREC Omnicam, Dentsply Sirona). The 20 crowns were randomly divided into two groups based on post-milling protocols: no manipulation after milling (Group MI) and oven fired-glazing after milling (Group FG). A 3D computer method was used to evaluate the internal adaptation of the crowns. This was based on a subtractive analysis of a digital scan of the crown preparation and a digital scan of the thickness of the cement space over the crown preparation as recorded by a polyvinylsiloxane (PVS) impression material. The preparation scan and PVS scan were matched in 3D and a 3D difference analysis was performed with a software program (OraCheck, Cyfex). Three areas of internal adaptation and fit were selected for analysis: margin (MA), axial wall (AX), and occlusal surface (OC). Statistical analysis was performed using 80% percentile and one-way ANOVA with post-hoc Scheffé test (P = .05). The closest internal adaptation of the crowns was measured at the axial wall with 102.0 ± 11.7 µm for group MI-AX and 106.3 ± 29.3 µm for group FG-AX. The largest internal adaptation of the crowns was measured for the occlusal surface with 258.9 ± 39.2 µm for group MI-OC and 260.6 ± 55.0 µm for group FG-OC. No statistically significant differences were found for the post-milling protocols (P > .05). The 3D difference pattern was visually analyzed for each area with a color-coded scheme. Post-milling processing did not affect the internal adaptation of zirconia-reinforced lithium silicate crowns fabricated with a chairside CAD/CAM technique. The new 3D computer technique for the evaluation of fit of restorations may be highly promising and has the opportunity to be applied to clinical studies.
NASA Technical Reports Server (NTRS)
Jules, Kenol; Lin, Paul P.; Weiss, Daniel S.
2002-01-01
This paper presents the preliminary performance results of the artificial intelligence monitoring system in full operational mode using near real time acceleration data downlinked from the International Space Station. Preliminary microgravity environment characterization analysis result for the International Space Station (Increment-2), using the monitoring system is presented. Also, comparison between the system predicted performance based on ground test data for the US laboratory "Destiny" module and actual on-orbit performance, using measured acceleration data from the U.S. laboratory module of the International Space Station is presented. Finally, preliminary on-orbit disturbance magnitude levels are presented for the Experiment of Physics of Colloids in Space, which are compared with on ground test data. The ground test data for the Experiment of Physics of Colloids in Space were acquired from the Microgravity Emission Laboratory, located at the NASA Glenn Research Center, Cleveland, Ohio. The artificial intelligence was developed by the NASA Glenn Principal Investigator Microgravity Services Project to help the principal investigator teams identify the primary vibratory disturbance sources that are active, at any moment of time, on-board the International Space Station, which might impact the microgravity environment their experiments are exposed to. From the Principal Investigator Microgravity Services' web site, the principal investigator teams can monitor via a dynamic graphical display, implemented in Java, in near real time, which event(s) is/are on, such as crew activities, pumps, fans, centrifuges, compressor, crew exercise, structural modes, etc., and decide whether or not to run their experiments, whenever that is an option, based on the acceleration magnitude and frequency sensitivity associated with that experiment. This monitoring system detects primarily the vibratory disturbance sources. The system has built-in capability to detect both known and unknown vibratory disturbance sources. Several soft computing techniques such as Kohonen's Self-Organizing Feature Map, Learning Vector Quantization, Back-Propagation Neural Networks, and Fuzzy Logic were used to design the system.
Analog simulation of a hybrid gasoline-electric vehicle
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilmore, D.B.
1982-03-01
Hybrid vehicles using both internal combustion engines and electric motors represent one way to reduce fuel consumption. Our demonstration project envisioned more than halving the fuel consumption of a passenger vehicle by reducing greatly the capacity of its engine and adding regenerative braking and an all-electric range. We also envisaged maintaining the same performance as current passenger vehicles. A 0-6 000 rpm gasoline-driven internal combustion engine, two 0-7 800 rpm electric motors, a 0-7 800 rpm flywheel, and lead-acid batteries are the major components assembled using a mechnical epicyclic gear box. An EAI 681 analog computer allowed us to examinemore » quickly the effects of engine capacity, flywheel size, battery voltage, gear ratios, and mode of operation. An external potentiometer control on the computer allowed the operator to drive the vehicle through any acceleration cycle on level ground. We have shown that a 1.3 litre gasoline engine, two 13 kW separately excited direct current electric motors, a 38 kg flywheel, and a 48-volt battery pack will provide the same maximum performance as a conventional 4.1 litre internal combustion engine with automatic transmission at vehicle speeds below 60 km/h, and lower but satisfactory highway performance up to a top speed of 130 km/h. The transmission has undergone laboratory tests; it is to be road-tested in the first half of 1982.« less
The internal gravity wave spectrum in two high-resolution global ocean models
NASA Astrophysics Data System (ADS)
Arbic, B. K.; Ansong, J. K.; Buijsman, M. C.; Kunze, E. L.; Menemenlis, D.; Müller, M.; Richman, J. G.; Savage, A.; Shriver, J. F.; Wallcraft, A. J.; Zamudio, L.
2016-02-01
We examine the internal gravity wave (IGW) spectrum in two sets of high-resolution global ocean simulations that are forced concurrently by atmospheric fields and the astronomical tidal potential. We analyze global 1/12th and 1/25th degree HYCOM simulations, and global 1/12th, 1/24th, and 1/48th degree simulations of the MITgcm. We are motivated by the central role that IGWs play in ocean mixing, by operational considerations of the US Navy, which runs HYCOM as an ocean forecast model, and by the impact of the IGW continuum on the sea surface height (SSH) measurements that will be taken by the planned NASA/CNES SWOT wide-swath altimeter mission. We (1) compute the IGW horizontal wavenumber-frequency spectrum of kinetic energy, and interpret the results with linear dispersion relations computed from the IGW Sturm-Liouville problem, (2) compute and similarly interpret nonlinear spectral kinetic energy transfers in the IGW band, (3) compute and similarly interpret IGW contributions to SSH variance, (4) perform comparisons of modeled IGW kinetic energy frequency spectra with moored current meter observations, and (5) perform comparisons of modeled IGW kinetic energy vertical wavenumber-frequency spectra with moored observations. This presentation builds upon our work in Muller et al. (2015, GRL), who performed tasks (1), (2), and (4) in 1/12th and 1/25th degree HYCOM simulations, for one region of the North Pacific. New for this presentation are tasks (3) and (5), the inclusion of MITgcm solutions, and the analysis of additional ocean regions.
Fast Boundary Element Method for acoustics with the Sparse Cardinal Sine Decomposition
NASA Astrophysics Data System (ADS)
Alouges, François; Aussal, Matthieu; Parolin, Emile
2017-07-01
This paper presents the newly proposed method Sparse Cardinal Sine Decomposition that allows fast convolution on unstructured grids. We focus on its use when coupled with finite element techniques to solve acoustic problems with the (compressed) Boundary Element Method. In addition, we also compare the computational performances of two equivalent Matlab® and Python implementations of the method. We show validation test cases in order to assess the precision of the approach. Eventually, the performance of the method is illustrated by the computation of the acoustic target strength of a realistic submarine from the Benchmark Target Strength Simulation international workshop.
1997-10-01
Research results include: (1) Developed empirical performance criteria for characterizing stabilities and robustness of the maglev control... Maglev Experience’ at HS: Fifth International Hybrid Systems Workshop, Notre Dame, IN, Sept. 11-13,1997
SCF-MO computations have been performed on tetra- to octa-chlorinated dibenzo-p-dioxin congeners (PCDD) using an MNDO-PM3 Hamiltonian. Qualitative relationships were developed between empirical, international-toxic equivalence factors for PCDD congeners and their relati...
Evaluating open-source cloud computing solutions for geosciences
NASA Astrophysics Data System (ADS)
Huang, Qunying; Yang, Chaowei; Liu, Kai; Xia, Jizhe; Xu, Chen; Li, Jing; Gui, Zhipeng; Sun, Min; Li, Zhenglong
2013-09-01
Many organizations start to adopt cloud computing for better utilizing computing resources by taking advantage of its scalability, cost reduction, and easy to access characteristics. Many private or community cloud computing platforms are being built using open-source cloud solutions. However, little has been done to systematically compare and evaluate the features and performance of open-source solutions in supporting Geosciences. This paper provides a comprehensive study of three open-source cloud solutions, including OpenNebula, Eucalyptus, and CloudStack. We compared a variety of features, capabilities, technologies and performances including: (1) general features and supported services for cloud resource creation and management, (2) advanced capabilities for networking and security, and (3) the performance of the cloud solutions in provisioning and operating the cloud resources as well as the performance of virtual machines initiated and managed by the cloud solutions in supporting selected geoscience applications. Our study found that: (1) no significant performance differences in central processing unit (CPU), memory and I/O of virtual machines created and managed by different solutions, (2) OpenNebula has the fastest internal network while both Eucalyptus and CloudStack have better virtual machine isolation and security strategies, (3) Cloudstack has the fastest operations in handling virtual machines, images, snapshots, volumes and networking, followed by OpenNebula, and (4) the selected cloud computing solutions are capable for supporting concurrent intensive web applications, computing intensive applications, and small-scale model simulations without intensive data communication.
NASA Astrophysics Data System (ADS)
Bessonov, O.; Silvestrov, P.
2017-02-01
This paper describes the general idea and the first implementation of the Interactive information and simulation system - an integrated environment that combines computational modules for modeling the aerodynamics and aerothermodynamics of re-entry space vehicles with the large collection of different information materials on this topic. The internal organization and the composition of the system are described and illustrated. Examples of the computational and information output are presented. The system has the unified implementation for Windows and Linux operation systems and can be deployed on any modern high-performance personal computer.
Program Helps To Determine Chemical-Reaction Mechanisms
NASA Technical Reports Server (NTRS)
Bittker, D. A.; Radhakrishnan, K.
1995-01-01
General Chemical Kinetics and Sensitivity Analysis (LSENS) computer code developed for use in solving complex, homogeneous, gas-phase, chemical-kinetics problems. Provides for efficient and accurate chemical-kinetics computations and provides for sensitivity analysis for variety of problems, including problems involving honisothermal conditions. Incorporates mathematical models for static system, steady one-dimensional inviscid flow, reaction behind incident shock wave (with boundary-layer correction), and perfectly stirred reactor. Computations of equilibrium properties performed for following assigned states: enthalpy and pressure, temperature and pressure, internal energy and volume, and temperature and volume. Written in FORTRAN 77 with exception of NAMELIST extensions used for input.
Analysis of shadowing effects on spacecraft power systems
NASA Technical Reports Server (NTRS)
Fincannon, H. J.
1995-01-01
This paper describes the Orbiting Spacecraft Shadowing Analysis (OSSA) computer program that was developed at NASA Lewis Research Center in order to assess the shadowing effects on various power systems. The algorithms, inputs and outputs are discussed. Examples of typical shadowing analyses that have been performed for the International Space Station Freedom, International Space Station Alpha and the joint United States/Russian Mir Solar Dynamic Flight Experiment Project are covered. Effects of shadowing on power systems are demonstrated.
Computed Tomographic Morphometry of the Internal Anatomy of Mandibular Second Primary Molars.
Kurthukoti, Ameet J; Sharma, Pranjal; Swamy, Dinesh Francis; Shashidara, R; Swamy, Elaine Barretto
2015-01-01
Need for the study: The most important procedure for a successful endodontic treatment is the cleaning and shaping of the canal system. Understanding the internal anatomy of teeth provides valuable information to the clinician that would help him achieve higher clinical success during endodontic therapy. To evaluate by computed tomography-the internal anatomy of mandibular second primary molars with respect to the number of canals, cross-sectional shape of canals, cross-sectional area of canals and the root dentin thickness. A total of 31 mandibular second primary molars were subjected to computed-tomographic evaluation in the transverse plane, after mounting them in a prefabricated template. The images, thus, obtained were analyzed using De-winter Bio-wizard® software. All the samples demonstrated two canals in the mesial root, while majority of the samples (65.48%) demonstrated two canals in the distal root. The cross-sectional images of the mesial canals demonstrated a round shape, while the distal canals demonstrated an irregular shape. The root dentin thickness was highly reduced on the distal aspect of mesial and mesial aspect of distal canals. The mandibular second primary molars demonstrated wide variation and complexities in their internal anatomy. A thorough understanding of the complexity of the root canal system is essential for understanding the principles and problems of shaping and cleaning, determining the apical limits and dimensions of canal preparations, and for performing successful endodontic procedures. How to cite this article: Kurthukoti AJ, Sharma P, Swamy DF, Shashidara R, Swamy EB. Computed Tomographic Morphometry of the Internal Anatomy of Mandibular Second Primary Molars. Int J Clin Pediatr Dent 2015;8(3):202-207.
Computed Tomographic Morphometry of the Internal Anatomy of Mandibular Second Primary Molars
Sharma, Pranjal; Swamy, Dinesh Francis; Shashidara, R; Swamy, Elaine Barretto
2015-01-01
ABSTRACT Need for the study: The most important procedure for a successful endodontic treatment is the cleaning and shaping of the canal system. Understanding the internal anatomy of teeth provides valuable information to the clinician that would help him achieve higher clinical success during endodontic therapy. Aims: To evaluate by computed tomography—the internal anatomy of mandibular second primary molars with respect to the number of canals, cross-sectional shape of canals, cross-sectional area of canals and the root dentin thickness. Materials and methods: A total of 31 mandibular second primary molars were subjected to computed-tomographic evaluation in the transverse plane, after mounting them in a prefabricated template. The images, thus, obtained were analyzed using De-winter Bio-wizard® software. Results: All the samples demonstrated two canals in the mesial root, while majority of the samples (65.48%) demonstrated two canals in the distal root. The cross-sectional images of the mesial canals demonstrated a round shape, while the distal canals demonstrated an irregular shape. The root dentin thickness was highly reduced on the distal aspect of mesial and mesial aspect of distal canals. Conclusion: The mandibular second primary molars demonstrated wide variation and complexities in their internal anatomy. A thorough understanding of the complexity of the root canal system is essential for understanding the principles and problems of shaping and cleaning, determining the apical limits and dimensions of canal preparations, and for performing successful endodontic procedures. How to cite this article: Kurthukoti AJ, Sharma P, Swamy DF, Shashidara R, Swamy EB. Computed Tomographic Morphometry of the Internal Anatomy of Mandibular Second Primary Molars. Int J Clin Pediatr Dent 2015;8(3):202-207. PMID:26628855
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-30
... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-841] Certain Computer and Computer... Bonding AGENCY: U.S. International Trade Commission. ACTION: Notice. SUMMARY: Notice is hereby given that the U.S. International Trade Commission has determined to review in the entirety the final initial...
Realization of ETRF2000 as a New Terrestrial Reference Frame in Republic of Serbia
NASA Astrophysics Data System (ADS)
Blagojevic, D.; Vasilic, V.
2012-12-01
The International Earth Rotation and Reference Systems Service (IERS) is a joint service of the International Association of Geodesy (IAG) and the International Astronomical Union (IAU), which provides the scientific community with the means for computing the transformation from the International Celestial Reference System (ICRS) to the International Terrestrial Reference System (ITRS). It further maintains the realizations of these systems by appropriate coordinate sets called "frames". The densification of terrestrial frame usually serves as official frame for positioning and navigation tasks within the territory of particular country. One of these densifications was recently performed in order to establish new reference frame for Republic of Serbia. The paper describes related activities resulting in ETRF2000 as a new Serbian terrestrial reference frame.
International Guidelines on Computer-Based and Internet-Delivered Testing
ERIC Educational Resources Information Center
International Journal of Testing, 2006
2006-01-01
Developed by the International Test Commission, the International Guidelines on Computer-Based and Internet-Delivered Testing are a set of guidelines specifically developed to highlight good practice issues in relation to computer/Internet tests and testing. These guidelines have been developed from an international perspective and are directed at…
Marginal Accuracy and Internal Fit of 3-D Printing Laser-Sintered Co-Cr Alloy Copings.
Kim, Myung-Joo; Choi, Yun-Jung; Kim, Seong-Kyun; Heo, Seong-Joo; Koak, Jai-Young
2017-01-23
Laser sintered technology has been introduced for clinical use and can be utilized more widely, accompanied by the digitalization of dentistry and the development of direct oral scanning devices. This study was performed with the aim of comparing the marginal accuracy and internal fit of Co-Cr alloy copings fabricated by casting, CAD/CAM (Computer-aided design/Computer-assisted manufacture) milled, and 3-D laser sintered techniques. A total of 36 Co-Cr alloy crown-copings were fabricated from an implant abutment. The marginal and internal fit were evaluated by measuring the weight of the silicone material, the vertical marginal discrepancy using a microscope, and the internal gap in the sectioned specimens. The data were statistically analyzed by One-way ANOVA (analysis of variance), a Scheffe's test, and Pearson's correlation at the significance level of p = 0.05, using statistics software. The silicone weight was significantly low in the casting group. The 3-D laser sintered group showed the highest vertical discrepancy, and marginal-, occlusal-, and average- internal gaps ( p < 0.05). The CAD/CAM milled group revealed a significantly high axial internal gap. There are moderate correlations between the vertical marginal discrepancy and the internal gap variables ( r = 0.654), except for the silicone weight. In this study, the 3-D laser sintered group achieved clinically acceptable marginal accuracy and internal fit.
Marginal Accuracy and Internal Fit of 3-D Printing Laser-Sintered Co-Cr Alloy Copings
Kim, Myung-Joo; Choi, Yun-Jung; Kim, Seong-Kyun; Heo, Seong-Joo; Koak, Jai-Young
2017-01-01
Laser sintered technology has been introduced for clinical use and can be utilized more widely, accompanied by the digitalization of dentistry and the development of direct oral scanning devices. This study was performed with the aim of comparing the marginal accuracy and internal fit of Co-Cr alloy copings fabricated by casting, CAD/CAM (Computer-aided design/Computer-assisted manufacture) milled, and 3-D laser sintered techniques. A total of 36 Co-Cr alloy crown-copings were fabricated from an implant abutment. The marginal and internal fit were evaluated by measuring the weight of the silicone material, the vertical marginal discrepancy using a microscope, and the internal gap in the sectioned specimens. The data were statistically analyzed by One-way ANOVA (analysis of variance), a Scheffe’s test, and Pearson’s correlation at the significance level of p = 0.05, using statistics software. The silicone weight was significantly low in the casting group. The 3-D laser sintered group showed the highest vertical discrepancy, and marginal-, occlusal-, and average- internal gaps (p < 0.05). The CAD/CAM milled group revealed a significantly high axial internal gap. There are moderate correlations between the vertical marginal discrepancy and the internal gap variables (r = 0.654), except for the silicone weight. In this study, the 3-D laser sintered group achieved clinically acceptable marginal accuracy and internal fit. PMID:28772451
Code of Federal Regulations, 2011 CFR
2011-10-01
... software or computer software documentation to foreign governments, foreign contractors, or international... Rights in Computer Software and Computer Software Documentation 227.7203-16 Providing computer software or computer software documentation to foreign governments, foreign contractors, or international...
Code of Federal Regulations, 2013 CFR
2013-10-01
... software or computer software documentation to foreign governments, foreign contractors, or international... Rights in Computer Software and Computer Software Documentation 227.7203-16 Providing computer software or computer software documentation to foreign governments, foreign contractors, or international...
Code of Federal Regulations, 2012 CFR
2012-10-01
... software or computer software documentation to foreign governments, foreign contractors, or international... Rights in Computer Software and Computer Software Documentation 227.7203-16 Providing computer software or computer software documentation to foreign governments, foreign contractors, or international...
Code of Federal Regulations, 2014 CFR
2014-10-01
... software or computer software documentation to foreign governments, foreign contractors, or international... Rights in Computer Software and Computer Software Documentation 227.7203-16 Providing computer software or computer software documentation to foreign governments, foreign contractors, or international...
Code of Federal Regulations, 2010 CFR
2010-10-01
... software or computer software documentation to foreign governments, foreign contractors, or international... Rights in Computer Software and Computer Software Documentation 227.7203-16 Providing computer software or computer software documentation to foreign governments, foreign contractors, or international...
Computation of ancestry scores with mixed families and unrelated individuals.
Zhou, Yi-Hui; Marron, James S; Wright, Fred A
2018-03-01
The issue of robustness to family relationships in computing genotype ancestry scores such as eigenvector projections has received increased attention in genetic association, and is particularly challenging when sets of both unrelated individuals and closely related family members are included. The current standard is to compute loadings (left singular vectors) using unrelated individuals and to compute projected scores for remaining family members. However, projected ancestry scores from this approach suffer from shrinkage toward zero. We consider two main novel strategies: (i) matrix substitution based on decomposition of a target family-orthogonalized covariance matrix, and (ii) using family-averaged data to obtain loadings. We illustrate the performance via simulations, including resampling from 1000 Genomes Project data, and analysis of a cystic fibrosis dataset. The matrix substitution approach has similar performance to the current standard, but is simple and uses only a genotype covariance matrix, while the family-average method shows superior performance. Our approaches are accompanied by novel ancillary approaches that provide considerable insight, including individual-specific eigenvalue scree plots. © 2017 The Authors. Biometrics published by Wiley Periodicals, Inc. on behalf of International Biometric Society.
Computational biomedicine: a challenge for the twenty-first century.
Coveney, Peter V; Shublaq, Nour W
2012-01-01
With the relentless increase of computer power and the widespread availability of digital patient-specific medical data, we are now entering an era when it is becoming possible to develop predictive models of human disease and pathology, which can be used to support and enhance clinical decision-making. The approach amounts to a grand challenge to computational science insofar as we need to be able to provide seamless yet secure access to large scale heterogeneous personal healthcare data in a facile way, typically integrated into complex workflows-some parts of which may need to be run on high performance computers-in a facile way that is integrated into clinical decision support software. In this paper, we review the state of the art in terms of case studies drawn from neurovascular pathologies and HIV/AIDS. These studies are representative of a large number of projects currently being performed within the Virtual Physiological Human initiative. They make demands of information technology at many scales, from the desktop to national and international infrastructures for data storage and processing, linked by high performance networks.
Dahl, Bjørn Einar; Rønold, Hans Jacob; Dahl, Jon E
2017-03-01
Whether single crowns produced by computer-aided design and computer-aided manufacturing (CAD-CAM) have an internal fit comparable to crowns made by lost-wax metal casting technique is unknown. The purpose of this in vitro study was to compare the internal fit of single crowns produced with the lost-wax and metal casting technique with that of single crowns produced with the CAD-CAM technique. The internal fit of 5 groups of single crowns produced with the CAD-CAM technique was compared with that of single crowns produced in cobalt-chromium with the conventional lost-wax and metal casting technique. Comparison was performed using the triple-scan protocol; scans of the master model, the crown on the master model, and the intaglio of the crown were superimposed and analyzed with computer software. The 5 groups were milled presintered zirconia, milled hot isostatic pressed zirconia, milled lithium disilicate, milled cobalt-chromium, and laser-sintered cobalt-chromium. The cement space in both the mesiodistal and buccopalatal directions was statistically smaller (P<.05) for crowns made by the conventional lost-wax and metal casting technique compared with that of crowns produced by the CAD-CAM technique. Single crowns made using the conventional lost-wax and metal casting technique have better internal fit than crowns produced using the CAD-CAM technique. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
Economizing Education: Assessment Algorithms and Calculative Agencies
ERIC Educational Resources Information Center
O'Keeffe, Cormac
2017-01-01
International Large Scale Assessments have been producing data about educational attainment for over 60 years. More recently however, these assessments as tests have become digitally and computationally complex and increasingly rely on the calculative work performed by algorithms. In this article I first consider the coordination of relations…
Computer-Based Adaptation Tool for Advanced Diesel Engines Used in Military Applications
2008-09-04
Scholarships. 4. Rupinder Kumar Sharma , MS in Mechanical Engineering, “Performance of EGR Cooling Device”, May 2006. 5. Rajesh Patel, MS in...secondary motions and hydrodynamic lubrication regime in a single cylinder internal combustion engine”. 9. Vijay K. Venugopal, MS in Mechanical
Integrating Asynchronous Digital Design Into the Computer Engineering Curriculum
ERIC Educational Resources Information Center
Smith, S. C.; Al-Assadi, W. K.; Di, J.
2010-01-01
As demand increases for circuits with higher performance, higher complexity, and decreased feature size, asynchronous (clockless) paradigms will become more widely used in the semiconductor industry, as evidenced by the International Technology Roadmap for Semiconductors' (ITRS) prediction of a likely shift from synchronous to asynchronous design…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Maxine D.; Leigh, Jason
2014-02-17
The Blaze high-performance visual computing system serves the high-performance computing research and education needs of University of Illinois at Chicago (UIC). Blaze consists of a state-of-the-art, networked, computer cluster and ultra-high-resolution visualization system called CAVE2(TM) that is currently not available anywhere in Illinois. This system is connected via a high-speed 100-Gigabit network to the State of Illinois' I-WIRE optical network, as well as to national and international high speed networks, such as the Internet2, and the Global Lambda Integrated Facility. This enables Blaze to serve as an on-ramp to national cyberinfrastructure, such as the National Science Foundation’s Blue Waters petascalemore » computer at the National Center for Supercomputing Applications at the University of Illinois at Chicago and the Department of Energy’s Argonne Leadership Computing Facility (ALCF) at Argonne National Laboratory. DOE award # DE-SC005067, leveraged with NSF award #CNS-0959053 for “Development of the Next-Generation CAVE Virtual Environment (NG-CAVE),” enabled us to create a first-of-its-kind high-performance visual computing system. The UIC Electronic Visualization Laboratory (EVL) worked with two U.S. companies to advance their commercial products and maintain U.S. leadership in the global information technology economy. New applications are being enabled with the CAVE2/Blaze visual computing system that is advancing scientific research and education in the U.S. and globally, and help train the next-generation workforce.« less
Computer applications in the search for unrelated stem cell donors.
Müller, Carlheinz R
2002-08-01
The majority of patients which are eligible for a blood stem cell transplantation from an allogeneic donor do not have a suitable related donor so that an efficient unrelated donor search is a prerequisite for this treatment. Currently, there are over 7 million volunteer donors in the files of 50 registries in the world and in most countries the majority of transplants are performed from a foreign donor. Evidently, computer and communication technology must play a crucial role in the complex donor search process on the national and international level. This article describes the structural elements of the donor search process and discusses major systematic and technical issues to be addressed in the development and evolution of the supporting telematic systems. The theoretical considerations are complemented by a concise overview over the current state of the art which is given by describing the scope, relevance, interconnection and technical background of three major national and international computer appliances: The German Marrow Donor Information System (GERMIS) and the European Marrow Donor Information System (EMDIS) are interoperable business-to-business e-commerce systems and Bone Marrow Donors World Wide (BMDW) is the basic international donor information desk on the web.
Airborne Cloud Computing Environment (ACCE)
NASA Technical Reports Server (NTRS)
Hardman, Sean; Freeborn, Dana; Crichton, Dan; Law, Emily; Kay-Im, Liz
2011-01-01
Airborne Cloud Computing Environment (ACCE) is JPL's internal investment to improve the return on airborne missions. Improve development performance of the data system. Improve return on the captured science data. The investment is to develop a common science data system capability for airborne instruments that encompasses the end-to-end lifecycle covering planning, provisioning of data system capabilities, and support for scientific analysis in order to improve the quality, cost effectiveness, and capabilities to enable new scientific discovery and research in earth observation.
Ayres-de-Campos, Diogo; Rei, Mariana; Nunes, Inês; Sousa, Paulo; Bernardes, João
2017-01-01
SisPorto 4.0 is the most recent version of a program for the computer analysis of cardiotocographic (CTG) signals and ST events, which has been adapted to the 2015 International Federation of Gynaecology and Obstetrics (FIGO) guidelines for intrapartum foetal monitoring. This paper provides a detailed description of the analysis performed by the system, including the signal-processing algorithms involved in identification of basic CTG features and the resulting real-time alerts.
Household Possessions Indices as Wealth Measures: A Validity Evaluation
ERIC Educational Resources Information Center
Traynor, Anne; Raykov, Tenko
2013-01-01
In international achievement studies, questionnaires typically ask about the presence of particular household assets in students' homes. Responses to the assets questions are used to compute a total score, which is intended to represent household wealth in models of test performance. This study uses item analysis and confirmatory factor analysis…
15 CFR Appendix B to Part 30 - AES Filing Codes
Code of Federal Regulations, 2011 CFR
2011-01-01
... exemptions: Currency Airline tickets Bank notes Internal revenue stamps State liquor stamps Advertising...—Trans-Alaska Pipeline Authorization Act C50ENC—Encryption Commodities and Software C51AGR—License Exception Agricultural Commodities C53APP—Adjusted Peak Performance (Computers) C54SS-WRC—Western Red Cedar...
15 CFR Appendix B to Part 30 - AES Filing Codes
Code of Federal Regulations, 2010 CFR
2010-01-01
... exemptions: Currency Airline tickets Bank notes Internal revenue stamps State liquor stamps Advertising...—Trans-Alaska Pipeline Authorization Act C50ENC—Encryption Commodities and Software C51AGR—License Exception Agricultural Commodities C53APP—Adjusted Peak Performance (Computers) C54SS-WRC—Western Red Cedar...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boegh, P.; Hopkirk, R.; Junod, A.
From international nuclear industries fair; Basel, Switzerland (16 Oct 1972). The extensive environmental studies performed in Switzerland for the cooling towers of the Kaiseraugst and Leibstadt Nuclear Power Plants are presented. The computer program SAUNA for the calculation of the cooling tower plume behavior is briefly described. The main results of the environmental studies are summarized. (8 references) (auth)
Park, Sung Hwan; Lee, Ji Min; Kim, Jong Shik
2013-01-01
An irregular performance of a mechanical-type constant power regulator is considered. In order to find the cause of an irregular discharge flow at the cut-off pressure area, modeling and numerical simulations are performed to observe dynamic behavior of internal parts of the constant power regulator system for a swashplate-type axial piston pump. The commercial numerical simulation software AMESim is applied to model the mechanical-type regulator with hydraulic pump and simulate the performance of it. The validity of the simulation model of the constant power regulator system is verified by comparing simulation results with experiments. In order to find the cause of the irregular performance of the mechanical-type constant power regulator system, the behavior of main components such as the spool, sleeve, and counterbalance piston is investigated using computer simulation. The shape modification of the counterbalance piston is proposed to improve the undesirable performance of the mechanical-type constant power regulator. The performance improvement is verified by computer simulation using AMESim software.
Computational simulation of the creep-rupture process in filamentary composite materials
NASA Technical Reports Server (NTRS)
Slattery, Kerry T.; Hackett, Robert M.
1991-01-01
A computational simulation of the internal damage accumulation which causes the creep-rupture phenomenon in filamentary composite materials is developed. The creep-rupture process involves complex interactions between several damage mechanisms. A statistically-based computational simulation using a time-differencing approach is employed to model these progressive interactions. The finite element method is used to calculate the internal stresses. The fibers are modeled as a series of bar elements which are connected transversely by matrix elements. Flaws are distributed randomly throughout the elements in the model. Load is applied, and the properties of the individual elements are updated at the end of each time step as a function of the stress history. The simulation is continued until failure occurs. Several cases, with different initial flaw dispersions, are run to establish a statistical distribution of the time-to-failure. The calculations are performed on a supercomputer. The simulation results compare favorably with the results of creep-rupture experiments conducted at the Lawrence Livermore National Laboratory.
Model-based analyses: Promises, pitfalls, and example applications to the study of cognitive control
Mars, Rogier B.; Shea, Nicholas J.; Kolling, Nils; Rushworth, Matthew F. S.
2011-01-01
We discuss a recent approach to investigating cognitive control, which has the potential to deal with some of the challenges inherent in this endeavour. In a model-based approach, the researcher defines a formal, computational model that performs the task at hand and whose performance matches that of a research participant. The internal variables in such a model might then be taken as proxies for latent variables computed in the brain. We discuss the potential advantages of such an approach for the study of the neural underpinnings of cognitive control and its pitfalls, and we make explicit the assumptions underlying the interpretation of data obtained using this approach. PMID:20437297
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paris, Mark W.
The current one-year project allocation (w17 burst) supports the continuation of research performed in the two-year Institutional Computing allocation (w14 bigbangnucleosynthesis). The project has supported development and production runs resulting in several publications[1, 2, 3, 4] in peer-review journals and talks. Most signi cantly, we have recently achieved a signi cant improvement in code performance. This improvement was essential to the prospect of making further progress on this heretofore unsolved multiphysics problem that lies at the intersection of nuclear and particle theory and the kinetic theory of energy transport in a system with internal (quantum) degrees of freedom.
Computational simulation and aerodynamic sensitivity analysis of film-cooled turbines
NASA Astrophysics Data System (ADS)
Massa, Luca
A computational tool is developed for the time accurate sensitivity analysis of the stage performance of hot gas, unsteady turbine components. An existing turbomachinery internal flow solver is adapted to the high temperature environment typical of the hot section of jet engines. A real gas model and film cooling capabilities are successfully incorporated in the software. The modifications to the existing algorithm are described; both the theoretical model and the numerical implementation are validated. The accuracy of the code in evaluating turbine stage performance is tested using a turbine geometry typical of the last stage of aeronautical jet engines. The results of the performance analysis show that the predictions differ from the experimental data by less than 3%. A reliable grid generator, applicable to the domain discretization of the internal flow field of axial flow turbine is developed. A sensitivity analysis capability is added to the flow solver, by rendering it able to accurately evaluate the derivatives of the time varying output functions. The complex Taylor's series expansion (CTSE) technique is reviewed. Two of them are used to demonstrate the accuracy and time dependency of the differentiation process. The results are compared with finite differences (FD) approximations. The CTSE is more accurate than the FD, but less efficient. A "black box" differentiation of the source code, resulting from the automated application of the CTSE, generates high fidelity sensitivity algorithms, but with low computational efficiency and high memory requirements. New formulations of the CTSE are proposed and applied. Selective differentiation of the method for solving the non-linear implicit residual equation leads to sensitivity algorithms with the same accuracy but improved run time. The time dependent sensitivity derivatives are computed in run times comparable to the ones required by the FD approach.
Internal audit in a microbiology laboratory.
Mifsud, A J; Shafi, M S
1995-01-01
AIM--To set up a programme of internal laboratory audit in a medical microbiology laboratory. METHODS--A model of laboratory based process audit is described. Laboratory activities were examined in turn by specimen type. Standards were set using laboratory standard operating procedures; practice was observed using a purpose designed questionnaire and the data were analysed by computer; performance was assessed at laboratory audit meetings; and the audit circle was closed by re-auditing topics after an interval. RESULTS--Improvements in performance scores (objective measures) and in staff morale (subjective impression) were observed. CONCLUSIONS--This model of process audit could be applied, with amendments to take local practice into account, in any microbiology laboratory. PMID:7665701
Adaptive computations of multispecies mixing between scramjet nozzle flows and hypersonic freestream
NASA Technical Reports Server (NTRS)
Baysa, Oktay; Engelund, Walter C.; Eleshaky, Mohamed E.; Pittman, James L.
1989-01-01
The objective of this paper is to compute the expansion of a supersonic flow through an internal-external nozzle and its viscous mixing with the hypersonic flow of air. The supersonic jet may be that of a multispecies gas other than air. Calculations are performed for one case where both flows are those of air, and another case where a mixture of freon-12 and argon is discharged supersonically to mix with the hypersonic airflow. Comparisons are made between these two cases with respect to gas compositions, and fixed versus flow-adaptive grids. All the computational results are compared successfully with the wind-tunnel tests results.
Multitasking a three-dimensional Navier-Stokes algorithm on the Cray-2
NASA Technical Reports Server (NTRS)
Swisshelm, Julie M.
1989-01-01
A three-dimensional computational aerodynamics algorithm has been multitasked for efficient parallel execution on the Cray-2. It provides a means for examining the multitasking performance of a complete CFD application code. An embedded zonal multigrid scheme is used to solve the Reynolds-averaged Navier-Stokes equations for an internal flow model problem. The explicit nature of each component of the method allows a spatial partitioning of the computational domain to achieve a well-balanced task load for MIMD computers with vector-processing capability. Experiments have been conducted with both two- and three-dimensional multitasked cases. The best speedup attained by an individual task group was 3.54 on four processors of the Cray-2, while the entire solver yielded a speedup of 2.67 on four processors for the three-dimensional case. The multiprocessing efficiency of various types of computational tasks is examined, performance on two Cray-2s with different memory access speeds is compared, and extrapolation to larger problems is discussed.
Scattering Properties of Heterogeneous Mineral Particles with Absorbing Inclusions
NASA Technical Reports Server (NTRS)
Dlugach, Janna M.; Mishchenko, Michael I.
2015-01-01
We analyze the results of numerically exact computer modeling of scattering and absorption properties of randomly oriented poly-disperse heterogeneous particles obtained by placing microscopic absorbing grains randomly on the surfaces of much larger spherical mineral hosts or by imbedding them randomly inside the hosts. These computations are paralleled by those for heterogeneous particles obtained by fully encapsulating fractal-like absorbing clusters in the mineral hosts. All computations are performed using the superposition T-matrix method. In the case of randomly distributed inclusions, the results are compared with the outcome of Lorenz-Mie computations for an external mixture of the mineral hosts and absorbing grains. We conclude that internal aggregation can affect strongly both the integral radiometric and differential scattering characteristics of the heterogeneous particle mixtures.
Computed Tomography For Internal Inspection Of Castings
NASA Technical Reports Server (NTRS)
Hanna, Timothy L.
1995-01-01
Computed tomography used to detect internal flaws in metal castings before machining and otherwise processing them into finished parts. Saves time and money otherwise wasted on machining and other processing of castings eventually rejected because of internal defects. Knowledge of internal defects gained by use of computed tomography also provides guidance for changes in foundry techniques, procedures, and equipment to minimize defects and reduce costs.
1985-11-01
McAuto) Transaction Manager Subsystem during 1984/1985 period. On-Line Software Responsible for programming the International (OSI) Communications...Network Transaction Manager (NTM) in 1981/1984 period. Software Performance Responsible for directing the Engineering (SPE) work on performance...computer software Contained herein are theoretical and/or SCAN Project 1prierity sao referenoes that In so way reflect Air Forceowmed or -developed $62 LO
Marchetti, Michael A; Codella, Noel C F; Dusza, Stephen W; Gutman, David A; Helba, Brian; Kalloo, Aadi; Mishra, Nabin; Carrera, Cristina; Celebi, M Emre; DeFazio, Jennifer L; Jaimes, Natalia; Marghoob, Ashfaq A; Quigley, Elizabeth; Scope, Alon; Yélamos, Oriol; Halpern, Allan C
2018-02-01
Computer vision may aid in melanoma detection. We sought to compare melanoma diagnostic accuracy of computer algorithms to dermatologists using dermoscopic images. We conducted a cross-sectional study using 100 randomly selected dermoscopic images (50 melanomas, 44 nevi, and 6 lentigines) from an international computer vision melanoma challenge dataset (n = 379), along with individual algorithm results from 25 teams. We used 5 methods (nonlearned and machine learning) to combine individual automated predictions into "fusion" algorithms. In a companion study, 8 dermatologists classified the lesions in the 100 images as either benign or malignant. The average sensitivity and specificity of dermatologists in classification was 82% and 59%. At 82% sensitivity, dermatologist specificity was similar to the top challenge algorithm (59% vs. 62%, P = .68) but lower than the best-performing fusion algorithm (59% vs. 76%, P = .02). Receiver operating characteristic area of the top fusion algorithm was greater than the mean receiver operating characteristic area of dermatologists (0.86 vs. 0.71, P = .001). The dataset lacked the full spectrum of skin lesions encountered in clinical practice, particularly banal lesions. Readers and algorithms were not provided clinical data (eg, age or lesion history/symptoms). Results obtained using our study design cannot be extrapolated to clinical practice. Deep learning computer vision systems classified melanoma dermoscopy images with accuracy that exceeded some but not all dermatologists. Copyright © 2017 American Academy of Dermatology, Inc. Published by Elsevier Inc. All rights reserved.
The International Symposium on Grids and Clouds
NASA Astrophysics Data System (ADS)
The International Symposium on Grids and Clouds (ISGC) 2012 will be held at Academia Sinica in Taipei from 26 February to 2 March 2012, with co-located events and workshops. The conference is hosted by the Academia Sinica Grid Computing Centre (ASGC). 2012 is the decennium anniversary of the ISGC which over the last decade has tracked the convergence, collaboration and innovation of individual researchers across the Asia Pacific region to a coherent community. With the continuous support and dedication from the delegates, ISGC has provided the primary international distributed computing platform where distinguished researchers and collaboration partners from around the world share their knowledge and experiences. The last decade has seen the wide-scale emergence of e-Infrastructure as a critical asset for the modern e-Scientist. The emergence of large-scale research infrastructures and instruments that has produced a torrent of electronic data is forcing a generational change in the scientific process and the mechanisms used to analyse the resulting data deluge. No longer can the processing of these vast amounts of data and production of relevant scientific results be undertaken by a single scientist. Virtual Research Communities that span organisations around the world, through an integrated digital infrastructure that connects the trust and administrative domains of multiple resource providers, have become critical in supporting these analyses. Topics covered in ISGC 2012 include: High Energy Physics, Biomedicine & Life Sciences, Earth Science, Environmental Changes and Natural Disaster Mitigation, Humanities & Social Sciences, Operations & Management, Middleware & Interoperability, Security and Networking, Infrastructure Clouds & Virtualisation, Business Models & Sustainability, Data Management, Distributed Volunteer & Desktop Grid Computing, High Throughput Computing, and High Performance, Manycore & GPU Computing.
On-Orbit Performance Degradation of the International Space Station P6 Photovoltaic Arrays
NASA Technical Reports Server (NTRS)
Kerslake, Thomas W.; Gustafson, Eric D.
2003-01-01
This paper discusses the on-orbit performance and performance degradation of the International Space Station P6 solar array wings (SAWs) from the period of December 2000 through February 2003. Data selection considerations and data reduction methods are reviewed along with the approach for calculating array performance degradation based on measured string shunt current levels. Measured degradation rates are compared with those predicted by the computational tool SPACE and prior degradation rates measured with the same SAW technology on the Mir space station. Initial results show that the measured SAW short-circuit current is degrading 0.2 to 0.5 percent per year. This degradation rate is below the predicted rate of 0.8 percent per year and is well within the 3 percent estimated uncertainty in measured SAW current levels. General contributors to SAW degradation are briefly discussed.
Performance management of multiple access communication networks
NASA Astrophysics Data System (ADS)
Lee, Suk; Ray, Asok
1993-12-01
This paper focuses on conceptual design, development, and implementation of a performance management tool for computer communication networks to serve large-scale integrated systems. The objective is to improve the network performance in handling various types of messages by on-line adjustment of protocol parameters. The techniques of perturbation analysis of Discrete Event Dynamic Systems (DEDS), stochastic approximation (SA), and learning automata have been used in formulating the algorithm of performance management. The efficacy of the performance management tool has been demonstrated on a network testbed. The conceptual design presented in this paper offers a step forward to bridging the gap between management standards and users' demands for efficient network operations since most standards such as ISO (International Standards Organization) and IEEE address only the architecture, services, and interfaces for network management. The proposed concept of performance management can also be used as a general framework to assist design, operation, and management of various DEDS such as computer integrated manufacturing and battlefield C(sup 3) (Command, Control, and Communications).
Development of probabilistic internal dosimetry computer code
NASA Astrophysics Data System (ADS)
Noh, Siwan; Kwon, Tae-Eun; Lee, Jai-Ki
2017-02-01
Internal radiation dose assessment involves biokinetic models, the corresponding parameters, measured data, and many assumptions. Every component considered in the internal dose assessment has its own uncertainty, which is propagated in the intake activity and internal dose estimates. For research or scientific purposes, and for retrospective dose reconstruction for accident scenarios occurring in workplaces having a large quantity of unsealed radionuclides, such as nuclear power plants, nuclear fuel cycle facilities, and facilities in which nuclear medicine is practiced, a quantitative uncertainty assessment of the internal dose is often required. However, no calculation tools or computer codes that incorporate all the relevant processes and their corresponding uncertainties, i.e., from the measured data to the committed dose, are available. Thus, the objective of the present study is to develop an integrated probabilistic internal-dose-assessment computer code. First, the uncertainty components in internal dosimetry are identified, and quantitative uncertainty data are collected. Then, an uncertainty database is established for each component. In order to propagate these uncertainties in an internal dose assessment, a probabilistic internal-dose-assessment system that employs the Bayesian and Monte Carlo methods. Based on the developed system, we developed a probabilistic internal-dose-assessment code by using MATLAB so as to estimate the dose distributions from the measured data with uncertainty. Using the developed code, we calculated the internal dose distribution and statistical values ( e.g. the 2.5th, 5th, median, 95th, and 97.5th percentiles) for three sample scenarios. On the basis of the distributions, we performed a sensitivity analysis to determine the influence of each component on the resulting dose in order to identify the major component of the uncertainty in a bioassay. The results of this study can be applied to various situations. In cases of severe internal exposure, the causation probability of a deterministic health effect can be derived from the dose distribution, and a high statistical value ( e.g., the 95th percentile of the distribution) can be used to determine the appropriate intervention. The distribution-based sensitivity analysis can also be used to quantify the contribution of each factor to the dose uncertainty, which is essential information for reducing and optimizing the uncertainty in the internal dose assessment. Therefore, the present study can contribute to retrospective dose assessment for accidental internal exposure scenarios, as well as to internal dose monitoring optimization and uncertainty reduction.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dennig, Yasmin
Sandia National Laboratories has a long history of significant contributions to the high performance community and industry. Our innovative computer architectures allowed the United States to become the first to break the teraFLOP barrier—propelling us to the international spotlight. Our advanced simulation and modeling capabilities have been integral in high consequence US operations such as Operation Burnt Frost. Strong partnerships with industry leaders, such as Cray, Inc. and Goodyear, have enabled them to leverage our high performance computing (HPC) capabilities to gain a tremendous competitive edge in the marketplace. As part of our continuing commitment to providing modern computing infrastructuremore » and systems in support of Sandia missions, we made a major investment in expanding Building 725 to serve as the new home of HPC systems at Sandia. Work is expected to be completed in 2018 and will result in a modern facility of approximately 15,000 square feet of computer center space. The facility will be ready to house the newest National Nuclear Security Administration/Advanced Simulation and Computing (NNSA/ASC) Prototype platform being acquired by Sandia, with delivery in late 2019 or early 2020. This new system will enable continuing advances by Sandia science and engineering staff in the areas of operating system R&D, operation cost effectiveness (power and innovative cooling technologies), user environment and application code performance.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-14
... Computer Solutions and Software International, Inc., Dell Service Sales, Emdeon Business Services, KFORCE... workers from Computer Solutions and Software International, Inc., Dell Service Sales, Emdeon Business... from Computer Solutions and Software International, Inc., Dell Service Sales, Emdeon Business Services...
A network of spiking neurons for computing sparse representations in an energy efficient way
Hu, Tao; Genkin, Alexander; Chklovskii, Dmitri B.
2013-01-01
Computing sparse redundant representations is an important problem both in applied mathematics and neuroscience. In many applications, this problem must be solved in an energy efficient way. Here, we propose a hybrid distributed algorithm (HDA), which solves this problem on a network of simple nodes communicating via low-bandwidth channels. HDA nodes perform both gradient-descent-like steps on analog internal variables and coordinate-descent-like steps via quantized external variables communicated to each other. Interestingly, such operation is equivalent to a network of integrate-and-fire neurons, suggesting that HDA may serve as a model of neural computation. We compare the numerical performance of HDA with existing algorithms and show that in the asymptotic regime the representation error of HDA decays with time, t, as 1/t. We show that HDA is stable against time-varying noise, specifically, the representation error decays as 1/t for Gaussian white noise. PMID:22920853
Cone Beam Computed Tomography (CBCT) in the Field of Interventional Oncology of the Liver
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bapst, Blanche, E-mail: blanchebapst@hotmail.com; Lagadec, Matthieu, E-mail: matthieu.lagadec@bjn.aphp.fr; Breguet, Romain, E-mail: romain.breguet@hcuge.ch
Cone beam computed tomography (CBCT) is an imaging modality that provides computed tomographic images using a rotational C-arm equipped with a flat panel detector as part of the Angiography suite. The aim of this technique is to provide additional information to conventional 2D imaging to improve the performance of interventional liver oncology procedures (intraarterial treatments such as chemoembolization or selective internal radiation therapy, and percutaneous tumor ablation). CBCT provides accurate tumor detection and targeting, periprocedural guidance, and post-procedural evaluation of treatment success. This technique can be performed during intraarterial or intravenous contrast agent administration with various acquisition protocols to highlightmore » liver tumors, liver vessels, or the liver parenchyma. The purpose of this review is to present an extensive overview of published data on CBCT in interventional oncology of the liver, for both percutaneous ablation and intraarterial procedures.« less
Development of a distributed-parameter mathematical model for simulation of cryogenic wind tunnels
NASA Technical Reports Server (NTRS)
Tripp, J. S.
1983-01-01
A one-dimensional distributed-parameter dynamic model of a cryogenic wind tunnel was developed which accounts for internal and external heat transfer, viscous momentum losses, and slotted-test-section dynamics. Boundary conditions imposed by liquid-nitrogen injection, gas venting, and the tunnel fan were included. A time-dependent numerical solution to the resultant set of partial differential equations was obtained on a CDC CYBER 203 vector-processing digital computer at a usable computational rate. Preliminary computational studies were performed by using parameters of the Langley 0.3-Meter Transonic Cryogenic Tunnel. Studies were performed by using parameters from the National Transonic Facility (NTF). The NTF wind-tunnel model was used in the design of control loops for Mach number, total temperature, and total pressure and for determining interactions between the control loops. It was employed in the application of optimal linear-regulator theory and eigenvalue-placement techniques to develop Mach number control laws.
A network of spiking neurons for computing sparse representations in an energy-efficient way.
Hu, Tao; Genkin, Alexander; Chklovskii, Dmitri B
2012-11-01
Computing sparse redundant representations is an important problem in both applied mathematics and neuroscience. In many applications, this problem must be solved in an energy-efficient way. Here, we propose a hybrid distributed algorithm (HDA), which solves this problem on a network of simple nodes communicating by low-bandwidth channels. HDA nodes perform both gradient-descent-like steps on analog internal variables and coordinate-descent-like steps via quantized external variables communicated to each other. Interestingly, the operation is equivalent to a network of integrate-and-fire neurons, suggesting that HDA may serve as a model of neural computation. We show that the numerical performance of HDA is on par with existing algorithms. In the asymptotic regime, the representation error of HDA decays with time, t, as 1/t. HDA is stable against time-varying noise; specifically, the representation error decays as 1/√t for gaussian white noise.
NASA Astrophysics Data System (ADS)
Papers are presented on ISDN, mobile radio systems and techniques for digital connectivity, centralized and distributed algorithms in computer networks, communications networks, quality assurance and impact on cost, adaptive filters in communications, the spread spectrum, signal processing, video communication techniques, and digital satellite services. Topics discussed include performance evaluation issues for integrated protocols, packet network operations, the computer network theory and multiple-access, microwave single sideband systems, switching architectures, fiber optic systems, wireless local communications, modulation, coding, and synchronization, remote switching, software quality, transmission, and expert systems in network operations. Consideration is given to wide area networks, image and speech processing, office communications application protocols, multimedia systems, customer-controlled network operations, digital radio systems, channel modeling and signal processing in digital communications, earth station/on-board modems, computer communications system performance evaluation, source encoding, compression, and quantization, and adaptive communications systems.
NASA Astrophysics Data System (ADS)
Tsai, Wen-Hsien; Chen, Hui-Chiao; Chang, Jui-Chu; Leu, Jun-Der; Chao Chen, Der; Purbokusumo, Yuyun
2015-10-01
In this study, the performance of the internal audit department (IAD) and its contribution to a company under enterprise resource planning (ERP) systems was examined. It is anticipated that this will provide insight into the factors perceived to be crucial to a company's effectiveness. A theoretical framework was developed and tested using the sample of Taiwanese companies. Using mail survey procedures, we elicited perceptions from key internal auditors about the ERP system and auditing software, as well as their opinions concerning the IAD's effectiveness and its contribution within a company. Data were analysed using the partial least square (PLS) regression to test the hypotheses. Drawing upon a sample of Taiwanese firms, the study suggests that a firm can improve the performance of the IAD through an enterprise-wide integrated, effective ERP system and appropriate auditing software. At the same time, the performance of the IAD can also contribute significantly to the company. The results also show that investments in computer-assisted auditing techniques (CAATs) are crucial due to their tremendous effectiveness in regard to the performance of the IAD and for the contributions CAATs can make to a company.
26 CFR 301.6231(a)(6)-1 - Computational adjustments.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 26 Internal Revenue 18 2013-04-01 2013-04-01 false Computational adjustments. 301.6231(a)(6)-1... Computational adjustments. (a) Changes in a partner's tax liability—(1) In general. A change in the tax... 63 of the Internal Revenue Code is made through a computational adjustment. A computational...
26 CFR 301.6231(a)(6)-1 - Computational adjustments.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 26 Internal Revenue 18 2012-04-01 2012-04-01 false Computational adjustments. 301.6231(a)(6)-1... Computational adjustments. (a) Changes in a partner's tax liability—(1) In general. A change in the tax... 63 of the Internal Revenue Code is made through a computational adjustment. A computational...
26 CFR 301.6231(a)(6)-1 - Computational adjustments.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 26 Internal Revenue 18 2011-04-01 2011-04-01 false Computational adjustments. 301.6231(a)(6)-1... Computational adjustments. (a) Changes in a partner's tax liability—(1) In general. A change in the tax... 63 of the Internal Revenue Code is made through a computational adjustment. A computational...
26 CFR 301.6231(a)(6)-1 - Computational adjustments.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 26 Internal Revenue 18 2010-04-01 2010-04-01 false Computational adjustments. 301.6231(a)(6)-1... Computational adjustments. (a) Changes in a partner's tax liability—(1) In general. A change in the tax... 63 of the Internal Revenue Code is made through a computational adjustment. A computational...
26 CFR 301.6231(a)(6)-1 - Computational adjustments.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 26 Internal Revenue 18 2014-04-01 2014-04-01 false Computational adjustments. 301.6231(a)(6)-1... Computational adjustments. (a) Changes in a partner's tax liability—(1) In general. A change in the tax... 63 of the Internal Revenue Code is made through a computational adjustment. A computational...
ICCE/ICCAI 2000 Full & Short Papers (Computer-Assisted Language Learning).
ERIC Educational Resources Information Center
2000
This document contains the following full and short papers on computer-assisted language learning (CALL) from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction): (1) "A Computer-Assisted English Abstract Words Learning Environment on the Web" (Wenli Tsou and…
Detailed Multi-dimensional Modeling of Direct Internal Reforming Solid Oxide Fuel Cells.
Tseronis, K; Fragkopoulos, I S; Bonis, I; Theodoropoulos, C
2016-06-01
Fuel flexibility is a significant advantage of solid oxide fuel cells (SOFCs) and can be attributed to their high operating temperature. Here we consider a direct internal reforming solid oxide fuel cell setup in which a separate fuel reformer is not required. We construct a multidimensional, detailed model of a planar solid oxide fuel cell, where mass transport in the fuel channel is modeled using the Stefan-Maxwell model, whereas the mass transport within the porous electrodes is simulated using the Dusty-Gas model. The resulting highly nonlinear model is built into COMSOL Multiphysics, a commercial computational fluid dynamics software, and is validated against experimental data from the literature. A number of parametric studies is performed to obtain insights on the direct internal reforming solid oxide fuel cell system behavior and efficiency, to aid the design procedure. It is shown that internal reforming results in temperature drop close to the inlet and that the direct internal reforming solid oxide fuel cell performance can be enhanced by increasing the operating temperature. It is also observed that decreases in the inlet temperature result in smoother temperature profiles and in the formation of reduced thermal gradients. Furthermore, the direct internal reforming solid oxide fuel cell performance was found to be affected by the thickness of the electrochemically-active anode catalyst layer, although not always substantially, due to the counter-balancing behavior of the activation and ohmic overpotentials.
Human computer interface guide, revision A
NASA Technical Reports Server (NTRS)
1993-01-01
The Human Computer Interface Guide, SSP 30540, is a reference document for the information systems within the Space Station Freedom Program (SSFP). The Human Computer Interface Guide (HCIG) provides guidelines for the design of computer software that affects human performance, specifically, the human-computer interface. This document contains an introduction and subparagraphs on SSFP computer systems, users, and tasks; guidelines for interactions between users and the SSFP computer systems; human factors evaluation and testing of the user interface system; and example specifications. The contents of this document are intended to be consistent with the tasks and products to be prepared by NASA Work Package Centers and SSFP participants as defined in SSP 30000, Space Station Program Definition and Requirements Document. The Human Computer Interface Guide shall be implemented on all new SSFP contractual and internal activities and shall be included in any existing contracts through contract changes. This document is under the control of the Space Station Control Board, and any changes or revisions will be approved by the deputy director.
Schönberger, Joana; Erdelt, Kurt-Jürgen; Bäumer, Daniel; Beuer, Florian
2017-11-01
The purpose of this in vitro study was to compare the precision of fit of frameworks milled from semi-sintered regular zirconia and high-translucent (HT) zirconia blanks, fabricated with two different CAD/CAM systems. Three-unit, posterior fixed dental prostheses (FDP) frameworks were fabricated for standardized dies (n = 11) with two different laboratory computer-aided design (CAD)/computer-aided manufacturing (CAM) systems (Cercon/Ceramill). The replica technique was used to evaluate the marginal and internal fit under an optical microscope. Evaluation of the data was performed according to prior studies at a level of significance of 5%. The systems showed a statistically significant influence on the internal fit of the frameworks (p ≤ 0.001) and on the marginal fit (p < 0.001). The type of material showed no influence on the marginal fit for the Cercon system (p = 0.636) and on the marginal fit (p = 0.064) and the internal fit (p = 0.316) for the Ceramill system, while regular zirconia from Cercon showed higher internal values than HT zirconia (p = 0.016). Both investigated systems showed clinically acceptable values within the limitations of this in vitro study. However, one showed less internal accuracy when regular zirconia was used.
NASA Astrophysics Data System (ADS)
Wang, Tianmin; Gao, Fei; Hu, Wangyu; Lai, Wensheng; Lu, Guang-Hong; Zu, Xiaotao
2009-09-01
The Ninth International Conference on Computer Simulation of Radiation Effects in Solids (COSIRES 2008) was hosted by Beihang University in Beijing, China from 12 to 17 October 2008. Started in 1992 in Berlin, Germany, this conference series has been held biennially in Santa Barbara, CA, USA (1994); Guildford, UK (1996); Okayama, Japan (1998); State College, PA, USA (2000); Dresden, Germany (2002); Helsinki Finland (2004); and Richland, WA USA (2006). The COSIRES conferences are the foremost international forum on the theory, development and application of advanced computer simulation methods and algorithms to achieve fundamental understanding and predictive modeling of the interaction of energetic particles and clusters with solids. As can be noticed in the proceedings of the COSIRES conferences, these computer simulation methods and algorithms have been proven to be very useful for the study of fundamental radiation effect processes, which are not easily accessible by experimental methods owing to small time and length scales. Moreover, with advance in computing power, they have remarkably been developed in the different scales ranging from meso to atomistic, and even down to electronic levels, as well as coupling of the different scales. They are now becoming increasingly applicable for materials processing and performance prediction in advance engineering and energy-production technologies.
The Fifth Generation. An annotated bibliography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bramer, M.; Bramer, D.
The Japanese Fifth Generation Computer System project constitutes a radical reappraisal of the functions which an advanced computer system should be able to perform, the programming languages needed to implement such functions, and the machine architectures suitable for supporting the chosen languages. The book guides the reader through the ever-growing literature on the project, and the international responses, including the United Kingdom Government's Alvey Program and the MCC Program in the United States. Evaluative abstracts are given, including books, journal articles, unpublished reports and material at both overview and technical levels.
1990-12-01
small powerful computers to businesses and homes on an international scale (29:74). Relatively low cost, high computing power , and ease of operation were...is performed. In large part, today’s AF IM professional has been inundated with powerful new technologies which were rapidly introduced and inserted...state that, "In a survey of five years of MIS research, we fouind the averane levels of statistical power to be relatively low (5:104). In their own
NASA Astrophysics Data System (ADS)
Sawicki, J.; Siedlaczek, P.; Staszczyk, A.
2018-03-01
A numerical three-dimensional model for computing residual stresses generated in cross section of steel 42CrMo4 after nitriding is presented. The diffusion process is analyzed by the finite-element method. The internal stresses are computed using the obtained profile of the distribution of the nitrogen concentration. The special features of the intricate geometry of the treated articles including edges and angles are considered. Comparative analysis of the results of the simulation and of the experimental measurement of residual stresses is performed by the Waisman-Philips method.
NASA Technical Reports Server (NTRS)
Goglia, G. L.; Spiegler, E.
1977-01-01
The research activity focused on two main tasks: (1) the further development of the SCRAM program and, in particular, the addition of a procedure for modeling the mechanism of the internal adjustment process of the flow, in response to the imposed thermal load across the combustor and (2) the development of a numerical code for the computation of the variation of concentrations throughout a turbulent field, where finite-rate reactions occur. The code also includes an estimation of the effect of the phenomenon called 'unmixedness'.
ERIC Educational Resources Information Center
Dasuki, Salihu Ibrahim; Ogedebe, Peter; Kanya, Rislana Abdulazeez; Ndume, Hauwa; Makinde, Julius
2015-01-01
Efforts are been made by Universities in developing countries to ensure that it's graduate are not left behind in the competitive global information society; thus have adopted international computing curricular for their computing degree programs. However, adopting these international curricula seem to be very challenging for developing countries…
26 CFR 302.1-4 - Computation of taxes.
Code of Federal Regulations, 2010 CFR
2010-04-01
... ADMINISTRATION TAXES UNDER THE INTERNATIONAL CLAIMS SETTLEMENT ACT, AS AMENDED AUGUST 9, 1955 § 302.1-4 Computation of taxes. (a) Detail of employees of the Internal Revenue Service. The Commissioner will detail... 26 Internal Revenue 18 2010-04-01 2010-04-01 false Computation of taxes. 302.1-4 Section 302.1-4...
26 CFR 1.584-3 - Computation of common trust fund income.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 26 Internal Revenue 7 2010-04-01 2010-04-01 true Computation of common trust fund income. 1.584-3 Section 1.584-3 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED) INCOME TAX (CONTINUED) INCOME TAXES (CONTINUED) Banking Institutions § 1.584-3 Computation of common trust...
P43-S Computational Biology Applications Suite for High-Performance Computing (BioHPC.net)
Pillardy, J.
2007-01-01
One of the challenges of high-performance computing (HPC) is user accessibility. At the Cornell University Computational Biology Service Unit, which is also a Microsoft HPC institute, we have developed a computational biology application suite that allows researchers from biological laboratories to submit their jobs to the parallel cluster through an easy-to-use Web interface. Through this system, we are providing users with popular bioinformatics tools including BLAST, HMMER, InterproScan, and MrBayes. The system is flexible and can be easily customized to include other software. It is also scalable; the installation on our servers currently processes approximately 8500 job submissions per year, many of them requiring massively parallel computations. It also has a built-in user management system, which can limit software and/or database access to specified users. TAIR, the major database of the plant model organism Arabidopsis, and SGN, the international tomato genome database, are both using our system for storage and data analysis. The system consists of a Web server running the interface (ASP.NET C#), Microsoft SQL server (ADO.NET), compute cluster running Microsoft Windows, ftp server, and file server. Users can interact with their jobs and data via a Web browser, ftp, or e-mail. The interface is accessible at http://cbsuapps.tc.cornell.edu/.
NASA Technical Reports Server (NTRS)
Chow, Edward T.; Schatzel, Donald V.; Whitaker, William D.; Sterling, Thomas
2008-01-01
A Spaceborne Processor Array in Multifunctional Structure (SPAMS) can lower the total mass of the electronic and structural overhead of spacecraft, resulting in reduced launch costs, while increasing the science return through dynamic onboard computing. SPAMS integrates the multifunctional structure (MFS) and the Gilgamesh Memory, Intelligence, and Network Device (MIND) multi-core in-memory computer architecture into a single-system super-architecture. This transforms every inch of a spacecraft into a sharable, interconnected, smart computing element to increase computing performance while simultaneously reducing mass. The MIND in-memory architecture provides a foundation for high-performance, low-power, and fault-tolerant computing. The MIND chip has an internal structure that includes memory, processing, and communication functionality. The Gilgamesh is a scalable system comprising multiple MIND chips interconnected to operate as a single, tightly coupled, parallel computer. The array of MIND components shares a global, virtual name space for program variables and tasks that are allocated at run time to the distributed physical memory and processing resources. Individual processor- memory nodes can be activated or powered down at run time to provide active power management and to configure around faults. A SPAMS system is comprised of a distributed Gilgamesh array built into MFS, interfaces into instrument and communication subsystems, a mass storage interface, and a radiation-hardened flight computer.
True Aneurysm of the Left Internal Thoracic Artery.
Ouldsalek, El Hadj; El Fatemi, Bouthianah; Bakkali, Tarek; El Idrissi, Radouane; El Khaloufi, Samir; Lekehal, Brahim; Sefiani, Yasser; El Mesnaoui, Abbas; Bensaid, Younès
2016-02-01
Aneurysms of the internal thoracic artery (ITA) are rare and can have many etiologies. Hyperflow is an exceptional etiology. We report the case of a 56-year-old woman who presented with a stress-induced ischemia of the left arm. Computed tomography angiography showed occlusion of the subclavian artery and a true aneurysm of the ITA. The ITA aneurysm was resected without restoration of the ITA continuity and a carotid-subclavian bypass was performed. Pathological examination of the aneurysm sac was not specific. Copyright © 2016 Elsevier Inc. All rights reserved.
[Elective reconstruction of thoracoabdominal aortic aneurysm type IV by transabdominal approach].
Marjanović, Ivan; Jevtić, Miodrag; Misović, Sidor; Sarac, Momir
2012-01-01
Thoracoabdominal aortic aneurysm (TAAA) type IV represents an aortic dilatation from the level of the diaphragmatic hiatus to the iliac arteries branches, including visceral branches of the aorta. In the traditional procedure of TAAA type IV repair, the body is opened using thoractomy and laparotomy in order to provide adequate exposure of the descending thoracic and abdominal aorta for safe aortic reconstruction. We reported a 71-year-old man with elective reconstruction of the TAAA type IV performed by transabdominal approach. Computed tomography scans angiography revealed a TAAA type IV with diameter of 62 mm in the region of celiac trunk andsuperior mesenteric artery branching, and the largest diameter of 75 mm in the infrarenal aortic level. The patient comorbidity included a chronic obstructive pulmonary disease and hypertension, therefore he was treated for a prolonged period. In preparation for the planned aortic reconstruction asymptomatic carotid disease (occlusion of the left internal carotid artery and subtotal stenosis of the right internal carotid artery) was diagnosed. Within the same intervention percutaneous transluminal angioplasty with stent placement in right internal carotid artery was made. In general, under endotracheal anesthesia and epidural analgesia, with transabdominal approach performed aortic reconstruction with tubular dakron graft 24 mm were, and reimplantation of visceral aortic branches into the graft performed. Postoperative course was uneventful, and the patient was discharged on the postoperative day 17. Control computed tomography scan angiography performed three months after the operation showed vascular state of the patient to be in order. Complete transabdominal approach to TAAA type IV represents an appropriate substitute for thoracoabdominal approach, without compromising safety of the patient. This approach is less traumatic, especially in patients with impaired pulmonary function, because there is no thoracotomy and any complications that could follow this approach.
Experimental performance of three design factors for ventral nozzles for SSTOVL aircraft
NASA Technical Reports Server (NTRS)
Esker, Barbara S.; Perusek, Gail P.
1992-01-01
An experimental study of three variations of a ventral nozzle system for supersonic short-takeoff and vertical-landing (SSTOVL) aircraft was performed at the NASA LeRC Powered Lift Facility. These test results include the effects of an annular duct flow into the ventral duct, a blocked tailpipe, and a short ventral duct length. An analytical study was also performed on the short ventral duct configuration using the PARC3D computational dynamics code. Data presented include pressure losses, thrust and flow performance, internal flow visualization, and pressure distributions at the exit plane of the ventral nozzle.
SRM Internal Flow Test and Computational Fluid Dynamic Analysis. Volume 1; Major Task Summaries
NASA Technical Reports Server (NTRS)
Whitesides, R. Harold; Dill, Richard A.; Purinton, David C.
1995-01-01
During the four year period of performance for NASA contract, NASB-39095, ERC has performed a wide variety of tasks to support the design and continued development of new and existing solid rocket motors and the resolution of operational problems associated with existing solid rocket motor's at NASA MSFC. This report summarizes the support provided to NASA MSFC during the contractual period of performance. The report is divided into three main sections. The first section presents summaries for the major tasks performed. These tasks are grouped into three major categories: full scale motor analysis, subscale motor analysis and cold flow analysis. The second section includes summaries describing the computational fluid dynamics (CFD) tasks performed. The third section, the appendices of the report, presents detailed descriptions of the analysis efforts as well as published papers, memoranda and final reports associated with specific tasks. These appendices are referenced in the summaries. The subsection numbers for the three sections correspond to the same topics for direct cross referencing.
CIFAC '92: First International Symposium on Computers in Furniture and Cabinet
Janice K. Wiedenbeck
1992-01-01
(Book Review) The First International Symposium on Computers in Furniture and Cabinet Manufacturing was sponsored by the Wood Machining Institute in cooperation with Furniture Design and Manufacturing Magazine. The symposium was designed to ãprovide an international forum for the exchange of the latest information on the use of computers in furniture and cabinet...
26 CFR 301.6014-1 - Income tax return-tax not computed by taxpayer.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 26 Internal Revenue 18 2010-04-01 2010-04-01 false Income tax return-tax not computed by taxpayer. 301.6014-1 Section 301.6014-1 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY... Records § 301.6014-1 Income tax return—tax not computed by taxpayer. For provisions relating to the...
Mangrulkar, Rajesh S.; Watt, John M.; Chapman, Chris M.; Judge, Richard D.; Stern, David T.
2001-01-01
In order to test the hypothesis that self study with a CD-ROM based cardiac auscultation tool would enhance knowledge and skills, we conducted a controlled trial of internal medicine residents and evaluated their performance on a test before and after exposure to the tool. Both intervention and control groups improved their auscultation knowledge and skills scores. However, subjects in the CD-ROM group had significantly higher improvements in skills, knowledge, and total scores than those not exposed to the intervention (all p<0.001). Therefore, protected time for internal medicine residents to use this multimedia computer program enhanced both facets of cardiac auscultation.
Computation of multi-dimensional viscous supersonic jet flow
NASA Technical Reports Server (NTRS)
Kim, Y. N.; Buggeln, R. C.; Mcdonald, H.
1986-01-01
A new method has been developed for two- and three-dimensional computations of viscous supersonic flows with embedded subsonic regions adjacent to solid boundaries. The approach employs a reduced form of the Navier-Stokes equations which allows solution as an initial-boundary value problem in space, using an efficient noniterative forward marching algorithm. Numerical instability associated with forward marching algorithms for flows with embedded subsonic regions is avoided by approximation of the reduced form of the Navier-Stokes equations in the subsonic regions of the boundary layers. Supersonic and subsonic portions of the flow field are simultaneously calculated by a consistently split linearized block implicit computational algorithm. The results of computations for a series of test cases relevant to internal supersonic flow is presented and compared with data. Comparison between data and computation are in general excellent thus indicating that the computational technique has great promise as a tool for calculating supersonic flow with embedded subsonic regions. Finally, a User's Manual is presented for the computer code used to perform the calculations.
A Living Library: New Model for Global Electronic Interactivity and Networking in the Garden.
ERIC Educational Resources Information Center
Sherk, Bonnie
1995-01-01
Describes the Living Library, an idea to create a network of international cultural parks in different cities of the world using new communications technologies on-line in a garden setting, bringing the humanities, sciences, and social sciences to life through plants, visual and performed artworks, lectures, and computer and on-line satellite…
26 CFR 7.999-1 - Computation of the international boycott factor.
Code of Federal Regulations, 2011 CFR
2011-04-01
... country, (iii) Securities by a dealer to a beneficial owner that is a resident of that country (but only... country, (iv) Intangible property (other than securities) in that country, (v) Securities by a dealer to a... numerator by reason of this subparagraph. (8) Payroll paid or accrued for services performed in a country...
ERIC Educational Resources Information Center
Troadec, Bertrand; Zarhbouch, Benaissa; Frede, Valerie
2009-01-01
The non-computational brand of cognitivism is based on the premise that performances, including those of children, are generated by mental models or representations, i.e., "internal" resources. The sociocultural approach, on the other hand, regards context, i.e., an "external" resource, as the chief means of elaborating…
Rule-driven defect detection in CT images of hardwood logs
Erol Sarigul; A. Lynn Abbott; Daniel L. Schmoldt
2000-01-01
This paper deals with automated detection and identification of internal defects in hardwood logs using computed tomography (CT) images. We have developed a system that employs artificial neural networks to perform tentative classification of logs on a pixel-by-pixel basis. This approach achieves a high level of classification accuracy for several hardwood species (...
ERIC Educational Resources Information Center
Blinn Coll., Brenham, TX.
Blinn College final course grade distributions are summarized for spring 1990 to 1994 in this four-part report. Section I presents tables of final grade distributions by campus and course in accounting; agriculture; anthropology; biology; business; chemistry; child development; communications; computer science; criminal justice; drama; emergency…
Radio Synthesis Imaging - A High Performance Computing and Communications Project
NASA Astrophysics Data System (ADS)
Crutcher, Richard M.
The National Science Foundation has funded a five-year High Performance Computing and Communications project at the National Center for Supercomputing Applications (NCSA) for the direct implementation of several of the computing recommendations of the Astronomy and Astrophysics Survey Committee (the "Bahcall report"). This paper is a summary of the project goals and a progress report. The project will implement a prototype of the next generation of astronomical telescope systems - remotely located telescopes connected by high-speed networks to very high performance, scalable architecture computers and on-line data archives, which are accessed by astronomers over Gbit/sec networks. Specifically, a data link has been installed between the BIMA millimeter-wave synthesis array at Hat Creek, California and NCSA at Urbana, Illinois for real-time transmission of data to NCSA. Data are automatically archived, and may be browsed and retrieved by astronomers using the NCSA Mosaic software. In addition, an on-line digital library of processed images will be established. BIMA data will be processed on a very high performance distributed computing system, with I/O, user interface, and most of the software system running on the NCSA Convex C3880 supercomputer or Silicon Graphics Onyx workstations connected by HiPPI to the high performance, massively parallel Thinking Machines Corporation CM-5. The very computationally intensive algorithms for calibration and imaging of radio synthesis array observations will be optimized for the CM-5 and new algorithms which utilize the massively parallel architecture will be developed. Code running simultaneously on the distributed computers will communicate using the Data Transport Mechanism developed by NCSA. The project will also use the BLANCA Gbit/s testbed network between Urbana and Madison, Wisconsin to connect an Onyx workstation in the University of Wisconsin Astronomy Department to the NCSA CM-5, for development of long-distance distributed computing. Finally, the project is developing 2D and 3D visualization software as part of the international AIPS++ project. This research and development project is being carried out by a team of experts in radio astronomy, algorithm development for massively parallel architectures, high-speed networking, database management, and Thinking Machines Corporation personnel. The development of this complete software, distributed computing, and data archive and library solution to the radio astronomy computing problem will advance our expertise in high performance computing and communications technology and the application of these techniques to astronomical data processing.
Park, Sung Hwan; Lee, Ji Min; Kim, Jong Shik
2013-01-01
An irregular performance of a mechanical-type constant power regulator is considered. In order to find the cause of an irregular discharge flow at the cut-off pressure area, modeling and numerical simulations are performed to observe dynamic behavior of internal parts of the constant power regulator system for a swashplate-type axial piston pump. The commercial numerical simulation software AMESim is applied to model the mechanical-type regulator with hydraulic pump and simulate the performance of it. The validity of the simulation model of the constant power regulator system is verified by comparing simulation results with experiments. In order to find the cause of the irregular performance of the mechanical-type constant power regulator system, the behavior of main components such as the spool, sleeve, and counterbalance piston is investigated using computer simulation. The shape modification of the counterbalance piston is proposed to improve the undesirable performance of the mechanical-type constant power regulator. The performance improvement is verified by computer simulation using AMESim software. PMID:24282389
NASA Technical Reports Server (NTRS)
Barnett, Alan R.; Widrick, Timothy W.; Ludwiczak, Damian R.
1996-01-01
Solving for dynamic responses of free-free launch vehicle/spacecraft systems acted upon by buffeting winds is commonly performed throughout the aerospace industry. Due to the unpredictable nature of this wind loading event, these problems are typically solved using frequency response random analysis techniques. To generate dynamic responses for spacecraft with statically-indeterminate interfaces, spacecraft contractors prefer to develop models which have response transformation matrices developed for mode acceleration data recovery. This method transforms spacecraft boundary accelerations and displacements into internal responses. Unfortunately, standard MSC/NASTRAN modal frequency response solution sequences cannot be used to combine acceleration- and displacement-dependent responses required for spacecraft mode acceleration data recovery. External user-written computer codes can be used with MSC/NASTRAN output to perform such combinations, but these methods can be labor and computer resource intensive. Taking advantage of the analytical and computer resource efficiencies inherent within MS C/NASTRAN, a DMAP Alter has been developed to combine acceleration- and displacement-dependent modal frequency responses for performing spacecraft mode acceleration data recovery. The Alter has been used successfully to efficiently solve a common aerospace buffeting wind analysis.
Cuiling, Liu; Liyuan, Yang; Xu, Gao; Hong, Shang
2016-06-01
This study aimed to investigate the influence of coping material and porcelain firing on the marginal and internal fit of computer-aided design/computer-aided manufacturing (CAD/CAM) of zirconia ceramic implant- and titanium ceramic implant-supported crowns. Zirconia ceramic implant (group A, n = 8) and titanium metal ceramic implant-supported crowns (group B, n = 8) were produced from copings using the CAD/CAM system. The marginal and internal gaps of the copings and crowns were measured by using a light-body silicone replica technique combined with micro-computed tomography scanning to obtain a three-dimensional image. Marginal gap (MG), horizontal marginal discrepancy (HMD), and axial wall (AW) were measured. Statistical analyses were performed using SPSS 17.0. Prior to porcelain firing, the measurements for MG, HMD, and AW of copings in group A were significantly larger than those in group B (P < 0.05). After porcelain firing, the measurements for MG of crowns in group A were smaller than those in group B (P < 0.05), whereas HMD and AW showed no significant difference between the two groups (P > 0.05). Porcelain firing significantly reduced MG (P < 0.05) in group A but significantly increased MG, HMD, and AW in group B (P < 0.05) HMD and AW were not influenced by porcelain firing in group A (P > 0.05). The marginal fits of CAD/CAM zirconia ceramic implant-supported crowns were superior to those of CAD/CAM titanium ceramic-supported crowns. The fits of both the CAD/CAM zirconia ceramic implant- and titanium ceramic implant-supported crowns were obviously influenced by porcelain firing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eck, Brendan L.; Fahmi, Rachid; Miao, Jun
2015-10-15
Purpose: Aims in this study are to (1) develop a computational model observer which reliably tracks the detectability of human observers in low dose computed tomography (CT) images reconstructed with knowledge-based iterative reconstruction (IMR™, Philips Healthcare) and filtered back projection (FBP) across a range of independent variables, (2) use the model to evaluate detectability trends across reconstructions and make predictions of human observer detectability, and (3) perform human observer studies based on model predictions to demonstrate applications of the model in CT imaging. Methods: Detectability (d′) was evaluated in phantom studies across a range of conditions. Images were generated usingmore » a numerical CT simulator. Trained observers performed 4-alternative forced choice (4-AFC) experiments across dose (1.3, 2.7, 4.0 mGy), pin size (4, 6, 8 mm), contrast (0.3%, 0.5%, 1.0%), and reconstruction (FBP, IMR), at fixed display window. A five-channel Laguerre–Gauss channelized Hotelling observer (CHO) was developed with internal noise added to the decision variable and/or to channel outputs, creating six different internal noise models. Semianalytic internal noise computation was tested against Monte Carlo and used to accelerate internal noise parameter optimization. Model parameters were estimated from all experiments at once using maximum likelihood on the probability correct, P{sub C}. Akaike information criterion (AIC) was used to compare models of different orders. The best model was selected according to AIC and used to predict detectability in blended FBP-IMR images, analyze trends in IMR detectability improvements, and predict dose savings with IMR. Predicted dose savings were compared against 4-AFC study results using physical CT phantom images. Results: Detection in IMR was greater than FBP in all tested conditions. The CHO with internal noise proportional to channel output standard deviations, Model-k4, showed the best trade-off between fit and model complexity according to AIC{sub c}. With parameters fixed, the model reasonably predicted detectability of human observers in blended FBP-IMR images. Semianalytic internal noise computation gave results equivalent to Monte Carlo, greatly speeding parameter estimation. Using Model-k4, the authors found an average detectability improvement of 2.7 ± 0.4 times that of FBP. IMR showed greater improvements in detectability with larger signals and relatively consistent improvements across signal contrast and x-ray dose. In the phantom tested, Model-k4 predicted an 82% dose reduction compared to FBP, verified with physical CT scans at 80% reduced dose. Conclusions: IMR improves detectability over FBP and may enable significant dose reductions. A channelized Hotelling observer with internal noise proportional to channel output standard deviation agreed well with human observers across a wide range of variables, even across reconstructions with drastically different image characteristics. Utility of the model observer was demonstrated by predicting the effect of image processing (blending), analyzing detectability improvements with IMR across dose, size, and contrast, and in guiding real CT scan dose reduction experiments. Such a model observer can be applied in optimizing parameters in advanced iterative reconstruction algorithms as well as guiding dose reduction protocols in physical CT experiments.« less
45 CFR 265.7 - How will we determine if the State is meeting the quarterly reporting requirements?
Code of Federal Regulations, 2012 CFR
2012-10-01
... computational errors and are internally consistent (e.g., items that should add to totals do so); (3) The State... from computational errors and are internally consistent (e.g., items that should add to totals do so... from computational errors and are internally consistent (e.g., items that should add to totals do so...
Multi-Functional UV-Visible-IR Nanosensors Devices and Structures
2015-04-29
Dual-Gate MOSFET System, Proceedings of the International Workshop on Computational Electronics, Nara, Japan, Society of Micro- and Nanoelectronics ...International Workshop on Computational Electronics, Nara, Japan, Society of Micro- and Nanoelectronics , 216-217 (2013); ISBN 978-3-901578-26-7 M. S...Raman Spectroscopy, Proceedings of the International Workshop on Computational Electronics, Nara, Japan, Society of Micro- and Nanoelectronics , 198
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-19
... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-841] Certain Computers and Computer Peripheral Devices and Components Thereof and Products Containing the Same Request for Statements on the Public Interest AGENCY: U.S. International Trade Commission. ACTION: Notice. SUMMARY: Notice is hereby...
NASA Technical Reports Server (NTRS)
Heeg, Jennifer; Chwalowski, Pawel; Wieseman, Carol D.; Florance, Jennifer P.; Schuster, David M.
2013-01-01
The Aeroelastic Prediction Workshop brought together an international community of computational fluid dynamicists as a step in defining the state of the art in computational aeroelasticity. The Rectangular Supercritical Wing (RSW) was chosen as the first configuration to study due to its geometric simplicity, perceived simple flow field at transonic conditions and availability of an experimental data set containing forced oscillation response data. Six teams performed analyses of the RSW; they used Reynolds-Averaged Navier-Stokes flow solvers exercised assuming that the wing had a rigid structure. Both steady-state and forced oscillation computations were performed by each team. The results of these calculations were compared with each other and with the experimental data. The steady-state results from the computations capture many of the flow features of a classical supercritical airfoil pressure distribution. The most dominant feature of the oscillatory results is the upper surface shock dynamics. Substantial variations were observed among the computational solutions as well as differences relative to the experimental data. Contributing issues to these differences include substantial wind tunnel wall effects and diverse choices in the analysis parameters.
Analysis of impact of general-purpose graphics processor units in supersonic flow modeling
NASA Astrophysics Data System (ADS)
Emelyanov, V. N.; Karpenko, A. G.; Kozelkov, A. S.; Teterina, I. V.; Volkov, K. N.; Yalozo, A. V.
2017-06-01
Computational methods are widely used in prediction of complex flowfields associated with off-normal situations in aerospace engineering. Modern graphics processing units (GPU) provide architectures and new programming models that enable to harness their large processing power and to design computational fluid dynamics (CFD) simulations at both high performance and low cost. Possibilities of the use of GPUs for the simulation of external and internal flows on unstructured meshes are discussed. The finite volume method is applied to solve three-dimensional unsteady compressible Euler and Navier-Stokes equations on unstructured meshes with high resolution numerical schemes. CUDA technology is used for programming implementation of parallel computational algorithms. Solutions of some benchmark test cases on GPUs are reported, and the results computed are compared with experimental and computational data. Approaches to optimization of the CFD code related to the use of different types of memory are considered. Speedup of solution on GPUs with respect to the solution on central processor unit (CPU) is compared. Performance measurements show that numerical schemes developed achieve 20-50 speedup on GPU hardware compared to CPU reference implementation. The results obtained provide promising perspective for designing a GPU-based software framework for applications in CFD.
Mexican Space Weather Service (SCIESMEX)
NASA Astrophysics Data System (ADS)
Gonzalez-Esparza, A.; De la Luz, V.; Mejia-Ambriz, J. C.; Aguilar-Rodriguez, E.; Corona-Romero, P.; Gonzalez, L. X.
2015-12-01
Recent modifications of the Civil Protection Law in Mexico include now specific mentions to space hazards and space weather phenomena. During the last few years, the UN has promoted international cooperation on Space Weather awareness, studies and monitoring. Internal and external conditions motivated the creation of a Space Weather Service in Mexico (SCIESMEX). The SCIESMEX (www.sciesmex.unam.mx) is operated by the Geophysics Institute at the National Autonomous University of Mexico (UNAM). The UNAM has the experience of operating several critical national services, including the National Seismological Service (SSN); besides that has a well established scientific group with expertise in space physics and solar- terrestrial phenomena. The SCIESMEX is also related with the recent creation of the Mexican Space Agency (AEM). The project combines a network of different ground instruments covering solar, interplanetary, geomagnetic, and ionospheric observations. The SCIESMEX has already in operation computing infrastructure running the web application, a virtual observatory and a high performance computing server to run numerical models. SCIESMEX participates in the International Space Environment Services (ISES) and in the Inter-progamme Coordination Team on Space Weather (ICTSW) of the Word Meteorological Organization (WMO).
Methods and computer readable medium for improved radiotherapy dosimetry planning
Wessol, Daniel E.; Frandsen, Michael W.; Wheeler, Floyd J.; Nigg, David W.
2005-11-15
Methods and computer readable media are disclosed for ultimately developing a dosimetry plan for a treatment volume irradiated during radiation therapy with a radiation source concentrated internally within a patient or incident from an external beam. The dosimetry plan is available in near "real-time" because of the novel geometric model construction of the treatment volume which in turn allows for rapid calculations to be performed for simulated movements of particles along particle tracks therethrough. The particles are exemplary representations of alpha, beta or gamma emissions emanating from an internal radiation source during various radiotherapies, such as brachytherapy or targeted radionuclide therapy, or they are exemplary representations of high-energy photons, electrons, protons or other ionizing particles incident on the treatment volume from an external source. In a preferred embodiment, a medical image of a treatment volume irradiated during radiotherapy having a plurality of pixels of information is obtained.
Internal stresses at the crystalline scale in textured ZrO2 films before lateral cracking
NASA Astrophysics Data System (ADS)
Berdin, Clotilde; Pascal, Serge; Tang, Yan
2015-05-01
Zirconium oxide layers are submitted to internal stresses that play a role in damage of the layer. Lateral cracking is often observed during Zr alloys oxidation. In this paper, we investigated the influence of the microstresses at the crystalline scale on the lateral cracking within a growing oxide on a plane substrate. A parametric study was carried out taking into account the crystallographic texture of the oxide and the presence of a tetragonal zirconia at the metal-oxide interface. Macroscopic computations and polycrystalline aggregate computations were performed. The result indicating the (1 0 6 bar) fiber texture as the most favorable was recovered. It was found that under macroscopic compressive stresses parallel to the plane metal-oxide interface, positive microstresses perpendicular to the interface develops. They can trigger the lateral cracking and the phenomenon is promoted by the presence of tetragonal zirconia at the metal-oxide interface.
Skylab extravehicular mobility unit thermal simulator
NASA Technical Reports Server (NTRS)
Hixon, C. W.; Phillips, M. A.
1974-01-01
The analytical methods, thermal model, and user's instructions for the Skylab Extravehicular Mobility Unit (SEMU) routine are presented. This digital computer program was developed for detailed thermal performance predictions of the SEMU on the NASA-JSC Univac 1108 computer system. It accounts for conductive, convective, and radiant heat transfer as well as fluid flow and special component characterization. The program provides thermal performance predictions for a 967 node thermal model in one thirty-sixth (1/36) of mission time when operated at a calculating interval of three minutes (mission time). The program has the operational flexibility to: (1) accept card or magnetic tape data input for the thermal model describing the SEMU structure, fluid systems, crewman and component performance, (2) accept card and/or magnetic tape input of internally generated heat and heat influx from the space environment, and (3) output tabular or plotted histories of temperature, flow rates, and other parameters describing system operating modes.
Giancarlo, Raffaele; Scaturro, Davide; Utro, Filippo
2008-10-29
Inferring cluster structure in microarray datasets is a fundamental task for the so-called -omic sciences. It is also a fundamental question in Statistics, Data Analysis and Classification, in particular with regard to the prediction of the number of clusters in a dataset, usually established via internal validation measures. Despite the wealth of internal measures available in the literature, new ones have been recently proposed, some of them specifically for microarray data. We consider five such measures: Clest, Consensus (Consensus Clustering), FOM (Figure of Merit), Gap (Gap Statistics) and ME (Model Explorer), in addition to the classic WCSS (Within Cluster Sum-of-Squares) and KL (Krzanowski and Lai index). We perform extensive experiments on six benchmark microarray datasets, using both Hierarchical and K-means clustering algorithms, and we provide an analysis assessing both the intrinsic ability of a measure to predict the correct number of clusters in a dataset and its merit relative to the other measures. We pay particular attention both to precision and speed. Moreover, we also provide various fast approximation algorithms for the computation of Gap, FOM and WCSS. The main result is a hierarchy of those measures in terms of precision and speed, highlighting some of their merits and limitations not reported before in the literature. Based on our analysis, we draw several conclusions for the use of those internal measures on microarray data. We report the main ones. Consensus is by far the best performer in terms of predictive power and remarkably algorithm-independent. Unfortunately, on large datasets, it may be of no use because of its non-trivial computer time demand (weeks on a state of the art PC). FOM is the second best performer although, quite surprisingly, it may not be competitive in this scenario: it has essentially the same predictive power of WCSS but it is from 6 to 100 times slower in time, depending on the dataset. The approximation algorithms for the computation of FOM, Gap and WCSS perform very well, i.e., they are faster while still granting a very close approximation of FOM and WCSS. The approximation algorithm for the computation of Gap deserves to be singled-out since it has a predictive power far better than Gap, it is competitive with the other measures, but it is at least two order of magnitude faster in time with respect to Gap. Another important novel conclusion that can be drawn from our analysis is that all the measures we have considered show severe limitations on large datasets, either due to computational demand (Consensus, as already mentioned, Clest and Gap) or to lack of precision (all of the other measures, including their approximations). The software and datasets are available under the GNU GPL on the supplementary material web page.
ERIC Educational Resources Information Center
Coleman, LaToya O.; Gibson, Philip; Cotten, Shelia R.; Howell-Moroney, Michael; Stringer, Kristi
2016-01-01
This study examines the relationship between internal barriers, professional development, and computer integration outcomes among a sample of fourth- and fifth-grade teachers in an urban, low-income school district in the Southeastern United States. Specifically, we examine the impact of teachers' computer attitudes, computer anxiety, and computer…
Complications in transorbital penetrating injury by bamboo branch: A case report.
Feng, Lei; He, Xiaojun; Chen, Jie; Ni, Shuang; Jiang, Biao; Hua, Jian-Ming
2018-05-01
Wooden transorbital penetrating injury is an uncommon and serious trauma that may cause multiply complications. Here we describe a 62-year-old Chinese woman with a transorbital penetrating injury caused by a long bamboo branch. Computed tomography scan and magnetic resonance imaging showed the presence of a wooden foreign body. Cerebrovascular digital subtraction angiography and temporary balloon occlusion were performed with general anesthesia. Anti-inflammatory therapy was subsequently administered. Retention of wooden foreign body, orbital cellulitis, and traumatic aneurysm at the right internal carotid artery were diagnosed 1 month later. Coil embolization of the right internal carotid artery aneurysm and endoscopic sinus surgery were then performed, and postoperative condition was monitored and recorded. Penetrating transorbital injury complications may occur because of retained wooden foreign bodies near the intracranial arteries. Reasonable surgical intervention and special attention should be performed in this kind of trauma.
Image Processor Electronics (IPE): The High-Performance Computing System for NASA SWIFT Mission
NASA Technical Reports Server (NTRS)
Nguyen, Quang H.; Settles, Beverly A.
2003-01-01
Gamma Ray Bursts (GRBs) are believed to be the most powerful explosions that have occurred in the Universe since the Big Bang and are a mystery to the scientific community. Swift, a NASA mission that includes international participation, was designed and built in preparation for a 2003 launch to help to determine the origin of Gamma Ray Bursts. Locating the position in the sky where a burst originates requires intensive computing, because the duration of a GRB can range between a few milliseconds up to approximately a minute. The instrument data system must constantly accept multiple images representing large regions of the sky that are generated by sixteen gamma ray detectors operating in parallel. It then must process the received images very quickly in order to determine the existence of possible gamma ray bursts and their locations. The high-performance instrument data computing system that accomplishes this is called the Image Processor Electronics (IPE). The IPE was designed, built and tested by NASA Goddard Space Flight Center (GSFC) in order to meet these challenging requirements. The IPE is a small size, low power and high performing computing system for space applications. This paper addresses the system implementation and the system hardware architecture of the IPE. The paper concludes with the IPE system performance that was measured during end-to-end system testing.
Detailed Multi‐dimensional Modeling of Direct Internal Reforming Solid Oxide Fuel Cells
Tseronis, K.; Fragkopoulos, I.S.; Bonis, I.
2016-01-01
Abstract Fuel flexibility is a significant advantage of solid oxide fuel cells (SOFCs) and can be attributed to their high operating temperature. Here we consider a direct internal reforming solid oxide fuel cell setup in which a separate fuel reformer is not required. We construct a multidimensional, detailed model of a planar solid oxide fuel cell, where mass transport in the fuel channel is modeled using the Stefan‐Maxwell model, whereas the mass transport within the porous electrodes is simulated using the Dusty‐Gas model. The resulting highly nonlinear model is built into COMSOL Multiphysics, a commercial computational fluid dynamics software, and is validated against experimental data from the literature. A number of parametric studies is performed to obtain insights on the direct internal reforming solid oxide fuel cell system behavior and efficiency, to aid the design procedure. It is shown that internal reforming results in temperature drop close to the inlet and that the direct internal reforming solid oxide fuel cell performance can be enhanced by increasing the operating temperature. It is also observed that decreases in the inlet temperature result in smoother temperature profiles and in the formation of reduced thermal gradients. Furthermore, the direct internal reforming solid oxide fuel cell performance was found to be affected by the thickness of the electrochemically‐active anode catalyst layer, although not always substantially, due to the counter‐balancing behavior of the activation and ohmic overpotentials. PMID:27570502
Performance analysis of SA-3 missile second stage
NASA Technical Reports Server (NTRS)
Helmy, A. M.
1981-01-01
One SA-3 missile was disassembled. The constituents of the second stage were thoroughly investigated for geometrical details. The second stage slotted composite propellant grain was subjected to mechanical properties testing, physiochemical analyses, and burning rate measurements at different conditions. To determine the propellant performance parameters, the slotted composite propellant grain was machined into a set of small-size tubular grains. These grains were fired in a small size rocket motor with a set of interchangeable nozzles with different throat diameters. The firings were carried out at three different conditions. The data from test motor firings, physiochemical properties of the propellant, burning rate measurement results and geometrical details of the second stage motor, were used as input data in a computer program to compute the internal ballistic characteristics of the second stage.
Robot, computer problem solving system
NASA Technical Reports Server (NTRS)
Becker, J. D.
1972-01-01
The development of a computer problem solving system is reported that considers physical problems faced by an artificial robot moving around in a complex environment. Fundamental interaction constraints with a real environment are simulated for the robot by visual scan and creation of an internal environmental model. The programming system used in constructing the problem solving system for the simulated robot and its simulated world environment is outlined together with the task that the system is capable of performing. A very general framework for understanding the relationship between an observed behavior and an adequate description of that behavior is included.
NASA Technical Reports Server (NTRS)
1990-01-01
Structural Reliability Consultants' computer program creates graphic plots showing the statistical parameters of glue laminated timbers, or 'glulam.' The company president, Dr. Joseph Murphy, read in NASA Tech Briefs about work related to analysis of Space Shuttle surface tile strength performed for Johnson Space Center by Rockwell International Corporation. Analysis led to a theory of 'consistent tolerance bounds' for statistical distributions, applicable in industrial testing where statistical analysis can influence product development and use. Dr. Murphy then obtained the Tech Support Package that covers the subject in greater detail. The TSP became the basis for Dr. Murphy's computer program PC-DATA, which he is marketing commercially.
Erba, Alessandro; Caglioti, Dominique; Zicovich-Wilson, Claudio Marcelo; Dovesi, Roberto
2017-02-15
Two alternative approaches for the quantum-mechanical calculation of the nuclear-relaxation term of elastic and piezoelectric tensors of crystalline materials are illustrated and their computational aspects discussed: (i) a numerical approach based on the geometry optimization of atomic positions at strained lattice configurations and (ii) a quasi-analytical approach based on the evaluation of the force- and displacement-response internal-strain tensors as combined with the interatomic force-constant matrix. The two schemes are compared both as regards their computational accuracy and performance. The latter approach, not being affected by the many numerical parameters and procedures of a typical quasi-Newton geometry optimizer, constitutes a more reliable and robust mean to the evaluation of such properties, at a reduced computational cost for most crystalline systems. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
NASA Technical Reports Server (NTRS)
Dunne, Matthew J.
2011-01-01
The development of computer software as a tool to generate visual displays has led to an overall expansion of automated computer generated images in the aerospace industry. These visual overlays are generated by combining raw data with pre-existing data on the object or objects being analyzed on the screen. The National Aeronautics and Space Administration (NASA) uses this computer software to generate on-screen overlays when a Visiting Vehicle (VV) is berthing with the International Space Station (ISS). In order for Mission Control Center personnel to be a contributing factor in the VV berthing process, computer software similar to that on the ISS must be readily available on the ground to be used for analysis. In addition, this software must perform engineering calculations and save data for further analysis.
Global information infrastructure.
Lindberg, D A
1994-01-01
The High Performance Computing and Communications Program (HPCC) is a multiagency federal initiative under the leadership of the White House Office of Science and Technology Policy, established by the High Performance Computing Act of 1991. It has been assigned a critical role in supporting the international collaboration essential to science and to health care. Goals of the HPCC are to extend USA leadership in high performance computing and networking technologies; to improve technology transfer for economic competitiveness, education, and national security; and to provide a key part of the foundation for the National Information Infrastructure. The first component of the National Institutes of Health to participate in the HPCC, the National Library of Medicine (NLM), recently issued a solicitation for proposals to address a range of issues, from privacy to 'testbed' networks, 'virtual reality,' and more. These efforts will build upon the NLM's extensive outreach program and other initiatives, including the Unified Medical Language System (UMLS), MEDLARS, and Grateful Med. New Internet search tools are emerging, such as Gopher and 'Knowbots'. Medicine will succeed in developing future intelligent agents to assist in utilizing computer networks. Our ability to serve patients is so often restricted by lack of information and knowledge at the time and place of medical decision-making. The new technologies, properly employed, will also greatly enhance our ability to serve the patient.
On the sensitivity of complex, internally coupled systems
NASA Technical Reports Server (NTRS)
Sobieszczanskisobieski, Jaroslaw
1988-01-01
A method is presented for computing sensitivity derivatives with respect to independent (input) variables for complex, internally coupled systems, while avoiding the cost and inaccuracy of finite differencing performed on the entire system analysis. The method entails two alternative algorithms: the first is based on the classical implicit function theorem formulated on residuals of governing equations, and the second develops the system sensitivity equations in a new form using the partial (local) sensitivity derivatives of the output with respect to the input of each part of the system. A few application examples are presented to illustrate the discussion.
ERIC Educational Resources Information Center
Chen, Gwo-Dong; Chen, Chun-Hsiang; Wang, Chin-Yeh; Li, Liang-Yi
2012-01-01
The article aims to compare international conferences, "The International Educational Technology Conference" (IETC, 2011) and "The International Conference on Computers in Education" (ICCE, 2010), from various dimensions. The comparison is expected to conclude a better approach for every IETC and ICCE to be held. (Contains 4…
NASA Technical Reports Server (NTRS)
1983-01-01
Missions to be performed, station operations and functions to be carried out, and technologies anticipated during the time frame of the space station were examined in order to determine the scope of the overall information management system for the space station. This system comprises: (1) the data management system which includes onboard computer related hardware and software required to assume and exercise control of all activities performed on the station; (2) the communication system for both internal and external communications; and (3) the ground segment. Techniques used to examine the information system from a functional and performance point of view are described as well as the analyses performed to derive the architecture of both the onboard data management system and the system for internal and external communications. These architectures are then used to generate a conceptual design of the onboard elements in order to determine the physical parameters (size/weight/power) of the hardware and software. The ground segment elements are summarized.
NASA Astrophysics Data System (ADS)
1983-04-01
Missions to be performed, station operations and functions to be carried out, and technologies anticipated during the time frame of the space station were examined in order to determine the scope of the overall information management system for the space station. This system comprises: (1) the data management system which includes onboard computer related hardware and software required to assume and exercise control of all activities performed on the station; (2) the communication system for both internal and external communications; and (3) the ground segment. Techniques used to examine the information system from a functional and performance point of view are described as well as the analyses performed to derive the architecture of both the onboard data management system and the system for internal and external communications. These architectures are then used to generate a conceptual design of the onboard elements in order to determine the physical parameters (size/weight/power) of the hardware and software. The ground segment elements are summarized.
Web-Based Integrated Research Environment for Aerodynamic Analyses and Design
NASA Astrophysics Data System (ADS)
Ahn, Jae Wan; Kim, Jin-Ho; Kim, Chongam; Cho, Jung-Hyun; Hur, Cinyoung; Kim, Yoonhee; Kang, Sang-Hyun; Kim, Byungsoo; Moon, Jong Bae; Cho, Kum Won
e-AIRS[1,2], an abbreviation of ‘e-Science Aerospace Integrated Research System,' is a virtual organization designed to support aerodynamic flow analyses in aerospace engineering using the e-Science environment. As the first step toward a virtual aerospace engineering organization, e-AIRS intends to give a full support of aerodynamic research process. Currently, e-AIRS can handle both the computational and experimental aerodynamic research on the e-Science infrastructure. In detail, users can conduct a full CFD (Computational Fluid Dynamics) research process, request wind tunnel experiment, perform comparative analysis between computational prediction and experimental measurement, and finally, collaborate with other researchers using the web portal. The present paper describes those services and the internal architecture of the e-AIRS system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duran, Felicia Angelica; Waymire, Russell L.
2013-10-01
Sandia National Laboratories (SNL) is providing training and consultation activities on security planning and design for the Korea Hydro and Nuclear Power Central Research Institute (KHNPCRI). As part of this effort, SNL performed a literature review on computer security requirements, guidance and best practices that are applicable to an advanced nuclear power plant. This report documents the review of reports generated by SNL and other organizations [U.S. Nuclear Regulatory Commission, Nuclear Energy Institute, and International Atomic Energy Agency] related to protection of information technology resources, primarily digital controls and computer resources and their data networks. Copies of the key documentsmore » have also been provided to KHNP-CRI.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
FINNEY, Charles E A; Edwards, Kevin Dean; Stoyanov, Miroslav K
2015-01-01
Combustion instabilities in dilute internal combustion engines are manifest in cyclic variability (CV) in engine performance measures such as integrated heat release or shaft work. Understanding the factors leading to CV is important in model-based control, especially with high dilution where experimental studies have demonstrated that deterministic effects can become more prominent. Observation of enough consecutive engine cycles for significant statistical analysis is standard in experimental studies but is largely wanting in numerical simulations because of the computational time required to compute hundreds or thousands of consecutive cycles. We have proposed and begun implementation of an alternative approach to allowmore » rapid simulation of long series of engine dynamics based on a low-dimensional mapping of ensembles of single-cycle simulations which map input parameters to output engine performance. This paper details the use Titan at the Oak Ridge Leadership Computing Facility to investigate CV in a gasoline direct-injected spark-ignited engine with a moderately high rate of dilution achieved through external exhaust gas recirculation. The CONVERGE CFD software was used to perform single-cycle simulations with imposed variations of operating parameters and boundary conditions selected according to a sparse grid sampling of the parameter space. Using an uncertainty quantification technique, the sampling scheme is chosen similar to a design of experiments grid but uses functions designed to minimize the number of samples required to achieve a desired degree of accuracy. The simulations map input parameters to output metrics of engine performance for a single cycle, and by mapping over a large parameter space, results can be interpolated from within that space. This interpolation scheme forms the basis for a low-dimensional metamodel which can be used to mimic the dynamical behavior of corresponding high-dimensional simulations. Simulations of high-EGR spark-ignition combustion cycles within a parametric sampling grid were performed and analyzed statistically, and sensitivities of the physical factors leading to high CV are presented. With these results, the prospect of producing low-dimensional metamodels to describe engine dynamics at any point in the parameter space will be discussed. Additionally, modifications to the methodology to account for nondeterministic effects in the numerical solution environment are proposed« less
Computational hybrid anthropometric paediatric phantom library for internal radiation dosimetry
NASA Astrophysics Data System (ADS)
Xie, Tianwu; Kuster, Niels; Zaidi, Habib
2017-04-01
Hybrid computational phantoms combine voxel-based and simplified equation-based modelling approaches to provide unique advantages and more realism for the construction of anthropomorphic models. In this work, a methodology and C++ code are developed to generate hybrid computational phantoms covering statistical distributions of body morphometry in the paediatric population. The paediatric phantoms of the Virtual Population Series (IT’IS Foundation, Switzerland) were modified to match target anthropometric parameters, including body mass, body length, standing height and sitting height/stature ratio, determined from reference databases of the National Centre for Health Statistics and the National Health and Nutrition Examination Survey. The phantoms were selected as representative anchor phantoms for the newborn, 1, 2, 5, 10 and 15 years-old children, and were subsequently remodelled to create 1100 female and male phantoms with 10th, 25th, 50th, 75th and 90th body morphometries. Evaluation was performed qualitatively using 3D visualization and quantitatively by analysing internal organ masses. Overall, the newly generated phantoms appear very reasonable and representative of the main characteristics of the paediatric population at various ages and for different genders, body sizes and sitting stature ratios. The mass of internal organs increases with height and body mass. The comparison of organ masses of the heart, kidney, liver, lung and spleen with published autopsy and ICRP reference data for children demonstrated that they follow the same trend when correlated with age. The constructed hybrid computational phantom library opens up the prospect of comprehensive radiation dosimetry calculations and risk assessment for the paediatric population of different age groups and diverse anthropometric parameters.
Performance Analysis of the Mobile IP Protocol (RFC 3344 and Related RFCS)
2006-12-01
Encapsulation HMAC Keyed-Hash Message Authentication Code ICMP Internet Control Message Protocol IEEE Institute of Electrical and Electronics Engineers IETF...Internet Engineering Task Force IOS Internetwork Operating System IP Internet Protocol ITU International Telecommunication Union LAN Local Area...network computing. Most organizations today have sophisticated networks that are connected to the Internet. The major benefit reaped from such a
Effects of Whole-Body Motion Simulation on Flight Skill Development.
1981-10-01
computation requirements, compared to the implementation allowing for a deviate internal model, provided further motivation for assuming a correct...We are left with two more likely explanations for the apparent trends: (1) subjects were motivated differently by the different task configurations...because of modeling constraints. The notion of task-related motivational differences are explored in Appendix E. Sensitivity analysis performed with
Using Artificial Physics to Control Agents
1999-11-01
unlimited 13. SUPPLEMENTARY NOTES IEEE International Conference on Information, Intelligence, and Systems, Oct 31 -Nov 3,1999. Bethesda, MD 14. ABSTRACT...distributed control can also perform distributed computation. 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT Same...1995. [9] H. Pattee. Artificial life needs a real epistemology. In Moran, Moreno, Merelo, and Chacon , editors, Advances in Artificial Life, pages
Merged Vision and GPS Control of a Semi-Autonomous, Small Helicopter
NASA Technical Reports Server (NTRS)
Rock, Stephen M.
1999-01-01
This final report documents the activities performed during the research period from April 1, 1996 to September 30, 1997. It contains three papers: Carrier Phase GPS and Computer Vision for Control of an Autonomous Helicopter; A Contestant in the 1997 International Aerospace Robotics Laboratory Stanford University; and Combined CDGPS and Vision-Based Control of a Small Autonomous Helicopter.
Robonaut 2 in the U.S. Laboratory
2012-02-15
ISS030-E-074059 (15 Feb. 2012) --- Robonaut 2, nicknamed R2, is pictured in the Destiny laboratory of the International Space Station while NASA astronaut Dan Burbank (mostly out of frame at left) uses a computer during R2?s initial checkouts. R2 later went on to make history with the first human/robotic handshake to be performed in space.
Self-force as probe of internal structure
NASA Astrophysics Data System (ADS)
Isoyama, Soichiro; Poisson, Eric
2012-08-01
The self-force acting on a (scalar or electric) charge held in place outside a massive body contains information about the body’s composition, and can therefore be used as a probe of internal structure. We explore this theme by computing the (scalar or electromagnetic) self-force when the body is a spherical ball of perfect fluid in hydrostatic equilibrium, under the assumption that its rest-mass density and pressure are related by a polytropic equation of state. The body is strongly self-gravitating, and all computations are performed in exact general relativity. The dependence on internal structure is best revealed by expanding the self-force in powers of r-10, with r0 denoting the radial position of the charge outside the body. To the leading order, the self-force scales as r-30 and depends only on the square of the charge and the body’s mass; the leading self-force is universal. The dependence on internal structure is seen at the next order, r-50, through a structure factor that depends on the equation of state. We compute this structure factor for relativistic polytropes, and show that for a fixed mass, it increases linearly with the body’s radius in the case of the scalar self-force, and quadratically with the body’s radius in the case of the electromagnetic self-force. In both cases we find that for a fixed mass and radius, the self-force is smaller if the body is more centrally dense, and larger if the mass density is more uniformly distributed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stoffel, T.; Reda, I.
2013-05-01
The NREL Pyrheliometer Comparisons for 2012 (NPC-2012) were held at the Solar Radiation Research Laboratory in Golden, Colorado, from September 24 through October 5 for the purpose of transferring the World Radiometric Reference (WRR) to participating instrument. Twenty scientists and engineers operated 32 absolute cavity radiometers and 18 conventional thermopile-based pyrheliometers to simultaneously measure clear-sky direct normal irradiance during the comparisons. The transfer standard group of reference radiometers for NPC-2012 consisted of four NREL radiometers with direct traceability to the WRR, having participated in the Eleventh International Pyrheliometer Comparisons (IPC-XI) hosted by the World Radiation Center in the fall ofmore » 2010. As the result of NPC-2012, each participating absolute cavity radiometer was assigned a new WRR transfer factor, computed as the reference irradiance computed by the transfer standard group divided by the observed irradiance from the participating radiometer. The performance of the transfer standard group during NPC-2012 was consistent with previous comparisons, including IPC-XI. The measurement performance of the transfer standard group allowed the transfer of the WRR to each participating radiometer with an estimated uncertainty of +/- 0.33% with respect to the International System of Units.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heard, F.J.; Harris, R.A.; Padilla, A.
The SASSYS/SAS4A systems analysis code was used to simulate a series of unprotected loss of flow (ULOF) tests planned at the Fast Flux Test Facility (FFTF). The subject tests were designed to investigate the transient performance of the FFTF during various ULOF scenarios for two different loading patterns designed to produce extremes in the assembly load pad clearance and the direction of the initial assembly bows. The tests are part of an international program designed to extend the existing data base on the performance of liquid metal reactors (LMR). The analyses demonstrate that a wide range of power-to-flow ratios canmore » be reached during the transients and, therefore, will yield valuable data on the dynamic character of the structural feedbacks in LMRS. These analyses will be repeated once the actual FFTF core loadings for the tests are available. These predictions, similar ones obtained by other international participants in the FFTF program, and post-test analyses will be used to upgrade and further verify the computer codes used to predict the behavior of LMRS.« less
Geomechanical/Geochemical Modeling Studies Conducted within theInternational DECOVALEX Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Birkholzer, J.T.; Rutqvist, J.; Sonnenthal, E.L.
2005-10-19
The DECOVALEX project is an international cooperative project initiated by SKI, the Swedish Nuclear Power Inspectorate, with participation of about 10 international organizations. The general goal of this project is to encourage multidisciplinary interactive and cooperative research on modeling coupled thermo-hydro-mechanical-chemical (THMC) processes in geologic formations in support of the performance assessment for underground storage of radioactive waste. One of the research tasks, initiated in 2004 by the U.S. Department of Energy (DOE), addresses the long-term impact of geomechanical and geochemical processes on the flow conditions near waste emplacement tunnels. Within this task, four international research teams conduct predictive analysismore » of the coupled processes in two generic repositories, using multiple approaches and different computer codes. Below, we give an overview of the research task and report its current status.« less
Geomechanical/ Geochemical Modeling Studies onducted Within the International DECOVALEX Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
J.T. Birkholzer; J. Rutqvist; E.L. Sonnenthal
2006-02-01
The DECOVALEX project is an international cooperative project initiated by SKI, the Swedish Nuclear Power Inspectorate, with participation of about 10 international organizations. The general goal of this project is to encourage multidisciplinary interactive and cooperative research on modeling coupled thermo-hydro-mechanical-chemical (THMC) processes in geologic formations in support of the performance assessment for underground storage of radioactive waste. One of the research tasks, initiated in 2004 by the U.S. Department of Energy (DOE), addresses the long-term impact of geomechanical and geochemical processes on the flow conditions near waste emplacement tunnels. Within this task, four international research teams conduct predictive analysismore » of the coupled processes in two generic repositories, using multiple approaches and different computer codes. Below, we give an overview of the research task and report its current status.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Birkholzer, J.T.; Barr, D.; Rutqvist, J.
2005-11-15
The DECOVALEX project is an international cooperativeproject initiated by SKI, the Swedish Nuclear Power Inspectorate, withparticipation of about 10 international organizations. The general goalof this project is to encourage multidisciplinary interactive andcooperative research on modelling coupledthermo-hydro-mechanical-chemical (THMC) processes in geologic formationsin support of the performance assessment for underground storage ofradioactive waste. One of the research tasks, initiated in 2004 by theU.S. Department of Energy (DOE), addresses the long-term impact ofgeomechanical and geochemical processes on the flow conditions near wasteemplacement tunnels. Within this task, four international research teamsconduct predictive analysis of the coupled processes in two genericrepositories, using multiple approaches andmore » different computer codes.Below, we give an overview of the research task and report its currentstatus.« less
PREFACE: ELC International Meeting on Inference, Computation, and Spin Glasses (ICSG2013)
NASA Astrophysics Data System (ADS)
Kabashima, Yoshiyuki; Hukushima, Koji; Inoue, Jun-ichi; Tanaka, Toshiyuki; Watanabe, Osamu
2013-12-01
The close relationship between probability-based inference and statistical mechanics of disordered systems has been noted for some time. This relationship has provided researchers with a theoretical foundation in various fields of information processing for analytical performance evaluation and construction of efficient algorithms based on message-passing or Monte Carlo sampling schemes. The ELC International Meeting on 'Inference, Computation, and Spin Glasses (ICSG2013)', was held in Sapporo 28-30 July 2013. The meeting was organized as a satellite meeting of STATPHYS25 in order to offer a forum where concerned researchers can assemble and exchange information on the latest results and newly established methodologies, and discuss future directions of the interdisciplinary studies between statistical mechanics and information sciences. Financial support from Grant-in-Aid for Scientific Research on Innovative Areas, MEXT, Japan 'Exploring the Limits of Computation (ELC)' is gratefully acknowledged. We are pleased to publish 23 papers contributed by invited speakers of ICSG2013 in this volume of Journal of Physics: Conference Series. We hope that this volume will promote further development of this highly vigorous interdisciplinary field between statistical mechanics and information/computer science. Editors and ICSG2013 Organizing Committee: Koji Hukushima Jun-ichi Inoue (Local Chair of ICSG2013) Yoshiyuki Kabashima (Editor-in-Chief) Toshiyuki Tanaka Osamu Watanabe (General Chair of ICSG2013)
ERIC Educational Resources Information Center
Gerick, Julia; Eickelmann, Birgit; Bos, Wilfried
2017-01-01
The "International Computer and Information Literacy Study" (ICILS 2013) provides, for the first time, information about students' computer and information literacy (CIL), as well as its acquisition, based on a computer-based test for students and background questionnaires. Among the 21 education systems that participated in ICILS 2013,…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-28
...; Computer Matching Program (SSA Internal Match)--Match Number 1014 AGENCY: Social Security Administration... regarding protections for such persons. The Privacy Act, as amended, regulates the use of computer matching....C. 552a, as amended, and the provisions of the Computer Matching and Privacy Protection Act of 1988...
Pace, Jonathan; Nelson, Jeffrey; Ray, Abhishek; Hu, Yin
2017-12-01
A middle-aged patient presented for elective embolization of an incidentally found right internal carotid aneurysm. An angiogram was performed, during which the left internal carotid artery was visualized to evaluate a second, small aneurysm. During the embolization of the right internal carotid artery aneurysm, a catheter-induced vasospasm was identified that prompted treatment with intra-arterial verapamil. The procedure was uncomplicated; a postoperative rotational flat-panel computed tomography scan was performed on the angiography table that demonstrated right hemisphere contrast staining. The patient developed a right middle cerebral artery (MCA) syndrome after extubation with repeat cerebral angiography negative for occlusion and magnetic resonance imaging negative for stroke. The patient was observed for 48 hours, during which time the patient had slowly improved. At a six-week follow up visit, the patient had fully recovered. We present an interesting case of a verapamil-induced breakdown of the blood-brain barrier and self-limited right MCA syndrome.
Kubota, So; Inaba, Yutaka; Kobayashi, Naomi; Choe, Hyonmin; Tezuka, Taro; Saito, Tomoyuki
2017-10-16
While cam resection is essential to achieve a good clinical result with respect to femoroacetabular impingement (FAI), it is unclear whether it should also be performed in cases of borderline developmental dysplasia of the hip (DDH) with a cam deformity. The aim of this study was to evaluate improvements in range of motion (ROM) in cases of cam-type FAI and borderline DDH after virtual osteochondroplasty using a computer impingement simulation. Thirty-eight symptomatic hips in 31 patients (11male and 20 female) diagnosed with cam-type FAI or borderline DDH were analyzed. There were divided into a cam-type FAI group (cam-FAI group: 15 hips), borderline DDH without cam group (DDH W/O cam group: 12 hips), and borderline DDH with cam group (DDH W/ cam group: 11 hips). The bony impingement point on the femoral head-neck junction at 90° flexion and maximum internal rotation of the hip joint was identified using ZedHip® software. Virtual osteochondroplasty of the impingement point was then performed in all cases. The maximum flexion angle and maximum internal rotation angle at 90° flexion were measured before and after virtual osteochondroplasty at two resection ranges (i.e., slight and sufficient). The mean improvement in the internal rotation angle in the DDH W/ cam group after slight resection was significantly greater than that in the DDH W/O cam group (P = 0.046). Furthermore, the mean improvement in the internal rotation angle in the DDH W/ cam and cam-FAI groups after sufficient resection was significantly greater than that in the DDH W/O cam group (DDH W/ cam vs DDH W/O cam: P = 0.002, cam-FAI vs DDH W/O cam: P = 0.043). Virtual osteochondroplasty resulted in a significant improvement in internal rotation angle in DDH W/ cam group but not in DDH W/O cam group. Thus, borderline DDH cases with cam deformity may be better to consider performing osteochondroplasty.
NASA Astrophysics Data System (ADS)
Peter, Daniel; Videau, Brice; Pouget, Kevin; Komatitsch, Dimitri
2015-04-01
Improving the resolution of tomographic images is crucial to answer important questions on the nature of Earth's subsurface structure and internal processes. Seismic tomography is the most prominent approach where seismic signals from ground-motion records are used to infer physical properties of internal structures such as compressional- and shear-wave speeds, anisotropy and attenuation. Recent advances in regional- and global-scale seismic inversions move towards full-waveform inversions which require accurate simulations of seismic wave propagation in complex 3D media, providing access to the full 3D seismic wavefields. However, these numerical simulations are computationally very expensive and need high-performance computing (HPC) facilities for further improving the current state of knowledge. During recent years, many-core architectures such as graphics processing units (GPUs) have been added to available large HPC systems. Such GPU-accelerated computing together with advances in multi-core central processing units (CPUs) can greatly accelerate scientific applications. There are mainly two possible choices of language support for GPU cards, the CUDA programming environment and OpenCL language standard. CUDA software development targets NVIDIA graphic cards while OpenCL was adopted mainly by AMD graphic cards. In order to employ such hardware accelerators for seismic wave propagation simulations, we incorporated a code generation tool BOAST into an existing spectral-element code package SPECFEM3D_GLOBE. This allows us to use meta-programming of computational kernels and generate optimized source code for both CUDA and OpenCL languages, running simulations on either CUDA or OpenCL hardware accelerators. We show here applications of forward and adjoint seismic wave propagation on CUDA/OpenCL GPUs, validating results and comparing performances for different simulations and hardware usages.
The influence of test mode and visuospatial ability on mathematics assessment performance
NASA Astrophysics Data System (ADS)
Logan, Tracy
2015-12-01
Mathematics assessment and testing are increasingly situated within digital environments with international tests moving to computer-based testing in the near future. This paper reports on a secondary data analysis which explored the influence the mode of assessment—computer-based (CBT) and pencil-and-paper based (PPT)—and visuospatial ability had on students' mathematics test performance. Data from 804 grade 6 Singaporean students were analysed using the knowledge discovery in data design. The results revealed statistically significant differences between performance on CBT and PPT test modes across content areas concerning whole number algebraic patterns and data and chance. However, there were no performance differences for content areas related to spatial arrangements geometric measurement or other number. There were also statistically significant differences in performance between those students who possess higher levels of visuospatial ability compared to those with lower levels across all six content areas. Implications include careful consideration for the comparability of CBT and PPT testing and the need for increased attention to the role of visuospatial reasoning in student's mathematics reasoning.
NCC Simulation Model: Simulating the operations of the network control center, phase 2
NASA Technical Reports Server (NTRS)
Benjamin, Norman M.; Paul, Arthur S.; Gill, Tepper L.
1992-01-01
The simulation of the network control center (NCC) is in the second phase of development. This phase seeks to further develop the work performed in phase one. Phase one concentrated on the computer systems and interconnecting network. The focus of phase two will be the implementation of the network message dialogues and the resources controlled by the NCC. These resources are requested, initiated, monitored and analyzed via network messages. In the NCC network messages are presented in the form of packets that are routed across the network. These packets are generated, encoded, decoded and processed by the network host processors that generate and service the message traffic on the network that connects these hosts. As a result, the message traffic is used to characterize the work done by the NCC and the connected network. Phase one of the model development represented the NCC as a network of bi-directional single server queues and message generating sources. The generators represented the external segment processors. The served based queues represented the host processors. The NCC model consists of the internal and external processors which generate message traffic on the network that links these hosts. To fully realize the objective of phase two it is necessary to identify and model the processes in each internal processor. These processes live in the operating system of the internal host computers and handle tasks such as high speed message exchanging, ISN and NFE interface, event monitoring, network monitoring, and message logging. Inter process communication is achieved through the operating system facilities. The overall performance of the host is determined by its ability to service messages generated by both internal and external processors.
ICCE/ICCAI 2000 Full & Short Papers (Artificial Intelligence in Education).
ERIC Educational Resources Information Center
2000
This document contains the full and short papers on artificial intelligence in education from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction) covering the following topics: a computational model for learners' motivation states in individualized tutoring system; a…
ICCE/ICCAI 2000 Full & Short Papers (Student Modeling).
ERIC Educational Resources Information Center
2000
This document contains the following full and short papers on student modeling from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction): (1) "A Computational Model for Learner's Motivation States in Individualized Tutoring System" (Behrouz H. Far and Anete H.…
Teaching Techniques and Course Content for International Finance.
ERIC Educational Resources Information Center
Esemuede, Samuel I.
Noting the rapid and large changes in international finance over the past 2 decades, this paper offers suggestions for teaching business education courses on international finance. The paper recommends a combination of computer-assisted instruction and electronic classroom, discussion group, independent study, and lecture. Computer-assisted…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-29
... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-856] Certain Wireless Communication Devices, Portable Music and Data Processing Devices, Computers, and Components Thereof AGENCY: U.S. International Trade Commission. ACTION: Notice. SUMMARY: Notice is hereby given that the U.S. International...
Computation of high Reynolds number internal/external flows
NASA Technical Reports Server (NTRS)
Cline, M. C.; Wilmoth, R. G.
1981-01-01
A general, user oriented computer program, called VNAP2, has been developed to calculate high Reynolds number, internal/external flows. VNAP2 solves the two-dimensional, time-dependent Navier-Stokes equations. The turbulence is modeled with either a mixing-length, a one transport equation, or a two transport equation model. Interior grid points are computed using the explicit MacCormack scheme with special procedures to speed up the calculation in the fine grid. All boundary conditions are calculated using a reference plane characteristic scheme with the viscous terms treated as source terms. Several internal, and internal/external flow calculations are presented.
Computation of high Reynolds number internal/external flows
NASA Technical Reports Server (NTRS)
Cline, M. C.; Wilmoth, R. G.
1981-01-01
A general, user oriented computer program, called VNAP2, was developed to calculate high Reynolds number, internal/ external flows. The VNAP2 program solves the two dimensional, time dependent Navier-Stokes equations. The turbulence is modeled with either a mixing-length, a one transport equation, or a two transport equation model. Interior grid points are computed using the explicit MacCormack Scheme with special procedures to speed up the calculation in the fine grid. All boundary conditions are calculated using a reference plane characteristic scheme with the viscous terms treated as source terms. Several internal, external, and internal/external flow calculations are presented.
Computation of high Reynolds number internal/external flows
NASA Technical Reports Server (NTRS)
Cline, M. C.; Wilmoth, R. G.
1981-01-01
A general, user oriented computer program, called VNAF2, developed to calculate high Reynolds number internal/external flows is described. The program solves the two dimensional, time dependent Navier-Stokes equations. Turbulence is modeled with either a mixing length, a one transport equation, or a two transport equation model. Interior grid points are computed using the explicit MacCormack scheme with special procedures to speed up the calculation in the fine grid. All boundary conditions are calculated using a reference plane characteristic scheme with the viscous terms treated as source terms. Several internal, external, and internal/external flow calculations are presented.
Experimental and Computational Investigation of a Translating-Throat Single-Expansion-Ramp Nozzle
NASA Technical Reports Server (NTRS)
Deere, Karen A.; Asbury, Scott C.
1999-01-01
An experimental and computational study was conducted on a high-speed, single-expansion-ramp nozzle (SERN) concept designed for efficient off-design performance. The translating-throat SERN concept adjusts the axial location of the throat to provide a variable expansion ratio and allow a more optimum jet exhaust expansion at various flight conditions in an effort to maximize nozzle performance. Three design points (throat locations) were investigated to simulate the operation of this concept at subsonic-transonic, low supersonic, and high supersonic flight conditions. The experimental study was conducted in the jet exit test facility at the Langley Research Center. Internal nozzle performance was obtained at nozzle pressure ratios (NPR's) up to 13 for six nozzles with design nozzle pressure ratios near 9, 42, and 102. Two expansion-ramp surfaces, one concave and one convex, were tested for each design point. Paint-oil flow and focusing schlieren flow visualization techniques were utilized to acquire additional flow data at selected NPR'S. The Navier-Stokes code, PAB3D, was used with a two-equation k-e turbulence model for the computational study. Nozzle performance characteristics were predicted at nozzle pressure ratios of 5, 9, and 13 for the concave ramp, low Mach number nozzle and at 10, 13, and 102 for the concave ramp, high Mach number nozzle.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duke, Daniel J.; Finney, Charles E. A.; Kastengren, Alan
Given the importance of the fuel-injection process on the combustion and emissions performance of gasoline direct injected engines, there has been significant recent interest in understanding the fluid dynamics within the injector, particularly around the needle and through the nozzles. Furthermore, the pressure losses and transients that occur in the flow passages above the needle are also of interest. Simulations of these injectors typically use the nominal design geometry, which does not always match the production geometry. Computed tomography (CT) using x-ray and neutron sources can be used to obtain the real geometry from production injectors, but there are trade-offsmore » in using these techniques. X-ray CT provides high resolution, but cannot penetrate through the thicker parts of the injector. Neutron CT has excellent penetrating power but lower resolution. Here, we present results from a joint effort to characterize a gasoline direct injector representative of the Spray G injector as defined by the Engine Combustion Network. High-resolution (1.2 to 3 µm) x-ray CT measurements from the Advanced Photon Source at Argonne National Laboratory were combined with moderate-resolution (40 µm) neutron CT measurements from the High Flux Isotope Reactor at Oak Ridge National Laboratory to generate a complete internal geometry for the injector. This effort combined the strengths of both facilities’ capabilities, with extremely fine spatially resolved features in the nozzles and injector tips and fine resolution of internal features of the needle along the length of injector. Analysis of the resulting surface model of the internal fluid flow volumes of the injector reveals how the internal cross-sectional area and nozzle hole geometry differs slightly from the design dimensions. A simplified numerical simulation of the internal flow shows how deviations from the design geometry can alter the flow inside the sac and holes. Our results of this study will provide computational modelers with very accurate solid and surface models for use in computational fluid dynamics studies and experimentalists with increased insight into the operating characteristics of their injectors.« less
Current structure of strongly nonlinear interfacial solitary waves
NASA Astrophysics Data System (ADS)
Semin, Sergey; Kurkina, Oxana; Kurkin, Andrey; Talipova, Tatiana; Pelinovsky, Efim; Churaev, Egor
2015-04-01
The characteristics of highly nonlinear solitary internal waves (solitons) in two-layer flow are computed within the fully nonlinear Navier-Stokes equations with use of numerical model of the Massachusetts Institute of Technology (MITgcm). The verification and adaptation of the model is based on the data from laboratory experiments [Carr & Davies, 2006]. The present paper also compares the results of our calculations with the computations performed in the framework of the fully nonlinear Bergen Ocean Model [Thiem et al, 2011]. The comparison of the computed soliton parameters with the predictions of the weakly nonlinear theory based on the Gardner equation is given. The occurrence of reverse flow in the bottom layer directly behind the soliton is confirmed in numerical simulations. The trajectories of Lagrangian particles in the internal soliton on the surface, on the interface and near the bottom are computed. The results demonstrated completely different trajectories at different depths of the model area. Thus, in the surface layer is observed the largest displacement of Lagrangian particles, which can be more than two and a half times larger than the characteristic width of the soliton. Located at the initial moment along the middle pycnocline fluid particles move along the elongated vertical loop at a distance of not more than one third of the width of the solitary wave. In the bottom layer of the fluid moves in the opposite direction of propagation of the internal wave, but under the influence of the reverse flow, when the bulk of the velocity field of the soliton ceases to influence the trajectory, it moves in the opposite direction. The magnitude of displacement of fluid particles in the bottom layer is not more than the half-width of the solitary wave. 1. Carr, M., and Davies, P.A. The motion of an internal solitary wave of depression over a fixed bottom boundary in a shallow, two-layer fluid. Phys. Fluids, 2006, vol. 18, No. 1, 1 - 10. 2. Thiem, O., Carr, M., Berntsen, J., and Davies, P.A. Numerical simulation of internal solitary wave-induced reverse flow and associated vortices in a shallow, two-layer fluid benthic boundary layer. Ocean Dynamics, 2011, vol. 61, No. 6, 857 - 872.
Duke, Daniel J.; Finney, Charles E. A.; Kastengren, Alan; ...
2017-03-14
Given the importance of the fuel-injection process on the combustion and emissions performance of gasoline direct injected engines, there has been significant recent interest in understanding the fluid dynamics within the injector, particularly around the needle and through the nozzles. Furthermore, the pressure losses and transients that occur in the flow passages above the needle are also of interest. Simulations of these injectors typically use the nominal design geometry, which does not always match the production geometry. Computed tomography (CT) using x-ray and neutron sources can be used to obtain the real geometry from production injectors, but there are trade-offsmore » in using these techniques. X-ray CT provides high resolution, but cannot penetrate through the thicker parts of the injector. Neutron CT has excellent penetrating power but lower resolution. Here, we present results from a joint effort to characterize a gasoline direct injector representative of the Spray G injector as defined by the Engine Combustion Network. High-resolution (1.2 to 3 µm) x-ray CT measurements from the Advanced Photon Source at Argonne National Laboratory were combined with moderate-resolution (40 µm) neutron CT measurements from the High Flux Isotope Reactor at Oak Ridge National Laboratory to generate a complete internal geometry for the injector. This effort combined the strengths of both facilities’ capabilities, with extremely fine spatially resolved features in the nozzles and injector tips and fine resolution of internal features of the needle along the length of injector. Analysis of the resulting surface model of the internal fluid flow volumes of the injector reveals how the internal cross-sectional area and nozzle hole geometry differs slightly from the design dimensions. A simplified numerical simulation of the internal flow shows how deviations from the design geometry can alter the flow inside the sac and holes. Our results of this study will provide computational modelers with very accurate solid and surface models for use in computational fluid dynamics studies and experimentalists with increased insight into the operating characteristics of their injectors.« less
Performance of MCNP4A on seven computing platforms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hendricks, J.S.; Brockhoff, R.C.
1994-12-31
The performance of seven computer platforms has been evaluated with the MCNP4A Monte Carlo radiation transport code. For the first time we report timing results using MCNP4A and its new test set and libraries. Comparisons are made on platforms not available to us in previous MCNP timing studies. By using MCNP4A and its 325-problem test set, a widely-used and readily-available physics production code is used; the timing comparison is not limited to a single ``typical`` problem, demonstrating the problem dependence of timing results; the results are reproducible at the more than 100 installations around the world using MCNP; comparison ofmore » performance of other computer platforms to the ones tested in this study is possible because we present raw data rather than normalized results; and a measure of the increase in performance of computer hardware and software over the past two years is possible. The computer platforms reported are the Cray-YMP 8/64, IBM RS/6000-560, Sun Sparc10, Sun Sparc2, HP/9000-735, 4 processor 100 MHz Silicon Graphics ONYX, and Gateway 2000 model 4DX2-66V PC. In 1991 a timing study of MCNP4, the predecessor to MCNP4A, was conducted using ENDF/B-V cross-section libraries, which are export protected. The new study is based upon the new MCNP 25-problem test set which utilizes internationally available data. MCNP4A, its test problems and the test data library are available from the Radiation Shielding and Information Center in Oak Ridge, Tennessee, or from the NEA Data Bank in Saclay, France. Anyone with the same workstation and compiler can get the same test problem sets, the same library files, and the same MCNP4A code from RSIC or NEA and replicate our results. And, because we report raw data, comparison of the performance of other compute platforms and compilers can be made.« less
Enabling Extreme Scale Earth Science Applications at the Oak Ridge Leadership Computing Facility
NASA Astrophysics Data System (ADS)
Anantharaj, V. G.; Mozdzynski, G.; Hamrud, M.; Deconinck, W.; Smith, L.; Hack, J.
2014-12-01
The Oak Ridge Leadership Facility (OLCF), established at the Oak Ridge National Laboratory (ORNL) under the auspices of the U.S. Department of Energy (DOE), welcomes investigators from universities, government agencies, national laboratories and industry who are prepared to perform breakthrough research across a broad domain of scientific disciplines, including earth and space sciences. Titan, the OLCF flagship system, is currently listed as #2 in the Top500 list of supercomputers in the world, and the largest available for open science. The computational resources are allocated primarily via the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, sponsored by the U.S. DOE Office of Science. In 2014, over 2.25 billion core hours on Titan were awarded via INCITE projects., including 14% of the allocation toward earth sciences. The INCITE competition is also open to research scientists based outside the USA. In fact, international research projects account for 12% of the INCITE awards in 2014. The INCITE scientific review panel also includes 20% participation from international experts. Recent accomplishments in earth sciences at OLCF include the world's first continuous simulation of 21,000 years of earth's climate history (2009); and an unprecedented simulation of a magnitude 8 earthquake over 125 sq. miles. One of the ongoing international projects involves scaling the ECMWF Integrated Forecasting System (IFS) model to over 200K cores of Titan. ECMWF is a partner in the EU funded Collaborative Research into Exascale Systemware, Tools and Applications (CRESTA) project. The significance of the research carried out within this project is the demonstration of techniques required to scale current generation Petascale capable simulation codes towards the performance levels required for running on future Exascale systems. One of the techniques pursued by ECMWF is to use Fortran2008 coarrays to overlap computations and communications and to reduce the total volume of data communicated. Use of Titan has enabled ECMWF to plan future scalability developments and resource requirements. We will also discuss the best practices developed over the years in navigating logistical, legal and regulatory hurdles involved in supporting the facility's diverse user community.
10th Annual Systems Engineering Conference: Volume 2 Wednesday
2007-10-25
intelligently optimize resource performance. Self - Healing Detect hardware/software failures and reconfigure to permit continued operations. Self ...Types Wake Ice WEAPON/PLATFORM ACOUSTICS Self -Noise Radiated Noise Beam Forming Pulse Types Submarines, surface ships, and platform sensors P r o p P r o...Computing Self -Protecting Detect internal/external attacks and protect it’s resources from exploitation. Self -Optimizing Detect sub-optimal behaviors and
Evaluating the Limits of Network Topology Inference Via Virtualized Network Emulation
2015-06-01
76 xi Figure 5.33 Hop-plot of five best reduction methods. KDD most closely matches the Internet plot...respectively, located around the world. These monitors provide locations from which to perform network measurement experiments, primarily using the ping ...International Symposium on Modeling, Analysis and Simulation of Computer Telecommunication Systems. IEEE, 2001, pp. 346–353. 90 [21] C. Jin , Q. Chen, and S
A Testbed Processor for Embedded Multicomputing
1990-04-01
Gajski 85]. These two problems of parallel expression and performance impact the real-time response of a vehicle system and, consequently, what models...and memory access. The following discussion of these problems is primarily from Gajski and Peir [ Gajski 85]. Multi-computers are Multiple Instruction...International Symposium on Unmanned Untethered Submersible Technology, University of New Hampshire, Durham, NH, June 22-24 1987, pp. 33-43. [ Gajski 85
Viscous computations of cold air/air flow around scramjet nozzle afterbody
NASA Technical Reports Server (NTRS)
Baysal, Oktay; Engelund, Walter C.
1991-01-01
The flow field in and around the nozzle afterbody section of a hypersonic vehicle was computationally simulated. The compressible, Reynolds averaged, Navier Stokes equations were solved by an implicit, finite volume, characteristic based method. The computational grids were adapted to the flow as the solutions were developing in order to improve the accuracy. The exhaust gases were assumed to be cold. The computational results were obtained for the two dimensional longitudinal plane located at the half span of the internal portion of the nozzle for over expanded and under expanded conditions. Another set of results were obtained, where the three dimensional simulations were performed for a half span nozzle. The surface pressures were successfully compared with the data obtained from the wind tunnel tests. The results help in understanding this complex flow field and, in turn, should help the design of the nozzle afterbody section.
THE INTERNAL ORGANIZATION OF COMPUTER MODELS OF COGNITIVE BEHAVIOR.
ERIC Educational Resources Information Center
BAKER, FRANK B.
IF COMPUTER PROGRAMS ARE TO SERVE AS USEFUL MODELS OF COGNITIVE BEHAVIOR, THEIR CREATORS MUST FACE THE NEED TO ESTABLISH AN INTERNAL ORGANIZATION FOR THEIR MODEL WHICH IMPLEMENTS THE HIGHER LEVEL COGNITIVE BEHAVIORS ASSOCIATED WITH THE HUMAN CAPACITY FOR SELF-DIRECTION, AUTOCRITICISM, AND ADAPTATION. PRESENT COMPUTER MODELS OF COGNITIVE BEHAVIOR…
ICCE/ICCAI 2000 Full & Short Papers (Web-Based Learning).
ERIC Educational Resources Information Center
2000
This document contains full and short papers on World Wide Web-based learning from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction). Topics covered include: design and development of CAL (Computer Assisted Learning) systems; design and development of WBI (Web-Based…
How Do Students Evaluate Computer Use for Learning?
ERIC Educational Resources Information Center
Lu, Jiamei; Li, Daqi; Stevens, Carla; Ye, Renmin
2016-01-01
Using Program for International Student Assessment (PISA) 2012, an international education database, this study analyzed the evaluations of computer use for academic learning by 15-year-old students from seven Edu-systems (unit in PISA) in Eastern Asia. Six variables were identified in association with students' evaluations of computer use…
Space Shuttle Communications Coverage Analysis for Thermal Tile Inspection
NASA Technical Reports Server (NTRS)
Kroll, Quin D.; Hwu, Shian U.; Upanavage, Matthew; Boster, John P.; Chavez, Mark A.
2009-01-01
The space shuttle ultra-high frequency Space-to-Space Communication System has to provide adequate communication coverage for astronauts who are performing thermal tile inspection and repair on the underside of the space shuttle orbiter (SSO). Careful planning and quantitative assessment are necessary to ensure successful system operations and mission safety in this work environment. This study assesses communication systems performance for astronauts who are working in the underside, non-line-of-sight shadow region on the space shuttle. All of the space shuttle and International Space Station (ISS) transmitting antennas are blocked by the SSO structure. To ensure communication coverage at planned inspection worksites, the signal strength and link margin between the SSO/ISS antennas and the extravehicular activity astronauts, whose line-of-sight is blocked by vehicle structure, was analyzed. Investigations were performed using rigorous computational electromagnetic modeling techniques. Signal strength was obtained by computing the reflected and diffracted fields along the signal propagation paths between transmitting and receiving antennas. Radio frequency (RF) coverage was determined for thermal tile inspection and repair missions using the results of this computation. Analysis results from this paper are important in formulating the limits on reliable communication range and RF coverage at planned underside inspection and repair worksites.
Influences of geological parameters to probabilistic assessment of slope stability of embankment
NASA Astrophysics Data System (ADS)
Nguyen, Qui T.; Le, Tuan D.; Konečný, Petr
2018-04-01
This article considers influences of geological parameters to slope stability of the embankment in probabilistic analysis using SLOPE/W computational system. Stability of a simple slope is evaluated with and without pore–water pressure on the basis of variation of soil properties. Normal distributions of unit weight, cohesion and internal friction angle are assumed. Monte Carlo simulation technique is employed to perform analysis of critical slip surface. Sensitivity analysis is performed to observe the variation of the geological parameters and their effects on safety factors of the slope stability.
A Computational Study of a New Dual Throat Fluidic Thrust Vectoring Nozzle Concept
NASA Technical Reports Server (NTRS)
Deere, Karen A.; Berrier, Bobby L.; Flamm, Jeffrey D.; Johnson, Stuart K.
2005-01-01
A computational investigation of a two-dimensional nozzle was completed to assess the use of fluidic injection to manipulate flow separation and cause thrust vectoring of the primary jet thrust. The nozzle was designed with a recessed cavity to enhance the throat shifting method of fluidic thrust vectoring. Several design cycles with the structured-grid, computational fluid dynamics code PAB3D and with experiments in the NASA Langley Research Center Jet Exit Test Facility have been completed to guide the nozzle design and analyze performance. This paper presents computational results on potential design improvements for best experimental configuration tested to date. Nozzle design variables included cavity divergence angle, cavity convergence angle and upstream throat height. Pulsed fluidic injection was also investigated for its ability to decrease mass flow requirements. Internal nozzle performance (wind-off conditions) and thrust vector angles were computed for several configurations over a range of nozzle pressure ratios from 2 to 7, with the fluidic injection flow rate equal to 3 percent of the primary flow rate. Computational results indicate that increasing cavity divergence angle beyond 10 is detrimental to thrust vectoring efficiency, while increasing cavity convergence angle from 20 to 30 improves thrust vectoring efficiency at nozzle pressure ratios greater than 2, albeit at the expense of discharge coefficient. Pulsed injection was no more efficient than steady injection for the Dual Throat Nozzle concept.
King, Mark A; Glynn, Jonathan A; Mitchell, Sean R
2011-11-01
A subject-specific angle-driven computer model of a tennis player, combined with a forward dynamics, equipment-specific computer model of tennis ball-racket impacts, was developed to determine the effect of ball-racket impacts on loading at the elbow for one-handed backhand groundstrokes. Matching subject-specific computer simulations of a typical topspin/slice one-handed backhand groundstroke performed by an elite tennis player were done with root mean square differences between performance and matching simulations of < 0.5 degrees over a 50 ms period starting from ball impact. Simulation results suggest that for similar ball-racket impact conditions, the difference in elbow loading for a topspin and slice one-handed backhand groundstroke is relatively small. In this study, the relatively small differences in elbow loading may be due to comparable angle-time histories at the wrist and elbow joints with the major kinematic differences occurring at the shoulder. Using a subject-specific angle-driven computer model combined with a forward dynamics, equipment-specific computer model of tennis ball-racket impacts allows peak internal loading, net impulse, and shock due to ball-racket impact to be calculated which would not otherwise be possible without impractical invasive techniques. This study provides a basis for further investigation of the factors that may increase elbow loading during tennis strokes.
Fission product release and survivability of UN-kernel LWR TRISO fuel
DOE Office of Scientific and Technical Information (OSTI.GOV)
T. M. Besmann; M. K. Ferber; H.-T. Lin
2014-05-01
A thermomechanical assessment of the LWR application of TRISO fuel with UN kernels was performed. Fission product release under operational and transient temperature conditions was determined by extrapolation from fission product recoil calculations and limited data from irradiated UN pellets. Both fission recoil and diffusive release were considered and internal particle pressures computed for both 650 and 800 um diameter kernels as a function of buffer layer thickness. These pressures were used in conjunction with a finite element program to compute the radial and tangential stresses generated within a TRISO particle undergoing burnup. Creep and swelling of the inner andmore » outer pyrolytic carbon layers were included in the analyses. A measure of reliability of the TRISO particle was obtained by computing the probability of survival of the SiC barrier layer and the maximum tensile stress generated in the pyrolytic carbon layers from internal pressure and thermomechanics of the layers. These reliability estimates were obtained as functions of the kernel diameter, buffer layer thickness, and pyrolytic carbon layer thickness. The value of the probability of survival at the end of irradiation was inversely proportional to the maximum pressure.« less
Solid propellant rocket motor internal ballistics performance variation analysis, phase 3
NASA Technical Reports Server (NTRS)
Sforzini, R. H.; Foster, W. A., Jr.; Murph, J. E.; Adams, G. W., Jr.
1977-01-01
Results of research aimed at improving the predictability of off nominal internal ballistics performance of solid propellant rocket motors (SRMs) including thrust imbalance between two SRMs firing in parallel are reported. The potential effects of nozzle throat erosion on internal ballistic performance were studied and a propellant burning rate low postulated. The propellant burning rate model when coupled with the grain deformation model permits an excellent match between theoretical results and test data for the Titan IIIC, TU455.02, and the first Space Shuttle SRM (DM-1). Analysis of star grain deformation using an experimental model and a finite element model shows the star grain deformation effects for the Space Shuttle to be small in comparison to those of the circular perforated grain. An alternative technique was developed for predicting thrust imbalance without recourse to the Monte Carlo computer program. A scaling relationship used to relate theoretical results to test results may be applied to the alternative technique of predicting thrust imbalance or to the Monte Carlo evaluation. Extended investigation into the effect of strain rate on propellant burning rate leads to the conclusion that the thermoelastic effect is generally negligible for both steadily increasing pressure loads and oscillatory loads.
Computational design and refinement of self-heating lithium ion batteries
NASA Astrophysics Data System (ADS)
Yang, Xiao-Guang; Zhang, Guangsheng; Wang, Chao-Yang
2016-10-01
The recently discovered self-heating lithium ion battery has shown rapid self-heating from subzero temperatures and superior power thereafter, delivering a practical solution to poor battery performance at low temperatures. Here, we describe and validate an electrochemical-thermal coupled model developed specifically for computational design and improvement of the self-heating Li-ion battery (SHLB) where nickel foils are embedded in its structure. Predicting internal cell characteristics, such as current, temperature and Li-concentration distributions, the model is used to discover key design factors affecting the time and energy needed for self-heating and to explore advanced cell designs with the highest self-heating efficiency. It is found that ohmic heat generated in the nickel foil accounts for the majority of internal heat generation, resulting in a large internal temperature gradient from the nickel foil toward the outer cell surface. The large through-plane temperature gradient leads to highly non-uniform current distribution, and more importantly, is found to be the decisive factor affecting the heating time and energy consumption. A multi-sheet cell design is thus proposed and demonstrated to substantially minimize the temperature gradient, achieving 30% more rapid self-heating with 27% less energy consumption than those reported in the literature.
Mexican Space Weather Service (SCiESMEX)
NASA Astrophysics Data System (ADS)
Gonzalez-Esparza, J. A.; De la Luz, V.; Corona-Romero, P.; Mejia-Ambriz, J. C.; Gonzalez, L. X.; Sergeeva, M. A.; Romero-Hernandez, E.; Aguilar-Rodriguez, E.
2017-01-01
Legislative modifications of the General Civil Protection Law in Mexico in 2014 included specific references to space hazards and space weather phenomena. The legislation is consistent with United Nations promotion of international engagement and cooperation on space weather awareness, studies, and monitoring. These internal and external conditions motivated the creation of a space weather service in Mexico. The Mexican Space Weather Service (SCiESMEX in Spanish) (www.sciesmex.unam.mx) was initiated in October 2014 and is operated by the Institute of Geophysics at the Universidad Nacional Autonoma de Mexico (UNAM). SCiESMEX became a Regional Warning Center of the International Space Environment Services (ISES) in June 2015. We present the characteristics of the service, some products, and the initial actions for developing a space weather strategy in Mexico. The service operates a computing infrastructure including a web application, data repository, and a high-performance computing server to run numerical models. SCiESMEX uses data of the ground-based instrumental network of the National Space Weather Laboratory (LANCE), covering solar radio burst emissions, solar wind and interplanetary disturbances (by interplanetary scintillation observations), geomagnetic measurements, and analysis of the total electron content (TEC) of the ionosphere (by employing data from local networks of GPS receiver stations).
Analysis of the Harrier forebody/inlet design using computational techniques
NASA Technical Reports Server (NTRS)
Chow, Chuen-Yen
1993-01-01
Under the support of this Cooperative Agreement, computations of transonic flow past the complex forebody/inlet configuration of the AV-8B Harrier II have been performed. The actual aircraft configuration was measured and its surface and surrounding domain were defined using computational structured grids. The thin-layer Navier-Stokes equations were used to model the flow along with the Chimera embedded multi-grid technique. A fully conservative, alternating direction implicit (ADI), approximately-factored, partially flux-split algorithm was employed to perform the computation. An existing code was altered to conform with the needs of the study, and some special engine face boundary conditions were developed. The algorithm incorporated the Chimera technique and an algebraic turbulence model in order to deal with the embedded multi-grids and viscous governing equations. Comparison with experimental data has yielded good agreement for the simplifications incorporated into the analysis. The aim of the present research was to provide a methodology for the numerical solution of complex, combined external/internal flows. This is the first time-dependent Navier-Stokes solution for a geometry in which the fuselage and inlet share a wall. The results indicate the methodology used here is a viable tool for transonic aircraft modeling.
Shumaker, L; Fetterolf, D E; Suhrie, J
1998-01-01
The recent availability of inexpensive document scanners and optical character recognition technology has created the ability to process surveys in large numbers with a minimum of operator time. Programs, which allow computer entry of such scanned questionnaire results directly into PC based relational databases, have further made it possible to quickly collect and analyze significant amounts of information. We have created an internal capability to easily generate survey data and conduct surveillance across a number of medical practice sites within a managed care/practice management organization. Patient satisfaction surveys, referring physician surveys and a variety of other evidence gathering tools have been deployed.
Satellite triangulation in Europe from WEST and ISAGEX data. [computer programs
NASA Technical Reports Server (NTRS)
Leick, A.; Arur, M.
1975-01-01
Observational data that was acquired during the West European Satellite Triangulation (WEST) program and the International Satellite Geodesy Experiment (ISAGEX) campaign was obtained for the purpose of performing a geometric solution to improve the present values of coordinates of the European stations in the OSU WN14 solutions, adding some new stations and assessing the quality of the WN14 solution with the help of the additional data available. The status of the data as received, the preprocessing required and the preliminary tests carried out for the initial screening of the data are described. The adjustment computations carried out and the results of the adjustments are discussed.
A New Evaluation Method of Stored Heat Effect of Reinforced Concrete Wall of Cold Storage
NASA Astrophysics Data System (ADS)
Nomura, Tomohiro; Murakami, Yuji; Uchikawa, Motoyuki
Today it has become imperative to save energy by operating a refrigerator in a cold storage executed by external insulate reinforced concrete wall intermittently. The theme of the paper is to get the evaluation method to be capable of calculating, numerically, interval time for stopping the refrigerator, in applying reinforced concrete wall as source of stored heat. The experiments with the concrete models were performed in order to examine the time variation of internal temperature after refrigerator stopped. In addition, the simulation method with three dimensional unsteady FEM for personal-computer type was introduced for easily analyzing the internal temperature variation. Using this method, it is possible to obtain the time variation of internal temperature and to calculate the interval time for stopping the refrigerator.
An efficient and accurate 3D displacements tracking strategy for digital volume correlation
NASA Astrophysics Data System (ADS)
Pan, Bing; Wang, Bo; Wu, Dafang; Lubineau, Gilles
2014-07-01
Owing to its inherent computational complexity, practical implementation of digital volume correlation (DVC) for internal displacement and strain mapping faces important challenges in improving its computational efficiency. In this work, an efficient and accurate 3D displacement tracking strategy is proposed for fast DVC calculation. The efficiency advantage is achieved by using three improvements. First, to eliminate the need of updating Hessian matrix in each iteration, an efficient 3D inverse compositional Gauss-Newton (3D IC-GN) algorithm is introduced to replace existing forward additive algorithms for accurate sub-voxel displacement registration. Second, to ensure the 3D IC-GN algorithm that converges accurately and rapidly and avoid time-consuming integer-voxel displacement searching, a generalized reliability-guided displacement tracking strategy is designed to transfer accurate and complete initial guess of deformation for each calculation point from its computed neighbors. Third, to avoid the repeated computation of sub-voxel intensity interpolation coefficients, an interpolation coefficient lookup table is established for tricubic interpolation. The computational complexity of the proposed fast DVC and the existing typical DVC algorithms are first analyzed quantitatively according to necessary arithmetic operations. Then, numerical tests are performed to verify the performance of the fast DVC algorithm in terms of measurement accuracy and computational efficiency. The experimental results indicate that, compared with the existing DVC algorithm, the presented fast DVC algorithm produces similar precision and slightly higher accuracy at a substantially reduced computational cost.
Finite-element reentry heat-transfer analysis of space shuttle Orbiter
NASA Technical Reports Server (NTRS)
Ko, William L.; Quinn, Robert D.; Gong, Leslie
1986-01-01
A structural performance and resizing (SPAR) finite-element thermal analysis computer program was used in the heat-transfer analysis of the space shuttle orbiter subjected to reentry aerodynamic heating. Three wing cross sections and one midfuselage cross section were selected for the thermal analysis. The predicted thermal protection system temperatures were found to agree well with flight-measured temperatures. The calculated aluminum structural temperatures also agreed reasonably well with the flight data from reentry to touchdown. The effects of internal radiation and of internal convection were found to be significant. The SPAR finite-element solutions agreed reasonably well with those obtained from the conventional finite-difference method.
NASA Technical Reports Server (NTRS)
Krishen, Kumar (Editor); Burnham, Calvin (Editor)
1995-01-01
The papers presented at the 4th International Conference Exhibition: World Congress on Superconductivity held at the Marriott Orlando World Center, Orlando, Florida, are contained in this document and encompass the research, technology, applications, funding, political, and social aspects of superconductivity. Specifically, the areas covered included: high-temperature materials; thin films; C-60 based superconductors; persistent magnetic fields and shielding; fabrication methodology; space applications; physical applications; performance characterization; device applications; weak link effects and flux motion; accelerator technology; superconductivity energy; storage; future research and development directions; medical applications; granular superconductors; wire fabrication technology; computer applications; technical and commercial challenges, and power and energy applications.
NASA Technical Reports Server (NTRS)
Watts, Michael E.
1991-01-01
The Acoustic Laboratory Data Acquisition System (ALDAS) is an inexpensive, transportable means to digitize and analyze data. The system is based on the Macintosh 2 family of computers, with internal analog-to-digital boards providing four channels of simultaneous data acquisition at rates up to 50,000 samples/sec. The ALDAS software package, written for use with rotorcraft acoustics, performs automatic acoustic calibration of channels, data display, two types of cycle averaging, and spectral amplitude analysis. The program can use data obtained from internal analog-to-digital conversion, or discrete external data imported in ASCII format. All aspects of ALDAS can be improved as new hardware becomes available and new features are introduced into the code.
NASA Technical Reports Server (NTRS)
Krishen, Kumar (Editor); Burnham, Calvin (Editor)
1995-01-01
This document contains papers presented at the 4th International Conference Exhibition: World Congress on Superconductivity held June 27-July 1, 1994 in Orlando, Florida. These documents encompass research, technology, applications, funding, political, and social aspects of superconductivity. The areas covered included: high-temperature materials; thin films; C-60 based superconductors; persistent magnetic fields and shielding; fabrication methodology; space applications; physical applications; performance characterization; device applications; weak link effects and flux motion; accelerator technology; superconductivity energy; storage; future research and development directions; medical applications; granular superconductors; wire fabrication technology; computer applications; technical and commercial challenges; and power and energy applications.
Effect of plasma spraying modes on material properties of internal combustion engine cylinder liners
NASA Astrophysics Data System (ADS)
Timokhova, O. M.; Burmistrova, O. N.; Sirina, E. A.; Timokhov, R. S.
2018-03-01
The paper analyses different methods of remanufacturing worn-out machine parts in order to get the best performance characteristics. One of the most promising of them is a plasma spraying method. The mathematical models presented in the paper are intended to anticipate the results of plasma spraying, its effect on the properties of the material of internal combustion engine cylinder liners under repair. The experimental data and research results have been computer processed with Statistica 10.0 software package. The pare correlation coefficient values (R) and F-statistic criterion are given to confirm the statistical properties and adequacy of obtained regression equations.
Internal fluid mechanics research on supercomputers for aerospace propulsion systems
NASA Technical Reports Server (NTRS)
Miller, Brent A.; Anderson, Bernhard H.; Szuch, John R.
1988-01-01
The Internal Fluid Mechanics Division of the NASA Lewis Research Center is combining the key elements of computational fluid dynamics, aerothermodynamic experiments, and advanced computational technology to bring internal computational fluid mechanics (ICFM) to a state of practical application for aerospace propulsion systems. The strategies used to achieve this goal are to: (1) pursue an understanding of flow physics, surface heat transfer, and combustion via analysis and fundamental experiments, (2) incorporate improved understanding of these phenomena into verified 3-D CFD codes, and (3) utilize state-of-the-art computational technology to enhance experimental and CFD research. Presented is an overview of the ICFM program in high-speed propulsion, including work in inlets, turbomachinery, and chemical reacting flows. Ongoing efforts to integrate new computer technologies, such as parallel computing and artificial intelligence, into high-speed aeropropulsion research are described.
SOMAR-LES: A framework for multi-scale modeling of turbulent stratified oceanic flows
NASA Astrophysics Data System (ADS)
Chalamalla, Vamsi K.; Santilli, Edward; Scotti, Alberto; Jalali, Masoud; Sarkar, Sutanu
2017-12-01
A new multi-scale modeling technique, SOMAR-LES, is presented in this paper. Localized grid refinement gives SOMAR (the Stratified Ocean Model with Adaptive Resolution) access to small scales of the flow which are normally inaccessible to general circulation models (GCMs). SOMAR-LES drives a LES (Large Eddy Simulation) on SOMAR's finest grids, forced with large scale forcing from the coarser grids. Three-dimensional simulations of internal tide generation, propagation and scattering are performed to demonstrate this multi-scale modeling technique. In the case of internal tide generation at a two-dimensional bathymetry, SOMAR-LES is able to balance the baroclinic energy budget and accurately model turbulence losses at only 10% of the computational cost required by a non-adaptive solver running at SOMAR-LES's fine grid resolution. This relative cost is significantly reduced in situations with intermittent turbulence or where the location of the turbulence is not known a priori because SOMAR-LES does not require persistent, global, high resolution. To illustrate this point, we consider a three-dimensional bathymetry with grids adaptively refined along the tidally generated internal waves to capture remote mixing in regions of wave focusing. The computational cost in this case is found to be nearly 25 times smaller than that of a non-adaptive solver at comparable resolution. In the final test case, we consider the scattering of a mode-1 internal wave at an isolated two-dimensional and three-dimensional topography, and we compare the results with Legg (2014) numerical experiments. We find good agreement with theoretical estimates. SOMAR-LES is less dissipative than the closure scheme employed by Legg (2014) near the bathymetry. Depending on the flow configuration and resolution employed, a reduction of more than an order of magnitude in computational costs is expected, relative to traditional existing solvers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
2008-07-15
The Meeting papers discuss research and test reactor fuel performance, manufacturing and testing. Some of the main topics are: conversion from HEU to LEU in different reactors and corresponding problems and activities; flux performance and core lifetime analysis with HEU and LEU fuels; physics and safety characteristics; measurement of gamma field parameters in core with LEU fuel; nondestructive analysis of RERTR fuel; thermal hydraulic analysis; fuel interactions; transient analyses and thermal hydraulics for HEU and LEU cores; microstructure research reactor fuels; post irradiation analysis and performance; computer codes and other related problems.
26 CFR 1.1348-2 - Computation of the fifty-percent maximum tax on earned income.
Code of Federal Regulations, 2011 CFR
2011-04-01
... earned income. 1.1348-2 Section 1.1348-2 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED) INCOME TAX (CONTINUED) INCOME TAXES (CONTINUED) Other Limitations § 1.1348-2 Computation of the fifty-percent maximum tax on earned income. (a) Computation of tax for taxable years...
International Computer and Information Literacy Study: Assessment Framework
ERIC Educational Resources Information Center
Fraillon, Julian; Schulz, Wolfram; Ainley, John
2013-01-01
The purpose of the International Computer and Information Literacy Study 2013 (ICILS 2013) is to investigate, in a range of countries, the ways in which young people are developing "computer and information literacy" (CIL) to support their capacity to participate in the digital age. To achieve this aim, the study will assess student…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-09
... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA-2009-0066] Privacy Act of 1974, as Amended; Computer Matching Program (SSA/ Internal Revenue Service (IRS))--Match 1305 AGENCY: Social Security... INFORMATION: A. General The Computer Matching and Privacy Protection Act of 1988 (Public Law (Pub. L.) 100-503...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-12
... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2010-0015] Privacy Act of 1974, as Amended; Computer Matching Program (SSA/ Internal Revenue Service (IRS))--Match Number 1016 AGENCY: Social Security... regarding protections for such persons. The Privacy Act, as amended, regulates the use of computer matching...
Mekhora, Keerin; Jalayondeja, Wattana; Jalayondeja, Chutima; Bhuanantanondh, Petcharatana; Dusadiisariyavong, Asadang; Upiriyasakul, Rujiret; Anuraktam, Khajornyod
2014-07-01
To develop an online, self-report questionnaire on computer work-related exposure (OSCWE) and to determine the internal consistency, face and content validity of the questionnaire. The online, self-report questionnaire was developed to determine the risk factors related to musculoskeletal disorders in computer users. It comprised five domains: personal, work-related, work environment, physical health and psychosocial factors. The questionnaire's content was validated by an occupational medical doctor and three physical therapy lecturers involved in ergonomic teaching. Twenty-five lay people examined the feasibility of computer-administered and the user-friendly language. The item correlation in each domain was analyzed by the internal consistency (Cronbach's alpha; alpha). The content of the questionnaire was considered congruent with the testing purposes. Eight hundred and thirty-five computer users at the PTT Exploration and Production Public Company Limited registered to the online self-report questionnaire. The internal consistency of the five domains was: personal (alpha = 0.58), work-related (alpha = 0.348), work environment (alpha = 0.72), physical health (alpha = 0.68) and psychosocial factor (alpha = 0.93). The findings suggested that the OSCWE had acceptable internal consistency for work environment and psychosocial factors. The OSCWE is available to use in population-based survey research among computer office workers.
ERIC Educational Resources Information Center
Sahin, Alpaslan; Adiguzel, Tufan
2014-01-01
The purpose of this study is to investigate how international teachers, who were from overseas but taught in the United States, rate effective teacher qualities in three domains; personal, professional, and classroom management skills. The study includes 130 international mathematics, science, and computer teachers who taught in a multi-school…
26 CFR 1.527-4 - Special rules for computation of political organization taxable income.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 26 Internal Revenue 7 2010-04-01 2010-04-01 true Special rules for computation of political organization taxable income. 1.527-4 Section 1.527-4 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED) INCOME TAX (CONTINUED) INCOME TAXES (CONTINUED) Farmers' Cooperatives § 1.527...
International Symposium on Grids and Clouds (ISGC) 2014
NASA Astrophysics Data System (ADS)
The International Symposium on Grids and Clouds (ISGC) 2014 will be held at Academia Sinica in Taipei, Taiwan from 23-28 March 2014, with co-located events and workshops. The conference is hosted by the Academia Sinica Grid Computing Centre (ASGC).“Bringing the data scientist to global e-Infrastructures” is the theme of ISGC 2014. The last decade has seen the phenomenal growth in the production of data in all forms by all research communities to produce a deluge of data from which information and knowledge need to be extracted. Key to this success will be the data scientist - educated to use advanced algorithms, applications and infrastructures - collaborating internationally to tackle society’s challenges. ISGC 2014 will bring together researchers working in all aspects of data science from different disciplines around the world to collaborate and educate themselves in the latest achievements and techniques being used to tackle the data deluge. In addition to the regular workshops, technical presentations and plenary keynotes, ISGC this year will focus on how to grow the data science community by considering the educational foundation needed for tomorrow’s data scientist. Topics of discussion include Physics (including HEP) and Engineering Applications, Biomedicine & Life Sciences Applications, Earth & Environmental Sciences & Biodiversity Applications, Humanities & Social Sciences Application, Virtual Research Environment (including Middleware, tools, services, workflow, ... etc.), Data Management, Big Data, Infrastructure & Operations Management, Infrastructure Clouds and Virtualisation, Interoperability, Business Models & Sustainability, Highly Distributed Computing Systems, and High Performance & Technical Computing (HPTC).
The new agreement of the international RIGA consensus conference on nasal airway function tests.
Vogt, K; Bachmann-Harildstad, G; Lintermann, A; Nechyporenko, A; Peters, F; Wernecke, K D
2018-01-21
The report reflects an agreement based on the consensus conference of the International Standardization Committee on the Objective Assessment of the Nasal Airway in Riga, 2nd Nov. 2016. The aim of the conference was to address the existing nasal airway function tests and to take into account physical, mathematical and technical correctness as a base of international standardization as well as the requirements of the Council Directive 93/42/EEC of 14 June 1993 concerning medical devices. Rhinomanometry, acoustic rhinometry, peak nasal inspiratory flow, Odiosoft-Rhino, optical rhinometry, 24-h measurements, computational fluid dynamics, nasometry and the mirrow test were evaluated for important diagnostic criteria, which are the precision of the equipment including calibration and the software applied; validity with sensitivity, specificity, positive and negative predictive values, reliability with intra-individual and inter-individual reproducibility and responsiveness in clinical studies. For rhinomanometry, the logarithmic effective resistance was set as the parameter of high diagnostic relevance. In acoustic rhinometry, the area of interest for the minimal cross-sectional area will need further standardization. Peak nasal inspiratory flow is a reproducible and fast test, which showed a high range of mean values in different studies. The state of the art with computational fluid dynamics for the simulation of the airway still depends on high performance computing hardware and will, after standardization of the software and both the software and hardware for imaging protocols, certainly deliver a better understanding of the nasal airway flux.
NCI's Transdisciplinary High Performance Scientific Data Platform
NASA Astrophysics Data System (ADS)
Evans, Ben; Antony, Joseph; Bastrakova, Irina; Car, Nicholas; Cox, Simon; Druken, Kelsey; Evans, Bradley; Fraser, Ryan; Ip, Alex; Kemp, Carina; King, Edward; Minchin, Stuart; Larraondo, Pablo; Pugh, Tim; Richards, Clare; Santana, Fabiana; Smillie, Jon; Trenham, Claire; Wang, Jingbo; Wyborn, Lesley
2016-04-01
The Australian National Computational Infrastructure (NCI) manages Earth Systems data collections sourced from several domains and organisations onto a single High Performance Data (HPD) Node to further Australia's national priority research and innovation agenda. The NCI HPD Node has rapidly established its value, currently managing over 10 PBytes of datasets from collections that span a wide range of disciplines including climate, weather, environment, geoscience, geophysics, water resources and social sciences. Importantly, in order to facilitate broad user uptake, maximise reuse and enable transdisciplinary access through software and standardised interfaces, the datasets, associated information systems and processes have been incorporated into the design and operation of a unified platform that NCI has called, the National Environmental Research Data Interoperability Platform (NERDIP). The key goal of the NERDIP is to regularise data access so that it is easily discoverable, interoperable for different domains and enabled for high performance methods. It adopts and implements international standards and data conventions, and promotes scientific integrity within a high performance computing and data analysis environment. NCI has established a rich and flexible computing environment to access to this data, through the NCI supercomputer; a private cloud that supports both domain focused virtual laboratories and in-common interactive analysis interfaces; as well as remotely through scalable data services. Data collections of this importance must be managed with careful consideration of both their current use and the needs of the end-communities, as well as its future potential use, such as transitioning to more advanced software and improved methods. It is therefore critical that the data platform is both well-managed and trusted for stable production use (including transparency and reproducibility), agile enough to incorporate new technological advances and additional communities practices, and a foundation for new exploratory developments. To that end, NCI is already participating in numerous current and emerging collaborations internationally including the Earth System Grid Federation (ESGF); Climate and Weather Data from International agencies such as NASA, NOAA, and UK Met Office; Remotely Sensed Satellite Earth Imaging through collaborations through GEOS and CEOS; EU-led Ocean Data Interoperability Platform (ODIP) and Horizon2020 Earth Server2 project; as well as broader data infrastructure community activities such as Research Data Alliance (RDA). Each research community is heavily engaged in international standards such as ISO, OGC and W3C, adopting community-led conventions for data, supporting improved data organisation such as controlled vocabularies, and creating workflows that use mature APIs and data services. NCI is engaging with these communities on NERDIP to ensure that such standards are applied uniformly and tested in practice by working with the variety of data and technologies. This includes benchmarking exemplar cases from individual communities, documenting their use of standards, and evaluating their practical use of the different technologies. Such a process fully establishes the functionality and performance, and is required to safely transition when improvements or rationalisation is required. Work is now underway to extend the NERDIP platform for better utilisation in the subsurface geophysical community, including maximizing national uptake, as well as better integration with international science platforms.
Advanced Modeling in Excel: from Water Jets to Big Bang
NASA Astrophysics Data System (ADS)
Ignatova, Olga; Chyzhyk, D.; Willis, C.; Kazachkov, A.
2006-12-01
An international students’ project is presented focused on application of Open Office and Excel spreadsheets for modeling of projectile-motion type dynamical systems. Variation of the parameters of plotted and animated families of jets flowing at different angles out of the holes in the wall of water-filled reservoir [1,2] revealed unexpected peculiarities of the envelopes, vertices, intersections and landing points of virtual trajectories. Comparison with real-life systems and rigorous calculations were performed to prove predictions of computer experiments. By same technique, the kinematics of fireworks was analyzed. On this basis two-dimensional ‘firework’ computer model of Big Bang was designed and studied, its relevance and limitations checked. 1.R.Ehrlich, Turning the World Inside Out, (Princeton University Press, Princeton, NJ, 1990), pp. 98-100. 2.A.Kazachkov, Yu.Bogdan, N.Makarovsky, N.Nedbailo. A Bucketful of Physics, in R.Pinto, S.Surinach (eds), International Conference Physics Teacher Education Beyond 2000. Selected Contributions (Elsevier Editions, Paris, 2001), pp.563-564. Sponsored by Courtney Willis.
ISS Radiation Shielding and Acoustic Simulation Using an Immersive Environment
NASA Technical Reports Server (NTRS)
Verhage, Joshua E.; Sandridge, Chris A.; Qualls, Garry D.; Rizzi, Stephen A.
2002-01-01
The International Space Station Environment Simulator (ISSES) is a virtual reality application that uses high-performance computing, graphics, and audio rendering to simulate the radiation and acoustic environments of the International Space Station (ISS). This CAVE application allows the user to maneuver to different locations inside or outside of the ISS and interactively compute and display the radiation dose at a point. The directional dose data is displayed as a color-mapped sphere that indicates the relative levels of radiation from all directions about the center of the sphere. The noise environment is rendered in real time over headphones or speakers and includes non-spatial background noise, such as air-handling equipment, and spatial sounds associated with specific equipment racks, such as compressors or fans. Changes can be made to equipment rack locations that produce changes in both the radiation shielding and system noise. The ISSES application allows for interactive investigation and collaborative trade studies between radiation shielding and noise for crew safety and comfort.
Probabilistic analysis of a materially nonlinear structure
NASA Technical Reports Server (NTRS)
Millwater, H. R.; Wu, Y.-T.; Fossum, A. F.
1990-01-01
A probabilistic finite element program is used to perform probabilistic analysis of a materially nonlinear structure. The program used in this study is NESSUS (Numerical Evaluation of Stochastic Structure Under Stress), under development at Southwest Research Institute. The cumulative distribution function (CDF) of the radial stress of a thick-walled cylinder under internal pressure is computed and compared with the analytical solution. In addition, sensitivity factors showing the relative importance of the input random variables are calculated. Significant plasticity is present in this problem and has a pronounced effect on the probabilistic results. The random input variables are the material yield stress and internal pressure with Weibull and normal distributions, respectively. The results verify the ability of NESSUS to compute the CDF and sensitivity factors of a materially nonlinear structure. In addition, the ability of the Advanced Mean Value (AMV) procedure to assess the probabilistic behavior of structures which exhibit a highly nonlinear response is shown. Thus, the AMV procedure can be applied with confidence to other structures which exhibit nonlinear behavior.
Computational Science in Armenia (Invited Talk)
NASA Astrophysics Data System (ADS)
Marandjian, H.; Shoukourian, Yu.
This survey is devoted to the development of informatics and computer science in Armenia. The results in theoretical computer science (algebraic models, solutions to systems of general form recursive equations, the methods of coding theory, pattern recognition and image processing), constitute the theoretical basis for developing problem-solving-oriented environments. As examples can be mentioned: a synthesizer of optimized distributed recursive programs, software tools for cluster-oriented implementations of two-dimensional cellular automata, a grid-aware web interface with advanced service trading for linear algebra calculations. In the direction of solving scientific problems that require high-performance computing resources, examples of completed projects include the field of physics (parallel computing of complex quantum systems), astrophysics (Armenian virtual laboratory), biology (molecular dynamics study of human red blood cell membrane), meteorology (implementing and evaluating the Weather Research and Forecast Model for the territory of Armenia). The overview also notes that the Institute for Informatics and Automation Problems of the National Academy of Sciences of Armenia has established a scientific and educational infrastructure, uniting computing clusters of scientific and educational institutions of the country and provides the scientific community with access to local and international computational resources, that is a strong support for computational science in Armenia.
SSME Investment in Turbomachinery Inducer Impeller Design Tools and Methodology
NASA Technical Reports Server (NTRS)
Zoladz, Thomas; Mitchell, William; Lunde, Kevin
2010-01-01
Within the rocket engine industry, SSME turbomachines are the de facto standards of success with regard to meeting aggressive performance requirements under challenging operational environments. Over the Shuttle era, SSME has invested heavily in our national inducer impeller design infrastructure. While both low and high pressure turbopump failures/anomaly resolution efforts spurred some of these investments, the SSME program was a major benefactor of key areas of turbomachinery inducer-impeller research outside of flight manifest pressures. Over the past several decades, key turbopump internal environments have been interrogated via highly instrumented hot-fire and cold-flow testing. Likewise, SSME has sponsored the advancement of time accurate and cavitating inducer impeller computation fluid dynamics (CFD) tools. These investments together have led to a better understanding of the complex internal flow fields within aggressive high performing inducers and impellers. New design tools and methodologies have evolved which intend to provide confident blade designs which strike an appropriate balance between performance and self induced load management.
Users manual for updated computer code for axial-flow compressor conceptual design
NASA Technical Reports Server (NTRS)
Glassman, Arthur J.
1992-01-01
An existing computer code that determines the flow path for an axial-flow compressor either for a given number of stages or for a given overall pressure ratio was modified for use in air-breathing engine conceptual design studies. This code uses a rapid approximate design methodology that is based on isentropic simple radial equilibrium. Calculations are performed at constant-span-fraction locations from tip to hub. Energy addition per stage is controlled by specifying the maximum allowable values for several aerodynamic design parameters. New modeling was introduced to the code to overcome perceived limitations. Specific changes included variable rather than constant tip radius, flow path inclination added to the continuity equation, input of mass flow rate directly rather than indirectly as inlet axial velocity, solution for the exact value of overall pressure ratio rather than for any value that met or exceeded it, and internal computation of efficiency rather than the use of input values. The modified code was shown to be capable of computing efficiencies that are compatible with those of five multistage compressors and one fan that were tested experimentally. This report serves as a users manual for the revised code, Compressor Spanline Analysis (CSPAN). The modeling modifications, including two internal loss correlations, are presented. Program input and output are described. A sample case for a multistage compressor is included.
A computational neural model of goal-directed utterance selection.
Klein, Michael; Kamp, Hans; Palm, Guenther; Doya, Kenji
2010-06-01
It is generally agreed that much of human communication is motivated by extra-linguistic goals: we often make utterances in order to get others to do something, or to make them support our cause, or adopt our point of view, etc. However, thus far a computational foundation for this view on language use has been lacking. In this paper we propose such a foundation using Markov Decision Processes. We borrow computational components from the field of action selection and motor control, where a neurobiological basis of these components has been established. In particular, we make use of internal models (i.e., next-state transition functions defined on current state action pairs). The internal model is coupled with reinforcement learning of a value function that is used to assess the desirability of any state that utterances (as well as certain non-verbal actions) can bring about. This cognitive architecture is tested in a number of multi-agent game simulations. In these computational experiments an agent learns to predict the context-dependent effects of utterances by interacting with other agents that are already competent speakers. We show that the cognitive architecture can account for acquiring the capability of deciding when to speak in order to achieve a certain goal (instead of performing a non-verbal action or simply doing nothing), whom to address and what to say. Copyright 2010 Elsevier Ltd. All rights reserved.
Monitoring system and methods for a distributed and recoverable digital control system
NASA Technical Reports Server (NTRS)
Stange, Kent (Inventor); Hess, Richard (Inventor); Kelley, Gerald B (Inventor); Rogers, Randy (Inventor)
2010-01-01
A monitoring system and methods are provided for a distributed and recoverable digital control system. The monitoring system generally comprises two independent monitoring planes within the control system. The first monitoring plane is internal to the computing units in the control system, and the second monitoring plane is external to the computing units. The internal first monitoring plane includes two in-line monitors. The first internal monitor is a self-checking, lock-step-processing monitor with integrated rapid recovery capability. The second internal monitor includes one or more reasonableness monitors, which compare actual effector position with commanded effector position. The external second monitor plane includes two monitors. The first external monitor includes a pre-recovery computing monitor, and the second external monitor includes a post recovery computing monitor. Various methods for implementing the monitoring functions are also disclosed.
Lessons Learned Using COTS Electronics for the International Space Station Radiation Environment
NASA Technical Reports Server (NTRS)
Blumer, John H.; Roth, A. (Technical Monitor)
2001-01-01
The mantra of 'Faster, Better, Cheaper' has to a large degree been interpreted as using Commercial Off-the-Shelf (COTS) components and/or circuit boards. One of the first space applications to actually use COTS in space along with radiation performance requirements was the Expedite the Processing of Experiments to Space Station (EXPRESS) Rack program, for the International Space Station (ISS). In order to meet the performance, cost and schedule targets, military grade Versa Module Eurocard (VME) was selected as the baseline design for the main computer, the Rack Interface Controller (RIC). VME was chosen as the computer backplane because of the large variety of military grade boards available, which were designed to meet the military environmental specifications (thermal, shock, vibration, etc.). These boards also have a paper pedigree in regards to components. Since these boards exceeded most ISS environmental requirements, it was reasoned using COTS mid-grade VME boards, as opposed to designing custom boards could save significant time and money. It was recognized up front the radiation environment of ISS, while benign compared to many space flight applications, would be the main challenge to using COTS. Thus in addition to selecting vendors on how well their boards met the usual performance and environmental specifications, the board's parts lists were reviewed on how well they would perform in the ISS radiation environment. However, issues with verifying that the available radiation test data was applicable to the actual part used, vendor part design changes and the fact most parts did not have valid test data soon complicated board and part selection in regards to radiation.
Low-complexity nonlinear adaptive filter based on a pipelined bilinear recurrent neural network.
Zhao, Haiquan; Zeng, Xiangping; He, Zhengyou
2011-09-01
To reduce the computational complexity of the bilinear recurrent neural network (BLRNN), a novel low-complexity nonlinear adaptive filter with a pipelined bilinear recurrent neural network (PBLRNN) is presented in this paper. The PBLRNN, inheriting the modular architectures of the pipelined RNN proposed by Haykin and Li, comprises a number of BLRNN modules that are cascaded in a chained form. Each module is implemented by a small-scale BLRNN with internal dynamics. Since those modules of the PBLRNN can be performed simultaneously in a pipelined parallelism fashion, it would result in a significant improvement of computational efficiency. Moreover, due to nesting module, the performance of the PBLRNN can be further improved. To suit for the modular architectures, a modified adaptive amplitude real-time recurrent learning algorithm is derived on the gradient descent approach. Extensive simulations are carried out to evaluate the performance of the PBLRNN on nonlinear system identification, nonlinear channel equalization, and chaotic time series prediction. Experimental results show that the PBLRNN provides considerably better performance compared to the single BLRNN and RNN models.
NASA Astrophysics Data System (ADS)
Gutzwiller, David; Gontier, Mathieu; Demeulenaere, Alain
2014-11-01
Multi-Block structured solvers hold many advantages over their unstructured counterparts, such as a smaller memory footprint and efficient serial performance. Historically, multi-block structured solvers have not been easily adapted for use in a High Performance Computing (HPC) environment, and the recent trend towards hybrid GPU/CPU architectures has further complicated the situation. This paper will elaborate on developments and innovations applied to the NUMECA FINE/Turbo solver that have allowed near-linear scalability with real-world problems on over 250 hybrid GPU/GPU cluster nodes. Discussion will focus on the implementation of virtual partitioning and load balancing algorithms using a novel meta-block concept. This implementation is transparent to the user, allowing all pre- and post-processing steps to be performed using a simple, unpartitioned grid topology. Additional discussion will elaborate on developments that have improved parallel performance, including fully parallel I/O with the ADIOS API and the GPU porting of the computationally heavy CPUBooster convergence acceleration module. Head of HPC and Release Management, Numeca International.
NASA Technical Reports Server (NTRS)
Mcardle, J. G.; Homyak, L.; Moore, A. S.
1979-01-01
The performance of a YF-102 turbofan engine was measured in an outdoor test stand with a bellmouth inlet and seven exhaust-system configurations. The configurations consisted of three separate-flow systems of various fan and core nozzle sizes and four confluent-flow systems of various nozzle sizes and shapes. A computer program provided good estimates of the engine performance and of thrust at maximum rating for each exhaust configuration. The internal performance of two different-shaped core nozzles for confluent-flow configurations was determined to be satisfactory. Pressure and temperature surveys were made with a traversing probe in the exhaust-nozzle flow for some confluent-flow configurations. The survey data at the mixing plane, plus the measured flow rates, were used to calculate the static-pressure variation along the exhaust nozzle length. The computed pressures compared well with experimental wall static-pressure data. External-flow surveys were made, for some confluent-flow configurations, with a large fixed rake at various locations in the exhaust plume.
2009-01-01
University of California, Berkeley. In this session, Dennis Gannon of Indiana University described the use of high performance computing for storm...Software Development (Session Introduction) Dennis Gannon Indiana University Software for Mesoscale Storm Prediction: Using Supercomputers for On...Ho, D. Ierardi, I. Kolossvary, J. Klepeis, T. Layman, C. McLeavey , M. Moraes, R. Mueller, E. Priest, Y. Shan, J. Spengler, M. Theobald, B. Towles
TOP500 Supercomputers for June 2004
DOE Office of Scientific and Technical Information (OSTI.GOV)
Strohmaier, Erich; Meuer, Hans W.; Dongarra, Jack
2004-06-23
23rd Edition of TOP500 List of World's Fastest Supercomputers Released: Japan's Earth Simulator Enters Third Year in Top Position MANNHEIM, Germany; KNOXVILLE, Tenn.;&BERKELEY, Calif. In what has become a closely watched event in the world of high-performance computing, the 23rd edition of the TOP500 list of the world's fastest supercomputers was released today (June 23, 2004) at the International Supercomputer Conference in Heidelberg, Germany.
Dynamic and thermal analysis of high speed tapered roller bearings under combined loading
NASA Technical Reports Server (NTRS)
Crecelius, W. J.; Milke, D. R.
1973-01-01
The development of a computer program capable of predicting the thermal and kinetic performance of high-speed tapered roller bearings operating with fluid lubrication under applied axial, radial and moment loading (five degrees of freedom) is detailed. Various methods of applying lubrication can be considered as well as changes in bearing internal geometry which occur as the bearing is brought to operating speeds, loads and temperatures.
A note on AB INITIO semiconductor band structures
NASA Astrophysics Data System (ADS)
Fiorentini, Vincenzo
1992-09-01
We point out that only the internal features of the DFT ab initio theoretical picture of a crystal should be used in a consistent ab initio calculation of the band structure. As a consequence, we show that ground-state band structure calculations should be performed for the system in equilibrium at zero pressure, i.e. at the computed equilibrium cell volume ω th. Examples of consequences of this attitude are considered.
Acquisition of a High Performance Computing Instrument for Big Data Research and Education
2015-12-03
Security and Privacy , University of Texas at Dallas, TX, September 16-17, 2014. • Chopade, P., Zhan, J., Community Detection in Large Scale Big Data...Security and Privacy in Communication Networks, Beijing, China, September 24-26, 2014. • Pravin Chopade, Kenneth Flurchick, Justin Zhan and Marwan...Balkirat Kaur, Malcolm Blow, and Justin Zhan, Digital Image Authentication in Social Media, The Sixth ASE International Conference on Privacy
What’s Wrong With Automatic Speech Recognition (ASR) and How Can We Fix It?
2013-03-01
Jordan Cohen International Computer Science Institute 1947 Center Street, Suite 600 Berkeley, CA 94704 MARCH 2013 Final Report ...This report was cleared for public release by the 88th Air Base Wing Public Affairs Office and is available to the general public, including foreign...711th Human Performance Wing Air Force Research Laboratory This report is published in the interest of scientific and technical
Righi, E; Carta, M; Bruzzone, A A; Lonardo, P M; Marinaro, E; Pastorino, A
1996-02-01
The authors report the results of an experimental analysis performed on titanium miniplates and screws in order to gain a better understanding of dynamic forces in internal rigid fixation. Ten segments of bovine scapula were prepared. Osteotomies were carried out along the minor axis, following which five were fixed with four hole straight miniplates and the other five with six hole double-Y miniplates. Each sample was fastened in a special clamp adapted to a tension test machine and shearing force was applied. Force versus time was recorded and the 50 bone fragments were examined by a pathologist. On the basis of the test results, two simple computer models were developed. No significant difference was evident between the mechanical and computed tests. The most critical sections were located near the hole proximal to the osteotomy and the microscopic findings confirmed this. On the basis of the experimental results, the authors propose a new plate design in which the area subject to most stress, proximal to the bone section, would be of miniplate thickness, the distal aspect being thinner as in a microplate. It is suggested that this design would provide sufficient stability and a high degree of anatomical adjustment of the system.
Liu, Yang; Chiaromonte, Francesca; Li, Bing
2017-06-01
In many scientific and engineering fields, advanced experimental and computing technologies are producing data that are not just high dimensional, but also internally structured. For instance, statistical units may have heterogeneous origins from distinct studies or subpopulations, and features may be naturally partitioned based on experimental platforms generating them, or on information available about their roles in a given phenomenon. In a regression analysis, exploiting this known structure in the predictor dimension reduction stage that precedes modeling can be an effective way to integrate diverse data. To pursue this, we propose a novel Sufficient Dimension Reduction (SDR) approach that we call structured Ordinary Least Squares (sOLS). This combines ideas from existing SDR literature to merge reductions performed within groups of samples and/or predictors. In particular, it leads to a version of OLS for grouped predictors that requires far less computation than recently proposed groupwise SDR procedures, and provides an informal yet effective variable selection tool in these settings. We demonstrate the performance of sOLS by simulation and present a first application to genomic data. The R package "sSDR," publicly available on CRAN, includes all procedures necessary to implement the sOLS approach. © 2016, The International Biometric Society.
Miller, J.J.
1982-01-01
The spectral analysis and filter program package is written in the BASIC language for the HP-9845T desktop computer. The program's main purpose is to perform spectral analyses on digitized time-domain data. In addition, band-pass filtering of the data can be performed in the time domain. Various other processes such as autocorrelation can be performed to the time domain data in order to precondition them for spectral analyses. The frequency domain data can also be transformed back into the time domain if desired. Any data can be displayed on the CRT in graphic form using a variety of plot routines. A hard copy can be obtained immediately using the internal thermal printer. Data can also be displayed in tabular form on the CRT or internal thermal printer or it can be stored permanently on a mass storage device like a tape or disk. A list of the processes performed in the order in which they occurred can be displayed at any time.
Advances in Numerical Boundary Conditions for Computational Aeroacoustics
NASA Technical Reports Server (NTRS)
Tam, Christopher K. W.
1997-01-01
Advances in Computational Aeroacoustics (CAA) depend critically on the availability of accurate, nondispersive, least dissipative computation algorithm as well as high quality numerical boundary treatments. This paper focuses on the recent developments of numerical boundary conditions. In a typical CAA problem, one often encounters two types of boundaries. Because a finite computation domain is used, there are external boundaries. On the external boundaries, boundary conditions simulating the solution outside the computation domain are to be imposed. Inside the computation domain, there may be internal boundaries. On these internal boundaries, boundary conditions simulating the presence of an object or surface with specific acoustic characteristics are to be applied. Numerical boundary conditions, both external or internal, developed for simple model problems are reviewed and examined. Numerical boundary conditions for real aeroacoustic problems are also discussed through specific examples. The paper concludes with a description of some much needed research in numerical boundary conditions for CAA.
Evaluating brain-computer interface performance using color in the P300 checkerboard speller.
Ryan, D B; Townsend, G; Gates, N A; Colwell, K; Sellers, E W
2017-10-01
Current Brain-Computer Interface (BCI) systems typically flash an array of items from grey to white (GW). The objective of this study was to evaluate BCI performance using uniquely colored stimuli. In addition to the GW stimuli, the current study tested two types of color stimuli (grey to color [GC] and color intensification [CI]). The main hypotheses were that in a checkboard paradigm, unique color stimuli will: (1) increase BCI performance over the standard GW paradigm; (2) elicit larger event-related potentials (ERPs); and, (3) improve offline performance with an electrode selection algorithm (i.e., Jumpwise). Online results (n=36) showed that GC provides higher accuracy and information transfer rate than the CI and GW conditions. Waveform analysis showed that GC produced higher amplitude ERPs than CI and GW. Information transfer rate was improved by the Jumpwise-selected channel locations in all conditions. Unique color stimuli (GC) improved BCI performance and enhanced ERPs. Jumpwise-selected electrode locations improved offline performance. These results show that in a checkerboard paradigm, unique color stimuli increase BCI performance, are preferred by participants, and are important to the design of end-user applications; thus, could lead to an increase in end-user performance and acceptance of BCI technology. Copyright © 2017 International Federation of Clinical Neurophysiology. All rights reserved.
Preventing Internal Computer Abuse
1986-12-01
abusers of computer systems are individuals who are -’’internal" to and working for the victim organization (these include full-time employees , part...Moonlighting * Organizational Property * Nonuse/nondisclosure * Substance Abuse * Gambling Employee Assistance Program "Whistle Blower" Policy EDP Auditor 1...sensitive computer systems. Of all the controls discussed so far. the Employee Assistance Program ’ EAP
The 6th International Conference on Computer Science and Computational Mathematics (ICCSCM 2017)
NASA Astrophysics Data System (ADS)
2017-09-01
The ICCSCM 2017 (The 6th International Conference on Computer Science and Computational Mathematics) has aimed to provide a platform to discuss computer science and mathematics related issues including Algebraic Geometry, Algebraic Topology, Approximation Theory, Calculus of Variations, Category Theory; Homological Algebra, Coding Theory, Combinatorics, Control Theory, Cryptology, Geometry, Difference and Functional Equations, Discrete Mathematics, Dynamical Systems and Ergodic Theory, Field Theory and Polynomials, Fluid Mechanics and Solid Mechanics, Fourier Analysis, Functional Analysis, Functions of a Complex Variable, Fuzzy Mathematics, Game Theory, General Algebraic Systems, Graph Theory, Group Theory and Generalizations, Image Processing, Signal Processing and Tomography, Information Fusion, Integral Equations, Lattices, Algebraic Structures, Linear and Multilinear Algebra; Matrix Theory, Mathematical Biology and Other Natural Sciences, Mathematical Economics and Financial Mathematics, Mathematical Physics, Measure Theory and Integration, Neutrosophic Mathematics, Number Theory, Numerical Analysis, Operations Research, Optimization, Operator Theory, Ordinary and Partial Differential Equations, Potential Theory, Real Functions, Rings and Algebras, Statistical Mechanics, Structure Of Matter, Topological Groups, Wavelets and Wavelet Transforms, 3G/4G Network Evolutions, Ad-Hoc, Mobile, Wireless Networks and Mobile Computing, Agent Computing & Multi-Agents Systems, All topics related Image/Signal Processing, Any topics related Computer Networks, Any topics related ISO SC-27 and SC- 17 standards, Any topics related PKI(Public Key Intrastructures), Artifial Intelligences(A.I.) & Pattern/Image Recognitions, Authentication/Authorization Issues, Biometric authentication and algorithms, CDMA/GSM Communication Protocols, Combinatorics, Graph Theory, and Analysis of Algorithms, Cryptography and Foundation of Computer Security, Data Base(D.B.) Management & Information Retrievals, Data Mining, Web Image Mining, & Applications, Defining Spectrum Rights and Open Spectrum Solutions, E-Comerce, Ubiquitous, RFID, Applications, Fingerprint/Hand/Biometrics Recognitions and Technologies, Foundations of High-performance Computing, IC-card Security, OTP, and Key Management Issues, IDS/Firewall, Anti-Spam mail, Anti-virus issues, Mobile Computing for E-Commerce, Network Security Applications, Neural Networks and Biomedical Simulations, Quality of Services and Communication Protocols, Quantum Computing, Coding, and Error Controls, Satellite and Optical Communication Systems, Theory of Parallel Processing and Distributed Computing, Virtual Visions, 3-D Object Retrievals, & Virtual Simulations, Wireless Access Security, etc. The success of ICCSCM 2017 is reflected in the received papers from authors around the world from several countries which allows a highly multinational and multicultural idea and experience exchange. The accepted papers of ICCSCM 2017 are published in this Book. Please check http://www.iccscm.com for further news. A conference such as ICCSCM 2017 can only become successful using a team effort, so herewith we want to thank the International Technical Committee and the Reviewers for their efforts in the review process as well as their valuable advices. We are thankful to all those who contributed to the success of ICCSCM 2017. The Secretary
26 CFR 1.611-4 - Depletion as a factor in computing earnings and profits for dividend purposes.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 26 Internal Revenue 7 2011-04-01 2009-04-01 true Depletion as a factor in computing earnings and profits for dividend purposes. 1.611-4 Section 1.611-4 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED) INCOME TAX (CONTINUED) INCOME TAXES (CONTINUED) Natural Resources § 1...
26 CFR 1.611-4 - Depletion as a factor in computing earnings and profits for dividend purposes.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 26 Internal Revenue 7 2010-04-01 2010-04-01 true Depletion as a factor in computing earnings and profits for dividend purposes. 1.611-4 Section 1.611-4 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED) INCOME TAX (CONTINUED) INCOME TAXES (CONTINUED) Natural Resources § 1...
Code of Federal Regulations, 2010 CFR
2010-04-01
... 26 Internal Revenue 4 2010-04-01 2010-04-01 false Limitations on certain capital losses and excess credits in computing alternative minimum tax. [Reserved] 1.383-2 Section 1.383-2 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED) INCOME TAX (CONTINUED) INCOME TAXES Insolvency...
ERIC Educational Resources Information Center
European Commission, 2014
2014-01-01
The 2013 European Commission Communication on Opening up Education underlined the importance of solid evidence to assess developments and take full advantage of the impact of technology on education, and called for sustained effort and international cooperation to improve our knowledge-base in this area. The International Computer and Information…
26 CFR 1.5000A-4 - Computation of shared responsibility payment.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 26 Internal Revenue 13 2014-04-01 2014-04-01 false Computation of shared responsibility payment. 1.5000A-4 Section 1.5000A-4 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY... individual was born. For example, an individual born on March 1, 1999, attains the age of 18 on March 1, 2017...
Indiveri, Giacomo
2008-01-01
Biological organisms perform complex selective attention operations continuously and effortlessly. These operations allow them to quickly determine the motor actions to take in response to combinations of external stimuli and internal states, and to pay attention to subsets of sensory inputs suppressing non salient ones. Selective attention strategies are extremely effective in both natural and artificial systems which have to cope with large amounts of input data and have limited computational resources. One of the main computational primitives used to perform these selection operations is the Winner-Take-All (WTA) network. These types of networks are formed by arrays of coupled computational nodes that selectively amplify the strongest input signals, and suppress the weaker ones. Neuromorphic circuits are an optimal medium for constructing WTA networks and for implementing efficient hardware models of selective attention systems. In this paper we present an overview of selective attention systems based on neuromorphic WTA circuits ranging from single-chip vision sensors for selecting and tracking the position of salient features, to multi-chip systems implement saliency-map based models of selective attention. PMID:27873818
Indiveri, Giacomo
2008-09-03
Biological organisms perform complex selective attention operations continuously and effortlessly. These operations allow them to quickly determine the motor actions to take in response to combinations of external stimuli and internal states, and to pay attention to subsets of sensory inputs suppressing non salient ones. Selective attention strategies are extremely effective in both natural and artificial systems which have to cope with large amounts of input data and have limited computational resources. One of the main computational primitives used to perform these selection operations is the Winner-Take-All (WTA) network. These types of networks are formed by arrays of coupled computational nodes that selectively amplify the strongest input signals, and suppress the weaker ones. Neuromorphic circuits are an optimal medium for constructing WTA networks and for implementing efficient hardware models of selective attention systems. In this paper we present an overview of selective attention systems based on neuromorphic WTA circuits ranging from single-chip vision sensors for selecting and tracking the position of salient features, to multi-chip systems implement saliency-map based models of selective attention.
Functional imaging and the cerebellum: recent developments and challenges. Editorial.
Habas, Christophe
2012-06-01
Recent neuroimaging developments allow a better in vivo characterization of the structural and functional connectivity of the human cerebellum. Ultrahigh fields, which considerably increase spatial resolution, enable to visualize deep cerebellar nuclei and cerebello-cortical sublayers. Tractography reconstructs afferent and efferent pathway of the cerebellum. Resting-state functional connectivity individualizes the prewired, parallel close-looped sensorimotor, cognitive, and affective networks passing through the cerebellum. These results are un agreement with activation maps obtained during stimulation functional neuroimaging or inferred from neurological deficits due to cerebellar lesions. Therefore, neuroimaging supports the hypothesis that cerebellum constitutes a general modulator involved in optimizing mental performance and computing internal models. However, the great challenges will remain to unravel: (1) the functional role of red and bulbar olivary nuclei, (2) the information processing in the cerebellar microcircuitry, and (3) the abstract computation performed by the cerebellum and shared by sensorimotor, cognitive, and affective domains.
Accuracy in planar cutting of bones: an ISO-based evaluation.
Cartiaux, Olivier; Paul, Laurent; Docquier, Pierre-Louis; Francq, Bernard G; Raucent, Benoît; Dombre, Etienne; Banse, Xavier
2009-03-01
Computer- and robot-assisted technologies are capable of improving the accuracy of planar cutting in orthopaedic surgery. This study is a first step toward formulating and validating a new evaluation methodology for planar bone cutting, based on the standards from the International Organization for Standardization. Our experimental test bed consisted of a purely geometrical model of the cutting process around a simulated bone. Cuts were performed at three levels of surgical assistance: unassisted, computer-assisted and robot-assisted. We measured three parameters of the standard ISO1101:2004: flatness, parallelism and location of the cut plane. The location was the most relevant parameter for assessing cutting errors. The three levels of assistance were easily distinguished using the location parameter. Our ISO methodology employs the location to obtain all information about translational and rotational cutting errors. Location may be used on any osseous structure to compare the performance of existing assistance technologies.
Lustre Distributed Name Space (DNE) Evaluation at the Oak Ridge Leadership Computing Facility (OLCF)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simmons, James S.; Leverman, Dustin B.; Hanley, Jesse A.
This document describes the Lustre Distributed Name Space (DNE) evaluation carried at the Oak Ridge Leadership Computing Facility (OLCF) between 2014 and 2015. DNE is a development project funded by the OpenSFS, to improve Lustre metadata performance and scalability. The development effort has been split into two parts, the first part (DNE P1) providing support for remote directories over remote Lustre Metadata Server (MDS) nodes and Metadata Target (MDT) devices, while the second phase (DNE P2) addressed split directories over multiple remote MDS nodes and MDT devices. The OLCF have been actively evaluating the performance, reliability, and the functionality ofmore » both DNE phases. For these tests, internal OLCF testbed were used. Results are promising and OLCF is planning on a full DNE deployment by mid-2016 timeframe on production systems.« less
Evaluation of Hands-Free Devices for Space Habitat Maintenance Procedures
NASA Technical Reports Server (NTRS)
Hoffman, R. B.; Twyford, E.; Conlee, C. S.; Litaker, H. L.; Solemn, J. A.; Holden
2007-01-01
Currently, International Space Station (ISS) crews use a laptop computer to display procedures for performing onboard maintenance tasks. This approach has been determined to be suboptimal. A heuristic evaluation and two studies have been completed to test commercial off-the-shelf (COTS) "near-eye" heads up displays (HUDs) for support of these types of maintenance tasks. In both studies, subjects worked through electronic procedures to perform simple maintenance tasks. As a result of the Phase I study, three HUDs were down-selected to one. In the Phase II study, the HUD was compared against two other electronic display devices - a laptop computer and an e-book reader. Results suggested that adjustability and stability of the HUD display were the most significant acceptability factors to consider for near-eye displays. The Phase II study uncovered a number of advantages and disadvantages of the HUD relative to the laptop and e-book reader for interacting with electronic procedures.
Computational biology approach to uncover hepatitis C virus helicase operation.
Flechsig, Holger
2014-04-07
Hepatitis C virus (HCV) helicase is a molecular motor that splits nucleic acid duplex structures during viral replication, therefore representing a promising target for antiviral treatment. Hence, a detailed understanding of the mechanism by which it operates would facilitate the development of efficient drug-assisted therapies aiming to inhibit helicase activity. Despite extensive investigations performed in the past, a thorough understanding of the activity of this important protein was lacking since the underlying internal conformational motions could not be resolved. Here we review investigations that have been previously performed by us for HCV helicase. Using methods of structure-based computational modelling it became possible to follow entire operation cycles of this motor protein in structurally resolved simulations and uncover the mechanism by which it moves along the nucleic acid and accomplishes strand separation. We also discuss observations from that study in the light of recent experimental studies that confirm our findings.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chai, X; Liu, L; Xing, L
Purpose: Visualization and processing of medical images and radiation treatment plan evaluation have traditionally been constrained to local workstations with limited computation power and ability of data sharing and software update. We present a web-based image processing and planning evaluation platform (WIPPEP) for radiotherapy applications with high efficiency, ubiquitous web access, and real-time data sharing. Methods: This software platform consists of three parts: web server, image server and computation server. Each independent server communicates with each other through HTTP requests. The web server is the key component that provides visualizations and user interface through front-end web browsers and relay informationmore » to the backend to process user requests. The image server serves as a PACS system. The computation server performs the actual image processing and dose calculation. The web server backend is developed using Java Servlets and the frontend is developed using HTML5, Javascript, and jQuery. The image server is based on open source DCME4CHEE PACS system. The computation server can be written in any programming language as long as it can send/receive HTTP requests. Our computation server was implemented in Delphi, Python and PHP, which can process data directly or via a C++ program DLL. Results: This software platform is running on a 32-core CPU server virtually hosting the web server, image server, and computation servers separately. Users can visit our internal website with Chrome browser, select a specific patient, visualize image and RT structures belonging to this patient and perform image segmentation running Delphi computation server and Monte Carlo dose calculation on Python or PHP computation server. Conclusion: We have developed a webbased image processing and plan evaluation platform prototype for radiotherapy. This system has clearly demonstrated the feasibility of performing image processing and plan evaluation platform through a web browser and exhibited potential for future cloud based radiotherapy.« less
Experimental, Theoretical, and Computational Investigation of Separated Nozzle Flows
NASA Technical Reports Server (NTRS)
Hunter, Craig A.
2004-01-01
A detailed experimental, theoretical, and computational study of separated nozzle flows has been conducted. Experimental testing was performed at the NASA Langley 16-Foot Transonic Tunnel Complex. As part of a comprehensive static performance investigation, force, moment, and pressure measurements were made and schlieren flow visualization was obtained for a sub-scale, non-axisymmetric, two-dimensional, convergent- divergent nozzle. In addition, two-dimensional numerical simulations were run using the computational fluid dynamics code PAB3D with two-equation turbulence closure and algebraic Reynolds stress modeling. For reference, experimental and computational results were compared with theoretical predictions based on one-dimensional gas dynamics and an approximate integral momentum boundary layer method. Experimental results from this study indicate that off-design overexpanded nozzle flow was dominated by shock induced boundary layer separation, which was divided into two distinct flow regimes; three- dimensional separation with partial reattachment, and fully detached two-dimensional separation. The test nozzle was observed to go through a marked transition in passing from one regime to the other. In all cases, separation provided a significant increase in static thrust efficiency compared to the ideal prediction. Results indicate that with controlled separation, the entire overexpanded range of nozzle performance would be within 10% of the peak thrust efficiency. By offering savings in weight and complexity over a conventional mechanical exhaust system, this may allow a fixed geometry nozzle to cover an entire flight envelope. The computational simulation was in excellent agreement with experimental data over most of the test range, and did a good job of modeling internal flow and thrust performance. An exception occurred at low nozzle pressure ratios, where the two-dimensional computational model was inconsistent with the three-dimensional separation observed in the experiment. In general, the computation captured the physics of the shock boundary layer interaction and shock induced boundary layer separation in the nozzle, though there were some differences in shock structure compared to experiment. Though minor, these differences could be important for studies involving flow control or thrust vectoring of separated nozzles. Combined with other observations, this indicates that more detailed, three-dimensional computational modeling needs to be conducted to more realistically simulate shock-separated nozzle flows.
Recurrence plot for parameters analysing of internal combustion engine
NASA Astrophysics Data System (ADS)
Alexa, O.; Ilie, C. O.; Marinescu, M.; Vilau, R.; Grosu, D.
2015-11-01
In many technical disciplines modem data analysis techniques has been successfully applied to understand the complexity of the system. The growing volume of theoretical knowledge about systems dynamic's offered researchers the opportunity to look for non-linear dynamics in data whose evolution linear models are unable to explain in a satisfactory manner. One approach in this respect is Recurrence Analysis - RA which is a graphical method designed to locate hidden recurring patterns, nonstationarity and structural changes. RA approach arose in natural sciences like physics and biology but quickly was adopted in economics and engineering. Meanwhile. The fast development of computer resources has provided powerful tools to perform this new and complex model. One free software which was used to perform our analysis is Visual Recurrence Analysis - VRA developed by Eugene Kononov. As is presented in this paper, the recurrence plot investigation for the analyzing of the internal combustion engine shows some of the RPA capabilities in this domain. We chose two specific engine parameters measured in two different tests to perform the RPA. These parameters are injection impulse width and engine angular speed and the tests are I11n and I51n. There were computed graphs for each of them. Graphs were analyzed and compared to obtain a conclusion. This work is an incipient research, being one of the first attempts of using recurrence plot for analyzing automotive dynamics. It opens a wide field of action for future research programs.
Cost-effective use of liquid nitrogen in cryogenic wind tunnels, phase 2
NASA Technical Reports Server (NTRS)
Mcintosh, Glen E.; Lombard, David S.; Leonard, Kenneth R.; Morhorst, Gerald D.
1990-01-01
Cryogenic seal tests were performed and Rulon A was selected for the subject nutating positive displacement expander. A four-chamber expander was designed and fabricated. A nitrogen reliquefier flow system was also designed and constructed for testing the cold expander. Initial tests were unsatisfactory because of high internal friction attributed to nutating Rulon inlet and outlet valve plates. Replacement of the nutating valves with cam-actuated poppet valves improved performance. However, no net nitrogen reliquefaction was achieved due to high internal friction. Computer software was developed for accurate calculation of nitrogen reliquefaction from a system such as that proposed. These calculations indicated that practical reliquefaction rates of 15 to 19 percent could be obtained. Due to mechanical problems, the nutating expander did not demonstrate its feasibility nor that of the system. It was concluded that redesign and testing of a smaller nutating expander was required to prove concept feasibility.
Dorsal slab fracture of the fourth carpal bone in a racing greyhound.
Rutherford, Scott; Ness, Malcolm G
2012-11-01
To report the diagnosis and surgical management of a dorsal slab fracture of the fourth carpal bone in a racing greyhound. Clinical report. Three-year-old, male racing Greyhound. The fracture was not visible on orthogonal radiographs and the diagnosis was made by computed tomography. Open reduction and internal fixation with 2 countersunk 2.0-mm screws inserted in lag fashion was performed via a dorsal approach. Outcome was analyzed objectively by comparing preinjury and postsurgery racing performances. Internal fixation resulted in fracture healing and the dog returned to racing recording times similar to those before injury. Fractures of the fourth carpal bone may not be visible on standard orthogonal radiographic views and cross-sectional imaging may be required for more accurate identification. Surgical management was successful with the dog returning to preinjury levels of competition. © Copyright 2012 by The American College of Veterinary Surgeons.
Ören, Ünal; Hiller, Mauritius; Andersson, M
2017-04-28
A Monte Carlo-based stand-alone program, IDACstar (Internal Dose Assessment by Computer), was developed, dedicated to perform radiation dose calculations using complex voxel simulations. To test the program, two irradiation situations were simulated, one hypothetical contamination case with 600 MBq of 99mTc and one extravasation case involving 370 MBq of 18F-FDG. The effective dose was estimated to be 0.042 mSv for the contamination case and 4.5 mSv for the extravasation case. IDACstar has demonstrated that dosimetry results from contamination or extravasation cases can be acquired with great ease. An effective tool for radiation protection applications is provided with IDACstar allowing physicists at nuclear medicine departments to easily quantify the radiation risk of stochastic effects when a radiation accident has occurred. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
STS-42 Commander Grabe works with MWPE at IML-1 Rack 8 aboard OV-103
NASA Technical Reports Server (NTRS)
1992-01-01
STS-42 Commander Ronald J. Grabe works with the Mental Workload and Performance Evaluation Experiment (MWPE) (portable laptop computer, keyboard cursor keys, a two-axis joystick, and a track ball) at Rack 8 in the International Microgravity Laboratory 1 (IML-1) module. The test was designed as a result of difficulty experienced by crewmembers working at a computer station on a previous Space Shuttle mission. The problem was due to the workstation's design being based on Earth-bound conditions with the operator in a typical one-G standing position. For STS-42, the workstation was redesigned to evaluate the effects of microgravity on the ability of crewmembers to interact with a computer workstation. Information gained from this experiment will be used to design workstations for future Spacelab missions and Space Station Freedom (SSF).
PRACE - The European HPC Infrastructure
NASA Astrophysics Data System (ADS)
Stadelmeyer, Peter
2014-05-01
The mission of PRACE (Partnership for Advanced Computing in Europe) is to enable high impact scientific discovery and engineering research and development across all disciplines to enhance European competitiveness for the benefit of society. PRACE seeks to realize this mission by offering world class computing and data management resources and services through a peer review process. This talk gives a general overview about PRACE and the PRACE research infrastructure (RI). PRACE is established as an international not-for-profit association and the PRACE RI is a pan-European supercomputing infrastructure which offers access to computing and data management resources at partner sites distributed throughout Europe. Besides a short summary about the organization, history, and activities of PRACE, it is explained how scientists and researchers from academia and industry from around the world can access PRACE systems and which education and training activities are offered by PRACE. The overview also contains a selection of PRACE contributions to societal challenges and ongoing activities. Examples of the latter are beside others petascaling, application benchmark suite, best practice guides for efficient use of key architectures, application enabling / scaling, new programming models, and industrial applications. The Partnership for Advanced Computing in Europe (PRACE) is an international non-profit association with its seat in Brussels. The PRACE Research Infrastructure provides a persistent world-class high performance computing service for scientists and researchers from academia and industry in Europe. The computer systems and their operations accessible through PRACE are provided by 4 PRACE members (BSC representing Spain, CINECA representing Italy, GCS representing Germany and GENCI representing France). The Implementation Phase of PRACE receives funding from the EU's Seventh Framework Programme (FP7/2007-2013) under grant agreements RI-261557, RI-283493 and RI-312763. For more information, see www.prace-ri.eu
NASA Astrophysics Data System (ADS)
Takagi, Y.; Okubo, S.
2016-12-01
Internal co- and post-seismic deformation fields such as strain and stress changes have been modelled in order to study their effects on the subsequent earthquake and/or volcanic activity around the epicentre. When modelling strain or stress changes caused by great earthquakes (M>9.0), we should use a realistic earth model including earth's curvature and stratification; according to Toda et al.'s (2011) result, the stress changes caused by the 2011 Tohoku-oki earthquake (Mw=9.0) exceed 0.1 bar (0.01 MPa) even at the epicentral distance over 400 km. Although many works have been carried out to compute co- and post-seismic surface deformation fields using a spherically stratified viscoelastic earth (e.g. Piersanti et al. 1995; Pollitz 1996, 1997; Tanaka et al. 2006), less attention has been paid to `internal' deformation fields. Tanaka et al. (2006) succeeded in computing post-seismic surface displacements in a continuously stratified compressible viscoelastic earth by evaluating the inverse Laplace integration numerically. To our regret, however, their method cannot calculate internal deformation because they use Okubo's (1993) reciprocity theorem. We found that Okubo's (1993) reciprocity theorem can be extended to computation of internal deformation fields. In this presentation, we show a method of computing internal co- and post-seismic deformation fields and discuss the effects of earth's curvature and stratification on them.
A Multi-center Milestone Study of Clinical Vertebral CT Segmentation
Yao, Jianhua; Burns, Joseph E.; Forsberg, Daniel; Seitel, Alexander; Rasoulian, Abtin; Abolmaesumi, Purang; Hammernik, Kerstin; Urschler, Martin; Ibragimov, Bulat; Korez, Robert; Vrtovec, Tomaž; Castro-Mateos, Isaac; Pozo, Jose M.; Frangi, Alejandro F.; Summers, Ronald M.; Li, Shuo
2017-01-01
A multiple center milestone study of clinical vertebra segmentation is presented in this paper. Vertebra segmentation is a fundamental step for spinal image analysis and intervention. The first half of the study was conducted in the spine segmentation challenge in 2014 International Conference on Medical Image Computing and Computer Assisted Intervention (MICCAI) Workshop on Computational Spine Imaging (CSI 2014). The objective was to evaluate the performance of several state-of-the-art vertebra segmentation algorithms on computed tomography (CT) scans using ten training and five testing dataset, all healthy cases; the second half of the study was conducted after the challenge, where additional 5 abnormal cases are used for testing to evaluate the performance under abnormal cases. Dice coefficients and absolute surface distances were used as evaluation metrics. Segmentation of each vertebra as a single geometric unit, as well as separate segmentation of vertebra substructures, was evaluated. Five teams participated in the comparative study. The top performers in the study achieved Dice coefficient of 0.93 in the upper thoracic, 0.95 in the lower thoracic and 0.96 in the lumbar spine for healthy cases, and 0.88 in the upper thoracic, 0.89 in the lower thoracic and 0.92 in the lumbar spine for osteoporotic and fractured cases. The strengths and weaknesses of each method as well as future suggestion for improvement are discussed. This is the first multi-center comparative study for vertebra segmentation methods, which will provide an up-to-date performance milestone for the fast growing spinal image analysis and intervention. PMID:26878138
The Centre of High-Performance Scientific Computing, Geoverbund, ABC/J - Geosciences enabled by HPSC
NASA Astrophysics Data System (ADS)
Kollet, Stefan; Görgen, Klaus; Vereecken, Harry; Gasper, Fabian; Hendricks-Franssen, Harrie-Jan; Keune, Jessica; Kulkarni, Ketan; Kurtz, Wolfgang; Sharples, Wendy; Shrestha, Prabhakar; Simmer, Clemens; Sulis, Mauro; Vanderborght, Jan
2016-04-01
The Centre of High-Performance Scientific Computing (HPSC TerrSys) was founded 2011 to establish a centre of competence in high-performance scientific computing in terrestrial systems and the geosciences enabling fundamental and applied geoscientific research in the Geoverbund ABC/J (geoscientfic research alliance of the Universities of Aachen, Cologne, Bonn and the Research Centre Jülich, Germany). The specific goals of HPSC TerrSys are to achieve relevance at the national and international level in (i) the development and application of HPSC technologies in the geoscientific community; (ii) student education; (iii) HPSC services and support also to the wider geoscientific community; and in (iv) the industry and public sectors via e.g., useful applications and data products. A key feature of HPSC TerrSys is the Simulation Laboratory Terrestrial Systems, which is located at the Jülich Supercomputing Centre (JSC) and provides extensive capabilities with respect to porting, profiling, tuning and performance monitoring of geoscientific software in JSC's supercomputing environment. We will present a summary of success stories of HPSC applications including integrated terrestrial model development, parallel profiling and its application from watersheds to the continent; massively parallel data assimilation using physics-based models and ensemble methods; quasi-operational terrestrial water and energy monitoring; and convection permitting climate simulations over Europe. The success stories stress the need for a formalized education of students in the application of HPSC technologies in future.
Truong, Quynh A; Knaapen, Paul; Pontone, Gianluca; Andreini, Daniele; Leipsic, Jonathon; Carrascosa, Patricia; Lu, Bin; Branch, Kelley; Raman, Subha; Bloom, Stephen; Min, James K
2015-10-01
Dual-energy CT (DECT) has potential to improve myocardial perfusion for physiologic assessment of coronary artery disease (CAD). Diagnostic performance of rest-stress DECT perfusion (DECTP) is unknown. DECIDE-Gold is a prospective multicenter study to evaluate the accuracy of DECT to detect hemodynamic (HD) significant CAD, as compared to fractional flow reserve (FFR) as a reference standard. Eligible participants are subjects with symptoms of CAD referred for invasive coronary angiography (ICA). Participants will undergo DECTP, which will be performed by pharmacological stress, and participants will subsequently proceed to ICA and FFR. HD-significant CAD will be defined as FFR ≤ 0.80. In those undergoing myocardial stress imaging (MPI) by positron emission tomography (PET), single photon emission computed tomography (SPECT) or cardiac magnetic resonance (CMR) imaging, ischemia will be graded by % ischemic myocardium. Blinded core laboratory interpretation will be performed for CCTA, DECTP, MPI, ICA, and FFR. Primary endpoint is accuracy of DECTP to detect ≥1 HD-significant stenosis at the subject level when compared to FFR. Secondary and tertiary endpoints are accuracies of combinations of DECTP at the subject and vessel levels compared to FFR and MPI. DECIDE-Gold will determine the performance of DECTP for diagnosing ischemia.
Automated method for determining Instron Residual Seal Force of glass vial/rubber closure systems.
Ludwig, J D; Nolan, P D; Davis, C W
1993-01-01
Instron Residual Seal Force (IRSF) of glass vial/rubber closure systems was determined using an Instron 4501 Materials Testing System. Computer programs were written to process raw data and calculate IRSF values. Preliminary experiments indicated both the appearance of the stress-deformation curves and precision of the derived IRSF values were dependent on the internal dimensions and top surface geometry of the cap anvil. Therefore, a series of five cap anvils varying in shape and dimensions were machined to optimize performance and precision. Vials capped with West 4416/50 PURCOAT button closures or Helvoet compound 6207 lyophilization closures were tested with each cap anvil. Cap anvils with spherical top surfaces and narrow internal dimensions produced more precise results and more uniform stress-deformation curves than cap anvils with flat top surfaces and wider internal dimensions.
International Space Station (ISS)
2001-02-01
The Payload Operations Center (POC) is the science command post for the International Space Station (ISS). Located at NASA's Marshall Space Flight Center in Huntsville, Alabama, it is the focal point for American and international science activities aboard the ISS. The POC's unique capabilities allow science experts and researchers around the world to perform cutting-edge science in the unique microgravity environment of space. The POC is staffed around the clock by shifts of payload flight controllers. At any given time, 8 to 10 flight controllers are on consoles operating, plarning for, and controlling various systems and payloads. This photograph shows a Payload Rack Officer (PRO) at a work station. The PRO is linked by a computer to all payload racks aboard the ISS. The PRO monitors and configures the resources and environment for science experiments including EXPRESS Racks, multiple-payload racks designed for commercial payloads.
ERIC Educational Resources Information Center
Association for the Advancement of Computing in Education. Asia-Pacific Chapter.
This conference addressed pedagogical, social, and technological issues related to computers in education. The conference theme, "Learning Societies in the New Millennium: Creativity, Caring & Commitments," focused on creative learning, caring for diverse cultures and global issues, and committing oneself to a new way of…
ERIC Educational Resources Information Center
Kish, Gary; Cook, Samuel A.; Kis, Greta
2013-01-01
The University of Debrecen's Faculty of Medicine has an international, multilingual student population with anatomy courses taught in English to all but Hungarian students. An elective computer-assisted gross anatomy course, the Computer Human Anatomy (CHA), has been taught in English at the Anatomy Department since 2008. This course focuses on an…
Computer-intensive simulation of solid-state NMR experiments using SIMPSON.
Tošner, Zdeněk; Andersen, Rasmus; Stevensson, Baltzar; Edén, Mattias; Nielsen, Niels Chr; Vosegaard, Thomas
2014-09-01
Conducting large-scale solid-state NMR simulations requires fast computer software potentially in combination with efficient computational resources to complete within a reasonable time frame. Such simulations may involve large spin systems, multiple-parameter fitting of experimental spectra, or multiple-pulse experiment design using parameter scan, non-linear optimization, or optimal control procedures. To efficiently accommodate such simulations, we here present an improved version of the widely distributed open-source SIMPSON NMR simulation software package adapted to contemporary high performance hardware setups. The software is optimized for fast performance on standard stand-alone computers, multi-core processors, and large clusters of identical nodes. We describe the novel features for fast computation including internal matrix manipulations, propagator setups and acquisition strategies. For efficient calculation of powder averages, we implemented interpolation method of Alderman, Solum, and Grant, as well as recently introduced fast Wigner transform interpolation technique. The potential of the optimal control toolbox is greatly enhanced by higher precision gradients in combination with the efficient optimization algorithm known as limited memory Broyden-Fletcher-Goldfarb-Shanno. In addition, advanced parallelization can be used in all types of calculations, providing significant time reductions. SIMPSON is thus reflecting current knowledge in the field of numerical simulations of solid-state NMR experiments. The efficiency and novel features are demonstrated on the representative simulations. Copyright © 2014 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Wang, Jia Jie; Wriedt, Thomas; Han, Yi Ping; Mädler, Lutz; Jiao, Yong Chang
2018-05-01
Light scattering of a radially inhomogeneous droplet, which is modeled by a multilayered sphere, is investigated within the framework of Generalized Lorenz-Mie Theory (GLMT), with particular efforts devoted to the analysis of the internal field distribution in the cases of shaped beam illumination. To circumvent numerical difficulties in the computation of internal field for an absorbing/non-absorbing droplet with pretty large size parameter, a recursive algorithm is proposed by reformulation of the equations for the expansion coefficients. Two approaches are proposed for the prediction of the internal field distribution, namely a rigorous method and an approximation method. The developed computer code is tested to be stable in a wide range of size parameters. Numerical computations are implemented to simulate the internal field distributions of a radially inhomogeneous droplet illuminated by a focused Gaussian beam.
NASA Technical Reports Server (NTRS)
Dobrinskaya, Tatiana
2015-01-01
This paper suggests a new method for optimizing yaw maneuvers on the International Space Station (ISS). Yaw rotations are the most common large maneuvers on the ISS often used for docking and undocking operations, as well as for other activities. When maneuver optimization is used, large maneuvers, which were performed on thrusters, could be performed either using control moment gyroscopes (CMG), or with significantly reduced thruster firings. Maneuver optimization helps to save expensive propellant and reduce structural loads - an important factor for the ISS service life. In addition, optimized maneuvers reduce contamination of the critical elements of the vehicle structure, such as solar arrays. This paper presents an analytical solution for optimizing yaw attitude maneuvers. Equations describing pitch and roll motion needed to counteract the major torques during a yaw maneuver are obtained. A yaw rate profile is proposed. Also the paper describes the physical basis of the suggested optimization approach. In the obtained optimized case, the torques are significantly reduced. This torque reduction was compared to the existing optimization method which utilizes the computational solution. It was shown that the attitude profiles and the torque reduction have a good match for these two methods of optimization. The simulations using the ISS flight software showed similar propellant consumption for both methods. The analytical solution proposed in this paper has major benefits with respect to computational approach. In contrast to the current computational solution, which only can be calculated on the ground, the analytical solution does not require extensive computational resources, and can be implemented in the onboard software, thus, making the maneuver execution automatic. The automatic maneuver significantly simplifies the operations and, if necessary, allows to perform a maneuver without communication with the ground. It also reduces the probability of command errors. The suggested analytical solution provides a new method of maneuver optimization which is less complicated, automatic and more universal. A maneuver optimization approach, presented in this paper, can be used not only for the ISS, but for other orbiting space vehicles.
Deep Supervised, but Not Unsupervised, Models May Explain IT Cortical Representation
Khaligh-Razavi, Seyed-Mahdi; Kriegeskorte, Nikolaus
2014-01-01
Inferior temporal (IT) cortex in human and nonhuman primates serves visual object recognition. Computational object-vision models, although continually improving, do not yet reach human performance. It is unclear to what extent the internal representations of computational models can explain the IT representation. Here we investigate a wide range of computational model representations (37 in total), testing their categorization performance and their ability to account for the IT representational geometry. The models include well-known neuroscientific object-recognition models (e.g. HMAX, VisNet) along with several models from computer vision (e.g. SIFT, GIST, self-similarity features, and a deep convolutional neural network). We compared the representational dissimilarity matrices (RDMs) of the model representations with the RDMs obtained from human IT (measured with fMRI) and monkey IT (measured with cell recording) for the same set of stimuli (not used in training the models). Better performing models were more similar to IT in that they showed greater clustering of representational patterns by category. In addition, better performing models also more strongly resembled IT in terms of their within-category representational dissimilarities. Representational geometries were significantly correlated between IT and many of the models. However, the categorical clustering observed in IT was largely unexplained by the unsupervised models. The deep convolutional network, which was trained by supervision with over a million category-labeled images, reached the highest categorization performance and also best explained IT, although it did not fully explain the IT data. Combining the features of this model with appropriate weights and adding linear combinations that maximize the margin between animate and inanimate objects and between faces and other objects yielded a representation that fully explained our IT data. Overall, our results suggest that explaining IT requires computational features trained through supervised learning to emphasize the behaviorally important categorical divisions prominently reflected in IT. PMID:25375136
SU-E-P-10: Establishment of Local Diagnostic Reference Levels of Routine Exam in Computed Tomography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yeh, M; Wang, Y; Weng, H
Introduction National diagnostic reference levels (NDRLs) can be used as a reference dose of radiological examination can provide radiation dose as the basis of patient dose optimization. Local diagnostic reference levels (LDRLs) by periodically view and check doses, more efficiency to improve the way of examination. Therefore, the important first step is establishing a diagnostic reference level. Computed Tomography in Taiwan had been built up the radiation dose limit value,in addition, many studies report shows that CT scan contributed most of the radiation dose in different medical. Therefore, this study was mainly to let everyone understand DRL’s international status. Formore » computed tomography in our hospital to establish diagnostic reference levels. Methods and Materials: There are two clinical CT scanners (a Toshiba Aquilion and a Siemens Sensation) were performed in this study. For CT examinations the basic recommended dosimetric quantity is the Computed Tomography Dose Index (CTDI). Each exam each different body part, we collect 10 patients at least. Carried out the routine examinations, and all exposure parameters have been collected and the corresponding CTDIv and DLP values have been determined. Results: The majority of patients (75%) were between 60–70 Kg of body weight. There are 25 examinations in this study. Table 1 shows the LDRL of each CT routine examination. Conclusions: Therefore, this study would like to let everyone know DRL’s international status, but also establishment of computed tomography of the local reference levels for our hospital, and providing radiation reference, as a basis for optimizing patient dose.« less
NASA Technical Reports Server (NTRS)
Delleur, Ann M.; Kerslake, Thomas W.
2002-01-01
With the first United States (U.S.) photovoltaic array (PVA) activated on International Space Station (ISS) in December 2000, on-orbit data can now be compared to analytical predictions. Due to ISS operational constraints, it is not always possible to point the front side of the arrays at the Sun. Thus, in many cases, sunlight directly illuminates the backside of the PVA as well as albedo illumination on either the front or the back. During this time, appreciable power is produced since the solar cells are mounted on a thin, solar transparent substrate. It is important to present accurate predictions for both front and backside power generation for mission planning, certification of flight readiness for a given mission, and on-orbit mission support. To provide a more detailed assessment of the ISS power production capability, the authors developed a PVA electrical performance model applicable to generalized bifacial illumination conditions. On-orbit PVA performance data were also collected and analyzed. This paper describes the ISS PVA performance model, and the methods used to reduce orbital performance data. Analyses were performed using SPACE. a NASA-GRC developed computer code for the ISS program office. Results showed a excellent comparison of on-orbit performance data and analytical results.
ICCE/ICCAI 2000 Keynote Papers.
ERIC Educational Resources Information Center
2000
This document contains the four keynote papers from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction). "Using Technologies To Model Student Problem Spaces" (David Jonassen) contrasts examples of semantic network, expert system, and systems modeling…
NASA Technical Reports Server (NTRS)
Sharp, John R.; Kittredge, Ken; Schunk, Richard G.
2003-01-01
As part of the aero-thermodynamics team supporting the Columbia Accident Investigation Board (CAB), the Marshall Space Flight Center was asked to perform engineering analyses of internal flows in the port wing. The aero-thermodynamics team was split into internal flow and external flow teams with the support being divided between shorter timeframe engineering methods and more complex computational fluid dynamics. In order to gain a rough order of magnitude type of knowledge of the internal flow in the port wing for various breach locations and sizes (as theorized by the CAB to have caused the Columbia re-entry failure), a bulk venting model was required to input boundary flow rates and pressures to the computational fluid dynamics (CFD) analyses. This paper summarizes the modeling that was done by MSFC in Thermal Desktop. A venting model of the entire Orbiter was constructed in FloCAD based on Rockwell International s flight substantiation analyses and the STS-107 reentry trajectory. Chemical equilibrium air thermodynamic properties were generated for SINDA/FLUINT s fluid property routines from a code provided by Langley Research Center. In parallel, a simplified thermal mathematical model of the port wing, including the Thermal Protection System (TPS), was based on more detailed Shuttle re-entry modeling previously done by the Dryden Flight Research Center. Once the venting model was coupled with the thermal model of the wing structure with chemical equilibrium air properties, various breach scenarios were assessed in support of the aero-thermodynamics team. The construction of the coupled model and results are presented herein.
Establishment of metrological traceability in porosity measurements by x-ray computed tomography
NASA Astrophysics Data System (ADS)
Hermanek, Petr; Carmignato, Simone
2017-09-01
Internal porosity is an inherent phenomenon to many manufacturing processes, such as casting, additive manufacturing, and others. Since these defects cannot be completely avoided by improving production processes, it is important to have a reliable method to detect and evaluate them accurately. The accurate evaluation becomes even more important concerning current industrial trends to minimize size and weight of products on one side, and enhance their complexity and performance on the other. X-ray computed tomography (CT) has emerged as a promising instrument for holistic porosity measurements offering several advantages over equivalent methods already established in the detection of internal defects. The main shortcomings of the conventional techniques pertain to too general information about total porosity content (e.g. Archimedes method) or the destructive way of testing (e.g. microscopy of cross-sections). On the contrary, CT is a nondestructive technique providing complete information about size, shape and distribution of internal porosity. However, due to the lack of international standards and the fact that it is relatively a new measurement technique, CT as a measurement technology has not yet reached maturity. This study proposes a procedure for the establishment of measurement traceability in porosity measurements by CT including the necessary evaluation of measurement uncertainty. The traceability transfer is carried out through a novel reference standard calibrated by optical and tactile coordinate measuring systems. The measurement uncertainty is calculated following international standards and guidelines. In addition, the accuracy of porosity measurements by CT with the associated measurement uncertainty is evaluated using the reference standard.
Launch Vehicle Systems Analysis
NASA Technical Reports Server (NTRS)
Olds, John R.
1999-01-01
This report summaries the key accomplishments of Georgia Tech's Space Systems Design Laboratory (SSDL) under NASA Grant NAG8-1302 from NASA - Marshall Space Flight Center. The report consists of this summary white paper, copies of technical papers written under this grant, and several viewgraph-style presentations. During the course of this grant four main tasks were completed: (1)Simulated Combined-Cycle Rocket Engine Analysis Module (SCCREAM), a computer analysis tool for predicting the performance of various RBCC engine configurations; (2) Hyperion, a single stage to orbit vehicle capable of delivering 25,000 pound payloads to the International Space Station Orbit; (3) Bantam-X Support - a small payload mission; (4) International Trajectory Support for interplanetary human Mars missions.
Contribution to the optimal shape design of two-dimensional internal flows with embedded shocks
NASA Technical Reports Server (NTRS)
Iollo, Angelo; Salas, Manuel D.
1995-01-01
We explore the practicability of optimal shape design for flows modeled by the Euler equations. We define a functional whose minimum represents the optimality condition. The gradient of the functional with respect to the geometry is calculated with the Lagrange multipliers, which are determined by solving a co-state equation. The optimization problem is then examined by comparing the performance of several gradient-based optimization algorithms. In this formulation, the flow field can be computed to an arbitrary order of accuracy. Finally, some results for internal flows with embedded shocks are presented, including a case for which the solution to the inverse problem does not belong to the design space.
Szécsi, László; Kacsó, Ágota; Zeck, Günther; Hantz, Péter
2017-01-01
Light stimulation with precise and complex spatial and temporal modulation is demanded by a series of research fields like visual neuroscience, optogenetics, ophthalmology, and visual psychophysics. We developed a user-friendly and flexible stimulus generating framework (GEARS GPU-based Eye And Retina Stimulation Software), which offers access to GPU computing power, and allows interactive modification of stimulus parameters during experiments. Furthermore, it has built-in support for driving external equipment, as well as for synchronization tasks, via USB ports. The use of GEARS does not require elaborate programming skills. The necessary scripting is visually aided by an intuitive interface, while the details of the underlying software and hardware components remain hidden. Internally, the software is a C++/Python hybrid using OpenGL graphics. Computations are performed on the GPU, and are defined in the GLSL shading language. However, all GPU settings, including the GPU shader programs, are automatically generated by GEARS. This is configured through a method encountered in game programming, which allows high flexibility: stimuli are straightforwardly composed using a broad library of basic components. Stimulus rendering is implemented solely in C++, therefore intermediary libraries for interfacing could be omitted. This enables the program to perform computationally demanding tasks like en-masse random number generation or real-time image processing by local and global operations.
Rudyanto, Rina D.; Kerkstra, Sjoerd; van Rikxoort, Eva M.; Fetita, Catalin; Brillet, Pierre-Yves; Lefevre, Christophe; Xue, Wenzhe; Zhu, Xiangjun; Liang, Jianming; Öksüz, İlkay; Ünay, Devrim; Kadipaşaogandcaron;lu, Kamuran; Estépar, Raúl San José; Ross, James C.; Washko, George R.; Prieto, Juan-Carlos; Hoyos, Marcela Hernández; Orkisz, Maciej; Meine, Hans; Hüllebrand, Markus; Stöcker, Christina; Mir, Fernando Lopez; Naranjo, Valery; Villanueva, Eliseo; Staring, Marius; Xiao, Changyan; Stoel, Berend C.; Fabijanska, Anna; Smistad, Erik; Elster, Anne C.; Lindseth, Frank; Foruzan, Amir Hossein; Kiros, Ryan; Popuri, Karteek; Cobzas, Dana; Jimenez-Carretero, Daniel; Santos, Andres; Ledesma-Carbayo, Maria J.; Helmberger, Michael; Urschler, Martin; Pienn, Michael; Bosboom, Dennis G.H.; Campo, Arantza; Prokop, Mathias; de Jong, Pim A.; Ortiz-de-Solorzano, Carlos; Muñoz-Barrutia, Arrate; van Ginneken, Bram
2016-01-01
The VESSEL12 (VESsel SEgmentation in the Lung) challenge objectively compares the performance of different algorithms to identify vessels in thoracic computed tomography (CT) scans. Vessel segmentation is fundamental in computer aided processing of data generated by 3D imaging modalities. As manual vessel segmentation is prohibitively time consuming, any real world application requires some form of automation. Several approaches exist for automated vessel segmentation, but judging their relative merits is difficult due to a lack of standardized evaluation. We present an annotated reference dataset containing 20 CT scans and propose nine categories to perform a comprehensive evaluation of vessel segmentation algorithms from both academia and industry. Twenty algorithms participated in the VESSEL12 challenge, held at International Symposium on Biomedical Imaging (ISBI) 2012. All results have been published at the VESSEL12 website http://vessel12.grand-challenge.org. The challenge remains ongoing and open to new participants. Our three contributions are: (1) an annotated reference dataset available online for evaluation of new algorithms; (2) a quantitative scoring system for objective comparison of algorithms; and (3) performance analysis of the strengths and weaknesses of the various vessel segmentation methods in the presence of various lung diseases. PMID:25113321
Implementing the Victory Access Control Framework in a Military Ground Vehicle
2015-08-01
Protocol ( SOAP ) message body, but lacked the ability to encrypt individual XML elements within the SOAP body. Several attempts were made to augment the C...Service and both use SOAP and provide freely available WSDLs with similarly defined operations. TS3 even leverages the same XACML engine that is...Characterizing the Performance of SOAP Toolkits”, Fifth IEEE/ACM International Workshop on Grid Computing, pages 365- 372, November 2004. [8] J.A
2010-01-01
This meeting report gives an overview of the keynote lectures and a selection of the student oral and poster presentations at the 6th International Society for Computational Biology Student Council Symposium that was held as a precursor event to the annual international conference on Intelligent Systems for Molecular Biology (ISMB). The symposium was held in Boston, MA, USA on July 9th, 2010.
Requirements for a network storage service
NASA Technical Reports Server (NTRS)
Kelly, Suzanne M.; Haynes, Rena A.
1991-01-01
Sandia National Laboratories provides a high performance classified computer network as a core capability in support of its mission of nuclear weapons design and engineering, physical sciences research, and energy research and development. The network, locally known as the Internal Secure Network (ISN), comprises multiple distributed local area networks (LAN's) residing in New Mexico and California. The TCP/IP protocol suite is used for inter-node communications. Scientific workstations and mid-range computers, running UNIX-based operating systems, compose most LAN's. One LAN, operated by the Sandia Corporate Computing Computing Directorate, is a general purpose resource providing a supercomputer and a file server to the entire ISN. The current file server on the supercomputer LAN is an implementation of the Common File Server (CFS). Subsequent to the design of the ISN, Sandia reviewed its mass storage requirements and chose to enter into a competitive procurement to replace the existing file server with one more adaptable to a UNIX/TCP/IP environment. The requirements study for the network was the starting point for the requirements study for the new file server. The file server is called the Network Storage Service (NSS) and its requirements are described. An application or functional description of the NSS is given. The final section adds performance, capacity, and access constraints to the requirements.
Hunt, Geraldine B; Culp, William T N; Mayhew, Kelli N; Mayhew, Philipp; Steffey, Michele A; Zwingenberger, Allison
2014-10-01
To evaluate the in vivo pattern of ameroid constrictor closure of congenital extrahepatic portosystemic shunts in dogs. Prospective study. Dogs (n = 22) with congenital extrahepatic portosystemic shunts. Contrast-enhanced computed tomography was performed immediately before, and at least 8 weeks after placement of ameroid ring constrictors. Plastic-encased ameroid constrictors were used in 17 dogs and metal constrictors in 5 dogs. Presence of residual flow through the portosystemic shunt, additional anomalous vessels, acquired shunts and soft tissue associated with the ameroid constrictor was recorded. Postoperative internal diameter was recorded for the 17 plastic constrictors. Correlations between internal diameter and pre- and postoperative serum protein concentration were analyzed. No ameroid constrictor closed completely: shunt occlusion was always dependent on soft tissue within the ameroid ring. Residual flow through the shunt was present in 4 dogs (18%), although this caused persistent elevation of shunt fraction in only 1 dog (dog 8). The change in ameroid constrictor internal diameter was not significantly correlated with serum protein concentration. Complete shunt occlusion after AC placement is usually dependent on soft tissue reaction. Ameroid constrictors ≥5 mm diameter may not promote complete shunt occlusion. © Copyright 2014 by The American College of Veterinary Surgeons.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amber Shrivastava; Brian Williams; Ali S. Siahpush
2014-06-01
There have been significant efforts by the heat transfer community to investigate the melting phenomenon of materials. These efforts have included the analytical development of equations to represent melting, numerical development of computer codes to assist in modeling the phenomena, and collection of experimental data. The understanding of the melting phenomenon has application in several areas of interest, for example, the melting of a Phase Change Material (PCM) used as a thermal storage medium as well as the melting of the fuel bundle in a nuclear power plant during an accident scenario. The objective of this research is two-fold. Firstmore » a numerical investigation, using computational fluid dynamics (CFD), of melting with internal heat generation for a vertical cylindrical geometry is presented. Second, to the best of authors knowledge, there are very limited number of engineering experimental results available for the case of melting with Internal Heat Generation (IHG). An experiment was performed to produce such data using resistive, or Joule, heating as the IHG mechanism. The numerical results are compared against the experimental results and showed favorable correlation. Uncertainties in the numerical and experimental analysis are discussed. Based on the numerical and experimental analysis, recommendations are made for future work.« less
Gong, Yuanzheng; Seibel, Eric J.
2017-01-01
Rapid development in the performance of sophisticated optical components, digital image sensors, and computer abilities along with decreasing costs has enabled three-dimensional (3-D) optical measurement to replace more traditional methods in manufacturing and quality control. The advantages of 3-D optical measurement, such as noncontact, high accuracy, rapid operation, and the ability for automation, are extremely valuable for inline manufacturing. However, most of the current optical approaches are eligible for exterior instead of internal surfaces of machined parts. A 3-D optical measurement approach is proposed based on machine vision for the 3-D profile measurement of tiny complex internal surfaces, such as internally threaded holes. To capture the full topographic extent (peak to valley) of threads, a side-view commercial rigid scope is used to collect images at known camera positions and orientations. A 3-D point cloud is generated with multiview stereo vision using linear motion of the test piece, which is repeated by a rotation to form additional point clouds. Registration of these point clouds into a complete reconstruction uses a proposed automated feature-based 3-D registration algorithm. The resulting 3-D reconstruction is compared with x-ray computed tomography to validate the feasibility of our proposed method for future robotically driven industrial 3-D inspection. PMID:28286351
NASA Astrophysics Data System (ADS)
Gong, Yuanzheng; Seibel, Eric J.
2017-01-01
Rapid development in the performance of sophisticated optical components, digital image sensors, and computer abilities along with decreasing costs has enabled three-dimensional (3-D) optical measurement to replace more traditional methods in manufacturing and quality control. The advantages of 3-D optical measurement, such as noncontact, high accuracy, rapid operation, and the ability for automation, are extremely valuable for inline manufacturing. However, most of the current optical approaches are eligible for exterior instead of internal surfaces of machined parts. A 3-D optical measurement approach is proposed based on machine vision for the 3-D profile measurement of tiny complex internal surfaces, such as internally threaded holes. To capture the full topographic extent (peak to valley) of threads, a side-view commercial rigid scope is used to collect images at known camera positions and orientations. A 3-D point cloud is generated with multiview stereo vision using linear motion of the test piece, which is repeated by a rotation to form additional point clouds. Registration of these point clouds into a complete reconstruction uses a proposed automated feature-based 3-D registration algorithm. The resulting 3-D reconstruction is compared with x-ray computed tomography to validate the feasibility of our proposed method for future robotically driven industrial 3-D inspection.
ERIC Educational Resources Information Center
Cheema, Jehanzeb R.; Zhang, Bo
2013-01-01
This study looked at the effect of both quantity and quality of computer use on achievement. The Program for International Student Assessment (PISA) 2003 student survey comprising of 4,356 students (boys, n = 2,129; girls, n = 2,227) was used to predict academic achievement from quantity and quality of computer use while controlling for…
Code of Federal Regulations, 2011 CFR
2011-01-01
... “technology” controlled by 9A011, 9E003.a.1, or 9E003.a.3.a; and (H) Controlled by 9D002, specially designed... computationally access computers that have been enhanced by “electronic assemblies”, which have been exported or reexported under License Exception GOV and have been used to enhance such computers by aggregation of...
Code of Federal Regulations, 2013 CFR
2013-01-01
... “technology” controlled by 9A011, 9E003.a.1, or 9E003.a.3.a; and (H) Controlled by 9D002, specially designed... computationally access computers that have been enhanced by “electronic assemblies”, which have been exported or reexported under License Exception GOV and have been used to enhance such computers by aggregation of...
Code of Federal Regulations, 2012 CFR
2012-01-01
... “technology” controlled by 9A011, 9E003.a.1, or 9E003.a.3.a; and (H) Controlled by 9D002, specially designed... computationally access computers that have been enhanced by “electronic assemblies”, which have been exported or reexported under License Exception GOV and have been used to enhance such computers by aggregation of...
Francescatto, Margherita; Hermans, Susanne M A; Babaei, Sepideh; Vicedo, Esmeralda; Borrel, Alexandre; Meysman, Pieter
2015-01-01
In this meeting report, we give an overview of the talks, presentations and posters presented at the third European Symposium of the International Society for Computational Biology (ISCB) Student Council. The event was organized as a satellite meeting of the 13th European Conference for Computational Biology (ECCB) and took place in Strasbourg, France on September 6th, 2014.
Computational toxicity in 21st century safety sciences (China ...
presentation at the Joint Meeting of Analytical Toxicology and Computational Toxicology Committee (Chinese Society of Toxicology) International Workshop on Advanced Chemical Safety Assessment Technologies on 11 May 2016, Fuzhou University, Fuzhou China presentation at the Joint Meeting of Analytical Toxicology and Computational Toxicology Committee (Chinese Society of Toxicology) International Workshop on Advanced Chemical Safety Assessment Technologies on 11 May 2016, Fuzhou University, Fuzhou China
ERIC Educational Resources Information Center
Baptista Nunes, Miguel, Ed.; McPherson, Maggie, Ed.
2014-01-01
These proceedings contain the papers of the International Conference e-Learning 2014, which was organised by the International Association for Development of the Information Society and is part of the Multi Conference on Computer Science and Information Systems (Lisbon, Portugal July 15-19, 2014). The e-Learning 2014 conference aims to address the…
Purchasing a Computer System for the Small Construction Company,
1983-06-08
October 1982. . . . . . . .. i. i...< ..- . ,. .-: i. i? ,. . .-- i * i . . . ., r77- 7 *,~ .- -- 36 IBM - International Business Machines...Corporation, "Small Systems Solutions: An Introduction to Business Computing," Pamphlet SC21-5205-0, Atlanta, Georgia, January 1979. International Business Machines...Corporation, "IBM System/34 Introduction," Pamphlet GC21-5153-5, File No. S34-00, Atlanta, Georgia, January 1979. International Business Machines
Transient Three-Dimensional Side Load Analysis of Out-of-Round Film Cooled Nozzles
NASA Technical Reports Server (NTRS)
Wang, Ten-See; Lin, Jeff; Ruf, Joe; Guidos, Mike
2010-01-01
The objective of this study is to investigate the effect of nozzle out-of-roundness on the transient startup side loads. The out-of-roundness could be the result of asymmetric loads induced by hardware attached to the nozzle, asymmetric internal stresses induced by previous tests and/or deformation, such as creep, from previous tests. The rocket engine studied encompasses a regeneratively cooled thrust chamber and a film cooled nozzle extension with film coolant distributed from a turbine exhaust manifold. The computational methodology is based on an unstructured-grid, pressure-based computational fluid dynamics formulation, and a transient inlet history based on an engine system simulation. Transient startup computations were performed with the out-of-roundness achieved by four degrees of ovalization of the nozzle: one perfectly round, one slightly out-of-round, one more out-of-round, and one significantly out-of-round. The computed side load physics caused by the nozzle out-of-roundness and its effect on nozzle side load are reported and discussed.
Computational Modelling of Patella Femoral Kinematics During Gait Cycle and Experimental Validation
NASA Astrophysics Data System (ADS)
Maiti, Raman
2016-06-01
The effect of loading and boundary conditions on patellar mechanics is significant due to the complications arising in patella femoral joints during total knee replacements. To understand the patellar mechanics with respect to loading and motion, a computational model representing the patella femoral joint was developed and validated against experimental results. The computational model was created in IDEAS NX and simulated in MSC ADAMS/VIEW software. The results obtained in the form of internal external rotations and anterior posterior displacements for a new and experimentally simulated specimen for patella femoral joint under standard gait condition were compared with experimental measurements performed on the Leeds ProSim knee simulator. A good overall agreement between the computational prediction and the experimental data was obtained for patella femoral kinematics. Good agreement between the model and the past studies was observed when the ligament load was removed and the medial lateral displacement was constrained. The model is sensitive to ±5 % change in kinematics, frictional, force and stiffness coefficients and insensitive to time step.
Computational Modelling of Patella Femoral Kinematics During Gait Cycle and Experimental Validation
NASA Astrophysics Data System (ADS)
Maiti, Raman
2018-06-01
The effect of loading and boundary conditions on patellar mechanics is significant due to the complications arising in patella femoral joints during total knee replacements. To understand the patellar mechanics with respect to loading and motion, a computational model representing the patella femoral joint was developed and validated against experimental results. The computational model was created in IDEAS NX and simulated in MSC ADAMS/VIEW software. The results obtained in the form of internal external rotations and anterior posterior displacements for a new and experimentally simulated specimen for patella femoral joint under standard gait condition were compared with experimental measurements performed on the Leeds ProSim knee simulator. A good overall agreement between the computational prediction and the experimental data was obtained for patella femoral kinematics. Good agreement between the model and the past studies was observed when the ligament load was removed and the medial lateral displacement was constrained. The model is sensitive to ±5 % change in kinematics, frictional, force and stiffness coefficients and insensitive to time step.
Wang, Shanshan; Pavlicek, William; Roberts, Catherine C; Langer, Steve G; Zhang, Muhong; Hu, Mengqi; Morin, Richard L; Schueler, Beth A; Wellnitz, Clinton V; Wu, Teresa
2011-04-01
The U.S. National Press has brought to full public discussion concerns regarding the use of medical radiation, specifically x-ray computed tomography (CT), in diagnosis. A need exists for developing methods whereby assurance is given that all diagnostic medical radiation use is properly prescribed, and all patients' radiation exposure is monitored. The "DICOM Index Tracker©" (DIT) transparently captures desired digital imaging and communications in medicine (DICOM) tags from CT, nuclear imaging equipment, and other DICOM devices across an enterprise. Its initial use is recording, monitoring, and providing automatic alerts to medical professionals of excursions beyond internally determined trigger action levels of radiation. A flexible knowledge base, aware of equipment in use, enables automatic alerts to system administrators of newly identified equipment models or software versions so that DIT can be adapted to the new equipment or software. A dosimetry module accepts mammography breast organ dose, skin air kerma values from XA modalities, exposure indices from computed radiography, etc. upon receipt. The American Association of Physicists in Medicine recommended a methodology for effective dose calculations which are performed with CT units having DICOM structured dose reports. Web interface reporting is provided for accessing the database in real-time. DIT is DICOM-compliant and, thus, is standardized for international comparisons. Automatic alerts currently in use include: email, cell phone text message, and internal pager text messaging. This system extends the utility of DICOM for standardizing the capturing and computing of radiation dose as well as other quality measures.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-02
... INTERNATIONAL TRADE COMMISSION [Inv. No. 337-TA-841] Certain Computers and Computer Peripheral... after importation of certain computers and computer peripheral devices and components thereof and... computers and computer peripheral devices and components thereof and products containing the same that...
Micro-Biomechanics of the Kebara 2 Hyoid and Its Implications for Speech in Neanderthals
D’Anastasio, Ruggero; Wroe, Stephen; Tuniz, Claudio; Mancini, Lucia; Cesana, Deneb T.; Dreossi, Diego; Ravichandiran, Mayoorendra; Attard, Marie; Parr, William C. H.; Agur, Anne; Capasso, Luigi
2013-01-01
The description of a Neanderthal hyoid from Kebara Cave (Israel) in 1989 fuelled scientific debate on the evolution of speech and complex language. Gross anatomy of the Kebara 2 hyoid differs little from that of modern humans. However, whether Homo neanderthalensis could use speech or complex language remains controversial. Similarity in overall shape does not necessarily demonstrate that the Kebara 2 hyoid was used in the same way as that of Homo sapiens. The mechanical performance of whole bones is partly controlled by internal trabecular geometries, regulated by bone-remodelling in response to the forces applied. Here we show that the Neanderthal and modern human hyoids also present very similar internal architectures and micro-biomechanical behaviours. Our study incorporates detailed analysis of histology, meticulous reconstruction of musculature, and computational biomechanical analysis with models incorporating internal micro-geometry. Because internal architecture reflects the loadings to which a bone is routinely subjected, our findings are consistent with a capacity for speech in the Neanderthals. PMID:24367509
Bruno Garza, J L; Eijckelhof, B H W; Johnson, P W; Raina, S M; Rynell, P W; Huysmans, M A; van Dieën, J H; van der Beek, A J; Blatter, B M; Dennerlein, J T
2012-01-01
This study, a part of the PRedicting Occupational biomechanics in OFfice workers (PROOF) study, investigated whether there are differences in field-measured forces, muscle efforts, postures, velocities and accelerations across computer activities. These parameters were measured continuously for 120 office workers performing their own work for two hours each. There were differences in nearly all forces, muscle efforts, postures, velocities and accelerations across keyboard, mouse and idle activities. Keyboard activities showed a 50% increase in the median right trapezius muscle effort when compared to mouse activities. Median shoulder rotation changed from 25 degrees internal rotation during keyboard use to 15 degrees external rotation during mouse use. Only keyboard use was associated with median ulnar deviations greater than 5 degrees. Idle activities led to the greatest variability observed in all muscle efforts and postures measured. In future studies, measurements of computer activities could be used to provide information on the physical exposures experienced during computer use. Practitioner Summary: Computer users may develop musculoskeletal disorders due to their force, muscle effort, posture and wrist velocity and acceleration exposures during computer use. We report that many physical exposures are different across computer activities. This information may be used to estimate physical exposures based on patterns of computer activities over time.
PREFACE: International Conference on Computing in High Energy and Nuclear Physics (CHEP'07)
NASA Astrophysics Data System (ADS)
Sobie, Randall; Tafirout, Reda; Thomson, Jana
2007-07-01
The 2007 International Conference on Computing in High Energy and Nuclear Physics (CHEP) was held on 2-7 September 2007 in Victoria, British Columbia, Canada. CHEP is a major series of international conferences for physicists and computing professionals from the High Energy and Nuclear Physics community, Computer Science and Information Technology. The CHEP conference provides an international forum to exchange information on computing experience and needs for the community, and to review recent, ongoing, and future activities. The CHEP'07 conference had close to 500 attendees with a program that included plenary sessions of invited oral presentations, a number of parallel sessions comprising oral and poster presentations, and an industrial exhibition. Conference tracks covered topics in Online Computing, Event Processing, Software Components, Tools and Databases, Software Tools and Information Systems, Computing Facilities, Production Grids and Networking, Grid Middleware and Tools, Distributed Data Analysis and Information Management and Collaborative Tools. The conference included a successful whale-watching excursion involving over 200 participants and a banquet at the Royal British Columbia Museum. The next CHEP conference will be held in Prague in March 2009. We would like thank the sponsors of the conference and the staff at the TRIUMF Laboratory and the University of Victoria who made the CHEP'07 a success. Randall Sobie and Reda Tafirout CHEP'07 Conference Chairs
Efficient Strategies for Predictive Cell-Level Control of Lithium-Ion Batteries
NASA Astrophysics Data System (ADS)
Xavier, Marcelo A.
This dissertation introduces a set of state-space based model predictive control (MPC) algorithms tailored to a non-zero feedthrough term to account for the ohmic resistance that is inherent to the battery dynamics. MPC is herein applied to the problem of regulating cell-level measures of performance for lithium-ion batteries; the control methodologies are used first to compute a fast charging profile that respects input, output, and state constraints, i.e., input current, terminal voltage, and state of charge for an equivalent circuit model of the battery cell, and extended later to a linearized physics-based reduced-order model. The novelty of this work can summarized as follows: (1) the MPC variants are employed to a physics based reduce-order model in order to make use of the available set of internal electrochemical variables and mitigate internal mechanisms of cell degradation. (e.g., lithium plating); (2) we developed a dual-mode MPC closed-loop paradigm that suits the battery control problem with the objective of reducing computational effort by solving simpler optimization routines and guaranteeing stability; and finally (3) we developed a completely new approach of the use of a predictive control strategy where MPC is employed as a "smart sensor" for power estimation. Results are presented that show the comparative performance of the MPC algorithms for both EMC and PBROM These results highlight that dual-mode MPC can deliver optimal input current profiles by using a shorter horizon while still guaranteeing stability. Additionally, rigorous mathematical developments are presented for the development of the MPC algorithms. The use of MPC as a "smart sensor" presents it self as an appealing method for power estimation, since MPC permits a fully dynamic input profile that is able to achieve performance right at the proper constraint boundaries. Therefore, MPC is expected to produce accurate power limits for each computed sample time when compared to the Bisection method [1] which assumes constant input values over the prediction interval.
Benkert, Thomas; Tian, Ye; Huang, Chenchan; DiBella, Edward V R; Chandarana, Hersh; Feng, Li
2018-07-01
Golden-angle radial sparse parallel (GRASP) MRI reconstruction requires gridding and regridding to transform data between radial and Cartesian k-space. These operations are repeatedly performed in each iteration, which makes the reconstruction computationally demanding. This work aimed to accelerate GRASP reconstruction using self-calibrating GRAPPA operator gridding (GROG) and to validate its performance in clinical imaging. GROG is an alternative gridding approach based on parallel imaging, in which k-space data acquired on a non-Cartesian grid are shifted onto a Cartesian k-space grid using information from multicoil arrays. For iterative non-Cartesian image reconstruction, GROG is performed only once as a preprocessing step. Therefore, the subsequent iterative reconstruction can be performed directly in Cartesian space, which significantly reduces computational burden. Here, a framework combining GROG with GRASP (GROG-GRASP) is first optimized and then compared with standard GRASP reconstruction in 22 prostate patients. GROG-GRASP achieved approximately 4.2-fold reduction in reconstruction time compared with GRASP (∼333 min versus ∼78 min) while maintaining image quality (structural similarity index ≈ 0.97 and root mean square error ≈ 0.007). Visual image quality assessment by two experienced radiologists did not show significant differences between the two reconstruction schemes. With a graphics processing unit implementation, image reconstruction time can be further reduced to approximately 14 min. The GRASP reconstruction can be substantially accelerated using GROG. This framework is promising toward broader clinical application of GRASP and other iterative non-Cartesian reconstruction methods. Magn Reson Med 80:286-293, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.
CSDMS2.0: Computational Infrastructure for Community Surface Dynamics Modeling
NASA Astrophysics Data System (ADS)
Syvitski, J. P.; Hutton, E.; Peckham, S. D.; Overeem, I.; Kettner, A.
2012-12-01
The Community Surface Dynamic Modeling System (CSDMS) is an NSF-supported, international and community-driven program that seeks to transform the science and practice of earth-surface dynamics modeling. CSDMS integrates a diverse community of more than 850 geoscientists representing 360 international institutions (academic, government, industry) from 60 countries and is supported by a CSDMS Interagency Committee (22 Federal agencies), and a CSDMS Industrial Consortia (18 companies). CSDMS presently distributes more 200 Open Source models and modeling tools, access to high performance computing clusters in support of developing and running models, and a suite of products for education and knowledge transfer. CSDMS software architecture employs frameworks and services that convert stand-alone models into flexible "plug-and-play" components to be assembled into larger applications. CSDMS2.0 will support model applications within a web browser, on a wider variety of computational platforms, and on other high performance computing clusters to ensure robustness and sustainability of the framework. Conversion of stand-alone models into "plug-and-play" components will employ automated wrapping tools. Methods for quantifying model uncertainty are being adapted as part of the modeling framework. Benchmarking data is being incorporated into the CSDMS modeling framework to support model inter-comparison. Finally, a robust mechanism for ingesting and utilizing semantic mediation databases is being developed within the Modeling Framework. Six new community initiatives are being pursued: 1) an earth - ecosystem modeling initiative to capture ecosystem dynamics and ensuing interactions with landscapes, 2) a geodynamics initiative to investigate the interplay among climate, geomorphology, and tectonic processes, 3) an Anthropocene modeling initiative, to incorporate mechanistic models of human influences, 4) a coastal vulnerability modeling initiative, with emphasis on deltas and their multiple threats and stressors, 5) a continental margin modeling initiative, to capture extreme oceanic and atmospheric events generating turbidity currents in the Gulf of Mexico, and 6) a CZO Focus Research Group, to develop compatibility between CSDMS architecture and protocols and Critical Zone Observatory-developed models and data.
Hjermstad, Marianne Jensen; Lie, Hanne C; Caraceni, Augusto; Currow, David C; Fainsinger, Robin L; Gundersen, Odd Erik; Haugen, Dagny Faksvaag; Heitzer, Ellen; Radbruch, Lukas; Stone, Patrick C; Strasser, Florian; Kaasa, Stein; Loge, Jon Håvard
2012-11-01
Symptom assessment by computers is only effective if it provides valid results and is perceived as useful for clinical use by the end users: patients and health care providers. To identify factors associated with discontinuation, time expenditure, and patient preferences of the computerized symptom assessment used in an international multicenter data collection project: the European Palliative Care Research Collaborative-Computerized Symptom Assessment. Cancer patients with incurable metastatic or locally advanced disease were recruited from 17 centers in eight countries, providing 1017 records for analyses. Observer-based registrations and patient-reported measures on pain, depression, and physical function were entered on touch screen laptop computers. The entire assessment was completed by 94.9% (n = 965), with median age 63 years (range 18-91 years) and median Karnofsky Performance Status (KPS) score of 70 (range 20-100). Predictive factors for noncompletion were higher age, lower KPS, and more pain (P ≤ 0.012). Time expenditure among completers increased with higher age, male gender, Norwegian nationality, number of comorbidities, and lower physical functioning (P ≤ 0.007) but was inversely related to pain levels and tiredness (P ≤ 0.03). Need for assistance was predicted by higher age, nationality other than Norwegian, lower KPS, and lower educational level (P < 0.001). More than 50% of patients preferred computerized assessment to a paper and pencil version. The high completion rate shows that symptom assessment by computers is feasible in patients with advanced cancer. However, reduced performance status reduces compliance and increases the need for assistance. Future work should aim at identifying the minimum set of valid screening questions and refine the software to optimize symptom assessment and reduce respondent burden in frail patients. Copyright © 2012 U.S. Cancer Pain Relief Committee. Published by Elsevier Inc. All rights reserved.
A critical evaluation of various turbulence models as applied to internal fluid flows
NASA Technical Reports Server (NTRS)
Nallasamy, M.
1985-01-01
Models employed in the computation of turbulent flows are described and their application to internal flows is evaluated by examining the predictions of various turbulence models in selected flow configurations. The main conclusions are: (1) the k-epsilon model is used in a majority of all the two-dimensional flow calculations reported in the literature; (2) modified forms of the k-epsilon model improve the performance for flows with streamline curvature and heat transfer; (3) for flows with swirl, the k-epsilon model performs rather poorly; the algebraic stress model performs better in this case; and (4) for flows with regions of secondary flow (noncircular duct flows), the algebraic stress model performs fairly well for fully developed flow, for developing flow, the algebraic stress model performance is not good; a Reynolds stress model should be used. False diffusion and inlet boundary conditions are discussed. Countergradient transport and its implications in turbulence modeling is mentioned. Two examples of recirculating flow predictions obtained using PHOENICS code are discussed. The vortex method, large eddy simulation (modeling of subgrid scale Reynolds stresses), and direct simulation, are considered. Some recommendations for improving the model performance are made. The need for detailed experimental data in flows with strong curvature is emphasized.
ISMB 2016 offers outstanding science, networking, and celebration
Fogg, Christiana
2016-01-01
The annual international conference on Intelligent Systems for Molecular Biology (ISMB) is the major meeting of the International Society for Computational Biology (ISCB). Over the past 23 years the ISMB conference has grown to become the world's largest bioinformatics/computational biology conference. ISMB 2016 will be the year's most important computational biology event globally. The conferences provide a multidisciplinary forum for disseminating the latest developments in bioinformatics/computational biology. ISMB brings together scientists from computer science, molecular biology, mathematics, statistics and related fields. Its principal focus is on the development and application of advanced computational methods for biological problems. ISMB 2016 offers the strongest scientific program and the broadest scope of any international bioinformatics/computational biology conference. Building on past successes, the conference is designed to cater to variety of disciplines within the bioinformatics/computational biology community. ISMB 2016 takes place July 8 - 12 at the Swan and Dolphin Hotel in Orlando, Florida, United States. For two days preceding the conference, additional opportunities including Satellite Meetings, Student Council Symposium, and a selection of Special Interest Group Meetings and Applied Knowledge Exchange Sessions (AKES) are all offered to enable registered participants to learn more on the latest methods and tools within specialty research areas. PMID:27347392
ISMB 2016 offers outstanding science, networking, and celebration.
Fogg, Christiana
2016-01-01
The annual international conference on Intelligent Systems for Molecular Biology (ISMB) is the major meeting of the International Society for Computational Biology (ISCB). Over the past 23 years the ISMB conference has grown to become the world's largest bioinformatics/computational biology conference. ISMB 2016 will be the year's most important computational biology event globally. The conferences provide a multidisciplinary forum for disseminating the latest developments in bioinformatics/computational biology. ISMB brings together scientists from computer science, molecular biology, mathematics, statistics and related fields. Its principal focus is on the development and application of advanced computational methods for biological problems. ISMB 2016 offers the strongest scientific program and the broadest scope of any international bioinformatics/computational biology conference. Building on past successes, the conference is designed to cater to variety of disciplines within the bioinformatics/computational biology community. ISMB 2016 takes place July 8 - 12 at the Swan and Dolphin Hotel in Orlando, Florida, United States. For two days preceding the conference, additional opportunities including Satellite Meetings, Student Council Symposium, and a selection of Special Interest Group Meetings and Applied Knowledge Exchange Sessions (AKES) are all offered to enable registered participants to learn more on the latest methods and tools within specialty research areas.
DESPIC: Detecting Early Signatures of Persuasion in Information Cascades
2015-08-27
over NoSQL Databases, Proceedings of the 14th IEEE/ACM International Symposium on Cluster, Cloud and Grid Computing (CCGrid 2014). 26-MAY-14, . : , P...over NoSQL Databases. Proceedings of the 14th IEEE/ACM International Symposium on Cluster, Cloud and Grid Computing (CCGrid 2014). Chicago, IL, USA...distributed NoSQL databases including HBase and Riak, we finalized the requirements of the optimal computational architecture to support our framework
2015-01-01
In this meeting report, we give an overview of the talks, presentations and posters presented at the third European Symposium of the International Society for Computational Biology (ISCB) Student Council. The event was organized as a satellite meeting of the 13th European Conference for Computational Biology (ECCB) and took place in Strasbourg, France on September 6th, 2014. PMID:25708611
Integrated Computational Materials Engineering for Magnesium in Automotive Body Applications
NASA Astrophysics Data System (ADS)
Allison, John E.; Liu, Baicheng; Boyle, Kevin P.; Hector, Lou; McCune, Robert
This paper provides an overview and progress report for an international collaborative project which aims to develop an ICME infrastructure for magnesium for use in automotive body applications. Quantitative processing-micro structure-property relationships are being developed for extruded Mg alloys, sheet-formed Mg alloys and high pressure die cast Mg alloys. These relationships are captured in computational models which are then linked with manufacturing process simulation and used to provide constitutive models for component performance analysis. The long term goal is to capture this information in efficient computational models and in a web-centered knowledge base. The work is being conducted at leading universities, national labs and industrial research facilities in the US, China and Canada. This project is sponsored by the U.S. Department of Energy, the U.S. Automotive Materials Partnership (USAMP), Chinese Ministry of Science and Technology (MOST) and Natural Resources Canada (NRCan).
System and Method for Providing a Climate Data Persistence Service
NASA Technical Reports Server (NTRS)
Schnase, John L. (Inventor); Ripley, III, William David (Inventor); Duffy, Daniel Q. (Inventor); Thompson, John H. (Inventor); Strong, Savannah L. (Inventor); McInerney, Mark (Inventor); Sinno, Scott (Inventor); Tamkin, Glenn S. (Inventor); Nadeau, Denis (Inventor)
2018-01-01
A system, method and computer-readable storage devices for providing a climate data persistence service. A system configured to provide the service can include a climate data server that performs data and metadata storage and management functions for climate data objects, a compute-storage platform that provides the resources needed to support a climate data server, provisioning software that allows climate data server instances to be deployed as virtual climate data servers in a cloud computing environment, and a service interface, wherein persistence service capabilities are invoked by software applications running on a client device. The climate data objects can be in various formats, such as International Organization for Standards (ISO) Open Archival Information System (OAIS) Reference Model Submission Information Packages, Archive Information Packages, and Dissemination Information Packages. The climate data server can enable scalable, federated storage, management, discovery, and access, and can be tailored for particular use cases.
NASA Technical Reports Server (NTRS)
Garg, Vijay K.
2001-01-01
The turbine gas path is a very complex flow field. This is due to a variety of flow and heat transfer phenomena encountered in turbine passages. This manuscript provides an overview of the current work in this field at the NASA Glenn Research Center. Also, based on the author's preference, more emphasis is on the computational work. There is much more experimental work in progress at GRC than that reported here. While much has been achieved, more needs to be done in terms of validating the predictions against experimental data. More experimental data, especially on film cooled and rough turbine blades, are required for code validation. Also, the combined film cooling and internal cooling flow computation for a real blade is yet to be performed. While most computational work to date has assumed steady state conditions, the flow is clearly unsteady due to the presence of wakes. All this points to a long road ahead. However, we are well on course.
Space station Simulation Computer System (SCS) study for NASA/MSFC. Volume 2: Concept document
NASA Technical Reports Server (NTRS)
1989-01-01
The Simulation Computer System (SCS) concept document describes and establishes requirements for the functional performance of the SCS system, including interface, logistic, and qualification requirements. The SCS is the computational communications and display segment of the Marshall Space Flight Center (MSFC) Payload Training Complex (PTC). The PTC is the MSFC facility that will train onboard and ground operations personnel to operate the payloads and experiments on board the international Space Station Freedom. The requirements to be satisfied by the system implementation are identified here. The SCS concept document defines the requirements to be satisfied through the implementation of the system capability. The information provides the operational basis for defining the requirements to be allocated to the system components and enables the system organization to assess whether or not the completed system complies with the requirements of the system.
A Computer Model for Analyzing Volatile Removal Assembly
NASA Technical Reports Server (NTRS)
Guo, Boyun
2010-01-01
A computer model simulates reactional gas/liquid two-phase flow processes in porous media. A typical process is the oxygen/wastewater flow in the Volatile Removal Assembly (VRA) in the Closed Environment Life Support System (CELSS) installed in the International Space Station (ISS). The volatile organics in the wastewater are combusted by oxygen gas to form clean water and carbon dioxide, which is solved in the water phase. The model predicts the oxygen gas concentration profile in the reactor, which is an indicator of reactor performance. In this innovation, a mathematical model is included in the computer model for calculating the mass transfer from the gas phase to the liquid phase. The amount of mass transfer depends on several factors, including gas-phase concentration, distribution, and reaction rate. For a given reactor dimension, these factors depend on pressure and temperature in the reactor and composition and flow rate of the influent.
Research on three-dimensional reconstruction method based on binocular vision
NASA Astrophysics Data System (ADS)
Li, Jinlin; Wang, Zhihui; Wang, Minjun
2018-03-01
As the hot and difficult issue in computer vision, binocular stereo vision is an important form of computer vision,which has a broad application prospects in many computer vision fields,such as aerial mapping,vision navigation,motion analysis and industrial inspection etc.In this paper, a research is done into binocular stereo camera calibration, image feature extraction and stereo matching. In the binocular stereo camera calibration module, the internal parameters of a single camera are obtained by using the checkerboard lattice of zhang zhengyou the field of image feature extraction and stereo matching, adopted the SURF operator in the local feature operator and the SGBM algorithm in the global matching algorithm are used respectively, and the performance are compared. After completed the feature points matching, we can build the corresponding between matching points and the 3D object points using the camera parameters which are calibrated, which means the 3D information.
Non-linear wave phenomena in Josephson elements for superconducting electronics
NASA Astrophysics Data System (ADS)
Christiansen, P. L.; Parmentier, R. D.; Skovgaard, O.
1985-07-01
The long and intermediate length Josephson tunnel junction oscillator with overlap geometry of linear and circular configuration, is investigated by computational solution of the perturbed sine-Gordon equation model and by experimental measurements. The model predicts the experimental results very well. Line oscillators as well as ring oscillators are treated. For long junctions soliton perturbation methods are developed and turn out to be efficient prediction tools, also providing physical understanding of the dynamics of the oscillator. For intermediate length junctions expansions in terms of linear cavity modes reduce computational costs. The narrow linewidth of the electromagnetic radiation (typically 1 kHz of a line at 10 GHz) is demonstrated experimentally. Corresponding computer simulations requiring a relative accuracy of less than 10 to the -7th power are performed on supercomputer CRAY-1-S. The broadening of linewidth due to external microradiation and internal thermal noise is determined.
10th International Conference of Computational Methods in Sciences and Engineering
2014-12-22
Density Modulation ", in the 10th International Conference of Computational Methods in Sciences and Engineering (ICCMSE 2014), April 4-7, 2014, Athens...ENGINEERING We organized the symposium, “Electronic Transport Properties in the Presence of Density Modulation ,” in the 10th International...Superlattices by Coplanar Waveguide Dr. Endo reported his recent experimental work on thermoelectric power of two-dimensional electron gases in the quantum
ICCE/ICCAI 2000 Full & Short Papers (Collaborative Learning).
ERIC Educational Resources Information Center
2000
This document contains the full and short papers on collaborative learning from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction) covering the following topics: comparison of applying Internet to cooperative and traditional learning; a distributed backbone system for…
ICCE/ICCAI 2000 Full & Short Papers (Creative Learning).
ERIC Educational Resources Information Center
2000
This document contains the following full and short papers on creative learning from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction): (1) "A Collaborative Learning Support System Based on Virtual Environment Server for Multiple Agents" (Takashi Ohno, Kenji…
ICCE/ICCAI 2000 Full & Short Papers (Others).
ERIC Educational Resources Information Center
2000
This document contains the following full and short papers from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction): (1) "A Code Restructuring Tool To Help Scaffold Novice Programmers" (Stuart Garner); (2) "An Assessment Framework for Information Technology Integrated…
BJUT at TREC 2015 Microblog Track: Real Time Filtering Using Knowledge Base
2015-11-20
learning to rank of tweets. In Proceedings of the 23rd International Conference on Computational Linguistics , pages 295–303. Association for Computational... Linguistics , 2010. Thorsten Joachims. Optimizing search engines using clickthrough data. In Proceedings of the eighth ACM SIGKDD international
The Fundamental Reasons Why Laptop Computers should not be Used on Your Lap.
Mortazavi, S A R; Taeb, S; Mortazavi, S M J; Zarei, S; Haghani, M; Habibzadeh, P; Shojaei-Fard, M B
2016-12-01
As a tendency to use new technologies, gadgets such as laptop computers are becoming more popular among students, teachers, businessmen and office workers. Today laptops are a great tool for education and learning, work and personal multimedia. Millions of men, especially those in the reproductive age, are frequently using their laptop computers on the lap (thigh). Over the past several years, our lab has focused on the health effects of exposure to different sources of electromagnetic fields such as cellular phones, mobile base stations, mobile phone jammers, laptop computers, radars, dentistry cavitrons and Magnetic Resonance Imaging (MRI). Our own studies as well as the studies performed by other researchers indicate that using laptop computers on the lap adversely affects the male reproductive health. When it is placed on the lap, not only the heat from a laptop computer can warm men's scrotums, the electromagnetic fields generated by laptop's internal electronic circuits as well as the Wi-Fi Radiofrequency radiation hazards (in a Wi-Fi connected laptop) may decrease sperm quality. Furthermore, due to poor working posture, laptops should not be used on the lap for long hours.
The Fundamental Reasons Why Laptop Computers should not be Used on Your Lap
Mortazavi, S.A.R.; Taeb, S.; Mortazavi, S.M.J.; Zarei, S.; Haghani, M.; Habibzadeh, P.; Shojaei-fard, M.B.
2016-01-01
As a tendency to use new technologies, gadgets such as laptop computers are becoming more popular among students, teachers, businessmen and office workers. Today laptops are a great tool for education and learning, work and personal multimedia. Millions of men, especially those in the reproductive age, are frequently using their laptop computers on the lap (thigh). Over the past several years, our lab has focused on the health effects of exposure to different sources of electromagnetic fields such as cellular phones, mobile base stations, mobile phone jammers, laptop computers, radars, dentistry cavitrons and Magnetic Resonance Imaging (MRI). Our own studies as well as the studies performed by other researchers indicate that using laptop computers on the lap adversely affects the male reproductive health. When it is placed on the lap, not only the heat from a laptop computer can warm men’s scrotums, the electromagnetic fields generated by laptop’s internal electronic circuits as well as the Wi-Fi Radiofrequency radiation hazards (in a Wi-Fi connected laptop) may decrease sperm quality. Furthermore, due to poor working posture, laptops should not be used on the lap for long hours. PMID:28144597
A workload model and measures for computer performance evaluation
NASA Technical Reports Server (NTRS)
Kerner, H.; Kuemmerle, K.
1972-01-01
A generalized workload definition is presented which constructs measurable workloads of unit size from workload elements, called elementary processes. An elementary process makes almost exclusive use of one of the processors, CPU, I/O processor, etc., and is measured by the cost of its execution. Various kinds of user programs can be simulated by quantitative composition of elementary processes into a type. The character of the type is defined by the weights of its elementary processes and its structure by the amount and sequence of transitions between its elementary processes. A set of types is batched to a mix. Mixes of identical cost are considered as equivalent amounts of workload. These formalized descriptions of workloads allow investigators to compare the results of different studies quantitatively. Since workloads of different composition are assigned a unit of cost, these descriptions enable determination of cost effectiveness of different workloads on a machine. Subsequently performance parameters such as throughput rate, gain factor, internal and external delay factors are defined and used to demonstrate the effects of various workload attributes on the performance of a selected large scale computer system.
Internally insulated thermal storage system development program
NASA Technical Reports Server (NTRS)
Scott, O. L.
1980-01-01
A cost effective thermal storage system for a solar central receiver power system using molten salt stored in internally insulated carbon steel tanks is described. Factors discussed include: testing of internal insulation materials in molten salt; preliminary design of storage tanks, including insulation and liner installation; optimization of the storage configuration; and definition of a subsystem research experiment to demonstrate the system. A thermal analytical model and analysis of a thermocline tank was performed. Data from a present thermocline test tank was compared to gain confidence in the analytical approach. A computer analysis of the various storage system parameters (insulation thickness, number of tanks, tank geometry, etc.,) showed that (1) the most cost-effective configuration was a small number of large cylindrical tanks, and (2) the optimum is set by the mechanical constraints of the system, such as soil bearing strength and tank hoop stress, not by the economics.
Internally insulated thermal storage system development program
NASA Astrophysics Data System (ADS)
Scott, O. L.
1980-03-01
A cost effective thermal storage system for a solar central receiver power system using molten salt stored in internally insulated carbon steel tanks is described. Factors discussed include: testing of internal insulation materials in molten salt; preliminary design of storage tanks, including insulation and liner installation; optimization of the storage configuration; and definition of a subsystem research experiment to demonstrate the system. A thermal analytical model and analysis of a thermocline tank was performed. Data from a present thermocline test tank was compared to gain confidence in the analytical approach. A computer analysis of the various storage system parameters (insulation thickness, number of tanks, tank geometry, etc.,) showed that (1) the most cost-effective configuration was a small number of large cylindrical tanks, and (2) the optimum is set by the mechanical constraints of the system, such as soil bearing strength and tank hoop stress, not by the economics.
Precision lattice test of the gauge/gravity duality at large N
Berkowitz, Evan; Rinaldi, Enrico; Hanada, Masanori; ...
2016-11-03
We perform a systematic, large-scale lattice simulation of D0-brane quantum mechanics. The large-N and continuum limits of the gauge theory are taken for the first time at various temperatures 0.4≤T≤1.0. As a way to test the gauge/gravity duality conjecture we compute the internal energy of the black hole as a function of the temperature directly from the gauge theory. We obtain a leading behavior that is compatible with the supergravity result E/N 2=7.41T 14/5: the coefficient is estimated to be 7.4±0.5 when the exponent is fixed and stringy corrections are included. This is the first confirmation of the supergravity predictionmore » for the internal energy of a black hole at finite temperature coming directly from the dual gauge theory. As a result, we also constrain stringy corrections to the internal energy.« less
2012-01-01
The present study is a report of retrospective case series of stress fracture of the olecranon. Six patients presented posterior elbow pain in throwing in baseball and softball, but fracture was not diagnosed in radiographs. We detected stress fracture of the olecranon using computed tomographic (CT) scan and treated the patient with internal fixation with a headless cannulated double threaded screw through a small skin incision. All patients returned to competitive level without elbow complaints after the operation. When throwing athletes present with unusual posterior elbow pain and no significant findings on radiographs, a CT scan examination should be performed. We recommend surgical treatment of internal fixation with a screw through a small skin incision, as a good option for stress fracture of the olecranon in order to allow early return to sports activity in competitive athletes. PMID:23241173
The Structure of Medical Informatics Journal Literature
Morris, Theodore A.; McCain, Katherine W.
1998-01-01
Abstract Objective: Medical informatics is an emergent interdisciplinary field described as drawing upon and contributing to both the health sciences and information sciences. The authors elucidate the disciplinary nature and internal structure of the field. Design: To better understand the field's disciplinary nature, the authors examine the intercitation relationships of its journal literature. To determine its internal structure, they examined its journal cocitation patterns. Measurements: The authors used data from the Science Citation Index (SCI) and Social Science Citation Index (SSCI) to perform intercitation studies among productive journal titles, and software routines from SPSS to perform multivariate data analyses on cocitation data for proposed core journals. Results: Intercitation network analysis suggests that a core literature exists, one mark of a separate discipline. Multivariate analyses of cocitation data suggest that major focus areas within the field include biomedical engineering, biomedical computing, decision support, and education. The interpretable dimensions of multidimensional scaling maps differed for the SCI and SSCI data sets. Strong links to information science literature were not found. Conclusion: The authors saw indications of a core literature and of several major research fronts. The field appears to be viewed differently by authors writing in journals indexed by SCI from those writing in journals indexed by SSCI, with more emphasis placed on computers and engineering versus decision making by the former and more emphasis on theory versus application (clinical practice) by the latter. PMID:9760393
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martin, R.L.; Gross, D.; Pearson, D.C.
In an attempt to better understand the impact that large mining shots will have on verifying compliance with the international, worldwide, Comprehensive Test Ban Treaty (CTBT, no nuclear explosion tests), a series of seismic and videographic experiments has been conducted during the past two years at the Black Thunder Coal Mine. Personnel from the mine and Los Alamos National Laboratory have cooperated closely to design and perform experiments to produce results with mutual benefit to both organizations. This paper summarizes the activities, highlighting the unique results of each. Topics which were covered in these experiments include: (1) synthesis of seismic,more » videographic, acoustic, and computer modeling data to improve understanding of shot performance and phenomenology; (2) development of computer generated visualizations of observed blasting techniques; (3) documentation of azimuthal variations in radiation of seismic energy from overburden casting shots; (4) identification of, as yet unexplained, out of sequence, simultaneous detonation in some shots using seismic and videographic techniques; (5) comparison of local (0.1 to 15 kilometer range) and regional (100 to 2,000 kilometer range) seismic measurements leading to determine of the relationship between local and regional seismic amplitude to explosive yield for overburden cast, coal bulking and single fired explosions; and (6) determination of the types of mining shots triggering the prototype International Monitoring System for the CTBT.« less
Multi-school collaboration to develop and test nutrition computer modules for pediatric residents.
Roche, Patricia L; Ciccarelli, Mary R; Gupta, Sandeep K; Hayes, Barbara M; Molleston, Jean P
2007-09-01
The provision of essential nutrition-related content in US medical education has been deficient, despite efforts of the federal government and multiple professional organizations. Novel and efficient approaches are needed. A multi-department project was developed to create and pilot a computer-based compact disc instructional program covering the nutrition topics of oral rehydration therapy, calcium, and vitamins. Funded by an internal medical school grant, the content of the modules was written by Department of Pediatrics faculty. The modules were built by School of Informatics faculty and students, and were tested on a convenience sampling of 38 pediatric residents in a randomized controlled trial performed by a registered dietitian/School of Health and Rehabilitation Sciences Master's degree candidate. The modules were reviewed for content by the pediatric faculty principal investigator and the registered dietitian/School of Health and Rehabilitation Sciences graduate student. Residents completed a pretest of nutrition knowledge and attitude toward nutrition and Web-based instruction. Half the group was given three programs (oral rehydration therapy, calcium, and vitamins) on compact disc for study over 6 weeks. Both study and control groups completed a posttest. Pre- and postintervention objective test results in study vs control groups and attitudinal survey results before and after intervention in the study group were compared. The experimental group demonstrated significantly better posttrial objective test performance compared to the control group (P=0.0005). The study group tended toward improvement, whereas the control group performance declined substantially between pre- and posttests. Study group resident attitudes toward computer-based instruction improved. Use of these computer modules prompted almost half of the residents in the study group to independently pursue relevant nutrition-related information. This inexpensive, collaborative, multi-department effort to design a computer-based nutrition curriculum positively impacted both resident knowledge and attitudes.
Implementing a Bayes Filter in a Neural Circuit: The Case of Unknown Stimulus Dynamics.
Sokoloski, Sacha
2017-09-01
In order to interact intelligently with objects in the world, animals must first transform neural population responses into estimates of the dynamic, unknown stimuli that caused them. The Bayesian solution to this problem is known as a Bayes filter, which applies Bayes' rule to combine population responses with the predictions of an internal model. The internal model of the Bayes filter is based on the true stimulus dynamics, and in this note, we present a method for training a theoretical neural circuit to approximately implement a Bayes filter when the stimulus dynamics are unknown. To do this we use the inferential properties of linear probabilistic population codes to compute Bayes' rule and train a neural network to compute approximate predictions by the method of maximum likelihood. In particular, we perform stochastic gradient descent on the negative log-likelihood of the neural network parameters with a novel approximation of the gradient. We demonstrate our methods on a finite-state, a linear, and a nonlinear filtering problem and show how the hidden layer of the neural network develops tuning curves consistent with findings in experimental neuroscience.
Lacerda, Luis M; Sperl, Jonathan I; Menzel, Marion I; Sprenger, Tim; Barker, Gareth J; Dell'Acqua, Flavio
2016-12-01
Diffusion spectrum imaging (DSI) is an imaging technique that has been successfully applied to resolve white matter crossings in the human brain. However, its accuracy in complex microstructure environments has not been well characterized. Here we have simulated different tissue configurations, sampling schemes, and processing steps to evaluate DSI performances' under realistic biophysical conditions. A novel approach to compute the orientation distribution function (ODF) has also been developed to include biophysical constraints, namely integration ranges compatible with axial fiber diffusivities. Performed simulations identified several DSI configurations that consistently show aliasing artifacts caused by fast diffusion components for both isotropic diffusion and fiber configurations. The proposed method for ODF computation showed some improvement in reducing such artifacts and improving the ability to resolve crossings, while keeping the quantitative nature of the ODF. In this study, we identified an important limitation of current DSI implementations, specifically the presence of aliasing due to fast diffusion components like those from pathological tissues, which are not well characterized, and can lead to artifactual fiber reconstructions. To minimize this issue, a new way of computing the ODF was introduced, which removes most of these artifacts and offers improved angular resolution. Magn Reson Med 76:1837-1847, 2016. © 2015 The Authors Magnetic Resonance in Medicine published by Wiley Periodicals, Inc. on behalf of International Society for Magnetic Resonance in Medicine. This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited. © 2015 The Authors Magnetic Resonance in Medicine published by Wiley Periodicals, Inc. on behalf of International Society for Magnetic Resonance in Medicine.
Motivation modulates the P300 amplitude during brain-computer interface use.
Kleih, S C; Nijboer, F; Halder, S; Kübler, A
2010-07-01
This study examined the effect of motivation as a possible psychological influencing variable on P300 amplitude and performance in a brain-computer interface (BCI) controlled by event-related potentials (ERP). Participants were instructed to copy spell a sentence by attending to cells of a randomly flashing 7*7 matrix. Motivation was manipulated by monetary reward. In two experimental groups participants received 25 (N=11) or 50 (N=11) Euro cent for each correctly selected character; the control group (N=11) was not rewarded. BCI performance was defined as the overall percentage of correctly selected characters (correct response rate=CRR). Participants performed at an average of 99%. At electrode location Cz the P300 amplitude was positively correlated to self-rated motivation. The P300 amplitude of the most motivated participants was significantly higher than that of the least motivated participants. Highly motivated participants were able to communicate correctly faster with the ERP-BCI than less motivated participants. Motivation modulates the P300 amplitude in an ERP-BCI. Motivation may contribute to variance in BCI performance and should be monitored in BCI settings. Copyright 2010 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
Influence of the postion of crew members on aerodynamics performance of two-man bobsleigh.
Dabnichki, Peter; Avital, Eldad
2006-01-01
Bobsleigh aerodynamics has long been recognised as one of the crucial performance factors. Although the published research in the area is very limited, it is well known that the leading nations in the sport devote significant resources in research and development of sleds' aerodynamics. However, the rules and regulations pose strict design constraints on the shape modifications aiming at aerodynamics improvements. The reason for that is two-fold: (i) safety of the athletes and (ii) reduction of equipment impact on competition outcome. One particular area that has not been looked at and falls outside the current rules and regulations is the influence of the crew positioning and internal modifications on the aerodynamic performance. The current study presents results on numerical simulation of the flow in the cavity underpinned with some experimental measurements including flow visualisation of the air circulation around the bobsleigh. A simplified computational model was developed to assess the trends and its results validated by windtunnel tests. The results show that crew members influence the drag level significantly and suggest that purely internal modifications can be introduced to reduce the overall resistance drag.
NASA Astrophysics Data System (ADS)
Ghazai, A. J.; Thahab, S. M.; Hassan, H. Abu; Hassan, Z.
2010-07-01
The development of efficient MQWs active regions of quaternary InAlGaN in the ultraviolet (UV) region is an engaging challenge by itself. Demonstrating lasers at such low wavelength will require resolving a number of materials, growth and device design issues. However, the quaternary AlInGaN represents a more versatile material since the bandgap and lattice constant can be independently varied. We report a quaternary AlInGaN double-quantum wells (DQWs) UV laser diode (LDs) study by using the simulation program of Integrated System Engineering-Technical Computer Aided Design (ISE TCAD). Advanced physical models of semiconductor properties were used. In this paper, the enhancement in the performance of AlInGaN laser diode can be achieved by optimizing the laser structure geometry design. The AlInGaN laser diodes operating parameters such as internal quantum efficiency ηi, internal loss αi and transparency threshold current density show effective improvements that contribute to a better performance.
Shimansky, Yury P; Kang, Tao; He, Jiping
2004-02-01
A computational model of a learning system (LS) is described that acquires knowledge and skill necessary for optimal control of a multisegmental limb dynamics (controlled object or CO), starting from "knowing" only the dimensionality of the object's state space. It is based on an optimal control problem setup different from that of reinforcement learning. The LS solves the optimal control problem online while practicing the manipulation of CO. The system's functional architecture comprises several adaptive components, each of which incorporates a number of mapping functions approximated based on artificial neural nets. Besides the internal model of the CO's dynamics and adaptive controller that computes the control law, the LS includes a new type of internal model, the minimal cost (IM(mc)) of moving the controlled object between a pair of states. That internal model appears critical for the LS's capacity to develop an optimal movement trajectory. The IM(mc) interacts with the adaptive controller in a cooperative manner. The controller provides an initial approximation of an optimal control action, which is further optimized in real time based on the IM(mc). The IM(mc) in turn provides information for updating the controller. The LS's performance was tested on the task of center-out reaching to eight randomly selected targets with a 2DOF limb model. The LS reached an optimal level of performance in a few tens of trials. It also quickly adapted to movement perturbations produced by two different types of external force field. The results suggest that the proposed design of a self-optimized control system can serve as a basis for the modeling of motor learning that includes the formation and adaptive modification of the plan of a goal-directed movement.
ICCE/ICCAI 2000 Full & Short Papers (Virtual Lab/Classroom/School).
ERIC Educational Resources Information Center
2000
This document contains the following full and short papers on virtual laboratories, classrooms, and schools from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction): (1) "A Collaborative Learning Support System Based on Virtual Environment Server for Multiple…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-24
... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-847] Certain Electronic Devices, Including Mobile Phones and Tablet Computers, and Components Thereof; Notice of Request for Statements on the Public Interest AGENCY: U.S. International Trade Commission. ACTION: Notice. SUMMARY: Notice is...
Simulating History to Understand International Politics
ERIC Educational Resources Information Center
Weir, Kimberly; Baranowski, Michael
2011-01-01
To understand world politics, one must appreciate the context in which international systems develop and operate. Pedagogy studies demonstrate that the more active students are in their learning, the more they learn. As such, using computer simulations can complement and enhance classroom instruction. CIVILIZATION is a computer simulation game…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-15
... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-745] Certain Wireless Communication Devices, Portable Music and Data Processing Devices, Computers and Components Thereof; Notice of Request for Statements on the Public Interest AGENCY: U.S. International Trade Commission. ACTION: Notice...
Hybrid computational phantoms of the male and female newborn patient: NURBS-based whole-body models
NASA Astrophysics Data System (ADS)
Lee, Choonsik; Lodwick, Daniel; Hasenauer, Deanna; Williams, Jonathan L.; Lee, Choonik; Bolch, Wesley E.
2007-07-01
Anthropomorphic computational phantoms are computer models of the human body for use in the evaluation of dose distributions resulting from either internal or external radiation sources. Currently, two classes of computational phantoms have been developed and widely utilized for organ dose assessment: (1) stylized phantoms and (2) voxel phantoms which describe the human anatomy via mathematical surface equations or 3D voxel matrices, respectively. Although stylized phantoms based on mathematical equations can be very flexible in regard to making changes in organ position and geometrical shape, they are limited in their ability to fully capture the anatomic complexities of human internal anatomy. In turn, voxel phantoms have been developed through image-based segmentation and correspondingly provide much better anatomical realism in comparison to simpler stylized phantoms. However, they themselves are limited in defining organs presented in low contrast within either magnetic resonance or computed tomography images—the two major sources in voxel phantom construction. By definition, voxel phantoms are typically constructed via segmentation of transaxial images, and thus while fine anatomic features are seen in this viewing plane, slice-to-slice discontinuities become apparent in viewing the anatomy of voxel phantoms in the sagittal or coronal planes. This study introduces the concept of a hybrid computational newborn phantom that takes full advantage of the best features of both its stylized and voxel counterparts: flexibility in phantom alterations and anatomic realism. Non-uniform rational B-spline (NURBS) surfaces, a mathematical modeling tool traditionally applied to graphical animation studies, was adopted to replace the limited mathematical surface equations of stylized phantoms. A previously developed whole-body voxel phantom of the newborn female was utilized as a realistic anatomical framework for hybrid phantom construction. The construction of a hybrid phantom is performed in three steps: polygonization of the voxel phantom, organ modeling via NURBS surfaces and phantom voxelization. Two 3D graphic tools, 3D-DOCTOR™ and Rhinoceros™, were utilized to polygonize the newborn voxel phantom and generate NURBS surfaces, while an in-house MATLAB™ code was used to voxelize the resulting NURBS model into a final computational phantom ready for use in Monte Carlo radiation transport calculations. A total of 126 anatomical organ and tissue models, including 38 skeletal sites and 31 cartilage sites, were described within the hybrid phantom using either NURBS or polygon surfaces. A male hybrid newborn phantom was constructed following the development of the female phantom through the replacement of female-specific organs with male-specific organs. The outer body contour and internal anatomy of the NURBS-based phantoms were adjusted to match anthropometric and reference newborn data reported by the International Commission on Radiological Protection in their Publication 89. The voxelization process was designed to accurately convert NURBS models to a voxel phantom with minimum volumetric change. A sensitivity study was additionally performed to better understand how the meshing tolerance and voxel resolution would affect volumetric changes between the hybrid-NURBS and hybrid-voxel phantoms. The male and female hybrid-NURBS phantoms were constructed in a manner so that all internal organs approached their ICRP reference masses to within 1%, with the exception of the skin (-6.5% relative error) and brain (-15.4% relative error). Both hybrid-voxel phantoms were constructed with an isotropic voxel resolution of 0.663 mm—equivalent to the ICRP 89 reference thickness of the newborn skin (dermis and epidermis). Hybrid-NURBS phantoms used to create their voxel counterpart retain the non-uniform scalability of stylized phantoms, while maintaining the anatomic realism of segmented voxel phantoms with respect to organ shape, depth and inter-organ positioning. This work was supported by the National Cancer Institute.
NASA Technical Reports Server (NTRS)
Bibb, Karen L.; Prabhu, Ramadas K.
2004-01-01
In support of the Columbia Accident Investigation, inviscid computations of the aerodynamic characteristics for various Shuttle Orbiter damage scenarios were performed using the FELISA unstructured CFD solver. Computed delta aerodynamics were compared with the reconstructed delta aerodynamics in order to postulate a progression of damage through the flight trajectory. By performing computations at hypervelocity flight and CF4 tunnel conditions, a bridge was provided between wind tunnel testing in Langley's 20-Inch CF4 facility and the flight environment experienced by Columbia during re-entry. The rapid modeling capability of the unstructured methodology allowed the computational effort to keep pace with the wind tunnel and, at times, guide the wind tunnel efforts. These computations provided a detailed view of the flowfield characteristics and the contribution of orbiter components (such as the vertical tail and wing) to aerodynamic forces and moments that were unavailable from wind tunnel testing. The damage scenarios are grouped into three categories. Initially, single and multiple missing full RCC panels were analyzed to determine the effect of damage location and magnitude on the aerodynamics. Next is a series of cases with progressive damage, increasing in severity, in the region of RCC panel 9. The final group is a set of wing leading edge and windward surface deformations that model possible structural deformation of the wing skin due to internal heating of the wing structure. By matching the aerodynamics from selected damage scenarios to the reconstructed flight aerodynamics, a progression of damage that is consistent with the flight data, debris forensics, and wind tunnel data is postulated.
Laparoscopic Skills Are Improved With LapMentor™ Training
Andreatta, Pamela B.; Woodrum, Derek T.; Birkmeyer, John D.; Yellamanchilli, Rajani K.; Doherty, Gerard M.; Gauger, Paul G.; Minter, Rebecca M.
2006-01-01
Objective: To determine if prior training on the LapMentor™ laparoscopic simulator leads to improved performance of basic laparoscopic skills in the animate operating room environment. Summary Background Data: Numerous influences have led to the development of computer-aided laparoscopic simulators: a need for greater efficiency in training, the unique and complex nature of laparoscopic surgery, and the increasing demand that surgeons demonstrate competence before proceeding to the operating room. The LapMentor™ simulator is expensive, however, and its use must be validated and justified prior to implementation into surgical training programs. Methods: Nineteen surgical interns were randomized to training on the LapMentor™ laparoscopic simulator (n = 10) or to a control group (no simulator training, n = 9). Subjects randomized to the LapMentor™ trained to expert criterion levels 2 consecutive times on 6 designated basic skills modules. All subjects then completed a series of laparoscopic exercises in a live porcine model, and performance was assessed independently by 2 blinded reviewers. Time, accuracy rates, and global assessments of performance were recorded with an interrater reliability between reviewers of 0.99. Results: LapMentor™ trained interns completed the 30° camera navigation exercise in significantly less time than control interns (166 ± 52 vs. 220 ± 39 seconds, P < 0.05); they also achieved higher accuracy rates in identifying the required objects with the laparoscope (96% ± 8% vs. 82% ± 15%, P < 0.05). Similarly, on the two-handed object transfer exercise, task completion time for LapMentor™ trained versus control interns was 130 ± 23 versus 184 ± 43 seconds (P < 0.01) with an accuracy rate of 98% ± 5% versus 80% ± 13% (P < 0.001). Additionally, LapMentor™ trained interns outperformed control subjects with regard to camera navigation skills, efficiency of motion, optimal instrument handling, perceptual ability, and performance of safe electrocautery. Conclusions: This study demonstrates that prior training on the LapMentor™ laparoscopic simulator leads to improved resident performance of basic skills in the animate operating room environment. This work marks the first prospective, randomized evaluation of the LapMentor™ simulator, and provides evidence that LapMentor™ training may lead to improved operating room performance. PMID:16772789
Ince, Ilker; Arı, Muhammet Ali; Sulak, Muhammet Mustafa; Aksoy, Mehmet
There are different ultrasound probe positions used for internal jugular venous catheter placement. Also, in-plane or out of plane needle approach may be used for catheterization. Transverse short-axis classic approach is the most popular performed approach in literature. "Syringe-Free" is a new described technique that is performed with oblique long-axis approach. We aimed to compare performance of these two approaches. This study was conducted as a prospective and randomized study. 80 patients were included the study and divided into two groups that were named Group C (transverse short-axis classic approach) and Group SF (oblique long-axis syringe-free approach) by a computer-generated randomization. The primary outcome was mean time that guidewire is seen in the internal jugular vein (performing time). The secondary outcomes were to compare number of needle pass, number of skin puncture and complications between two groups. Demographic and hemodynamic data were not significantly different. The mean performing time was 54.9±19.1s in Group C and 43.9±15.8s in Group SF. Significant differences were found between the groups (p=0.006). Mean number of needle pass was 3.2(±2.1) in Group C and 2.1(±1.6) in Group SF. There were statistically significant differences between two groups (p=0.002). The number of skin puncture was 1.6(±0.8) and 1.2(±0.5) in Group C and SF, respectively (p=0.027). "Syringe-Free" technique has lower performing time, number of needle pass and skin puncture. Also, it allows to follow progress of guide-wire under continuous ultrasound visualization and the procedure does not need assistance during catheter insertion. Namely, "Syringe-Free" is effective, safe and fast technique that may be used to place internal jugular venous catheter. Copyright © 2017 Sociedade Brasileira de Anestesiologia. Publicado por Elsevier Editora Ltda. All rights reserved.
NASA Technical Reports Server (NTRS)
Ahmad, Rashid A.; McCool, Alex (Technical Monitor)
2001-01-01
An enhanced performance solid rocket booster concept for the space shuttle system has been proposed. The concept booster will have strong commonality with the existing, proven, reliable four-segment Space Shuttle Reusable Solid Rocket Motors (RSRM) with individual component design (nozzle, insulator, etc.) optimized for a five-segment configuration. Increased performance is desirable to further enhance safety/reliability and/or increase payload capability. Performance increase will be achieved by adding a fifth propellant segment to the current four-segment booster and opening the throat to accommodate the increased mass flow while maintaining current pressure levels. One development concept under consideration is the static test of a "standard" RSRM with a fifth propellant segment inserted and appropriate minimum motor modifications. Feasibility studies are being conducted to assess the potential for any significant departure in component performance/loading from the well-characterized RSRM. An area of concern is the aft motor (submerged nozzle inlet, aft dome, etc.) where the altered internal flow resulting from the performance enhancing features (25% increase in mass flow rate, higher Mach numbers, modified subsonic nozzle contour) may result in increased component erosion and char. To assess this issue and to define the minimum design changes required to successfully static test a fifth segment RSRM engineering test motor, internal flow studies have been initiated. Internal aero-thermal environments were quantified in terms of conventional convective heating and discrete phase alumina particle impact/concentration and accretion calculations via Computational Fluid Dynamics (CFD) simulation. Two sets of comparative CFD simulations of the RSRM and the five-segment (IBM) concept motor were conducted with CFD commercial code FLUENT. The first simulation involved a two-dimensional axi-symmetric model of the full motor, initial grain RSRM. The second set of analyses included three-dimensional models of the RSRM and FSM aft motors with four-degree vectored nozzles.
The Computerized Anatomical Man (CAM) model
NASA Technical Reports Server (NTRS)
Billings, M. P.; Yucker, W. R.
1973-01-01
A computerized anatomical man (CAM) model, representing the most detailed and anatomically correct geometrical model of the human body yet prepared, has been developed for use in analyzing radiation dose distribution in man. This model of a 50-percentile standing USAF man comprises some 1100 unique geometric surfaces and some 2450 solid regions. Internal body geometry such as organs, voids, bones, and bone marrow are explicitly modeled. A computer program called CAMERA has also been developed for performing analyses with the model. Such analyses include tracing rays through the CAM geometry, placing results on magnetic tape in various forms, collapsing areal density data from ray tracing information to areal density distributions, preparing cross section views, etc. Numerous computer drawn cross sections through the CAM model are presented.
Adaptive quantum computation in changing environments using projective simulation
NASA Astrophysics Data System (ADS)
Tiersch, M.; Ganahl, E. J.; Briegel, H. J.
2015-08-01
Quantum information processing devices need to be robust and stable against external noise and internal imperfections to ensure correct operation. In a setting of measurement-based quantum computation, we explore how an intelligent agent endowed with a projective simulator can act as controller to adapt measurement directions to an external stray field of unknown magnitude in a fixed direction. We assess the agent’s learning behavior in static and time-varying fields and explore composition strategies in the projective simulator to improve the agent’s performance. We demonstrate the applicability by correcting for stray fields in a measurement-based algorithm for Grover’s search. Thereby, we lay out a path for adaptive controllers based on intelligent agents for quantum information tasks.
Global behavior analysis for stochastic system of 1,3-PD continuous fermentation
NASA Astrophysics Data System (ADS)
Zhu, Xi; Kliemann, Wolfgang; Li, Chunfa; Feng, Enmin; Xiu, Zhilong
2017-12-01
Global behavior for stochastic system of continuous fermentation in glycerol bio-dissimilation to 1,3-propanediol by Klebsiella pneumoniae is analyzed in this paper. This bioprocess cannot avoid the stochastic perturbation caused by internal and external disturbance which reflect on the growth rate. These negative factors can limit and degrade the achievable performance of controlled systems. Based on multiplicity phenomena, the equilibriums and bifurcations of the deterministic system are analyzed. Then, a stochastic model is presented by a bounded Markov diffusion process. In order to analyze the global behavior, we compute the control sets for the associated control system. The probability distributions of relative supports are also computed. The simulation results indicate that how the disturbed biosystem tend to stationary behavior globally.
NASA Astrophysics Data System (ADS)
de Oliveira, José Martins, Jr.; Mangini, F. Salvador; Carvalho Vila, Marta Maria Duarte; ViníciusChaud, Marco
2013-05-01
This work presents an alternative and non-conventional technique for evaluatingof physic-chemical properties of pharmaceutical dosage forms, i.e. we used computed tomography (CT) technique as a nondestructive technique to visualize internal structures of pharmaceuticals dosage forms and to conduct static and dynamical studies. The studies were conducted involving static and dynamic situations through the use of tomographic images, generated by the scanner at University of Sorocaba - Uniso. We have shown that through the use of tomographic images it is possible to conduct studies of porosity, densities, analysis of morphological parameters and performing studies of dissolution. Our results are in agreement with the literature, showing that CT is a powerful tool for use in the pharmaceutical sciences.
Adaptive quantum computation in changing environments using projective simulation
Tiersch, M.; Ganahl, E. J.; Briegel, H. J.
2015-01-01
Quantum information processing devices need to be robust and stable against external noise and internal imperfections to ensure correct operation. In a setting of measurement-based quantum computation, we explore how an intelligent agent endowed with a projective simulator can act as controller to adapt measurement directions to an external stray field of unknown magnitude in a fixed direction. We assess the agent’s learning behavior in static and time-varying fields and explore composition strategies in the projective simulator to improve the agent’s performance. We demonstrate the applicability by correcting for stray fields in a measurement-based algorithm for Grover’s search. Thereby, we lay out a path for adaptive controllers based on intelligent agents for quantum information tasks. PMID:26260263
Advanced large scale GaAs monolithic IF switch matrix subsystem
NASA Technical Reports Server (NTRS)
Ch'en, D. R.; Petersen, W. C.; Kiba, W. M.
1992-01-01
Attention is given to a novel chip design and packaging technique to overcome the limitations due to the high signal isolation requirements of advanced communications systems. A hermetically sealed 6 x 6 monolithic GaAs switch matrix subsystem with integral control electronics based on this technique is presented. An 0-dB insertion loss and 60-dB crosspoint isolation over a 3.5-to-6-GHz band were achieved. The internal controller portion of the switching subsystem provides crosspoint control via a standard RS-232 computer interface and can be synchronized with an external systems control computer. The measured performance of this advanced switching subsystem is fully compatible with relatively static 'switchboard' as well as dynamic TDMA modes of operation.
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph is a computer printout which presents findings from an analysis of data on international cooperation over a three-year period. Part of a large scale research project to test various theories with regard to their ability to analyze international relations, this monograph presents the computer printout of data on the application of…
ERIC Educational Resources Information Center
Vincent, Jack E.
This monograph is a computer printout which presents findings from an analysis of data on international conflict over a three-year period. Part of a large scale research project to test various theories with regard to their ability to analyze international relations, this monograph presents the computer printout of data on the application of…
97. View of International Business Machine (IBM) digital computer model ...
97. View of International Business Machine (IBM) digital computer model 7090 magnetic core installation, international telephone and telegraph (ITT) Artic Services Inc., Official photograph BMEWS site II, Clear, AK, by unknown photographer, 17 September 1965, BMEWS, clear as negative no. A-6604. - Clear Air Force Station, Ballistic Missile Early Warning System Site II, One mile west of mile marker 293.5 on Parks Highway, 5 miles southwest of Anderson, Anderson, Denali Borough, AK
NASA Technical Reports Server (NTRS)
Foley, Michael J.
1989-01-01
The primary nozzle diffuser routes fuel from the main fuel valve on the Space Shuttle Main Engine (SSME) to the nozzle coolant inlet mainfold, main combustion chamber coolant inlet mainfold, chamber coolant valve, and the augmented spark igniters. The diffuser also includes the fuel system purge check valve connection. A static stress analysis was performed on the diffuser because no detailed analysis was done on this part in the past. Structural concerns were in the area of the welds because approximately 10 percent are in areas inaccessible by X-ray testing devices. Flow dynamics and thermodynamics were not included in the analysis load case. Constant internal pressure at maximum SSME power was used instead. A three-dimensional, finite element method was generated using ANSYS version 4.3A on the Lockheed VAX 11/785 computer to perform the stress computations. IDEAS Supertab on a Sun 3/60 computer was used to create the finite element model. Rocketdyne drawing number RS009156 was used for the model interpretation. The flight diffuser is denoted as -101. A description of the model, boundary conditions/load case, material properties, structural analysis/results, and a summary are included for documentation.
A framework for analyzing the cognitive complexity of computer-assisted clinical ordering.
Horsky, Jan; Kaufman, David R; Oppenheim, Michael I; Patel, Vimla L
2003-01-01
Computer-assisted provider order entry is a technology that is designed to expedite medical ordering and to reduce the frequency of preventable errors. This paper presents a multifaceted cognitive methodology for the characterization of cognitive demands of a medical information system. Our investigation was informed by the distributed resources (DR) model, a novel approach designed to describe the dimensions of user interfaces that introduce unnecessary cognitive complexity. This method evaluates the relative distribution of external (system) and internal (user) representations embodied in system interaction. We conducted an expert walkthrough evaluation of a commercial order entry system, followed by a simulated clinical ordering task performed by seven clinicians. The DR model was employed to explain variation in user performance and to characterize the relationship of resource distribution and ordering errors. The analysis revealed that the configuration of resources in this ordering application placed unnecessarily heavy cognitive demands on the user, especially on those who lacked a robust conceptual model of the system. The resources model also provided some insight into clinicians' interactive strategies and patterns of associated errors. Implications for user training and interface design based on the principles of human-computer interaction in the medical domain are discussed.
Modeling of anomalous electron mobility in Hall thrusters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koo, Justin W.; Boyd, Iain D.
Accurate modeling of the anomalous electron mobility is absolutely critical for successful simulation of Hall thrusters. In this work, existing computational models for the anomalous electron mobility are used to simulate the UM/AFRL P5 Hall thruster (a 5 kW laboratory model) in a two-dimensional axisymmetric hybrid particle-in-cell Monte Carlo collision code. Comparison to experimental results indicates that, while these computational models can be tuned to reproduce the correct thrust or discharge current, it is very difficult to match all integrated performance parameters (thrust, power, discharge current, etc.) simultaneously. Furthermore, multiple configurations of these computational models can produce reasonable integrated performancemore » parameters. A semiempirical electron mobility profile is constructed from a combination of internal experimental data and modeling assumptions. This semiempirical electron mobility profile is used in the code and results in more accurate simulation of both the integrated performance parameters and the mean potential profile of the thruster. Results indicate that the anomalous electron mobility, while absolutely necessary in the near-field region, provides a substantially smaller contribution to the total electron mobility in the high Hall current region near the thruster exit plane.« less
Unsteady 3D flow simulations in cranial arterial tree
NASA Astrophysics Data System (ADS)
Grinberg, Leopold; Anor, Tomer; Madsen, Joseph; Karniadakis, George
2008-11-01
High resolution unsteady 3D flow simulations in major cranial arteries have been performed. Two cases were considered: 1) a healthy volunteer with a complete Circle of Willis (CoW); and 2) a patient with hydrocephalus and an incomplete CoW. Computation was performed on 3344 processors of the new half petaflop supercomputer in TACC. Two new numerical approaches were developed and implemented: 1) a new two-level domain decomposition method, which couples continuous and discontinuous Galerkin discretization of the computational domain; and 2) a new type of outflow boundary conditions, which imposes, in an accurate and computationally efficient manner, clinically measured flow rates. In the first simulation, a geometric model of 65 cranial arteries was reconstructed. Our simulation reveals a high degree of asymmetry in the flow at the left and right parts of the CoW and the presence of swirling flow in most of the CoW arteries. In the second simulation, one of the main findings was a high pressure drop at the right anterior communicating artery (PCA). Due to the incompleteness of the CoW and the pressure drop at the PCA, the right internal carotid artery supplies blood to most regions of the brain.
Meyer, Nanna L; Sundgot-Borgen, Jorunn; Lohman, Timothy G; Ackland, Timothy R; Stewart, Arthur D; Maughan, Ronald J; Smith, Suzanne; Müller, Wolfram
2013-11-01
Successful performers in weight-sensitive sports are characterised by low body mass (BM) and fat content. This often requires chronic energy restriction and acute weight loss practices. To evaluate current use of body composition (BC) assessment methods and identify problems and solutions with current BC approaches. A 40-item survey was developed, including demographic and content questions related to BC assessment. The survey was electronically distributed among international sporting organisations. Frequencies and χ(2) analyses were computed. 216 responses were received, from 33 countries, representing various institutions, sports and competitive levels. Of the sample, 86% of respondents currently assess BC, most frequently using skinfolds (International Society for the Advancement of Kinanthropometry (ISAK): 50%; non-ISAK, conventional: 40%; both: 28%), dual energy X-ray absorptiometry (38%), bioelectrical impedance (29%), air displacement plethysmography (17%) and hydrostatic weighing (10%). Of those using skinfolds, more at the international level used ISAK, whereas conventional approaches were more reported at regional/national level (p=0.006). The sport dietitian/nutritionist (57%) and physiologist/sports scientist (54%) were most frequently the professionals assessing BC, followed by MDs and athletic trainers, with some reporting coaches (5%). 36% of 116 respondents assessed hydration status and more (64%) did so at international than regional/national level (36%, p=0.028). Of 125 participants answering the question of whether they thought that BC assessment raised problems, 69% said 'yes', with most providing ideas for solutions. Results show high use of BC assessment but also a lack of standardisation and widespread perception of problems related to BM and BC in sport. Future work should emphasise standardisation with appropriate training opportunities and more research on BC and performance.
Li, Qiao; Gao, Xinyi; Yao, Zhenwei; Feng, Xiaoyuan; He, Huijin; Xue, Jing; Gao, Peiyi; Yang, Lumeng; Cheng, Xin; Chen, Weijian; Yang, Yunjun
2017-09-01
Permeability surface (PS) on computed tomographic perfusion reflects blood-brain barrier permeability and is related to hemorrhagic transformation (HT). HT of deep middle cerebral artery (MCA) territory can occur after recanalization of proximal large-vessel occlusion. We aimed to determine the relationship between HT and PS of deep MCA territory. We retrospectively reviewed 70 consecutive acute ischemic stroke patients presenting with occlusion of the distal internal carotid artery or M1 segment of the MCA. All patients underwent computed tomographic perfusion within 6 hours after symptom onset. Computed tomographic perfusion data were postprocessed to generate maps of different perfusion parameters. Risk factors were identified for increased deep MCA territory PS. Receiver operating characteristic curve analysis was performed to calculate the optimal PS threshold to predict HT of deep MCA territory. Increased PS was associated with HT of deep MCA territory. After adjustments for age, sex, onset time to computed tomographic perfusion, and baseline National Institutes of Health Stroke Scale, poor collateral status (odds ratio, 7.8; 95% confidence interval, 1.67-37.14; P =0.009) and proximal MCA-M1 occlusion (odds ratio, 4.12; 95% confidence interval, 1.03-16.52; P =0.045) were independently associated with increased deep MCA territory PS. Relative PS most accurately predicted HT of deep MCA territory (area under curve, 0.94; optimal threshold, 2.89). Increased PS can predict HT of deep MCA territory after recanalization therapy for cerebral proximal large-vessel occlusion. Proximal MCA-M1 complete occlusion and distal internal carotid artery occlusion in conjunction with poor collaterals elevate deep MCA territory PS. © 2017 American Heart Association, Inc.
Improving Conceptual Design for Launch Vehicles
NASA Technical Reports Server (NTRS)
Olds, John R.
1998-01-01
This report summarizes activities performed during the second year of a three year cooperative agreement between NASA - Langley Research Center and Georgia Tech. Year 1 of the project resulted in the creation of a new Cost and Business Assessment Model (CABAM) for estimating the economic performance of advanced reusable launch vehicles including non-recurring costs, recurring costs, and revenue. The current year (second year) activities were focused on the evaluation of automated, collaborative design frameworks (computation architectures or computational frameworks) for automating the design process in advanced space vehicle design. Consistent with NASA's new thrust area in developing and understanding Intelligent Synthesis Environments (ISE), the goals of this year's research efforts were to develop and apply computer integration techniques and near-term computational frameworks for conducting advanced space vehicle design. NASA - Langley (VAB) has taken a lead role in developing a web-based computing architectures within which the designer can interact with disciplinary analysis tools through a flexible web interface. The advantages of this approach are, 1) flexible access to the designer interface through a simple web browser (e.g. Netscape Navigator), 2) ability to include existing 'legacy' codes, and 3) ability to include distributed analysis tools running on remote computers. To date, VAB's internal emphasis has been on developing this test system for the planetary entry mission under the joint Integrated Design System (IDS) program with NASA - Ames and JPL. Georgia Tech's complementary goals this year were to: 1) Examine an alternate 'custom' computational architecture for the three-discipline IDS planetary entry problem to assess the advantages and disadvantages relative to the web-based approach.and 2) Develop and examine a web-based interface and framework for a typical launch vehicle design problem.
Space Shuttle propulsion performance reconstruction from flight data
NASA Technical Reports Server (NTRS)
Rogers, Robert M.
1989-01-01
The aplication of extended Kalman filtering to estimating Space Shuttle Solid Rocket Booster (SRB) performance, specific impulse, from flight data in a post-flight processing computer program. The flight data used includes inertial platform acceleration, SRB head pressure, and ground based radar tracking data. The key feature in this application is the model used for the SRBs, which represents a reference quasi-static internal ballistics model normalized to the propellant burn depth. Dynamic states of mass overboard and propellant burn depth are included in the filter model to account for real-time deviations from the reference model used. Aerodynamic, plume, wind and main engine uncertainties are included.
Maximum-performance fiber-optic irradiation with nonimaging designs.
Fang, Y; Feuermann, D; Gordon, J M
1997-10-01
A range of practical nonimaging designs for optical fiber applications is presented. Rays emerging from a fiber over a restricted angular range (small numerical aperture) are needed to illuminate a small near-field detector at maximum radiative efficiency. These designs range from pure reflector (all-mirror), to pure dielectric (refractive and based on total internal reflection) to lens-mirror combinations. Sample designs are shown for a specific infrared fiber-optic irradiation problem of practical interest. Optical performance is checked with computer three-dimensional ray tracing. Compared with conventional imaging solutions, nonimaging units offer considerable practical advantages in compactness and ease of alignment as well as noticeably superior radiative efficiency.
Experimental and analytical study of close-coupled ventral nozzles for ASTOVL aircraft
NASA Technical Reports Server (NTRS)
Mcardle, Jack G.; Smith, C. Frederic
1990-01-01
Flow in a generic ventral nozzle system was studied experimentally and analytically with a block version of the PARC3D computational fluid dynamics program (a full Navier-Stokes equation solver) in order to evaluate the program's ability to predict system performance and internal flow patterns. For the experimental work a one-third-size model tailpipe with a single large rectangular ventral nozzle mounted normal to the tailpipe axis was tested with unheated air at steady-state pressure ratios up to 4.0. The end of the tailpipe was closed to simulate a blocked exhaust nozzle. Measurements showed about 5 1/2 percent flow-turning loss, reasonable nozzle performance coefficients, and a significant aftward axial component of thrust due to flow turning loss, reasonable nozzle performance coefficients, and a significant aftward axial component of thrust due to flow turning more than 90 deg. Flow behavior into and through the ventral duct is discussed and illustrated with paint streak flow visualization photographs. For the analytical work the same ventral system configuration was modeled with two computational grids to evaluate the effect of grid density. Both grids gave good results. The finer-grid solution produced more detailed flow patterns and predicted performance parameters, such as thrust and discharge coefficient, within 1 percent of the measured values. PARC3D flow visualization images are shown for comparison with the paint streak photographs. Modeling and computational issues encountered in the analytical work are discussed.
Experiments with microcomputer-based artificial intelligence environments
Summers, E.G.; MacDonald, R.A.
1988-01-01
The U.S. Geological Survey (USGS) has been experimenting with the use of relatively inexpensive microcomputers as artificial intelligence (AI) development environments. Several AI languages are available that perform fairly well on desk-top personal computers, as are low-to-medium cost expert system packages. Although performance of these systems is respectable, their speed and capacity limitations are questionable for serious earth science applications foreseen by the USGS. The most capable artificial intelligence applications currently are concentrated on what is known as the "artificial intelligence computer," and include Xerox D-series, Tektronix 4400 series, Symbolics 3600, VAX, LMI, and Texas Instruments Explorer. The artificial intelligence computer runs expert system shells and Lisp, Prolog, and Smalltalk programming languages. However, these AI environments are expensive. Recently, inexpensive 32-bit hardware has become available for the IBM/AT microcomputer. USGS has acquired and recently completed Beta-testing of the Gold Hill Systems 80386 Hummingboard, which runs Common Lisp on an IBM/AT microcomputer. Hummingboard appears to have the potential to overcome many of the speed/capacity limitations observed with AI-applications on standard personal computers. USGS is a Beta-test site for the Gold Hill Systems GoldWorks expert system. GoldWorks combines some high-end expert system shell capabilities in a medium-cost package. This shell is developed in Common Lisp, runs on the 80386 Hummingboard, and provides some expert system features formerly available only on AI-computers including frame and rule-based reasoning, on-line tutorial, multiple inheritance, and object-programming. ?? 1988 International Association for Mathematical Geology.
NASA Technical Reports Server (NTRS)
Ray, Charles D.; Carrasquillo, Robyn L.; Minton-Summers, Silvia
1997-01-01
This paper provides a summary of current work accomplished under technical task agreement (TTA) by the Marshall Space Flight Center (MSFC) regarding the Environmental Control and Life Support System (ECLSS) as well as future planning activities in support of the International Space Station (ISS). Current activities include ECLSS computer model development, component design and development, subsystem integrated system testing, life testing, and government furnished equipment delivered to the ISS program. A long range plan for the MSFC ECLSS test facility is described whereby the current facility would be upgraded to support integrated station ECLSS operations. ECLSS technology development efforts proposed to be performed under the Advanced Engineering Technology Development (AETD) program are also discussed.
Journal of Undergraduate Research, Volume IX, 2009
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stiner, K. S.; Graham, S.; Khan, M.
Each year more than 600 undergraduate students are awarded paid internships at the Department of Energy’s (DOE) National Laboratories. Th ese interns are paired with research scientists who serve as mentors in authentic research projects. All participants write a research abstract and present at a poster session and/or complete a fulllength research paper. Abstracts and selected papers from our 2007–2008 interns that represent the breadth and depth of undergraduate research performed each year at our National Laboratories are published here in the Journal of Undergraduate Research. The fields in which these students worked included: Biology; Chemistry; Computer Science; Engineering; Environmentalmore » Science; General Science; Materials Science; Medical and Health Sciences; Nuclear Science; Physics; Science Policy; and Waste Management.« less
An integrative theory of prefrontal cortex function.
Miller, E K; Cohen, J D
2001-01-01
The prefrontal cortex has long been suspected to play an important role in cognitive control, in the ability to orchestrate thought and action in accordance with internal goals. Its neural basis, however, has remained a mystery. Here, we propose that cognitive control stems from the active maintenance of patterns of activity in the prefrontal cortex that represent goals and the means to achieve them. They provide bias signals to other brain structures whose net effect is to guide the flow of activity along neural pathways that establish the proper mappings between inputs, internal states, and outputs needed to perform a given task. We review neurophysiological, neurobiological, neuroimaging, and computational studies that support this theory and discuss its implications as well as further issues to be addressed
NECAP 4.1: NASA's Energy-Cost Analysis Program input manual
NASA Technical Reports Server (NTRS)
Jensen, R. N.
1982-01-01
The computer program NECAP (NASA's Energy Cost Analysis Program) is described. The program is a versatile building design and energy analysis tool which has embodied within it state of the art techniques for performing thermal load calculations and energy use predictions. With the program, comparisons of building designs and operational alternatives for new or existing buildings can be made. The major feature of the program is the response factor technique for calculating the heat transfer through the building surfaces which accounts for the building's mass. The program expands the response factor technique into a space response factor to account for internal building temperature swings; this is extremely important in determining true building loads and energy consumption when internal temperatures are allowed to swing.
NASA Technical Reports Server (NTRS)
Kreskovsky, J. P.; Briley, W. R.; Mcdonald, H.
1982-01-01
A finite difference method is developed for making detailed predictions of three dimensional subsonic turbulent flow in turbofan lobe mixers. The governing equations are solved by a forward-marching solution procedure which corrects an inviscid potential flow solution for viscous and thermal effects, secondary flows, total pressure distortion and losses, internal flow blockage and pressure drop. Test calculations for a turbulent coaxial jet flow verify that the turbulence model performs satisfactorily for this relatively simple flow. Lobe mixer flows are presented for two geometries typical of current mixer design. These calculations included both hot and cold flow conditions, and both matched and mismatched Mach number and total pressure in the fan and turbine streams.
Sozzi, Fabiola B; Maiello, Maria; Pelliccia, Francesco; Parato, Vito Maurizio; Canetta, Ciro; Savino, Ketty; Lombardi, Federico; Palmiero, Pasquale
2016-09-01
Coronary computed tomography angiography is a noninvasive heart imaging test currently undergoing rapid development and advancement. The high resolution of the three-dimensional pictures of the moving heart and great vessels is performed during a coronary computed tomography to identify coronary artery disease and classify patient risk for atherosclerotic cardiovascular disease. The technique provides useful information about the coronary tree and atherosclerotic plaques beyond simple luminal narrowing and plaque type defined by calcium content. This application will improve image-guided prevention, medical therapy, and coronary interventions. The ability to interpret coronary computed tomography images is of utmost importance as we develop personalized medical care to enable therapeutic interventions stratified on the bases of plaque characteristics. This overview provides available data and expert's recommendations in the utilization of coronary computed tomography findings. We focus on the use of coronary computed tomography to detect coronary artery disease and stratify patients at risk, illustrating the implications of this test on patient management. We describe its diagnostic power in identifying patients at higher risk to develop acute coronary syndrome and its prognostic significance. Finally, we highlight the features of the vulnerable plaques imaged by coronary computed tomography angiography. © 2016, Wiley Periodicals, Inc.
ICCE/ICCAI 2000 Full & Short Papers (Knowledge Construction and Navigation).
ERIC Educational Resources Information Center
2000
This document contains the following full and short papers on knowledge construction and navigation from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction): (1) "An XML-Based Tool for Building and Using Conceptual Maps in Education and Training Environments"…
ICCE/ICCAI 2000 Full & Short Papers (Lifelong Learning).
ERIC Educational Resources Information Center
2000
This document contains the following full and short papers on lifelong learning from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction): (1) "A Study on the School Information Technology Pilot Scheme: Possibilities of Creative and Lifelong Learning" (Siu-Cheung Kong,…
ICCE/ICCAI 2000 Full & Short Papers (Educational Agent).
ERIC Educational Resources Information Center
2000
This document contains the full text of the following papers on educational agent from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction): (1) "An Agent-Based Intelligent Tutoring System" (C.M. Bruff and M.A. Williams); (2) "Design of Systematic Concept…
ICCE/ICCAI 2000 Full & Short Papers (Intelligent Tutoring Systems).
ERIC Educational Resources Information Center
2000
This document contains the full and short papers on intelligent tutoring systems (ITS) from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction) covering the following topics: a framework for Internet-based distributed learning; a fuzzy-based assessment for the Perl tutoring…
ICCE/ICCAI 2000 Full & Short Papers (Interactive Learning Environments).
ERIC Educational Resources Information Center
2000
This document contains the full and short papers on interactive learning environments from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction) covering the following topics: a CAL system for appreciation of 3D shapes by surface development; a constructivist virtual physics…
ICCE/ICCAI 2000 Full & Short Papers (System Design and Development).
ERIC Educational Resources Information Center
2000
This document contains the full and short papers on system design and development from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction) covering the following topics: a code restructuring tool to help scaffold novice programmers; a framework for Internet-based…
ICCE/ICCAI 2000 Full & Short Papers (Multimedia and Hypermedia in Education).
ERIC Educational Resources Information Center
2000
This document contains the full and short papers on multimedia and hypermedia in education from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction) covering the following topics: learner-centered navigation path planning in world Wide Web-based learning; the relation…
ICCE/ICCAI 2000 Full & Short Papers (Methodologies).
ERIC Educational Resources Information Center
2000
This document contains the full text of the following full and short papers on methodologies from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction): (1) "A Methodology for Learning Pattern Analysis from Web Logs by Interpreting Web Page Contents" (Chih-Kai Chang and…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-19
...., including on-site leased workers from Computer Solutions and Software International, Inc., Dell Service... Insphere Insurance Solutions, Inc., Including On-Site Leased Workers From Computer Solutions and Software International, Inc., Dell Service Sales, Emdeon Business Services, KFORCE, Microsoft, Pariveda Solutions, Inc...
ICCE/ICCAI 2000 Full & Short Papers (Teaching and Learning Processes).
ERIC Educational Resources Information Center
2000
This document contains the full and short papers on teaching and learning processes from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction) covering the following topics: a code restructuring tool to help scaffold novice programmers; efficient study of Kanji using…
2012-01-01
The report summarizes the scientific content of the annual symposium organized by the Student Council of the International Society for Computational Biology (ISCB) held in conjunction with the Intelligent Systems for Molecular Biology (ISMB) conference in Long Beach, California on July 13, 2012.
Performance analysis of wireless sensor networks in geophysical sensing applications
NASA Astrophysics Data System (ADS)
Uligere Narasimhamurthy, Adithya
Performance is an important criteria to consider before switching from a wired network to a wireless sensing network. Performance is especially important in geophysical sensing where the quality of the sensing system is measured by the precision of the acquired signal. Can a wireless sensing network maintain the same reliability and quality metrics that a wired system provides? Our work focuses on evaluating the wireless GeoMote sensor motes that were developed by previous computer science graduate students at Mines. Specifically, we conducted a set of experiments, namely WalkAway and Linear Array experiments, to characterize the performance of the wireless motes. The motes were also equipped with the Sticking Heartbeat Aperture Resynchronization Protocol (SHARP), a time synchronization protocol developed by a previous computer science graduate student at Mines. This protocol should automatically synchronize the mote's internal clocks and reduce time synchronization errors. We also collected passive data to evaluate the response of GeoMotes to various frequency components associated with the seismic waves. With the data collected from these experiments, we evaluated the performance of the SHARP protocol and compared the performance of our GeoMote wireless system against the industry standard wired seismograph system (Geometric-Geode). Using arrival time analysis and seismic velocity calculations, we set out to answer the following question. Can our wireless sensing system (GeoMotes) perform similarly to a traditional wired system in a realistic scenario?
Effects of training and motivation on auditory P300 brain-computer interface performance.
Baykara, E; Ruf, C A; Fioravanti, C; Käthner, I; Simon, N; Kleih, S C; Kübler, A; Halder, S
2016-01-01
Brain-computer interface (BCI) technology aims at helping end-users with severe motor paralysis to communicate with their environment without using the natural output pathways of the brain. For end-users in complete paralysis, loss of gaze control may necessitate non-visual BCI systems. The present study investigated the effect of training on performance with an auditory P300 multi-class speller paradigm. For half of the participants, spatial cues were added to the auditory stimuli to see whether performance can be further optimized. The influence of motivation, mood and workload on performance and P300 component was also examined. In five sessions, 16 healthy participants were instructed to spell several words by attending to animal sounds representing the rows and columns of a 5 × 5 letter matrix. 81% of the participants achieved an average online accuracy of ⩾ 70%. From the first to the fifth session information transfer rates increased from 3.72 bits/min to 5.63 bits/min. Motivation significantly influenced P300 amplitude and online ITR. No significant facilitative effect of spatial cues on performance was observed. Training improves performance in an auditory BCI paradigm. Motivation influences performance and P300 amplitude. The described auditory BCI system may help end-users to communicate independently of gaze control with their environment. Copyright © 2015 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
Trial-by-Trial Motor Cortical Correlates of a Rapidly Adapting Visuomotor Internal Model
Ryu, Stephen I.
2017-01-01
Accurate motor control is mediated by internal models of how neural activity generates movement. We examined neural correlates of an adapting internal model of visuomotor gain in motor cortex while two macaques performed a reaching task in which the gain scaling between the hand and a presented cursor was varied. Previous studies of cortical changes during visuomotor adaptation focused on preparatory and perimovement epochs and analyzed trial-averaged neural data. Here, we recorded simultaneous neural population activity using multielectrode arrays and focused our analysis on neural differences in the period before the target appeared. We found that we could estimate the monkey's internal model of the gain using the neural population state during this pretarget epoch. This neural correlate depended on the gain experienced during recent trials and it predicted the speed of the subsequent reach. To explore the utility of this internal model estimate for brain–machine interfaces, we performed an offline analysis showing that it can be used to compensate for upcoming reach extent errors. Together, these results demonstrate that pretarget neural activity in motor cortex reflects the monkey's internal model of visuomotor gain on single trials and can potentially be used to improve neural prostheses. SIGNIFICANCE STATEMENT When generating movement commands, the brain is believed to use internal models of the relationship between neural activity and the body's movement. Visuomotor adaptation tasks have revealed neural correlates of these computations in multiple brain areas during movement preparation and execution. Here, we describe motor cortical changes in a visuomotor gain change task even before a specific movement is cued. We were able to estimate the gain internal model from these pretarget neural correlates and relate it to single-trial behavior. This is an important step toward understanding the sensorimotor system's algorithms for updating its internal models after specific movements and errors. Furthermore, the ability to estimate the internal model before movement could improve motor neural prostheses being developed for people with paralysis. PMID:28087767
NASA Astrophysics Data System (ADS)
Sharif, Hafiz Zafar; Leman, A. M.; Muthuraman, S.; Salleh, Mohd Najib Mohd; Zakaria, Supaat
2017-09-01
Combined heating, cooling, and power is also known as Tri-generation. Tri-generation system can provide power, hot water, space heating and air -conditioning from single source of energy. The objective of this study is to propose a method to evaluate the characteristic and performance of a single stage lithium bromide-water (LiBr-H2O) absorption machine operated with waste thermal energy of internal combustion engine which is integral part of trigeneration system. Correlations for computer sensitivity analysis are developed in data fit software for (P-T-X), (H-T-X), saturated liquid (water), saturated vapor, saturation pressure and crystallization temperature curve of LiBr-H2O Solution. Number of equations were developed with data fit software and exported into excel work sheet for the evaluation of number of parameter concerned with the performance of vapor absorption machine such as co-efficient of performance, concentration of solution, mass flow rate, size of heat exchangers of the unit in relation to the generator, condenser, absorber and evaporator temperatures. Size of vapor absorption machine within its crystallization limits for cooling and heating by waste energy recovered from exhaust gas, and jacket water of internal combustion engine also presented in this study to save the time and cost for the facilities managers who are interested to utilize the waste thermal energy of their buildings or premises for heating and air conditioning applications.
Forsman, Mikael; Richter, Hans O.
2017-01-01
Visually demanding near work can cause eye discomfort, and eye and neck/shoulder discomfort during, e.g., computer work are associated. To investigate direct effects of experimental near work on eye and neck/shoulder discomfort, 33 individuals with chronic neck pain and 33 healthy control subjects performed a visual task four times using four different trial lenses (referred to as four different viewing conditions), and they rated eye and neck/shoulder discomfort at baseline and after each task. Since symptoms of eye discomfort may differ depending on the underlying cause, two categories were used; internal eye discomfort, such as ache and strain, that may be caused by accommodative or vergence stress; and external eye discomfort, such as burning and smarting, that may be caused by dry-eye disorders. The cumulative performance time (reflected in the temporal order of the tasks), astigmatism, accommodation response and concurrent symptoms of internal eye discomfort all aggravated neck/shoulder discomfort, but there was no significant effect of external eye discomfort. There was also an interaction effect between the temporal order and internal eye discomfort: participants with a greater mean increase in internal eye discomfort also developed more neck/shoulder discomfort with time. Since moderate musculoskeletal symptoms are a risk factor for more severe symptoms, it is important to ensure a good visual environment in occupations involving visually demanding near work. PMID:28832612
Zetterberg, Camilla; Forsman, Mikael; Richter, Hans O
2017-01-01
Visually demanding near work can cause eye discomfort, and eye and neck/shoulder discomfort during, e.g., computer work are associated. To investigate direct effects of experimental near work on eye and neck/shoulder discomfort, 33 individuals with chronic neck pain and 33 healthy control subjects performed a visual task four times using four different trial lenses (referred to as four different viewing conditions), and they rated eye and neck/shoulder discomfort at baseline and after each task. Since symptoms of eye discomfort may differ depending on the underlying cause, two categories were used; internal eye discomfort, such as ache and strain, that may be caused by accommodative or vergence stress; and external eye discomfort, such as burning and smarting, that may be caused by dry-eye disorders. The cumulative performance time (reflected in the temporal order of the tasks), astigmatism, accommodation response and concurrent symptoms of internal eye discomfort all aggravated neck/shoulder discomfort, but there was no significant effect of external eye discomfort. There was also an interaction effect between the temporal order and internal eye discomfort: participants with a greater mean increase in internal eye discomfort also developed more neck/shoulder discomfort with time. Since moderate musculoskeletal symptoms are a risk factor for more severe symptoms, it is important to ensure a good visual environment in occupations involving visually demanding near work.
COBRA ATD minefield detection model initial performance analysis
NASA Astrophysics Data System (ADS)
Holmes, V. Todd; Kenton, Arthur C.; Hilton, Russell J.; Witherspoon, Ned H.; Holloway, John H., Jr.
2000-08-01
A statistical performance analysis of the USMC Coastal Battlefield Reconnaissance and Analysis (COBRA) Minefield Detection (MFD) Model has been performed in support of the COBRA ATD Program under execution by the Naval Surface Warfare Center/Dahlgren Division/Coastal Systems Station . This analysis uses the Veridian ERIM International MFD model from the COBRA Sensor Performance Evaluation and Computational Tools for Research Analysis modeling toolbox and a collection of multispectral mine detection algorithm response distributions for mines and minelike clutter objects. These mine detection response distributions were generated form actual COBRA ATD test missions over littoral zone minefields. This analysis serves to validate both the utility and effectiveness of the COBRA MFD Model as a predictive MFD performance too. COBRA ATD minefield detection model algorithm performance results based on a simulate baseline minefield detection scenario are presented, as well as result of a MFD model algorithm parametric sensitivity study.
Szécsi, László; Kacsó, Ágota; Zeck, Günther; Hantz, Péter
2017-01-01
Light stimulation with precise and complex spatial and temporal modulation is demanded by a series of research fields like visual neuroscience, optogenetics, ophthalmology, and visual psychophysics. We developed a user-friendly and flexible stimulus generating framework (GEARS GPU-based Eye And Retina Stimulation Software), which offers access to GPU computing power, and allows interactive modification of stimulus parameters during experiments. Furthermore, it has built-in support for driving external equipment, as well as for synchronization tasks, via USB ports. The use of GEARS does not require elaborate programming skills. The necessary scripting is visually aided by an intuitive interface, while the details of the underlying software and hardware components remain hidden. Internally, the software is a C++/Python hybrid using OpenGL graphics. Computations are performed on the GPU, and are defined in the GLSL shading language. However, all GPU settings, including the GPU shader programs, are automatically generated by GEARS. This is configured through a method encountered in game programming, which allows high flexibility: stimuli are straightforwardly composed using a broad library of basic components. Stimulus rendering is implemented solely in C++, therefore intermediary libraries for interfacing could be omitted. This enables the program to perform computationally demanding tasks like en-masse random number generation or real-time image processing by local and global operations. PMID:29326579
A fast, time-accurate unsteady full potential scheme
NASA Technical Reports Server (NTRS)
Shankar, V.; Ide, H.; Gorski, J.; Osher, S.
1985-01-01
The unsteady form of the full potential equation is solved in conservation form by an implicit method based on approximate factorization. At each time level, internal Newton iterations are performed to achieve time accuracy and computational efficiency. A local time linearization procedure is introduced to provide a good initial guess for the Newton iteration. A novel flux-biasing technique is applied to generate proper forms of the artificial viscosity to treat hyperbolic regions with shocks and sonic lines present. The wake is properly modeled by accounting not only for jumps in phi, but also for jumps in higher derivatives of phi, obtained by imposing the density to be continuous across the wake. The far field is modeled using the Riemann invariants to simulate nonreflecting boundary conditions. The resulting unsteady method performs well which, even at low reduced frequency levels of 0.1 or less, requires fewer than 100 time steps per cycle at transonic Mach numbers. The code is fully vectorized for the CRAY-XMP and the VPS-32 computers.
Pretest aerosol code comparisons for LWR aerosol containment tests LA1 and LA2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wright, A.L.; Wilson, J.H.; Arwood, P.C.
The Light-Water-Reactor (LWR) Aerosol Containment Experiments (LACE) are being performed in Richland, Washington, at the Hanford Engineering Development Laboratory (HEDL) under the leadership of an international project board and the Electric Power Research Institute. These tests have two objectives: (1) to investigate, at large scale, the inherent aerosol retention behavior in LWR containments under simulated severe accident conditions, and (2) to provide an experimental data base for validating aerosol behavior and thermal-hydraulic computer codes. Aerosol computer-code comparison activities are being coordinated at the Oak Ridge National Laboratory. For each of the six LACE tests, ''pretest'' calculations (for code-to-code comparisons) andmore » ''posttest'' calculations (for code-to-test data comparisons) are being performed. The overall goals of the comparison effort are (1) to provide code users with experience in applying their codes to LWR accident-sequence conditions and (2) to evaluate and improve the code models.« less