Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-02
... INTERNATIONAL TRADE COMMISSION [Inv. No. 337-TA-841] Certain Computers and Computer Peripheral... after importation of certain computers and computer peripheral devices and components thereof and... computers and computer peripheral devices and components thereof and products containing the same that...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-24
... Communications and Computer Devices and Components Thereof; Notice of Investigation AGENCY: U.S. International... States after importation of certain mobile communications and computer devices and components thereof by... importation of certain mobile communications or computer devices or components thereof that infringe one or...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-08
... Phones and Tablet Computers, and Components Thereof Institution of Investigation AGENCY: U.S... computers, and components thereof by reason of infringement of certain claims of U.S. Patent No. 5,570,369... mobile phones and tablet computers, and components thereof that infringe one or more of claims 1-3 and 5...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-14
... Communications and Computer Devices and Components Thereof; Notice of Commission Determination Not To Review an... in its entirety Inv. No. 337-TA-704, Certain Mobile Communications and Computer Devices and... importation of certain mobile communications and computer devices and components thereof by reason of...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-24
... INTERNATIONAL TRADE COMMISSION [Inv. No. 337-TA-705] In the Matter of Certain Notebook Computer... United States after importation of certain notebook computer products and components thereof by reason of... after importation of certain notebook computer products or components thereof that infringe one or more...
Archer, Charles J [Rochester, MN; Blocksome, Michael A [Rochester, MN; Heidelberger, Philip [Cortlandt Manor, NY; Kumar, Sameer [White Plains, NY; Parker, Jeffrey J [Rochester, MN; Ratterman, Joseph D [Rochester, MN
2011-06-07
Methods, compute nodes, and computer program products are provided for heuristic status polling of a component in a computing system. Embodiments include receiving, by a polling module from a requesting application, a status request requesting status of a component; determining, by the polling module, whether an activity history for the component satisfies heuristic polling criteria; polling, by the polling module, the component for status if the activity history for the component satisfies the heuristic polling criteria; and not polling, by the polling module, the component for status if the activity history for the component does not satisfy the heuristic criteria.
A grid-embedding transonic flow analysis computer program for wing/nacelle configurations
NASA Technical Reports Server (NTRS)
Atta, E. H.; Vadyak, J.
1983-01-01
An efficient grid-interfacing zonal algorithm was developed for computing the three-dimensional transonic flow field about wing/nacelle configurations. the algorithm uses the full-potential formulation and the AF2 approximate factorization scheme. The flow field solution is computed using a component-adaptive grid approach in which separate grids are employed for the individual components in the multi-component configuration, where each component grid is optimized for a particular geometry such as the wing or nacelle. The wing and nacelle component grids are allowed to overlap, and flow field information is transmitted from one grid to another through the overlap region using trivariate interpolation. This report represents a discussion of the computational methods used to generate both the wing and nacelle component grids, the technique used to interface the component grids, and the method used to obtain the inviscid flow solution. Computed results and correlations with experiment are presented. also presented are discussions on the organization of the wing grid generation (GRGEN3) and nacelle grid generation (NGRIDA) computer programs, the grid interface (LK) computer program, and the wing/nacelle flow solution (TWN) computer program. Descriptions of the respective subroutines, definitions of the required input parameters, a discussion on interpretation of the output, and the sample cases illustrating application of the analysis are provided for each of the four computer programs.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-04
... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-769] Certain Handheld Electronic Computing Devices, Related Software, and Components Thereof; Termination of the Investigation Based on... electronic computing devices, related software, and components thereof by reason of infringement of certain...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-08
... Phones and Tablet Computers, and Components Thereof; Notice of Receipt of Complaint; Solicitation of... entitled Certain Electronic Devices, Including Mobile Phones and Tablet Computers, and Components Thereof... the United States after importation of certain electronic devices, including mobile phones and tablet...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-26
... States after importation of certain notebook computer products and components thereof by reason of... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-705] In the Matter of: Certain Notebook Computer Products and Components Thereof; Notice of Commission Determination Not To Review an Initial...
1990-08-01
Computer Conferencing ’ DTIC •ELECTEM. b ~Novo JIML 0*- B August 1990 Field Element al Boise, Idaho Field Unit at Fort Knox, Kentucky Training Resecarch...Distributed Training for the Reserve Component: Course Conversion and implementation Guidelines for Computer (onferencing _________________ __________ 12...identify by block number) FIELD GROUP SUB-GROT;W Asynchironous computer conferencing ’rt i1inimg technology _____ 1Reserve Component jtr ibuted
Calculating a checksum with inactive networking components in a computing system
Aho, Michael E; Chen, Dong; Eisley, Noel A; Gooding, Thomas M; Heidelberger, Philip; Tauferner, Andrew T
2014-12-16
Calculating a checksum utilizing inactive networking components in a computing system, including: identifying, by a checksum distribution manager, an inactive networking component, wherein the inactive networking component includes a checksum calculation engine for computing a checksum; sending, to the inactive networking component by the checksum distribution manager, metadata describing a block of data to be transmitted by an active networking component; calculating, by the inactive networking component, a checksum for the block of data; transmitting, to the checksum distribution manager from the inactive networking component, the checksum for the block of data; and sending, by the active networking component, a data communications message that includes the block of data and the checksum for the block of data.
Calculating a checksum with inactive networking components in a computing system
Aho, Michael E; Chen, Dong; Eisley, Noel A; Gooding, Thomas M; Heidelberger, Philip; Tauferner, Andrew T
2015-01-27
Calculating a checksum utilizing inactive networking components in a computing system, including: identifying, by a checksum distribution manager, an inactive networking component, wherein the inactive networking component includes a checksum calculation engine for computing a checksum; sending, to the inactive networking component by the checksum distribution manager, metadata describing a block of data to be transmitted by an active networking component; calculating, by the inactive networking component, a checksum for the block of data; transmitting, to the checksum distribution manager from the inactive networking component, the checksum for the block of data; and sending, by the active networking component, a data communications message that includes the block of data and the checksum for the block of data.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-08
... Wireless Communication Devices, Tablet Computers, Media Players, and Televisions, and Components Thereof... devices, including wireless communication devices, tablet computers, media players, and televisions, and... wireless communication devices, tablet computers, media players, and televisions, and components thereof...
Reflections on Component Computing from the Boxer Project's Perspective
ERIC Educational Resources Information Center
diSessa, Andrea A.
2004-01-01
The Boxer Project conducted the research that led to the synthetic review "Issues in Component Computing." This brief essay provides a platform from which to develop our general perspective on educational computing and how it relates to components. The two most important lines of our thinking are (1) the goal to open technology's creative…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-06
... importation of certain notebook computer products and components thereof by reason of infringement of the '693... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-705] In the Matter of Certain Notebook Computer Products and Components Thereof; Notice of Commission Decision Not To Review an Initial...
Design of microstrip components by computer
NASA Technical Reports Server (NTRS)
Cisco, T. C.
1972-01-01
Development of computer programs for component analysis and design aids used in production of microstrip components is discussed. System includes designs for couplers, filters, circulators, transformers, power splitters, diode switches, and attenuators.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Armstrong, Robert C.; Ray, Jaideep; Malony, A.
2003-11-01
We present a case study of performance measurement and modeling of a CCA (Common Component Architecture) component-based application in a high performance computing environment. We explore issues peculiar to component-based HPC applications and propose a performance measurement infrastructure for HPC based loosely on recent work done for Grid environments. A prototypical implementation of the infrastructure is used to collect data for a three components in a scientific application and construct performance models for two of them. Both computational and message-passing performance are addressed.
Evaluation of Rankine cycle air conditioning system hardware by computer simulation
NASA Technical Reports Server (NTRS)
Healey, H. M.; Clark, D.
1978-01-01
A computer program for simulating the performance of a variety of solar powered Rankine cycle air conditioning system components (RCACS) has been developed. The computer program models actual equipment by developing performance maps from manufacturers data and is capable of simulating off-design operation of the RCACS components. The program designed to be a subroutine of the Marshall Space Flight Center (MSFC) Solar Energy System Analysis Computer Program 'SOLRAD', is a complete package suitable for use by an occasional computer user in developing performance maps of heating, ventilation and air conditioning components.
ResidPlots-2: Computer Software for IRT Graphical Residual Analyses
ERIC Educational Resources Information Center
Liang, Tie; Han, Kyung T.; Hambleton, Ronald K.
2009-01-01
This article discusses the ResidPlots-2, a computer software that provides a powerful tool for IRT graphical residual analyses. ResidPlots-2 consists of two components: a component for computing residual statistics and another component for communicating with users and for plotting the residual graphs. The features of the ResidPlots-2 software are…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-21
... Devices, Portable Music and Data Processing Devices, Computers, and Components Thereof; Institution of... communication devices, portable music and data processing devices, computers, and components thereof by reason... certain wireless communication devices, portable music and data processing devices, computers, and...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-24
... Music and Data Processing Devices, Computers, and Components Thereof; Notice of Receipt of Complaint... complaint entitled Wireless Communication Devices, Portable Music and Data Processing Devices, Computers..., portable music and data processing devices, computers, and components thereof. The complaint names as...
Efficient Computational Prototyping of Mixed Technology Microfluidic Components and Systems
2002-08-01
AFRL-IF-RS-TR-2002-190 Final Technical Report August 2002 EFFICIENT COMPUTATIONAL PROTOTYPING OF MIXED TECHNOLOGY MICROFLUIDIC...SUBTITLE EFFICIENT COMPUTATIONAL PROTOTYPING OF MIXED TECHNOLOGY MICROFLUIDIC COMPONENTS AND SYSTEMS 6. AUTHOR(S) Narayan R. Aluru, Jacob White...Aided Design (CAD) tools for microfluidic components and systems were developed in this effort. Innovative numerical methods and algorithms for mixed
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-19
... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-841] Certain Computers and Computer Peripheral Devices and Components Thereof and Products Containing the Same Request for Statements on the Public Interest AGENCY: U.S. International Trade Commission. ACTION: Notice. SUMMARY: Notice is hereby...
Semantic Annotation of Computational Components
NASA Technical Reports Server (NTRS)
Vanderbilt, Peter; Mehrotra, Piyush
2004-01-01
This paper describes a methodology to specify machine-processable semantic descriptions of computational components to enable them to be shared and reused. A particular focus of this scheme is to enable automatic compositon of such components into simple work-flows.
NASA Technical Reports Server (NTRS)
1991-01-01
The technical effort and computer code enhancements performed during the sixth year of the Probabilistic Structural Analysis Methods program are summarized. Various capabilities are described to probabilistically combine structural response and structural resistance to compute component reliability. A library of structural resistance models is implemented in the Numerical Evaluations of Stochastic Structures Under Stress (NESSUS) code that included fatigue, fracture, creep, multi-factor interaction, and other important effects. In addition, a user interface was developed for user-defined resistance models. An accurate and efficient reliability method was developed and was successfully implemented in the NESSUS code to compute component reliability based on user-selected response and resistance models. A risk module was developed to compute component risk with respect to cost, performance, or user-defined criteria. The new component risk assessment capabilities were validated and demonstrated using several examples. Various supporting methodologies were also developed in support of component risk assessment.
Diamond High Assurance Security Program: Trusted Computing Exemplar
2002-09-01
computing component, the Embedded MicroKernel Prototype. A third-party evaluation of the component will be initiated during development (e.g., once...target technologies and larger projects is a topic for future research. Trusted Computing Reference Component – The Embedded MicroKernel Prototype We...Kernel The primary security function of the Embedded MicroKernel will be to enforce process and data-domain separation, while providing primitive
Transformation of OODT CAS to Perform Larger Tasks
NASA Technical Reports Server (NTRS)
Mattmann, Chris; Freeborn, Dana; Crichton, Daniel; Hughes, John; Ramirez, Paul; Hardman, Sean; Woollard, David; Kelly, Sean
2008-01-01
A computer program denoted OODT CAS has been transformed to enable performance of larger tasks that involve greatly increased data volumes and increasingly intensive processing of data on heterogeneous, geographically dispersed computers. Prior to the transformation, OODT CAS (also alternatively denoted, simply, 'CAS') [wherein 'OODT' signifies 'Object-Oriented Data Technology' and 'CAS' signifies 'Catalog and Archive Service'] was a proven software component used to manage scientific data from spaceflight missions. In the transformation, CAS was split into two separate components representing its canonical capabilities: file management and workflow management. In addition, CAS was augmented by addition of a resource-management component. This third component enables CAS to manage heterogeneous computing by use of diverse resources, including high-performance clusters of computers, commodity computing hardware, and grid computing infrastructures. CAS is now more easily maintainable, evolvable, and reusable. These components can be used separately or, taking advantage of synergies, can be used together. Other elements of the transformation included addition of a separate Web presentation layer that supports distribution of data products via Really Simple Syndication (RSS) feeds, and provision for full Resource Description Framework (RDF) exports of metadata.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wetter, Michael; Fuchs, Marcus; Nouidui, Thierry
This paper discusses design decisions for exporting Modelica thermofluid flow components as Functional Mockup Units. The purpose is to provide guidelines that will allow building energy simulation programs and HVAC equipment manufacturers to effectively use FMUs for modeling of HVAC components and systems. We provide an analysis for direct input-output dependencies of such components and discuss how these dependencies can lead to algebraic loops that are formed when connecting thermofluid flow components. Based on this analysis, we provide recommendations that increase the computing efficiency of such components and systems that are formed by connecting multiple components. We explain what codemore » optimizations are lost when providing thermofluid flow components as FMUs rather than Modelica code. We present an implementation of a package for FMU export of such components, explain the rationale for selecting the connector variables of the FMUs and finally provide computing benchmarks for different design choices. It turns out that selecting temperature rather than specific enthalpy as input and output signals does not lead to a measurable increase in computing time, but selecting nine small FMUs rather than a large FMU increases computing time by 70%.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-25
... Computing Devices, Related Software, and Components Thereof; Notice of Investigation AGENCY: U.S... devices, related software, and components thereof by reason of infringement of certain claims of U.S... devices, related software, and components thereof that infringe one or more of claims 1 and 5 of the '372...
NASA Technical Reports Server (NTRS)
Kolb, Mark A.
1990-01-01
Viewgraphs on Rubber Airplane: Constraint-based Component-Modeling for Knowledge Representation in Computer Aided Conceptual Design are presented. Topics covered include: computer aided design; object oriented programming; airfoil design; surveillance aircraft; commercial aircraft; aircraft design; and launch vehicles.
Brain-controlled body movement assistance devices and methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leuthardt, Eric C.; Love, Lonnie J.; Coker, Rob
Methods, devices, systems, and apparatus, including computer programs encoded on a computer storage medium, for brain-controlled body movement assistance devices. In one aspect, a device includes a brain-controlled body movement assistance device with a brain-computer interface (BCI) component adapted to be mounted to a user, a body movement assistance component operably connected to the BCI component and adapted to be worn by the user, and a feedback mechanism provided in connection with at least one of the BCI component and the body movement assistance component, the feedback mechanism being configured to output information relating to a usage session of themore » brain-controlled body movement assistance device.« less
A CLIPS based personal computer hardware diagnostic system
NASA Technical Reports Server (NTRS)
Whitson, George M.
1991-01-01
Often the person designated to repair personal computers has little or no knowledge of how to repair a computer. Described here is a simple expert system to aid these inexperienced repair people. The first component of the system leads the repair person through a number of simple system checks such as making sure that all cables are tight and that the dip switches are set correctly. The second component of the system assists the repair person in evaluating error codes generated by the computer. The final component of the system applies a large knowledge base to attempt to identify the component of the personal computer that is malfunctioning. We have implemented and tested our design with a full system to diagnose problems for an IBM compatible system based on the 8088 chip. In our tests, the inexperienced repair people found the system very useful in diagnosing hardware problems.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-30
... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-745] Certain Wireless Communication Devices, Portable Music and Data Processing Devices, Computers and Components Thereof; Notice of... communication devices, portable music and data processing devices, computers and components thereof by reason of...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-29
..., Including Mobile Phones, Mobile Tablets, Portable Music Players, and Computers, and Components Thereof... certain electronic devices, including mobile phones, mobile tablets, portable music players, and computers... mobile phones, mobile tablets, portable music players, and computers, and components thereof that...
ERIC Educational Resources Information Center
Howard, Bruce C.; McGee, Steven; Shia, Regina; Hong, Namsoo Shin
This study sought to examine the effects of meta cognitive self-regulation on problem solving across three conditions: (1) an interactive, computer-based treatment condition; (2) a noninteractive computer-based alternative treatment condition; and (3) a control condition. Also investigated was which of five components of metacognitive…
NASA Astrophysics Data System (ADS)
Furlong, Cosme; Pryputniewicz, Ryszard J.
1998-05-01
Increased demands on the performance and efficiency of mechanical components impose challenges on their engineering design and optimization, especially when new and more demanding applications must be developed in relatively short periods of time while satisfying design objectives, as well as cost and manufacturability. In addition, reliability and durability must be taken into consideration. As a consequence, effective quantitative methodologies, computational and experimental, should be applied in the study and optimization of mechanical components. Computational investigations enable parametric studies and the determination of critical engineering design conditions, while experimental investigations, especially those using optical techniques, provide qualitative and quantitative information on the actual response of the structure of interest to the applied load and boundary conditions. We discuss a hybrid experimental and computational approach for investigation and optimization of mechanical components. The approach is based on analytical, computational, and experimental resolutions methodologies in the form of computational, noninvasive optical techniques, and fringe prediction analysis tools. Practical application of the hybrid approach is illustrated with representative examples that demonstrate the viability of the approach as an effective engineering tool for analysis and optimization.
Development of an Aeroelastic Modeling Capability for Transient Nozzle Side Load Analysis
NASA Technical Reports Server (NTRS)
Wang, Ten-See; Zhao, Xiang; Zhang, Sijun; Chen, Yen-Sen
2013-01-01
Lateral nozzle forces are known to cause severe structural damage to any new rocket engine in development. Currently there is no fully coupled computational tool to analyze this fluid/structure interaction process. The objective of this study was to develop a fully coupled aeroelastic modeling capability to describe the fluid/structure interaction process during the transient nozzle operations. The aeroelastic model composes of three components: the computational fluid dynamics component based on an unstructured-grid, pressure-based computational fluid dynamics formulation, the computational structural dynamics component developed in the framework of modal analysis, and the fluid-structural interface component. The developed aeroelastic model was applied to the transient nozzle startup process of the Space Shuttle Main Engine at sea level. The computed nozzle side loads and the axial nozzle wall pressure profiles from the aeroelastic nozzle are compared with those of the published rigid nozzle results, and the impact of the fluid/structure interaction on nozzle side loads is interrogated and presented.
Archer, Charles J [Rochester, MN; Blocksome, Michael A [Rochester, MN; Peters, Amanda A [Rochester, MN; Ratterman, Joseph D [Rochester, MN; Smith, Brian E [Rochester, MN
2012-01-10
Methods, apparatus, and products are disclosed for reducing power consumption while synchronizing a plurality of compute nodes during execution of a parallel application that include: beginning, by each compute node, performance of a blocking operation specified by the parallel application, each compute node beginning the blocking operation asynchronously with respect to the other compute nodes; reducing, for each compute node, power to one or more hardware components of that compute node in response to that compute node beginning the performance of the blocking operation; and restoring, for each compute node, the power to the hardware components having power reduced in response to all of the compute nodes beginning the performance of the blocking operation.
Archer, Charles J [Rochester, MN; Blocksome, Michael A [Rochester, MN; Peters, Amanda E [Cambridge, MA; Ratterman, Joseph D [Rochester, MN; Smith, Brian E [Rochester, MN
2012-04-17
Methods, apparatus, and products are disclosed for reducing power consumption while synchronizing a plurality of compute nodes during execution of a parallel application that include: beginning, by each compute node, performance of a blocking operation specified by the parallel application, each compute node beginning the blocking operation asynchronously with respect to the other compute nodes; reducing, for each compute node, power to one or more hardware components of that compute node in response to that compute node beginning the performance of the blocking operation; and restoring, for each compute node, the power to the hardware components having power reduced in response to all of the compute nodes beginning the performance of the blocking operation.
NASA Technical Reports Server (NTRS)
Huang, Norden E. (Inventor)
2001-01-01
A computer implemented method of processing two-dimensional physical signals includes five basic components and the associated presentation techniques of the results. The first component decomposes the two-dimensional signal into one-dimensional profiles. The second component is a computer implemented Empirical Mode Decomposition that extracts a collection of Intrinsic Mode Functions (IMF's) from each profile based on local extrema and/or curvature extrema. The decomposition is based on the direct extraction of the energy associated with various intrinsic time scales in the profiles. In the third component, the IMF's of each profile are then subjected to a Hilbert Transform. The fourth component collates the Hilbert transformed IMF's of the profiles to form a two-dimensional Hilbert Spectrum. A fifth component manipulates the IMF's by, for example, filtering the two-dimensional signal by reconstructing the two-dimensional signal from selected IMF(s).
Interface Generation and Compositional Verification in JavaPathfinder
NASA Technical Reports Server (NTRS)
Giannakopoulou, Dimitra; Pasareanu, Corina
2009-01-01
We present a novel algorithm for interface generation of software components. Given a component, our algorithm uses learning techniques to compute a permissive interface representing legal usage of the component. Unlike our previous work, this algorithm does not require knowledge about the component s environment. Furthermore, in contrast to other related approaches, our algorithm computes permissive interfaces even in the presence of non-determinism in the component. Our algorithm is implemented in the JavaPathfinder model checking framework for UML statechart components. We have also added support for automated assume-guarantee style compositional verification in JavaPathfinder, using component interfaces. We report on the application of the presented approach to the generation of interfaces for flight software components.
NASA Technical Reports Server (NTRS)
Mckay, Charles W.; Feagin, Terry; Bishop, Peter C.; Hallum, Cecil R.; Freedman, Glenn B.
1987-01-01
The principle focus of one of the RICIS (Research Institute for Computing and Information Systems) components is computer systems and software engineering in-the-large of the lifecycle of large, complex, distributed systems which: (1) evolve incrementally over a long time; (2) contain non-stop components; and (3) must simultaneously satisfy a prioritized balance of mission and safety critical requirements at run time. This focus is extremely important because of the contribution of the scaling direction problem to the current software crisis. The Computer Systems and Software Engineering (CSSE) component addresses the lifestyle issues of three environments: host, integration, and target.
NASA Tech Briefs, July 1995. Volume 19, No. 7
NASA Technical Reports Server (NTRS)
1995-01-01
Topics include: mechanical components, electronic components and circuits, electronic systems, physical sciences, materials, computer programs, mechanics, machinery, manufacturing/fabrication, mathematics and information sciences, book and reports, and a special section of Federal laboratory computing Tech Briefs.
Five Year Computer Technology Forecast
DOT National Transportation Integrated Search
1972-12-01
The report delineates the various computer system components and extrapolates past trends in light of industry goals and physical limitations to predict what individual components and entire systems will look like in the second half of this decade. T...
Confabulation Based Sentence Completion for Machine Reading
2010-11-01
making sentence completion an indispensible component of machine reading. Cogent confabulation is a bio-inspired computational model that mimics the...thus making sentence completion an indispensible component of machine reading. Cogent confabulation is a bio-inspired computational model that mimics...University Press, 1992. [2] H. Motoda and K. Yoshida, “Machine learning techniques to make computers easier to use,” Proceedings of the Fifteenth
Center for Center for Technology for Advanced Scientific Component Software (TASCS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kostadin, Damevski
A resounding success of the Scientific Discovery through Advanced Computing (SciDAC) program is that high-performance computational science is now universally recognized as a critical aspect of scientific discovery [71], complementing both theoretical and experimental research. As scientific communities prepare to exploit unprecedented computing capabilities of emerging leadership-class machines for multi-model simulations at the extreme scale [72], it is more important than ever to address the technical and social challenges of geographically distributed teams that combine expertise in domain science, applied mathematics, and computer science to build robust and flexible codes that can incorporate changes over time. The Center for Technologymore » for Advanced Scientific Component Software (TASCS)1 tackles these these issues by exploiting component-based software development to facilitate collaborative high-performance scientific computing.« less
Architectures for single-chip image computing
NASA Astrophysics Data System (ADS)
Gove, Robert J.
1992-04-01
This paper will focus on the architectures of VLSI programmable processing components for image computing applications. TI, the maker of industry-leading RISC, DSP, and graphics components, has developed an architecture for a new-generation of image processors capable of implementing a plurality of image, graphics, video, and audio computing functions. We will show that the use of a single-chip heterogeneous MIMD parallel architecture best suits this class of processors--those which will dominate the desktop multimedia, document imaging, computer graphics, and visualization systems of this decade.
NASA Technical Reports Server (NTRS)
Kleb, William L.; Wood, William A.
2004-01-01
The computational simulation community is not routinely publishing independently verifiable tests to accompany new models or algorithms. A survey reveals that only 22% of new models published are accompanied by tests suitable for independently verifying the new model. As the community develops larger codes with increased functionality, and hence increased complexity in terms of the number of building block components and their interactions, it becomes prohibitively expensive for each development group to derive the appropriate tests for each component. Therefore, the computational simulation community is building its collective castle on a very shaky foundation of components with unpublished and unrepeatable verification tests. The computational simulation community needs to begin publishing component level verification tests before the tide of complexity undermines its foundation.
NASA Technical Reports Server (NTRS)
Palusinski, O. A.; Allgyer, T. T.; Mosher, R. A.; Bier, M.; Saville, D. A.
1981-01-01
A mathematical model of isoelectric focusing at the steady state has been developed for an M-component system of electrochemically defined ampholytes. The model is formulated from fundamental principles describing the components' chemical equilibria, mass transfer resulting from diffusion and electromigration, and electroneutrality. The model consists of ordinary differential equations coupled with a system of algebraic equations. The model is implemented on a digital computer using FORTRAN-based simulation software. Computer simulation data are presented for several two-component systems showing the effects of varying the isoelectric points and dissociation constants of the constituents.
Stability and error estimation for Component Adaptive Grid methods
NASA Technical Reports Server (NTRS)
Oliger, Joseph; Zhu, Xiaolei
1994-01-01
Component adaptive grid (CAG) methods for solving hyperbolic partial differential equations (PDE's) are discussed in this paper. Applying recent stability results for a class of numerical methods on uniform grids. The convergence of these methods for linear problems on component adaptive grids is established here. Furthermore, the computational error can be estimated on CAG's using the stability results. Using these estimates, the error can be controlled on CAG's. Thus, the solution can be computed efficiently on CAG's within a given error tolerance. Computational results for time dependent linear problems in one and two space dimensions are presented.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Computation of the tier I annuity component for a widow(er), disabled widow(er), remarried widow(er), and a surviving divorced spouse. 228.10... component for a widow(er), disabled widow(er), remarried widow(er), and a surviving divorced spouse. The...
Code of Federal Regulations, 2011 CFR
2011-04-01
... 20 Employees' Benefits 1 2011-04-01 2011-04-01 false Computation of the tier I annuity component for a widow(er), disabled widow(er), remarried widow(er), and a surviving divorced spouse. 228.10... component for a widow(er), disabled widow(er), remarried widow(er), and a surviving divorced spouse. The...
Code of Federal Regulations, 2012 CFR
2012-04-01
... 20 Employees' Benefits 1 2012-04-01 2012-04-01 false Computation of the tier I annuity component for a widow(er), disabled widow(er), remarried widow(er), and a surviving divorced spouse. 228.10... component for a widow(er), disabled widow(er), remarried widow(er), and a surviving divorced spouse. The...
Code of Federal Regulations, 2013 CFR
2013-04-01
... 20 Employees' Benefits 1 2013-04-01 2012-04-01 true Computation of the tier I annuity component for a widow(er), disabled widow(er), remarried widow(er), and a surviving divorced spouse. 228.10... component for a widow(er), disabled widow(er), remarried widow(er), and a surviving divorced spouse. The...
Code of Federal Regulations, 2014 CFR
2014-04-01
... 20 Employees' Benefits 1 2014-04-01 2012-04-01 true Computation of the tier I annuity component for a widow(er), disabled widow(er), remarried widow(er), and a surviving divorced spouse. 228.10... component for a widow(er), disabled widow(er), remarried widow(er), and a surviving divorced spouse. The...
NASA Technical Reports Server (NTRS)
Rajagopal, K. R.
1992-01-01
The technical effort and computer code development is summarized. Several formulations for Probabilistic Finite Element Analysis (PFEA) are described with emphasis on the selected formulation. The strategies being implemented in the first-version computer code to perform linear, elastic PFEA is described. The results of a series of select Space Shuttle Main Engine (SSME) component surveys are presented. These results identify the critical components and provide the information necessary for probabilistic structural analysis. Volume 2 is a summary of critical SSME components.
Aeroelastic Modeling of a Nozzle Startup Transient
NASA Technical Reports Server (NTRS)
Wang, Ten-See; Zhao, Xiang; Zhang, Sijun; Chen, Yen-Sen
2014-01-01
Lateral nozzle forces are known to cause severe structural damage to any new rocket engine in development during test. While three-dimensional, transient, turbulent, chemically reacting computational fluid dynamics methodology has been demonstrated to capture major side load physics with rigid nozzles, hot-fire tests often show nozzle structure deformation during major side load events, leading to structural damages if structural strengthening measures were not taken. The modeling picture is incomplete without the capability to address the two-way responses between the structure and fluid. The objective of this study is to develop a tightly coupled aeroelastic modeling algorithm by implementing the necessary structural dynamics component into an anchored computational fluid dynamics methodology. The computational fluid dynamics component is based on an unstructured-grid, pressure-based computational fluid dynamics formulation, while the computational structural dynamics component is developed under the framework of modal analysis. Transient aeroelastic nozzle startup analyses at sea level were performed, and the computed transient nozzle fluid-structure interaction physics presented,
Development of an Aeroelastic Modeling Capability for Transient Nozzle Side Load Analysis
NASA Technical Reports Server (NTRS)
Wang, Ten-See; Zhao, Xiang; Zhang, Sijun; Chen, Yen-Sen
2013-01-01
Lateral nozzle forces are known to cause severe structural damage to any new rocket engine in development during test. While three-dimensional, transient, turbulent, chemically reacting computational fluid dynamics methodology has been demonstrated to capture major side load physics with rigid nozzles, hot-fire tests often show nozzle structure deformation during major side load events, leading to structural damages if structural strengthening measures were not taken. The modeling picture is incomplete without the capability to address the two-way responses between the structure and fluid. The objective of this study is to develop a coupled aeroelastic modeling capability by implementing the necessary structural dynamics component into an anchored computational fluid dynamics methodology. The computational fluid dynamics component is based on an unstructured-grid, pressure-based computational fluid dynamics formulation, while the computational structural dynamics component is developed in the framework of modal analysis. Transient aeroelastic nozzle startup analyses of the Block I Space Shuttle Main Engine at sea level were performed. The computed results from the aeroelastic nozzle modeling are presented.
Computed tomography (CT) as a nondestructive test method used for composite helicopter components
NASA Astrophysics Data System (ADS)
Oster, Reinhold
1991-09-01
The first components of primary helicopter structures to be made of glass fiber reinforced plastics were the main and tail rotor blades of the Bo105 and BK 117 helicopters. These blades are now successfully produced in series. New developments in rotor components, e.g., the rotor blade technology of the Bo108 and PAH2 programs, make use of very complex fiber reinforced structures to achieve simplicity and strength. Computer tomography was found to be an outstanding nondestructive test method for examining the internal structure of components. A CT scanner generates x-ray attenuation measurements which are used to produce computer reconstructed images of any desired part of an object. The system images a range of flaws in composites in a number of views and planes. Several CT investigations and their results are reported taking composite helicopter components as an example.
Computed Tomography (CT) as a nondestructive test method used for composite helicopter components
NASA Astrophysics Data System (ADS)
Oster, Reinhold
The first components of primary helicopter structures to be made of glass fiber reinforced plastics were the main and tail rotor blades of the Bo105 and BK117 helicopters. These blades are now successfully produced in series. New developments in rotor components, e.g. the rotor blade technology of the Bo108 and PAH2 programs, make use of very complex fiber reinforced structures to achieve simplicity and strength. Computer tomography was found to be an outstanding nondestructive test method for examining the internal structure of components. A CT scanner generates x-ray attenuation measurements which are used to produce computer reconstructed images of any desired part of an object. The system images a range of flaws in composites in a number of views and planes. Several CT investigations and their results are reported taking composite helicopter components as an example.
NASA Technical Reports Server (NTRS)
Kolb, Mark A.
1990-01-01
Originally, computer programs for engineering design focused on detailed geometric design. Later, computer programs for algorithmically performing the preliminary design of specific well-defined classes of objects became commonplace. However, due to the need for extreme flexibility, it appears unlikely that conventional programming techniques will prove fruitful in developing computer aids for engineering conceptual design. The use of symbolic processing techniques, such as object-oriented programming and constraint propagation, facilitate such flexibility. Object-oriented programming allows programs to be organized around the objects and behavior to be simulated, rather than around fixed sequences of function- and subroutine-calls. Constraint propagation allows declarative statements to be understood as designating multi-directional mathematical relationships among all the variables of an equation, rather than as unidirectional assignments to the variable on the left-hand side of the equation, as in conventional computer programs. The research has concentrated on applying these two techniques to the development of a general-purpose computer aid for engineering conceptual design. Object-oriented programming techniques are utilized to implement a user-extensible database of design components. The mathematical relationships which model both geometry and physics of these components are managed via constraint propagation. In addition, to this component-based hierarchy, special-purpose data structures are provided for describing component interactions and supporting state-dependent parameters. In order to investigate the utility of this approach, a number of sample design problems from the field of aerospace engineering were implemented using the prototype design tool, Rubber Airplane. The additional level of organizational structure obtained by representing design knowledge in terms of components is observed to provide greater convenience to the program user, and to result in a database of engineering information which is easier both to maintain and to extend.
Borresen, Jon; Lynch, Stephen
2012-01-01
In the 1940s, the first generation of modern computers used vacuum tube oscillators as their principle components, however, with the development of the transistor, such oscillator based computers quickly became obsolete. As the demand for faster and lower power computers continues, transistors are themselves approaching their theoretical limit and emerging technologies must eventually supersede them. With the development of optical oscillators and Josephson junction technology, we are again presented with the possibility of using oscillators as the basic components of computers, and it is possible that the next generation of computers will be composed almost entirely of oscillatory devices. Here, we demonstrate how coupled threshold oscillators may be used to perform binary logic in a manner entirely consistent with modern computer architectures. We describe a variety of computational circuitry and demonstrate working oscillator models of both computation and memory.
A Hybrid Method for Accelerated Simulation of Coulomb Collisions in a Plasma
DOE Office of Scientific and Technical Information (OSTI.GOV)
Caflisch, R; Wang, C; Dimarco, G
2007-10-09
If the collisional time scale for Coulomb collisions is comparable to the characteristic time scales for a plasma, then simulation of Coulomb collisions may be important for computation of kinetic plasma dynamics. This can be a computational bottleneck because of the large number of simulated particles and collisions (or phase-space resolution requirements in continuum algorithms), as well as the wide range of collision rates over the velocity distribution function. This paper considers Monte Carlo simulation of Coulomb collisions using the binary collision models of Takizuka & Abe and Nanbu. It presents a hybrid method for accelerating the computation of Coulombmore » collisions. The hybrid method represents the velocity distribution function as a combination of a thermal component (a Maxwellian distribution) and a kinetic component (a set of discrete particles). Collisions between particles from the thermal component preserve the Maxwellian; collisions between particles from the kinetic component are performed using the method of or Nanbu. Collisions between the kinetic and thermal components are performed by sampling a particle from the thermal component and selecting a particle from the kinetic component. Particles are also transferred between the two components according to thermalization and dethermalization probabilities, which are functions of phase space.« less
Open Component Portability Infrastructure (OPENCPI)
2009-11-01
Disk Drive 7 1 www.antec.com P182 $120. ATX Mid Tower Computer Case 8 1 www.xilinx.com HW-V5-ML555-G $2200. Xilinx ML555 V5 Dev Kit Notes: Cost...s/ GEORGE RAMSEYER EDWARD J. JONES, Deputy Chief Work Unit Manager Advanced Computing ...uniquely positioned to meet the goals of the Software Systems Stockroom (S3) since in some sense component-based systems are computer -science’s
Computational composite mechanics for aerospace propulsion structures
NASA Technical Reports Server (NTRS)
Chamis, C. C.
1986-01-01
Specialty methods are presented for the computational simulation of specific composite behavior. These methods encompass all aspects of composite mechanics, impact, progressive fracture and component specific simulation. Some of these methods are structured to computationally simulate, in parallel, the composite behavior and history from the initial fabrication through several missions and even to fracture. Select methods and typical results obtained from such simulations are described in detail in order to demonstrate the effectiveness of computationally simulating (1) complex composite structural behavior in general and (2) specific aerospace propulsion structural components in particular.
Computational composite mechanics for aerospace propulsion structures
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
1987-01-01
Specialty methods are presented for the computational simulation of specific composite behavior. These methods encompass all aspects of composite mechanics, impact, progressive fracture and component specific simulation. Some of these methods are structured to computationally simulate, in parallel, the composite behavior and history from the initial frabrication through several missions and even to fracture. Select methods and typical results obtained from such simulations are described in detail in order to demonstrate the effectiveness of computationally simulating: (1) complex composite structural behavior in general, and (2) specific aerospace propulsion structural components in particular.
Lithner, Delilah; Halling, Maja; Dave, Göran
2012-05-01
Electronic waste has become one of the fastest growing waste problems in the world. It contains both toxic metals and toxic organics. The aim of this study was to (1) investigate to what extent toxicants can leach from different electronic products, components, and materials into water and (2) identify which group of toxicants (metals or hydrophobic organics) that is causing toxicity. Components from five discarded electronic products (cell phone, computer, phone modem, keyboard, and computer mouse) were leached in deionised water for 3 days at 23°C in concentrations of 25 g/l for metal components, 50 g/l for mixed-material components, and 100 g/l for plastic components. The water phase was tested for acute toxicity to Daphnia magna. Eighteen of 68 leachates showed toxicity (with immobility of D. magna ≥ 50% after 48 h) and came from metal or mixed-material components. The 8 most toxic leachates, with 48 h EC(50)s ranging from 0.4 to 20 g/l, came from 2 circuit sheets (key board), integrated drive electronics (IDE) cable clips (computer), metal studs (computer), a circuit board (computer mouse), a cord (phone modem), mixed parts (cell phone), and a circuit board (key board). All 5 electronic products were represented among them. Toxicity identification evaluations (with C18 and CM resins filtrations and ethylenediaminetetraacetic acid addition) indicated that metals caused the toxicity in the majority of the most toxic leachates. Overall, this study has shown that electronic waste can leach toxic compounds also during short-term leaching with pure water.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-24
... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-847] Certain Electronic Devices, Including Mobile Phones and Tablet Computers, and Components Thereof; Notice of Request for Statements on the Public Interest AGENCY: U.S. International Trade Commission. ACTION: Notice. SUMMARY: Notice is...
40 CFR 86.010-2 - Definitions.
Code of Federal Regulations, 2013 CFR
2013-07-01
... diagnostics, means verifying that a component and/or system that receives information from a control computer... maintained. In general, limp-home operation implies that a component or system is not operating properly or... cannot be erased through human interaction with the OBD system or any onboard computer. Potential...
40 CFR 86.010-2 - Definitions.
Code of Federal Regulations, 2014 CFR
2014-07-01
... diagnostics, means verifying that a component and/or system that receives information from a control computer... maintained. In general, limp-home operation implies that a component or system is not operating properly or... cannot be erased through human interaction with the OBD system or any onboard computer. Potential...
40 CFR 86.010-2 - Definitions.
Code of Federal Regulations, 2011 CFR
2011-07-01
... diagnostics, means verifying that a component and/or system that receives information from a control computer... maintained. In general, limp-home operation implies that a component or system is not operating properly or... cannot be erased through human interaction with the OBD system or any onboard computer. Potential...
40 CFR 86.010-2 - Definitions.
Code of Federal Regulations, 2012 CFR
2012-07-01
... diagnostics, means verifying that a component and/or system that receives information from a control computer... maintained. In general, limp-home operation implies that a component or system is not operating properly or... cannot be erased through human interaction with the OBD system or any onboard computer. Potential...
40 CFR 86.010-2 - Definitions.
Code of Federal Regulations, 2010 CFR
2010-07-01
... diagnostics, means verifying that a component and/or system that receives information from a control computer... maintained. In general, limp-home operation implies that a component or system is not operating properly or... cannot be erased through human interaction with the OBD system or any onboard computer. Potential...
Computer program uses Monte Carlo techniques for statistical system performance analysis
NASA Technical Reports Server (NTRS)
Wohl, D. P.
1967-01-01
Computer program with Monte Carlo sampling techniques determines the effect of a component part of a unit upon the overall system performance. It utilizes the full statistics of the disturbances and misalignments of each component to provide unbiased results through simulated random sampling.
Design component method for sensitivity analysis of built-up structures
NASA Technical Reports Server (NTRS)
Choi, Kyung K.; Seong, Hwai G.
1986-01-01
A 'design component method' that provides a unified and systematic organization of design sensitivity analysis for built-up structures is developed and implemented. Both conventional design variables, such as thickness and cross-sectional area, and shape design variables of components of built-up structures are considered. It is shown that design of components of built-up structures can be characterized and system design sensitivity expressions obtained by simply adding contributions from each component. The method leads to a systematic organization of computations for design sensitivity analysis that is similar to the way in which computations are organized within a finite element code.
Component architecture in drug discovery informatics.
Smith, Peter M
2002-05-01
This paper reviews the characteristics of a new model of computing that has been spurred on by the Internet, known as Netcentric computing. Developments in this model led to distributed component architectures, which, although not new ideas, are now realizable with modern tools such as Enterprise Java. The application of this approach to scientific computing, particularly in pharmaceutical discovery research, is discussed and highlighted by a particular case involving the management of biological assay data.
Borresen, Jon; Lynch, Stephen
2012-01-01
In the 1940s, the first generation of modern computers used vacuum tube oscillators as their principle components, however, with the development of the transistor, such oscillator based computers quickly became obsolete. As the demand for faster and lower power computers continues, transistors are themselves approaching their theoretical limit and emerging technologies must eventually supersede them. With the development of optical oscillators and Josephson junction technology, we are again presented with the possibility of using oscillators as the basic components of computers, and it is possible that the next generation of computers will be composed almost entirely of oscillatory devices. Here, we demonstrate how coupled threshold oscillators may be used to perform binary logic in a manner entirely consistent with modern computer architectures. We describe a variety of computational circuitry and demonstrate working oscillator models of both computation and memory. PMID:23173034
Cut Costs with Thin Client Computing.
ERIC Educational Resources Information Center
Hartley, Patrick H.
2001-01-01
Discusses how school districts can considerably increase the number of administrative computers in their districts without a corresponding increase in costs by using the "Thin Client" component of the Total Cost of Ownership (TCC) model. TCC and Thin Client are described, including its software and hardware components. An example of a…
19 CFR 10.14 - Fabricated components subject to the exemption.
Code of Federal Regulations, 2012 CFR
2012-04-01
....14 Section 10.14 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY... assembly for a computer is assembled in the United States by soldering American-made and foreign-made... electronic function and is ready for incorporation into the computer. The foreign-made components have...
19 CFR 10.14 - Fabricated components subject to the exemption.
Code of Federal Regulations, 2014 CFR
2014-04-01
....14 Section 10.14 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY... assembly for a computer is assembled in the United States by soldering American-made and foreign-made... electronic function and is ready for incorporation into the computer. The foreign-made components have...
19 CFR 10.14 - Fabricated components subject to the exemption.
Code of Federal Regulations, 2013 CFR
2013-04-01
....14 Section 10.14 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY... assembly for a computer is assembled in the United States by soldering American-made and foreign-made... electronic function and is ready for incorporation into the computer. The foreign-made components have...
19 CFR 10.14 - Fabricated components subject to the exemption.
Code of Federal Regulations, 2011 CFR
2011-04-01
....14 Section 10.14 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY... assembly for a computer is assembled in the United States by soldering American-made and foreign-made... electronic function and is ready for incorporation into the computer. The foreign-made components have...
Distributed Training for the Reserve Component: Instructor Handbook for Computer Conferencing.
ERIC Educational Resources Information Center
Harbour, J.; And Others
The purpose of this handbook is to provide background and teaching recommendations for instructors who will be remotely conducting Reserve Component training using asynchronous computer conferencing techniques. The recommendations in this handbook are based on an international review of the literature in distance learning and experience gained…
EON: a component-based approach to automation of protocol-directed therapy.
Musen, M A; Tu, S W; Das, A K; Shahar, Y
1996-01-01
Provision of automated support for planning protocol-directed therapy requires a computer program to take as input clinical data stored in an electronic patient-record system and to generate as output recommendations for therapeutic interventions and laboratory testing that are defined by applicable protocols. This paper presents a synthesis of research carried out at Stanford University to model the therapy-planning task and to demonstrate a component-based architecture for building protocol-based decision-support systems. We have constructed general-purpose software components that (1) interpret abstract protocol specifications to construct appropriate patient-specific treatment plans; (2) infer from time-stamped patient data higher-level, interval-based, abstract concepts; (3) perform time-oriented queries on a time-oriented patient database; and (4) allow acquisition and maintenance of protocol knowledge in a manner that facilitates efficient processing both by humans and by computers. We have implemented these components in a computer system known as EON. Each of the components has been developed, evaluated, and reported independently. We have evaluated the integration of the components as a composite architecture by implementing T-HELPER, a computer-based patient-record system that uses EON to offer advice regarding the management of patients who are following clinical trial protocols for AIDS or HIV infection. A test of the reuse of the software components in a different clinical domain demonstrated rapid development of a prototype application to support protocol-based care of patients who have breast cancer. PMID:8930854
HYSEP: A Computer Program for Streamflow Hydrograph Separation and Analysis
Sloto, Ronald A.; Crouse, Michele Y.
1996-01-01
HYSEP is a computer program that can be used to separate a streamflow hydrograph into base-flow and surface-runoff components. The base-flow component has traditionally been associated with ground-water discharge and the surface-runoff component with precipitation that enters the stream as overland runoff. HYSEP includes three methods of hydrograph separation that are referred to in the literature as the fixed interval, sliding-interval, and local-minimum methods. The program also describes the frequency and duration of measured streamflow and computed base flow and surface runoff. Daily mean stream discharge is used as input to the program in either an American Standard Code for Information Interchange (ASCII) or binary format. Output from the program includes table,s graphs, and data files. Graphical output may be plotted on the computer screen or output to a printer, plotter, or metafile.
NASA Astrophysics Data System (ADS)
Shimobaba, Tomoyoshi; Nagahama, Yuki; Kakue, Takashi; Takada, Naoki; Okada, Naohisa; Endo, Yutaka; Hirayama, Ryuji; Hiyama, Daisuke; Ito, Tomoyoshi
2014-02-01
A calculation reduction method for color digital holography (DH) and computer-generated holograms (CGHs) using color space conversion is reported. Color DH and color CGHs are generally calculated on RGB space. We calculate color DH and CGHs in other color spaces for accelerating the calculation (e.g., YCbCr color space). In YCbCr color space, a RGB image or RGB hologram is converted to the luminance component (Y), blue-difference chroma (Cb), and red-difference chroma (Cr) components. In terms of the human eye, although the negligible difference of the luminance component is well recognized, the difference of the other components is not. In this method, the luminance component is normal sampled and the chroma components are down-sampled. The down-sampling allows us to accelerate the calculation of the color DH and CGHs. We compute diffraction calculations from the components, and then we convert the diffracted results in YCbCr color space to RGB color space. The proposed method, which is possible to accelerate the calculations up to a factor of 3 in theory, accelerates the calculation over two times faster than the ones in RGB color space.
NASA Tech Briefs, June 2000. Volume 24, No. 6
NASA Technical Reports Server (NTRS)
2000-01-01
Topics include: Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Test and Measurement; Physical Sciences; Materials; Computer Programs; Computers and Peripherals;
Navier-Stokes Computations With One-Equation Turbulence Model for Flows Along Concave Wall Surfaces
NASA Technical Reports Server (NTRS)
Wang, Chi R.
2005-01-01
This report presents the use of a time-marching three-dimensional compressible Navier-Stokes equation numerical solver with a one-equation turbulence model to simulate the flow fields developed along concave wall surfaces without and with a downstream extension flat wall surface. The 3-D Navier- Stokes numerical solver came from the NASA Glenn-HT code. The one-equation turbulence model was derived from the Spalart and Allmaras model. The computational approach was first calibrated with the computations of the velocity and Reynolds shear stress profiles of a steady flat plate boundary layer flow. The computational approach was then used to simulate developing boundary layer flows along concave wall surfaces without and with a downstream extension wall. The author investigated the computational results of surface friction factors, near surface velocity components, near wall temperatures, and a turbulent shear stress component in terms of turbulence modeling, computational mesh configurations, inlet turbulence level, and time iteration step. The computational results were compared with existing measurements of skin friction factors, velocity components, and shear stresses of the developing boundary layer flows. With a fine computational mesh and a one-equation model, the computational approach could predict accurately the skin friction factors, near surface velocity and temperature, and shear stress within the flows. The computed velocity components and shear stresses also showed the vortices effect on the velocity variations over a concave wall. The computed eddy viscosities at the near wall locations were also compared with the results from a two equation turbulence modeling technique. The inlet turbulence length scale was found to have little effect on the eddy viscosities at locations near the concave wall surface. The eddy viscosities, from the one-equation and two-equation modeling, were comparable at most stream-wise stations. The present one-equation turbulence model is an effective approach for turbulence modeling in the near solid wall surface region of flow over a concave wall.
NASA Astrophysics Data System (ADS)
Ha, J.; Chung, W.; Shin, S.
2015-12-01
Many waveform inversion algorithms have been proposed in order to construct subsurface velocity structures from seismic data sets. These algorithms have suffered from computational burden, local minima problems, and the lack of low-frequency components. Computational efficiency can be improved by the application of back-propagation techniques and advances in computing hardware. In addition, waveform inversion algorithms, for obtaining long-wavelength velocity models, could avoid both the local minima problem and the effect of the lack of low-frequency components in seismic data. In this study, we proposed spectrogram inversion as a technique for recovering long-wavelength velocity models. In spectrogram inversion, decomposed frequency components from spectrograms of traces, in the observed and calculated data, are utilized to generate traces with reproduced low-frequency components. Moreover, since each decomposed component can reveal the different characteristics of a subsurface structure, several frequency components were utilized to analyze the velocity features in the subsurface. We performed the spectrogram inversion using a modified SEG/SEGE salt A-A' line. Numerical results demonstrate that spectrogram inversion could also recover the long-wavelength velocity features. However, inversion results varied according to the frequency components utilized. Based on the results of inversion using a decomposed single-frequency component, we noticed that robust inversion results are obtained when a dominant frequency component of the spectrogram was utilized. In addition, detailed information on recovered long-wavelength velocity models was obtained using a multi-frequency component combined with single-frequency components. Numerical examples indicate that various detailed analyses of long-wavelength velocity models can be carried out utilizing several frequency components.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Slavic, I.; Draskovic, R.; Tasovac, T.
1973-03-01
A computer program for the determination of trace elements in components of the water systems bed material, suspended material, dissolved substances, plankton, algae) by nondestructive activation analysis was developed. Results of the determination of Cr, Sb, Sc, Fe, Co, Na, and La concentrations in suspended materials from the Danube river, obtained by interpretation of data with a CDC- 3600 computer (64 k words), are presented. (auth)
Computational Toxicology as Implemented by the US EPA ...
Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, the U.S. Environmental Protection Agency EPA is developing robust and flexible computational tools that can be applied to the thousands of chemicals in commerce, and contaminant mixtures found in air, water, and hazardous-waste sites. The Office of Research and Development (ORD) Computational Toxicology Research Program (CTRP) is composed of three main elements. The largest component is the National Center for Computational Toxicology (NCCT), which was established in 2005 to coordinate research on chemical screening and prioritization, informatics, and systems modeling. The second element consists of related activities in the National Health and Environmental Effects Research Laboratory (NHEERL) and the National Exposure Research Laboratory (NERL). The third and final component consists of academic centers working on various aspects of computational toxicology and funded by the U.S. EPA Science to Achieve Results (STAR) program. Together these elements form the key components in the implementation of both the initial strategy, A Framework for a Computational Toxicology Research Program (U.S. EPA, 2003), and the newly released The U.S. Environmental Protection Agency's Strategic Plan for Evaluating the T
A Position on a Computer Literacy Course.
ERIC Educational Resources Information Center
Self, Charles C.
A position is put forth on the appropriate content of a computer literacy course and the role of computer literacy in the community college. First, various definitions of computer literacy are examined, including the programming, computer awareness, and comprehensive approaches. Next, five essential components of a computer literacy course are…
Computing Lives And Reliabilities Of Turboprop Transmissions
NASA Technical Reports Server (NTRS)
Coy, J. J.; Savage, M.; Radil, K. C.; Lewicki, D. G.
1991-01-01
Computer program PSHFT calculates lifetimes of variety of aircraft transmissions. Consists of main program, series of subroutines applying to specific configurations, generic subroutines for analysis of properties of components, subroutines for analysis of system, and common block. Main program selects routines used in analysis and causes them to operate in desired sequence. Series of configuration-specific subroutines put in configuration data, perform force and life analyses for components (with help of generic component-property-analysis subroutines), fill property array, call up system-analysis routines, and finally print out results of analysis for system and components. Written in FORTRAN 77(IV).
Design of microstrip components by computer
NASA Technical Reports Server (NTRS)
Cisco, T. C.
1972-01-01
A number of computer programs are presented for use in the synthesis of microwave components in microstrip geometries. The programs compute the electrical and dimensional parameters required to synthesize couplers, filters, circulators, transformers, power splitters, diode switches, multipliers, diode attenuators and phase shifters. Additional programs are included to analyze and optimize cascaded transmission lines and lumped element networks, to analyze and synthesize Chebyshev and Butterworth filter prototypes, and to compute mixer intermodulation products. The programs are written in FORTRAN and the emphasis of the study is placed on the use of these programs and not on the theoretical aspects of the structures.
Rosenthal, L E
1986-10-01
Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.
van der List, Jelle P; Chawla, Harshvardhan; Joskowicz, Leo; Pearle, Andrew D
2016-11-01
Recently, there is a growing interest in surgical variables that are intraoperatively controlled by orthopaedic surgeons, including lower leg alignment, component positioning and soft tissues balancing. Since more tight control over these factors is associated with improved outcomes of unicompartmental knee arthroplasty and total knee arthroplasty (TKA), several computer navigation and robotic-assisted systems have been developed. Although mechanical axis accuracy and component positioning have been shown to improve with computer navigation, no superiority in functional outcomes has yet been shown. This could be explained by the fact that many differences exist between the number and type of surgical variables these systems control. Most systems control lower leg alignment and component positioning, while some in addition control soft tissue balancing. Finally, robotic-assisted systems have the additional advantage of improving surgical precision. A systematic search in PubMed, Embase and Cochrane Library resulted in 40 comparative studies and three registries on computer navigation reporting outcomes of 474,197 patients, and 21 basic science and clinical studies on robotic-assisted knee arthroplasty. Twenty-eight of these comparative computer navigation studies reported Knee Society Total scores in 3504 patients. Stratifying by type of surgical variables, no significant differences were noted in outcomes between surgery with computer-navigated TKA controlling for alignment and component positioning versus conventional TKA (p = 0.63). However, significantly better outcomes were noted following computer-navigated TKA that also controlled for soft tissue balancing versus conventional TKA (mean difference 4.84, 95 % Confidence Interval 1.61, 8.07, p = 0.003). A literature review of robotic systems showed that these systems can, similarly to computer navigation, reliably improve lower leg alignment, component positioning and soft tissues balancing. Furthermore, two studies comparing robotic-assisted with computer-navigated surgery reported superiority of robotic-assisted surgery in controlling these factors. Manually controlling all these surgical variables can be difficult for the orthopaedic surgeon. Findings in this study suggest that computer navigation or robotic assistance may help managing these multiple variables and could improve outcomes. Future studies assessing the role of soft tissue balancing in knee arthroplasty and long-term follow-up studies assessing the role of computer-navigated and robotic-assisted knee arthroplasty are needed.
Video Analysis of Projectile Motion Using Tablet Computers as Experimental Tools
ERIC Educational Resources Information Center
Klein, P.; Gröber, S.; Kuhn, J.; Müller, A.
2014-01-01
Tablet computers were used as experimental tools to record and analyse the motion of a ball thrown vertically from a moving skateboard. Special applications plotted the measurement data component by component, allowing a simple determination of initial conditions and "g" in order to explore the underlying laws of motion. This experiment…
This article describes the governing equations, computational algorithms, and other components entering into the Community Multiscale Air Quality (CMAQ) modeling system. This system has been designed to approach air quality as a whole by including state-of-the-science capabiliti...
Towards a Theory-Based Design Framework for an Effective E-Learning Computer Programming Course
ERIC Educational Resources Information Center
McGowan, Ian S.
2016-01-01
Built on Dabbagh (2005), this paper presents a four component theory-based design framework for an e-learning session in introductory computer programming. The framework, driven by a body of exemplars component, emphasizes the transformative interaction between the knowledge building community (KBC) pedagogical model, a mixed instructional…
ERIC Educational Resources Information Center
Hahn, H. A.; And Others
The purpose of this handbook is to provide background and guidelines for course designers and instructional developers who will be developing Reserve Component training for the United States military using asynchronous computer conferencing techniques. The recommendations in this report are based on an international review of the literature in…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-08
... Communication Devices, Portable Music and Data Processing Devices, Computers and Components Thereof; Notice of... within the United States after importation of certain wireless communication devices, portable music and... music and data processing devices, computers and components thereof that infringe one or more of claim...
The large scale microelectronics Computer-Aided Design and Test (CADAT) system
NASA Technical Reports Server (NTRS)
Gould, J. M.
1978-01-01
The CADAT system consists of a number of computer programs written in FORTRAN that provide the capability to simulate, lay out, analyze, and create the artwork for large scale microelectronics. The function of each software component of the system is described with references to specific documentation for each software component.
Do Clouds Compute? A Framework for Estimating the Value of Cloud Computing
NASA Astrophysics Data System (ADS)
Klems, Markus; Nimis, Jens; Tai, Stefan
On-demand provisioning of scalable and reliable compute services, along with a cost model that charges consumers based on actual service usage, has been an objective in distributed computing research and industry for a while. Cloud Computing promises to deliver on this objective: consumers are able to rent infrastructure in the Cloud as needed, deploy applications and store data, and access them via Web protocols on a pay-per-use basis. The acceptance of Cloud Computing, however, depends on the ability for Cloud Computing providers and consumers to implement a model for business value co-creation. Therefore, a systematic approach to measure costs and benefits of Cloud Computing is needed. In this paper, we discuss the need for valuation of Cloud Computing, identify key components, and structure these components in a framework. The framework assists decision makers in estimating Cloud Computing costs and to compare these costs to conventional IT solutions. We demonstrate by means of representative use cases how our framework can be applied to real world scenarios.
Conceptual model of iCAL4LA: Proposing the components using comparative analysis
NASA Astrophysics Data System (ADS)
Ahmad, Siti Zulaiha; Mutalib, Ariffin Abdul
2016-08-01
This paper discusses an on-going study that initiates an initial process in determining the common components for a conceptual model of interactive computer-assisted learning that is specifically designed for low achieving children. This group of children needs a specific learning support that can be used as an alternative learning material in their learning environment. In order to develop the conceptual model, this study extracts the common components from 15 strongly justified computer assisted learning studies. A comparative analysis has been conducted to determine the most appropriate components by using a set of specific indication classification to prioritize the applicability. The results of the extraction process reveal 17 common components for consideration. Later, based on scientific justifications, 16 of them were selected as the proposed components for the model.
Computational electronics and electromagnetics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shang, C C
The Computational Electronics and Electromagnetics thrust area serves as the focal point for Engineering R and D activities for developing computer-based design and analysis tools. Representative applications include design of particle accelerator cells and beamline components; design of transmission line components; engineering analysis and design of high-power (optical and microwave) components; photonics and optoelectronics circuit design; electromagnetic susceptibility analysis; and antenna synthesis. The FY-97 effort focuses on development and validation of (1) accelerator design codes; (2) 3-D massively parallel, time-dependent EM codes; (3) material models; (4) coupling and application of engineering tools for analysis and design of high-power components; andmore » (5) development of beam control algorithms coupled to beam transport physics codes. These efforts are in association with technology development in the power conversion, nondestructive evaluation, and microtechnology areas. The efforts complement technology development in Lawrence Livermore National programs.« less
75 FR 25185 - Broadband Initiatives Program
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-07
..., excluding desktop or laptop computers, computer hardware and software (including anti-virus, anti-spyware, and other security software), audio or video equipment, computer network components... 10 desktop or laptop computers and individual workstations to be located within the rural library...
NASA Technical Reports Server (NTRS)
Muszynska, A.
1985-01-01
In rotating machinery dynamics an orbit (Lissajous curve) represents the dynamic path of the shaft centerline motion during shaft rotation and resulting precession. The orbit can be observed with an oscilloscope connected to XY promixity probes. The orbits can also be simulated by a computer. The software for HP computer simulates orbits for two cases: (1) Symmetric orbit with four frequency components with different radial amplitudes and relative phase angles; and (2) Nonsymmetric orbit with two frequency components with two different vertical/horizontal amplitudes and two different relative phase angles. Each orbit carries a Keyphasor mark (one-per-turn reference). The frequencies, amplitudes, and phase angles, as well as number of time steps for orbit computation, have to be chosen and introduced to the computer by the user. The orbit graphs can be observed on the computer screen.
NASA Technical Reports Server (NTRS)
1992-01-01
The technical effort and computer code developed during the first year are summarized. Several formulations for Probabilistic Finite Element Analysis (PFEA) are described with emphasis on the selected formulation. The strategies being implemented in the first-version computer code to perform linear, elastic PFEA is described. The results of a series of select Space Shuttle Main Engine (SSME) component surveys are presented. These results identify the critical components and provide the information necessary for probabilistic structural analysis.
COMCAN: a computer program for common cause analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burdick, G.R.; Marshall, N.H.; Wilson, J.R.
1976-05-01
The computer program, COMCAN, searches the fault tree minimal cut sets for shared susceptibility to various secondary events (common causes) and common links between components. In the case of common causes, a location check may also be performed by COMCAN to determine whether barriers to the common cause exist between components. The program can locate common manufacturers of components having events in the same minimal cut set. A relative ranking scheme for secondary event susceptibility is included in the program.
2014-08-01
performance computing, smoothed particle hydrodynamics, rigid body dynamics, flexible body dynamics ARMAN PAZOUKI ∗, RADU SERBAN ∗, DAN NEGRUT ∗ A...HIGH PERFORMANCE COMPUTING APPROACH TO THE SIMULATION OF FLUID-SOLID INTERACTION PROBLEMS WITH RIGID AND FLEXIBLE COMPONENTS This work outlines a unified...are implemented to model rigid and flexible multibody dynamics. The two- way coupling of the fluid and solid phases is supported through use of
Prabhakar, P.; Sames, William J.; Dehoff, Ryan R.; ...
2015-03-28
Here, a computational modeling approach to simulate residual stress formation during the electron beam melting (EBM) process within the additive manufacturing (AM) technologies for Inconel 718 is presented in this paper. The EBM process has demonstrated a high potential to fabricate components with complex geometries, but the resulting components are influenced by the thermal cycles observed during the manufacturing process. When processing nickel based superalloys, very high temperatures (approx. 1000 °C) are observed in the powder bed, base plate, and build. These high temperatures, when combined with substrate adherence, can result in warping of the base plate and affect themore » final component by causing defects. It is important to have an understanding of the thermo-mechanical response of the entire system, that is, its mechanical behavior towards thermal loading occurring during the EBM process prior to manufacturing a component. Therefore, computational models to predict the response of the system during the EBM process will aid in eliminating the undesired process conditions, a priori, in order to fabricate the optimum component. Such a comprehensive computational modeling approach is demonstrated to analyze warping of the base plate, stress and plastic strain accumulation within the material, and thermal cycles in the system during different stages of the EBM process.« less
The Student-Teacher-Computer Team: Focus on the Computer.
ERIC Educational Resources Information Center
Ontario Inst. for Studies in Education, Toronto.
Descriptions of essential computer elements, logic and programing techniques, and computer applications are provided in an introductory handbook for use by educators and students. Following a brief historical perspective, the organization of a computer system is schematically illustrated, functions of components are explained in non-technical…
RANDOM MATRIX DIAGONALIZATION--A COMPUTER PROGRAM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fuchel, K.; Greibach, R.J.; Porter, C.E.
A computer prograra is described which generates random matrices, diagonalizes them and sorts appropriately the resulting eigenvalues and eigenvector components. FAP and FORTRAN listings for the IBM 7090 computer are included. (auth)
Impact design methods for ceramic components in gas turbine engines
NASA Technical Reports Server (NTRS)
Song, J.; Cuccio, J.; Kington, H.
1991-01-01
Methods currently under development to design ceramic turbine components with improved impact resistance are presented. Two different modes of impact damage are identified and characterized, i.e., structural damage and local damage. The entire computation is incorporated into the EPIC computer code. Model capability is demonstrated by simulating instrumented plate impact and particle impact tests.
Automated Grading of Rough Hardwood Lumber
Richard W. Conners; Tai-Hoon Cho; Philip A. Araman
1989-01-01
Any automatic hardwood grading system must have two components. The first of these is a computer vision system for locating and identifying defects on rough lumber. The second is a system for automatically grading boards based on the output of the computer vision system. This paper presents research results aimed at developing the first of these components. The...
Code of Federal Regulations, 2011 CFR
2011-01-01
... interest rate and foreign exchange rate contracts are computed on the basis of the credit equivalent amounts of such contracts. Credit equivalent amounts are computed for each of the following off-balance... Equivalent Amounts a. The minimum capital components for interest rate and foreign exchange rate contracts...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huff, Kathryn D.
Component level and system level abstraction of detailed computational geologic repository models have resulted in four rapid computational models of hydrologic radionuclide transport at varying levels of detail. Those models are described, as is their implementation in Cyder, a software library of interchangeable radionuclide transport models appropriate for representing natural and engineered barrier components of generic geology repository concepts. A proof of principle demonstration was also conducted in which these models were used to represent the natural and engineered barrier components of a repository concept in a reducing, homogenous, generic geology. This base case demonstrates integration of the Cyder openmore » source library with the Cyclus computational fuel cycle systems analysis platform to facilitate calculation of repository performance metrics with respect to fuel cycle choices. (authors)« less
Lo, Ming; Hue, Chih-Wei
2008-11-01
The Character-Component Analysis Toolkit (C-CAT) software was designed to assist researchers in constructing experimental materials using traditional Chinese characters. The software package contains two sets of character stocks: one suitable for research using literate adults as subjects and one suitable for research using schoolchildren as subjects. The software can identify linguistic properties, such as the number of strokes contained, the character-component pronunciation regularity, and the arrangement of character components within a character. Moreover, it can compute a character's linguistic frequency, neighborhood size, and phonetic validity with respect to a user-selected character stock. It can also search the selected character stock for similar characters or for character components with user-specified linguistic properties.
Computational model for fuel component supply into a combustion chamber of LRE
NASA Astrophysics Data System (ADS)
Teterev, A. V.; Mandrik, P. A.; Rudak, L. V.; Misyuchenko, N. I.
2017-12-01
A 2D-3D computational model for calculating a flow inside jet injectors that feed fuel components to a combustion chamber of a liquid rocket engine is described. The model is based on the gasdynamic calculation of compressible medium. Model software provides calculation of both one- and two-component injectors. Flow simulation in two-component injectors is realized using the scheme of separate supply of “gas-gas” or “gas-liquid” fuel components. An algorithm for converting a continuous liquid medium into a “cloud” of drops is described. Application areas of the developed model and the results of 2D simulation of injectors to obtain correction factors in the calculation formulas for fuel supply are discussed.
NASA Technical Reports Server (NTRS)
Wilson, R. B.; Banerjee, P. K.
1987-01-01
This Annual Status Report presents the results of work performed during the third year of the 3-D Inelastic Analysis Methods for Hot Sections Components program (NASA Contract NAS3-23697). The objective of the program is to produce a series of computer codes that permit more accurate and efficient three-dimensional analyses of selected hot section components, i.e., combustor liners, turbine blades, and turbine vanes. The computer codes embody a progression of mathematical models and are streamlined to take advantage of geometrical features, loading conditions, and forms of material response that distinguish each group of selected components.
Galaxy Makers Exhibition: Re-engagement, Evaluation and Content Legacy through an Online Component
NASA Astrophysics Data System (ADS)
Borrow, J.; Harrison, C.
2017-09-01
For the Royal Society Summer Science Exhibition 2016, Durham University's Institute of Computational Cosmology created the Galaxy Makers exhibit to communicate our computational cosmology and astronomy research. In addition to the physical exhibit we created an online component to foster re-engagement, create a permanent home for our content and allow us to collect important information about participation and impact. Here we summarise the details of the exhibit and the degree of success attached to the online component. We also share suggestions for further uses and improvements that could be implemented for the online components of other science exhibitions.
Periodic component analysis as a spatial filter for SSVEP-based brain-computer interface.
Kiran Kumar, G R; Reddy, M Ramasubba
2018-06-08
Traditional Spatial filters used for steady-state visual evoked potential (SSVEP) extraction such as minimum energy combination (MEC) require the estimation of the background electroencephalogram (EEG) noise components. Even though this leads to improved performance in low signal to noise ratio (SNR) conditions, it makes such algorithms slow compared to the standard detection methods like canonical correlation analysis (CCA) due to the additional computational cost. In this paper, Periodic component analysis (πCA) is presented as an alternative spatial filtering approach to extract the SSVEP component effectively without involving extensive modelling of the noise. The πCA can separate out components corresponding to a given frequency of interest from the background electroencephalogram (EEG) by capturing the temporal information and does not generalize SSVEP based on rigid templates. Data from ten test subjects were used to evaluate the proposed method and the results demonstrate that the periodic component analysis acts as a reliable spatial filter for SSVEP extraction. Statistical tests were performed to validate the results. The experimental results show that πCA provides significant improvement in accuracy compared to standard CCA and MEC in low SNR conditions. The results demonstrate that πCA provides better detection accuracy compared to CCA and on par with that of MEC at a lower computational cost. Hence πCA is a reliable and efficient alternative detection algorithm for SSVEP based brain-computer interface (BCI). Copyright © 2018. Published by Elsevier B.V.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Swindeman, M. J.; Jetter, R. I.; Sham, T. -L.
One of the objectives of the high temperature design methodology activities is to develop and validate both improvements and the basic features of ASME Boiler and Pressure Vessel Code, Section III, Rules for Construction of Nuclear Facility Components, Division 5, High Temperature Reactors, Subsection HB, Subpart B (HBB). The overall scope of this task is to develop a computer program to aid assessment procedures of components under specified loading conditions in accordance with the elevated temperature design requirements for Division 5 Class A components. There are many features and alternative paths of varying complexity in HBB. The initial focus ofmore » this computer program is a basic path through the various options for a single reference material, 316H stainless steel. However, the computer program is being structured for eventual incorporation all of the features and permitted materials of HBB. This report will first provide a description of the overall computer program, particular challenges in developing numerical procedures for the assessment, and an overall approach to computer program development. This is followed by a more comprehensive appendix, which is the draft computer program manual for the program development. The strain limits rules have been implemented in the computer program. The evaluation of creep-fatigue damage will be implemented in future work scope.« less
ERIC Educational Resources Information Center
Taylor, Jack A.
1983-01-01
Peripheral components for music instruction include a music keyboard, a digital music synthesizer, and music listening devices. Computers can teach sight-singing, playing an instrument, dictation, and composition. Computer programs should be interactive with students. (KC)
NASA Tech Briefs, February 2000. Volume 24, No. 2
NASA Technical Reports Server (NTRS)
2000-01-01
Topics covered include: Test and Measurement; Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Bio-Medical; Mathematics and Information Sciences; Computers and Peripherals.
NASA Astrophysics Data System (ADS)
Biermann, D.; Gausemeier, J.; Heim, H.-P.; Hess, S.; Petersen, M.; Ries, A.; Wagner, T.
2014-05-01
In this contribution a framework for the computer-aided planning and optimisation of functional graded components is presented. The framework is divided into three modules - the "Component Description", the "Expert System" for the synthetisation of several process chains and the "Modelling and Process Chain Optimisation". The Component Description module enhances a standard computer-aided design (CAD) model by a voxel-based representation of the graded properties. The Expert System synthesises process steps stored in the knowledge base to generate several alternative process chains. Each process chain is capable of producing components according to the enhanced CAD model and usually consists of a sequence of heating-, cooling-, and forming processes. The dependencies between the component and the applied manufacturing processes as well as between the processes themselves need to be considered. The Expert System utilises an ontology for that purpose. The ontology represents all dependencies in a structured way and connects the information of the knowledge base via relations. The third module performs the evaluation of the generated process chains. To accomplish this, the parameters of each process are optimised with respect to the component specification, whereby the result of the best parameterisation is used as representative value. Finally, the process chain which is capable of manufacturing a functionally graded component in an optimal way regarding to the property distributions of the component description is presented by means of a dedicated specification technique.
NASA Technical Reports Server (NTRS)
Cole, Gary L.; Richard, Jacques C.
1991-01-01
An approach to simulating the internal flows of supersonic propulsion systems is presented. The approach is based on a fairly simple modification of the Large Perturbation Inlet (LAPIN) computer code. LAPIN uses a quasi-one dimensional, inviscid, unsteady formulation of the continuity, momentum, and energy equations. The equations are solved using a shock capturing, finite difference algorithm. The original code, developed for simulating supersonic inlets, includes engineering models of unstart/restart, bleed, bypass, and variable duct geometry, by means of source terms in the equations. The source terms also provide a mechanism for incorporating, with the inlet, propulsion system components such as compressor stages, combustors, and turbine stages. This requires each component to be distributed axially over a number of grid points. Because of the distributed nature of such components, this representation should be more accurate than a lumped parameter model. Components can be modeled by performance map(s), which in turn are used to compute the source terms. The general approach is described. Then, simulation of a compressor/fan stage is discussed to show the approach in detail.
Kairov, Ulykbek; Cantini, Laura; Greco, Alessandro; Molkenov, Askhat; Czerwinska, Urszula; Barillot, Emmanuel; Zinovyev, Andrei
2017-09-11
Independent Component Analysis (ICA) is a method that models gene expression data as an action of a set of statistically independent hidden factors. The output of ICA depends on a fundamental parameter: the number of components (factors) to compute. The optimal choice of this parameter, related to determining the effective data dimension, remains an open question in the application of blind source separation techniques to transcriptomic data. Here we address the question of optimizing the number of statistically independent components in the analysis of transcriptomic data for reproducibility of the components in multiple runs of ICA (within the same or within varying effective dimensions) and in multiple independent datasets. To this end, we introduce ranking of independent components based on their stability in multiple ICA computation runs and define a distinguished number of components (Most Stable Transcriptome Dimension, MSTD) corresponding to the point of the qualitative change of the stability profile. Based on a large body of data, we demonstrate that a sufficient number of dimensions is required for biological interpretability of the ICA decomposition and that the most stable components with ranks below MSTD have more chances to be reproduced in independent studies compared to the less stable ones. At the same time, we show that a transcriptomics dataset can be reduced to a relatively high number of dimensions without losing the interpretability of ICA, even though higher dimensions give rise to components driven by small gene sets. We suggest a protocol of ICA application to transcriptomics data with a possibility of prioritizing components with respect to their reproducibility that strengthens the biological interpretation. Computing too few components (much less than MSTD) is not optimal for interpretability of the results. The components ranked within MSTD range have more chances to be reproduced in independent studies.
Computer Languages: A Practical Guide to the Chief Programming Languages.
ERIC Educational Resources Information Center
Sanderson, Peter C.
All the most commonly-used high-level computer languages are discussed in this book. An introductory discussion provides an overview of the basic components of a digital computer, the general planning of a computer programing problem, and the various types of computer languages. Each chapter is self-contained, emphasizes those features of a…
Computer-Aided Facilities Management Systems (CAFM).
ERIC Educational Resources Information Center
Cyros, Kreon L.
Computer-aided facilities management (CAFM) refers to a collection of software used with increasing frequency by facilities managers. The six major CAFM components are discussed with respect to their usefulness and popularity in facilities management applications: (1) computer-aided design; (2) computer-aided engineering; (3) decision support…
Gregory Elmes; Thomas Millette; Charles B. Yuill
1991-01-01
GypsES, a decision-support and expert system for the management of Gypsy Moth addresses five related research problems in a modular, computer-based project. The modules are hazard rating, monitoring, prediction, treatment decision and treatment implementation. One common component is a geographic information system designed to function intelligently. We refer to this...
ERIC Educational Resources Information Center
Wan, Hsu-Tien; Hsu, Kuang-Yang; Sheu, Shiow-Yunn
2016-01-01
In this research, we aim to understand the effectiveness of adopting educational technologies in a computer literacy course to students in a medical university. The course was organized with three core components: Open Education Resources (OER) reading, a book club, and online game competition. These components were delivered by a learning…
Novices on the Net: An Introduction to Education Class Uses E-Mail and the Internet.
ERIC Educational Resources Information Center
Corl, Susan F.
In order to offer computer and/or technology instruction to their education students before they transfer to four-year colleges, Louisiana State University at Eunice (LSUE), a two-year college, added a computer component to an introductory education class. The component introduces pre-service teachers to electronic mail and the Internet in order…
Computing Reliabilities Of Ceramic Components Subject To Fracture
NASA Technical Reports Server (NTRS)
Nemeth, N. N.; Gyekenyesi, J. P.; Manderscheid, J. M.
1992-01-01
CARES calculates fast-fracture reliability or failure probability of macroscopically isotropic ceramic components. Program uses results from commercial structural-analysis program (MSC/NASTRAN or ANSYS) to evaluate reliability of component in presence of inherent surface- and/or volume-type flaws. Computes measure of reliability by use of finite-element mathematical model applicable to multiple materials in sense model made function of statistical characterizations of many ceramic materials. Reliability analysis uses element stress, temperature, area, and volume outputs, obtained from two-dimensional shell and three-dimensional solid isoparametric or axisymmetric finite elements. Written in FORTRAN 77.
NASA Technical Reports Server (NTRS)
Kolb, Mark A.
1988-01-01
The Rubber Airplane program, which combines two symbolic processing techniques with a component-based database of design knowledge, is proposed as a computer aid for conceptual design. Using object-oriented programming, programs are organized around the objects and behavior to be simulated, and using constraint propagation, declarative statements designate mathematical relationships among all the equation variables. It is found that the additional level of organizational structure resulting from the arrangement of the design information in terms of design components provides greater flexibility and convenience.
Collar grids for intersecting geometric components within the Chimera overlapped grid scheme
NASA Technical Reports Server (NTRS)
Parks, Steven J.; Buning, Pieter G.; Chan, William M.; Steger, Joseph L.
1991-01-01
A method for overcoming problems with using the Chimera overset grid scheme in the region of intersecting geometry components is presented. A 'collar grid' resolves the intersection region and provides communication between the component grids. This approach is validated by comparing computed and experimental data for a flow about a wing/body configuration. Application of the collar grid scheme to the Orbiter fuselage and vertical tail intersection in a computation of the full Space Shuttle launch vehicle demonstrates its usefulness for simulation of flow about complex aerospace vehicles.
Stored program concept for analog computers
NASA Technical Reports Server (NTRS)
Hannauer, G., III; Patmore, J. R.
1971-01-01
Optimization of three-stage matrices, modularization, and black boxes design techniques provides for automatically interconnecting computing component inputs and outputs in general purpose analog computer. Design also produces relatively inexpensive and less complex automatic patching system.
NASA Astrophysics Data System (ADS)
Beiden, Sergey V.; Wagner, Robert F.; Campbell, Gregory; Metz, Charles E.; Chan, Heang-Ping; Nishikawa, Robert M.; Schnall, Mitchell D.; Jiang, Yulei
2001-06-01
In recent years, the multiple-reader, multiple-case (MRMC) study paradigm has become widespread for receiver operating characteristic (ROC) assessment of systems for diagnostic imaging and computer-aided diagnosis. We review how MRMC data can be analyzed in terms of the multiple components of the variance (case, reader, interactions) observed in those studies. Such information is useful for the design of pivotal studies from results of a pilot study and also for studying the effects of reader training. Recently, several of the present authors have demonstrated methods to generalize the analysis of multiple variance components to the case where unaided readers of diagnostic images are compared with readers who receive the benefit of a computer assist (CAD). For this case it is necessary to model the possibility that several of the components of variance might be reduced when readers incorporate the computer assist, compared to the unaided reading condition. We review results of this kind of analysis on three previously published MRMC studies, two of which were applications of CAD to diagnostic mammography and one was an application of CAD to screening mammography. The results for the three cases are seen to differ, depending on the reader population sampled and the task of interest. Thus, it is not possible to generalize a particular analysis of variance components beyond the tasks and populations actually investigated.
Failure detection in high-performance clusters and computers using chaotic map computations
Rao, Nageswara S.
2015-09-01
A programmable media includes a processing unit capable of independent operation in a machine that is capable of executing 10.sup.18 floating point operations per second. The processing unit is in communication with a memory element and an interconnect that couples computing nodes. The programmable media includes a logical unit configured to execute arithmetic functions, comparative functions, and/or logical functions. The processing unit is configured to detect computing component failures, memory element failures and/or interconnect failures by executing programming threads that generate one or more chaotic map trajectories. The central processing unit or graphical processing unit is configured to detect a computing component failure, memory element failure and/or an interconnect failure through an automated comparison of signal trajectories generated by the chaotic maps.
32 CFR 310.52 - Computer matching publication and review requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 32 National Defense 2 2012-07-01 2012-07-01 false Computer matching publication and review... OF DEFENSE (CONTINUED) PRIVACY PROGRAM DOD PRIVACY PROGRAM Computer Matching Program Procedures § 310.52 Computer matching publication and review requirements. (a) DoD Components shall identify the...
32 CFR 310.52 - Computer matching publication and review requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 32 National Defense 2 2014-07-01 2014-07-01 false Computer matching publication and review... OF DEFENSE (CONTINUED) PRIVACY PROGRAM DOD PRIVACY PROGRAM Computer Matching Program Procedures § 310.52 Computer matching publication and review requirements. (a) DoD Components shall identify the...
32 CFR 310.52 - Computer matching publication and review requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 32 National Defense 2 2013-07-01 2013-07-01 false Computer matching publication and review... OF DEFENSE (CONTINUED) PRIVACY PROGRAM DOD PRIVACY PROGRAM Computer Matching Program Procedures § 310.52 Computer matching publication and review requirements. (a) DoD Components shall identify the...
32 CFR 310.52 - Computer matching publication and review requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 32 National Defense 2 2010-07-01 2010-07-01 false Computer matching publication and review... OF DEFENSE (CONTINUED) PRIVACY PROGRAM DOD PRIVACY PROGRAM Computer Matching Program Procedures § 310.52 Computer matching publication and review requirements. (a) DoD Components shall identify the...
32 CFR 310.52 - Computer matching publication and review requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 32 National Defense 2 2011-07-01 2011-07-01 false Computer matching publication and review... OF DEFENSE (CONTINUED) PRIVACY PROGRAM DOD PRIVACY PROGRAM Computer Matching Program Procedures § 310.52 Computer matching publication and review requirements. (a) DoD Components shall identify the...
Design for pressure regulating components
NASA Technical Reports Server (NTRS)
Wichmann, H.
1973-01-01
The design development for Pressure Regulating Components included a regulator component trade-off study with analog computer performance verification to arrive at a final optimized regulator configuration for the Space Storable Propulsion Module, under development for a Jupiter Orbiter mission. This application requires the pressure regulator to be capable of long-term fluorine exposure. In addition, individual but basically identical (for purposes of commonality) units are required for separate oxidizer and fuel pressurization. The need for dual units requires improvement in the regulation accuracy over present designs. An advanced regulator concept was prepared featuring redundant bellows, all metallic/ceramic construction, friction-free guidance of moving parts, gas damping, and the elimination of coil springs normally used for reference forces. The activities included testing of actual size seat/poppet components to determine actual discharge coefficients and flow forces. The resulting data was inserted into the computer model of the regulator. Computer simulation of the propulsion module performance over two mission profiles indicated satisfactory minimization of propellant residual requirements imposed by regulator performance uncertainties.
Free energy change of a dislocation due to a Cottrell atmosphere
Sills, R. B.; Cai, W.
2018-03-07
The free energy reduction of a dislocation due to a Cottrell atmosphere of solutes is computed using a continuum model. In this work, we show that the free energy change is composed of near-core and far-field components. The far-field component can be computed analytically using the linearized theory of solid solutions. Near the core the linearized theory is inaccurate, and the near-core component must be computed numerically. The influence of interactions between solutes in neighbouring lattice sites is also examined using the continuum model. We show that this model is able to reproduce atomistic calculations of the nickel–hydrogen system, predictingmore » hydride formation on dislocations. The formation of these hydrides leads to dramatic reductions in the free energy. Lastly, the influence of the free energy change on a dislocation’s line tension is examined by computing the equilibrium shape of a dislocation shear loop and the activation stress for a Frank–Read source using discrete dislocation dynamics.« less
Free energy change of a dislocation due to a Cottrell atmosphere
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sills, R. B.; Cai, W.
The free energy reduction of a dislocation due to a Cottrell atmosphere of solutes is computed using a continuum model. In this work, we show that the free energy change is composed of near-core and far-field components. The far-field component can be computed analytically using the linearized theory of solid solutions. Near the core the linearized theory is inaccurate, and the near-core component must be computed numerically. The influence of interactions between solutes in neighbouring lattice sites is also examined using the continuum model. We show that this model is able to reproduce atomistic calculations of the nickel–hydrogen system, predictingmore » hydride formation on dislocations. The formation of these hydrides leads to dramatic reductions in the free energy. Lastly, the influence of the free energy change on a dislocation’s line tension is examined by computing the equilibrium shape of a dislocation shear loop and the activation stress for a Frank–Read source using discrete dislocation dynamics.« less
A new method for computing the reliability of consecutive k-out-of-n:F systems
NASA Astrophysics Data System (ADS)
Gökdere, Gökhan; Gürcan, Mehmet; Kılıç, Muhammet Burak
2016-01-01
In many physical systems, reliability evaluation, such as ones encountered in telecommunications, the design of integrated circuits, microwave relay stations, oil pipeline systems, vacuum systems in accelerators, computer ring networks, and spacecraft relay stations, have had applied consecutive k-out-of-n system models. These systems are characterized as logical connections among the components of the systems placed in lines or circles. In literature, a great deal of attention has been paid to the study of the reliability evaluation of consecutive k-out-of-n systems. In this paper, we propose a new method to compute the reliability of consecutive k-out-of-n:F systems, with n linearly and circularly arranged components. The proposed method provides a simple way for determining the system failure probability. Also, we write R-Project codes based on our proposed method to compute the reliability of the linear and circular systems which have a great number of components.
Free energy change of a dislocation due to a Cottrell atmosphere
NASA Astrophysics Data System (ADS)
Sills, R. B.; Cai, W.
2018-06-01
The free energy reduction of a dislocation due to a Cottrell atmosphere of solutes is computed using a continuum model. We show that the free energy change is composed of near-core and far-field components. The far-field component can be computed analytically using the linearized theory of solid solutions. Near the core the linearized theory is inaccurate, and the near-core component must be computed numerically. The influence of interactions between solutes in neighbouring lattice sites is also examined using the continuum model. We show that this model is able to reproduce atomistic calculations of the nickel-hydrogen system, predicting hydride formation on dislocations. The formation of these hydrides leads to dramatic reductions in the free energy. Finally, the influence of the free energy change on a dislocation's line tension is examined by computing the equilibrium shape of a dislocation shear loop and the activation stress for a Frank-Read source using discrete dislocation dynamics.
Remote information service access system based on a client-server-service model
Konrad, Allan M.
1996-01-01
A local host computing system, a remote host computing system as connected by a network, and service functionalities: a human interface service functionality, a starter service functionality, and a desired utility service functionality, and a Client-Server-Service (CSS) model is imposed on each service functionality. In one embodiment, this results in nine logical components and three physical components (a local host, a remote host, and an intervening network), where two of the logical components are integrated into one Remote Object Client component, and that Remote Object Client component and the other seven logical components are deployed among the local host and remote host in a manner which eases compatibility and upgrade problems, and provides an illusion to a user that a desired utility service supported on a remote host resides locally on the user's local host, thereby providing ease of use and minimal software maintenance for users of that remote service.
Remote information service access system based on a client-server-service model
Konrad, A.M.
1997-12-09
A local host computing system, a remote host computing system as connected by a network, and service functionalities: a human interface service functionality, a starter service functionality, and a desired utility service functionality, and a Client-Server-Service (CSS) model is imposed on each service functionality. In one embodiment, this results in nine logical components and three physical components (a local host, a remote host, and an intervening network), where two of the logical components are integrated into one Remote Object Client component, and that Remote Object Client component and the other seven logical components are deployed among the local host and remote host in a manner which eases compatibility and upgrade problems, and provides an illusion to a user that a desired utility service supported on a remote host resides locally on the user`s local host, thereby providing ease of use and minimal software maintenance for users of that remote service. 16 figs.
Remote information service access system based on a client-server-service model
Konrad, Allan M.
1999-01-01
A local host computing system, a remote host computing system as connected by a network, and service functionalities: a human interface service functionality, a starter service functionality, and a desired utility service functionality, and a Client-Server-Service (CSS) model is imposed on each service functionality. In one embodiment, this results in nine logical components and three physical components (a local host, a remote host, and an intervening network), where two of the logical components are integrated into one Remote Object Client component, and that Remote Object Client component and the other seven logical components are deployed among the local host and remote host in a manner which eases compatibility and upgrade problems, and provides an illusion to a user that a desired utility service supported on a remote host resides locally on the user's local host, thereby providing ease of use and minimal software maintenance for users of that remote service.
Remote information service access system based on a client-server-service model
Konrad, A.M.
1996-08-06
A local host computing system, a remote host computing system as connected by a network, and service functionalities: a human interface service functionality, a starter service functionality, and a desired utility service functionality, and a Client-Server-Service (CSS) model is imposed on each service functionality. In one embodiment, this results in nine logical components and three physical components (a local host, a remote host, and an intervening network), where two of the logical components are integrated into one Remote Object Client component, and that Remote Object Client component and the other seven logical components are deployed among the local host and remote host in a manner which eases compatibility and upgrade problems, and provides an illusion to a user that a desired utility service supported on a remote host resides locally on the user`s local host, thereby providing ease of use and minimal software maintenance for users of that remote service. 16 figs.
Remote information service access system based on a client-server-service model
Konrad, Allan M.
1997-01-01
A local host computing system, a remote host computing system as connected by a network, and service functionalities: a human interface service functionality, a starter service functionality, and a desired utility service functionality, and a Client-Server-Service (CSS) model is imposed on each service functionality. In one embodiment, this results in nine logical components and three physical components (a local host, a remote host, and an intervening network), where two of the logical components are integrated into one Remote Object Client component, and that Remote Object Client component and the other seven logical components are deployed among the local host and remote host in a manner which eases compatibility and upgrade problems, and provides an illusion to a user that a desired utility service supported on a remote host resides locally on the user's local host, thereby providing ease of use and minimal software maintenance for users of that remote service.
Webb, Taylor W.; Kelly, Yin T.; Graziano, Michael S. A.
2016-01-01
Abstract The temporoparietal junction (TPJ) is activated in association with a large range of functions, including social cognition, episodic memory retrieval, and attentional reorienting. An ongoing debate is whether the TPJ performs an overarching, domain-general computation, or whether functions reside in domain-specific subdivisions. We scanned subjects with fMRI during five tasks known to activate the TPJ, probing social, attentional, and memory functions, and used data-driven parcellation (independent component analysis) to isolate task-related functional processes in the bilateral TPJ. We found that one dorsal component in the right TPJ, which was connected with the frontoparietal control network, was activated in all of the tasks. Other TPJ subregions were specific for attentional reorienting, oddball target detection, or social attribution of belief. The TPJ components that participated in attentional reorienting and oddball target detection appeared spatially separated, but both were connected with the ventral attention network. The TPJ component that participated in the theory-of-mind task was part of the default-mode network. Further, we found that the BOLD response in the domain-general dorsal component had a longer latency than responses in the domain-specific components, suggesting an involvement in distinct, perhaps postperceptual, computations. These findings suggest that the TPJ performs both domain-general and domain-specific computations that reside within spatially distinct functional components. PMID:27280153
Wang, Yuanjia; Chen, Huaihou
2012-01-01
Summary We examine a generalized F-test of a nonparametric function through penalized splines and a linear mixed effects model representation. With a mixed effects model representation of penalized splines, we imbed the test of an unspecified function into a test of some fixed effects and a variance component in a linear mixed effects model with nuisance variance components under the null. The procedure can be used to test a nonparametric function or varying-coefficient with clustered data, compare two spline functions, test the significance of an unspecified function in an additive model with multiple components, and test a row or a column effect in a two-way analysis of variance model. Through a spectral decomposition of the residual sum of squares, we provide a fast algorithm for computing the null distribution of the test, which significantly improves the computational efficiency over bootstrap. The spectral representation reveals a connection between the likelihood ratio test (LRT) in a multiple variance components model and a single component model. We examine our methods through simulations, where we show that the power of the generalized F-test may be higher than the LRT, depending on the hypothesis of interest and the true model under the alternative. We apply these methods to compute the genome-wide critical value and p-value of a genetic association test in a genome-wide association study (GWAS), where the usual bootstrap is computationally intensive (up to 108 simulations) and asymptotic approximation may be unreliable and conservative. PMID:23020801
Wang, Yuanjia; Chen, Huaihou
2012-12-01
We examine a generalized F-test of a nonparametric function through penalized splines and a linear mixed effects model representation. With a mixed effects model representation of penalized splines, we imbed the test of an unspecified function into a test of some fixed effects and a variance component in a linear mixed effects model with nuisance variance components under the null. The procedure can be used to test a nonparametric function or varying-coefficient with clustered data, compare two spline functions, test the significance of an unspecified function in an additive model with multiple components, and test a row or a column effect in a two-way analysis of variance model. Through a spectral decomposition of the residual sum of squares, we provide a fast algorithm for computing the null distribution of the test, which significantly improves the computational efficiency over bootstrap. The spectral representation reveals a connection between the likelihood ratio test (LRT) in a multiple variance components model and a single component model. We examine our methods through simulations, where we show that the power of the generalized F-test may be higher than the LRT, depending on the hypothesis of interest and the true model under the alternative. We apply these methods to compute the genome-wide critical value and p-value of a genetic association test in a genome-wide association study (GWAS), where the usual bootstrap is computationally intensive (up to 10(8) simulations) and asymptotic approximation may be unreliable and conservative. © 2012, The International Biometric Society.
Mitigating component performance variation
Gara, Alan G.; Sylvester, Steve S.; Eastep, Jonathan M.; Nagappan, Ramkumar; Cantalupo, Christopher M.
2018-01-09
Apparatus and methods may provide for characterizing a plurality of similar components of a distributed computing system based on a maximum safe operation level associated with each component and storing characterization data in a database and allocating non-uniform power to each similar component based at least in part on the characterization data in the database to substantially equalize performance of the components.
The Reasoning behind the Scene: Why Do Early Childhood Educators Use Computers in Their Classrooms?
ERIC Educational Resources Information Center
Edwards, Suzy
2005-01-01
In recent times discussion surrounding the use of computers in early childhood education has emphasised the role computers play in children's everyday lives. This realisation has replaced early debate regarding the appropriateness or otherwise of computer use for young children in early childhood education. An important component of computer use…
The Computer and Its Functions; How to Communicate with the Computer.
ERIC Educational Resources Information Center
Ward, Peggy M.
A brief discussion of why it is important for students to be familiar with computers and their functions and a list of some practical applications introduce this two-part paper. Focusing on how the computer works, the first part explains the various components of the computer, different kinds of memory storage devices, disk operating systems, and…
Figueroa, José; Guarachi, Juan Pablo; Matas, José; Arnander, Magnus; Orrego, Mario
2016-04-01
Computed tomography (CT) is widely used to assess component rotation in patients with poor results after total knee arthroplasty (TKA). The purpose of this study was to simultaneously determine the accuracy and reliability of CT in measuring TKA component rotation. TKA components were implanted in dry-bone models and assigned to two groups. The first group (n = 7) had variable femoral component rotations, and the second group (n = 6) had variable tibial tray rotations. CT images were then used to assess component rotation. Accuracy of CT rotational assessment was determined by mean difference, in degrees, between implanted component rotation and CT-measured rotation. Intraclass correlation coefficient (ICC) was applied to determine intra-observer and inter-observer reliability. Femoral component accuracy showed a mean difference of 2.5° and the tibial tray a mean difference of 3.2°. There was good intra- and inter-observer reliability for both components, with a femoral ICC of 0.8 and 0.76, and tibial ICC of 0.68 and 0.65, respectively. CT rotational assessment accuracy can differ from true component rotation by approximately 3° for each component. It does, however, have good inter- and intra-observer reliability.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-30
... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-841] Certain Computer and Computer... Bonding AGENCY: U.S. International Trade Commission. ACTION: Notice. SUMMARY: Notice is hereby given that the U.S. International Trade Commission has determined to review in the entirety the final initial...
19 CFR 152.106 - Computed value.
Code of Federal Regulations, 2010 CFR
2010-04-01
... will not be added to the other elements as it is not intended that any component of computed value be... 19 Customs Duties 2 2010-04-01 2010-04-01 false Computed value. 152.106 Section 152.106 Customs... (CONTINUED) CLASSIFICATION AND APPRAISEMENT OF MERCHANDISE Valuation of Merchandise § 152.106 Computed value...
19 CFR 152.106 - Computed value.
Code of Federal Regulations, 2013 CFR
2013-04-01
... will not be added to the other elements as it is not intended that any component of computed value be... 19 Customs Duties 2 2013-04-01 2013-04-01 false Computed value. 152.106 Section 152.106 Customs... (CONTINUED) CLASSIFICATION AND APPRAISEMENT OF MERCHANDISE Valuation of Merchandise § 152.106 Computed value...
19 CFR 152.106 - Computed value.
Code of Federal Regulations, 2012 CFR
2012-04-01
... will not be added to the other elements as it is not intended that any component of computed value be... 19 Customs Duties 2 2012-04-01 2012-04-01 false Computed value. 152.106 Section 152.106 Customs... (CONTINUED) CLASSIFICATION AND APPRAISEMENT OF MERCHANDISE Valuation of Merchandise § 152.106 Computed value...
19 CFR 152.106 - Computed value.
Code of Federal Regulations, 2014 CFR
2014-04-01
... will not be added to the other elements as it is not intended that any component of computed value be... 19 Customs Duties 2 2014-04-01 2014-04-01 false Computed value. 152.106 Section 152.106 Customs... (CONTINUED) CLASSIFICATION AND APPRAISEMENT OF MERCHANDISE Valuation of Merchandise § 152.106 Computed value...
19 CFR 152.106 - Computed value.
Code of Federal Regulations, 2011 CFR
2011-04-01
... will not be added to the other elements as it is not intended that any component of computed value be... 19 Customs Duties 2 2011-04-01 2011-04-01 false Computed value. 152.106 Section 152.106 Customs... (CONTINUED) CLASSIFICATION AND APPRAISEMENT OF MERCHANDISE Valuation of Merchandise § 152.106 Computed value...
2007-09-01
Technology (NIST) [7]. SUPERTRAPP is an interactive computer database designed to predict the thermodynamic and transport properties of fluid mixtures...of liquid sprays. However, the potential core computation is done for all the Raman scattering injection conditions to compare the condensed phase...spaced from the Rayleigh component suggesting that they contain the same information about the vibrational quantum energy. The intensity
NASA Tech Briefs, July 1997. Volume 21, No. 7
NASA Technical Reports Server (NTRS)
1997-01-01
Topics: Mechanical Components; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Software; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Life Sciences.
Phased Array Imaging of Complex-Geometry Composite Components.
Brath, Alex J; Simonetti, Francesco
2017-10-01
Progress in computational fluid dynamics and the availability of new composite materials are driving major advances in the design of aerospace engine components which now have highly complex geometries optimized to maximize system performance. However, shape complexity poses significant challenges to traditional nondestructive evaluation methods whose sensitivity and selectivity rapidly decrease as surface curvature increases. In addition, new aerospace materials typically exhibit an intricate microstructure that further complicates the inspection. In this context, an attractive solution is offered by combining ultrasonic phased array (PA) technology with immersion testing. Here, the water column formed between the complex surface of the component and the flat face of a linear or matrix array probe ensures ideal acoustic coupling between the array and the component as the probe is continuously scanned to form a volumetric rendering of the part. While the immersion configuration is desirable for practical testing, the interpretation of the measured ultrasonic signals for image formation is complicated by reflection and refraction effects that occur at the water-component interface. To account for refraction, the geometry of the interface must first be reconstructed from the reflected signals and subsequently used to compute suitable delay laws to focus inside the component. These calculations are based on ray theory and can be computationally intensive. Moreover, strong reflections from the interface can lead to a thick dead zone beneath the surface of the component which limits sensitivity to shallow subsurface defects. This paper presents a general approach that combines advanced computing for rapid ray tracing in anisotropic media with a 256-channel parallel array architecture. The full-volume inspection of complex-shape components is enabled through the combination of both reflected and transmitted signals through the part using a pair of arrays held in a yoke configuration. Experimental results are provided for specimens of increasing complexity relevant to aerospace applications such as fan blades. It is shown that PA technology can provide a robust solution to detect a variety of defects including porosity and waviness in composite parts.
Li, Xiaohui; Yu, Jianhua; Gong, Yuekun; Ren, Kaijing; Liu, Jun
2015-04-21
To assess the early postoperative clinical and radiographic outcomes after navigation-assisted or standard instrumentation total knee arthroplasty (TKA). From August 2007 to May 2008, 60 KSS-A type patients underwent 67 primary TKA operations by the same surgical team. Twenty-two operations were performed with the Image-free navigation system with an average age of 64.5 years while the remaining 45 underwent conventional manual procedures with an average age of 66 years. Their preoperative demographic and functional data had no statistical differences (P>0.05). The operative duration, blood loss volume and hospitalization days were compared for two groups. And radiographic data included coronal femoral component angle, coronal tibial component angle, sagittal femoral component angle, sagittal tibial component angle and coronal tibiofemoral angle after one month. And functional assessment scores were evaluated at 1, 3 and 6 months postoperatively. Operative duration was significantly longer for computer navigation (P<0.05). The average blood loss volume was 555.26 ml in computer navigation group and 647.56 ml in conventional manual method group (P<0.05). And hospitalization stay was shorter in computer navigation group than that in conventional method group (7.74 vs 8.68 days) (P=0.04). The alignment deviation was better in computer-assisted group than that in conventional manual method group (P<0.05). The percentage of patients with a coronal tibiofemoral angle within ±3 of ideal value was 95.45% for computer-assisted mini-invasive TKA group and 80% for conventional TKA group (P=0.003). The Knee Society Clinical Rating Score was higher in computer-assisted group than that in conventional manual method group at 1 and 3 montha post-operation. However, no statistical inter-group difference existed at 6 months post-operation. Navigation allows a surgeon to precisely implant the components for TKA. And it offers faster functional recovery and shorter hospitalization stay. At 6 months post-operation, there is no statistical inter-group difference in KSS scores.
Development of a small-scale computer cluster
NASA Astrophysics Data System (ADS)
Wilhelm, Jay; Smith, Justin T.; Smith, James E.
2008-04-01
An increase in demand for computing power in academia has necessitated the need for high performance machines. Computing power of a single processor has been steadily increasing, but lags behind the demand for fast simulations. Since a single processor has hard limits to its performance, a cluster of computers can have the ability to multiply the performance of a single computer with the proper software. Cluster computing has therefore become a much sought after technology. Typical desktop computers could be used for cluster computing, but are not intended for constant full speed operation and take up more space than rack mount servers. Specialty computers that are designed to be used in clusters meet high availability and space requirements, but can be costly. A market segment exists where custom built desktop computers can be arranged in a rack mount situation, gaining the space saving of traditional rack mount computers while remaining cost effective. To explore these possibilities, an experiment was performed to develop a computing cluster using desktop components for the purpose of decreasing computation time of advanced simulations. This study indicates that small-scale cluster can be built from off-the-shelf components which multiplies the performance of a single desktop machine, while minimizing occupied space and still remaining cost effective.
Modeling the missile-launch tube problem in DYSCO
NASA Technical Reports Server (NTRS)
Berman, Alex; Gustavson, Bruce A.
1989-01-01
DYSCO is a versatile, general purpose dynamic analysis program which assembles equations and solves dynamics problems. The executive manages a library of technology modules which contain routines that compute the matrix coefficients of the second order ordinary differential equations of the components. The executive performs the coupling of the equations of the components and manages the solution of the coupled equations. Any new component representation may be added to the library if, given the state vector, a FORTRAN program can be written to compute M, C, K, and F. The problem described demonstrates the generality of this statement.
Using dynamic mode decomposition for real-time background/foreground separation in video
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kutz, Jose Nathan; Grosek, Jacob; Brunton, Steven
The technique of dynamic mode decomposition (DMD) is disclosed herein for the purpose of robustly separating video frames into background (low-rank) and foreground (sparse) components in real-time. Foreground/background separation is achieved at the computational cost of just one singular value decomposition (SVD) and one linear equation solve, thus producing results orders of magnitude faster than robust principal component analysis (RPCA). Additional techniques, including techniques for analyzing the video for multi-resolution time-scale components, and techniques for reusing computations to allow processing of streaming video in real time, are also described herein.
Development of high performance scientific components for interoperability of computing packages
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gulabani, Teena Pratap
2008-01-01
Three major high performance quantum chemistry computational packages, NWChem, GAMESS and MPQC have been developed by different research efforts following different design patterns. The goal is to achieve interoperability among these packages by overcoming the challenges caused by the different communication patterns and software design of each of these packages. A chemistry algorithm is hard to develop as well as being a time consuming process; integration of large quantum chemistry packages will allow resource sharing and thus avoid reinvention of the wheel. Creating connections between these incompatible packages is the major motivation of the proposed work. This interoperability is achievedmore » by bringing the benefits of Component Based Software Engineering through a plug-and-play component framework called Common Component Architecture (CCA). In this thesis, I present a strategy and process used for interfacing two widely used and important computational chemistry methodologies: Quantum Mechanics and Molecular Mechanics. To show the feasibility of the proposed approach the Tuning and Analysis Utility (TAU) has been coupled with NWChem code and its CCA components. Results show that the overhead is negligible when compared to the ease and potential of organizing and coping with large-scale software applications.« less
Computational Fatigue Life Analysis of Carbon Fiber Laminate
NASA Astrophysics Data System (ADS)
Shastry, Shrimukhi G.; Chandrashekara, C. V., Dr.
2018-02-01
In the present scenario, many traditional materials are being replaced by composite materials for its light weight and high strength properties. Industries like automotive industry, aerospace industry etc., are some of the examples which uses composite materials for most of its components. Replacing of components which are subjected to static load or impact load are less challenging compared to components which are subjected to dynamic loading. Replacing the components made up of composite materials demands many stages of parametric study. One such parametric study is the fatigue analysis of composite material. This paper focuses on the fatigue life analysis of the composite material by using computational techniques. A composite plate is considered for the study which has a hole at the center. The analysis is carried on (0°/90°/90°/90°/90°)s laminate sequence and (45°/-45°)2s laminate sequence by using a computer script. The life cycles for both the lay-up sequence are compared with each other. It is observed that, for the same material and geometry of the component, cross ply laminates show better fatigue life than that of angled ply laminates.
Palmer, Tim N.; O’Shea, Michael
2015-01-01
How is the brain configured for creativity? What is the computational substrate for ‘eureka’ moments of insight? Here we argue that creative thinking arises ultimately from a synergy between low-energy stochastic and energy-intensive deterministic processing, and is a by-product of a nervous system whose signal-processing capability per unit of available energy has become highly energy optimised. We suggest that the stochastic component has its origin in thermal (ultimately quantum decoherent) noise affecting the activity of neurons. Without this component, deterministic computational models of the brain are incomplete. PMID:26528173
Toward a Fault Tolerant Architecture for Vital Medical-Based Wearable Computing.
Abdali-Mohammadi, Fardin; Bajalan, Vahid; Fathi, Abdolhossein
2015-12-01
Advancements in computers and electronic technologies have led to the emergence of a new generation of efficient small intelligent systems. The products of such technologies might include Smartphones and wearable devices, which have attracted the attention of medical applications. These products are used less in critical medical applications because of their resource constraint and failure sensitivity. This is due to the fact that without safety considerations, small-integrated hardware will endanger patients' lives. Therefore, proposing some principals is required to construct wearable systems in healthcare so that the existing concerns are dealt with. Accordingly, this paper proposes an architecture for constructing wearable systems in critical medical applications. The proposed architecture is a three-tier one, supporting data flow from body sensors to cloud. The tiers of this architecture include wearable computers, mobile computing, and mobile cloud computing. One of the features of this architecture is its high possible fault tolerance due to the nature of its components. Moreover, the required protocols are presented to coordinate the components of this architecture. Finally, the reliability of this architecture is assessed by simulating the architecture and its components, and other aspects of the proposed architecture are discussed.
A Component-based Programming Model for Composite, Distributed Applications
NASA Technical Reports Server (NTRS)
Eidson, Thomas M.; Bushnell, Dennis M. (Technical Monitor)
2001-01-01
The nature of scientific programming is evolving to larger, composite applications that are composed of smaller element applications. These composite applications are more frequently being targeted for distributed, heterogeneous networks of computers. They are most likely programmed by a group of developers. Software component technology and computational frameworks are being proposed and developed to meet the programming requirements of these new applications. Historically, programming systems have had a hard time being accepted by the scientific programming community. In this paper, a programming model is outlined that attempts to organize the software component concepts and fundamental programming entities into programming abstractions that will be better understood by the application developers. The programming model is designed to support computational frameworks that manage many of the tedious programming details, but also that allow sufficient programmer control to design an accurate, high-performance application.
NASA Technical Reports Server (NTRS)
Deardorff, Glenn; Djomehri, M. Jahed; Freeman, Ken; Gambrel, Dave; Green, Bryan; Henze, Chris; Hinke, Thomas; Hood, Robert; Kiris, Cetin; Moran, Patrick;
2001-01-01
A series of NASA presentations for the Supercomputing 2001 conference are summarized. The topics include: (1) Mars Surveyor Landing Sites "Collaboratory"; (2) Parallel and Distributed CFD for Unsteady Flows with Moving Overset Grids; (3) IP Multicast for Seamless Support of Remote Science; (4) Consolidated Supercomputing Management Office; (5) Growler: A Component-Based Framework for Distributed/Collaborative Scientific Visualization and Computational Steering; (6) Data Mining on the Information Power Grid (IPG); (7) Debugging on the IPG; (8) Debakey Heart Assist Device: (9) Unsteady Turbopump for Reusable Launch Vehicle; (10) Exploratory Computing Environments Component Framework; (11) OVERSET Computational Fluid Dynamics Tools; (12) Control and Observation in Distributed Environments; (13) Multi-Level Parallelism Scaling on NASA's Origin 1024 CPU System; (14) Computing, Information, & Communications Technology; (15) NAS Grid Benchmarks; (16) IPG: A Large-Scale Distributed Computing and Data Management System; and (17) ILab: Parameter Study Creation and Submission on the IPG.
Longitudinal train dynamics: an overview
NASA Astrophysics Data System (ADS)
Wu, Qing; Spiryagin, Maksym; Cole, Colin
2016-12-01
This paper discusses the evolution of longitudinal train dynamics (LTD) simulations, which covers numerical solvers, vehicle connection systems, air brake systems, wagon dumper systems and locomotives, resistance forces and gravitational components, vehicle in-train instabilities, and computing schemes. A number of potential research topics are suggested, such as modelling of friction, polymer, and transition characteristics for vehicle connection simulations, studies of wagon dumping operations, proper modelling of vehicle in-train instabilities, and computing schemes for LTD simulations. Evidence shows that LTD simulations have evolved with computing capabilities. Currently, advanced component models that directly describe the working principles of the operation of air brake systems, vehicle connection systems, and traction systems are available. Parallel computing is a good solution to combine and simulate all these advanced models. Parallel computing can also be used to conduct three-dimensional long train dynamics simulations.
Personal Computer Transport Analysis Program
NASA Technical Reports Server (NTRS)
DiStefano, Frank, III; Wobick, Craig; Chapman, Kirt; McCloud, Peter
2012-01-01
The Personal Computer Transport Analysis Program (PCTAP) is C++ software used for analysis of thermal fluid systems. The program predicts thermal fluid system and component transients. The output consists of temperatures, flow rates, pressures, delta pressures, tank quantities, and gas quantities in the air, along with air scrubbing component performance. PCTAP s solution process assumes that the tubes in the system are well insulated so that only the heat transfer between fluid and tube wall and between adjacent tubes is modeled. The system described in the model file is broken down into its individual components; i.e., tubes, cold plates, heat exchangers, etc. A solution vector is built from the components and a flow is then simulated with fluid being transferred from one component to the next. The solution vector of components in the model file is built at the initiation of the run. This solution vector is simply a list of components in the order of their inlet dependency on other components. The component parameters are updated in the order in which they appear in the list at every time step. Once the solution vectors have been determined, PCTAP cycles through the components in the solution vector, executing their outlet function for each time-step increment.
Systems Suitable for Information Professionals.
ERIC Educational Resources Information Center
Blair, John C., Jr.
1983-01-01
Describes computer operating systems applicable to microcomputers, noting hardware components, advantages and disadvantages of each system, local area networks, distributed processing, and a fully configured system. Lists of hardware components (disk drives, solid state disk emulators, input/output and memory components, and processors) and…
NASA Technical Reports Server (NTRS)
Nakazawa, S.
1988-01-01
This annual status report presents the results of work performed during the fourth year of the 3-D Inelastic Analysis Methods for Hot Section Components program (NASA Contract NAS3-23697). The objective of the program is to produce a series of new computer codes permitting more accurate and efficient 3-D analysis of selected hot section components, i.e., combustor liners, turbine blades and turbine vanes. The computer codes embody a progression of math models and are streamlined to take advantage of geometrical features, loading conditions, and forms of material response that distinguish each group of selected components. Volume 1 of this report discusses the special finite element models developed during the fourth year of the contract.
Computed Tomography Inspection and Analysis for Additive Manufacturing Components
NASA Technical Reports Server (NTRS)
Beshears, Ronald D.
2017-01-01
Computed tomography (CT) inspection was performed on test articles additively manufactured from metallic materials. Metallic AM and machined wrought alloy test articles with programmed flaws and geometric features were inspected using a 2-megavolt linear accelerator based CT system. Performance of CT inspection on identically configured wrought and AM components and programmed flaws was assessed to determine the impact of additive manufacturing on inspectability of objects with complex geometries.
Future of Assurance: Ensuring that a System is Trustworthy
NASA Astrophysics Data System (ADS)
Sadeghi, Ahmad-Reza; Verbauwhede, Ingrid; Vishik, Claire
Significant efforts are put in defining and implementing strong security measures for all components of the comput-ing environment. It is equally important to be able to evaluate the strength and robustness of these measures and establish trust among the components of the computing environment based on parameters and attributes of these elements and best practices associated with their production and deployment. Today the inventory of techniques used for security assurance and to establish trust -- audit, security-conscious development process, cryptographic components, external evaluation - is somewhat limited. These methods have their indisputable strengths and have contributed significantly to the advancement in the area of security assurance. However, shorter product and tech-nology development cycles and the sheer complexity of modern digital systems and processes have begun to decrease the efficiency of these techniques. Moreover, these approaches and technologies address only some aspects of security assurance and, for the most part, evaluate assurance in a general design rather than an instance of a product. Additionally, various components of the computing environment participating in the same processes enjoy different levels of security assurance, making it difficult to ensure adequate levels of protection end-to-end. Finally, most evaluation methodologies rely on the knowledge and skill of the evaluators, making reliable assessments of trustworthiness of a system even harder to achieve. The paper outlines some issues in security assurance that apply across the board, with the focus on the trustworthiness and authenticity of hardware components and evaluates current approaches to assurance.
Universal distribution of component frequencies in biological and technological systems
Pang, Tin Yau; Maslov, Sergei
2013-01-01
Bacterial genomes and large-scale computer software projects both consist of a large number of components (genes or software packages) connected via a network of mutual dependencies. Components can be easily added or removed from individual systems, and their use frequencies vary over many orders of magnitude. We study this frequency distribution in genomes of ∼500 bacterial species and in over 2 million Linux computers and find that in both cases it is described by the same scale-free power-law distribution with an additional peak near the tail of the distribution corresponding to nearly universal components. We argue that the existence of a power law distribution of frequencies of components is a general property of any modular system with a multilayered dependency network. We demonstrate that the frequency of a component is positively correlated with its dependency degree given by the total number of upstream components whose operation directly or indirectly depends on the selected component. The observed frequency/dependency degree distributions are reproduced in a simple mathematically tractable model introduced and analyzed in this study. PMID:23530195
A High Performance COTS Based Computer Architecture
NASA Astrophysics Data System (ADS)
Patte, Mathieu; Grimoldi, Raoul; Trautner, Roland
2014-08-01
Using Commercial Off The Shelf (COTS) electronic components for space applications is a long standing idea. Indeed the difference in processing performance and energy efficiency between radiation hardened components and COTS components is so important that COTS components are very attractive for use in mass and power constrained systems. However using COTS components in space is not straightforward as one must account with the effects of the space environment on the COTS components behavior. In the frame of the ESA funded activity called High Performance COTS Based Computer, Airbus Defense and Space and its subcontractor OHB CGS have developed and prototyped a versatile COTS based architecture for high performance processing. The rest of the paper is organized as follows: in a first section we will start by recapitulating the interests and constraints of using COTS components for space applications; then we will briefly describe existing fault mitigation architectures and present our solution for fault mitigation based on a component called the SmartIO; in the last part of the paper we will describe the prototyping activities executed during the HiP CBC project.
ERIC Educational Resources Information Center
Ciminero, Sandra Elser
The acute care pediatric/adolescent unit at Saint Joseph Hospital in Chicago, Illinois offers patients computer services consisting of recreation, general education, newspaper, and patient education components. To gather information concerning patients' experience with computers and to assess the effectiveness of computer services, data were…
Design Principles for Computer-Assisted Instruction in Histology Education: An Exploratory Study
ERIC Educational Resources Information Center
Deniz, Hasan; Cakir, Hasan
2006-01-01
The purpose of this paper is to describe the development process and the key components of a computer-assisted histology material. Computer-assisted histology material is designed to supplement traditional histology education in a large Midwestern university. Usability information of the computer-assisted instruction (CAI) material was obtained…
Addressing Small Computers in the First OS Course
ERIC Educational Resources Information Center
Nutt, Gary
2006-01-01
Small computers are emerging as important components of the contemporary computing scene. Their operating systems vary from specialized software for an embedded system to the same style of OS used on a generic desktop or server computer. This article describes a course in which systems are classified by their hardware capability and the…
NASA Tech Briefs, May 2000. Volume 24, No. 5
NASA Technical Reports Server (NTRS)
2000-01-01
Topics include: Sensors: Test and Measurement; Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Composites and Plastics; Materials; Computer Programs; Mechanics;
NASA Technical Reports Server (NTRS)
Taylor, Nancy L.; Randall, Donald P.; Bowen, John T.; Johnson, Mary M.; Roland, Vincent R.; Matthews, Christine G.; Gates, Raymond L.; Skeens, Kristi M.; Nolf, Scott R.; Hammond, Dana P.
1990-01-01
The computer graphics capabilities available at the Center are introduced and their use is explained. More specifically, the manual identifies and describes the various graphics software and hardware components, details the interfaces between these components, and provides information concerning the use of these components at LaRC.
40 CFR 86.005-17 - On-board diagnostics.
Code of Federal Regulations, 2013 CFR
2013-07-01
... other available operating parameters), and functionality checks for computer output components (proper... considered acceptable. (e) Storing of computer codes. The OBD system shall record and store in computer... monitors that can be considered continuously operating monitors (e.g., misfire monitor, fuel system monitor...
40 CFR 86.005-17 - On-board diagnostics.
Code of Federal Regulations, 2012 CFR
2012-07-01
... other available operating parameters), and functionality checks for computer output components (proper... considered acceptable. (e) Storing of computer codes. The OBD system shall record and store in computer... monitors that can be considered continuously operating monitors (e.g., misfire monitor, fuel system monitor...
Devane, P A; Horne, J G; Foley, G; Stanley, J
2017-10-01
This paper describes the methodology, validation and reliability of a new computer-assisted method which uses models of the patient's bones and the components to measure their migration and polyethylene wear from radiographs after total hip arthroplasty (THA). Models of the patient's acetabular and femoral component obtained from the manufacturer and models of the patient's pelvis and femur built from a single computed tomography (CT) scan, are used by a computer program to measure the migration of the components and the penetration of the femoral head from anteroposterior and lateral radiographs taken at follow-up visits. The program simulates the radiographic setup and matches the position and orientation of the models to outlines of the pelvis, the acetabular and femoral component, and femur on radiographs. Changes in position and orientation reflect the migration of the components and the penetration of the femoral head. Validation was performed using radiographs of phantoms simulating known migration and penetration, and the clinical feasibility of measuring migration was assessed in two patients. Migration of the acetabular and femoral components can be measured with limits of agreement (LOA) of 0.37 mm and 0.33 mm, respectively. Penetration of the femoral head can be measured with LOA of 0.161 mm. The migration of components and polyethylene wear can be measured without needing specialised radiographs. Accurate measurement may allow earlier prediction of failure after THA. Cite this article: Bone Joint J 2017;99-B:1290-7. ©2017 The British Editorial Society of Bone & Joint Surgery.
An efficient two-stage approach for image-based FSI analysis of atherosclerotic arteries
Rayz, Vitaliy L.; Mofrad, Mohammad R. K.; Saloner, David
2010-01-01
Patient-specific biomechanical modeling of atherosclerotic arteries has the potential to aid clinicians in characterizing lesions and determining optimal treatment plans. To attain high levels of accuracy, recent models use medical imaging data to determine plaque component boundaries in three dimensions, and fluid–structure interaction is used to capture mechanical loading of the diseased vessel. As the plaque components and vessel wall are often highly complex in shape, constructing a suitable structured computational mesh is very challenging and can require a great deal of time. Models based on unstructured computational meshes require relatively less time to construct and are capable of accurately representing plaque components in three dimensions. These models unfortunately require additional computational resources and computing time for accurate and meaningful results. A two-stage modeling strategy based on unstructured computational meshes is proposed to achieve a reasonable balance between meshing difficulty and computational resource and time demand. In this method, a coarsegrained simulation of the full arterial domain is used to guide and constrain a fine-scale simulation of a smaller region of interest within the full domain. Results for a patient-specific carotid bifurcation model demonstrate that the two-stage approach can afford a large savings in both time for mesh generation and time and resources needed for computation. The effects of solid and fluid domain truncation were explored, and were shown to minimally affect accuracy of the stress fields predicted with the two-stage approach. PMID:19756798
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Shujia; Duffy, Daniel; Clune, Thomas
The call for ever-increasing model resolutions and physical processes in climate and weather models demands a continual increase in computing power. The IBM Cell processor's order-of-magnitude peak performance increase over conventional processors makes it very attractive to fulfill this requirement. However, the Cell's characteristics, 256KB local memory per SPE and the new low-level communication mechanism, make it very challenging to port an application. As a trial, we selected the solar radiation component of the NASA GEOS-5 climate model, which: (1) is representative of column physics components (half the total computational time), (2) has an extremely high computational intensity: the ratiomore » of computational load to main memory transfers, and (3) exhibits embarrassingly parallel column computations. In this paper, we converted the baseline code (single-precision Fortran) to C and ported it to an IBM BladeCenter QS20. For performance, we manually SIMDize four independent columns and include several unrolling optimizations. Our results show that when compared with the baseline implementation running on one core of Intel's Xeon Woodcrest, Dempsey, and Itanium2, the Cell is approximately 8.8x, 11.6x, and 12.8x faster, respectively. Our preliminary analysis shows that the Cell can also accelerate the dynamics component (~;;25percent total computational time). We believe these dramatic performance improvements make the Cell processor very competitive as an accelerator.« less
NASA Astrophysics Data System (ADS)
Jamie, Majid
2016-11-01
Singh and Mogi (2003) presented a forward modeling (FWD) program, coded in FORTRAN 77 called "EMLCLLER", which is capable of computing the frequency-domain electromagnetic (EM) response of a large circular loop, in terms of vertical magnetic component (Hz), over 1D layer earth models; computations at this program could be performed by assuming variable transmitter-receiver configurations and incorporating both conduction and displacement currents into computations. Integral equations at this program are computed through digital linear filters based on the Hankel transforms together with analytic solutions based on hyper-geometric functions. Despite capabilities of EMLCLLER, there are some mistakes at this program that make its FWD results unreliable. The mistakes in EMLCLLER arise in using wrong algorithm for computing reflection coefficient of the EM wave in TE-mode (rTE), and using flawed algorithms for computing phase and normalized phase values relating to Hz; in this paper corrected form of these mistakes are presented. Moreover, in order to illustrate how these mistakes can affect FWD results, EMLCLLER and corrected version of this program presented in this paper titled "EMLCLLER_Corr" are conducted on different two- and three-layered earth models; afterwards their FWD results in terms of real and imaginary parts of Hz, its normalized amplitude, and the corresponding normalized phase curves are plotted versus frequency and compared to each other. In addition, in Singh and Mogi (2003) extra derivations for computing radial component of the magnetic field (Hr) and angular component of the electric field (Eϕ) are also presented where the numerical solution presented for Hr is incorrect; in this paper the correct numerical solution for this derivation is also presented.
2012-05-01
cloud computing 17 NASA Nebula Platform • Cloud computing pilot program at NASA Ames • Integrates open-source components into seamless, self...Mission support • Education and public outreach (NASA Nebula , 2010) 18 NSF Supported Cloud Research • Support for Cloud Computing in...Mell, P. & Grance, T. (2011). The NIST Definition of Cloud Computing. NIST Special Publication 800-145 • NASA Nebula (2010). Retrieved from
ERIC Educational Resources Information Center
Green, Kenneth C.
This report presents findings of a June 1998 survey of computing officials at 1,623 two- and four-year U.S. colleges and universities concerning the use of computer technology. The survey found that computing and information technology (IT) are now core components of the campus environment and classroom experience. However, key aspects of IT…
[Computer aided design and rapid manufacturing of removable partial denture frameworks].
Han, Jing; Lü, Pei-jun; Wang, Yong
2010-08-01
To introduce a method of digital modeling and fabricating removable partial denture (RPD) frameworks using self-developed software for RPD design and rapid manufacturing system. The three-dimensional data of two partially dentate dental casts were obtained using a three-dimensional crossing section scanner. Self-developed software package for RPD design was used to decide the path of insertion and to design different components of RPD frameworks. The components included occlusal rest, clasp, lingual bar, polymeric retention framework and maxillary major connector. The design procedure for the components was as following: first, determine the outline of the component. Second, build the tissue surface of the component using the scanned data within the outline. Third, preset cross section was used to produce the polished surface. Finally, different RPD components were modeled respectively and connected by minor connectors to form an integrated RPD framework. The finished data were imported into a self-developed selective laser melting (SLM) machine and metal frameworks were fabricated directly. RPD frameworks for the two scanned dental casts were modeled with this self-developed program and metal RPD frameworks were successfully fabricated using SLM method. The finished metal frameworks fit well on the plaster models. The self-developed computer aided design and computer aided manufacture (CAD-CAM) system for RPD design and fabrication has completely independent intellectual property rights. It provides a new method of manufacturing metal RPD frameworks.
ERIC Educational Resources Information Center
Stecher, Brian
A training program in computer educationtTested in 89 secondary schools focused on the use of computers as tools in all subject areas. Each school received enough computers and software from IBM to equip a full computer laboratory. The schools were organized into local networks in eight regions and received training and continuing support in these…
Johari, Masoumeh; Abdollahzadeh, Milad; Esmaeili, Farzad; Sakhamanesh, Vahideh
2018-01-01
Dental cone beam computed tomography (CBCT) images suffer from severe metal artifacts. These artifacts degrade the quality of acquired image and in some cases make it unsuitable to use. Streaking artifacts and cavities around teeth are the main reason of degradation. In this article, we have proposed a new artifact reduction algorithm which has three parallel components. The first component extracts teeth based on the modeling of image histogram with a Gaussian mixture model. Striking artifact reduction component reduces artifacts using converting image into the polar domain and applying morphological filtering. The third component fills cavities through a simple but effective morphological filtering operation. Finally, results of these three components are combined into a fusion step to create a visually good image which is more compatible to human visual system. Results show that the proposed algorithm reduces artifacts of dental CBCT images and produces clean images.
Evaluation of runaway-electron effects on plasma-facing components for NET
NASA Astrophysics Data System (ADS)
Bolt, H.; Calén, H.
1991-03-01
Runaway electrons which are generated during disruptions can cause serious damage to plasma facing components in a next generation device like NET. A study was performed to quantify the response of NET plasma facing components to runaway-electron impact. For the determination of the energy deposition in the component materials Monte Carlo computations were performed. Since the subsurface metal structures can be strongly heated under runaway-electron impact from the computed results damage threshold values for the thermal excursions were derived. These damage thresholds are strongly dependent on the materials selection and the component design. For a carbonmolybdenum divertor with 10 and 20 mm carbon armour thickness and 1 degree electron incidence the damage thresholds are 100 MJ/m 2 and 220 MJ/m 2. The thresholds for a carbon-copper divertor under the same conditions are about 50% lower. On the first wall damage is anticipated for energy depositions above 180 MJ/m 2.
Low-Dimensional Models for Physiological Systems: Nonlinear Coupling of Gas and Liquid Flows
NASA Astrophysics Data System (ADS)
Staples, A. E.; Oran, E. S.; Boris, J. P.; Kailasanath, K.
2006-11-01
Current computational models of biological organisms focus on the details of a specific component of the organism. For example, very detailed models of the human heart, an aorta, a vein, or part of the respiratory or digestive system, are considered either independently from the rest of the body, or as interacting simply with other systems and components in the body. In actual biological organisms, these components and systems are strongly coupled and interact in complex, nonlinear ways leading to complicated global behavior. Here we describe a low-order computational model of two physiological systems, based loosely on a circulatory and respiratory system. Each system is represented as a one-dimensional fluid system with an interconnected series of mass sources, pumps, valves, and other network components, as appropriate, representing different physical organs and system components. Preliminary results from a first version of this model system are presented.
Johari, Masoumeh; Abdollahzadeh, Milad; Esmaeili, Farzad; Sakhamanesh, Vahideh
2018-01-01
Background: Dental cone beam computed tomography (CBCT) images suffer from severe metal artifacts. These artifacts degrade the quality of acquired image and in some cases make it unsuitable to use. Streaking artifacts and cavities around teeth are the main reason of degradation. Methods: In this article, we have proposed a new artifact reduction algorithm which has three parallel components. The first component extracts teeth based on the modeling of image histogram with a Gaussian mixture model. Striking artifact reduction component reduces artifacts using converting image into the polar domain and applying morphological filtering. The third component fills cavities through a simple but effective morphological filtering operation. Results: Finally, results of these three components are combined into a fusion step to create a visually good image which is more compatible to human visual system. Conclusions: Results show that the proposed algorithm reduces artifacts of dental CBCT images and produces clean images. PMID:29535920
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Qishi; Zhu, Mengxia; Rao, Nageswara S
We propose an intelligent decision support system based on sensor and computer networks that incorporates various component techniques for sensor deployment, data routing, distributed computing, and information fusion. The integrated system is deployed in a distributed environment composed of both wireless sensor networks for data collection and wired computer networks for data processing in support of homeland security defense. We present the system framework and formulate the analytical problems and develop approximate or exact solutions for the subtasks: (i) sensor deployment strategy based on a two-dimensional genetic algorithm to achieve maximum coverage with cost constraints; (ii) data routing scheme tomore » achieve maximum signal strength with minimum path loss, high energy efficiency, and effective fault tolerance; (iii) network mapping method to assign computing modules to network nodes for high-performance distributed data processing; and (iv) binary decision fusion rule that derive threshold bounds to improve system hit rate and false alarm rate. These component solutions are implemented and evaluated through either experiments or simulations in various application scenarios. The extensive results demonstrate that these component solutions imbue the integrated system with the desirable and useful quality of intelligence in decision making.« less
Viewing ISS Data in Real Time via the Internet
NASA Technical Reports Server (NTRS)
Myers, Gerry; Chamberlain, Jim
2004-01-01
EZStream is a computer program that enables authorized users at diverse terrestrial locations to view, in real time, data generated by scientific payloads aboard the International Space Station (ISS). The only computation/communication resource needed for use of EZStream is a computer equipped with standard Web-browser software and a connection to the Internet. EZStream runs in conjunction with the TReK software, described in a prior NASA Tech Briefs article, that coordinates multiple streams of data for the ground communication system of the ISS. EZStream includes server components that interact with TReK within the ISS ground communication system and client components that reside in the users' remote computers. Once an authorized client has logged in, a server component of EZStream pulls the requested data from a TReK application-program interface and sends the data to the client. Future EZStream enhancements will include (1) extensions that enable the server to receive and process arbitrary data streams on its own and (2) a Web-based graphical-user-interface-building subprogram that enables a client who lacks programming expertise to create customized display Web pages.
Computed Tomography Inspection and Analysis for Additive Manufacturing Components
NASA Technical Reports Server (NTRS)
Beshears, Ronald D.
2016-01-01
Computed tomography (CT) inspection was performed on test articles additively manufactured from metallic materials. Metallic AM and machined wrought alloy test articles with programmed flaws were inspected using a 2MeV linear accelerator based CT system. Performance of CT inspection on identically configured wrought and AM components and programmed flaws was assessed using standard image analysis techniques to determine the impact of additive manufacturing on inspectability of objects with complex geometries.
ERIC Educational Resources Information Center
Sayre, Scott Alan
The purpose of this study was to develop and validate a computer-based system that would allow interactive video developers to integrate and manage the design components prior to production. These components of an interactive video (IVD) program include visual information in a variety of formats, audio information, and instructional techniques,…
Symbolic Computation of Strongly Connected Components Using Saturation
NASA Technical Reports Server (NTRS)
Zhao, Yang; Ciardo, Gianfranco
2010-01-01
Finding strongly connected components (SCCs) in the state-space of discrete-state models is a critical task in formal verification of LTL and fair CTL properties, but the potentially huge number of reachable states and SCCs constitutes a formidable challenge. This paper is concerned with computing the sets of states in SCCs or terminal SCCs of asynchronous systems. Because of its advantages in many applications, we employ saturation on two previously proposed approaches: the Xie-Beerel algorithm and transitive closure. First, saturation speeds up state-space exploration when computing each SCC in the Xie-Beerel algorithm. Then, our main contribution is a novel algorithm to compute the transitive closure using saturation. Experimental results indicate that our improved algorithms achieve a clear speedup over previous algorithms in some cases. With the help of the new transitive closure computation algorithm, up to 10(exp 150) SCCs can be explored within a few seconds.
Computation material science of structural-phase transformation in casting aluminium alloys
NASA Astrophysics Data System (ADS)
Golod, V. M.; Dobosh, L. Yu
2017-04-01
Successive stages of computer simulation the formation of the casting microstructure under non-equilibrium conditions of crystallization of multicomponent aluminum alloys are presented. On the basis of computer thermodynamics and heat transfer during solidification of macroscale shaped castings are specified the boundary conditions of local heat exchange at mesoscale modeling of non-equilibrium formation the solid phase and of the component redistribution between phases during coalescence of secondary dendrite branches. Computer analysis of structural - phase transitions based on the principle of additive physico-chemical effect of the alloy components in the process of diffusional - capillary morphological evolution of the dendrite structure and the o of local dendrite heterogeneity which stochastic nature and extent are revealed under metallographic study and modeling by the Monte Carlo method. The integrated computational materials science tools at researches of alloys are focused and implemented on analysis the multiple-factor system of casting processes and prediction of casting microstructure.
The NASA computer aided design and test system
NASA Technical Reports Server (NTRS)
Gould, J. M.; Juergensen, K.
1973-01-01
A family of computer programs facilitating the design, layout, evaluation, and testing of digital electronic circuitry is described. CADAT (computer aided design and test system) is intended for use by NASA and its contractors and is aimed predominantly at providing cost effective microelectronic subsystems based on custom designed metal oxide semiconductor (MOS) large scale integrated circuits (LSIC's). CADAT software can be easily adopted by installations with a wide variety of computer hardware configurations. Its structure permits ease of update to more powerful component programs and to newly emerging LSIC technologies. The components of the CADAT system are described stressing the interaction of programs rather than detail of coding or algorithms. The CADAT system provides computer aids to derive and document the design intent, includes powerful automatic layout software, permits detailed geometry checks and performance simulation based on mask data, and furnishes test pattern sequences for hardware testing.
SPREADSHEET-BASED PROGRAM FOR ERGONOMIC ADJUSTMENT OF NOTEBOOK COMPUTER AND WORKSTATION SETTINGS.
Nanthavanij, Suebsak; Prae-Arporn, Kanlayanee; Chanjirawittaya, Sorajak; Paripoonyo, Satirajit; Rodloy, Somsak
2015-06-01
This paper discusses a computer program, ErgoNBC, which provides suggestions regarding the ergonomic settings of a notebook computer (NBC), workstation components, and selected accessories in order to help computer users to assume an appropriate work posture during the NBC work. From the users' body height, NBC and workstation component data, ErgoNBC computes the recommended tilt angle of NBC base unit, NBC screen angle, distance between the user and NBC, seat height and work surface height. If necessary, the NBC base support, seat cushion and footrest, including their settings, are recommended. An experiment involving twenty-four university students was conducted to evaluate the recommendations provided by ErgoNBC. The Rapid Upper Limb Assessment (RULA) technique was used to analyze their work postures both before and after implementing the Ergo NBC's recommendations. The results clearly showed that ErgoNBC could significantly help to improve the subjects' work postures.
Computational electronics and electromagnetics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shang, C. C.
The Computational Electronics and Electromagnetics thrust area at Lawrence Livermore National Laboratory serves as the focal point for engineering R&D activities for developing computer-based design, analysis, and tools for theory. Key representative applications include design of particle accelerator cells and beamline components; engineering analysis and design of high-power components, photonics, and optoelectronics circuit design; EMI susceptibility analysis; and antenna synthesis. The FY-96 technology-base effort focused code development on (1) accelerator design codes; (2) 3-D massively parallel, object-oriented time-domain EM codes; (3) material models; (4) coupling and application of engineering tools for analysis and design of high-power components; (5) 3-D spectral-domainmore » CEM tools; and (6) enhancement of laser drilling codes. Joint efforts with the Power Conversion Technologies thrust area include development of antenna systems for compact, high-performance radar, in addition to novel, compact Marx generators. 18 refs., 25 figs., 1 tab.« less
A composite computational model of liver glucose homeostasis. I. Building the composite model.
Hetherington, J; Sumner, T; Seymour, R M; Li, L; Rey, M Varela; Yamaji, S; Saffrey, P; Margoninski, O; Bogle, I D L; Finkelstein, A; Warner, A
2012-04-07
A computational model of the glucagon/insulin-driven liver glucohomeostasis function, focusing on the buffering of glucose into glycogen, has been developed. The model exemplifies an 'engineering' approach to modelling in systems biology, and was produced by linking together seven component models of separate aspects of the physiology. The component models use a variety of modelling paradigms and degrees of simplification. Model parameters were determined by an iterative hybrid of fitting to high-scale physiological data, and determination from small-scale in vitro experiments or molecular biological techniques. The component models were not originally designed for inclusion within such a composite model, but were integrated, with modification, using our published modelling software and computational frameworks. This approach facilitates the development of large and complex composite models, although, inevitably, some compromises must be made when composing the individual models. Composite models of this form have not previously been demonstrated.
Lifetime Reliability Evaluation of Structural Ceramic Parts with the CARES/LIFE Computer Program
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Powers, Lynn M.; Janosik, Lesley A.; Gyekenyesi, John P.
1993-01-01
The computer program CARES/LIFE calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. This program is an extension of the CARES (Ceramics Analysis and Reliability Evaluation of Structures) computer program. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing the power law, Paris law, or Walker equation. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled using either the principle of independent action (PIA), Weibull's normal stress averaging method (NSA), or Batdorf's theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. Two example problems demonstrating cyclic fatigue parameter estimation and component reliability analysis with proof testing are included.
Advanced Placement Computer Science (with Pascal). Teacher's Guide. Volume 1. Second Edition.
ERIC Educational Resources Information Center
Farkouh, Alice; And Others
The purpose of this guide is to give teachers and supervisors a working knowledge of various approaches to enhancing pupil learning about computer science, particularly through the use of Pascal. It contains instructional units dealing with: (1) computer components; (2) computer languages; (3) compilers; (4) essential features of a Pascal program;…
ERIC Educational Resources Information Center
Hsu, Ching-Kun; Hwang, Gwo-Jen
2014-01-01
Personal computer assembly courses have been recognized as being essential in helping students understand computer structure as well as the functionality of each computer component. In this study, a context-aware ubiquitous learning approach is proposed for providing instant assistance to individual students in the learning activity of a…
ERIC Educational Resources Information Center
Dashtestani, Reza
2014-01-01
Computer literacy is a significant component of language teachers' computer-assisted language learning (call) knowledge. Despite its importance, limited research has been undertaken to analyze factors which might influence language teachers' computer literacy levels. This qualitative study explored the perspectives of 39 Iranian EFL teacher…
The CPU and You: Mastering the Microcomputer.
ERIC Educational Resources Information Center
Kansky, Robert
1983-01-01
Computers are both understandable and controllable. Educators need some understanding of a computer's cognitive profile, component parts, and systematic nature in order to set it to work on some of the teaching tasks that need to be done. Much computer-related vocabulary is discussed. (MP)
47 CFR 64.702 - Furnishing of enhanced services and customer-premises equipment.
Code of Federal Regulations, 2012 CFR
2012-10-01
... separate operating, marketing, installation, and maintenance personnel, and utilize separate computer... available to the separate corporation any capacity or computer system component on its computer system or... Enhanced Services and Customer-Premises Equipment by Bell Operating Companies; Telephone Operator Services...
45 CFR 310.1 - What definitions apply to this part?
Code of Federal Regulations, 2010 CFR
2010-10-01
... existing automated data processing computer system through an Intergovernmental Service Agreement; (4...) Office Automation means a generic adjunct component of a computer system that supports the routine... timely and satisfactory; (iv) Assurances that information in the computer system as well as access, use...
45 CFR 310.1 - What definitions apply to this part?
Code of Federal Regulations, 2013 CFR
2013-10-01
... existing automated data processing computer system through an Intergovernmental Service Agreement; (4...) Office Automation means a generic adjunct component of a computer system that supports the routine... timely and satisfactory; (iv) Assurances that information in the computer system as well as access, use...
47 CFR 64.702 - Furnishing of enhanced services and customer-premises equipment.
Code of Federal Regulations, 2013 CFR
2013-10-01
... separate operating, marketing, installation, and maintenance personnel, and utilize separate computer... available to the separate corporation any capacity or computer system component on its computer system or... Enhanced Services and Customer-Premises Equipment by Bell Operating Companies; Telephone Operator Services...
45 CFR 310.1 - What definitions apply to this part?
Code of Federal Regulations, 2014 CFR
2014-10-01
... existing automated data processing computer system through an Intergovernmental Service Agreement; (4...) Office Automation means a generic adjunct component of a computer system that supports the routine... timely and satisfactory; (iv) Assurances that information in the computer system as well as access, use...
45 CFR 310.1 - What definitions apply to this part?
Code of Federal Regulations, 2011 CFR
2011-10-01
... existing automated data processing computer system through an Intergovernmental Service Agreement; (4...) Office Automation means a generic adjunct component of a computer system that supports the routine... timely and satisfactory; (iv) Assurances that information in the computer system as well as access, use...
Computer Training for Staff and Patrons.
ERIC Educational Resources Information Center
Krissoff, Alan; Konrad, Lee
1998-01-01
Describes a pilot computer training program for library staff and patrons at the University of Wisconsin-Madison. Reviews components of effective training programs and highlights core computer competencies: operating systems, hardware and software basics and troubleshooting, and search concepts and techniques. Includes an instructional outline and…
47 CFR 64.702 - Furnishing of enhanced services and customer-premises equipment.
Code of Federal Regulations, 2011 CFR
2011-10-01
... separate operating, marketing, installation, and maintenance personnel, and utilize separate computer... available to the separate corporation any capacity or computer system component on its computer system or... Enhanced Services and Customer-Premises Equipment by Bell Operating Companies; Telephone Operator Services...
47 CFR 64.702 - Furnishing of enhanced services and customer-premises equipment.
Code of Federal Regulations, 2010 CFR
2010-10-01
... separate operating, marketing, installation, and maintenance personnel, and utilize separate computer... available to the separate corporation any capacity or computer system component on its computer system or... Enhanced Services and Customer-Premises Equipment by Bell Operating Companies; Telephone Operator Services...
45 CFR 310.1 - What definitions apply to this part?
Code of Federal Regulations, 2012 CFR
2012-10-01
... existing automated data processing computer system through an Intergovernmental Service Agreement; (4...) Office Automation means a generic adjunct component of a computer system that supports the routine... timely and satisfactory; (iv) Assurances that information in the computer system as well as access, use...
47 CFR 64.702 - Furnishing of enhanced services and customer-premises equipment.
Code of Federal Regulations, 2014 CFR
2014-10-01
... separate operating, marketing, installation, and maintenance personnel, and utilize separate computer... available to the separate corporation any capacity or computer system component on its computer system or... Enhanced Services and Customer-Premises Equipment by Bell Operating Companies; Telephone Operator Services...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-26
... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-745] Certain Wireless Communication Devices, Portable Music and Data Processing Devices, Computers and Components Thereof; Commission Decision... importation of certain wireless communication devices, portable music and data processing devices, computers...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-25
... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-745] Certain Wireless Communication Devices, Portable Music and Data Processing Devices, Computers and Components Thereof; Commission Decision... importation of certain wireless communication devices, portable music and data processing devices, computers...
SINFAC - SYSTEMS IMPROVED NUMERICAL FLUIDS ANALYSIS CODE
NASA Technical Reports Server (NTRS)
Costello, F. A.
1994-01-01
The Systems Improved Numerical Fluids Analysis Code, SINFAC, consists of additional routines added to the April 1983 revision of SINDA, a general thermal analyzer program. The purpose of the additional routines is to allow for the modeling of active heat transfer loops. The modeler can simulate the steady-state and pseudo-transient operations of 16 different heat transfer loop components including radiators, evaporators, condensers, mechanical pumps, reservoirs and many types of valves and fittings. In addition, the program contains a property analysis routine that can be used to compute the thermodynamic properties of 20 different refrigerants. SINFAC can simulate the response to transient boundary conditions. SINFAC was first developed as a method for computing the steady-state performance of two phase systems. It was then modified using CNFRWD, SINDA's explicit time-integration scheme, to accommodate transient thermal models. However, SINFAC cannot simulate pressure drops due to time-dependent fluid acceleration, transient boil-out, or transient fill-up, except in the accumulator. SINFAC also requires the user to be familiar with SINDA. The solution procedure used by SINFAC is similar to that which an engineer would use to solve a system manually. The solution to a system requires the determination of all of the outlet conditions of each component such as the flow rate, pressure, and enthalpy. To obtain these values, the user first estimates the inlet conditions to the first component of the system, then computes the outlet conditions from the data supplied by the manufacturer of the first component. The user then estimates the temperature at the outlet of the third component and computes the corresponding flow resistance of the second component. With the flow resistance of the second component, the user computes the conditions down stream, namely the inlet conditions of the third. The computations follow for the rest of the system, back to the first component. On the first pass, the user finds that the calculated outlet conditions of the last component do not match the estimated inlet conditions of the first. The user then modifies the estimated inlet conditions of the first component in an attempt to match the calculated values. The user estimated values are called State Variables. The differences between the user estimated values and calculated values are called the Error Variables. The procedure systematically changes the State Variables until all of the Error Variables are less than the user-specified iteration limits. The solution procedure is referred to as SCX. It consists of two phases, the Systems phase and the Controller phase. The X is to imply experimental. SCX computes each next set of State Variables in two phases. In the first phase, SCX fixes the controller positions and modifies the other State Variables by the Newton-Raphson method. This first phase is the Systems phase. Once the Newton-Raphson method has solved the problem for the fixed controller positions, SCX next calculates new controller positions based on Newton's method while treating each sensor-controller pair independently but allowing all to change in one iteration. This phase is the Controller phase. SINFAC is available by license for a period of ten (10) years to approved licensees. The licenced program product includes the source code for the additional routines to SINDA, the SINDA object code, command procedures, sample data and supporting documentation. Additional documentation may be purchased at the price below. SINFAC was created for use on a DEC VAX under VMS. Source code is written in FORTRAN 77, requires 180k of memory, and should be fully transportable. The program was developed in 1988.
Cool Spot and Flare Activities of a RS CVn Binary KIC 7885570
NASA Astrophysics Data System (ADS)
Kunt, M.; Dal, H. A.
2017-12-01
We present here the results of our studies on the physical nature and chromospheric activity of a RS CVn binary KIC 7885570 based on the Kepler Mission data. Assuming the primary component temperature, 6530 K, the temperature of the secondary component was found to be 5732±4 K. The mass ratio of the components (q) was found to be 0.43±0.01, while the inclination (i) of the system - 80.6°±0.1°. Additionally, the data were separated into 35 subsets to model the sinusoidal variation due to the rotational modulation, using the SpotModel program, as the light curve analysis indicated the chromospherically active secondary component. It was found that there are generally two spotted areas, whose radii, longitudes and latitudes are rapidly changing, located around the latitudes of +50° and +90° on the active component. Moreover, 113 flares were detected and their parameters were computed from the available data. The One Phase Exponential Association function model was derived from the parameters of these flares. Using the regression calculations, the Plateau value was found to be 1.9815±0.1177, while the half-life value was computed as 3977.2 s. In addition, the flare frequency (N1) - the flare number per hour, was estimated to be 0.00362 h-1, while flare frequency (N2) - the flare-equivalent duration emitted per hour, was computed as 0.00001. Finally, the times of eclipses were computed for 278 minima of the light curves, whose analysis indicated that the chromosphere activity nature of the system causes some effects on these minima times. Comparing the chromospheric activity patterns with the analogues of the secondary component, it is seen that the magnetic activity level is remarkably low. However, it is still at the expected level according to the B-V color index of 0.643 mag for the secondary component.
Multidimensional computer simulation of Stirling cycle engines
NASA Technical Reports Server (NTRS)
Hall, C. A.; Porsching, T. A.; Medley, J.; Tew, R. C.
1990-01-01
The computer code ALGAE (algorithms for the gas equations) treats incompressible, thermally expandable, or locally compressible flows in complicated two-dimensional flow regions. The solution method, finite differencing schemes, and basic modeling of the field equations in ALGAE are applicable to engineering design settings of the type found in Stirling cycle engines. The use of ALGAE to model multiple components of the space power research engine (SPRE) is reported. Videotape computer simulations of the transient behavior of the working gas (helium) in the heater-regenerator-cooler complex of the SPRE demonstrate the usefulness of such a program in providing information on thermal and hydraulic phenomena in multiple component sections of the SPRE.
Computer Assisted Navigation in Knee Arthroplasty
Bae, Dae Kyung
2011-01-01
Computer assisted surgery (CAS) was used to improve the positioning of implants during total knee arthroplasty (TKA). Most studies have reported that computer assisted navigation reduced the outliers of alignment and component malpositioning. However, additional sophisticated studies are necessary to determine if the improvement of alignment will improve long-term clinical results and increase the survival rate of the implant. Knowledge of CAS-TKA technology and understanding the advantages and limitations of navigation are crucial to the successful application of the CAS technique in TKA. In this article, we review the components of navigation, classification of the system, surgical method, potential error, clinical results, advantages, and disadvantages. PMID:22162787
Euler Flow Computations on Non-Matching Unstructured Meshes
NASA Technical Reports Server (NTRS)
Gumaste, Udayan
1999-01-01
Advanced fluid solvers to predict aerodynamic performance-coupled treatment of multiple fields are described. The interaction between the fluid and structural components in the bladed regions of the engine is investigated with respect to known blade failures caused by either flutter or forced vibrations. Methods are developed to describe aeroelastic phenomena for internal flows in turbomachinery by accounting for the increased geometric complexity, mutual interaction between adjacent structural components and presence of thermal and geometric loading. The computer code developed solves the full three dimensional aeroelastic problem of-stage. The results obtained show that flow computations can be performed on non-matching finite-volume unstructured meshes with second order spatial accuracy.
NASA Tech Briefs, June 1996. Volume 20, No. 6
NASA Technical Reports Server (NTRS)
1996-01-01
Topics: New Computer Hardware; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences;Books and Reports.
Visual Computing Environment Workshop
NASA Technical Reports Server (NTRS)
Lawrence, Charles (Compiler)
1998-01-01
The Visual Computing Environment (VCE) is a framework for intercomponent and multidisciplinary computational simulations. Many current engineering analysis codes simulate various aspects of aircraft engine operation. For example, existing computational fluid dynamics (CFD) codes can model the airflow through individual engine components such as the inlet, compressor, combustor, turbine, or nozzle. Currently, these codes are run in isolation, making intercomponent and complete system simulations very difficult to perform. In addition, management and utilization of these engineering codes for coupled component simulations is a complex, laborious task, requiring substantial experience and effort. To facilitate multicomponent aircraft engine analysis, the CFD Research Corporation (CFDRC) is developing the VCE system. This system, which is part of NASA's Numerical Propulsion Simulation System (NPSS) program, can couple various engineering disciplines, such as CFD, structural analysis, and thermal analysis.
Raster-Based Approach to Solar Pressure Modeling
NASA Technical Reports Server (NTRS)
Wright, Theodore W. II
2013-01-01
An algorithm has been developed to take advantage of the graphics processing hardware in modern computers to efficiently compute high-fidelity solar pressure forces and torques on spacecraft, taking into account the possibility of self-shading due to the articulation of spacecraft components such as solar arrays. The process is easily extended to compute other results that depend on three-dimensional attitude analysis, such as solar array power generation or free molecular flow drag. The impact of photons upon a spacecraft introduces small forces and moments. The magnitude and direction of the forces depend on the material properties of the spacecraft components being illuminated. The parts of the components being lit depends on the orientation of the craft with respect to the Sun, as well as the gimbal angles for any significant moving external parts (solar arrays, typically). Some components may shield others from the Sun. The purpose of this innovation is to enable high-fidelity computation of solar pressure and power generation effects of illuminated portions of spacecraft, taking self-shading from spacecraft attitude and movable components into account. The key idea in this innovation is to compute results dependent upon complicated geometry by using an image to break the problem into thousands or millions of sub-problems with simple geometry, and then the results from the simpler problems are combined to give high-fidelity results for the full geometry. This process is performed by constructing a 3D model of a spacecraft using an appropriate computer language (OpenGL), and running that model on a modern computer's 3D accelerated video processor. This quickly and accurately generates a view of the model (as shown on a computer screen) that takes rotation and articulation of spacecraft components into account. When this view is interpreted as the spacecraft as seen by the Sun, then only the portions of the craft visible in the view are illuminated. The view as shown on the computer screen is composed of up to millions of pixels. Each of those pixels is associated with a small illuminated area of the spacecraft. For each pixel, it is possible to compute its position, angle (surface normal) from the view direction, and the spacecraft material (and therefore, optical coefficients) associated with that area. With this information, the area associated with each pixel can be modeled as a simple flat plate for calculating solar pressure. The vector sum of these individual flat plate models is a high-fidelity approximation of the solar pressure forces and torques on the whole vehicle. In addition to using optical coefficients associated with each spacecraft material to calculate solar pressure, a power generation coefficient is added for computing solar array power generation from the sum of the illuminated areas. Similarly, other area-based calculations, such as free molecular flow drag, are also enabled. Because the model rendering is separated from other calculations, it is relatively easy to add a new model to explore a new vehicle or mission configuration. Adding a new model is performed by adding OpenGL code, but a future version might read a mesh file exported from a computer-aided design (CAD) system to enable very rapid turnaround for new designs
Component extraction on CT volumes of assembled products using geometric template matching
NASA Astrophysics Data System (ADS)
Muramatsu, Katsutoshi; Ohtake, Yutaka; Suzuki, Hiromasa; Nagai, Yukie
2017-03-01
As a method of non-destructive internal inspection, X-ray computed tomography (CT) is used not only in medical applications but also for product inspection. Some assembled products can be divided into separate components based on density, which is known to be approximately proportional to CT values. However, components whose densities are similar cannot be distinguished using the CT value driven approach. In this study, we proposed a new component extraction algorithm from the CT volume, using a set of voxels with an assigned CT value with the surface mesh as the template rather than the density. The method has two main stages: rough matching and fine matching. At the rough matching stage, the position of candidate targets is identified roughly from the CT volume, using the template of the target component. At the fine matching stage, these candidates are precisely matched with the templates, allowing the correct position of the components to be detected from the CT volume. The results of two computational experiments showed that the proposed algorithm is able to extract components with similar density within the assembled products on CT volumes.
Application-Program-Installer Builder
NASA Technical Reports Server (NTRS)
Wolgast, Paul; Demore, Martha; Lowik, Paul
2007-01-01
A computer program builds application programming interfaces (APIs) and related software components for installing and uninstalling application programs in any of a variety of computers and operating systems that support the Java programming language in its binary form. This program is partly similar in function to commercial (e.g., Install-Shield) software. This program is intended to enable satisfaction of a quasi-industry-standard set of requirements for a set of APIs that would enable such installation and uninstallation and that would avoid the pitfalls that are commonly encountered during installation of software. The requirements include the following: 1) Properly detecting prerequisites to an application program before performing the installation; 2) Properly registering component requirements; 3) Correctly measuring the required hard-disk space, including accounting for prerequisite components that have already been installed; and 4) Correctly uninstalling an application program. Correct uninstallation includes (1) detecting whether any component of the program to be removed is required by another program, (2) not removing that component, and (3) deleting references to requirements of the to-be-removed program for components of other programs so that those components can be properly removed at a later time.
Computer Program for the Design and Off-Design Performance of Turbojet and Turbofan Engine Cycles
NASA Technical Reports Server (NTRS)
Morris, S. J.
1978-01-01
The rapid computer program is designed to be run in a stand-alone mode or operated within a larger program. The computation is based on a simplified one-dimensional gas turbine cycle. Each component in the engine is modeled thermo-dynamically. The component efficiencies used in the thermodynamic modeling are scaled for the off-design conditions from input design point values using empirical trends which are included in the computer code. The engine cycle program is capable of producing reasonable engine performance prediction with a minimum of computer execute time. The current computer execute time on the IBM 360/67 for one Mach number, one altitude, and one power setting is about 0.1 seconds. about 0.1 seconds. The principal assumption used in the calculation is that the compressor is operated along a line of maximum adiabatic efficiency on the compressor map. The fluid properties are computed for the combustion mixture, but dissociation is not included. The procedure included in the program is only for the combustion of JP-4, methane, or hydrogen.
Microcomputers and the Future.
ERIC Educational Resources Information Center
Uhlig, George E.
Dangers are inherent in predicting the future. In discussing the future of computers, specifically, it is useful to consider the brief history of computers from the development of ENIAC to microcomputers. Advances in computer technology can be seen by looking at changes in individual components, including internal and external memory, the…
Developing Crash-Resistant Electronic Services.
ERIC Educational Resources Information Center
Almquist, Arne J.
1997-01-01
Libraries' dependence on computers can lead to frustrations for patrons and staff during downtime caused by computer system failures. Advice for reducing the number of crashes is provided, focusing on improved training for systems staff, better management of library systems, and the development of computer systems using quality components which…
Comparing Server Energy Use and Efficiency Using Small Sample Sizes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coles, Henry C.; Qin, Yong; Price, Phillip N.
This report documents a demonstration that compared the energy consumption and efficiency of a limited sample size of server-type IT equipment from different manufacturers by measuring power at the server power supply power cords. The results are specific to the equipment and methods used. However, it is hoped that those responsible for IT equipment selection can used the methods described to choose models that optimize energy use efficiency. The demonstration was conducted in a data center at Lawrence Berkeley National Laboratory in Berkeley, California. It was performed with five servers of similar mechanical and electronic specifications; three from Intel andmore » one each from Dell and Supermicro. Server IT equipment is constructed using commodity components, server manufacturer-designed assemblies, and control systems. Server compute efficiency is constrained by the commodity component specifications and integration requirements. The design freedom, outside of the commodity component constraints, provides room for the manufacturer to offer a product with competitive efficiency that meets market needs at a compelling price. A goal of the demonstration was to compare and quantify the server efficiency for three different brands. The efficiency is defined as the average compute rate (computations per unit of time) divided by the average energy consumption rate. The research team used an industry standard benchmark software package to provide a repeatable software load to obtain the compute rate and provide a variety of power consumption levels. Energy use when the servers were in an idle state (not providing computing work) were also measured. At high server compute loads, all brands, using the same key components (processors and memory), had similar results; therefore, from these results, it could not be concluded that one brand is more efficient than the other brands. The test results show that the power consumption variability caused by the key components as a group is similar to all other components as a group. However, some differences were observed. The Supermicro server used 27 percent more power at idle compared to the other brands. The Intel server had a power supply control feature called cold redundancy, and the data suggest that cold redundancy can provide energy savings at low power levels. Test and evaluation methods that might be used by others having limited resources for IT equipment evaluation are explained in the report.« less
Ho, Hsiang; Milenković, Tijana; Memisević, Vesna; Aruri, Jayavani; Przulj, Natasa; Ganesan, Anand K
2010-06-15
RNA-mediated interference (RNAi)-based functional genomics is a systems-level approach to identify novel genes that control biological phenotypes. Existing computational approaches can identify individual genes from RNAi datasets that regulate a given biological process. However, currently available methods cannot identify which RNAi screen "hits" are novel components of well-characterized biological pathways known to regulate the interrogated phenotype. In this study, we describe a method to identify genes from RNAi datasets that are novel components of known biological pathways. We experimentally validate our approach in the context of a recently completed RNAi screen to identify novel regulators of melanogenesis. In this study, we utilize a PPI network topology-based approach to identify targets within our RNAi dataset that may be components of known melanogenesis regulatory pathways. Our computational approach identifies a set of screen targets that cluster topologically in a human PPI network with the known pigment regulator Endothelin receptor type B (EDNRB). Validation studies reveal that these genes impact pigment production and EDNRB signaling in pigmented melanoma cells (MNT-1) and normal melanocytes. We present an approach that identifies novel components of well-characterized biological pathways from functional genomics datasets that could not have been identified by existing statistical and computational approaches.
2010-01-01
Background RNA-mediated interference (RNAi)-based functional genomics is a systems-level approach to identify novel genes that control biological phenotypes. Existing computational approaches can identify individual genes from RNAi datasets that regulate a given biological process. However, currently available methods cannot identify which RNAi screen "hits" are novel components of well-characterized biological pathways known to regulate the interrogated phenotype. In this study, we describe a method to identify genes from RNAi datasets that are novel components of known biological pathways. We experimentally validate our approach in the context of a recently completed RNAi screen to identify novel regulators of melanogenesis. Results In this study, we utilize a PPI network topology-based approach to identify targets within our RNAi dataset that may be components of known melanogenesis regulatory pathways. Our computational approach identifies a set of screen targets that cluster topologically in a human PPI network with the known pigment regulator Endothelin receptor type B (EDNRB). Validation studies reveal that these genes impact pigment production and EDNRB signaling in pigmented melanoma cells (MNT-1) and normal melanocytes. Conclusions We present an approach that identifies novel components of well-characterized biological pathways from functional genomics datasets that could not have been identified by existing statistical and computational approaches. PMID:20550706
Learning by statistical cooperation of self-interested neuron-like computing elements.
Barto, A G
1985-01-01
Since the usual approaches to cooperative computation in networks of neuron-like computating elements do not assume that network components have any "preferences", they do not make substantive contact with game theoretic concepts, despite their use of some of the same terminology. In the approach presented here, however, each network component, or adaptive element, is a self-interested agent that prefers some inputs over others and "works" toward obtaining the most highly preferred inputs. Here we describe an adaptive element that is robust enough to learn to cooperate with other elements like itself in order to further its self-interests. It is argued that some of the longstanding problems concerning adaptation and learning by networks might be solvable by this form of cooperativity, and computer simulation experiments are described that show how networks of self-interested components that are sufficiently robust can solve rather difficult learning problems. We then place the approach in its proper historical and theoretical perspective through comparison with a number of related algorithms. A secondary aim of this article is to suggest that beyond what is explicitly illustrated here, there is a wealth of ideas from game theory and allied disciplines such as mathematical economics that can be of use in thinking about cooperative computation in both nervous systems and man-made systems.
NASA Tech Briefs, March 1996. Volume 20, No. 3
NASA Technical Reports Server (NTRS)
1996-01-01
Topics: Computer-Aided Design and Engineering; Electronic Components and Cicuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information; Books and Reports.
NASA Tech Briefs, August 1993. Volume 17, No. 8
NASA Technical Reports Server (NTRS)
1993-01-01
Topics include: Computer Graphics; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences; Books and Reports.
NASA Tech Briefs, September 1999. Volume 23, No. 9
NASA Technical Reports Server (NTRS)
1999-01-01
Topics discussed include: Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences;
NASA Tech Briefs, March 1993. Volume 17, No. 3
NASA Technical Reports Server (NTRS)
1993-01-01
Topics include: Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences;
NASA Technical Reports Server (NTRS)
Levy, Lionel L., Jr.; Yoshikawa, Kenneth K.
1959-01-01
A method based on linearized and slender-body theories, which is easily adapted to electronic-machine computing equipment, is developed for calculating the zero-lift wave drag of single- and multiple-component configurations from a knowledge of the second derivative of the area distribution of a series of equivalent bodies of revolution. The accuracy and computational time required of the method to calculate zero-lift wave drag is evaluated relative to another numerical method which employs the Tchebichef form of harmonic analysis of the area distribution of a series of equivalent bodies of revolution. The results of the evaluation indicate that the total zero-lift wave drag of a multiple-component configuration can generally be calculated most accurately as the sum of the zero-lift wave drag of each component alone plus the zero-lift interference wave drag between all pairs of components. The accuracy and computational time required of both methods to calculate total zero-lift wave drag at supersonic Mach numbers is comparable for airplane-type configurations. For systems of bodies of revolution both methods yield similar results with comparable accuracy; however, the present method only requires up to 60 percent of the computing time required of the harmonic-analysis method for two bodies of revolution and less time for a larger number of bodies.
Towards early software reliability prediction for computer forensic tools (case study).
Abu Talib, Manar
2016-01-01
Versatility, flexibility and robustness are essential requirements for software forensic tools. Researchers and practitioners need to put more effort into assessing this type of tool. A Markov model is a robust means for analyzing and anticipating the functioning of an advanced component based system. It is used, for instance, to analyze the reliability of the state machines of real time reactive systems. This research extends the architecture-based software reliability prediction model for computer forensic tools, which is based on Markov chains and COSMIC-FFP. Basically, every part of the computer forensic tool is linked to a discrete time Markov chain. If this can be done, then a probabilistic analysis by Markov chains can be performed to analyze the reliability of the components and of the whole tool. The purposes of the proposed reliability assessment method are to evaluate the tool's reliability in the early phases of its development, to improve the reliability assessment process for large computer forensic tools over time, and to compare alternative tool designs. The reliability analysis can assist designers in choosing the most reliable topology for the components, which can maximize the reliability of the tool and meet the expected reliability level specified by the end-user. The approach of assessing component-based tool reliability in the COSMIC-FFP context is illustrated with the Forensic Toolkit Imager case study.
Wavelet decomposition based principal component analysis for face recognition using MATLAB
NASA Astrophysics Data System (ADS)
Sharma, Mahesh Kumar; Sharma, Shashikant; Leeprechanon, Nopbhorn; Ranjan, Aashish
2016-03-01
For the realization of face recognition systems in the static as well as in the real time frame, algorithms such as principal component analysis, independent component analysis, linear discriminate analysis, neural networks and genetic algorithms are used for decades. This paper discusses an approach which is a wavelet decomposition based principal component analysis for face recognition. Principal component analysis is chosen over other algorithms due to its relative simplicity, efficiency, and robustness features. The term face recognition stands for identifying a person from his facial gestures and having resemblance with factor analysis in some sense, i.e. extraction of the principal component of an image. Principal component analysis is subjected to some drawbacks, mainly the poor discriminatory power and the large computational load in finding eigenvectors, in particular. These drawbacks can be greatly reduced by combining both wavelet transform decomposition for feature extraction and principal component analysis for pattern representation and classification together, by analyzing the facial gestures into space and time domain, where, frequency and time are used interchangeably. From the experimental results, it is envisaged that this face recognition method has made a significant percentage improvement in recognition rate as well as having a better computational efficiency.
NASA Tech Briefs, August 1997. Volume 21, No. 8
NASA Technical Reports Server (NTRS)
1997-01-01
Topics:Graphics and Simulation; Mechanical Components; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Software; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Books and Reports.
Grid Computing in K-12 Schools. Soapbox Digest. Volume 3, Number 2, Fall 2004
ERIC Educational Resources Information Center
AEL, 2004
2004-01-01
Grid computing allows large groups of computers (either in a lab, or remote and connected only by the Internet) to extend extra processing power to each individual computer to work on components of a complex request. Grid middleware, recognizing priorities set by systems administrators, allows the grid to identify and use this power without…
ERIC Educational Resources Information Center
Proctor, Tony
1988-01-01
Explores the conceptual components of a computer program designed to enhance creative thinking and reviews software that aims to stimulate creative thinking. Discusses BRAIN and ORACLE, programs intended to aid in creative problem solving. (JOW)
NASA Tech Briefs, August 1994. Volume 18, No. 8
NASA Technical Reports Server (NTRS)
1994-01-01
Topics covered include: Computer Hardware; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences; Books and Reports.
NASA Tech Briefs, June 1997. Volume 21, No. 6
NASA Technical Reports Server (NTRS)
1997-01-01
Topics include: Computer Hardware and Peripherals; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Books and Reports.
NASA Tech Briefs, November 1999. Volume 23, No. 11
NASA Technical Reports Server (NTRS)
1999-01-01
Topics covered include: Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Materials; Computer Programs; Mechanics; Machinery/Automation; Physical Sciences; Mathematics and Information Sciences; Books and Reports.
Maslia, M.L.; Randolph, R.B.
1986-01-01
The theory of anisotropic aquifer hydraulic properties and a computer program, written in Fortran 77, developed to compute the components of the anisotropic transmissivity tensor of two-dimensional groundwater flow are described. To determine the tensor components using one pumping well and three observation wells, the type-curve and straight-line approximation methods are developed. These methods are based on the equation of drawdown developed for two-dimensional nonsteady flow in an infinite anisotropic aquifer. To determine tensor components using more than three observation wells, a weighted least squares optimization procedure is described for use with the type-curve and straight-line approximation methods. The computer program described in this report allows the type-curve, straight-line approximation, and weighted least squares optimization methods to be used in conjunction with data from observation and pumping wells. Three example applications using the computer program and field data gathered during geohydrologic investigations at a site near Dawsonville, Georgia , are provided to illustrate the use of the computer program. The example applications demonstrate the use of the type-curve method using three observation wells, the weighted least squares optimization method using eight observation wells and equal weighting, and the weighted least squares optimization method using eight observation wells and unequal weighting. Results obtained using the computer program indicate major transmissivity in the range of 347-296 sq ft/day, minor transmissivity in the range of 139-99 sq ft/day, aquifer anisotropy in the range of 3.54 to 2.14, principal direction of flow in the range of N. 45.9 degrees E. to N. 58.7 degrees E., and storage coefficient in the range of 0.0063 to 0.0037. The numerical results are in good agreement with field data gathered on the weathered crystalline rocks underlying the investigation site. Supplemental material provides definitions of variables, data requirements and corresponding formats, input data and output results for the example applications, and a listing of the Fortran 77 computer code. (Author 's abstract)
Creation of system of computer-aided design for technological objects
NASA Astrophysics Data System (ADS)
Zubkova, T. M.; Tokareva, M. A.; Sultanov, N. Z.
2018-05-01
Due to the competition in the market of process equipment, its production should be flexible, retuning to various product configurations, raw materials and productivity, depending on the current market needs. This process is not possible without CAD (computer-aided design). The formation of CAD begins with planning. Synthesizing, analyzing, evaluating, converting operations, as well as visualization and decision-making operations, can be automated. Based on formal description of the design procedures, the design route in the form of an oriented graph is constructed. The decomposition of the design process, represented by the formalized description of the design procedures, makes it possible to make an informed choice of the CAD component for the solution of the task. The object-oriented approach allows us to consider the CAD as an independent system whose properties are inherited from the components. The first step determines the range of tasks to be performed by the system, and a set of components for their implementation. The second one is the configuration of the selected components. The interaction between the selected components is carried out using the CALS standards. The chosen CAD / CAE-oriented approach allows creating a single model, which is stored in the database of the subject area. Each of the integration stages is implemented as a separate functional block. The transformation of the CAD model into the model of the internal representation is realized by the block of searching for the geometric parameters of the technological machine, in which the XML-model of the construction is obtained on the basis of the feature method from the theory of image recognition. The configuration of integrated components is divided into three consecutive steps: configuring tasks, components, interfaces. The configuration of the components is realized using the theory of "soft computations" using the Mamdani fuzzy inference algorithm.
NIST Libraries of Peptide Fragmentation Mass Spectra Databass
National Institute of Standards and Technology Data Gateway
SRD 4 NIST Libraries of Peptide Fragmentation Mass Spectra Databass (PC database for purchase) Interactive computer program for predicting thermodynamic and transport properties of pure fluids and fluid mixtures containing up to 20 components. The components are selected from a database of 196 components, mostly hydrocarbons.
Introducing Cloud Computing Topics in Curricula
ERIC Educational Resources Information Center
Chen, Ling; Liu, Yang; Gallagher, Marcus; Pailthorpe, Bernard; Sadiq, Shazia; Shen, Heng Tao; Li, Xue
2012-01-01
The demand for graduates with exposure in Cloud Computing is on the rise. For many educational institutions, the challenge is to decide on how to incorporate appropriate cloud-based technologies into their curricula. In this paper, we describe our design and experiences of integrating Cloud Computing components into seven third/fourth-year…
Neuromorphic Computing: A Post-Moore's Law Complementary Architecture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schuman, Catherine D; Birdwell, John Douglas; Dean, Mark
2016-01-01
We describe our approach to post-Moore's law computing with three neuromorphic computing models that share a RISC philosophy, featuring simple components combined with a flexible and programmable structure. We envision these to be leveraged as co-processors, or as data filters to provide in situ data analysis in supercomputing environments.
ERIC Educational Resources Information Center
Nikirk, Martin
2006-01-01
This article discusses a computer game design and animation pilot at Washington County Technical High School as part of the advanced computer applications completer program. The focus of the instructional program is to teach students the 16 components of computer game design through a team-centered, problem-solving instructional format. Among…
The Contribution of Visualization to Learning Computer Architecture
ERIC Educational Resources Information Center
Yehezkel, Cecile; Ben-Ari, Mordechai; Dreyfus, Tommy
2007-01-01
This paper describes a visualization environment and associated learning activities designed to improve learning of computer architecture. The environment, EasyCPU, displays a model of the components of a computer and the dynamic processes involved in program execution. We present the results of a research program that analysed the contribution of…
Code of Federal Regulations, 2010 CFR
2010-04-01
... DETERMINATIONS PIA's Used in Computing Employee, Spouse and Divorced Spouse Annuities § 225.10 General. This subpart contains information about the PIA's that can be used in computing most employee, spouse and divorced spouse annuities. The Tier I PIA is used in computing the tier I component of an employee, spouse...
Space shuttle environmental and thermal control life support system computer program
NASA Technical Reports Server (NTRS)
1972-01-01
A computer program for the design and operation of the space shuttle environmental and thermal control life support system is presented. The subjects discussed are: (1) basic optimization program, (2) off design performance, (3) radiator/evaporator expendable usage, (4) component weights, and (5) computer program operating procedures.
The IBM PC as an Online Search Machine--Part 2: Physiology for Searchers.
ERIC Educational Resources Information Center
Kolner, Stuart J.
1985-01-01
Enumerates "hardware problems" associated with use of the IBM personal computer as an online search machine: purchase of machinery, unpacking of parts, and assembly into a properly functioning computer. Components that allow transformations of computer into a search machine (combination boards, printer, modem) and diagnostics software…
Tse computers. [ultrahigh speed optical processing for two dimensional binary image
NASA Technical Reports Server (NTRS)
Schaefer, D. H.; Strong, J. P., III
1977-01-01
An ultra-high-speed computer that utilizes binary images as its basic computational entity is being developed. The basic logic components perform thousands of operations simultaneously. Technologies of the fiber optics, display, thin film, and semiconductor industries are being utilized in the building of the hardware.
NASA Tech Briefs, July 1996. Volume 20, No. 7
NASA Technical Reports Server (NTRS)
1996-01-01
Topics covered include: Mechanical Components; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Life Sciences; Books and Reports
Observing System Simulation Experiment (OSSE) for the HyspIRI Spectrometer Mission
NASA Technical Reports Server (NTRS)
Turmon, Michael J.; Block, Gary L.; Green, Robert O.; Hua, Hook; Jacob, Joseph C.; Sobel, Harold R.; Springer, Paul L.; Zhang, Qingyuan
2010-01-01
The OSSE software provides an integrated end-to-end environment to simulate an Earth observing system by iteratively running a distributed modeling workflow based on the HyspIRI Mission, including atmospheric radiative transfer, surface albedo effects, detection, and retrieval for agile exploration of the mission design space. The software enables an Observing System Simulation Experiment (OSSE) and can be used for design trade space exploration of science return for proposed instruments by modeling the whole ground truth, sensing, and retrieval chain and to assess retrieval accuracy for a particular instrument and algorithm design. The OSSE in fra struc ture is extensible to future National Research Council (NRC) Decadal Survey concept missions where integrated modeling can improve the fidelity of coupled science and engineering analyses for systematic analysis and science return studies. This software has a distributed architecture that gives it a distinct advantage over other similar efforts. The workflow modeling components are typically legacy computer programs implemented in a variety of programming languages, including MATLAB, Excel, and FORTRAN. Integration of these diverse components is difficult and time-consuming. In order to hide this complexity, each modeling component is wrapped as a Web Service, and each component is able to pass analysis parameterizations, such as reflectance or radiance spectra, on to the next component downstream in the service workflow chain. In this way, the interface to each modeling component becomes uniform and the entire end-to-end workflow can be run using any existing or custom workflow processing engine. The architecture lets users extend workflows as new modeling components become available, chain together the components using any existing or custom workflow processing engine, and distribute them across any Internet-accessible Web Service endpoints. The workflow components can be hosted on any Internet-accessible machine. This has the advantages that the computations can be distributed to make best use of the available computing resources, and each workflow component can be hosted and maintained by their respective domain experts.
NASA Tech Briefs, January 2000. Volume 24, No. 1
NASA Technical Reports Server (NTRS)
2000-01-01
Topics include: Data Acquisition; Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Bio-Medical; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Information Sciences; Books and reports.
NASA Tech Briefs, October 1997. Volume 21, No. 10
NASA Technical Reports Server (NTRS)
1997-01-01
Topics covered include: Sensors/Imaging; Mechanical Components; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Software; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Life Sciences; Books and Reports.
NASA Tech Briefs, December 1993. Volume 17, No. 12
NASA Technical Reports Server (NTRS)
1993-01-01
Topics covered include: High-Performance Computing; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Life Sciences; Books and Reports.
Classroom Laboratory Report: Using an Image Database System in Engineering Education.
ERIC Educational Resources Information Center
Alam, Javed; And Others
1991-01-01
Describes an image database system assembled using separate computer components that was developed to overcome text-only computer hardware storage and retrieval limitations for a pavement design class. (JJK)
NASA Tech Briefs, March 1994. Volume 18, No. 3
NASA Technical Reports Server (NTRS)
1994-01-01
Topics include: Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Life Sciences; Books and Reports
NASA Tech Briefs, March 2000. Volume 24, No. 3
NASA Technical Reports Server (NTRS)
2000-01-01
Topics include: Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Life Sciences; Books and Reports.
NASA Tech Briefs, March 1997. Volume 21, No. 3
NASA Technical Reports Server (NTRS)
1997-01-01
Topics: Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Life Sciences; Books and Reports.
Color image watermarking against fog effects
NASA Astrophysics Data System (ADS)
Chotikawanid, Piyanart; Amornraksa, Thumrongrat
2017-07-01
Fog effects in various computer and camera software can partially or fully damage the watermark information within the watermarked image. In this paper, we propose a color image watermarking based on the modification of reflectance component against fog effects. The reflectance component is extracted from the blue color channel in the RGB color space of a host image, and then used to carry a watermark signal. The watermark extraction is blindly achieved by subtracting the estimation of the original reflectance component from the watermarked component. The performance of the proposed watermarking method in terms of wPSNR and NC is evaluated, and then compared with the previous method. The experimental results on robustness against various levels of fog effect, from both computer software and mobile application, demonstrated a higher robustness of our proposed method, compared to the previous one.
NASA Technical Reports Server (NTRS)
Nakazawa, S.
1987-01-01
This Annual Status Report presents the results of work performed during the third year of the 3-D Inelastic Analysis Methods for Hot Section Components program (NASA Contract NAS3-23697). The objective of the program is to produce a series of new computer codes that permit more accurate and efficient three-dimensional analysis of selected hot section components, i.e., combustor liners, turbine blades, and turbine vanes. The computer codes embody a progression of mathematical models and are streamlined to take advantage of geometrical features, loading conditions, and forms of material response that distinguish each group of selected components. This report is presented in two volumes. Volume 1 describes effort performed under Task 4B, Special Finite Element Special Function Models, while Volume 2 concentrates on Task 4C, Advanced Special Functions Models.
Program For Evaluation Of Reliability Of Ceramic Parts
NASA Technical Reports Server (NTRS)
Nemeth, N.; Janosik, L. A.; Gyekenyesi, J. P.; Powers, Lynn M.
1996-01-01
CARES/LIFE predicts probability of failure of monolithic ceramic component as function of service time. Assesses risk that component fractures prematurely as result of subcritical crack growth (SCG). Effect of proof testing of components prior to service also considered. Coupled to such commercially available finite-element programs as ANSYS, ABAQUS, MARC, MSC/NASTRAN, and COSMOS/M. Also retains all capabilities of previous CARES code, which includes estimation of fast-fracture component reliability and Weibull parameters from inert strength (without SCG contributing to failure) specimen data. Estimates parameters that characterize SCG from specimen data as well. Written in ANSI FORTRAN 77 to be machine-independent. Program runs on any computer in which sufficient addressable memory (at least 8MB) and FORTRAN 77 compiler available. For IBM-compatible personal computer with minimum 640K memory, limited program available (CARES/PC, COSMIC number LEW-15248).
Probabilistic Structural Analysis Methods (PSAM) for Select Space Propulsion System Components
NASA Technical Reports Server (NTRS)
1999-01-01
Probabilistic Structural Analysis Methods (PSAM) are described for the probabilistic structural analysis of engine components for current and future space propulsion systems. Components for these systems are subjected to stochastic thermomechanical launch loads. Uncertainties or randomness also occurs in material properties, structural geometry, and boundary conditions. Material property stochasticity, such as in modulus of elasticity or yield strength, exists in every structure and is a consequence of variations in material composition and manufacturing processes. Procedures are outlined for computing the probabilistic structural response or reliability of the structural components. The response variables include static or dynamic deflections, strains, and stresses at one or several locations, natural frequencies, fatigue or creep life, etc. Sample cases illustrates how the PSAM methods and codes simulate input uncertainties and compute probabilistic response or reliability using a finite element model with probabilistic methods.
Survey of Turbulence Models for the Computation of Turbulent Jet Flow and Noise
NASA Technical Reports Server (NTRS)
Nallasamy, N.
1999-01-01
The report presents an overview of jet noise computation utilizing the computational fluid dynamic solution of the turbulent jet flow field. The jet flow solution obtained with an appropriate turbulence model provides the turbulence characteristics needed for the computation of jet mixing noise. A brief account of turbulence models that are relevant for the jet noise computation is presented. The jet flow solutions that have been directly used to calculate jet noise are first reviewed. Then, the turbulent jet flow studies that compute the turbulence characteristics that may be used for noise calculations are summarized. In particular, flow solutions obtained with the k-e model, algebraic Reynolds stress model, and Reynolds stress transport equation model are reviewed. Since, the small scale jet mixing noise predictions can be improved by utilizing anisotropic turbulence characteristics, turbulence models that can provide the Reynolds stress components must now be considered for jet flow computations. In this regard, algebraic stress models and Reynolds stress transport models are good candidates. Reynolds stress transport models involve more modeling and computational effort and time compared to algebraic stress models. Hence, it is recommended that an algebraic Reynolds stress model (ASM) be implemented in flow solvers to compute the Reynolds stress components.
Metasurface optics for full-color computational imaging.
Colburn, Shane; Zhan, Alan; Majumdar, Arka
2018-02-01
Conventional imaging systems comprise large and expensive optical components that successively mitigate aberrations. Metasurface optics offers a route to miniaturize imaging systems by replacing bulky components with flat and compact implementations. The diffractive nature of these devices, however, induces severe chromatic aberrations, and current multiwavelength and narrowband achromatic metasurfaces cannot support full visible spectrum imaging (400 to 700 nm). We combine principles of both computational imaging and metasurface optics to build a system with a single metalens of numerical aperture ~0.45, which generates in-focus images under white light illumination. Our metalens exhibits a spectrally invariant point spread function that enables computational reconstruction of captured images with a single digital filter. This work connects computational imaging and metasurface optics and demonstrates the capabilities of combining these disciplines by simultaneously reducing aberrations and downsizing imaging systems using simpler optics.
Shading of a computer-generated hologram by zone plate modulation.
Kurihara, Takayuki; Takaki, Yasuhiro
2012-02-13
We propose a hologram calculation technique that enables reconstructing a shaded three-dimensional (3D) image. The amplitude distributions of zone plates, which generate the object points that constitute a 3D object, were two-dimensionally modulated. Two-dimensional (2D) amplitude modulation was determined on the basis of the Phong reflection model developed for computer graphics, which considers the specular, diffuse, and ambient reflection light components. The 2D amplitude modulation added variable and constant modulations: the former controlled the specular light component and the latter controlled the diffuse and ambient components. The proposed calculation technique was experimentally verified. The reconstructed image showed specular reflection that varied depending on the viewing position.
Secure Encapsulation and Publication of Biological Services in the Cloud Computing Environment
Zhang, Weizhe; Wang, Xuehui; Lu, Bo; Kim, Tai-hoon
2013-01-01
Secure encapsulation and publication for bioinformatics software products based on web service are presented, and the basic function of biological information is realized in the cloud computing environment. In the encapsulation phase, the workflow and function of bioinformatics software are conducted, the encapsulation interfaces are designed, and the runtime interaction between users and computers is simulated. In the publication phase, the execution and management mechanisms and principles of the GRAM components are analyzed. The functions such as remote user job submission and job status query are implemented by using the GRAM components. The services of bioinformatics software are published to remote users. Finally the basic prototype system of the biological cloud is achieved. PMID:24078906
The planum temporale as a computational hub.
Griffiths, Timothy D; Warren, Jason D
2002-07-01
It is increasingly recognized that the human planum temporale is not a dedicated language processor, but is in fact engaged in the analysis of many types of complex sound. We propose a model of the human planum temporale as a computational engine for the segregation and matching of spectrotemporal patterns. The model is based on segregating the components of the acoustic world and matching these components with learned spectrotemporal representations. Spectrotemporal information derived from such a 'computational hub' would be gated to higher-order cortical areas for further processing, leading to object recognition and the perception of auditory space. We review the evidence for the model and specific predictions that follow from it.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lai, Jih-Sheng
This paper introduces control system design based softwares, SIMNON and MATLAB/SIMULINK, for power electronics system simulation. A complete power electronics system typically consists of a rectifier bridge along with its smoothing capacitor, an inverter, and a motor. The system components, featuring discrete or continuous, linear or nonlinear, are modeled in mathematical equations. Inverter control methods,such as pulse-width-modulation and hysteresis current control, are expressed in either computer algorithms or digital circuits. After describing component models and control methods, computer programs are then developed for complete systems simulation. Simulation results are mainly used for studying system performances, such as input and outputmore » current harmonics, torque ripples, and speed responses. Key computer programs and simulation results are demonstrated for educational purposes.« less
Determining effects of turbine blades on fluid motion
Linn, Rodman Ray [Los Alamos, NM; Koo, Eunmo [Los Alamos, NM
2012-05-01
Disclosed is a technique for simulating wind interaction with wind turbines. A turbine blade is divided into radial sections. The effect that each of these radial sections has on the velocities in Eulerian computational cells they overlap is determined. The effect is determined using Lagrangian techniques such that the calculations need not include wind components in the radial direction. A force on each radial section of turbine blade is determined. This force depends on the axial and azimuthal components of the fluid flow in the computational cell and the geometric properties of the turbine blade. The force on the turbine blade is fed back to effect the fluid flow in the computational cell for the next time step.
Determining effects of turbine blades on fluid motion
Linn, Rodman Ray [Los Alamos, NM; Koo, Eunmo [Los Alamos, NM
2011-05-31
Disclosed is a technique for simulating wind interaction with wind turbines. A turbine blade is divided into radial sections. The effect that each of these radial sections has on the velocities in Eulerian computational cells they overlap is determined. The effect is determined using Lagrangian techniques such that the calculations need not include wind components in the radial direction. A force on each radial section of turbine blade is determined. This force depends on the axial and azimuthal components of the fluid flow in the computational cell and the geometric properties of the turbine blade. The force on the turbine blade is fed back to effect the fluid flow in the computational cell for the next time step.
Secure encapsulation and publication of biological services in the cloud computing environment.
Zhang, Weizhe; Wang, Xuehui; Lu, Bo; Kim, Tai-hoon
2013-01-01
Secure encapsulation and publication for bioinformatics software products based on web service are presented, and the basic function of biological information is realized in the cloud computing environment. In the encapsulation phase, the workflow and function of bioinformatics software are conducted, the encapsulation interfaces are designed, and the runtime interaction between users and computers is simulated. In the publication phase, the execution and management mechanisms and principles of the GRAM components are analyzed. The functions such as remote user job submission and job status query are implemented by using the GRAM components. The services of bioinformatics software are published to remote users. Finally the basic prototype system of the biological cloud is achieved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pendergrass, J.H.
1977-10-01
Based on the theory developed in an earlier report, a FORTRAN computer program, DIFFUSE, was written. It computes, for design purposes, rates of transport of hydrogen isotopes by temperature-dependent quasi-unidirectional, and quasi-static combined ordinary and thermal diffusion through thin, hot thermonuclear reactor components that can be represented by composites of plane, cylindrical-shell, and spherical-shell elements when the dominant resistance to transfer is that of the bulk metal. The program is described, directions for its use are given, and a listing of the program, together with sample problem results, is presented.
NASA Technical Reports Server (NTRS)
Effinger, Michael; Beshears, Ron; Hufnagle, David; Walker, James; Russell, Sam; Stowell, Bob; Myers, David
2002-01-01
Nondestructive characterization techniques have been used to steer development and testing of CMCs. Computed tomography is used to determine the volumetric integrity of the CMC plates and components. Thermography is used to determine the near surface integrity of the CMC plates and components. For process and material development, information such as density uniformity, part delamination, and dimensional tolerance conformity is generated. The information from the thermography and computed tomography is correlated and then specimen cutting maps are superimposed on the thermography images. This enables for tighter data and potential explanation of off nominal test data. Examples of nondestructive characterization utilization to make decisions in process and material development and testing are presented.
Diamond turning machine controller implementation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garrard, K.P.; Taylor, L.W.; Knight, B.F.
The standard controller for a Pnuemo ASG 2500 Diamond Turning Machine, an Allen Bradley 8200, has been replaced with a custom high-performance design. This controller consists of four major components. Axis position feedback information is provided by a Zygo Axiom 2/20 laser interferometer with 0.1 micro-inch resolution. Hardware interface logic couples the computers digital and analog I/O channels to the diamond turning machine`s analog motor controllers, the laser interferometer, and other machine status and control information. It also provides front panel switches for operator override of the computer controller and implement the emergency stop sequence. The remaining two components, themore » control computer hardware and software, are discussed in detail below.« less
NASA Technical Reports Server (NTRS)
Taylor, C. M.
1977-01-01
A finite element computer program which enables the analysis of distortions and stresses occurring in compounds having a relative interference is presented. The program is limited to situations in which the loading is axisymmetric. Loads arising from the interference fit(s) and external, inertial, and thermal loadings are accommodated. The components comprise several different homogeneous isotropic materials whose properties may be a function of temperature. An example illustrating the data input and program output is given.
Modeling of Explorative Procedures for Remote Object Identification
1991-09-01
haptic sensory system and the simulated foveal component of the visual system. Eventually it will allow multiple applications in remote sensing and...superposition of sensory channels. The use of a force reflecting telemanipulator and computer simulated visual foveal component are the tools which...representation of human search models is achieved by using the proprioceptive component of the haptic sensory system and the simulated foveal component of the
Holistic and component plant phenotyping using temporal image sequence.
Das Choudhury, Sruti; Bashyam, Srinidhi; Qiu, Yumou; Samal, Ashok; Awada, Tala
2018-01-01
Image-based plant phenotyping facilitates the extraction of traits noninvasively by analyzing large number of plants in a relatively short period of time. It has the potential to compute advanced phenotypes by considering the whole plant as a single object (holistic phenotypes) or as individual components, i.e., leaves and the stem (component phenotypes), to investigate the biophysical characteristics of the plants. The emergence timing, total number of leaves present at any point of time and the growth of individual leaves during vegetative stage life cycle of the maize plants are significant phenotypic expressions that best contribute to assess the plant vigor. However, image-based automated solution to this novel problem is yet to be explored. A set of new holistic and component phenotypes are introduced in this paper. To compute the component phenotypes, it is essential to detect the individual leaves and the stem. Thus, the paper introduces a novel method to reliably detect the leaves and the stem of the maize plants by analyzing 2-dimensional visible light image sequences captured from the side using a graph based approach. The total number of leaves are counted and the length of each leaf is measured for all images in the sequence to monitor leaf growth. To evaluate the performance of the proposed algorithm, we introduce University of Nebraska-Lincoln Component Plant Phenotyping Dataset (UNL-CPPD) and provide ground truth to facilitate new algorithm development and uniform comparison. The temporal variation of the component phenotypes regulated by genotypes and environment (i.e., greenhouse) are experimentally demonstrated for the maize plants on UNL-CPPD. Statistical models are applied to analyze the greenhouse environment impact and demonstrate the genetic regulation of the temporal variation of the holistic phenotypes on the public dataset called Panicoid Phenomap-1. The central contribution of the paper is a novel computer vision based algorithm for automated detection of individual leaves and the stem to compute new component phenotypes along with a public release of a benchmark dataset, i.e., UNL-CPPD. Detailed experimental analyses are performed to demonstrate the temporal variation of the holistic and component phenotypes in maize regulated by environment and genetic variation with a discussion on their significance in the context of plant science.
Improved Distance Learning Environment For Marine Forces Reserve
2016-09-01
keyboard, to 20 form a desktop computer . Laptop computers share similar components but add mobility to the user. If additional desktop computers ...for stationary computing devices such as desktop PCs and laptops include the Microsoft Windows, Mac OS, and Linux families of OSs 44 (Hopkins...opportunities to all Marines. For active duty Marines, government-provided desktops and laptops (GPDLs) typically support DL T&E or learning resource
A PC program to optimize system configuration for desired reliability at minimum cost
NASA Technical Reports Server (NTRS)
Hills, Steven W.; Siahpush, Ali S.
1994-01-01
High reliability is desired in all engineered systems. One way to improve system reliability is to use redundant components. When redundant components are used, the problem becomes one of allocating them to achieve the best reliability without exceeding other design constraints such as cost, weight, or volume. Systems with few components can be optimized by simply examining every possible combination but the number of combinations for most systems is prohibitive. A computerized iteration of the process is possible but anything short of a super computer requires too much time to be practical. Many researchers have derived mathematical formulations for calculating the optimum configuration directly. However, most of the derivations are based on continuous functions whereas the real system is composed of discrete entities. Therefore, these techniques are approximations of the true optimum solution. This paper describes a computer program that will determine the optimum configuration of a system of multiple redundancy of both standard and optional components. The algorithm is a pair-wise comparative progression technique which can derive the true optimum by calculating only a small fraction of the total number of combinations. A designer can quickly analyze a system with this program on a personal computer.
Web Program for Development of GUIs for Cluster Computers
NASA Technical Reports Server (NTRS)
Czikmantory, Akos; Cwik, Thomas; Klimeck, Gerhard; Hua, Hook; Oyafuso, Fabiano; Vinyard, Edward
2003-01-01
WIGLAF (a Web Interface Generator and Legacy Application Facade) is a computer program that provides a Web-based, distributed, graphical-user-interface (GUI) framework that can be adapted to any of a broad range of application programs, written in any programming language, that are executed remotely on any cluster computer system. WIGLAF enables the rapid development of a GUI for controlling and monitoring a specific application program running on the cluster and for transferring data to and from the application program. The only prerequisite for the execution of WIGLAF is a Web-browser program on a user's personal computer connected with the cluster via the Internet. WIGLAF has a client/server architecture: The server component is executed on the cluster system, where it controls the application program and serves data to the client component. The client component is an applet that runs in the Web browser. WIGLAF utilizes the Extensible Markup Language to hold all data associated with the application software, Java to enable platform-independent execution on the cluster system and the display of a GUI generator through the browser, and the Java Remote Method Invocation software package to provide simple, effective client/server networking.
yourSky: Custom Sky-Image Mosaics via the Internet
NASA Technical Reports Server (NTRS)
Jacob, Joseph
2003-01-01
yourSky (http://yourSky.jpl.nasa.gov) is a computer program that supplies custom astronomical image mosaics of sky regions specified by requesters using client computers connected to the Internet. [yourSky is an upgraded version of the software reported in Software for Generating Mosaics of Astronomical Images (NPO-21121), NASA Tech Briefs, Vol. 25, No. 4 (April 2001), page 16a.] A requester no longer has to engage in the tedious process of determining what subset of images is needed, nor even to know how the images are indexed in image archives. Instead, in response to a requester s specification of the size and location of the sky area, (and optionally of the desired set and type of data, resolution, coordinate system, projection, and image format), yourSky automatically retrieves the component image data from archives totaling tens of terabytes stored on computer tape and disk drives at multiple sites and assembles the component images into a mosaic image by use of a high-performance parallel code. yourSky runs on the server computer where the mosaics are assembled. Because yourSky includes a Web-interface component, no special client software is needed: ordinary Web browser software is sufficient.
Renkawitz, Tobias; Tingart, Markus; Grifka, Joachim; Sendtner, Ernst; Kalteis, Thomas
2009-09-01
This article outlines the scientific basis and a state-of-the-art application of computer-assisted orthopedic surgery in total hip arthroplasty (THA) and provides a future perspective on this technology. Computer-assisted orthopedic surgery in primary THA has the potential to couple 3D simulations with real-time evaluations of surgical performance, which has brought these developments from the research laboratory all the way to clinical use. Nonimage- or imageless-based navigation systems without the need for additional pre- or intra-operative image acquisition have stood the test to significantly reduce the variability in positioning the acetabular component and have shown precise measurement of leg length and offset changes during THA. More recently, computer-assisted orthopedic surgery systems have opened a new frontier for accurate surgical practice in minimally invasive, tissue-preserving THA. The future generation of imageless navigation systems will switch from simple measurement tasks to real navigation tools. These software algorithms will consider the cup and stem as components of a coupled biomechanical system, navigating the orthopedic surgeon to find an optimized complementary component orientation rather than target values intraoperatively, and are expected to have a high impact on clinical practice and postoperative functionality in modern THA.
Kia, Mohammad; Wright, Timothy M; Cross, Michael B; Mayman, David J; Pearle, Andrew D; Sculco, Peter K; Westrich, Geoffrey H; Imhauser, Carl W
2018-01-01
The correct amount of external rotation of the femoral component during TKA is controversial because the resulting changes in biomechanical knee function associated with varying degrees of femoral component rotation are not well understood. We addressed this question using a computational model, which allowed us to isolate the biomechanical impact of geometric factors including bony shapes, location of ligament insertions, and implant size across three different knees after posterior-stabilized (PS) TKA. Using a computational model of the tibiofemoral joint, we asked: (1) Does external rotation unload the medial collateral ligament (MCL) and what is the effect on lateral collateral ligament tension? (2) How does external rotation alter tibiofemoral contact loads and kinematics? (3) Does 3° external rotation relative to the posterior condylar axis align the component to the surgical transepicondylar axis (sTEA) and what anatomic factors of the femoral condyle explain variations in maximum MCL tension among knees? We incorporated a PS TKA into a previously developed computational knee model applied to three neutrally aligned, nonarthritic, male cadaveric knees. The computational knee model was previously shown to corroborate coupled motions and ligament loading patterns of the native knee through a range of flexion. Implant geometries were virtually installed using hip-to-ankle CT scans through measured resection and anterior referencing surgical techniques. Collateral ligament properties were standardized across each knee model by defining stiffness and slack lengths based on the healthy population. The femoral component was externally rotated from 0° to 9° relative to the posterior condylar axis in 3° increments. At each increment, the knee was flexed under 500 N compression from 0° to 90° simulating an intraoperative examination. The computational model predicted collateral ligament forces, compartmental contact forces, and tibiofemoral internal/external and varus-valgus rotation through the flexion range. The computational model predicted that femoral component external rotation relative to the posterior condylar axis unloads the MCL and the medial compartment; however, these effects were inconsistent from knee to knee. When the femoral component was externally rotated by 9° rather than 0° in knees one, two, and three, the maximum force carried by the MCL decreased a respective 55, 88, and 297 N; the medial contact forces decreased at most a respective 90, 190, and 570 N; external tibial rotation in early flexion increased by a respective 4.6°, 1.1°, and 3.3°; and varus angulation of the tibia relative to the femur in late flexion increased by 8.4°, 8.0°, and 7.9°, respectively. With 3° of femoral component external rotation relative to the posterior condylar axis, the femoral component was still externally rotated by up to 2.7° relative to the sTEA in these three neutrally aligned knees. Variations in MCL force from knee to knee with 3° of femoral component external rotation were related to the ratio of the distances from the femoral insertion of the MCL to the posterior and distal cuts of the implant; the closer this ratio was to 1, the more uniform were the MCL tensions from 0° to 90° flexion. A larger ratio of distances from the femoral insertion of the MCL to the posterior and distal cuts may cause clinically relevant increases in both MCL tension and compartmental contact forces. To obtain more consistent ligament tensions through flexion, it may be important to locate the posterior and distal aspects of the femoral component with respect to the proximal insertion of the MCL such that a ratio of 1 is achieved.
Earth Orbiter 1: Wideband Advanced Recorder and Processor (WARP)
NASA Technical Reports Server (NTRS)
Smith, Terry; Kessler, John
1999-01-01
An advanced on-board spacecraft data system component is presented. The component is computer-based and provides science data acquisition, processing, storage, and base-band transmission functions. Specifically, the component is a very high rate solid state recorder, serving as a pathfinder for achieving the data handling requirements of next-generation hyperspectral imaging missions.
Visualizing and Understanding the Components of Lagrange and Newton Interpolation
ERIC Educational Resources Information Center
Yang, Yajun; Gordon, Sheldon P.
2016-01-01
This article takes a close look at Lagrange and Newton interpolation by graphically examining the component functions of each of these formulas. Although interpolation methods are often considered simply to be computational procedures, we demonstrate how the components of the polynomial terms in these formulas provide insight into where these…
Historical evolution of disease mapping in general and specifically of cancer mapping.
Howe, G M
1989-01-01
The presentation of areal data in epidemiology is illustrated by such mapping techniques as dots (spots), shading (choropleth, thematic) and isolines (isopleths). Examples are also given of computer-assisted cartography (computer graphics) which employs hardware and software components of digital computers, together with the use of geographical and demographic base maps.
Using PC Software To Enhance the Student's Ability To Learn the Exporting Process.
ERIC Educational Resources Information Center
Buckles, Tom A.; Lange, Irene
This paper describes the advantages of using computer simulations in the classroom or managerial environment and the major premise and principal components of Export to Win!, a computer simulation used in international marketing seminars. A rationale for using computer simulations argues that they improve the quality of teaching by building…
A DDC Bibliography on Computers in Information Sciences. Volume I. Information Sciences Series.
ERIC Educational Resources Information Center
Defense Documentation Center, Alexandria, VA.
The unclassified and unlimited bibliography compiles references dealing specifically with the role of computers in information sciences. The volume contains 249 annotated references grouped under two major headings: Time Shared, On-Line, and Real Time Systems, and Computer Components. The references are arranged in accesion number (AD-number)…
Computer-Aided College Algebra: Learning Components that Students Find Beneficial
ERIC Educational Resources Information Center
Aichele, Douglas B.; Francisco, Cynthia; Utley, Juliana; Wescoatt, Benjamin
2011-01-01
A mixed-method study was conducted during the Fall 2008 semester to better understand the experiences of students participating in computer-aided instruction of College Algebra using the software MyMathLab. The learning environment included a computer learning system for the majority of the instruction, a support system via focus groups (weekly…
An Analysis of Creative Process Learning in Computer Game Activities through Player Experiences
ERIC Educational Resources Information Center
Inchamnan, Wilawan
2016-01-01
This research investigates the extent to which creative processes can be fostered through computer gaming. It focuses on creative components in games that have been specifically designed for educational purposes: Digital Game Based Learning (DGBL). A behavior analysis for measuring the creative potential of computer game activities and learning…
Interaction and Cognition in Asynchronous Computer Conferencing
ERIC Educational Resources Information Center
Schrire, Sarah
2004-01-01
This paper is based on a multiple-case study of the learning process in three asynchronous computer conferences. The conferences were part of the distance learning component in doctoral degree courses in computing technology in education offered at an American university. The conferences were analyzed from a number of perspectives, the emphasis in…
Students' Attitudes toward Computers at the College of Nursing at King Saud University (KSU)
ERIC Educational Resources Information Center
Samarkandi, Osama Abdulhaleem
2011-01-01
Computer knowledge and skills are becoming essential components technology in nursing education. Saudi nurses must be prepared to utilize these technologies for the advancement of science and nursing practice in local and global communities. Little attention has been directed to students' attitudes about computer usage in academic communities in…
An Application of a Computer Instructional Management Package.
ERIC Educational Resources Information Center
Sullivan, David W.
Following the presentation of a conceptual framework for computer-based education (CBE), this paper examines the use of one aspect of CBE, computer-managed instruction (CMI), in a Major Appliance Serving Program. The paper begins with definitions and a graphic illustration of CBE and its components and uses, i.e., CMI, tutorial or…
High-Performance Computing Data Center | Computational Science | NREL
liquid cooling to achieve its very low PUE, then captures and reuses waste heat as the primary heating dry cooler that uses refrigerant in a passive cycle to dissipate heat-is reducing onsite water Measuring efficiency through PUE Warm-water liquid cooling Re-using waste heat from computing components
A Method for Identifying Contours in Processing Digital Images from Computer Tomograph
NASA Astrophysics Data System (ADS)
Roşu, Şerban; Pater, Flavius; Costea, Dan; Munteanu, Mihnea; Roşu, Doina; Fratila, Mihaela
2011-09-01
The first step in digital processing of two-dimensional computed tomography images is to identify the contour of component elements. This paper deals with the collective work of specialists in medicine and applied mathematics in computer science on elaborating new algorithms and methods in medical 2D and 3D imagery.
Report on 1984-85 Statewide Computer Survey.
ERIC Educational Resources Information Center
South Carolina State Dept. of Education, Columbia. Office of Instructional Technology.
To follow up an initial statewide computer survey (1983-84), the State Department of Education conducted an investigation of microcomputers and their use in elementary, middle, and secondary schools in South Carolina. The survey was conducted in two phases: an inventory of computer equipment was taken as a component of the Statewide Basic…
A Proposed Programming System for Knuth's Mix Computer.
ERIC Educational Resources Information Center
Akers, Max Neil
A programing system using a hypothetical computer is proposed for use in teaching machine and assembly language programing courses. Major components such as monitor, assembler, interpreter, grader, and diagnostics are described. The interpreter is programed and documented for use on an IBM 360/67 computer. The interpreter can be used for teaching…
Enhancing Student Writing and Computer Programming with LATEX and MATLAB in Multivariable Calculus
ERIC Educational Resources Information Center
Sullivan, Eric; Melvin, Timothy
2016-01-01
Written communication and computer programming are foundational components of an undergraduate degree in the mathematical sciences. All lower-division mathematics courses at our institution are paired with computer-based writing, coding, and problem-solving activities. In multivariable calculus we utilize MATLAB and LATEX to have students explore…
The Development of Computational Thinking in a High School Chemistry Course
ERIC Educational Resources Information Center
Matsumoto, Paul S.; Cao, Jiankang
2017-01-01
Computational thinking is a component of the Science and Engineering Practices in the Next Generation Science Standards, which were adopted by some states. We describe the activities in a high school chemistry course that may develop students' computational thinking skills by primarily using Excel, a widely available spreadsheet software. These…
77 FR 64439 - Airworthiness Directives; Bell Helicopter Textron Canada (Bell) Model Helicopters
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-22
... Control System] Air Data Computer.'' TCAA issued AD CF-2005-30 to require the procedures in Bell Alert... overspeed warning system, replacing the overspeed warning computer, V ne converter, and pilot and copilot... Aircraft System/Component Code: 3417 Air Data Computer. Issued in Fort Worth, Texas, on October 12, 2012...
ERIC Educational Resources Information Center
Selverian, Melissa E. Markaridian; Lombard, Matthew
2009-01-01
A thorough review of the research relating to Human-Computer Interface (HCI) form and content factors in the education, communication and computer science disciplines reveals strong associations of meaningful perceptual "illusions" with enhanced learning and satisfaction in the evolving classroom. Specifically, associations emerge…
Teaching Computer-Aided Design of Fluid Flow and Heat Transfer Engineering Equipment.
ERIC Educational Resources Information Center
Gosman, A. D.; And Others
1979-01-01
Describes a teaching program for fluid mechanics and heat transfer which contains both computer aided learning (CAL) and computer aided design (CAD) components and argues that the understanding of the physical and numerical modeling taught in the CAL course is essential to the proper implementation of CAD. (Author/CMV)
Reconfigurable vision system for real-time applications
NASA Astrophysics Data System (ADS)
Torres-Huitzil, Cesar; Arias-Estrada, Miguel
2002-03-01
Recently, a growing community of researchers has used reconfigurable systems to solve computationally intensive problems. Reconfigurability provides optimized processors for systems on chip designs, and makes easy to import technology to a new system through reusable modules. The main objective of this work is the investigation of a reconfigurable computer system targeted for computer vision and real-time applications. The system is intended to circumvent the inherent computational load of most window-based computer vision algorithms. It aims to build a system for such tasks by providing an FPGA-based hardware architecture for task specific vision applications with enough processing power, using the minimum amount of hardware resources as possible, and a mechanism for building systems using this architecture. Regarding the software part of the system, a library of pre-designed and general-purpose modules that implement common window-based computer vision operations is being investigated. A common generic interface is established for these modules in order to define hardware/software components. These components can be interconnected to develop more complex applications, providing an efficient mechanism for transferring image and result data among modules. Some preliminary results are presented and discussed.
Trusted Computing Technologies, Intel Trusted Execution Technology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guise, Max Joseph; Wendt, Jeremy Daniel
2011-01-01
We describe the current state-of-the-art in Trusted Computing Technologies - focusing mainly on Intel's Trusted Execution Technology (TXT). This document is based on existing documentation and tests of two existing TXT-based systems: Intel's Trusted Boot and Invisible Things Lab's Qubes OS. We describe what features are lacking in current implementations, describe what a mature system could provide, and present a list of developments to watch. Critical systems perform operation-critical computations on high importance data. In such systems, the inputs, computation steps, and outputs may be highly sensitive. Sensitive components must be protected from both unauthorized release, and unauthorized alteration: Unauthorizedmore » users should not access the sensitive input and sensitive output data, nor be able to alter them; the computation contains intermediate data with the same requirements, and executes algorithms that the unauthorized should not be able to know or alter. Due to various system requirements, such critical systems are frequently built from commercial hardware, employ commercial software, and require network access. These hardware, software, and network system components increase the risk that sensitive input data, computation, and output data may be compromised.« less
Space-Shuttle Emulator Software
NASA Technical Reports Server (NTRS)
Arnold, Scott; Askew, Bill; Barry, Matthew R.; Leigh, Agnes; Mermelstein, Scott; Owens, James; Payne, Dan; Pemble, Jim; Sollinger, John; Thompson, Hiram;
2007-01-01
A package of software has been developed to execute a raw binary image of the space shuttle flight software for simulation of the computational effects of operation of space shuttle avionics. This software can be run on inexpensive computer workstations. Heretofore, it was necessary to use real flight computers to perform such tests and simulations. The package includes a program that emulates the space shuttle orbiter general- purpose computer [consisting of a central processing unit (CPU), input/output processor (IOP), master sequence controller, and buscontrol elements]; an emulator of the orbiter display electronics unit and models of the associated cathode-ray tubes, keyboards, and switch controls; computational models of the data-bus network; computational models of the multiplexer-demultiplexer components; an emulation of the pulse-code modulation master unit; an emulation of the payload data interleaver; a model of the master timing unit; a model of the mass memory unit; and a software component that ensures compatibility of telemetry and command services between the simulated space shuttle avionics and a mission control center. The software package is portable to several host platforms.
Computer-aided design of antenna structures and components
NASA Technical Reports Server (NTRS)
Levy, R.
1976-01-01
This paper discusses computer-aided design procedures for antenna reflector structures and related components. The primary design aid is a computer program that establishes cross sectional sizes of the structural members by an optimality criterion. Alternative types of deflection-dependent objectives can be selected for designs subject to constraints on structure weight. The computer program has a special-purpose formulation to design structures of the type frequently used for antenna construction. These structures, in common with many in other areas of application, are represented by analytical models that employ only the three translational degrees of freedom at each node. The special-purpose construction of the program, however, permits coding and data management simplifications that provide advantages in problem size and execution speed. Size and speed are essentially governed by the requirements of structural analysis and are relatively unaffected by the added requirements of design. Computation times to execute several design/analysis cycles are comparable to the times required by general-purpose programs for a single analysis cycle. Examples in the paper illustrate effective design improvement for structures with several thousand degrees of freedom and within reasonable computing times.
Stress Testing of Data-Communication Networks
NASA Technical Reports Server (NTRS)
Leucht, Kurt; Bedette, Guy
2006-01-01
NetStress is a computer program that stress-tests a data-communication network and components thereof. NetStress comprises two components running, respectively, in a transmitting system and a receiving system connected to a network under test
Ko, Sungahn; Zhao, Jieqiong; Xia, Jing; Afzal, Shehzad; Wang, Xiaoyu; Abram, Greg; Elmqvist, Niklas; Kne, Len; Van Riper, David; Gaither, Kelly; Kennedy, Shaun; Tolone, William; Ribarsky, William; Ebert, David S
2014-12-01
We present VASA, a visual analytics platform consisting of a desktop application, a component model, and a suite of distributed simulation components for modeling the impact of societal threats such as weather, food contamination, and traffic on critical infrastructure such as supply chains, road networks, and power grids. Each component encapsulates a high-fidelity simulation model that together form an asynchronous simulation pipeline: a system of systems of individual simulations with a common data and parameter exchange format. At the heart of VASA is the Workbench, a visual analytics application providing three distinct features: (1) low-fidelity approximations of the distributed simulation components using local simulation proxies to enable analysts to interactively configure a simulation run; (2) computational steering mechanisms to manage the execution of individual simulation components; and (3) spatiotemporal and interactive methods to explore the combined results of a simulation run. We showcase the utility of the platform using examples involving supply chains during a hurricane as well as food contamination in a fast food restaurant chain.
Computer analysis of the leaf movements of pinto beans.
Hoshizaki, T; Hamner, K C
1969-07-01
Computer analysis was used for the detection of rhythmic components and the estimation of period length in leaf movement records. The results of this study indicated that spectral analysis can be profitably used to determine rhythmic components in leaf movements.In Pinto bean plants (Phaseolus vulgaris L.) grown for 28 days under continuous light of 750 ft-c and at a constant temperature of 28 degrees , there was only 1 highly significant rhythmic component in the leaf movements. The period of this rhythm was 27.3 hr. In plants grown at 20 degrees , there were 2 highly significant rhythmic components: 1 of 13.8 hr and a much stronger 1 of 27.3 hr. At 15 degrees , the highly significant rhythmic components were also 27.3 and 13.8 hr in length but were of equal intensity. Random movements less than 9 hr in length became very pronounced at this temperature. At 10 degrees , no significant rhythm was found in the leaf movements. At 5 degrees , the leaf movements ceased within 1 day.
NASA Tech Briefs, July 1994. Volume 18, No. 7
NASA Technical Reports Server (NTRS)
1994-01-01
Topics covered include: Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Life Sciences; Books and Reports
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-12
... Certain Wireless Communication Devices, Portable Music and Data Processing Devices, Computers and..., portable music and data processing devices, computers and components thereof. The complaint names as...
NASA Tech Briefs, November 2000. Volume 24, No. 11
NASA Technical Reports Server (NTRS)
2000-01-01
Topics covered include: Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Test and Measurement; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Data Acquisition.
NASA Tech Briefs, April 1996. Volume 20, No. 4
NASA Technical Reports Server (NTRS)
1996-01-01
Topics covered include: Advanced Composites and Plastics; Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information; Books and Reports.
NASA Tech Briefs, October 1994. Volume 18, No. 10
NASA Technical Reports Server (NTRS)
1994-01-01
Topics: Data Acquisition and Analysis; Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences; Books and Reports
Aerodynamic analysis for aircraft with nacelles, pylons, and winglets at transonic speeds
NASA Technical Reports Server (NTRS)
Boppe, Charles W.
1987-01-01
A computational method has been developed to provide an analysis for complex realistic aircraft configurations at transonic speeds. Wing-fuselage configurations with various combinations of pods, pylons, nacelles, and winglets can be analyzed along with simpler shapes such as airfoils, isolated wings, and isolated bodies. The flexibility required for the treatment of such diverse geometries is obtained by using a multiple nested grid approach in the finite-difference relaxation scheme. Aircraft components (and their grid systems) can be added or removed as required. As a result, the computational method can be used in the same manner as a wind tunnel to study high-speed aerodynamic interference effects. The multiple grid approach also provides high boundary point density/cost ratio. High resolution pressure distributions can be obtained. Computed results are correlated with wind tunnel and flight data using four different transport configurations. Experimental/computational component interference effects are included for cases where data are available. The computer code used for these comparisons is described in the appendices.
The Numerical Propulsion System Simulation: An Overview
NASA Technical Reports Server (NTRS)
Lytle, John K.
2000-01-01
Advances in computational technology and in physics-based modeling are making large-scale, detailed simulations of complex systems possible within the design environment. For example, the integration of computing, communications, and aerodynamics has reduced the time required to analyze major propulsion system components from days and weeks to minutes and hours. This breakthrough has enabled the detailed simulation of major propulsion system components to become a routine part of designing systems, providing the designer with critical information about the components early in the design process. This paper describes the development of the numerical propulsion system simulation (NPSS), a modular and extensible framework for the integration of multicomponent and multidisciplinary analysis tools using geographically distributed resources such as computing platforms, data bases, and people. The analysis is currently focused on large-scale modeling of complete aircraft engines. This will provide the product developer with a "virtual wind tunnel" that will reduce the number of hardware builds and tests required during the development of advanced aerospace propulsion systems.
A computational approach for coupled 1D and 2D/3D CFD modelling of pulse Tube cryocoolers
NASA Astrophysics Data System (ADS)
Fang, T.; Spoor, P. S.; Ghiaasiaan, S. M.
2017-12-01
The physics behind Stirling-type cryocoolers are complicated. One dimensional (1D) simulation tools offer limited details and accuracy, in particular for cryocoolers that have non-linear configurations. Multi-dimensional Computational Fluid Dynamic (CFD) methods are useful but are computationally expensive in simulating cyrocooler systems in their entirety. In view of the fact that some components of a cryocooler, e.g., inertance tubes and compliance tanks, can be modelled as 1D components with little loss of critical information, a 1D-2D/3D coupled model was developed. Accordingly, one-dimensional - like components are represented by specifically developed routines. These routines can be coupled to CFD codes and provide boundary conditions for 2D/3D CFD simulations. The developed coupled model, while preserving sufficient flow field details, is two orders of magnitude faster than equivalent 2D/3D CFD models. The predictions show good agreement with experimental data and 2D/3D CFD model.
NASA Astrophysics Data System (ADS)
Geng, Weihua; Zhao, Shan
2017-12-01
We present a new Matched Interface and Boundary (MIB) regularization method for treating charge singularity in solvated biomolecules whose electrostatics are described by the Poisson-Boltzmann (PB) equation. In a regularization method, by decomposing the potential function into two or three components, the singular component can be analytically represented by the Green's function, while other components possess a higher regularity. Our new regularization combines the efficiency of two-component schemes with the accuracy of the three-component schemes. Based on this regularization, a new MIB finite difference algorithm is developed for solving both linear and nonlinear PB equations, where the nonlinearity is handled by using the inexact-Newton's method. Compared with the existing MIB PB solver based on a three-component regularization, the present algorithm is simpler to implement by circumventing the work to solve a boundary value Poisson equation inside the molecular interface and to compute related interface jump conditions numerically. Moreover, the new MIB algorithm becomes computationally less expensive, while maintains the same second order accuracy. This is numerically verified by calculating the electrostatic potential and solvation energy on the Kirkwood sphere on which the analytical solutions are available and on a series of proteins with various sizes.
Python as a federation tool for GENESIS 3.0.
Cornelis, Hugo; Rodriguez, Armando L; Coop, Allan D; Bower, James M
2012-01-01
The GENESIS simulation platform was one of the first broad-scale modeling systems in computational biology to encourage modelers to develop and share model features and components. Supported by a large developer community, it participated in innovative simulator technologies such as benchmarking, parallelization, and declarative model specification and was the first neural simulator to define bindings for the Python scripting language. An important feature of the latest version of GENESIS is that it decomposes into self-contained software components complying with the Computational Biology Initiative federated software architecture. This architecture allows separate scripting bindings to be defined for different necessary components of the simulator, e.g., the mathematical solvers and graphical user interface. Python is a scripting language that provides rich sets of freely available open source libraries. With clean dynamic object-oriented designs, they produce highly readable code and are widely employed in specialized areas of software component integration. We employ a simplified wrapper and interface generator to examine an application programming interface and make it available to a given scripting language. This allows independent software components to be 'glued' together and connected to external libraries and applications from user-defined Python or Perl scripts. We illustrate our approach with three examples of Python scripting. (1) Generate and run a simple single-compartment model neuron connected to a stand-alone mathematical solver. (2) Interface a mathematical solver with GENESIS 3.0 to explore a neuron morphology from either an interactive command-line or graphical user interface. (3) Apply scripting bindings to connect the GENESIS 3.0 simulator to external graphical libraries and an open source three dimensional content creation suite that supports visualization of models based on electron microscopy and their conversion to computational models. Employed in this way, the stand-alone software components of the GENESIS 3.0 simulator provide a framework for progressive federated software development in computational neuroscience.
Python as a Federation Tool for GENESIS 3.0
Cornelis, Hugo; Rodriguez, Armando L.; Coop, Allan D.; Bower, James M.
2012-01-01
The GENESIS simulation platform was one of the first broad-scale modeling systems in computational biology to encourage modelers to develop and share model features and components. Supported by a large developer community, it participated in innovative simulator technologies such as benchmarking, parallelization, and declarative model specification and was the first neural simulator to define bindings for the Python scripting language. An important feature of the latest version of GENESIS is that it decomposes into self-contained software components complying with the Computational Biology Initiative federated software architecture. This architecture allows separate scripting bindings to be defined for different necessary components of the simulator, e.g., the mathematical solvers and graphical user interface. Python is a scripting language that provides rich sets of freely available open source libraries. With clean dynamic object-oriented designs, they produce highly readable code and are widely employed in specialized areas of software component integration. We employ a simplified wrapper and interface generator to examine an application programming interface and make it available to a given scripting language. This allows independent software components to be ‘glued’ together and connected to external libraries and applications from user-defined Python or Perl scripts. We illustrate our approach with three examples of Python scripting. (1) Generate and run a simple single-compartment model neuron connected to a stand-alone mathematical solver. (2) Interface a mathematical solver with GENESIS 3.0 to explore a neuron morphology from either an interactive command-line or graphical user interface. (3) Apply scripting bindings to connect the GENESIS 3.0 simulator to external graphical libraries and an open source three dimensional content creation suite that supports visualization of models based on electron microscopy and their conversion to computational models. Employed in this way, the stand-alone software components of the GENESIS 3.0 simulator provide a framework for progressive federated software development in computational neuroscience. PMID:22276101
WMT: The CSDMS Web Modeling Tool
NASA Astrophysics Data System (ADS)
Piper, M.; Hutton, E. W. H.; Overeem, I.; Syvitski, J. P.
2015-12-01
The Community Surface Dynamics Modeling System (CSDMS) has a mission to enable model use and development for research in earth surface processes. CSDMS strives to expand the use of quantitative modeling techniques, promotes best practices in coding, and advocates for the use of open-source software. To streamline and standardize access to models, CSDMS has developed the Web Modeling Tool (WMT), a RESTful web application with a client-side graphical interface and a server-side database and API that allows users to build coupled surface dynamics models in a web browser on a personal computer or a mobile device, and run them in a high-performance computing (HPC) environment. With WMT, users can: Design a model from a set of components Edit component parameters Save models to a web-accessible server Share saved models with the community Submit runs to an HPC system Download simulation results The WMT client is an Ajax application written in Java with GWT, which allows developers to employ object-oriented design principles and development tools such as Ant, Eclipse and JUnit. For deployment on the web, the GWT compiler translates Java code to optimized and obfuscated JavaScript. The WMT client is supported on Firefox, Chrome, Safari, and Internet Explorer. The WMT server, written in Python and SQLite, is a layered system, with each layer exposing a web service API: wmt-db: database of component, model, and simulation metadata and output wmt-api: configure and connect components wmt-exe: launch simulations on remote execution servers The database server provides, as JSON-encoded messages, the metadata for users to couple model components, including descriptions of component exchange items, uses and provides ports, and input parameters. Execution servers are network-accessible computational resources, ranging from HPC systems to desktop computers, containing the CSDMS software stack for running a simulation. Once a simulation completes, its output, in NetCDF, is packaged and uploaded to a data server where it is stored and from which a user can download it as a single compressed archive file.
Cheng, Tao; Zhang, Guoyou; Zhang, Xianlong
2011-12-01
The aim of computer-assisted surgery is to improve accuracy and limit the range of surgical variability. However, a worldwide debate exists regarding the importance and usefulness of computer-assisted navigation for total knee arthroplasty (TKA). The main purpose of this study is to summarize and compare the radiographic outcomes of TKA performed using imageless computer-assisted navigation compared with conventional techniques. An electronic search of PubMed, EMBASE, Web of Science, and Cochrane library databases was made, in addition to manual search of major orthopedic journals. A meta-analysis of 29 quasi-randomized/randomized controlled trials (quasi-RCTs/RCTs) and 11 prospective comparative studies was conducted through a random effects model. Additional a priori sources of clinical heterogeneity were evaluated by subgroup analysis with regard to radiographic methods. When the outlier cut-off value of lower limb axis was defined as ±2° or ±3° from the neutral, the postoperative full-length radiographs demonstrated that the risk ratio was 0.54 or 0.39, respectively, which were in favor of the navigated group. When the cut-off value used for the alignment in the coronal and sagittal plane was 2° or 3°, imageless navigation significantly reduced the outlier rate of the femoral and tibial components compared with the conventional group. Notably, computed tomography scans demonstrated no statistically significant differences between the two groups regarding the outliers in the rotational alignment of the femoral and tibial components; however, there was strong statistical heterogeneity. Our results indicated that imageless computer-assisted navigation systems improve lower limb axis and component orientation in the coronal and sagittal planes, but not the rotational alignment in TKA. Further multiple-center clinical trials with long-term follow-up are needed to determine differences in the clinical and functional outcomes of knee arthroplasties performed using computer-assisted techniques. Copyright © 2011 Elsevier Inc. All rights reserved.
Reconfigurable Analog PDE computation for Baseband and RFComputation
2017-03-01
waveguiding PDEs. One-dimensional ladder topologies enable linear delays, linear-phase analog filters , as well as analog beamforming, potentially at RF...performance. This discussion focuses on ODE / PDE analog computation available in SoC FPAA structures. One such computation is a ladder filter (Fig...Implementation of a one-dimensional ladder filter for computing inductor (L) and capacitor (C) lines. These components can be implemented in CABs or as
USSR Report, Military Affairs Foreign Military Review No 6, June 1986
1986-11-20
computers used for an objective accounting of the difference in current firing conditions from standard hold an important place in integrated fire...control systems of modern tanks of capitalist countries. Mechanical ballistic computers gave way in the early 1970’s to electronic computers , initially...made with analog components. Then digital ballistic computers were created, installed in particular in the Ml Abrams and Leopard-2 tanks. The basic
Hybrid, experimental and computational, investigation of mechanical components
NASA Astrophysics Data System (ADS)
Furlong, Cosme; Pryputniewicz, Ryszard J.
1996-07-01
Computational and experimental methodologies have unique features for the analysis and solution of a wide variety of engineering problems. Computations provide results that depend on selection of input parameters such as geometry, material constants, and boundary conditions which, for correct modeling purposes, have to be appropriately chosen. In addition, it is relatively easy to modify the input parameters in order to computationally investigate different conditions. Experiments provide solutions which characterize the actual behavior of the object of interest subjected to specific operating conditions. However, it is impractical to experimentally perform parametric investigations. This paper discusses the use of a hybrid, computational and experimental, approach for study and optimization of mechanical components. Computational techniques are used for modeling the behavior of the object of interest while it is experimentally tested using noninvasive optical techniques. Comparisons are performed through a fringe predictor program used to facilitate the correlation between both techniques. In addition, experimentally obtained quantitative information, such as displacements and shape, can be applied in the computational model in order to improve this correlation. The result is a validated computational model that can be used for performing quantitative analyses and structural optimization. Practical application of the hybrid approach is illustrated with a representative example which demonstrates the viability of the approach as an engineering tool for structural analysis and optimization.
Kavlock, Robert; Dix, David
2010-02-01
Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, the U.S. Environmental Protection Agency EPA is developing robust and flexible computational tools that can be applied to the thousands of chemicals in commerce, and contaminant mixtures found in air, water, and hazardous-waste sites. The Office of Research and Development (ORD) Computational Toxicology Research Program (CTRP) is composed of three main elements. The largest component is the National Center for Computational Toxicology (NCCT), which was established in 2005 to coordinate research on chemical screening and prioritization, informatics, and systems modeling. The second element consists of related activities in the National Health and Environmental Effects Research Laboratory (NHEERL) and the National Exposure Research Laboratory (NERL). The third and final component consists of academic centers working on various aspects of computational toxicology and funded by the U.S. EPA Science to Achieve Results (STAR) program. Together these elements form the key components in the implementation of both the initial strategy, A Framework for a Computational Toxicology Research Program (U.S. EPA, 2003), and the newly released The U.S. Environmental Protection Agency's Strategic Plan for Evaluating the Toxicity of Chemicals (U.S. EPA, 2009a). Key intramural projects of the CTRP include digitizing legacy toxicity testing information toxicity reference database (ToxRefDB), predicting toxicity (ToxCast) and exposure (ExpoCast), and creating virtual liver (v-Liver) and virtual embryo (v-Embryo) systems models. U.S. EPA-funded STAR centers are also providing bioinformatics, computational toxicology data and models, and developmental toxicity data and models. The models and underlying data are being made publicly available through the Aggregated Computational Toxicology Resource (ACToR), the Distributed Structure-Searchable Toxicity (DSSTox) Database Network, and other U.S. EPA websites. While initially focused on improving the hazard identification process, the CTRP is placing increasing emphasis on using high-throughput bioactivity profiling data in systems modeling to support quantitative risk assessments, and in developing complementary higher throughput exposure models. This integrated approach will enable analysis of life-stage susceptibility, and understanding of the exposures, pathways, and key events by which chemicals exert their toxicity in developing systems (e.g., endocrine-related pathways). The CTRP will be a critical component in next-generation risk assessments utilizing quantitative high-throughput data and providing a much higher capacity for assessing chemical toxicity than is currently available.
Hydrodynamic design of generic pump components
NASA Technical Reports Server (NTRS)
Eastland, A. H. J.; Dodson, H. C.
1991-01-01
Inducer and impellar base geometries were defined for a fuel pump for a generic generator cycle. Blade surface data and inlet flowfield definition are available in sufficient detail to allow computational fluid dynamic analysis of the two components.
Program For Analysis Of Metal-Matrix Composites
NASA Technical Reports Server (NTRS)
Murthy, P. L. N.; Mital, S. K.
1994-01-01
METCAN (METal matrix Composite ANalyzer) is computer program used to simulate computationally nonlinear behavior of high-temperature metal-matrix composite structural components in specific applications, providing comprehensive analyses of thermal and mechanical performances. Written in FORTRAN 77.
NASA Tech Briefs, August 2000. Volume 24, No. 8
NASA Technical Reports Server (NTRS)
2000-01-01
Topics include: Simulation/Virtual Reality; Test and Measurement; Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Medical Design.
Computer models and output, Spartan REM: Appendix B
NASA Technical Reports Server (NTRS)
Marlowe, D. S.; West, E. J.
1984-01-01
A computer model of the Spartan Release Engagement Mechanism (REM) is presented in a series of numerical charts and engineering drawings. A crack growth analysis code is used to predict the fracture mechanics of critical components.
The Numerical Propulsion System Simulation: A Multidisciplinary Design System for Aerospace Vehicles
NASA Technical Reports Server (NTRS)
Lytle, John K.
1999-01-01
Advances in computational technology and in physics-based modeling are making large scale, detailed simulations of complex systems possible within the design environment. For example, the integration of computing, communications, and aerodynamics has reduced the time required to analyze ma or propulsion system components from days and weeks to minutes and hours. This breakthrough has enabled the detailed simulation of major propulsion system components to become a routine part of design process and to provide the designer with critical information about the components early in the design process. This paper describes the development of the Numerical Propulsion System Simulation (NPSS), a multidisciplinary system of analysis tools that is focussed on extending the simulation capability from components to the full system. This will provide the product developer with a "virtual wind tunnel" that will reduce the number of hardware builds and tests required during the development of advanced aerospace propulsion systems.
NASA Technical Reports Server (NTRS)
Dahl, Milo D.
2010-01-01
Codes for predicting supersonic jet mixing and broadband shock-associated noise were assessed using a database containing noise measurements of a jet issuing from a convergent nozzle. Two types of codes were used to make predictions. Fast running codes containing empirical models were used to compute both the mixing noise component and the shock-associated noise component of the jet noise spectrum. One Reynolds-averaged, Navier-Stokes-based code was used to compute only the shock-associated noise. To enable the comparisons of the predicted component spectra with data, the measured total jet noise spectra were separated into mixing noise and shock-associated noise components. Comparisons were made for 1/3-octave spectra and some power spectral densities using data from jets operating at 24 conditions covering essentially 6 fully expanded Mach numbers with 4 total temperature ratios.
NASA Astrophysics Data System (ADS)
Welty, N.; Rudolph, M.; Schäfer, F.; Apeldoorn, J.; Janovsky, R.
2013-07-01
This paper presents a computational methodology to predict the satellite system-level effects resulting from impacts of untrackable space debris particles. This approach seeks to improve on traditional risk assessment practices by looking beyond the structural penetration of the satellite and predicting the physical damage to internal components and the associated functional impairment caused by untrackable debris impacts. The proposed method combines a debris flux model with the Schäfer-Ryan-Lambert ballistic limit equation (BLE), which accounts for the inherent shielding of components positioned behind the spacecraft structure wall. Individual debris particle impact trajectories and component shadowing effects are considered and the failure probabilities of individual satellite components as a function of mission time are calculated. These results are correlated to expected functional impairment using a Boolean logic model of the system functional architecture considering the functional dependencies and redundancies within the system.
Duan, Xian-Chun; Wang, Yong-Zhong; Zhang, Jun-Ru; Luo, Huan; Zhang, Heng; Xia, Lun-Zhu
2011-08-01
To establish a dynamics model for extracting the lipophilic components in Panax notoginseng with supercritical carbon dioxide (CO2). Based on the theory of counter-flow mass transfer and the molecular mass transfer between the material and the supercritical CO2 fluid under differential mass-conservation equation, a dynamics model was established and computed to compare forecasting result with the experiment process. A dynamics model has been established for supercritical CO2 to extract the lipophilic components in Panax notoginseng, the computed result of this model was consistent with the experiment process basically. The supercritical fluid extract dynamics model established in this research can expound the mechanism in the extract process of which lipophilic components of Panax notoginseng dissolve the mass transfer and is tallied with the actual extract process. This provides certain instruction for the supercritical CO2 fluid extract' s industrialization enlargement.
Tracking by Identification Using Computer Vision and Radio
Mandeljc, Rok; Kovačič, Stanislav; Kristan, Matej; Perš, Janez
2013-01-01
We present a novel system for detection, localization and tracking of multiple people, which fuses a multi-view computer vision approach with a radio-based localization system. The proposed fusion combines the best of both worlds, excellent computer-vision-based localization, and strong identity information provided by the radio system, and is therefore able to perform tracking by identification, which makes it impervious to propagated identity switches. We present comprehensive methodology for evaluation of systems that perform person localization in world coordinate system and use it to evaluate the proposed system as well as its components. Experimental results on a challenging indoor dataset, which involves multiple people walking around a realistically cluttered room, confirm that proposed fusion of both systems significantly outperforms its individual components. Compared to the radio-based system, it achieves better localization results, while at the same time it successfully prevents propagation of identity switches that occur in pure computer-vision-based tracking. PMID:23262485
Patwary, Nurmohammed; Preza, Chrysanthe
2015-01-01
A depth-variant (DV) image restoration algorithm for wide field fluorescence microscopy, using an orthonormal basis decomposition of DV point-spread functions (PSFs), is investigated in this study. The efficient PSF representation is based on a previously developed principal component analysis (PCA), which is computationally intensive. We present an approach developed to reduce the number of DV PSFs required for the PCA computation, thereby making the PCA-based approach computationally tractable for thick samples. Restoration results from both synthetic and experimental images show consistency and that the proposed algorithm addresses efficiently depth-induced aberration using a small number of principal components. Comparison of the PCA-based algorithm with a previously-developed strata-based DV restoration algorithm demonstrates that the proposed method improves performance by 50% in terms of accuracy and simultaneously reduces the processing time by 64% using comparable computational resources. PMID:26504634
Computer network defense system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Urias, Vincent; Stout, William M. S.; Loverro, Caleb
A method and apparatus for protecting virtual machines. A computer system creates a copy of a group of the virtual machines in an operating network in a deception network to form a group of cloned virtual machines in the deception network when the group of the virtual machines is accessed by an adversary. The computer system creates an emulation of components from the operating network in the deception network. The components are accessible by the group of the cloned virtual machines as if the group of the cloned virtual machines was in the operating network. The computer system moves networkmore » connections for the group of the virtual machines in the operating network used by the adversary from the group of the virtual machines in the operating network to the group of the cloned virtual machines, enabling protecting the group of the virtual machines from actions performed by the adversary.« less
From serological to computer cross-matching in nine hospitals.
Georgsen, J; Kristensen, T
1998-01-01
In 1991 it was decided to reorganise the transfusion service of the County of Funen. The aims were to standardise and improve the quality of blood components, laboratory procedures and the transfusion service and to reduce the number of outdated blood units. Part of the efficiency gains was reinvested in a dedicated computer system making it possible--among other things--to change the cross-match procedures from serological to computer cross-matching according to the ABCD-concept. This communication describes how this transition was performed in terms of laboratory techniques, education of personnel as well as implementation of the computer system and indicates the results obtained. The Funen Transfusion Service has by now performed more than 100.000 red cell transfusions based on ABCD-cross-matching and has not encountered any problems. Major results are the significant reductions of cross-match procedures, blood grouping as well as the number of outdated blood components.
Users matter : multi-agent systems model of high performance computing cluster users.
DOE Office of Scientific and Technical Information (OSTI.GOV)
North, M. J.; Hood, C. S.; Decision and Information Sciences
2005-01-01
High performance computing clusters have been a critical resource for computational science for over a decade and have more recently become integral to large-scale industrial analysis. Despite their well-specified components, the aggregate behavior of clusters is poorly understood. The difficulties arise from complicated interactions between cluster components during operation. These interactions have been studied by many researchers, some of whom have identified the need for holistic multi-scale modeling that simultaneously includes network level, operating system level, process level, and user level behaviors. Each of these levels presents its own modeling challenges, but the user level is the most complex duemore » to the adaptability of human beings. In this vein, there are several major user modeling goals, namely descriptive modeling, predictive modeling and automated weakness discovery. This study shows how multi-agent techniques were used to simulate a large-scale computing cluster at each of these levels.« less
Component-specific modeling. [jet engine hot section components
NASA Technical Reports Server (NTRS)
Mcknight, R. L.; Maffeo, R. J.; Tipton, M. T.; Weber, G.
1992-01-01
Accomplishments are described for a 3 year program to develop methodology for component-specific modeling of aircraft hot section components (turbine blades, turbine vanes, and burner liners). These accomplishments include: (1) engine thermodynamic and mission models, (2) geometry model generators, (3) remeshing, (4) specialty three-dimensional inelastic structural analysis, (5) computationally efficient solvers, (6) adaptive solution strategies, (7) engine performance parameters/component response variables decomposition and synthesis, (8) integrated software architecture and development, and (9) validation cases for software developed.
NASA Technical Reports Server (NTRS)
Goldstein, Arthur W
1947-01-01
The performance of the turbine component of an NACA research jet engine was investigated with cold air. The interaction and the matching of the turbine with the NACA eight-stage compressor were computed with the combination considered as a jet engine. The over-all performance of the engine was then determined. The internal aerodynamics were studied to the extent of investigating the performance of the first stator ring and its influence on the turbine performance. For this ring, the stream-filament method for computing velocity distribution permitted efficient sections to be designed, but the design condition of free-vortex flow with uniform axial velocities was not obtained.
Halftoning method for the generation of motion stimuli
NASA Technical Reports Server (NTRS)
Mulligan, Jeffrey B.; Stone, Leland S.
1989-01-01
This paper describes a novel computer-graphic technique for the generation of a broad class of motion stimuli for vision research, which uses color table animation in conjunction with a single base image. Using this technique, contrast and temporal frequency can be varied with a negligible amount of computation, once a single-base image is produced. Since only two-bit planes are needed to display a single drifting grating, an eight-bit/pixel display can be used to generate four-component plaids, in which each component of the plaid has independently programmable contrast and temporal frequency. Because the contrast and temporal frequencies of the various components are mutually independent, a large number of two-dimensional stimulus motions can be produced from a single image file.
Extreme-Scale De Novo Genome Assembly
DOE Office of Scientific and Technical Information (OSTI.GOV)
Georganas, Evangelos; Hofmeyr, Steven; Egan, Rob
De novo whole genome assembly reconstructs genomic sequence from short, overlapping, and potentially erroneous DNA segments and is one of the most important computations in modern genomics. This work presents HipMER, a high-quality end-to-end de novo assembler designed for extreme scale analysis, via efficient parallelization of the Meraculous code. Genome assembly software has many components, each of which stresses different components of a computer system. This chapter explains the computational challenges involved in each step of the HipMer pipeline, the key distributed data structures, and communication costs in detail. We present performance results of assembling the human genome and themore » large hexaploid wheat genome on large supercomputers up to tens of thousands of cores.« less
NASA Technical Reports Server (NTRS)
Tezduyar, Tayfun E.
1998-01-01
This is a final report as far as our work at University of Minnesota is concerned. The report describes our research progress and accomplishments in development of high performance computing methods and tools for 3D finite element computation of aerodynamic characteristics and fluid-structure interactions (FSI) arising in airdrop systems, namely ram-air parachutes and round parachutes. This class of simulations involves complex geometries, flexible structural components, deforming fluid domains, and unsteady flow patterns. The key components of our simulation toolkit are a stabilized finite element flow solver, a nonlinear structural dynamics solver, an automatic mesh moving scheme, and an interface between the fluid and structural solvers; all of these have been developed within a parallel message-passing paradigm.
Life Prediction for a CMC Component Using the NASALIFE Computer Code
NASA Technical Reports Server (NTRS)
Gyekenyesi, John Z.; Murthy, Pappu L. N.; Mital, Subodh K.
2005-01-01
The computer code, NASALIFE, was used to provide estimates for life of an SiC/SiC stator vane under varying thermomechanical loading conditions. The primary intention of this effort is to show how the computer code NASALIFE can be used to provide reasonable estimates of life for practical propulsion system components made of advanced ceramic matrix composites (CMC). Simple loading conditions provided readily observable and acceptable life predictions. Varying the loading conditions such that low cycle fatigue and creep were affected independently provided expected trends in the results for life due to varying loads and life due to creep. Analysis was based on idealized empirical data for the 9/99 Melt Infiltrated SiC fiber reinforced SiC.
NASA Technical Reports Server (NTRS)
Kemp, Victoria R.
1992-01-01
A fluid-dynamic, digital-transient computer model of an integrated, parallel propulsion system was developed for the CDC mainframe and the SUN workstation computers. Since all STME component designs were used for the integrated system, computer subroutines were written characterizing the performance and geometry of all the components used in the system, including the manifolds. Three transient analysis reports were completed. The first report evaluated the feasibility of integrated engine systems in regards to the start and cutoff transient behavior. The second report evaluated turbopump out and combined thrust chamber/turbopump out conditions. The third report presented sensitivity study results in staggered gas generator spin start and in pump performance characteristics.
NASA Astrophysics Data System (ADS)
Speidel, Steven
1992-08-01
Our ultimate goal is to develop neural-like cognitive sensory processing within non-neuronal systems. Toward this end, computational models are being developed for selectivity attending the task-relevant parts of composite sensory excitations in an example sound processing application. Significant stimuli partials are selectively attended through the use of generalized neural adaptive beamformers. Computational components are being tested by experiment in the laboratory and also by use of recordings from sensor deployments in the ocean. Results will be presented. These computational components are being integrated into a comprehensive processing architecture that simultaneously attends memory according to stimuli, attends stimuli according to memory, and attends stimuli and memory according to an ongoing thought process. The proposed neural architecture is potentially very fast when implemented in special hardware.
Using a Modular Construction Kit for the Realization of an Interactive Computer Graphics Course.
ERIC Educational Resources Information Center
Klein, Reinhard; Hanisch, Frank
Recently, platform independent software components, like JavaBeans, have appeared that allow writing reusable components and composing them in a visual builder tool into new applications. This paper describes the use of such models to transform an existing course into a modular construction kit consisting of components of teaching text and program…
Jeff Palmer; Adrienn Andersch; Jan Wiedenbeck; Urs. Buehlmann
2014-01-01
WoodCite is a Microsoft® Access-based application that allows wood component manufacturers to develop product price quotations for their current and potential customers. The application was developed by the U.S. Forest Service and Virginia Polytechnic Institute and State University, in cooperation with the Wood Components Manufacturers Association.
Computational mechanics and physics at NASA Langley Research Center
NASA Technical Reports Server (NTRS)
South, Jerry C., Jr.
1987-01-01
An overview is given of computational mechanics and physics at NASA Langley Research Center. Computational analysis is a major component and tool in many of Langley's diverse research disciplines, as well as in the interdisciplinary research. Examples are given for algorithm development and advanced applications in aerodynamics, transition to turbulence and turbulence simulation, hypersonics, structures, and interdisciplinary optimization.
ERIC Educational Resources Information Center
Hunt, Darwin P.
The use of systems theory as a conceptual framework is proposed as useful when considering computers as a machine component in teaching. Skinner's proposal that the label "computer" is inaccurate and counterproductive when used to refer to a machine being used for teaching is discussed. It is suggested that the alternative label…
ERIC Educational Resources Information Center
Balajthy, Ernest
Intended for reading and language arts teachers at all educational levels, this guide presents information to be used by teachers in constructing their own computer assisted educational software using the BASIC programming language and Apple computers. Part 1 provides an overview of the components of traditional tutorial and drill-and-practice…
The Nature-Computer Camp. Final Evaluation Report, 1982-1983. E.C.I.A. Chapter 2.
ERIC Educational Resources Information Center
District of Columbia Public Schools, Washington, DC. Div. of Quality Assurance.
This report presents a description and evaluation of the Nature-Computer Camp (NCC), an environmental and computer science program designed for sixth grade students in the District of Columbia public schools. Among the major components of the program were: planning for administration of operating the camp and for instruction in environmental…
Teachers' Helpers: Experimental Evidence from Costa Rica on Computers for English Language Learning
ERIC Educational Resources Information Center
Humpage, Sarah; Álvarez-Marinelli, Horacio
2014-01-01
Computers have taken an increasingly prominent role in education around the world in recent years in developed and developing countries alike. As developing country governments have turned their focus from increasing enrollment to improving the quality of education in their schools, many have made access to computers a key component to their…
An approximation formula for a class of fault-tolerant computers
NASA Technical Reports Server (NTRS)
White, A. L.
1986-01-01
An approximation formula is derived for the probability of failure for fault-tolerant process-control computers. These computers use redundancy and reconfiguration to achieve high reliability. Finite-state Markov models capture the dynamic behavior of component failure and system recovery, and the approximation formula permits an estimation of system reliability by an easy examination of the model.
Software for computing plant biomassBIOPAK users guide.
Joseph E. Means; Heather A. Hansen; Greg J. Koerper; Paul B Alaback; Mark W. Klopsch
1994-01-01
BIOPAK is a menu-driven package of computer programs for IBM-compatible personal computers that calculates the biomass, area, height, length, or volume of plant components (leaves, branches, stem, crown, and roots). The routines were written in FoxPro, Fortran, and C.BIOPAK was created to facilitate linking of a diverse array of vegetation datasets with the...
Developing Student Programming and Problem-Solving Skills with Visual Basic
ERIC Educational Resources Information Center
Siegle, Del
2009-01-01
Although most computer users will never need to write a computer program, many students enjoy the challenge of creating one. Computer programming enhances students' problem solving by forcing students to break a problem into its component pieces and reassemble it in a generic format that can be understood by a nonsentient entity. It promotes…
All-optical reservoir computing.
Duport, François; Schneider, Bendix; Smerieri, Anteo; Haelterman, Marc; Massar, Serge
2012-09-24
Reservoir Computing is a novel computing paradigm that uses a nonlinear recurrent dynamical system to carry out information processing. Recent electronic and optoelectronic Reservoir Computers based on an architecture with a single nonlinear node and a delay loop have shown performance on standardized tasks comparable to state-of-the-art digital implementations. Here we report an all-optical implementation of a Reservoir Computer, made of off-the-shelf components for optical telecommunications. It uses the saturation of a semiconductor optical amplifier as nonlinearity. The present work shows that, within the Reservoir Computing paradigm, all-optical computing with state-of-the-art performance is possible.
Two Computer Programs for the Statistical Evaluation of a Weighted Linear Composite.
ERIC Educational Resources Information Center
Sands, William A.
1978-01-01
Two computer programs (one batch, one interactive) are designed to provide statistics for a weighted linear combination of several component variables. Both programs provide mean, variance, standard deviation, and a validity coefficient. (Author/JKS)
Faster, Better, Cheaper: A Decade of PC Progress.
ERIC Educational Resources Information Center
Crawford, Walt
1997-01-01
Reviews the development of personal computers and how computer components have changed in price and value. Highlights include disk drives; keyboards; displays; memory; color graphics; modems; CPU (central processing unit); storage; direct mail vendors; and future possibilities. (LRW)
Computer Use in Research Exercises: Some Suggested Procedures for Undergraduate Political Science.
ERIC Educational Resources Information Center
Comer, John
1979-01-01
Describes some procedures designed to assist instructors in developing a research component using the computer. Benefits include development of research skills, kindling student interest in the field of political science, and recruitment potential. (Author/CK)
NASA Tech Briefs, July 2000. Volume 24, No. 7
NASA Technical Reports Server (NTRS)
2000-01-01
Topics covered include: Data Acquisition; Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Test and Measurement; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Life Sciences; Books and Reports.
Mission definition study for Stanford relativity satellite. Volume 3: Appendices
NASA Technical Reports Server (NTRS)
1971-01-01
An analysis is presented for the cost of the mission as a function of the following variables: amount of redundancy in the spacecraft, amount of care taken in building the spacecraft (functional and environmental tests, screening of components, quality control, etc), and the number of flights necessary to accomplish the mission. Thermal analysis and mathematical models for the experimental components are presented. The results of computer structural and stress analyses for support and cylinders are discussed. Reliability, quality control, and control system simulation by computer are also considered.
NASA Technical Reports Server (NTRS)
Stroupe, Ashley W.; Okon, Avi; Robinson, Matthew; Huntsberger, Terry; Aghazarian, Hrand; Baumgartner, Eric
2004-01-01
Robotic Construction Crew (RCC) is a heterogeneous multi-robot system for autonomous acquisition, transport, and precision mating of components in construction tasks. RCC minimizes resources constrained in a space environment such as computation, power, communication and, sensing. A behavior-based architecture provides adaptability and robustness despite low computational requirements. RCC successfully performs several construction related tasks in an emulated outdoor environment despite high levels of uncertainty in motions and sensing. Quantitative results are provided for formation keeping in component transport, precision instrument placement, and construction tasks.
Enabling computer decisions based on EEG input.
Culpepper, Benjamin J; Keller, Robert M
2003-12-01
Multilayer neural networks were successfully trained to classify segments of 12-channel electroencephalogram (EEG) data into one of five classes corresponding to five cognitive tasks performed by a subject. Independent component analysis (ICA) was used to segregate obvious artifact EEG components from other sources, and a frequency-band representation was used to represent the sources computed by ICA. Examples of results include an 85% accuracy rate on differentiation between two tasks, using a segment of EEG only 0.05 s long and a 95% accuracy rate using a 0.5-s-long segment.
Video analysis of projectile motion using tablet computers as experimental tools
NASA Astrophysics Data System (ADS)
Klein, P.; Gröber, S.; Kuhn, J.; Müller, A.
2014-01-01
Tablet computers were used as experimental tools to record and analyse the motion of a ball thrown vertically from a moving skateboard. Special applications plotted the measurement data component by component, allowing a simple determination of initial conditions and g in order to explore the underlying laws of motion. This experiment can easily be performed by students themselves, providing more autonomy in their problem-solving processes than traditional learning approaches. We believe that this autonomy and the authenticity of the experimental tool both foster their motivation.
Enabling computer decisions based on EEG input
NASA Technical Reports Server (NTRS)
Culpepper, Benjamin J.; Keller, Robert M.
2003-01-01
Multilayer neural networks were successfully trained to classify segments of 12-channel electroencephalogram (EEG) data into one of five classes corresponding to five cognitive tasks performed by a subject. Independent component analysis (ICA) was used to segregate obvious artifact EEG components from other sources, and a frequency-band representation was used to represent the sources computed by ICA. Examples of results include an 85% accuracy rate on differentiation between two tasks, using a segment of EEG only 0.05 s long and a 95% accuracy rate using a 0.5-s-long segment.
Smith, Zachary J; Strombom, Sven; Wachsmann-Hogiu, Sebastian
2011-08-29
A multivariate optical computer has been constructed consisting of a spectrograph, digital micromirror device, and photomultiplier tube that is capable of determining absolute concentrations of individual components of a multivariate spectral model. We present experimental results on ternary mixtures, showing accurate quantification of chemical concentrations based on integrated intensities of fluorescence and Raman spectra measured with a single point detector. We additionally show in simulation that point measurements based on principal component spectra retain the ability to classify cancerous from noncancerous T cells.
Analysis of whisker-toughened CMC structural components using an interactive reliability model
NASA Technical Reports Server (NTRS)
Duffy, Stephen F.; Palko, Joseph L.
1992-01-01
Realizing wider utilization of ceramic matrix composites (CMC) requires the development of advanced structural analysis technologies. This article focuses on the use of interactive reliability models to predict component probability of failure. The deterministic William-Warnke failure criterion serves as theoretical basis for the reliability model presented here. The model has been implemented into a test-bed software program. This computer program has been coupled to a general-purpose finite element program. A simple structural problem is presented to illustrate the reliability model and the computer algorithm.
Advances in Human-Computer Interaction: Graphics and Animation Components for Interface Design
NASA Astrophysics Data System (ADS)
Cipolla Ficarra, Francisco V.; Nicol, Emma; Cipolla-Ficarra, Miguel; Richardson, Lucy
We present an analysis of communicability methodology in graphics and animation components for interface design, called CAN (Communicability, Acceptability and Novelty). This methodology has been under development between 2005 and 2010, obtaining excellent results in cultural heritage, education and microcomputing contexts. In studies where there is a bi-directional interrelation between ergonomics, usability, user-centered design, software quality and the human-computer interaction. We also present the heuristic results about iconography and layout design in blogs and websites of the following countries: Spain, Italy, Portugal and France.
1986-07-01
COMPUTER-AIDED OPERATION MANAGEMENT SYSTEM ................. 29 Functions of an Off-Line Computer-Aided Operation Management System Applications of...System Comparisons 85 DISTRIBUTION 5V J. • 0. FIGURES Number Page 1 Hardware Components 21 2 Basic Functions of a Computer-Aided Operation Management System...Plant Visits 26 4 Computer-Aided Operation Management Systems Reviewed for Analysis of Basic Functions 29 5 Progress of Software System Installation and
MAGIC Computer Simulation. Volume 2: Analyst Manual, Part 1
1971-05-01
A review of the subject Magic Computer Simulation User and Analyst Manuals has been conducted based upon a request received from the US Army...1971 4. TITLE AND SUBTITLE MAGIC Computer Simulation Analyst Manual Part 1 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...14. ABSTRACT The MAGIC computer simulation generates target description data consisting of item-by-item listings of the target’s components and air
Specialized computer architectures for computational aerodynamics
NASA Technical Reports Server (NTRS)
Stevenson, D. K.
1978-01-01
In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.
CICE, The Los Alamos Sea Ice Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hunke, Elizabeth; Lipscomb, William; Jones, Philip
The Los Alamos sea ice model (CICE) is the result of an effort to develop a computationally efficient sea ice component for a fully coupled atmosphere–land–ocean–ice global climate model. It was originally designed to be compatible with the Parallel Ocean Program (POP), an ocean circulation model developed at Los Alamos National Laboratory for use on massively parallel computers. CICE has several interacting components: a vertical thermodynamic model that computes local growth rates of snow and ice due to vertical conductive, radiative and turbulent fluxes, along with snowfall; an elastic-viscous-plastic model of ice dynamics, which predicts the velocity field of themore » ice pack based on a model of the material strength of the ice; an incremental remapping transport model that describes horizontal advection of the areal concentration, ice and snow volume and other state variables; and a ridging parameterization that transfers ice among thickness categories based on energetic balances and rates of strain. It also includes a biogeochemical model that describes evolution of the ice ecosystem. The CICE sea ice model is used for climate research as one component of complex global earth system models that include atmosphere, land, ocean and biogeochemistry components. It is also used for operational sea ice forecasting in the polar regions and in numerical weather prediction models.« less
Niazi, Muaz A
2014-01-01
The body structure of snakes is composed of numerous natural components thereby making it resilient, flexible, adaptive, and dynamic. In contrast, current computer animations as well as physical implementations of snake-like autonomous structures are typically designed to use either a single or a relatively smaller number of components. As a result, not only these artificial structures are constrained by the dimensions of the constituent components but often also require relatively more computationally intensive algorithms to model and animate. Still, these animations often lack life-like resilience and adaptation. This paper presents a solution to the problem of modeling snake-like structures by proposing an agent-based, self-organizing algorithm resulting in an emergent and surprisingly resilient dynamic structure involving a minimal of interagent communication. Extensive simulation experiments demonstrate the effectiveness as well as resilience of the proposed approach. The ideas originating from the proposed algorithm can not only be used for developing self-organizing animations but can also have practical applications such as in the form of complex, autonomous, evolvable robots with self-organizing, mobile components with minimal individual computational capabilities. The work also demonstrates the utility of exploratory agent-based modeling (EABM) in the engineering of artificial life-like complex adaptive systems.
Niazi, Muaz A.
2014-01-01
The body structure of snakes is composed of numerous natural components thereby making it resilient, flexible, adaptive, and dynamic. In contrast, current computer animations as well as physical implementations of snake-like autonomous structures are typically designed to use either a single or a relatively smaller number of components. As a result, not only these artificial structures are constrained by the dimensions of the constituent components but often also require relatively more computationally intensive algorithms to model and animate. Still, these animations often lack life-like resilience and adaptation. This paper presents a solution to the problem of modeling snake-like structures by proposing an agent-based, self-organizing algorithm resulting in an emergent and surprisingly resilient dynamic structure involving a minimal of interagent communication. Extensive simulation experiments demonstrate the effectiveness as well as resilience of the proposed approach. The ideas originating from the proposed algorithm can not only be used for developing self-organizing animations but can also have practical applications such as in the form of complex, autonomous, evolvable robots with self-organizing, mobile components with minimal individual computational capabilities. The work also demonstrates the utility of exploratory agent-based modeling (EABM) in the engineering of artificial life-like complex adaptive systems. PMID:24701135
OVERSMART Reporting Tool for Flow Computations Over Large Grid Systems
NASA Technical Reports Server (NTRS)
Kao, David L.; Chan, William M.
2012-01-01
Structured grid solvers such as NASA's OVERFLOW compressible Navier-Stokes flow solver can generate large data files that contain convergence histories for flow equation residuals, turbulence model equation residuals, component forces and moments, and component relative motion dynamics variables. Most of today's large-scale problems can extend to hundreds of grids, and over 100 million grid points. However, due to the lack of efficient tools, only a small fraction of information contained in these files is analyzed. OVERSMART (OVERFLOW Solution Monitoring And Reporting Tool) provides a comprehensive report of solution convergence of flow computations over large, complex grid systems. It produces a one-page executive summary of the behavior of flow equation residuals, turbulence model equation residuals, and component forces and moments. Under the automatic option, a matrix of commonly viewed plots such as residual histograms, composite residuals, sub-iteration bar graphs, and component forces and moments is automatically generated. Specific plots required by the user can also be prescribed via a command file or a graphical user interface. Output is directed to the user s computer screen and/or to an html file for archival purposes. The current implementation has been targeted for the OVERFLOW flow solver, which is used to obtain a flow solution on structured overset grids. The OVERSMART framework allows easy extension to other flow solvers.
Real-time machine vision system using FPGA and soft-core processor
NASA Astrophysics Data System (ADS)
Malik, Abdul Waheed; Thörnberg, Benny; Meng, Xiaozhou; Imran, Muhammad
2012-06-01
This paper presents a machine vision system for real-time computation of distance and angle of a camera from reference points in the environment. Image pre-processing, component labeling and feature extraction modules were modeled at Register Transfer (RT) level and synthesized for implementation on field programmable gate arrays (FPGA). The extracted image component features were sent from the hardware modules to a soft-core processor, MicroBlaze, for computation of distance and angle. A CMOS imaging sensor operating at a clock frequency of 27MHz was used in our experiments to produce a video stream at the rate of 75 frames per second. Image component labeling and feature extraction modules were running in parallel having a total latency of 13ms. The MicroBlaze was interfaced with the component labeling and feature extraction modules through Fast Simplex Link (FSL). The latency for computing distance and angle of camera from the reference points was measured to be 2ms on the MicroBlaze, running at 100 MHz clock frequency. In this paper, we present the performance analysis, device utilization and power consumption for the designed system. The FPGA based machine vision system that we propose has high frame speed, low latency and a power consumption that is much lower compared to commercially available smart camera solutions.
Computational dynamics of soft machines
NASA Astrophysics Data System (ADS)
Hu, Haiyan; Tian, Qiang; Liu, Cheng
2017-06-01
Soft machine refers to a kind of mechanical system made of soft materials to complete sophisticated missions, such as handling a fragile object and crawling along a narrow tunnel corner, under low cost control and actuation. Hence, soft machines have raised great challenges to computational dynamics. In this review article, recent studies of the authors on the dynamic modeling, numerical simulation, and experimental validation of soft machines are summarized in the framework of multibody system dynamics. The dynamic modeling approaches are presented first for the geometric nonlinearities of coupled overall motions and large deformations of a soft component, the physical nonlinearities of a soft component made of hyperelastic or elastoplastic materials, and the frictional contacts/impacts of soft components, respectively. Then the computation approach is outlined for the dynamic simulation of soft machines governed by a set of differential-algebraic equations of very high dimensions, with an emphasis on the efficient computations of the nonlinear elastic force vector of finite elements. The validations of the proposed approaches are given via three case studies, including the locomotion of a soft quadrupedal robot, the spinning deployment of a solar sail of a spacecraft, and the deployment of a mesh reflector of a satellite antenna, as well as the corresponding experimental studies. Finally, some remarks are made for future studies.
Scalable Robust Principal Component Analysis Using Grassmann Averages.
Hauberg, Sren; Feragen, Aasa; Enficiaud, Raffi; Black, Michael J
2016-11-01
In large datasets, manual data verification is impossible, and we must expect the number of outliers to increase with data size. While principal component analysis (PCA) can reduce data size, and scalable solutions exist, it is well-known that outliers can arbitrarily corrupt the results. Unfortunately, state-of-the-art approaches for robust PCA are not scalable. We note that in a zero-mean dataset, each observation spans a one-dimensional subspace, giving a point on the Grassmann manifold. We show that the average subspace corresponds to the leading principal component for Gaussian data. We provide a simple algorithm for computing this Grassmann Average ( GA), and show that the subspace estimate is less sensitive to outliers than PCA for general distributions. Because averages can be efficiently computed, we immediately gain scalability. We exploit robust averaging to formulate the Robust Grassmann Average (RGA) as a form of robust PCA. The resulting Trimmed Grassmann Average ( TGA) is appropriate for computer vision because it is robust to pixel outliers. The algorithm has linear computational complexity and minimal memory requirements. We demonstrate TGA for background modeling, video restoration, and shadow removal. We show scalability by performing robust PCA on the entire Star Wars IV movie; a task beyond any current method. Source code is available online.
Links, Jonathan M; Schwartz, Brian S; Lin, Sen; Kanarek, Norma; Mitrani-Reiser, Judith; Sell, Tara Kirk; Watson, Crystal R; Ward, Doug; Slemp, Cathy; Burhans, Robert; Gill, Kimberly; Igusa, Tak; Zhao, Xilei; Aguirre, Benigno; Trainor, Joseph; Nigg, Joanne; Inglesby, Thomas; Carbone, Eric; Kendra, James M
2018-02-01
Policy-makers and practitioners have a need to assess community resilience in disasters. Prior efforts conflated resilience with community functioning, combined resistance and recovery (the components of resilience), and relied on a static model for what is inherently a dynamic process. We sought to develop linked conceptual and computational models of community functioning and resilience after a disaster. We developed a system dynamics computational model that predicts community functioning after a disaster. The computational model outputted the time course of community functioning before, during, and after a disaster, which was used to calculate resistance, recovery, and resilience for all US counties. The conceptual model explicitly separated resilience from community functioning and identified all key components for each, which were translated into a system dynamics computational model with connections and feedbacks. The components were represented by publicly available measures at the county level. Baseline community functioning, resistance, recovery, and resilience evidenced a range of values and geographic clustering, consistent with hypotheses based on the disaster literature. The work is transparent, motivates ongoing refinements, and identifies areas for improved measurements. After validation, such a model can be used to identify effective investments to enhance community resilience. (Disaster Med Public Health Preparedness. 2018;12:127-137).
Individualized Educational Programming for the Mentally Retarded.
ERIC Educational Resources Information Center
Singh, Nirbhay N.; Ahrens, Michael G.
1980-01-01
The minimal components of a model which utilizes a computer for summarizing individual performance records for teaching educational skills to the mentally retarded are described. The most important components are assessment, individual and group programing, continuous data collection, and program evaluation. (Author)
Stress analysis under component relative interference fit
NASA Technical Reports Server (NTRS)
Taylor, C. M.
1978-01-01
Finite-element computer program enables analysis of distortions and stresses occurring in components having relative interference. Program restricts itself to simple elements and axisymmetric loading situations. External inertial and thermal loads may be applied in addition to forces arising from interference conditions.
The Mechanization of Design and Manufacturing.
ERIC Educational Resources Information Center
Gunn, Thomas G.
1982-01-01
Describes changes in the design of products and in planning, managing, and coordinating their manufacture. Focuses on discrete-products manufacturing industries, encompassing the fabrication and assembly of automobiles, aircraft, computers and microelectric components of computers, furniture, appliances, foods, clothing, building materials, and…
An Analog Computer for Electronic Engineering Education
ERIC Educational Resources Information Center
Fitch, A. L.; Iu, H. H. C.; Lu, D. D. C.
2011-01-01
This paper describes a compact analog computer and proposes its use in electronic engineering teaching laboratories to develop student understanding of applications in analog electronics, electronic components, engineering mathematics, control engineering, safe laboratory and workshop practices, circuit construction, testing, and maintenance. The…
DOT National Transportation Integrated Search
2003-10-01
The purpose of this document is to expand upon the evaluation components presented in "Computer-aided dispatch--traffic management center field operational test final evaluation plan : WSDOT deployment". This document defines the objective, approach,...
AOPs & Biomarkers: Bridging High Throughput Screening and Regulatory Decision Making.
As high throughput screening (HTS) approaches play a larger role in toxicity testing, computational toxicology has emerged as a critical component in interpreting the large volume of data produced. Computational models for this purpose are becoming increasingly more sophisticated...
Computer program determines chemical equilibria in complex systems
NASA Technical Reports Server (NTRS)
Gordon, S.; Zeleznik, F. J.
1966-01-01
Computer program numerically solves nonlinear algebraic equations for chemical equilibrium based on iteration equations independent of choice of components. This program calculates theoretical performance for frozen and equilibrium composition during expansion and Chapman-Jouguet flame properties, studies combustion, and designs hardware.
ERIC Educational Resources Information Center
Standing, Roy A.
1982-01-01
Reviews the basic concepts and technology behind the functions computers perform, describes the miniaturization of computer components, discusses the development of the microprocessor and the microcomputer, and makes projections concerning the future of the microcomputer market. Information is provided on the features, costs, and manufacturers of…
AIR TOXICS EMISSIONS FROM ELECTRONICS INCINERATION
The purpose of this project is to examine the emissions of air toxics from the combustion of electronics equipment, primarily personal computer components. Due to a shortage of recycling programs for personal computers and other personal electronics equipment, most of these mate...
Code of Federal Regulations, 2010 CFR
2010-01-01
... 12 Banks and Banking 5 2010-01-01 2010-01-01 false How does OTS determine the risk/complexity...-Calculation of Assessments § 502.27 How does OTS determine the risk/complexity component for a savings and loan holding company? (a) OTS computes the risk/complexity component for responsible savings and loan...
High-sensitivity observations of 28 pulsars
NASA Technical Reports Server (NTRS)
Weisberg, J. M.; Armstrong, B. K.; Backus, P. R.; Cordes, J. M.; Boriakoff, V.
1986-01-01
Average 430-MHz pulse profiles and, where possible, modulation indices and pulse-nulling fractions are computed for 28 pulsars. Morphological classifications are determined for most of the pulsars. It is found that core emission components tend to have lower modulation indices than conal components, and that pulsars having only a core component never exhibit pulse pulling. PSR 1612 + 07 is shown to undergo mode changes.
JMS Proxy and C/C++ Client SDK
NASA Technical Reports Server (NTRS)
Wolgast, Paul; Pechkam, Paul
2007-01-01
JMS Proxy and C/C++ Client SDK (JMS signifies "Java messaging service" and "SDK" signifies "software development kit") is a software package for developing interfaces that enable legacy programs (here denoted "clients") written in the C and C++ languages to communicate with each other via a JMS broker. This package consists of two main components: the JMS proxy server component and the client C library SDK component. The JMS proxy server component implements a native Java process that receives and responds to requests from clients. This component can run on any computer that supports Java and a JMS client. The client C library SDK component is used to develop a JMS client program running in each affected C or C++ environment, without need for running a Java virtual machine in the affected computer. A C client program developed by use of this SDK has most of the quality-of-service characteristics of standard Java-based client programs, including the following: Durable subscriptions; Asynchronous message receipt; Such standard JMS message qualities as "TimeToLive," "Message Properties," and "DeliveryMode" (as the quoted terms are defined in previously published JMS documentation); and Automatic reconnection of a JMS proxy to a restarted JMS broker.
ERIC Educational Resources Information Center
Zadahmad, Manouchehr; Yousefzadehfard, Parisa
2016-01-01
Mobile Cloud Computing (MCC) aims to improve all mobile applications such as m-learning systems. This study presents an innovative method to use web technology and software engineering's best practices to provide m-learning functionalities hosted in a MCC-learning system as service. Components hosted by MCC are used to empower developers to create…
Defense Acquisitions Acronyms and Terms
2012-12-01
Computer-Aided Design CADD Computer-Aided Design and Drafting CAE Component Acquisition Executive; Computer-Aided Engineering CAIV Cost As an...Radiation to Ordnance HFE Human Factors Engineering HHA Health Hazard Assessment HNA Host-Nation Approval HNS Host-Nation Support HOL High -Order...Engineering Change Proposal VHSIC Very High Speed Integrated Circuit VLSI Very Large Scale Integration VOC Volatile Organic Compound W WAN Wide
Accelerating Climate and Weather Simulations through Hybrid Computing
NASA Technical Reports Server (NTRS)
Zhou, Shujia; Cruz, Carlos; Duffy, Daniel; Tucker, Robert; Purcell, Mark
2011-01-01
Unconventional multi- and many-core processors (e.g. IBM (R) Cell B.E.(TM) and NVIDIA (R) GPU) have emerged as effective accelerators in trial climate and weather simulations. Yet these climate and weather models typically run on parallel computers with conventional processors (e.g. Intel, AMD, and IBM) using Message Passing Interface. To address challenges involved in efficiently and easily connecting accelerators to parallel computers, we investigated using IBM's Dynamic Application Virtualization (TM) (IBM DAV) software in a prototype hybrid computing system with representative climate and weather model components. The hybrid system comprises two Intel blades and two IBM QS22 Cell B.E. blades, connected with both InfiniBand(R) (IB) and 1-Gigabit Ethernet. The system significantly accelerates a solar radiation model component by offloading compute-intensive calculations to the Cell blades. Systematic tests show that IBM DAV can seamlessly offload compute-intensive calculations from Intel blades to Cell B.E. blades in a scalable, load-balanced manner. However, noticeable communication overhead was observed, mainly due to IP over the IB protocol. Full utilization of IB Sockets Direct Protocol and the lower latency production version of IBM DAV will reduce this overhead.
Importance of balanced architectures in the design of high-performance imaging systems
NASA Astrophysics Data System (ADS)
Sgro, Joseph A.; Stanton, Paul C.
1999-03-01
Imaging systems employed in demanding military and industrial applications, such as automatic target recognition and computer vision, typically require real-time high-performance computing resources. While high- performances computing systems have traditionally relied on proprietary architectures and custom components, recent advances in high performance general-purpose microprocessor technology have produced an abundance of low cost components suitable for use in high-performance computing systems. A common pitfall in the design of high performance imaging system, particularly systems employing scalable multiprocessor architectures, is the failure to balance computational and memory bandwidth. The performance of standard cluster designs, for example, in which several processors share a common memory bus, is typically constrained by memory bandwidth. The symptom characteristic of this problem is failure to the performance of the system to scale as more processors are added. The problem becomes exacerbated if I/O and memory functions share the same bus. The recent introduction of microprocessors with large internal caches and high performance external memory interfaces makes it practical to design high performance imaging system with balanced computational and memory bandwidth. Real word examples of such designs will be presented, along with a discussion of adapting algorithm design to best utilize available memory bandwidth.
Puniya, Bhanwar Lal; Allen, Laura; Hochfelder, Colleen; Majumder, Mahbubul; Helikar, Tomáš
2016-01-01
Dysregulation in signal transduction pathways can lead to a variety of complex disorders, including cancer. Computational approaches such as network analysis are important tools to understand system dynamics as well as to identify critical components that could be further explored as therapeutic targets. Here, we performed perturbation analysis of a large-scale signal transduction model in extracellular environments that stimulate cell death, growth, motility, and quiescence. Each of the model’s components was perturbed under both loss-of-function and gain-of-function mutations. Using 1,300 simulations under both types of perturbations across various extracellular conditions, we identified the most and least influential components based on the magnitude of their influence on the rest of the system. Based on the premise that the most influential components might serve as better drug targets, we characterized them for biological functions, housekeeping genes, essential genes, and druggable proteins. The most influential components under all environmental conditions were enriched with several biological processes. The inositol pathway was found as most influential under inactivating perturbations, whereas the kinase and small lung cancer pathways were identified as the most influential under activating perturbations. The most influential components were enriched with essential genes and druggable proteins. Moreover, known cancer drug targets were also classified in influential components based on the affected components in the network. Additionally, the systemic perturbation analysis of the model revealed a network motif of most influential components which affect each other. Furthermore, our analysis predicted novel combinations of cancer drug targets with various effects on other most influential components. We found that the combinatorial perturbation consisting of PI3K inactivation and overactivation of IP3R1 can lead to increased activity levels of apoptosis-related components and tumor-suppressor genes, suggesting that this combinatorial perturbation may lead to a better target for decreasing cell proliferation and inducing apoptosis. Finally, our approach shows a potential to identify and prioritize therapeutic targets through systemic perturbation analysis of large-scale computational models of signal transduction. Although some components of the presented computational results have been validated against independent gene expression data sets, more laboratory experiments are warranted to more comprehensively validate the presented results. PMID:26904540
Using an architectural approach to integrate heterogeneous, distributed software components
NASA Technical Reports Server (NTRS)
Callahan, John R.; Purtilo, James M.
1995-01-01
Many computer programs cannot be easily integrated because their components are distributed and heterogeneous, i.e., they are implemented in diverse programming languages, use different data representation formats, or their runtime environments are incompatible. In many cases, programs are integrated by modifying their components or interposing mechanisms that handle communication and conversion tasks. For example, remote procedure call (RPC) helps integrate heterogeneous, distributed programs. When configuring such programs, however, mechanisms like RPC must be used explicitly by software developers in order to integrate collections of diverse components. Each collection may require a unique integration solution. This paper describes improvements to the concepts of software packaging and some of our experiences in constructing complex software systems from a wide variety of components in different execution environments. Software packaging is a process that automatically determines how to integrate a diverse collection of computer programs based on the types of components involved and the capabilities of available translators and adapters in an environment. Software packaging provides a context that relates such mechanisms to software integration processes and reduces the cost of configuring applications whose components are distributed or implemented in different programming languages. Our software packaging tool subsumes traditional integration tools like UNIX make by providing a rule-based approach to software integration that is independent of execution environments.
Fast, Exact Bootstrap Principal Component Analysis for p > 1 million
Fisher, Aaron; Caffo, Brian; Schwartz, Brian; Zipunnikov, Vadim
2015-01-01
Many have suggested a bootstrap procedure for estimating the sampling variability of principal component analysis (PCA) results. However, when the number of measurements per subject (p) is much larger than the number of subjects (n), calculating and storing the leading principal components from each bootstrap sample can be computationally infeasible. To address this, we outline methods for fast, exact calculation of bootstrap principal components, eigenvalues, and scores. Our methods leverage the fact that all bootstrap samples occupy the same n-dimensional subspace as the original sample. As a result, all bootstrap principal components are limited to the same n-dimensional subspace and can be efficiently represented by their low dimensional coordinates in that subspace. Several uncertainty metrics can be computed solely based on the bootstrap distribution of these low dimensional coordinates, without calculating or storing the p-dimensional bootstrap components. Fast bootstrap PCA is applied to a dataset of sleep electroencephalogram recordings (p = 900, n = 392), and to a dataset of brain magnetic resonance images (MRIs) (p ≈ 3 million, n = 352). For the MRI dataset, our method allows for standard errors for the first 3 principal components based on 1000 bootstrap samples to be calculated on a standard laptop in 47 minutes, as opposed to approximately 4 days with standard methods. PMID:27616801
NASA Astrophysics Data System (ADS)
Song, Bowen; Zhang, Guopeng; Wang, Huafeng; Zhu, Wei; Liang, Zhengrong
2013-02-01
Various types of features, e.g., geometric features, texture features, projection features etc., have been introduced for polyp detection and differentiation tasks via computer aided detection and diagnosis (CAD) for computed tomography colonography (CTC). Although these features together cover more information of the data, some of them are statistically highly-related to others, which made the feature set redundant and burdened the computation task of CAD. In this paper, we proposed a new dimension reduction method which combines hierarchical clustering and principal component analysis (PCA) for false positives (FPs) reduction task. First, we group all the features based on their similarity using hierarchical clustering, and then PCA is employed within each group. Different numbers of principal components are selected from each group to form the final feature set. Support vector machine is used to perform the classification. The results show that when three principal components were chosen from each group we can achieve an area under the curve of receiver operating characteristics of 0.905, which is as high as the original dataset. Meanwhile, the computation time is reduced by 70% and the feature set size is reduce by 77%. It can be concluded that the proposed method captures the most important information of the feature set and the classification accuracy is not affected after the dimension reduction. The result is promising and further investigation, such as automatically threshold setting, are worthwhile and are under progress.
NASA Astrophysics Data System (ADS)
Aneri, Parikh; Sumathy, S.
2017-11-01
Cloud computing provides services over the internet and provides application resources and data to the users based on their demand. Base of the Cloud Computing is consumer provider model. Cloud provider provides resources which consumer can access using cloud computing model in order to build their application based on their demand. Cloud data center is a bulk of resources on shared pool architecture for cloud user to access. Virtualization is the heart of the Cloud computing model, it provides virtual machine as per application specific configuration and those applications are free to choose their own configuration. On one hand, there is huge number of resources and on other hand it has to serve huge number of requests effectively. Therefore, resource allocation policy and scheduling policy play very important role in allocation and managing resources in this cloud computing model. This paper proposes the load balancing policy using Hungarian algorithm. Hungarian Algorithm provides dynamic load balancing policy with a monitor component. Monitor component helps to increase cloud resource utilization by managing the Hungarian algorithm by monitoring its state and altering its state based on artificial intelligent. CloudSim used in this proposal is an extensible toolkit and it simulates cloud computing environment.
Computational Toxicology at the US EPA | Science Inventory ...
Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, EPA is developing robust and flexible computational tools that can be applied to the thousands of chemicals in commerce, and contaminant mixtures found in America’s air, water, and hazardous-waste sites. The ORD Computational Toxicology Research Program (CTRP) is composed of three main elements. The largest component is the National Center for Computational Toxicology (NCCT), which was established in 2005 to coordinate research on chemical screening and prioritization, informatics, and systems modeling. The second element consists of related activities in the National Health and Environmental Effects Research Laboratory (NHEERL) and the National Exposure Research Laboratory (NERL). The third and final component consists of academic centers working on various aspects of computational toxicology and funded by the EPA Science to Achieve Results (STAR) program. Key intramural projects of the CTRP include digitizing legacy toxicity testing information toxicity reference database (ToxRefDB), predicting toxicity (ToxCast™) and exposure (ExpoCast™), and creating virtual liver (v-Liver™) and virtual embryo (v-Embryo™) systems models. The models and underlying data are being made publicly available t
Ahmed, Afaz Uddin; Arablouei, Reza; Hoog, Frank de; Kusy, Branislav; Jurdak, Raja; Bergmann, Neil
2018-05-29
Channel state information (CSI) collected during WiFi packet transmissions can be used for localization of commodity WiFi devices in indoor environments with multipath propagation. To this end, the angle of arrival (AoA) and time of flight (ToF) for all dominant multipath components need to be estimated. A two-dimensional (2D) version of the multiple signal classification (MUSIC) algorithm has been shown to solve this problem using 2D grid search, which is computationally expensive and is therefore not suited for real-time localisation. In this paper, we propose using a modified matrix pencil (MMP) algorithm instead. Specifically, we show that the AoA and ToF estimates can be found independently of each other using the one-dimensional (1D) MMP algorithm and the results can be accurately paired to obtain the AoA⁻ToF pairs for all multipath components. Thus, the 2D estimation problem reduces to running 1D estimation multiple times, substantially reducing the computational complexity. We identify and resolve the problem of degenerate performance when two or more multipath components have the same AoA. In addition, we propose a packet aggregation model that uses the CSI data from multiple packets to improve the performance under noisy conditions. Simulation results show that our algorithm achieves two orders of magnitude reduction in the computational time over the 2D MUSIC algorithm while achieving similar accuracy. High accuracy and low computation complexity of our approach make it suitable for applications that require location estimation to run on resource-constrained embedded devices in real time.
Preparation of forefinger's sequence on keyboard orients ocular fixations on computer screen.
Coutté, Alexandre; Olivier, Gérard; Faure, Sylvane; Baccino, Thierry
2014-08-01
This study examined the links between attention, hand movements and eye movements when performed in different spatial areas. Participants performed a visual search task on a computer screen while preparing to press two keyboard keys sequentially with their index. Results showed that the planning of the manual sequence influenced the latency of the first saccade and the placement of the first fixation. In particular, even if the first fixation placement was influenced by the combination of both components of the prepared manual sequence in some trials, it was affected principally by the first component of the prepared manual sequence. Moreover, the probability that the first fixation placement did reflect a combination of both components of the manual sequence was correlated with the speed of the second component. This finding suggests that the preparation of the second component of the sequence influence simultaneous oculomotor behavior when motor control of the manual sequence relied on proactive motor planning. These results are discussed taking into account the current debate on the eye/hand coordination research.
Principal component regression analysis with SPSS.
Liu, R X; Kuang, J; Gong, Q; Hou, X L
2003-06-01
The paper introduces all indices of multicollinearity diagnoses, the basic principle of principal component regression and determination of 'best' equation method. The paper uses an example to describe how to do principal component regression analysis with SPSS 10.0: including all calculating processes of the principal component regression and all operations of linear regression, factor analysis, descriptives, compute variable and bivariate correlations procedures in SPSS 10.0. The principal component regression analysis can be used to overcome disturbance of the multicollinearity. The simplified, speeded up and accurate statistical effect is reached through the principal component regression analysis with SPSS.
DAKOTA Design Analysis Kit for Optimization and Terascale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Brian M.; Dalbey, Keith R.; Eldred, Michael S.
2010-02-24
The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes (computational models) and iterative analysis methods. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and analysis of computational models on high performance computers.A user provides a set of DAKOTA commands in an input file and launches DAKOTA. DAKOTA invokes instances of the computational models, collects their results, and performs systems analyses. DAKOTA contains algorithms for optimization with gradient and nongradient-basedmore » methods; uncertainty quantification with sampling, reliability, polynomial chaos, stochastic collocation, and epistemic methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as hybrid optimization, surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. Services for parallel computing, simulation interfacing, approximation modeling, fault tolerance, restart, and graphics are also included.« less
NASA Technical Reports Server (NTRS)
Goradia, S. H.; Lilley, D. E.
1975-01-01
Theoretical and experimental studies are described which were conducted for the purpose of developing a new generalized method for the prediction of profile drag of single component airfoil sections with sharp trailing edges. This method aims at solution for the flow in the wake from the airfoil trailing edge to the large distance in the downstream direction; the profile drag of the given airfoil section can then easily be obtained from the momentum balance once the shape of velocity profile at a large distance from the airfoil trailing edge has been computed. Computer program subroutines have been developed for the computation of the profile drag and flow in the airfoil wake on CDC6600 computer. The required inputs to the computer program consist of free stream conditions and the characteristics of the boundary layers at the airfoil trailing edge or at the point of incipient separation in the neighborhood of airfoil trailing edge. The method described is quite generalized and hence can be extended to the solution of the profile drag for multi-component airfoil sections.
Hermida, Juan C; Flores-Hernandez, Cesar; Hoenecke, Heinz R; D'Lima, Darryl D
2014-03-01
This study undertook a computational analysis of a wedged glenoid component for correction of retroverted glenoid arthritic deformity to determine whether a wedge-shaped glenoid component design with a built-in correction for version reduces excessive stresses in the implant, cement, and glenoid bone. Recommendations for correcting retroversion deformity are asymmetric reaming of the anterior glenoid, bone grafting of the posterior glenoid, or a glenoid component with posterior augmentation. Eccentric reaming has the disadvantages of removing normal bone, reducing structural support for the glenoid component, and increasing the risk of bone perforation by the fixation pegs. Bone grafting to correct retroverted deformity does not consistently generate successful results. Finite element models of 2 scapulae models representing a normal and an arthritic retroverted glenoid were implanted with a standard glenoid component (in retroversion or neutral alignment) or a wedged component. Glenohumeral forces representing in vivo loading were applied and stresses and strains computed in the bone, cement, and glenoid component. The retroverted glenoid components generated the highest compressive stresses and decreased cyclic fatigue life predictions for trabecular bone. Correction of retroversion by the wedged glenoid component significantly decreased stresses and predicted greater bone fatigue life. The cement volume estimated to survive 10 million cycles was the lowest for the retroverted components and the highest for neutrally implanted glenoid components and for wedged components. A wedged glenoid implant is a viable option to correct severe arthritic retroversion, reducing the need for eccentric reaming and the risk for implant failure. Copyright © 2014 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Mosby, Inc. All rights reserved.
Advances and trends in computational structural mechanics
NASA Technical Reports Server (NTRS)
Noor, A. K.
1986-01-01
Recent developments in computational structural mechanics are reviewed with reference to computational needs for future structures technology, advances in computational models for material behavior, discrete element technology, assessment and control of numerical simulations of structural response, hybrid analysis, and techniques for large-scale optimization. Research areas in computational structural mechanics which have high potential for meeting future technological needs are identified. These include prediction and analysis of the failure of structural components made of new materials, development of computational strategies and solution methodologies for large-scale structural calculations, and assessment of reliability and adaptive improvement of response predictions.
Time-dependent reliability analysis of ceramic engine components
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.
1993-01-01
The computer program CARES/LIFE calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. This program is an extension of the CARES (Ceramics Analysis and Reliability Evaluation of Structures) computer program. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing either the power or Paris law relations. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled using either the principle of independent action (PIA), the Weibull normal stress averaging method (NSA), or the Batdorf theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. Two example problems demonstrating proof testing and fatigue parameter estimation are given.
A high speed buffer for LV data acquisition
NASA Technical Reports Server (NTRS)
Cavone, Angelo A.; Sterlina, Patrick S.; Clemmons, James I., Jr.; Meyers, James F.
1987-01-01
The laser velocimeter (autocovariance) buffer interface is a data acquisition subsystem designed specifically for the acquisition of data from a laser velocimeter. The subsystem acquires data from up to six laser velocimeter components in parallel, measures the times between successive data points for each of the components, establishes and maintains a coincident condition between any two or three components, and acquires data from other instrumentation systems simultaneously with the laser velocimeter data points. The subsystem is designed to control the entire data acquisition process based on initial setup parameters obtained from a host computer and to be independent of the computer during the acquisition. On completion of the acquisition cycle, the interface transfers the contents of its memory to the host under direction of the host via a single 16-bit parallel DMA channel.
Computational Methods for Biomolecular Electrostatics
Dong, Feng; Olsen, Brett; Baker, Nathan A.
2008-01-01
An understanding of intermolecular interactions is essential for insight into how cells develop, operate, communicate and control their activities. Such interactions include several components: contributions from linear, angular, and torsional forces in covalent bonds, van der Waals forces, as well as electrostatics. Among the various components of molecular interactions, electrostatics are of special importance because of their long range and their influence on polar or charged molecules, including water, aqueous ions, and amino or nucleic acids, which are some of the primary components of living systems. Electrostatics, therefore, play important roles in determining the structure, motion and function of a wide range of biological molecules. This chapter presents a brief overview of electrostatic interactions in cellular systems with a particular focus on how computational tools can be used to investigate these types of interactions. PMID:17964951
Moving target, distributed, real-time simulation using Ada
NASA Technical Reports Server (NTRS)
Collins, W. R.; Feyock, S.; King, L. A.; Morell, L. J.
1985-01-01
Research on a precompiler solution is described for the moving target compiler problem encountered when trying to run parallel simulation algorithms on several microcomputers. The precompiler is under development at NASA-Lewis for simulating jet engines. Since the behavior of any component of a jet engine, e.g., the fan inlet, rear duct, forward sensor, etc., depends on the previous behaviors and not the current behaviors of other components, the behaviors can be modeled on different processors provided the outputs of the processors reach other processors in appropriate time intervals. The simulator works in compute and transfer modes. The Ada procedure sets for the behaviors of different components are divided up and routed by the precompiler, which essentially receives a multitasking program. The subroutines are synchronized after each computation cycle.
Channel Model Optimization with Reflection Residual Component for Indoor MIMO-VLC System
NASA Astrophysics Data System (ADS)
Chen, Yong; Li, Tengfei; Liu, Huanlin; Li, Yichao
2017-12-01
A fast channel modeling method is studied to solve the problem of reflection channel gain for multiple input multiple output-visible light communications (MIMO-VLC) in the paper. For reducing the computational complexity when associating with the reflection times, no more than 3 reflections are taken into consideration in VLC. We think that higher order reflection link consists of corresponding many times line of sight link and firstly present reflection residual component to characterize higher reflection (more than 2 reflections). We perform computer simulation results for point-to-point channel impulse response, receiving optical power and receiving signal to noise ratio. Based on theoretical analysis and simulation results, the proposed method can effectively reduce the computational complexity of higher order reflection in channel modeling.
The RISC (Reduced Instruction Set Computer) Architecture and Computer Performance Evaluation.
1986-03-01
time where the main emphasis of the evaluation process is put on the software . The model is intended to provide a tool for computer architects to use...program, or 3) Was to be implemented in random logic more effec- tively than the equivalent sequence of software instructions. Both data and address...definition is the IEEE standard 729-1983 stating Computer Architecture as: " The process of defining a collection of hardware and software components and
Computer Description of the M561 Utility Truck
1984-10-01
GIFT Computer Code Sustainabi1ity Predictions for Army Spare Components Requirements for Combat (SPARC) 20. ABSTRACT (Caotfmia «a NWM eitim ft...used as input to the GIFT computer code to generate target vulnerability data. DO FORM V JAM 73 1473 EDITION OF I NOV 65 IS OBSOLETE Unclass i f ied...anaLyiis requires input from the Geometric Information for Targets ( GIFT ) ’ computer code. This report documents the combina- torial geometry (Com-Geom
Computer ethics: A capstone course
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fisher, T.G.; Abunawass, A.M.
1994-12-31
This paper presents a capstone course on computer ethics required for all computer science majors in our program. The course was designed to encourage students to evaluate their own personal value systems in terms of the established values in computer science as represented by the ACM Code of Ethics. The structure, activities, and topics of the course as well as assessment of the students are presented. Observations on various course components and student evaluations of the course are also presented.
Kim, Young-Hoo; Park, Jang-Won; Kim, Jun-Shik
2018-01-01
Proponents of computer-assisted TKA suggest that better alignment of the TKAs will lead to improved long-term patient functional outcome and survivorship of the implants. However, there is little evidence about whether the improved position and alignment of the knee components obtained using computer navigation improve patient function and the longevity of the TKA. The purpose of this study was to determine whether (1) clinical results; (2) radiographic and CT scan results; and (3) the survival rate of TKA components would be better in patients having computer-assisted TKA than results of patients having TKA without computer-assisted TKA. In addition, we determined whether (4) complication rates would be less in the patients with computer-assisted TKA than those in patients with conventional TKA. We performed a randomized trial between October 2000 and October 2002 in patients undergoing same-day bilateral TKA; in this trial, one knee was operated on using navigation, and the other knee was operated on without navigation. All 296 patients who underwent same-day bilateral TKA during that period were enrolled. Of those, 282 patients (95%) were accounted for at a mean of 15 years (range, 14-16 years). A total of 79% (223 of 282) were women and the mean age of the patients at the time of index arthroplasty was 59 ± 7 years (range, 48-64 years). Knee Society knee score, WOMAC score, and UCLA activity score were obtained preoperatively and at latest followup. Radiographic measurements were performed including femorotibial angle, position of femoral and tibial components, level of joint line, and posterior condylar offset. Aseptic loosening was defined as a complete radiolucent line > 1 mm in width around any component or migration of any component. Assessors and patients were blind to treatment assignment. The Knee Society knee (92 ± 8 versus 93 ± 7 points; 95% confidence interval [CI], 92-98; p = 0.461) and function scores (80 ± 11 versus 80 ± 11 points; 95% CI, 73-87; p = 1.000), WOMAC score (14 ± 7 versus 15 ± 8 points; 95% CI, 14-18; p = 0.991), range of knee motion (128° ± 9° versus 127° ± 10°; 95% CI, 100-140; p = 0.780), and UCLA patient activity score (6 versus 6 points; 95% CI, 4-8; p = 1.000) were not different between the two groups at 15 years followup. There were no differences in any radiographic parameters of alignment (on radiography or CT scan) between the two groups. The frequency of aseptic loosening was not different between the two groups (p = 0.918). Kaplan-Meier survivorship of the TKA components was 99% in both groups (95% CI, 93-100) at 15 years as the endpoint of revision or aseptic loosening (p = 0.982). Anterior femoral notching was observed in 11 knees (4%) in the computer-assisted TKA group and none in the conventional TKA group (p = 0.046). In this randomized trial, with data presented at a minimum of 14 years of followup, we found no benefit to computer navigation in TKA in terms of pain, function, or survivorship. Unless another study at long-term followup identifies an advantage to survivorship, pain, and function, we do not recommend the widespread use of computer navigation in TKA because of its risks (in this series, we observed femoral notching; others have observed pin site fractures) and attendant costs. Level I, therapeutic study.
Hierarchical nonlinear behavior of hot composite structures
NASA Technical Reports Server (NTRS)
Murthy, P. L. N.; Chamis, C. C.; Singhal, S. N.
1993-01-01
Hierarchical computational procedures are described to simulate the multiple scale thermal/mechanical behavior of high temperature metal matrix composites (HT-MMC) in the following three broad areas: (1) behavior of HT-MMC's from micromechanics to laminate via METCAN (Metal Matrix Composite Analyzer), (2) tailoring of HT-MMC behavior for optimum specific performance via MMLT (Metal Matrix Laminate Tailoring), and (3) HT-MMC structural response for hot structural components via HITCAN (High Temperature Composite Analyzer). Representative results from each area are presented to illustrate the effectiveness of computational simulation procedures and accompanying computer codes. The sample case results show that METCAN can be used to simulate material behavior such as the entire creep span; MMLT can be used to concurrently tailor the fabrication process and the interphase layer for optimum performance such as minimum residual stresses; and HITCAN can be used to predict the structural behavior such as the deformed shape due to component fabrication. These codes constitute virtual portable desk-top test laboratories for characterizing HT-MMC laminates, tailoring the fabrication process, and qualifying structural components made from them.
A computer-based feedback only intervention with and without a moderation skills component.
Weaver, Cameron C; Leffingwell, Thad R; Lombardi, Nathaniel J; Claborn, Kasey R; Miller, Mary E; Martens, Matthew P
2014-01-01
Research on the efficacy of computer-delivered feedback-only interventions (FOIs) for college alcohol misuse has been mixed. Limitations to these FOIs include participant engagement and variation in the use of a moderation skills component. The current investigation sought to address these limitations using a novel computer-delivered FOI, the Drinkers Assessment and Feedback Tool for College Students (DrAFT-CS). Heavy drinking college students (N=176) were randomly assigned to DrAFT-CS, DrAFT-CS plus moderation skills (DrAFT-CS+), moderation skills only (MSO), or assessment only (AO) group, and were assessed at 1-month follow-up (N=157). Participants in the DrAFT-CS and DrAFT-CS+groups reported significantly lower estimated blood alcohol concentrations (eBACs) on typical heaviest drinking day than participants in the AO group. The data also supported the incorporation of a moderation skills component within FOIs, such that participants in DrAFT-CS+group reported significantly fewer drinks per week and drinks per heaviest drinking occasion than participants in the AO group. © 2013.
A computer-based feedback only intervention with and without a moderation skills component
Weaver, Cameron C.; Leffingwell, Thad R.; Lombardi, Nathaniel J.; Claborn, Kasey R.; Miller, Mary E.; Martens, Matthew P.
2014-01-01
Research on the efficacy of computer-delivered feedback-only interventions (FOIs) for college alcohol misuse has been mixed. Limitations to these FOIs include participant engagement and variation in the use of a moderation skills component. The current investigation sought to address these limitations using a novel computer-delivered FOI, the Drinkers Assessment and Feedback Tool for College Students (DrAFT-CS). Heavy drinking college students (N = 176) were randomly assigned to DrAFT-CS, DrAFT-CS plus moderation skills (DrAFT-CS+), moderation skills only (MSO), or assessment only (AO) group, and were assessed at 1-month follow-up (N = 157). Participants in the DrAFT-CS and DrAFT-CS + groups reported significantly lower estimated blood alcohol concentrations (eBACs) on typical heaviest drinking day than participants in the AO group. The data also supported the incorporation of a moderation skills component within FOIs, such that participants in DrAFT-CS + group reported significantly fewer drinks per week and drinks per heaviest drinking occasion than participants in the AO group. PMID:24041748
NASA Astrophysics Data System (ADS)
Pan, M.-Ch.; Chu, W.-Ch.; Le, Duc-Do
2016-12-01
The paper presents an alternative Vold-Kalman filter order tracking (VKF_OT) method, i.e. adaptive angular-velocity VKF_OT technique, to extract and characterize order components in an adaptive manner for the condition monitoring and fault diagnosis of rotary machinery. The order/spectral waveforms to be tracked can be recursively solved by using Kalman filter based on the one-step state prediction. The paper comprises theoretical derivation of computation scheme, numerical implementation, and parameter investigation. Comparisons of the adaptive VKF_OT scheme with two other ones are performed through processing synthetic signals of designated order components. Processing parameters such as the weighting factor and the correlation matrix of process noise, and data conditions like the sampling frequency, which influence tracking behavior, are explored. The merits such as adaptive processing nature and computation efficiency brought by the proposed scheme are addressed although the computation was performed in off-line conditions. The proposed scheme can simultaneously extract multiple spectral components, and effectively decouple close and crossing orders associated with multi-axial reference rotating speeds.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jadaan, O.M.; Powers, L.M.; Nemeth, N.N.
1995-08-01
A probabilistic design methodology which predicts the fast fracture and time-dependent failure behavior of thermomechanically loaded ceramic components is discussed using the CARES/LIFE integrated design computer program. Slow crack growth (SCG) is assumed to be the mechanism responsible for delayed failure behavior. Inert strength and dynamic fatigue data obtained from testing coupon specimens (O-ring and C-ring specimens) are initially used to calculate the fast fracture and SCG material parameters as a function of temperature using the parameter estimation techniques available with the CARES/LIFE code. Finite element analysis (FEA) is used to compute the stress distributions for the tube as amore » function of applied pressure. Knowing the stress and temperature distributions and the fast fracture and SCG material parameters, the life time for a given tube can be computed. A stress-failure probability-time to failure (SPT) diagram is subsequently constructed for these tubes. Such a diagram can be used by design engineers to estimate the time to failure at a given failure probability level for a component subjected to a given thermomechanical load.« less
NASA Technical Reports Server (NTRS)
Daigle, Matthew John; Goebel, Kai Frank
2010-01-01
Model-based prognostics captures system knowledge in the form of physics-based models of components, and how they fail, in order to obtain accurate predictions of end of life (EOL). EOL is predicted based on the estimated current state distribution of a component and expected profiles of future usage. In general, this requires simulations of the component using the underlying models. In this paper, we develop a simulation-based prediction methodology that achieves computational efficiency by performing only the minimal number of simulations needed in order to accurately approximate the mean and variance of the complete EOL distribution. This is performed through the use of the unscented transform, which predicts the means and covariances of a distribution passed through a nonlinear transformation. In this case, the EOL simulation acts as that nonlinear transformation. In this paper, we review the unscented transform, and describe how this concept is applied to efficient EOL prediction. As a case study, we develop a physics-based model of a solenoid valve, and perform simulation experiments to demonstrate improved computational efficiency without sacrificing prediction accuracy.
Computer-Aided Modeling and Analysis of Power Processing Systems (CAMAPPS). Phase 1: Users handbook
NASA Technical Reports Server (NTRS)
Kim, S.; Lee, J.; Cho, B. H.; Lee, F. C.
1986-01-01
The EASY5 macro component models developed for the spacecraft power system simulation are described. A brief explanation about how to use the macro components with the EASY5 Standard Components to build a specific system is given through an example. The macro components are ordered according to the following functional group: converter power stage models, compensator models, current-feedback models, constant frequency control models, load models, solar array models, and shunt regulator models. Major equations, a circuit model, and a program listing are provided for each macro component.
40 CFR 86.1803-01 - Definitions.
Code of Federal Regulations, 2011 CFR
2011-07-01
... operator prior to procurement. Auxiliary Emission Control Device (AECD) means any element of design which... components are those components which are designed primarily for emission control, or whose failure may... of design means any control system (i.e., computer software, electronic control system, emission...
40 CFR 86.1803-01 - Definitions.
Code of Federal Regulations, 2012 CFR
2012-07-01
... prior to procurement. Auxiliary Emission Control Device (AECD) means any element of design which senses... components are those components which are designed primarily for emission control, or whose failure may... of design means any control system (i.e., computer software, electronic control system, emission...
Proceedings of the 3rd Annual Conference on Aerospace Computational Control, volume 1
NASA Technical Reports Server (NTRS)
Bernard, Douglas E. (Editor); Man, Guy K. (Editor)
1989-01-01
Conference topics included definition of tool requirements, advanced multibody component representation descriptions, model reduction, parallel computation, real time simulation, control design and analysis software, user interface issues, testing and verification, and applications to spacecraft, robotics, and aircraft.
DOT National Transportation Integrated Search
2004-01-01
The purpose of this document is to expand upon the evaluation components presented in "Computer-aided dispatch--traffic management center field operational test final evaluation plan : state of Utah". This document defines the objective, approach, an...
Telecommunications Technology in the 1980s.
ERIC Educational Resources Information Center
Baer, Walter S.
This paper describes some of the advances in telecommunications technology that can be anticipated during the 1980's in the areas of computer and component technologies, computer influences on telecommunications systems and services, communications terminals, transmission and switching systems, and local distribution. Specific topics covered…
Software For Monitoring A Computer Network
NASA Technical Reports Server (NTRS)
Lee, Young H.
1992-01-01
SNMAT is rule-based expert-system computer program designed to assist personnel in monitoring status of computer network and identifying defective computers, workstations, and other components of network. Also assists in training network operators. Network for SNMAT located at Space Flight Operations Center (SFOC) at NASA's Jet Propulsion Laboratory. Intended to serve as data-reduction system providing windows, menus, and graphs, enabling users to focus on relevant information. SNMAT expected to be adaptable to other computer networks; for example in management of repair, maintenance, and security, or in administration of planning systems, billing systems, or archives.
Guest Editor's Introduction: Special section on dependable distributed systems
NASA Astrophysics Data System (ADS)
Fetzer, Christof
1999-09-01
We rely more and more on computers. For example, the Internet reshapes the way we do business. A `computer outage' can cost a company a substantial amount of money. Not only with respect to the business lost during an outage, but also with respect to the negative publicity the company receives. This is especially true for Internet companies. After recent computer outages of Internet companies, we have seen a drastic fall of the shares of the affected companies. There are multiple causes for computer outages. Although computer hardware becomes more reliable, hardware related outages remain an important issue. For example, some of the recent computer outages of companies were caused by failed memory and system boards, and even by crashed disks - a failure type which can easily be masked using disk mirroring. Transient hardware failures might also look like software failures and, hence, might be incorrectly classified as such. However, many outages are software related. Faulty system software, middleware, and application software can crash a system. Dependable computing systems are systems we can rely on. Dependable systems are, by definition, reliable, available, safe and secure [3]. This special section focuses on issues related to dependable distributed systems. Distributed systems have the potential to be more dependable than a single computer because the probability that all computers in a distributed system fail is smaller than the probability that a single computer fails. However, if a distributed system is not built well, it is potentially less dependable than a single computer since the probability that at least one computer in a distributed system fails is higher than the probability that one computer fails. For example, if the crash of any computer in a distributed system can bring the complete system to a halt, the system is less dependable than a single-computer system. Building dependable distributed systems is an extremely difficult task. There is no silver bullet solution. Instead one has to apply a variety of engineering techniques [2]: fault-avoidance (minimize the occurrence of faults, e.g. by using a proper design process), fault-removal (remove faults before they occur, e.g. by testing), fault-evasion (predict faults by monitoring and reconfigure the system before failures occur), and fault-tolerance (mask and/or contain failures). Building a system from scratch is an expensive and time consuming effort. To reduce the cost of building dependable distributed systems, one would choose to use commercial off-the-shelf (COTS) components whenever possible. The usage of COTS components has several potential advantages beyond minimizing costs. For example, through the widespread usage of a COTS component, design failures might be detected and fixed before the component is used in a dependable system. Custom-designed components have to mature without the widespread in-field testing of COTS components. COTS components have various potential disadvantages when used in dependable systems. For example, minimizing the time to market might lead to the release of components with inherent design faults (e.g. use of `shortcuts' that only work most of the time). In addition, the components might be more complex than needed and, hence, potentially have more design faults than simpler components. However, given economic constraints and the ability to cope with some of the problems using fault-evasion and fault-tolerance, only for a small percentage of systems can one justify not using COTS components. Distributed systems built from current COTS components are asynchronous systems in the sense that there exists no a priori known bound on the transmission delay of messages or the execution time of processes. When designing a distributed algorithm, one would like to make sure (e.g. by testing or verification) that it is correct, i.e. satisfies its specification. Many distributed algorithms make use of consensus (eventually all non-crashed processes have to agree on a value), leader election (a crashed leader is eventually replaced by a new leader, but at any time there is at most one leader) or a group membership detection service (a crashed process is eventually suspected to have crashed but only crashed processes are suspected). From a theoretical point of view, the service specifications given for such services are not implementable in asynchronous systems. In particular, for each implementation one can derive a counter example in which the service violates its specification. From a practical point of view, the consensus, the leader election, and the membership detection problem are solvable in asynchronous distributed systems. In this special section, Raynal and Tronel show how to bridge this difference by showing how to implement the group membership detection problem with a negligible probability [1] to fail in an asynchronous system. The group membership detection problem is specified by a liveness condition (L) and a safety property (S): (L) if a process p crashes, then eventually every non-crashed process q has to suspect that p has crashed; and (S) if a process q suspects p, then p has indeed crashed. One can show that either (L) or (S) is implementable, but one cannot implement both (L) and (S) at the same time in an asynchronous system. In practice, one only needs to implement (L) and (S) such that the probability that (L) or (S) is violated becomes negligible. Raynal and Tronel propose and analyse a protocol that implements (L) with certainty and that can be tuned such that the probability that (S) is violated becomes negligible. Designing and implementing distributed fault-tolerant protocols for asynchronous systems is a difficult but not an impossible task. A fault-tolerant protocol has to detect and mask certain failure classes, e.g. crash failures and message omission failures. There is a trade-off between the performance of a fault-tolerant protocol and the failure classes the protocol can tolerate. One wants to tolerate as many failure classes as needed to satisfy the stochastic requirements of the protocol [1] while still maintaining a sufficient performance. Since clients of a protocol have different requirements with respect to the performance/fault-tolerance trade-off, one would like to be able to customize protocols such that one can select an appropriate performance/fault-tolerance trade-off. In this special section Hiltunen et al describe how one can compose protocols from micro-protocols in their Cactus system. They show how a group RPC system can be tailored to the needs of a client. In particular, they show how considering additional failure classes affects the performance of a group RPC system. References [1] Cristian F 1991 Understanding fault-tolerant distributed systems Communications of ACM 34 (2) 56-78 [2] Heimerdinger W L and Weinstock C B 1992 A conceptual framework for system fault tolerance Technical Report 92-TR-33, CMU/SEI [3] Laprie J C (ed) 1992 Dependability: Basic Concepts and Terminology (Vienna: Springer)
Component Position and Metal Ion Levels in Computer-Navigated Hip Resurfacing Arthroplasty.
Mann, Stephen M; Kunz, Manuela; Ellis, Randy E; Rudan, John F
2017-01-01
Metal ion levels are used as a surrogate marker for wear in hip resurfacing arthroplasties. Improper component position, particularly on the acetabular side, plays an important role in problems with the bearing surfaces, such as edge loading, impingement on the acetabular component rim, lack of fluid-film lubrication, and acetabular component deformation. There are little data regarding femoral component position and its possible implications on wear and failure rates. The purpose of this investigation was to determine both femoral and acetabular component positions in our cohort of mechanically stable hip resurfacing arthroplasties and to determine if these were related to metal ion levels. One hundred fourteen patients who had undergone a computer-assisted metal-on-metal hip resurfacing were prospectively followed. Cobalt and chromium levels, Harris Hip, and UCLA activity scores in addition to measures of the acetabular and femoral component position and angles of the femur and acetabulum were recorded. Significant changes included increases in the position of the acetabular component compared to the native acetabulum; increase in femoral vertical offset; and decreases in global offset, gluteus medius activation angle, and abductor arm angle (P < .05). Multiple regression analysis found no significant predictors of cobalt and chromium metal ion levels. Femoral and acetabular components placed in acceptable position failed to predict increased metal ion levels, and increased levels did not adversely impact patient function or satisfaction. Further research is necessary to clarify factors contributing to prosthesis wear. Copyright © 2016 Elsevier Inc. All rights reserved.
Model reduction by weighted Component Cost Analysis
NASA Technical Reports Server (NTRS)
Kim, Jae H.; Skelton, Robert E.
1990-01-01
Component Cost Analysis considers any given system driven by a white noise process as an interconnection of different components, and assigns a metric called 'component cost' to each component. These component costs measure the contribution of each component to a predefined quadratic cost function. A reduced-order model of the given system may be obtained by deleting those components that have the smallest component costs. The theory of Component Cost Analysis is extended to include finite-bandwidth colored noises. The results also apply when actuators have dynamics of their own. Closed-form analytical expressions of component costs are also derived for a mechanical system described by its modal data. This is very useful to compute the modal costs of very high order systems. A numerical example for MINIMAST system is presented.
Cyber-workstation for computational neuroscience.
Digiovanna, Jack; Rattanatamrong, Prapaporn; Zhao, Ming; Mahmoudi, Babak; Hermer, Linda; Figueiredo, Renato; Principe, Jose C; Fortes, Jose; Sanchez, Justin C
2010-01-01
A Cyber-Workstation (CW) to study in vivo, real-time interactions between computational models and large-scale brain subsystems during behavioral experiments has been designed and implemented. The design philosophy seeks to directly link the in vivo neurophysiology laboratory with scalable computing resources to enable more sophisticated computational neuroscience investigation. The architecture designed here allows scientists to develop new models and integrate them with existing models (e.g. recursive least-squares regressor) by specifying appropriate connections in a block-diagram. Then, adaptive middleware transparently implements these user specifications using the full power of remote grid-computing hardware. In effect, the middleware deploys an on-demand and flexible neuroscience research test-bed to provide the neurophysiology laboratory extensive computational power from an outside source. The CW consolidates distributed software and hardware resources to support time-critical and/or resource-demanding computing during data collection from behaving animals. This power and flexibility is important as experimental and theoretical neuroscience evolves based on insights gained from data-intensive experiments, new technologies and engineering methodologies. This paper describes briefly the computational infrastructure and its most relevant components. Each component is discussed within a systematic process of setting up an in vivo, neuroscience experiment. Furthermore, a co-adaptive brain machine interface is implemented on the CW to illustrate how this integrated computational and experimental platform can be used to study systems neurophysiology and learning in a behavior task. We believe this implementation is also the first remote execution and adaptation of a brain-machine interface.
Cyber-Workstation for Computational Neuroscience
DiGiovanna, Jack; Rattanatamrong, Prapaporn; Zhao, Ming; Mahmoudi, Babak; Hermer, Linda; Figueiredo, Renato; Principe, Jose C.; Fortes, Jose; Sanchez, Justin C.
2009-01-01
A Cyber-Workstation (CW) to study in vivo, real-time interactions between computational models and large-scale brain subsystems during behavioral experiments has been designed and implemented. The design philosophy seeks to directly link the in vivo neurophysiology laboratory with scalable computing resources to enable more sophisticated computational neuroscience investigation. The architecture designed here allows scientists to develop new models and integrate them with existing models (e.g. recursive least-squares regressor) by specifying appropriate connections in a block-diagram. Then, adaptive middleware transparently implements these user specifications using the full power of remote grid-computing hardware. In effect, the middleware deploys an on-demand and flexible neuroscience research test-bed to provide the neurophysiology laboratory extensive computational power from an outside source. The CW consolidates distributed software and hardware resources to support time-critical and/or resource-demanding computing during data collection from behaving animals. This power and flexibility is important as experimental and theoretical neuroscience evolves based on insights gained from data-intensive experiments, new technologies and engineering methodologies. This paper describes briefly the computational infrastructure and its most relevant components. Each component is discussed within a systematic process of setting up an in vivo, neuroscience experiment. Furthermore, a co-adaptive brain machine interface is implemented on the CW to illustrate how this integrated computational and experimental platform can be used to study systems neurophysiology and learning in a behavior task. We believe this implementation is also the first remote execution and adaptation of a brain-machine interface. PMID:20126436
Determining the Requisite Components of Visual Threat Detection to Improve Operational Performance
2014-04-01
cognitive processes, and may be enhanced by focusing training development on the principle components such as causal reasoning. The second report will...discuss the development and evaluation of a research-based training exemplar. Visual threat detection pervades many military contexts, but is also... developing computer-controlled exercises to study the primary components of visual threat detection. Similarly, civilian law enforcement officers were
Herd-Level Mastitis-Associated Costs on Canadian Dairy Farms
Aghamohammadi, Mahjoob; Haine, Denis; Kelton, David F.; Barkema, Herman W.; Hogeveen, Henk; Keefe, Gregory P.; Dufour, Simon
2018-01-01
Mastitis imposes considerable and recurring economic losses on the dairy industry worldwide. The main objective of this study was to estimate herd-level costs incurred by expenditures and production losses associated with mastitis on Canadian dairy farms in 2015, based on producer reports. Previously, published mastitis economic frameworks were used to develop an economic model with the most important cost components. Components investigated were divided between clinical mastitis (CM), subclinical mastitis (SCM), and other costs components (i.e., preventive measures and product quality). A questionnaire was mailed to 374 dairy producers randomly selected from the (Canadian National Dairy Study 2015) to collect data on these costs components, and 145 dairy producers returned a completed questionnaire. For each herd, costs due to the different mastitis-related components were computed by applying the values reported by the dairy producer to the developed economic model. Then, for each herd, a proportion of the costs attributable to a specific component was computed by dividing absolute costs for this component by total herd mastitis-related costs. Median self-reported CM incidence was 19 cases/100 cow-year and mean self-reported bulk milk somatic cell count was 184,000 cells/mL. Most producers reported using post-milking teat disinfection (97%) and dry cow therapy (93%), and a substantial proportion of producers reported using pre-milking teat disinfection (79%) and wearing gloves during milking (77%). Mastitis costs were substantial (662 CAD per milking cow per year for a typical Canadian dairy farm), with a large portion of the costs (48%) being attributed to SCM, and 34 and 15% due to CM and implementation of preventive measures, respectively. For SCM, the two most important cost components were the subsequent milk yield reduction and culling (72 and 25% of SCM costs, respectively). For CM, first, second, and third most important cost components were culling (48% of CM costs), milk yield reduction following the CM events (34%), and discarded milk (11%), respectively. This study is the first since 1990 to investigate costs of mastitis in Canada. The model developed in the current study can be used to compute mastitis costs at the herd and national level in Canada. PMID:29868620
ERIC Educational Resources Information Center
Wofford, Jennifer
2009-01-01
Computing is anticipated to have an increasingly expansive impact on the sciences overall, becoming the third, crucial component of a "golden triangle" that includes mathematics and experimental and theoretical science. However, even more true with computing than with math and science, we are not preparing our students for this new reality. It is…
ERIC Educational Resources Information Center
Rubin, Michael Rogers
1988-01-01
The second of three articles on abusive data collection and usage practices and their effect on personal privacy, discusses the evolution of data protection laws worldwide, and compares the scope, major provisions, and enforcement components of the laws. A chronology of key events in the regulation of computer databanks in included. (1 reference)…
Electro-Optic Computing Architectures: Volume II. Components and System Design and Analysis
1998-02-01
The objective of the Electro - Optic Computing Architecture (EOCA) program was to develop multi-function electro - optic interfaces and optical...interconnect units to enhance the performance of parallel processor systems and form the building blocks for future electro - optic computing architectures...Specifically, three multi-function interface modules were targeted for development - an Electro - Optic Interface (EOI), an Optical Interconnection Unit
Active Computer Network Defense: An Assessment
2001-04-01
sufficient base of knowledge in information technology can be assumed to be working on some form of computer network warfare, even if only defensive in...the Defense Information Infrastructure (DII) to attack. Transmission Control Protocol/ Internet Protocol (TCP/IP) networks are inherently resistant to...aims to create this part of information superiority, and computer network defense is one of its fundamental components. Most of these efforts center
Translations on USSR Science and Technology Physical Sciences and Technology No. 18
1977-09-19
and Avetik Gukasyan discuss component arrangement alternatives. COPYRIGHT: Notice not available 8545 CSO: 1870 CYBERNETICS, COMPUTERS AND...1974. COPYRIGHT: Notice not available 8545 CSO: 1870 CYBERNETICS, COMPUTERS AND AUTOMATION TECHNOLOGY ’PROYEKC’ COMPUTER-ASSISTED DESIGN SYSTEM...throughout the world are struggling. The "Proyekt" system, produced in the Institute of Cybernetics, assists in automating the design and manufacture of
How to Build a Quantum Computer
NASA Astrophysics Data System (ADS)
Sanders, Barry C.
2017-11-01
Quantum computer technology is progressing rapidly with dozens of qubits and hundreds of quantum logic gates now possible. Although current quantum computer technology is distant from being able to solve computational problems beyond the reach of non-quantum computers, experiments have progressed well beyond simply demonstrating the requisite components. We can now operate small quantum logic processors with connected networks of qubits and quantum logic gates, which is a great stride towards functioning quantum computers. This book aims to be accessible to a broad audience with basic knowledge of computers, electronics and physics. The goal is to convey key notions relevant to building quantum computers and to present state-of-the-art quantum-computer research in various media such as trapped ions, superconducting circuits, photonics and beyond.
Simulation of an enhanced TCAS 2 system in operation
NASA Technical Reports Server (NTRS)
Rojas, R. G.; Law, P.; Burnside, W. D.
1987-01-01
Described is a computer simulation of a Boeing 737 aircraft equipped with an enhanced Traffic and Collision Avoidance System (TCAS II). In particular, an algorithm is developed which permits the computer simulation of the tracking of a target airplane by a Boeing 373 which has a TCAS II array mounted on top of its fuselage. This algorithm has four main components: namely, the target path, the noise source, the alpha-beta filter, and threat detection. The implementation of each of these four components is described. Furthermore, the areas where the present algorithm needs to be improved are also mentioned.
Re-Tooling the Agency's Engineering Predictive Practices for Durability and Damage Tolerance
NASA Technical Reports Server (NTRS)
Piascik, Robert S.; Knight, Norman F., Jr.
2017-01-01
Over the past decade, the Agency has placed less emphasis on testing and has increasingly relied on computational methods to assess durability and damage tolerance (D&DT) behavior when evaluating design margins for fracture-critical components. With increased emphasis on computational D&DT methods as the standard practice, it is paramount that capabilities of these methods are understood, the methods are used within their technical limits, and validation by well-designed tests confirms understanding. The D&DT performance of a component is highly dependent on parameters in the neighborhood of the damage. This report discusses D&DT method vulnerabilities.
NASA Astrophysics Data System (ADS)
Kulkarni, Malhar; Kulkarni, Irawati; Dangarikar, Chaitali; Bhattacharyya, Pushpak
Glosses and examples are the essential components of the computational lexical databases like, Wordnet. These two components of the lexical database can be used in building domain ontologies, semantic relations, phrase structure rules etc., and can help automatic or manual word sense disambiguation tasks. The present paper aims to highlight the importance of gloss in the process of WSD based on the experiences from building Sanskrit Wordnet. This paper presents a survey of Sanskrit Synonymy lexica, use of Navya-Nyāya terminology in developing a gloss and the kind of patterns evolved that are useful for the computational purpose of WSD with special reference to Sanskrit.
NASA Astrophysics Data System (ADS)
Ness, P. H.; Jacobson, H.
1984-10-01
The thrust of 'group technology' is toward the exploitation of similarities in component design and manufacturing process plans to achieve assembly line flow cost efficiencies for small batch production. The systematic method devised for the identification of similarities in component geometry and processing steps is a coding and classification scheme implemented by interactive CAD/CAM systems. This coding and classification scheme has led to significant increases in computer processing power, allowing rapid searches and retrievals on the basis of a 30-digit code together with user-friendly computer graphics.
Monkey search algorithm for ECE components partitioning
NASA Astrophysics Data System (ADS)
Kuliev, Elmar; Kureichik, Vladimir; Kureichik, Vladimir, Jr.
2018-05-01
The paper considers one of the important design problems – a partitioning of electronic computer equipment (ECE) components (blocks). It belongs to the NP-hard class of problems and has a combinatorial and logic nature. In the paper, a partitioning problem formulation can be found as a partition of graph into parts. To solve the given problem, the authors suggest using a bioinspired approach based on a monkey search algorithm. Based on the developed software, computational experiments were carried out that show the algorithm efficiency, as well as its recommended settings for obtaining more effective solutions in comparison with a genetic algorithm.
A COTS-Based Replacement Strategy for Aging Avionics Computers
2001-12-01
Communication Control Unit. A COTS-Based Replacement Strategy for Aging Avionics Computers COTS Microprocessor Real Time Operating System New Native Code...Native Code Objec ts Native Code Thread Real - Time Operating System Legacy Function x Virtual Component Environment Context Switch Thunk Add-in Replace
Improving Perceptual Skills with 3-Dimensional Animations.
ERIC Educational Resources Information Center
Johns, Janet Faye; Brander, Julianne Marie
1998-01-01
Describes three-dimensional computer aided design (CAD) models for every component in a representative mechanical system; the CAD models made it easy to generate 3-D animations that are ideal for teaching perceptual skills in multimedia computer-based technical training. Fifteen illustrations are provided. (AEF)
Computer Disaster Recovery Planning.
ERIC Educational Resources Information Center
Clark, Orvin R.
Arguing that complete, reliable, up-to-date system documentation is critical for every data processing environment, this paper on computer disaster recovery planning begins by discussing the importance of such documentation both for recovering from a systems crash, and for system maintenance and enhancement. The various components of system…
Computer Assistance for Writing Interactive Programs: TICS.
ERIC Educational Resources Information Center
Kaplow, Roy; And Others
1973-01-01
Investigators developed an on-line, interactive programing system--the Teacher-Interactive Computer System (TICS)--to provide assistance to those who were not programers, but nevertheless wished to write interactive instructional programs. TICS had two components: an author system and a delivery system. Underlying assumptions were that…
ERIC Educational Resources Information Center
Anderson, Cheryl A.
Designed to answer basic questions educators have about microcomputer hardware and software and their applications in teaching, this paper describes the revolution in computer technology that has resulted from the development of the microchip processor and provides information on the major computer components; i.e.; input, central processing unit,…
Computer-Aided Design of Low-Noise Microwave Circuits
NASA Astrophysics Data System (ADS)
Wedge, Scott William
1991-02-01
Devoid of most natural and manmade noise, microwave frequencies have detection sensitivities limited by internally generated receiver noise. Low-noise amplifiers are therefore critical components in radio astronomical antennas, communications links, radar systems, and even home satellite dishes. A general technique to accurately predict the noise performance of microwave circuits has been lacking. Current noise analysis methods have been limited to specific circuit topologies or neglect correlation, a strong effect in microwave devices. Presented here are generalized methods, developed for computer-aided design implementation, for the analysis of linear noisy microwave circuits comprised of arbitrarily interconnected components. Included are descriptions of efficient algorithms for the simultaneous analysis of noisy and deterministic circuit parameters based on a wave variable approach. The methods are therefore particularly suited to microwave and millimeter-wave circuits. Noise contributions from lossy passive components and active components with electronic noise are considered. Also presented is a new technique for the measurement of device noise characteristics that offers several advantages over current measurement methods.
Distributed user interfaces for clinical ubiquitous computing applications.
Bång, Magnus; Larsson, Anders; Berglund, Erik; Eriksson, Henrik
2005-08-01
Ubiquitous computing with multiple interaction devices requires new interface models that support user-specific modifications to applications and facilitate the fast development of active workspaces. We have developed NOSTOS, a computer-augmented work environment for clinical personnel to explore new user interface paradigms for ubiquitous computing. NOSTOS uses several devices such as digital pens, an active desk, and walk-up displays that allow the system to track documents and activities in the workplace. We present the distributed user interface (DUI) model that allows standalone applications to distribute their user interface components to several devices dynamically at run-time. This mechanism permit clinicians to develop their own user interfaces and forms to clinical information systems to match their specific needs. We discuss the underlying technical concepts of DUIs and show how service discovery, component distribution, events and layout management are dealt with in the NOSTOS system. Our results suggest that DUIs--and similar network-based user interfaces--will be a prerequisite of future mobile user interfaces and essential to develop clinical multi-device environments.
NASA Astrophysics Data System (ADS)
Thomas, W. A.; McAnally, W. H., Jr.
1985-07-01
TABS-2 is a generalized numerical modeling system for open-channel flows, sedimentation, and constituent transport. It consists of more than 40 computer programs to perform modeling and related tasks. The major modeling components--RMA-2V, STUDH, and RMA-4--calculate two-dimensional, depth-averaged flows, sedimentation, and dispersive transport, respectively. The other programs in the system perform digitizing, mesh generation, data management, graphical display, output analysis, and model interfacing tasks. Utilities include file management and automatic generation of computer job control instructions. TABS-2 has been applied to a variety of waterways, including rivers, estuaries, bays, and marshes. It is designed for use by engineers and scientists who may not have a rigorous computer background. Use of the various components is described in Appendices A-O. The bound version of the report does not include the appendices. A looseleaf form with Appendices A-O is distributed to system users.
Research in the design of high-performance reconfigurable systems
NASA Technical Reports Server (NTRS)
Mcewan, S. D.; Spry, A. J.
1985-01-01
Computer aided design and computer aided manufacturing have the potential for greatly reducing the cost and lead time in the development of VLSI components. This potential paves the way for the design and fabrication of a wide variety of economically feasible high level functional units. It was observed that current computer systems have only a limited capacity to absorb new VLSI component types other than memory, microprocessors, and a relatively small number of other parts. The first purpose is to explore a system design which is capable of effectively incorporating a considerable number of VLSI part types and will both increase the speed of computation and reduce the attendant programming effort. A second purpose is to explore design techniques for VLSI parts which when incorporated by such a system will result in speeds and costs which are optimal. The proposed work may lay the groundwork for future efforts in the extensive simulation and measurements of the system's cost effectiveness and lead to prototype development.
Computational simulation of acoustic fatigue for hot composite structures
NASA Technical Reports Server (NTRS)
Singhal, Surendra N.; Murthy, Pappu L. N.; Chamis, Christos C.; Nagpal, Vinod K.; Sutjahjo, Edhi
1991-01-01
Predictive methods/computer codes for the computational simulation of acoustic fatigue resistance of hot composite structures subjected to acoustic excitation emanating from an adjacent vibrating component are discussed. Select codes developed over the past two decades at the NASA Lewis Research Center are used. The codes include computation of acoustic noise generated from a vibrating component, degradation in material properties of a composite laminate at use temperature, dynamic response of acoustically excited hot multilayered composite structure, degradation in the first ply strength of the excited structure due to acoustic loading, and acoustic fatigue resistance of the excited structure, including the propulsion environment. Effects of the laminate lay-up and environment on the acoustic fatigue life are evaluated. The results show that, by keeping the angled plies on the outer surface of the laminate, a substantial increase in the acoustic fatigue life is obtained. The effect of environment (temperature and moisture) is to relieve the residual stresses leading to an increase in the acoustic fatigue life of the excited panel.
Recent Advances in Photonic Devices for Optical Computing and the Role of Nonlinear Optics-Part II
NASA Technical Reports Server (NTRS)
Abdeldayem, Hossin; Frazier, Donald O.; Witherow, William K.; Banks, Curtis E.; Paley, Mark S.
2007-01-01
The twentieth century has been the era of semiconductor materials and electronic technology while this millennium is expected to be the age of photonic materials and all-optical technology. Optical technology has led to countless optical devices that have become indispensable in our daily lives in storage area networks, parallel processing, optical switches, all-optical data networks, holographic storage devices, and biometric devices at airports. This chapters intends to bring some awareness to the state-of-the-art of optical technologies, which have potential for optical computing and demonstrate the role of nonlinear optics in many of these components. Our intent, in this Chapter, is to present an overview of the current status of optical computing, and a brief evaluation of the recent advances and performance of the following key components necessary to build an optical computing system: all-optical logic gates, adders, optical processors, optical storage, holographic storage, optical interconnects, spatial light modulators and optical materials.
NASA Technical Reports Server (NTRS)
Mcknight, R. L.
1985-01-01
Accomplishments are described for the second year effort of a 3-year program to develop methodology for component specific modeling of aircraft engine hot section components (turbine blades, turbine vanes, and burner liners). These accomplishments include: (1) engine thermodynamic and mission models; (2) geometry model generators; (3) remeshing; (4) specialty 3-D inelastic stuctural analysis; (5) computationally efficient solvers, (6) adaptive solution strategies; (7) engine performance parameters/component response variables decomposition and synthesis; (8) integrated software architecture and development, and (9) validation cases for software developed.
Final Technical Report - Center for Technology for Advanced Scientific Component Software (TASCS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sussman, Alan
2014-10-21
This is a final technical report for the University of Maryland work in the SciDAC Center for Technology for Advanced Scientific Component Software (TASCS). The Maryland work focused on software tools for coupling parallel software components built using the Common Component Architecture (CCA) APIs. Those tools are based on the Maryland InterComm software framework that has been used in multiple computational science applications to build large-scale simulations of complex physical systems that employ multiple separately developed codes.
Integrated Computer System of Management in Logistics
NASA Astrophysics Data System (ADS)
Chwesiuk, Krzysztof
2011-06-01
This paper aims at presenting a concept of an integrated computer system of management in logistics, particularly in supply and distribution chains. Consequently, the paper includes the basic idea of the concept of computer-based management in logistics and components of the system, such as CAM and CIM systems in production processes, and management systems for storage, materials flow, and for managing transport, forwarding and logistics companies. The platform which integrates computer-aided management systems is that of electronic data interchange.
CompGC: Efficient Offline/Online Semi-Honest Two-Party Computation
2016-07-06
negligible function µ(·) such that for every κ ∈ N : Pr [ ExptprivA,S(κ) = 1 ] ≤ 12 + µ(κ) 4.1 Component-Based Secure Two-Party Computation We now...automating secure two-party computations. In Ehab Al-Shaer, Angelos D. Keromytis, and Vitaly Shmatikov, editors, ACM CCS 10, pages 451–462. ACM Press...computation. In Yan Chen, George Danezis, and Vitaly Shmatikov, editors, ACM CCS 11, pages 715–724. ACM Press, October 2011. [MGBF14] Benjamin Mood, Debayan
NASA Technical Reports Server (NTRS)
Matthews, Christine G.; Posenau, Mary-Anne; Leonard, Desiree M.; Avis, Elizabeth L.; Debure, Kelly R.; Stacy, Kathryn; Vonofenheim, Bill
1992-01-01
The intent is to provide an introduction to the image processing capabilities available at the Langley Research Center (LaRC) Central Scientific Computing Complex (CSCC). Various image processing software components are described. Information is given concerning the use of these components in the Data Visualization and Animation Laboratory at LaRC.
Optical Character Recognition.
ERIC Educational Resources Information Center
Converso, L.; Hocek, S.
1990-01-01
This paper describes computer-based optical character recognition (OCR) systems, focusing on their components (the computer, the scanner, the OCR, and the output device); how the systems work; and features to consider in selecting a system. A list of 26 questions to ask to evaluate systems for potential purchase is included. (JDD)
State-of-the-Art Opportunities. Hispanic Special Report: Careers in Engineering.
ERIC Educational Resources Information Center
Heller, Michele
1992-01-01
Although the demand for electrical, defense, and computer science engineers has dropped sharply, opportunities exist for Hispanics in computer communication and integration, miniaturization of electronic components, environmental, and genetic and biomedical engineering. Engineers should diversify their skills to adapt to the changing field. (KS)
The Benefits of Multimedia Computer Software for Students with Disabilities.
ERIC Educational Resources Information Center
Green, Douglas W.
This paper assesses the current state of research and informed opinion on the benefits of multimedia computer software for students with disabilities. Topics include: a definition of multimedia; advantages of multimedia; Multiple Intelligence Theory which states intellectual abilities consist of seven components; motivation and behavior…
A Computer Model of the Cardiovascular System for Effective Learning.
ERIC Educational Resources Information Center
Rothe, Carl F.
1979-01-01
Described is a physiological model which solves a set of interacting, possibly nonlinear, differential equations through numerical integration on a digital computer. Sample printouts are supplied and explained for effects on the components of a cardiovascular system when exercise, hemorrhage, and cardiac failure occur. (CS)
EPA’s National Center for Computational Toxicology is engaged in high-profile research efforts to improve the ability to more efficiently and effectively prioritize and screen thousands of environmental chemicals for potential toxicity. A central component of these efforts invol...
21 CFR 892.1200 - Emission computed tomography system.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Emission computed tomography system. 892.1200 Section 892.1200 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES... analysis and display equipment, patient and equipment supports, radionuclide anatomical markers, component...