Science.gov

Sample records for honeywell computers

  1. Honeywell Modular Automation System Computer Software Documentation

    SciTech Connect

    CUNNINGHAM, L.T.

    1999-09-27

    This document provides a Computer Software Documentation for a new Honeywell Modular Automation System (MAS) being installed in the Plutonium Finishing Plant (PFP). This system will be used to control new thermal stabilization furnaces in HA-211 and vertical denitration calciner in HC-230C-2.

  2. Evaluation of Honeywell Recoverable Computer System (RCS) in Presence of Electromagnetic Effects

    NASA Technical Reports Server (NTRS)

    Malekpour, Mahyar

    1997-01-01

    The design and development of a Closed-Loop System to study and evaluate the performance of the Honeywell Recoverable Computer System (RCS) in electromagnetic environments (EME) is presented. The development of a Windows-based software package to handle the time critical communication of data and commands between the RCS and flight simulation code in real-time, while meeting the stringent hard deadlines is also presented. The performance results of the RCS while exercising flight control laws under ideal conditions as well as in the presence of electromagnetic fields is also discussed.

  3. HONEY -- The Honeywell Camera

    NASA Astrophysics Data System (ADS)

    Clayton, C. A.; Wilkins, T. N.

    The Honeywell model 3000 colour graphic recorder system (hereafter referred to simply as Honeywell) has been bought by Starlink for producing publishable quality photographic hardcopy from the IKON image displays. Full colour and black & white images can be recorded on positive or negative 35mm film. The Honeywell consists of a built-in high resolution flat-faced monochrome video monitor, a red/green/blue colour filter mechanism and a 35mm camera. The device works on the direct video signals from the IKON. This means that changing the brightness or contrast on the IKON monitor will not affect any photographs that you take. The video signals from the IKON consist of separate red, green and blue signals. When you take a picture, the Honeywell takes the red, green and blue signals in turn and displays three pictures consecutively on its internal monitor. It takes an exposure through each of three filters (red, green and blue) onto the film in the camera. This builds up the complete colour picture on the film. Honeywell systems are installed at nine Starlink sites, namely Belfast (locally funded), Birmingham, Cambridge, Durham, Leicester, Manchester, Rutherford, ROE and UCL.

  4. Honeywell: Comfort and economy

    SciTech Connect

    Lukaszewski, J.

    1995-12-31

    The presentation of the Company starts with having it ranked among the ones operating on the customers` market or those acting on the professional market. But it is not so. Honeywell is beyond such simple criteria. We are a company supplying products, systems and services related with generally conceived automatic control engineering, yet the operational range does comprise so many apparently diversified fields, for instance automatic control in aeronautics, heavy power engineering, building of apartment buildings, detached houses, heat engineering and some others. Nevertheless, our targets are always the same: maximum increase in efficiency and reliability of the process lines controlled by our systems as well as securing the best comfort of work and rest for people who stay in the buildings controlled by our devices. Simultaneously, the utilization of energy sources and the natural environment resources must be as sensible as possible.

  5. PREVENTIVE MAINTENANCE. HONEYWELL PLANNING GUIDE.

    ERIC Educational Resources Information Center

    Honeywell, Inc., Minneapolis, Minn.

    THIS HONEYWELL PAMPHLET DISCUSSES SOME ASPECTS OF PREVENTIVE MAINTENANCE OF AUTOMATIC CONTROLS, HEATING, VENTILATING, AND AIR CONDITIONING, AND COMPARES IN-PLANT WITH CONTRACT SERVICE, CONCLUDING THAT CONTRACT SERVICE IS PREFERABLE AND DESCRIBING A NUMBER OF MAINTENANCE PLANS WHICH THEY FURNISH. PREVENTIVE MAINTENANCE PROVIDES--(1) MORE EFFICIENT…

  6. R and D productivity improvement at Honeywell: A case study

    NASA Technical Reports Server (NTRS)

    Lyons, W. E.

    1985-01-01

    The problems encountered when computer-aided-design/documentation was applied to a large design program at Honeywell; how a study team was established to solve the problem; the techniques used by the team and the resulting solutions are described. The techniques used in this instance may be applied to other problem areas in the R&D process to improve productivity.

  7. Honeywell optical investigations on FLASH program

    NASA Astrophysics Data System (ADS)

    O'Rourke, Ken; Peterson, Eric; Yount, Larry

    1995-05-01

    The increasing performance and reduction of life cycle cost requirements placed on commercial and military transport aircraft are resulting in more complex, highly integrated aircraft control and management systems. The use of fiber optic data transmission media can make significant contributions in achieving these performance and cost goals. The Honeywell portion of Task 2A on the Fly-by-Light Advanced System Hardware (FLASH) program is evaluating a Primary Flight Control System (PFCS) using pilot and copilot inputs from Active Hand Controllers (AHC) which are optically linked to the primary flight Control Computers (PFCC). Customer involvement is an important element of the Task 2A activity. Establishing customer requirements and perspectives on productization of systems developed under FLASH are key to future product success. The Honeywell elements of the PFCS demonstrator provide a command path that is optically interfaced from crew inputs to commands of distributed, smart actuation subsystems commands. Optical communication architectures are implemented using several protocols including the new AS-1773A 20 Mbps data bus standard. The interconnecting fiber optic cable plant is provided by our Task 1A teammate McDonnell Douglas Aerospace (West). Fiber optic cable plant fabrication uses processed, tools and materials reflecting necessary advances in manufacturing required to make fly-by-light avionics systems marketable.

  8. Overview Of Diffractive Optics At Honeywell

    NASA Astrophysics Data System (ADS)

    Cox, J. Allen

    1988-05-01

    Interest in holographic, or diffractive, optics has been rekindled in the last few years with demonstrated advances in three areas: computer-aided design (CAD) tools, VLSI lithographic and dry etching processes, and mathematical modeling of diffractive elements.1 The availability of CAD tools and electron-beam lithography led first to the emergence of computer-generated holography (CGH). CGH work at Honeywell was started and brought to maturity by Arnold2 in 1980-1983. However, because of the inherently low diffraction efficiency (-10%), lithographic CGHs have found a place in only a relatively few practical applications, such as testing diamond turned aspherics, and thus CGHs have not been widely accepted within industry. The first step in changing this situation came in the 1970s with numerical approaches to rigorously solve the vector field equations for diffraction from blazed gratings.3 The extensive numerical results from these models not only showed that high diffraction efficiencies are possible with etched surface profiles, but also indicated the sensitivity to various profile configurations and design parameters. Veldkamp et al.1,4'-'61 at MIT Lincoln Laboratories have taken the final step necessary to establish the practical feasibility of diffractive optics by using reactive ion etching techniques to produce the surface profiles prescribed by the numerical models and delineated by CGH lithographic masks. With this combined approach, they have demonstrated the feasibility of high-efficiency diffractive elements for a variety of diverse applications, such as the CO2 laser radar telescope,4 coherent beam addition of laser diode arrays,5 and on-axis, broadband, aspheric lens elements for infrared imagers.6 These elements are fabricated using well-established VLSI lithographic and dry etching techniques. Moreover, the ability to replicate each diffractive element provides the potential for high-volume, low-cost producibility. With this precedent, Honeywell

  9. Honeywell FLASH fiber optic motherboard evaluations

    NASA Astrophysics Data System (ADS)

    Stange, Kent

    1996-10-01

    The use of fiber optic data transmission media can make significant contributions in achieving increasing performance and reduced life cycle cost requirements placed on commercial and military transport aircraft. For complete end-to-end fiber optic transmission, photonics technologies and techniques need to be understood and applied internally to the aircraft line replaceable units as well as externally on the interconnecting aircraft cable plant. During a portion of the Honeywell contribution to Task 2A on the Fly- by-Light Advanced System Hardware program, evaluations were done on a fiber optic transmission media implementation internal to a Primary Flight Control Computer (PFCC). The PFCC internal fiber optic transmission media implementation included a fiber optic backplane, an optical card-edge connector, and an optical source/detector coupler/installation. The performance of these optical media components were evaluated over typical aircraft environmental stresses of temperature, vibration, and humidity. These optical media components represent key technologies to the computer end-to-end fiber optic transmission capability on commercial and military transport aircraft. The evaluations and technical readiness assessments of these technologies will enable better perspectives on productization of fly-by-light systems requiring their utilizations.

  10. Child Care Recommendations for Honeywell Employees.

    ERIC Educational Resources Information Center

    Johnston, Chris, Ed.

    Quality child care is an issue affecting both an industry as a whole and the individuals within that industry. Employees' absenteeism, morale, and motivation are closely linked to concern for their child's well-being and this concern will ultimately affect both production and company success. In recognition of this issue, the Honeywell Corporation…

  11. 77 FR 51695 - Airworthiness Directives; Honeywell International Inc. Turbofan Engines

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-27

    ... (1) This AD applies to Honeywell International Inc. TFE731-20R, -20AR, -20BR, -40, -40AR, -40R, -50R.... SUMMARY: We are adopting a new airworthiness directive (AD) for all Honeywell International Inc. TFE731-20R, -20AR, -20BR, -40, -40AR, - 40R, -50R, and -60 turbofan engines. This AD was prompted by a...

  12. Fiber optic gyro development at Honeywell

    NASA Astrophysics Data System (ADS)

    Sanders, Glen A.; Sanders, Steven J.; Strandjord, Lee K.; Qiu, Tiequn; Wu, Jianfeng; Smiciklas, Marc; Mead, Derek; Mosor, Sorin; Arrizon, Alejo; Ho, Waymon; Salit, Mary

    2016-05-01

    Two major architectures of fiber optic gyroscopes have been under development at Honeywell in recent years. The interferometric fiber optic gyro (IFOG) has been in production and deployment for various high performance space and marine applications. Different designs, offering very low noise, ranging from better than navigation grade to ultra-precise performance have been tested and produced. The resonator fiber optic gyro (RFOG) is also under development, primarily for its attractive potential for civil navigation usage, but also because of its scalability to other performance. New techniques to address optical backscatter and laser frequency noise have been developed and demonstrated. Development of novel, enhanced RFOG architectures using hollow core fiber, silicon optical bench technology, and highly stable multifrequency laser sources are discussed.

  13. 32. INTERIOR VIEW TO THE WEST OF A HONEYWELL WALL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    32. INTERIOR VIEW TO THE WEST OF A HONEYWELL WALL PRESSURE GAUGE IN ROOM 105, THE CONTROL ROOM. - Nevada Test Site, Pluto Facility, Disassembly Building, Area 26, Wahmonie Flats, Cane Spring Road, Mercury, Nye County, NV

  14. 76 FR 70334 - Airworthiness Directives; Honeywell International Inc. Turboshaft Engines

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-14

    ... Policies and Procedures (44 FR 11034, February 26, 1979), (3) Will not affect intrastate aviation in Alaska... site: http://portal.honeywell.com/wps/portal/aero . (l) Material Incorporated by Reference None....

  15. 75 FR 22519 - Airworthiness Directives; Honeywell International Inc., Primus EPIC and Primus APEX Flight...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-29

    ... Policies and Procedures (44 FR 11034, February 26, 1979), and (3) Will not have a significant economic...- 3343; e-mail AeroTechSupport@Honeywell.com ; Internet http://portal.honeywell.com/wps/portal/aero....honeywell.com/wps/portal/aero . (3) You may review copies of the service information at the FAA,...

  16. 52. VIEW OF HONEYWELL PROPELLANT UTILIZATION TEST SET (FOREGROUND) AND ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    52. VIEW OF HONEYWELL PROPELLANT UTILIZATION TEST SET (FOREGROUND) AND GENERAL ELECTRIC AIRBORNE BEACON EQUIPMENT TEST SET LOCATED IMMEDIATELY SOUTH OF DEMULTIPLEX BAY, IN THE SOUTHWEST CORNER OF THE TELEMETRY ROOM (ROOM 106) - Vandenberg Air Force Base, Space Launch Complex 3, Launch Operations Building, Napa & Alden Roads, Lompoc, Santa Barbara County, CA

  17. 76 FR 73489 - Airworthiness Directives; Honeywell International Inc. Turbofan Engines

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-29

    ... ``significant rule'' under DOT Regulatory Policies and Procedures (44 FR 11034, February 26, 1979), (3) Will not...: (800) 601-3099; Web site: http:// ] portal.honeywell.com/wps/portal/aero. You may review copies of the... 52181, Phoenix, AZ 85072- 2181, phone: (800) 601-3099; Web site:...

  18. The Honeywell Studies: How Managers Learn to Manage.

    ERIC Educational Resources Information Center

    Zemke, Ron

    1985-01-01

    Describes how a group of Honeywell Corporation human resources specialists determined how the organization's management development process could be improved to support the needs of the business. Discusses corporate management development strategy, curriculum, conference center, on-the-job experiences, and how experience, relationships, and…

  19. 75 FR 77664 - Honeywell International, Inc., Automation and Control Solutions Division, Including On-Site...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-13

    ... of Honeywell International, Inc., Automation and Control Solutions Division, Rock Island, Illinois. The notice was published in the Federal Register on August 13, 2010 (75 FR 49531). At the request of a... Employment and Training Administration Honeywell International, Inc., Automation and Control...

  20. Honeywell Cascade Distiller System Performance Testing Interim Results

    NASA Technical Reports Server (NTRS)

    Callahan, Michael R.; Sargusingh, Miriam

    2014-01-01

    The ability to recover and purify water through physiochemical processes is crucial for realizing long-term human space missions, including both planetary habitation and space travel. Because of their robust nature, distillation systems have been actively pursued as one of the technologies for water recovery. The Cascade Distillation System (CDS) is a vacuum rotary distillation system with potential for greater reliability and lower energy costs than existing distillation systems. The CDS was previously under development through Honeywell and NASA. In 2009, an assessment was performed to collect data to support down-selection and development of a primary distillation technology for application in a lunar outpost water recovery system. Based on the results of this testing, an expert panel concluded that the CDS showed adequate development maturity, TRL-4, together with the best product water quality and competitive weight and power estimates to warrant further development. The Advanced Exploration Systems (AES) Water Recovery Project (WRP) worked to address weaknesses identified by The Panel; namely bearing design and heat pump power efficiency. Testing at the NASA-JSC Advanced Exploration System Water Laboratory (AES Water Lab) using a prototype Cascade Distillation Subsystem (CDS) wastewater processor (Honeywell International, Torrance, Calif.) with test support equipment and control system developed by Johnson Space Center was performed to evaluate performance of the system with the upgrades. The CDS will also have been challenged with ISS analog waste streams and a subset of those being considered for Exploration architectures. This paper details interim results of the AES WRP CDS performance testing.

  1. 78 FR 56859 - Foreign-Trade Zone 75-Phoenix, Arizona, Authorization of Limited Production Activity, Honeywell...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-16

    ... Activity, Honeywell Aerospace, Inc. (Aircraft Engines, Systems and Components), Phoenix and Tempe, Arizona... comment (78 FR 27951-27952, 05-13-2013). Based on the FTZ Board's determination in this proceeding,...

  2. Qualification test procedures and results for Honeywell solar collector subsystem, single-family residence

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The test procedures and results in qualifying the Honeywell single family residence solar collector subsystem are presented. Testing was done in the following areas: pressure, service loads, hail, solar degradation, pollutants, thermal degradation, and outgassing.

  3. 83. DETAIL OF HONEYWELL AIRCONDITIONING CONTROLS IN SLC3E CONTROL ROOM ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    83. DETAIL OF HONEYWELL AIR-CONDITIONING CONTROLS IN SLC-3E CONTROL ROOM - Vandenberg Air Force Base, Space Launch Complex 3, Launch Operations Building, Napa & Alden Roads, Lompoc, Santa Barbara County, CA

  4. 76 FR 58049 - Atomic Safety and Licensing Board; Honeywell International, Inc.; Metropolis Works Uranium...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-19

    ... From the Federal Register Online via the Government Publishing Office NUCLEAR REGULATORY COMMISSION Atomic Safety and Licensing Board; Honeywell International, Inc.; Metropolis Works Uranium... assurance for its Metropolis Works uranium conversion facility in Metropolis, Illinois. \\1\\ LBP-11-19,...

  5. Behavior of Capstone and Honeywell microturbine generators during load changes

    SciTech Connect

    Yinger, Robert J.

    2001-07-01

    This report describes test measurements of the behavior of two microturbine generators (MTGs) under transient conditions. The tests were conducted under three different operating conditions: grid-connect; stand-alone single MTG with load banks; and two MTGs running in parallel with load banks. Tests were conducted with both the Capstone 30-kW and Honeywell Parallon 75-kW MTGs. All tests were conducted at the Southern California Edison /University of California, Irvine (UCI) test facility. In the grid-connected mode, several test runs were conducted with different set-point changes both up and down and a start up and shutdown were recorded for each MTG. For the stand-alone mode, load changes were initiated by changing load-bank values (both watts and VARs). For the parallel mode, tests involved changes in the load-bank settings as well as changes in the power set point of the MTG running in grid-connect mode. Detailed graphs of the test results are presented. It should be noted that these tests were done using a specific hardware and software configuration. Use of different software and hardware could result in different performance characteristics for the same units.

  6. Detailed analysis of Honeywell In-Space Accelerometer data - STS-32. [crystal microstructure response to different types of residual acceleration

    NASA Technical Reports Server (NTRS)

    Rogers, Melissa J. B.; Alexander, J. I. D.; Schoess, Jeff

    1993-01-01

    The Honeywell In-Space Accelerometer (HISA) system collected data in the mid-deck area of the Shuttle Columbia during the flight of STS-32, January 1990. The resulting data were to be used to investigate the response of crystal microstructure to different types of residual acceleration. The HISA is designed to detect and record transient and oscillatory accelerations. The sampling and electronics package stored averaged accelerations over two sampling periods; two sampling rates were available: 1 Hz and 50 Hz. Analysis of the HISA data followed the CMMR Acceleration Data Processing Guide, considering in-house computer modelling of a float-zone indium crystal growth experiment. Characteristic examples of HISA data showing the response to the primary reaction control system, Orbiter Maneuvering System operations, and crew treadmill activity are presented. Various orbiter structural modes are excited by these and other activities.

  7. Ice Particle Analysis of the Honeywell AL502 Engine Booster

    NASA Technical Reports Server (NTRS)

    Bidwell, Colin S.; Rigby, David L.

    2015-01-01

    A flow and ice particle trajectory analysis was performed for the booster of the Honeywell ALF502 engine. The analysis focused on two closely related conditions one of which produced an icing event and another which did not during testing of the ALF502 engine in the Propulsion Systems Lab (PSL) at NASA Glenn Research Center. The flow analysis was generated using the NASA Glenn GlennHT flow solver and the particle analysis was generated using the NASA Glenn LEWICE3D v3.63 ice accretion software. The inflow conditions for the two conditions were similar with the main differences being that the condition that produced the icing event was 6.8 K colder than the non-icing event case and the inflow ice water content (IWC) for the non-icing event case was 50% less than for the icing event case. The particle analysis, which considered sublimation, evaporation and phase change, was generated for a 5 micron ice particle with a sticky impact model and for a 24 micron median volume diameter (MVD), 7 bin ice particle distribution with a supercooled large droplet (SLD) splash model used to simulate ice particle breakup. The particle analysis did not consider the effect of the runback and re-impingement of water resulting from the heated spinner and anti-icing system. The results from the analysis showed that the amount of impingement for the components were similar for the same particle size and impact model for the icing and non-icing event conditions. This was attributed to the similar aerodynamic conditions in the booster for the two cases. The particle temperature and melt fraction were higher at the same location and particle size for the non-icing event than for the icing event case due to the higher incoming inflow temperature for the non-event case. The 5 micron ice particle case produced higher impact temperatures and higher melt fractions on the components downstream of the fan than the 24 micron MVD case because the average particle size generated by the particle

  8. 78 FR 1735 - Airworthiness Directives; Honeywell International Inc. Air Data Pressure Transducers

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-09

    ... Executive Order 12866, (2) Is not a ``significant rule'' under DOT Regulatory Policies and Procedures (44 FR... International Inc. Air Data Pressure Transducers AGENCY: Federal Aviation Administration (FAA), DOT. ACTION... certain Honeywell International Inc. air data pressure transducers as installed on various aircraft....

  9. 76 FR 41312 - Honeywell International Inc.; Establishment of Atomic Safety And Licensing Board

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-13

    ... COMMISSION Honeywell International Inc.; Establishment of Atomic Safety And Licensing Board Pursuant to delegation by the Commission dated December 29, 1972, published in the Federal Register, 37 FR 28,710 (1972... rule, which the NRC promulgated in August 2007 (72 FR 49,139). Issued at Rockville, Maryland, this...

  10. 76 FR 81986 - Honeywell International, Inc., Automation and Control Solutions Division, Including On-Site...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-29

    ..., 2010 (75 FR 49531). The notice was amended on December 7, 2010 to include several on-site leased worker firms. The amended notice was published in the Federal Register on December 13, 2010 (75 FR 77664-77665... Employment and Training Administration Honeywell International, Inc., Automation and Control...

  11. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, HONEYWELL POWER SYSTEMS, INC. PARALLON 75 KW TURBOGENERATOR

    EPA Science Inventory

    The Environmental Technology Verification report discusses the technology and performance of the Parallon 75kW Turbogenerator manufactured by Honeywell Power Systems, Inc., formerly AlliedSignal Power Systems, Inc. The unit uses a natural-gas-fired turbine to power an electric ge...

  12. 77 FR 40637 - Honeywell International, Scanning and Mobility Division, Formerly Known as Hand Held Products...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-10

    ... in the Federal Register on June 17, 2011 (Vol. 76 FR 117). At the request of the state workforce... 17, 2011 (Vol. 76 FR 117). In order to ensure that the worker group is properly identified, the... Employment and Training Administration Honeywell International, Scanning and Mobility Division,...

  13. 76 FR 39918 - Honeywell International, Inc., Metropolis Works; License Amendment Request and Request for a Hearing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-07

    ... FR 49139, August 28, 2007). The E-Filing process requires participants to submit and serve all... COMMISSION Honeywell International, Inc., Metropolis Works; License Amendment Request and Request for a... Metropolis Works Facility site located in Metropolis, Illinois. License No. SUB-526 authorizes the...

  14. Neighbors: A Partnership Project between the St. Louis Park, Minnesota Schools and the Military Avionics Division of Honeywell.

    ERIC Educational Resources Information Center

    Erickson, Cindy; Bengtson, Wayne

    1984-01-01

    A partnership between Honeywell and a Minnesota school district benefited both organizations through shared resources and provision of staff development programs. Details on how this collaborative project was designed and implemented are discussed. (DF)

  15. Test evaluation of the Honeywell GG1111 Single-Degree-of-Freedom (SDF) strapdown gyroscope

    NASA Astrophysics Data System (ADS)

    Vinnins, M. F.; Apps, R. G.

    1984-10-01

    Test results from the evaluation of a Honeywell GG1111 single-degree-of-freedom strapdown gyroscope are presented. Tests include both static and constant-rate tests in servo and in analog-torque-to-balance modes. Results of multiposition drift tests, drift stability, cool down sensitivity, temperature sensitivity, torque generator linearity, scale factor stability and torque generator sensitivity to IA rate changes are presented, described and discussed.

  16. 75 FR 34347 - Airworthiness Directives; Honeywell International Inc. Auxiliary Power Unit Models GTCP36-150(R...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-17

    ... published the proposed AD in the Federal Register on December 23, 2009 (74 FR 68196). That action proposed... Policies and Procedures (44 FR 11034, February 26, 1979); and (3) Will not have a significant economic...) for Honeywell International Inc. auxiliary power unit (APU) models GTCP36- 150(R) and...

  17. LEWICE3D/GlennHT Particle Analysis of the Honeywell Al502 Low Pressure Compressor

    NASA Technical Reports Server (NTRS)

    Bidwell, Colin S.; Rigby, David L.

    2015-01-01

    A flow and ice particle trajectory analysis was performed for the booster of the Honeywell AL502 engine. The analysis focused on two closely related conditions one of which produced a rollback and another which did not rollback during testing in the Propulsion Systems Lab at NASA Glenn Research Center. The flow analysis was generated using the NASA Glenn GlennHT flow solver and the particle analysis was generated using the NASA Glenn LEWICE3D v3.56 ice accretion software. The flow and particle analysis used a 3D steady flow, mixing plane approach to model the transport of flow and particles through the engine. The inflow conditions for the rollback case were: airspeed, 145 ms; static pressure, 33,373 Pa; static temperature, 253.3 K. The inflow conditions for the non-roll-back case were: airspeed, 153 ms; static pressure, 34,252 Pa; static temperature, 260.1 K. Both cases were subjected to an ice particle cloud with a median volume diameter of 24 microns, an ice water content of 2.0 gm3 and a relative humidity of 100 percent. The most significant difference between the rollback and non-rollback conditions was the inflow static temperature which was 6.8 K higher for the non-rollback case.

  18. Solar energy system performance evaluation: Honeywell OTS 41, Shenandoah (Newman), Georgia

    NASA Astrophysics Data System (ADS)

    Mathur, A. K.; Pederson, S.

    1982-08-01

    The operation and technical performance of the Solar Operational Test Site (OTS 41) located at Shenandoah, Georgia, are described, based on the analysis of the data collected between January and August 1981. The following topics are discussed: system description, performance assessment, operating energy, energy savings, system maintenance, and conclusions. The solar energy system at OTS 41 is a hydronic heating and cooling system consisting of 702 square feet of liquid-cooled flat-plate collectors; a 1000-gallon thermal storage tank; a 3-ton capacity organic Rankine-cycle-engine-assisted air conditioner; a water-to-are heat exchanger for solar space heating; a finned-tube coil immersed in the storage tank to preheat water for a gas-fired hot water heater; and associated piping, pumps, valves, and controls. The solar system has six basic modes of operation and several combination modes. The system operation is controlled automatically by a Honeywell-designed microprocessor-based control system, which also provides diagnostics.

  19. Solar energy system performance evaluation: final report for Honeywell OTS 41, Shenandoah (Newnan), Georgia

    SciTech Connect

    Mathur, A K; Pederson, S

    1982-08-01

    The operation and technical performance of the Solar Operational Test Site (OTS 41) located at Shenandoah, Georgia, are described, based on the analysis of data collected between January and August 1981. The following topics are discussed: system description, performance assessment, operating energy, energy savings, system maintenance, and conclusions. The solar energy system at OTS 41 is a hydronic heating and cooling system consisting of 702 square feet of liquid-cooled flat-plate collectors; a 1000-gallon thermal storage tank; a 3-ton capacity organic Rankine-cycle-engine-assisted air conditioner; a water-to-air heat exchanger for solar space heating; a finned-tube coil immersed in the storage tank to preheat water for a gas-fired hot water heater; and associated piping, pumps, valves, and controls. The solar system has six basic modes of operation and several combination modes. The system operation is controlled automatically by a Honeywell-designed microprocessor-based control system, which also provides diagnostics. Based on the instrumented test data monitored and collected during the 7 months of the Operational Test Period, the solar system collected 53 MMBtu of thermal energy of the total incident solar energy of 219 MMBtu and provided 11.4 MMBtu for cooling, 8.6 MMBtu for heating, and 8.1 MMBtu for domestic hot water. The projected net annual energy savings due to the solar system were approximately 50 MMBtu of fossil energy (49,300 cubic feet of natural gas) and a loss of 280 kWh(e) of electrical energy.

  20. AN EXPLORATORY STUDY OF THE DEVELOPMENT AND UTILIZATION OF A GRADE TWO BRAILLE TRANSLATOR FOR THE HONEYWELL 222 HIGH SPEED BRAILLE PRINTER. FINAL REPORT.

    ERIC Educational Resources Information Center

    NELSON, CALVIN C.

    IN ORDER TO EXPLORE THE TECHNICAL AND PRACTICAL PROBLEMS INVOLVED IN BRINGING THE HONEYWELL MODEL 222 MODIFIED BRAILLE PRINTER TO FULL UTILIZATION, THREE OBJECTIVES WERE DEVELOPED--(1) EXPLORATION OF THE PROBLEMS RELATED TO THE DEVELOPMENT OF A TRANSLATOR SYSTEM, (2) EXPLORATION OF A SYSTEM FOR DIRECT INPUT OF GRADE TWO BRAILLE SO THAT THE…

  1. Characterization of a Recoverable Flight Control Computer System

    NASA Technical Reports Server (NTRS)

    Malekpour, Mahyar; Torres, Wilfredo

    1999-01-01

    The design and development of a Closed-Loop System to study and evaluate the performance of the Honeywell Recoverable Computer System (RCS) in electromagnetic environments (EME) is presented. The development of a Windows-based software package to handle the time-critical communication of data and commands between the RCS and flight simulation code in real-time while meeting the stringent hard deadlines is also submitted. The performance results of the RCS and characteristics of its upset recovery scheme while exercising flight control laws under ideal conditions as well as in the presence of electromagnetic fields are also discussed.

  2. Fire Protection. Honeywell Planning Guide.

    ERIC Educational Resources Information Center

    Honeywell, Inc., Minneapolis, Minn.

    A general discussion of fire alarms and protection is provided by a manufacturer of automated monitoring and control systems. Background information describes old and new fire alarm systems, comparing system components, wage savings, and cost analysis. Different kinds of automatic systems are listed, including--(1) local system, (2) auxiliary…

  3. A computer program for calculating symmetrical aerodynamic characteristics and lateral-directional stability derivatives of wing-body combinations with blowing jets

    NASA Technical Reports Server (NTRS)

    Lan, C. E.; Mehrotra, S. C.; Fox, C. H., Jr.

    1978-01-01

    The necessary information for using a computer program to calculate the aerodynamic characteristics under symmetrical flight conditions and the lateral-directional stability derivatives of wing-body combinations with upper-surface-blowing (USB) or over-wing-blowing (OWB) jets are described. The following new features were added to the program: (1) a fuselage of arbitrary body of revolution has been included. The effect of wing-body interference can now be investigated, and (2) all nine lateral-directional stability derivatives can be calculated. The program is written in FORTRAN language and runs on CDC Cyber 175 and Honeywell 66/60 computers.

  4. Computers and Computer Resources.

    ERIC Educational Resources Information Center

    Bitter, Gary

    1980-01-01

    This resource directory provides brief evaluative descriptions of six popular home computers and lists selected sources of educational software, computer books, and magazines. For a related article on microcomputers in the schools, see p53-58 of this journal issue. (SJL)

  5. 77 FR 29697 - Honeywell Metropolis Works; Grant of Exemption for Honeywell Metropolis Works License

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-18

    ... on May 21, 1991, (56 FR 23391) and became mandatory for all licensees in January 1994. One of the... ML061780260. Federal Register Notice of Availability of EA and FONSI--71 FR 45862, August 10, 2006. 10. ICRP... in certain work areas, posting airborne radioactivity warning signs outside the work areas,...

  6. Computers and Computer Cultures.

    ERIC Educational Resources Information Center

    Papert, Seymour

    1981-01-01

    Instruction using computers is viewed as different from most other approaches to education, by allowing more than right or wrong answers, by providing models for systematic procedures, by shifting the boundary between formal and concrete processes, and by influencing the development of thinking in many new ways. (MP)

  7. Computer Music

    NASA Astrophysics Data System (ADS)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  8. Computer Music

    NASA Astrophysics Data System (ADS)

    Cook, Perry

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.). Although most people would think that analog synthesizers and electronic music substantially predate the use of computers in music, many experiments and complete computer music systems were being constructed and used as early as the 1950s.

  9. Cooling Computers.

    ERIC Educational Resources Information Center

    Birken, Marvin N.

    1967-01-01

    Numerous decisions must be made in the design of computer air conditioning, each determined by a combination of economics, physical, and esthetic characteristics, and computer requirements. Several computer air conditioning systems are analyzed--(1) underfloor supply and overhead return, (2) underfloor plenum and overhead supply with computer unit…

  10. Pygmalion's Computer.

    ERIC Educational Resources Information Center

    Peelle, Howard A.

    Computers have undoubtedly entered the educational arena, mainly in the areas of computer-assisted instruction (CAI) and artificial intelligence, but whether educators should embrace computers and exactly how they should use them are matters of great debate. The use of computers in support of educational administration is widely accepted.…

  11. Computational dosimetry

    SciTech Connect

    Siebert, B.R.L.; Thomas, R.H.

    1996-01-01

    The paper presents a definition of the term ``Computational Dosimetry`` that is interpreted as the sub-discipline of computational physics which is devoted to radiation metrology. It is shown that computational dosimetry is more than a mere collection of computational methods. Computational simulations directed at basic understanding and modelling are important tools provided by computational dosimetry, while another very important application is the support that it can give to the design, optimization and analysis of experiments. However, the primary task of computational dosimetry is to reduce the variance in the determination of absorbed dose (and its related quantities), for example in the disciplines of radiological protection and radiation therapy. In this paper emphasis is given to the discussion of potential pitfalls in the applications of computational dosimetry and recommendations are given for their avoidance. The need for comparison of calculated and experimental data whenever possible is strongly stressed.

  12. Computer Recreations.

    ERIC Educational Resources Information Center

    Dewdney, A. K.

    1989-01-01

    Reviews the performance of computer programs for writing poetry and prose, including MARK V. SHANEY, MELL, POETRY GENERATOR, THUNDER THOUGHT, and ORPHEUS. Discusses the writing principles of the programs. Provides additional information on computer magnification techniques. (YP)

  13. Computational Toxicology

    EPA Science Inventory

    Computational toxicology’ is a broad term that encompasses all manner of computer-facilitated informatics, data-mining, and modeling endeavors in relation to toxicology, including exposure modeling, physiologically based pharmacokinetic (PBPK) modeling, dose-response modeling, ...

  14. Female Computer

    NASA Technical Reports Server (NTRS)

    1964-01-01

    Melba Roy heads the group of NASA mathematicians, known as 'computers,' who track the Echo satellites. Roy's computations help produce the orbital element timetables by which millions can view the satellite from Earth as it passes overhead.

  15. Cloud Computing

    SciTech Connect

    Pete Beckman and Ian Foster

    2009-12-04

    Chicago Matters: Beyond Burnham (WTTW). Chicago has become a world center of "cloud computing." Argonne experts Pete Beckman and Ian Foster explain what "cloud computing" is and how you probably already use it on a daily basis.

  16. Quantum computing

    PubMed Central

    Li, Shu-Shen; Long, Gui-Lu; Bai, Feng-Shan; Feng, Song-Lin; Zheng, Hou-Zhi

    2001-01-01

    Quantum computing is a quickly growing research field. This article introduces the basic concepts of quantum computing, recent developments in quantum searching, and decoherence in a possible quantum dot realization. PMID:11562459

  17. Computer Algebra.

    ERIC Educational Resources Information Center

    Pavelle, Richard; And Others

    1981-01-01

    Describes the nature and use of computer algebra and its applications to various physical sciences. Includes diagrams illustrating, among others, a computer algebra system and flow chart of operation of the Euclidean algorithm. (SK)

  18. Computer Ease.

    ERIC Educational Resources Information Center

    Drenning, Susan; Getz, Lou

    1992-01-01

    Computer Ease is an intergenerational program designed to put an Ohio elementary school's computer lab, software library, staff, and students at the disposal of older adults desiring to become computer literate. Three 90-minute instructional sessions allow seniors to experience 1-to-1 high-tech instruction by enthusiastic, nonthreatening…

  19. Parallel computers

    SciTech Connect

    Treveaven, P.

    1989-01-01

    This book presents an introduction to object-oriented, functional, and logic parallel computing on which the fifth generation of computer systems will be based. Coverage includes concepts for parallel computing languages, a parallel object-oriented system (DOOM) and its language (POOL), an object-oriented multilevel VLSI simulator using POOL, and implementation of lazy functional languages on parallel architectures.

  20. Computer Manual.

    ERIC Educational Resources Information Center

    Illinois State Office of Education, Springfield.

    This manual designed to provide the teacher with methods of understanding the computer and its potential in the classroom includes four units with exercises and an answer sheet. Unit 1 covers computer fundamentals, the mini computer, programming languages, an introduction to BASIC, and control instructions. Variable names and constants described…

  1. Computer Literacy.

    ERIC Educational Resources Information Center

    San Marcos Unified School District, CA.

    THE FOLLOWING IS THE FULL TEXT OF THIS DOCUMENT: After viewing many computer-literacy programs, we believe San Marcos Junior High School has developed a unique program which will truly develop computer literacy. Our hope is to give all students a comprehensive look at computers as they go through their two years here. They will not only learn the…

  2. Computer software.

    PubMed

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  3. Computer sciences

    NASA Technical Reports Server (NTRS)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  4. Flight evaluation of a computer aided low-altitude helicopter flight guidance system

    NASA Technical Reports Server (NTRS)

    Swenson, Harry N.; Jones, Raymond D.; Clark, Raymond

    1993-01-01

    The Flight Systems Development branch of the U.S. Army's Avionics Research and Development Activity (AVRADA) and NASA Ames Research Center have developed for flight testing a Computer Aided Low-Altitude Helicopter Flight (CALAHF) guidance system. The system includes a trajectory-generation algorithm which uses dynamic programming and a helmet-mounted display (HMD) presentation of a pathway-in-the-sky, a phantom aircraft, and flight-path vector/predictor guidance symbology. The trajectory-generation algorithm uses knowledge of the global mission requirements, a digital terrain map, aircraft performance capabilities, and precision navigation information to determine a trajectory between mission way points that seeks valleys to minimize threat exposure. This system was developed and evaluated through extensive use of piloted simulation and has demonstrated a 'pilot centered' concept of automated and integrated navigation and terrain mission planning flight guidance. This system has shown a significant improvement in pilot situational awareness, and mission effectiveness as well as a decrease in training and proficiency time required for a near terrain, nighttime, adverse weather system. AVRADA's NUH-60A STAR (Systems Testbed for Avionics Research) helicopter was specially modified, in house, for the flight evaluation of the CALAHF system. The near terrain trajectory generation algorithm runs on a multiprocessor flight computer. Global Positioning System (GPS) data are integrated with Inertial Navigation Unit (INU) data in the flight computer to provide a precise navigation solution. The near-terrain trajectory and the aircraft state information are passed to a Silicon Graphics computer to provide the graphical 'pilot centered' guidance, presented on a Honeywell Integrated Helmet And Display Sighting System (IHADSS). The system design, piloted simulation, and initial flight test results are presented.

  5. Computer Literacy: Teaching Computer Ethics.

    ERIC Educational Resources Information Center

    Troutner, Joanne

    1986-01-01

    Suggests learning activities for teaching computer ethics in three areas: (1) equal access; (2) computer crime; and (3) privacy. Topics include computer time, advertising, class enrollments, copyright law, sabotage ("worms"), the Privacy Act of 1974 and the Freedom of Information Act of 1966. (JM)

  6. Computational psychiatry.

    PubMed

    Montague, P Read; Dolan, Raymond J; Friston, Karl J; Dayan, Peter

    2012-01-01

    Computational ideas pervade many areas of science and have an integrative explanatory role in neuroscience and cognitive science. However, computational depictions of cognitive function have had surprisingly little impact on the way we assess mental illness because diseases of the mind have not been systematically conceptualized in computational terms. Here, we outline goals and nascent efforts in the new field of computational psychiatry, which seeks to characterize mental dysfunction in terms of aberrant computations over multiple scales. We highlight early efforts in this area that employ reinforcement learning and game theoretic frameworks to elucidate decision-making in health and disease. Looking forwards, we emphasize a need for theory development and large-scale computational phenotyping in human subjects.

  7. Computed Tomography

    NASA Astrophysics Data System (ADS)

    Castellano, Isabel; Geleijns, Jacob

    After its clinical introduction in 1973, computed tomography developed from an x-ray modality for axial imaging in neuroradiology into a versatile three dimensional imaging modality for a wide range of applications in for example oncology, vascular radiology, cardiology, traumatology and even in interventional radiology. Computed tomography is applied for diagnosis, follow-up studies and screening of healthy subpopulations with specific risk factors. This chapter provides a general introduction in computed tomography, covering a short history of computed tomography, technology, image quality, dosimetry, room shielding, quality control and quality criteria.

  8. Computer Software.

    ERIC Educational Resources Information Center

    Kay, Alan

    1984-01-01

    Discusses the nature and development of computer software. Programing, programing languages, types of software (including dynamic spreadsheets), and software of the future are among the topics considered. (JN)

  9. Computer News

    ERIC Educational Resources Information Center

    Science Activities: Classroom Projects and Curriculum Ideas, 2007

    2007-01-01

    This article presents several news stories about computers and technology. (1) Applied Science Associates of Narragansett, Rhode Island is providing computer modeling technology to help locate the remains to the USS Bonhomme Richard, which sank in 1779 after claiming a Revolutionary War victory. (2) Whyville, the leading edu-tainment virtual world…

  10. Cafeteria Computers.

    ERIC Educational Resources Information Center

    Dervarics, Charles

    1992-01-01

    By relying on new computer hardware and software, school food service departments can keep better records of daily food consumption, free and reduced-price meals, inventory, production, and other essentials. The most commonly used systems fall into two basic categories: point-of-sale computers and behind-the-counter systems. State funding efforts…

  11. Computer Recreations.

    ERIC Educational Resources Information Center

    Dewdney, A. K.

    1989-01-01

    Discussed are three examples of computer graphics including biomorphs, Truchet tilings, and fractal popcorn. The graphics are shown and the basic algorithm using multiple iteration of a particular function or mathematical operation is described. An illustration of a snail shell created by computer graphics is presented. (YP)

  12. Computer Insecurity.

    ERIC Educational Resources Information Center

    Wilson, David L.

    1994-01-01

    College administrators recently appealed to students and faculty to change their computer passwords after security experts announced that tens of thousands had been stolen by computer hackers. Federal officials are investigating. Such attacks are not uncommon, but the most effective solutions are either inconvenient or cumbersome. (MSE)

  13. Computer Graphics.

    ERIC Educational Resources Information Center

    Halpern, Jeanne W.

    1970-01-01

    Computer graphics have been called the most exciting development in computer technology. At the University of Michigan, three kinds of graphics output equipment are now being used: symbolic printers, line plotters or drafting devices, and cathode-ray tubes (CRT). Six examples are given that demonstrate the range of graphics use at the University.…

  14. Computing Life

    ERIC Educational Resources Information Center

    National Institute of General Medical Sciences (NIGMS), 2009

    2009-01-01

    Computer advances now let researchers quickly search through DNA sequences to find gene variations that could lead to disease, simulate how flu might spread through one's school, and design three-dimensional animations of molecules that rival any video game. By teaming computers and biology, scientists can answer new and old questions that could…

  15. Grid Computing

    NASA Astrophysics Data System (ADS)

    Foster, Ian

    2001-08-01

    The term "Grid Computing" refers to the use, for computational purposes, of emerging distributed Grid infrastructures: that is, network and middleware services designed to provide on-demand and high-performance access to all important computational resources within an organization or community. Grid computing promises to enable both evolutionary and revolutionary changes in the practice of computational science and engineering based on new application modalities such as high-speed distributed analysis of large datasets, collaborative engineering and visualization, desktop access to computation via "science portals," rapid parameter studies and Monte Carlo simulations that use all available resources within an organization, and online analysis of data from scientific instruments. In this article, I examine the status of Grid computing circa 2000, briefly reviewing some relevant history, outlining major current Grid research and development activities, and pointing out likely directions for future work. I also present a number of case studies, selected to illustrate the potential of Grid computing in various areas of science.

  16. Computational astrophysics

    NASA Technical Reports Server (NTRS)

    Miller, Richard H.

    1987-01-01

    Astronomy is an area of applied physics in which unusually beautiful objects challenge the imagination to explain observed phenomena in terms of known laws of physics. It is a field that has stimulated the development of physical laws and of mathematical and computational methods. Current computational applications are discussed in terms of stellar and galactic evolution, galactic dynamics, and particle motions.

  17. Computational Pathology

    PubMed Central

    Louis, David N.; Feldman, Michael; Carter, Alexis B.; Dighe, Anand S.; Pfeifer, John D.; Bry, Lynn; Almeida, Jonas S.; Saltz, Joel; Braun, Jonathan; Tomaszewski, John E.; Gilbertson, John R.; Sinard, John H.; Gerber, Georg K.; Galli, Stephen J.; Golden, Jeffrey A.; Becich, Michael J.

    2016-01-01

    Context We define the scope and needs within the new discipline of computational pathology, a discipline critical to the future of both the practice of pathology and, more broadly, medical practice in general. Objective To define the scope and needs of computational pathology. Data Sources A meeting was convened in Boston, Massachusetts, in July 2014 prior to the annual Association of Pathology Chairs meeting, and it was attended by a variety of pathologists, including individuals highly invested in pathology informatics as well as chairs of pathology departments. Conclusions The meeting made recommendations to promote computational pathology, including clearly defining the field and articulating its value propositions; asserting that the value propositions for health care systems must include means to incorporate robust computational approaches to implement data-driven methods that aid in guiding individual and population health care; leveraging computational pathology as a center for data interpretation in modern health care systems; stating that realizing the value proposition will require working with institutional administrations, other departments, and pathology colleagues; declaring that a robust pipeline should be fostered that trains and develops future computational pathologists, for those with both pathology and non-pathology backgrounds; and deciding that computational pathology should serve as a hub for data-related research in health care systems. The dissemination of these recommendations to pathology and bioinformatics departments should help facilitate the development of computational pathology. PMID:26098131

  18. I, Computer

    ERIC Educational Resources Information Center

    Barack, Lauren

    2005-01-01

    What child hasn't chatted with friends through a computer? But chatting with a computer? Some Danish scientists have literally put a face on their latest software program, bringing to virtual life storyteller Hans Christian Andersen, who engages users in actual conversations. The digitized Andersen resides at the Hans Christian Andersen Museum in…

  19. Personal Computers.

    ERIC Educational Resources Information Center

    Toong, Hoo-min D.; Gupta, Amar

    1982-01-01

    Describes the hardware, software, applications, and current proliferation of personal computers (microcomputers). Includes discussions of microprocessors, memory, output (including printers), application programs, the microcomputer industry, and major microcomputer manufacturers (Apple, Radio Shack, Commodore, and IBM). (JN)

  20. Sort computation

    NASA Technical Reports Server (NTRS)

    Dorband, John E.

    1988-01-01

    Sorting has long been used to organize data in preparation for further computation, but sort computation allows some types of computation to be performed during the sort. Sort aggregation and sort distribution are the two basic forms of sort computation. Sort aggregation generates an accumulative or aggregate result for each group of records and places this result in one of the records. An aggregate operation can be any operation that is both associative and commutative, i.e., any operation whose result does not depend on the order of the operands or the order in which the operations are performed. Sort distribution copies the value from a field of a specific record in a group into that field in every record of that group.

  1. LHC Computing

    SciTech Connect

    Lincoln, Don

    2015-07-28

    The LHC is the world’s highest energy particle accelerator and scientists use it to record an unprecedented amount of data. This data is recorded in electronic format and it requires an enormous computational infrastructure to convert the raw data into conclusions about the fundamental rules that govern matter. In this video, Fermilab’s Dr. Don Lincoln gives us a sense of just how much data is involved and the incredible computer resources that makes it all possible.

  2. Advanced computing

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Advanced concepts in hardware, software and algorithms are being pursued for application in next generation space computers and for ground based analysis of space data. The research program focuses on massively parallel computation and neural networks, as well as optical processing and optical networking which are discussed under photonics. Also included are theoretical programs in neural and nonlinear science, and device development for magnetic and ferroelectric memories.

  3. Computational chemistry

    NASA Technical Reports Server (NTRS)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  4. Chromatin Computation

    PubMed Central

    Bryant, Barbara

    2012-01-01

    In living cells, DNA is packaged along with protein and RNA into chromatin. Chemical modifications to nucleotides and histone proteins are added, removed and recognized by multi-functional molecular complexes. Here I define a new computational model, in which chromatin modifications are information units that can be written onto a one-dimensional string of nucleosomes, analogous to the symbols written onto cells of a Turing machine tape, and chromatin-modifying complexes are modeled as read-write rules that operate on a finite set of adjacent nucleosomes. I illustrate the use of this “chromatin computer” to solve an instance of the Hamiltonian path problem. I prove that chromatin computers are computationally universal – and therefore more powerful than the logic circuits often used to model transcription factor control of gene expression. Features of biological chromatin provide a rich instruction set for efficient computation of nontrivial algorithms in biological time scales. Modeling chromatin as a computer shifts how we think about chromatin function, suggests new approaches to medical intervention, and lays the groundwork for the engineering of a new class of biological computing machines. PMID:22567109

  5. Computational structures for robotic computations

    NASA Technical Reports Server (NTRS)

    Lee, C. S. G.; Chang, P. R.

    1987-01-01

    The computational problem of inverse kinematics and inverse dynamics of robot manipulators by taking advantage of parallelism and pipelining architectures is discussed. For the computation of inverse kinematic position solution, a maximum pipelined CORDIC architecture has been designed based on a functional decomposition of the closed-form joint equations. For the inverse dynamics computation, an efficient p-fold parallel algorithm to overcome the recurrence problem of the Newton-Euler equations of motion to achieve the time lower bound of O(log sub 2 n) has also been developed.

  6. [DNA computing].

    PubMed

    Błasiak, Janusz; Krasiński, Tadeusz; Popławski, Tomasz; Sakowski, Sebastian

    2011-01-01

    Biocomputers can be an alternative for traditional "silicon-based" computers, which continuous development may be limited due to further miniaturization (imposed by the Heisenberg Uncertainty Principle) and increasing the amount of information between the central processing unit and the main memory (von Neuman bottleneck). The idea of DNA computing came true for the first time in 1994, when Adleman solved the Hamiltonian Path Problem using short DNA oligomers and DNA ligase. In the early 2000s a series of biocomputer models was presented with a seminal work of Shapiro and his colleguas who presented molecular 2 state finite automaton, in which the restriction enzyme, FokI, constituted hardware and short DNA oligomers were software as well as input/output signals. DNA molecules provided also energy for this machine. DNA computing can be exploited in many applications, from study on the gene expression pattern to diagnosis and therapy of cancer. The idea of DNA computing is still in progress in research both in vitro and in vivo and at least promising results of these research allow to have a hope for a breakthrough in the computer science. PMID:21735816

  7. Computational mechanics

    SciTech Connect

    Goudreau, G.L.

    1993-03-01

    The Computational Mechanics thrust area sponsors research into the underlying solid, structural and fluid mechanics and heat transfer necessary for the development of state-of-the-art general purpose computational software. The scale of computational capability spans office workstations, departmental computer servers, and Cray-class supercomputers. The DYNA, NIKE, and TOPAZ codes have achieved world fame through our broad collaborators program, in addition to their strong support of on-going Lawrence Livermore National Laboratory (LLNL) programs. Several technology transfer initiatives have been based on these established codes, teaming LLNL analysts and researchers with counterparts in industry, extending code capability to specific industrial interests of casting, metalforming, and automobile crash dynamics. The next-generation solid/structural mechanics code, ParaDyn, is targeted toward massively parallel computers, which will extend performance from gigaflop to teraflop power. Our work for FY-92 is described in the following eight articles: (1) Solution Strategies: New Approaches for Strongly Nonlinear Quasistatic Problems Using DYNA3D; (2) Enhanced Enforcement of Mechanical Contact: The Method of Augmented Lagrangians; (3) ParaDyn: New Generation Solid/Structural Mechanics Codes for Massively Parallel Processors; (4) Composite Damage Modeling; (5) HYDRA: A Parallel/Vector Flow Solver for Three-Dimensional, Transient, Incompressible Viscous How; (6) Development and Testing of the TRIM3D Radiation Heat Transfer Code; (7) A Methodology for Calculating the Seismic Response of Critical Structures; and (8) Reinforced Concrete Damage Modeling.

  8. Computational Psychiatry

    PubMed Central

    Wang, Xiao-Jing; Krystal, John H.

    2014-01-01

    Psychiatric disorders such as autism and schizophrenia arise from abnormalities in brain systems that underlie cognitive, emotional and social functions. The brain is enormously complex and its abundant feedback loops on multiple scales preclude intuitive explication of circuit functions. In close interplay with experiments, theory and computational modeling are essential for understanding how, precisely, neural circuits generate flexible behaviors and their impairments give rise to psychiatric symptoms. This Perspective highlights recent progress in applying computational neuroscience to the study of mental disorders. We outline basic approaches, including identification of core deficits that cut across disease categories, biologically-realistic modeling bridging cellular and synaptic mechanisms with behavior, model-aided diagnosis. The need for new research strategies in psychiatry is urgent. Computational psychiatry potentially provides powerful tools for elucidating pathophysiology that may inform both diagnosis and treatment. To achieve this promise will require investment in cross-disciplinary training and research in this nascent field. PMID:25442941

  9. Computational mechanics

    SciTech Connect

    Raboin, P J

    1998-01-01

    The Computational Mechanics thrust area is a vital and growing facet of the Mechanical Engineering Department at Lawrence Livermore National Laboratory (LLNL). This work supports the development of computational analysis tools in the areas of structural mechanics and heat transfer. Over 75 analysts depend on thrust area-supported software running on a variety of computing platforms to meet the demands of LLNL programs. Interactions with the Department of Defense (DOD) High Performance Computing and Modernization Program and the Defense Special Weapons Agency are of special importance as they support our ParaDyn project in its development of new parallel capabilities for DYNA3D. Working with DOD customers has been invaluable to driving this technology in directions mutually beneficial to the Department of Energy. Other projects associated with the Computational Mechanics thrust area include work with the Partnership for a New Generation Vehicle (PNGV) for ''Springback Predictability'' and with the Federal Aviation Administration (FAA) for the ''Development of Methodologies for Evaluating Containment and Mitigation of Uncontained Engine Debris.'' In this report for FY-97, there are five articles detailing three code development activities and two projects that synthesized new code capabilities with new analytic research in damage/failure and biomechanics. The article this year are: (1) Energy- and Momentum-Conserving Rigid-Body Contact for NIKE3D and DYNA3D; (2) Computational Modeling of Prosthetics: A New Approach to Implant Design; (3) Characterization of Laser-Induced Mechanical Failure Damage of Optical Components; (4) Parallel Algorithm Research for Solid Mechanics Applications Using Finite Element Analysis; and (5) An Accurate One-Step Elasto-Plasticity Algorithm for Shell Elements in DYNA3D.

  10. Computer viruses

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1988-01-01

    The worm, Trojan horse, bacterium, and virus are destructive programs that attack information stored in a computer's memory. Virus programs, which propagate by incorporating copies of themselves into other programs, are a growing menace in the late-1980s world of unprotected, networked workstations and personal computers. Limited immunity is offered by memory protection hardware, digitally authenticated object programs,and antibody programs that kill specific viruses. Additional immunity can be gained from the practice of digital hygiene, primarily the refusal to use software from untrusted sources. Full immunity requires attention in a social dimension, the accountability of programmers.

  11. LHC Computing

    ScienceCinema

    Lincoln, Don

    2016-07-12

    The LHC is the world’s highest energy particle accelerator and scientists use it to record an unprecedented amount of data. This data is recorded in electronic format and it requires an enormous computational infrastructure to convert the raw data into conclusions about the fundamental rules that govern matter. In this video, Fermilab’s Dr. Don Lincoln gives us a sense of just how much data is involved and the incredible computer resources that makes it all possible.

  12. Computational vision

    NASA Technical Reports Server (NTRS)

    Barrow, H. G.; Tenenbaum, J. M.

    1981-01-01

    The range of fundamental computational principles underlying human vision that equally apply to artificial and natural systems is surveyed. There emerges from research a view of the structuring of vision systems as a sequence of levels of representation, with the initial levels being primarily iconic (edges, regions, gradients) and the highest symbolic (surfaces, objects, scenes). Intermediate levels are constrained by information made available by preceding levels and information required by subsequent levels. In particular, it appears that physical and three-dimensional surface characteristics provide a critical transition from iconic to symbolic representations. A plausible vision system design incorporating these principles is outlined, and its key computational processes are elaborated.

  13. Computer systems

    NASA Technical Reports Server (NTRS)

    Olsen, Lola

    1992-01-01

    In addition to the discussions, Ocean Climate Data Workshop hosts gave participants an opportunity to hear about, see, and test for themselves some of the latest computer tools now available for those studying climate change and the oceans. Six speakers described computer systems and their functions. The introductory talks were followed by demonstrations to small groups of participants and some opportunities for participants to get hands-on experience. After this familiarization period, attendees were invited to return during the course of the Workshop and have one-on-one discussions and further hands-on experience with these systems. Brief summaries or abstracts of introductory presentations are addressed.

  14. 77 FR 9868 - Airworthiness Directives; Honeywell International Inc. Turbofan Engines

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-21

    ... Regulatory Policies and Procedures (44 FR 11034, February 26, 1979), (3) Will not affect intrastate aviation... turbofan engines. This proposed AD was prompted by a report of a rim/web separation of a first stage low... uncontained disk separation, leading to fuel tank penetration, fire, personal injury, and damage to...

  15. 77 FR 32009 - Airworthiness Directives; Honeywell International, Inc. Turbofan Engines

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-31

    .... Guidance on performing a detailed functional test of the overspeed system can be found in the applicable... engine and perform a detailed functional test of the overspeed system. Guidance on performing a detailed... functional test of the overspeed system. Guidance on performing a detailed functional test of the...

  16. Honeywell militarized color liquid crystal displays for the F-16

    NASA Astrophysics Data System (ADS)

    Wood, Ted

    1996-05-01

    A fully militarized color multifunction display for the F-16 has been completed and is in the first months of production. This high performance display is a tightly integrated ensemble of optical, electronic, mechanical, and thermal designs, Many of the elements are critically interdependent, requiring fine-tuning to achieve the exceptional performance required by the F-16 environment. With no cooling air available on the F-16, the thermal requirements, both specified and implicit, dominated the design process. The high luminance requirements, in combination with a high resolution display, concentrated a great deal of heat in the display module. As a result, thermal efficiency and management were paramount. Temperature stability and performance of the liquid crystal material itself, stability of the polarizers, optical and electronic efficiency and heat extraction design required intense scrutiny.

  17. Reliability and the design process at Honeywell Avionics Division

    NASA Technical Reports Server (NTRS)

    Bezat, A.

    1981-01-01

    The division's philosophy for designed-in reliability and a comparison of reliability programs for space, manned military aircraft, and commercial aircraft, are presented. Topics include: the reliability interface with design and production; the concept phase through final proposal; the design, development, test and evaluation phase; the production phase; and the commonality among space, military, and commercial avionics.

  18. 77 FR 51892 - Airworthiness Directives; Honeywell International Inc. Turbofan Engines

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-28

    ... published in the Federal Register on February 21, 2012 (77 FR 9868). That NPRM proposed to require replacing...: Are consistent with the intent that was proposed in the NPRM (77 FR 9868, February 21, 2012) for... proposed in the NPRM (77 FR 9868, February 21, 2012). We also determined that these changes will...

  19. 77 FR 14312 - Airworthiness Directives; Honeywell International, Inc. Turbofan Engines

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-09

    ... FAA, Engine & Propeller Directorate, 12 New England Executive Park, Burlington, MA. For information on the availability of this material at the FAA, call 781-238-7125. Examining the AD Docket You may... Policies and Procedures (44 FR 11034, February 26, 1979), (3) Will not affect intrastate aviation in...

  20. 77 FR 1043 - Airworthiness Directives; Honeywell International Inc. Turbofan Engines

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-09

    ... referenced service information at the FAA, Engine & Propeller Directorate, 12 New England Executive Park, Burlington, MA 01803. For information on the availability of this material at the FAA, call (781) 238-7125... 12866, (2) Is not a ``significant rule'' under the DOT Regulatory Policies and Procedures (44 FR...

  1. Computer Routing.

    ERIC Educational Resources Information Center

    Malone, Roger

    1991-01-01

    Computerized bus-routing systems plot the most efficient routes, cut the time it takes to draw routes, and generate reports quickly and accurately. However, school districts often underestimate the amount of work necessary to get information into the computer database. (MLF)

  2. Computer Corner.

    ERIC Educational Resources Information Center

    Mason, Margie

    1985-01-01

    This article: describes how to prevent pins on game paddles from breaking; suggests using needlepoint books for ideas to design computer graphics; lists a BASIC program to create a Christmas tree, with extension activities; suggests a LOGO Christmas activity; and describes a book on the development of microcomputers. (JN)

  3. Business Computers.

    ERIC Educational Resources Information Center

    Canipe, Stephen L.

    A brief definition of some fundamentals of microcomputers and of the ways they may be used in small businesses can help potential buyers make informed purchases. Hardware (the mechanical devices from which computers are made) described here are the video display, keyboard, central processing unit, "random access" and "read only" memories, cassette…

  4. Computational trigonometry

    SciTech Connect

    Gustafson, K.

    1994-12-31

    By means of the author`s earlier theory of antieigenvalues and antieigenvectors, a new computational approach to iterative methods is presented. This enables an explicit trigonometric understanding of iterative convergence and provides new insights into the sharpness of error bounds. Direct applications to Gradient descent, Conjugate gradient, GCR(k), Orthomin, CGN, GMRES, CGS, and other matrix iterative schemes will be given.

  5. Computational Physics.

    ERIC Educational Resources Information Center

    Borcherds, P. H.

    1986-01-01

    Describes an optional course in "computational physics" offered at the University of Birmingham. Includes an introduction to numerical methods and presents exercises involving fast-Fourier transforms, non-linear least-squares, Monte Carlo methods, and the three-body problem. Recommends adding laboratory work into the course in the future. (TW)

  6. Computer Recreations.

    ERIC Educational Resources Information Center

    Dewdney, A. K.

    1988-01-01

    Describes the creation of the computer program "BOUNCE," designed to simulate a weighted piston coming into equilibrium with a cloud of bouncing balls. The model follows the ideal gas law. Utilizes the critical event technique to create the model. Discusses another program, "BOOM," which simulates a chain reaction. (CW)

  7. Networking computers.

    PubMed

    McBride, D C

    1997-03-01

    This decade the role of the personal computer has shifted dramatically from a desktop device designed to increase individual productivity and efficiency to an instrument of communication linking people and machines in different places with one another. A computer in one city can communicate with another that may be thousands of miles away. Networking is how this is accomplished. Just like the voice network used by the telephone, computer networks transmit data and other information via modems over these same telephone lines. A network can be created over both short and long distances. Networks can be established within a hospital or medical building or over many hospitals or buildings covering many geographic areas. Those confined to one location are called LANs, local area networks. Those that link computers in one building to those at other locations are known as WANs, or wide area networks. The ultimate wide area network is the one we've all been hearing so much about these days--the Internet, and its World Wide Web. Setting up a network is a process that requires careful planning and commitment. To avoid potential pitfalls and to make certain the network you establish meets your needs today and several years down the road, several steps need to be followed. This article reviews the initial steps involved in getting ready to network.

  8. Computational Estimation

    ERIC Educational Resources Information Center

    Fung, Maria G.; Latulippe, Christine L.

    2010-01-01

    Elementary school teachers are responsible for constructing the foundation of number sense in youngsters, and so it is recommended that teacher-training programs include an emphasis on number sense to ensure the development of dynamic, productive computation and estimation skills in students. To better prepare preservice elementary school teachers…

  9. Computational Musicology.

    ERIC Educational Resources Information Center

    Bel, Bernard; Vecchione, Bernard

    1993-01-01

    Asserts that a revolution has been occurring in musicology since the 1970s. Contends that music has change from being only a source of emotion to appearing more open to science and techniques based on computer technology. Describes recent research and other writings about the topic and provides an extensive bibliography. (CFR)

  10. Amorphous Computing

    NASA Astrophysics Data System (ADS)

    Sussman, Gerald

    2002-03-01

    Digital computers have always been constructed to behave as precise arrangements of reliable parts, and our techniques for organizing computations depend upon this precision and reliability. Two emerging technologies, however, are begnning to undercut these assumptions about constructing and programming computers. These technologies -- microfabrication and bioengineering -- will make it possible to assemble systems composed of myriad information- processing units at almost no cost, provided: 1) that not all the units need to work correctly; and 2) that there is no need to manufacture precise geometrical arrangements or interconnection patterns among them. Microelectronic mechanical components are becoming so inexpensive to manufacture that we can anticipate combining logic circuits, microsensors, actuators, and communications devices integrated on the same chip to produce particles that could be mixed with bulk materials, such as paints, gels, and concrete. Imagine coating bridges or buildings with smart paint that can sense and report on traffic and wind loads and monitor structural integrity of the bridge. A smart paint coating on a wall could sense vibrations, monitor the premises for intruders, or cancel noise. Even more striking, there has been such astounding progress in understanding the biochemical mechanisms in individual cells, that it appears we'll be able to harness these mechanisms to construct digital- logic circuits. Imagine a discipline of cellular engineering that could tailor-make biological cells that function as sensors and actuators, as programmable delivery vehicles for pharmaceuticals, as chemical factories for the assembly of nanoscale structures. Fabricating such systems seem to be within our reach, even if it is not yet within our grasp Fabrication, however, is only part of the story. We can envision producing vast quantities of individual computing elements, whether microfabricated particles, engineered cells, or macromolecular computing

  11. Bacteria as computers making computers

    PubMed Central

    Danchin, Antoine

    2009-01-01

    Various efforts to integrate biological knowledge into networks of interactions have produced a lively microbial systems biology. Putting molecular biology and computer sciences in perspective, we review another trend in systems biology, in which recursivity and information replace the usual concepts of differential equations, feedback and feedforward loops and the like. Noting that the processes of gene expression separate the genome from the cell machinery, we analyse the role of the separation between machine and program in computers. However, computers do not make computers. For cells to make cells requires a specific organization of the genetic program, which we investigate using available knowledge. Microbial genomes are organized into a paleome (the name emphasizes the role of the corresponding functions from the time of the origin of life), comprising a constructor and a replicator, and a cenome (emphasizing community-relevant genes), made up of genes that permit life in a particular context. The cell duplication process supposes rejuvenation of the machine and replication of the program. The paleome also possesses genes that enable information to accumulate in a ratchet-like process down the generations. The systems biology must include the dynamics of information creation in its future developments. PMID:19016882

  12. RATIO COMPUTER

    DOEpatents

    Post, R.F.

    1958-11-11

    An electronic computer circuit is described for producing an output voltage proportional to the product or quotient of tbe voltages of a pair of input signals. ln essence, the disclosed invention provides a computer having two channels adapted to receive separate input signals and each having amplifiers with like fixed amplification factors and like negatlve feedback amplifiers. One of the channels receives a constant signal for comparison purposes, whereby a difference signal is produced to control the amplification factors of the variable feedback amplifiers. The output of the other channel is thereby proportional to the product or quotient of input signals depending upon the relation of input to fixed signals in the first mentioned channel.

  13. Computational Combustion

    SciTech Connect

    Westbrook, C K; Mizobuchi, Y; Poinsot, T J; Smith, P J; Warnatz, J

    2004-08-26

    Progress in the field of computational combustion over the past 50 years is reviewed. Particular attention is given to those classes of models that are common to most system modeling efforts, including fluid dynamics, chemical kinetics, liquid sprays, and turbulent flame models. The developments in combustion modeling are placed into the time-dependent context of the accompanying exponential growth in computer capabilities and Moore's Law. Superimposed on this steady growth, the occasional sudden advances in modeling capabilities are identified and their impacts are discussed. Integration of submodels into system models for spark ignition, diesel and homogeneous charge, compression ignition engines, surface and catalytic combustion, pulse combustion, and detonations are described. Finally, the current state of combustion modeling is illustrated by descriptions of a very large jet lifted 3D turbulent hydrogen flame with direct numerical simulation and 3D large eddy simulations of practical gas burner combustion devices.

  14. Computer Game

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Using NASA studies of advanced lunar exploration and colonization, KDT Industries, Inc. and Wesson International have developed MOONBASE, a computer game. The player, or team commander, must build and operate a lunar base using NASA technology. He has 10 years to explore the surface, select a site and assemble structures brought from Earth into an efficient base. The game was introduced in 1991 by Texas Space Grant Consortium.

  15. Computer centers

    NASA Astrophysics Data System (ADS)

    The National Science Foundation has renewed grants to four of its five supercomputer centers. Average annual funding will rise from $10 million to $14 million so facilities can be upgraded and training and education expanded. As cooperative projects, the centers also receive money from states, universities, computer vendors and industry. The centers support research in fluid dynamics, atmospheric modeling, engineering geophysics and many other scientific disciplines.

  16. Singularity computations

    NASA Technical Reports Server (NTRS)

    Swedlow, J. L.

    1976-01-01

    An approach is described for singularity computations based on a numerical method for elastoplastic flow to delineate radial and angular distribution of field quantities and measure the intensity of the singularity. The method is applicable to problems in solid mechanics and lends itself to certain types of heat flow and fluid motion studies. Its use is not limited to linear, elastic, small strain, or two-dimensional situations.

  17. New computer architectures

    SciTech Connect

    Tiberghien, J.

    1984-01-01

    This book presents papers on supercomputers. Topics considered include decentralized computer architecture, new programming languages, data flow computers, reduction computers, parallel prefix calculations, structural and behavioral descriptions of digital systems, instruction sets, software generation, personal computing, and computer architecture education.

  18. Computer vision

    NASA Technical Reports Server (NTRS)

    Gennery, D.; Cunningham, R.; Saund, E.; High, J.; Ruoff, C.

    1981-01-01

    The field of computer vision is surveyed and assessed, key research issues are identified, and possibilities for a future vision system are discussed. The problems of descriptions of two and three dimensional worlds are discussed. The representation of such features as texture, edges, curves, and corners are detailed. Recognition methods are described in which cross correlation coefficients are maximized or numerical values for a set of features are measured. Object tracking is discussed in terms of the robust matching algorithms that must be devised. Stereo vision, camera control and calibration, and the hardware and systems architecture are discussed.

  19. Computational crystallization.

    PubMed

    Altan, Irem; Charbonneau, Patrick; Snell, Edward H

    2016-07-15

    Crystallization is a key step in macromolecular structure determination by crystallography. While a robust theoretical treatment of the process is available, due to the complexity of the system, the experimental process is still largely one of trial and error. In this article, efforts in the field are discussed together with a theoretical underpinning using a solubility phase diagram. Prior knowledge has been used to develop tools that computationally predict the crystallization outcome and define mutational approaches that enhance the likelihood of crystallization. For the most part these tools are based on binary outcomes (crystal or no crystal), and the full information contained in an assembly of crystallization screening experiments is lost. The potential of this additional information is illustrated by examples where new biological knowledge can be obtained and where a target can be sub-categorized to predict which class of reagents provides the crystallization driving force. Computational analysis of crystallization requires complete and correctly formatted data. While massive crystallization screening efforts are under way, the data available from many of these studies are sparse. The potential for this data and the steps needed to realize this potential are discussed.

  20. Computational Astrophysics

    NASA Astrophysics Data System (ADS)

    Mickaelian, A. M.; Astsatryan, H. V.

    2015-07-01

    Present astronomical archives that contain billions of objects, both Galactic and extragalactic, and the vast amount of data on them allow new studies and discoveries. Astrophysical Virtual Observatories (VO) use available databases and current observing material as a collection of interoperating data archives and software tools to form a research environment in which complex research programs can be conducted. Most of the modern databases give at present VO access to the stored information, which makes possible also a fast analysis and managing of these data. Cross-correlations result in revealing new objects and new samples. Very often dozens of thousands of sources hide a few very interesting ones that are needed to be discovered by comparison of various physical characteristics. VO is a prototype of Grid technologies that allows distributed data computation, analysis and imaging. Particularly important are data reduction and analysis systems: spectral analysis, SED building and fitting, modelling, variability studies, cross correlations, etc. Computational astrophysics has become an indissoluble part of astronomy and most of modern research is being done by means of it.

  1. Computational introspection

    SciTech Connect

    Batali, J.

    1983-02-01

    Introspection is the process of thinking about one's own thoughts and feelings. In this paper, the author discusses recent attempts to make computational systems that exhibit introspective behavior. Each presents a system capable of manipulating representations of its own program and current context. He argues that introspective ability is crucial for intelligent systems--without it an agent cannot represent certain problems that it must be able to solve. A theory of intelligent action would describe how and why certain actions intelligently achieve an agent's goals. The agent would both embody and represent this theory: it would be implemented as the program for the agent; and the importance of introspection suggests that the agent represent its theory of action to itself.

  2. Computer vision

    SciTech Connect

    Not Available

    1982-01-01

    This paper discusses material from areas such as artificial intelligence, psychology, computer graphics, and image processing. The intent is to assemble a selection of this material in a form that will serve both as a senior/graduate-level academic text and as a useful reference to those building vision systems. This book has a strong artificial intelligence flavour, emphasising the belief that both the intrinsic image information and the internal model of the world are important in successful vision systems. The book is organised into four parts, based on descriptions of objects at four different levels of abstraction. These are: generalised images-images and image-like entities; segmented images-images organised into subimages that are likely to correspond to interesting objects; geometric structures-quantitative models of image and world structures; relational structures-complex symbolic descriptions of image and world structures. The book contains author and subject indexes.

  3. Advanced Computing for Medicine.

    ERIC Educational Resources Information Center

    Rennels, Glenn D.; Shortliffe, Edward H.

    1987-01-01

    Discusses contributions that computers and computer networks are making to the field of medicine. Emphasizes the computer's speed in storing and retrieving data. Suggests that doctors may soon be able to use computers to advise on diagnosis and treatment. (TW)

  4. Computer security in DOE distributed computing systems

    SciTech Connect

    Hunteman, W.J.

    1990-01-01

    The modernization of DOE facilities amid limited funding is creating pressure on DOE facilities to find innovative approaches to their daily activities. Distributed computing systems are becoming cost-effective solutions to improved productivity. This paper defines and describes typical distributed computing systems in the DOE. The special computer security problems present in distributed computing systems are identified and compared with traditional computer systems. The existing DOE computer security policy supports only basic networks and traditional computer systems and does not address distributed computing systems. A review of the existing policy requirements is followed by an analysis of the policy as it applies to distributed computing systems. Suggested changes in the DOE computer security policy are identified and discussed. The long lead time in updating DOE policy will require guidelines for applying the existing policy to distributed systems. Some possible interim approaches are identified and discussed. 2 refs.

  5. Computers and occupational therapy.

    PubMed

    English, C B

    1975-01-01

    The benefits and applications of computer science for occupational therapy are explored and a basic, functional description of the computer and computer programming is presented. Potential problems and advantages of computer utilization are compared and examples of existing computer systems in health fields are cited. Methods for successfully introducing computers are discussed.

  6. The assumptions of computing

    SciTech Connect

    Huggins, J.K.

    1994-12-31

    The use of computers, like any technological activity, is not content-neutral. Users of computers constantly interact with assumptions regarding worthwhile activity which are embedded in any computing system. Directly questioning these assumptions in the context of computing allows us to develop an understanding of responsible computing.

  7. Computers in Teaching English.

    ERIC Educational Resources Information Center

    Davis, James E., Ed.; Davis, Hazel K., Ed.

    1983-01-01

    The 26 articles in this journal issue discuss the use of the computer in the English classroom. Among the topics and applications discussed are (1) computer assisted invention, (2) word processing, (3) overcoming computer anxiety, (4) using computers in technical writing classes, (5) grading student essays by computer, (6) the experiences of an…

  8. The Old Computers' Home.

    ERIC Educational Resources Information Center

    Angier, Natalie

    1983-01-01

    The Computer Museum in Marlborough, Massachusetts houses old and not-so-old calculators, famous old computers and parts of computers, photographs and assorted memorabilia, computer-generated murals, and even a computer made of Tinkertoys that plays tick-tack-toe. The development of the museum and selected exhibits is described. (Author/JN)

  9. Democratizing Computer Science

    ERIC Educational Resources Information Center

    Margolis, Jane; Goode, Joanna; Ryoo, Jean J.

    2015-01-01

    Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…

  10. Computers for Everybody.

    ERIC Educational Resources Information Center

    Willis, Jerry; Miller, Merl

    This book explains how computers can be used in the home, office or school, and provides a consumer's guide to computer equipment for the novice user. The first sections of the book offer a brief sketch of computer history, a listing of entertaining and easily available computer programs, a step-by-step guide to buying a computer, and advice on…

  11. GeoComputation 2009

    SciTech Connect

    Xue, Yong; Hoffman, Forrest M; Liu, Dingsheng

    2009-01-01

    The tremendous computing requirements of today's algorithms and the high costs of high-performance supercomputers drive us to share computing resources. The emerging computational Grid technologies are expected to make feasible the creation of a computational environment handling many PetaBytes of distributed data, tens of thousands of heterogeneous computing resources, and thousands of simultaneous users from multiple research institutions.

  12. Tying into Computers.

    ERIC Educational Resources Information Center

    Canipe, Stephen L.

    Topics in this paper include: sources of computer programs, public domain software, copyright violations, purposes of computers in classrooms (drill/practice and interactive learning), computer assisted instruction, flow charts, and computer clubs (such as App-le-kations in Charlotte, North Carolina). A complete listing of two computer programs…

  13. Computational thinking and thinking about computing.

    PubMed

    Wing, Jeannette M

    2008-10-28

    Computational thinking will influence everyone in every field of endeavour. This vision poses a new educational challenge for our society, especially for our children. In thinking about computing, we need to be attuned to the three drivers of our field: science, technology and society. Accelerating technological advances and monumental societal demands force us to revisit the most basic scientific questions of computing.

  14. Computational thinking and thinking about computing

    PubMed Central

    Wing, Jeannette M.

    2008-01-01

    Computational thinking will influence everyone in every field of endeavour. This vision poses a new educational challenge for our society, especially for our children. In thinking about computing, we need to be attuned to the three drivers of our field: science, technology and society. Accelerating technological advances and monumental societal demands force us to revisit the most basic scientific questions of computing. PMID:18672462

  15. ADVANCED COMPUTATIONAL METHODS IN DOSE MODELING: APPLICATION OF COMPUTATIONAL BIOPHYSICAL TRANSPORT, COMPUTATIONAL CHEMISTRY, AND COMPUTATIONAL BIOLOGY

    EPA Science Inventory

    Computational toxicology (CompTox) leverages the significant gains in computing power and computational techniques (e.g., numerical approaches, structure-activity relationships, bioinformatics) realized over the last few years, thereby reducing costs and increasing efficiency i...

  16. PR Educators Stress Computers.

    ERIC Educational Resources Information Center

    Fleming, Charles A.

    1988-01-01

    Surveys the varied roles computers play in public relations education. Asserts that, because computers are used extensively in the public relations field, students should become acquainted with the varied capabilities of computers and their role in public relations practice. (MM)

  17. On Teaching Computer Programming.

    ERIC Educational Resources Information Center

    Er, M. C.

    1984-01-01

    Points out difficulties associated with teaching introductory computer programing courses, discussing the importance of computer programing and explains activities associated with its use. Possible solutions to help teachers resolve problem areas in computer instruction are also offered. (ML)

  18. Computers: Instruments of Change.

    ERIC Educational Resources Information Center

    Barkume, Megan

    1993-01-01

    Discusses the impact of computers in the home, the school, and the workplace. Looks at changes in computer use by occupations and by industry. Provides information on new job titles in computer occupations. (JOW)

  19. Selecting Appropriate Computing Tools.

    ERIC Educational Resources Information Center

    Tetlow, William L.

    1990-01-01

    Selecting computer tools requires analyzing information requirements and audiences, assessing existing institutional research and computing capacities, creating or improving a planning database, using computer experts, determining software needs, obtaining sufficient resources for independent operations, acquiring quality, and insisting on…

  20. Avoiding Computer Viruses.

    ERIC Educational Resources Information Center

    Rowe, Joyce; And Others

    1989-01-01

    The threat of computer sabotage is a real concern to business teachers and others responsible for academic computer facilities. Teachers can minimize the possibility. Eight suggestions for avoiding computer viruses are given. (JOW)

  1. Computer Viruses: An Overview.

    ERIC Educational Resources Information Center

    Marmion, Dan

    1990-01-01

    Discusses the early history and current proliferation of computer viruses that occur on Macintosh and DOS personal computers, mentions virus detection programs, and offers suggestions for how libraries can protect themselves and their users from damage by computer viruses. (LRW)

  2. Computing technology in the 1980's. [computers

    NASA Technical Reports Server (NTRS)

    Stone, H. S.

    1978-01-01

    Advances in computing technology have been led by consistently improving semiconductor technology. The semiconductor industry has turned out ever faster, smaller, and less expensive devices since transistorized computers were first introduced 20 years ago. For the next decade, there appear to be new advances possible, with the rate of introduction of improved devices at least equal to the historic trends. The implication of these projections is that computers will enter new markets and will truly be pervasive in business, home, and factory as their cost diminishes and their computational power expands to new levels. The computer industry as we know it today will be greatly altered in the next decade, primarily because the raw computer system will give way to computer-based turn-key information and control systems.

  3. Symbolic and algebraic computation

    SciTech Connect

    Not Available

    1990-01-01

    This book contains subjects under the following headings: Foundations of symbolic computation; Computational logics; systems Algorithms on polynormal; Integrative and differential equations; and Differential equations.

  4. Computer Lab Configuration.

    ERIC Educational Resources Information Center

    Wodarz, Nan

    2003-01-01

    Describes the layout and elements of an effective school computer lab. Includes configuration, storage spaces, cabling and electrical requirements, lighting, furniture, and computer hardware and peripherals. (PKP)

  5. Computer hardware fault administration

    DOEpatents

    Archer, Charles J.; Megerian, Mark G.; Ratterman, Joseph D.; Smith, Brian E.

    2010-09-14

    Computer hardware fault administration carried out in a parallel computer, where the parallel computer includes a plurality of compute nodes. The compute nodes are coupled for data communications by at least two independent data communications networks, where each data communications network includes data communications links connected to the compute nodes. Typical embodiments carry out hardware fault administration by identifying a location of a defective link in the first data communications network of the parallel computer and routing communications data around the defective link through the second data communications network of the parallel computer.

  6. Computational aerodynamics and supercomputers

    NASA Technical Reports Server (NTRS)

    Ballhaus, W. F., Jr.

    1984-01-01

    Some of the progress in computational aerodynamics over the last decade is reviewed. The Numerical Aerodynamic Simulation Program objectives, computational goals, and implementation plans are described.

  7. Computational physics with PetaFlops computers

    NASA Astrophysics Data System (ADS)

    Attig, Norbert

    2009-04-01

    Driven by technology, Scientific Computing is rapidly entering the PetaFlops era. The Jülich Supercomputing Centre (JSC), one of three German national supercomputing centres, is focusing on the IBM Blue Gene architecture to provide computer resources of this class to its users, the majority of whom are computational physicists. Details of the system will be discussed and applications will be described which significantly benefit from this new architecture.

  8. Computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    1989-01-01

    An overview of computational fluid dynamics (CFD) activities at the Langley Research Center is given. The role of supercomputers in CFD research, algorithm development, multigrid approaches to computational fluid flows, aerodynamics computer programs, computational grid generation, turbulence research, and studies of rarefied gas flows are among the topics that are briefly surveyed.

  9. Computer-Based Learning.

    ERIC Educational Resources Information Center

    Brown, Peggy, Ed.

    1981-01-01

    Three essays on the ways in which colleges and universities use the computer as a teaching tool are presented, along with descriptions of 10 school programs that reflect the diversity of computer applications across the United States. In "A Place for Computing in Liberal Education," Karl L. Zinn likens the computer to personal resource tools, such…

  10. Computers and Employment.

    ERIC Educational Resources Information Center

    McConnell, Sheila; And Others

    1996-01-01

    Includes "Role of Computers in Reshaping the Work Force" (McConnell); "Semiconductors" (Moris); "Computer Manufacturing" (Warnke); "Commercial Banking Transformed by Computer Technology" (Morisi); "Software, Engineering Industries: Threatened by Technological Change?" (Goodman); "Job Creation and the Emerging Home Computer Market" (Freeman); and…

  11. Computer Viruses. Technology Update.

    ERIC Educational Resources Information Center

    Ponder, Tim, Comp.; Ropog, Marty, Comp.; Keating, Joseph, Comp.

    This document provides general information on computer viruses, how to help protect a computer network from them, measures to take if a computer becomes infected. Highlights include the origins of computer viruses; virus contraction; a description of some common virus types (File Virus, Boot Sector/Partition Table Viruses, Trojan Horses, and…

  12. My Computer Romance

    ERIC Educational Resources Information Center

    Campbell, Gardner

    2007-01-01

    In this article, the author relates the big role of computers in his life as a writer. The author narrates that he has been using a computer for nearly twenty years now. He relates that computers has set his writing free. When he started writing, he was just using an electric typewriter. He also relates that his romance with computers is also a…

  13. Overview 1993: Computational applications

    NASA Technical Reports Server (NTRS)

    Benek, John A.

    1993-01-01

    Computational applications include projects that apply or develop computationally intensive computer programs. Such programs typically require supercomputers to obtain solutions in a timely fashion. This report describes two CSTAR projects involving Computational Fluid Dynamics (CFD) technology. The first, the Parallel Processing Initiative, is a joint development effort and the second, the Chimera Technology Development, is a transfer of government developed technology to American industry.

  14. Computer Innovations in Education.

    ERIC Educational Resources Information Center

    Molnar, Andrew R.

    Computers in education are put in context by a brief review of current social and technological trends, a short history of the development of computers and the vast expansion of their use, and a brief description of computers and their use. Further chapters describe instructional applications, administrative uses, uses of computers for libraries…

  15. Elementary School Computer Literacy.

    ERIC Educational Resources Information Center

    New York City Board of Education, Brooklyn, NY.

    This curriculum guide presents lessons for computer literacy instruction in the elementary grades. The first section of the guide includes 22 lessons on hardware, covering such topics as how computers work, keyboarding, word processing, and computer peripherals. The 13 lessons in the second section cover social topics related to the computer,…

  16. The Computer Manpower Evolution

    ERIC Educational Resources Information Center

    Rooney, Joseph J.

    1975-01-01

    Advances and employment outlook in the field of computer science are discussed as well as the problems related to improving the quality of computer education. Specific computer jobs discussed include: data processing machine repairers, systems analysts, programmers, computer and peripheral equipment operators, and keypunch operators. (EA)

  17. The Glass Computer

    ERIC Educational Resources Information Center

    Paesler, M. A.

    2009-01-01

    Digital computers use different kinds of memory, each of which is either volatile or nonvolatile. On most computers only the hard drive memory is nonvolatile, i.e., it retains all information stored on it when the power is off. When a computer is turned on, an operating system stored on the hard drive is loaded into the computer's memory cache and…

  18. How Computer Graphics Work.

    ERIC Educational Resources Information Center

    Prosise, Jeff

    This document presents the principles behind modern computer graphics without straying into the arcane languages of mathematics and computer science. Illustrations accompany the clear, step-by-step explanations that describe how computers draw pictures. The 22 chapters of the book are organized into 5 sections. "Part 1: Computer Graphics in…

  19. Computers and Conceptual Change.

    ERIC Educational Resources Information Center

    Olson, John

    A systematic study was conducted with a group of 10- to 12-year-olds using computer assisted instruction in a unit on fire which included a computer simulation of combustion. Three research questions were addressed to learn more about how the computer experience challenged the students' preconceptions: what the students thought the computer knew,…

  20. French Computer Terminology.

    ERIC Educational Resources Information Center

    Gray, Eugene F.

    1985-01-01

    Characteristics, idiosyncrasies, borrowings, and other aspects of the French terminology for computers and computer-related matters are discussed and placed in the context of French computer use. A glossary provides French equivalent terms or translations of English computer terminology. (MSE)

  1. The Story of Computers.

    ERIC Educational Resources Information Center

    Meadow, Charles T.

    The aim of this book is to interest young people from the ages of ten to fourteen in computers, particularly to show them that computers are exciting machines, controlled by people, and to dispel myths that computers can do magic. This is not a detailed exposition of computers, nor is it a textbook. It is an attempt to impart flavor and general…

  2. Parallel computing works

    SciTech Connect

    Not Available

    1991-10-23

    An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of many computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.

  3. (Computer vision and robotics)

    SciTech Connect

    Jones, J.P.

    1989-02-13

    The traveler attended the Fourth Aalborg International Symposium on Computer Vision at Aalborg University, Aalborg, Denmark. The traveler presented three invited lectures entitled, Concurrent Computer Vision on a Hypercube Multicomputer'', The Butterfly Accumulator and its Application in Concurrent Computer Vision on Hypercube Multicomputers'', and Concurrency in Mobile Robotics at ORNL'', and a ten-minute editorial entitled, It Concurrency an Issue in Computer Vision.'' The traveler obtained information on current R D efforts elsewhere in concurrent computer vision.

  4. Computational Biology, Advanced Scientific Computing, and Emerging Computational Architectures

    SciTech Connect

    2007-06-27

    This CRADA was established at the start of FY02 with $200 K from IBM and matching funds from DOE to support post-doctoral fellows in collaborative research between International Business Machines and Oak Ridge National Laboratory to explore effective use of emerging petascale computational architectures for the solution of computational biology problems. 'No cost' extensions of the CRADA were negotiated with IBM for FY03 and FY04.

  5. Future Computer Requirements for Computational Aerodynamics

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Recent advances in computational aerodynamics are discussed as well as motivations for and potential benefits of a National Aerodynamic Simulation Facility having the capability to solve fluid dynamic equations at speeds two to three orders of magnitude faster than presently possible with general computers. Two contracted efforts to define processor architectures for such a facility are summarized.

  6. Computers and Computation. Readings from Scientific American.

    ERIC Educational Resources Information Center

    Fenichel, Robert R.; Weizenbaum, Joseph

    A collection of articles from "Scientific American" magazine has been put together at this time because the current period in computer science is one of consolidation rather than innovation. A few years ago, computer science was moving so swiftly that even the professional journals were more archival than informative; but today it is much easier…

  7. Understanding student computational thinking with computational modeling

    NASA Astrophysics Data System (ADS)

    Aiken, John M.; Caballero, Marcos D.; Douglas, Scott S.; Burk, John B.; Scanlon, Erin M.; Thoms, Brian D.; Schatz, Michael F.

    2013-01-01

    Recently, the National Research Council's framework for next generation science standards highlighted "computational thinking" as one of its "fundamental practices". 9th Grade students taking a physics course that employed the Arizona State University's Modeling Instruction curriculum were taught to construct computational models of physical systems. Student computational thinking was assessed using a proctored programming assignment, written essay, and a series of think-aloud interviews, where the students produced and discussed a computational model of a baseball in motion via a high-level programming environment (VPython). Roughly a third of the students in the study were successful in completing the programming assignment. Student success on this assessment was tied to how students synthesized their knowledge of physics and computation. On the essay and interview assessments, students displayed unique views of the relationship between force and motion; those who spoke of this relationship in causal (rather than observational) terms tended to have more success in the programming exercise.

  8. Heterotic computing: exploiting hybrid computational devices.

    PubMed

    Kendon, Viv; Sebald, Angelika; Stepney, Susan

    2015-07-28

    Current computational theory deals almost exclusively with single models: classical, neural, analogue, quantum, etc. In practice, researchers use ad hoc combinations, realizing only recently that they can be fundamentally more powerful than the individual parts. A Theo Murphy meeting brought together theorists and practitioners of various types of computing, to engage in combining the individual strengths to produce powerful new heterotic devices. 'Heterotic computing' is defined as a combination of two or more computational systems such that they provide an advantage over either substrate used separately. This post-meeting collection of articles provides a wide-ranging survey of the state of the art in diverse computational paradigms, together with reflections on their future combination into powerful and practical applications.

  9. Cloud Computing for radiologists.

    PubMed

    Kharat, Amit T; Safvi, Amjad; Thind, Ss; Singh, Amarjit

    2012-07-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  10. Polymorphous computing fabric

    DOEpatents

    Wolinski, Christophe Czeslaw; Gokhale, Maya B.; McCabe, Kevin Peter

    2011-01-18

    Fabric-based computing systems and methods are disclosed. A fabric-based computing system can include a polymorphous computing fabric that can be customized on a per application basis and a host processor in communication with said polymorphous computing fabric. The polymorphous computing fabric includes a cellular architecture that can be highly parameterized to enable a customized synthesis of fabric instances for a variety of enhanced application performances thereof. A global memory concept can also be included that provides the host processor random access to all variables and instructions associated with the polymorphous computing fabric.

  11. Computational aerodynamics and design

    NASA Technical Reports Server (NTRS)

    Ballhaus, W. F., Jr.

    1982-01-01

    The role of computational aerodynamics in design is reviewed with attention given to the design process; the proper role of computations; the importance of calibration, interpretation, and verification; the usefulness of a given computational capability; and the marketing of new codes. Examples of computational aerodynamics in design are given with particular emphasis on the Highly Maneuverable Aircraft Technology. Finally, future prospects are noted, with consideration given to the role of advanced computers, advances in numerical solution techniques, turbulence models, complex geometries, and computational design procedures. Previously announced in STAR as N82-33348

  12. Scientific Grid computing.

    PubMed

    Coveney, Peter V

    2005-08-15

    We introduce a definition of Grid computing which is adhered to throughout this Theme Issue. We compare the evolution of the World Wide Web with current aspirations for Grid computing and indicate areas that need further research and development before a generally usable Grid infrastructure becomes available. We discuss work that has been done in order to make scientific Grid computing a viable proposition, including the building of Grids, middleware developments, computational steering and visualization. We review science that has been enabled by contemporary computational Grids, and associated progress made through the widening availability of high performance computing.

  13. Structural mechanics computations on parallel computing platforms

    SciTech Connect

    Kulak, R.F.; Plaskacz, E.J.; Pfeiffer, P.A.

    1995-06-01

    With recent advances in parallel supercomputers and network-connected workstations, the solution to large scale structural engineering problems has now become tractable. High-performance computer architectures, which are usually available at large universities and national laboratories, now can solve large nonlinear problems. At the other end of the spectrum, network connected workstations can be configured to become a distributed-parallel computer. This approach is attractive to small, medium and large engineering firms. This paper describes the development of a parallelized finite element computer program for the solution of static, nonlinear structural mechanics problems.

  14. Cognitive Computing for Security.

    SciTech Connect

    Debenedictis, Erik; Rothganger, Fredrick; Aimone, James Bradley; Marinella, Matthew; Evans, Brian Robert; Warrender, Christina E.; Mickel, Patrick

    2015-12-01

    Final report for Cognitive Computing for Security LDRD 165613. It reports on the development of hybrid of general purpose/ne uromorphic computer architecture, with an emphasis on potential implementation with memristors.

  15. Computer Intrusions and Attacks.

    ERIC Educational Resources Information Center

    Falk, Howard

    1999-01-01

    Examines some frequently encountered unsolicited computer intrusions, including computer viruses, worms, Java applications, trojan horses or vandals, e-mail spamming, hoaxes, and cookies. Also discusses virus-protection software, both for networks and for individual users. (LRW)

  16. Computers in Manufacturing.

    ERIC Educational Resources Information Center

    Hudson, C. A.

    1982-01-01

    Advances in factory computerization (computer-aided design and computer-aided manufacturing) are reviewed, including discussions of robotics, human factors engineering, and the sociological impact of automation. (JN)

  17. ICASE Computer Science Program

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The Institute for Computer Applications in Science and Engineering computer science program is discussed in outline form. Information is given on such topics as problem decomposition, algorithm development, programming languages, and parallel architectures.

  18. Computational Toxicology (S)

    EPA Science Inventory

    The emerging field of computational toxicology applies mathematical and computer models and molecular biological and chemical approaches to explore both qualitative and quantitative relationships between sources of environmental pollutant exposure and adverse health outcomes. Th...

  19. Computers in Schools.

    ERIC Educational Resources Information Center

    Moore, John W.; Moore, Elizabeth A.

    1988-01-01

    Surveys the types of computers being used in high school and university chemistry courses. Identifies types of hardware found on Apple and MS-DOS computers. Makes recommendations for the upgrading of current equipment. (ML)

  20. The Interpersonal Computer.

    ERIC Educational Resources Information Center

    Neal, John S.

    1994-01-01

    Provides helpful tips regarding classroom use of computers in a cooperative-learning environment. Comments on student computer ratio, interdependence, individual accountability, heterogeneous grouping, shared responsibility, and good working relationships. (ZWH)

  1. Space Spurred Computer Graphics

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Dicomed Corporation was asked by NASA in the early 1970s to develop processing capabilities for recording images sent from Mars by Viking spacecraft. The company produced a film recorder which increased the intensity levels and the capability for color recording. This development led to a strong technology base resulting in sophisticated computer graphics equipment. Dicomed systems are used to record CAD (computer aided design) and CAM (computer aided manufacturing) equipment, to update maps and produce computer generated animation.

  2. Introduction to Quantum Computation

    NASA Astrophysics Data System (ADS)

    Ekert, A.

    A computation is a physical process. It may be performed by a piece of electronics or on an abacus, or in your brain, but it is a process that takes place in nature and as such it is subject to the laws of physics. Quantum computers are machines that rely on characteristically quantum phenomena, such as quantum interference and quantum entanglement in order to perform computation. In this series of lectures I want to elaborate on the computational power of such machines.

  3. COMPUTATIONAL SCIENCE CENTER

    SciTech Connect

    DAVENPORT,J.

    2004-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security.

  4. Quantum computing and probability.

    PubMed

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  5. Nanoelectronics: Metrology and Computation

    SciTech Connect

    Lundstrom, Mark; Clark, Jason V.; Klimeck, Gerhard; Raman, Arvind

    2007-09-26

    Research in nanoelectronics poses new challenges for metrology, but advances in theory, simulation and computing and networking technology provide new opportunities to couple simulation and metrology. This paper begins with a brief overview of current work in computational nanoelectronics. Three examples of how computation can assist metrology will then be discussed. The paper concludes with a discussion of how cyberinfrastructure can help connect computing and metrology using the nanoHUB (www.nanoHUB.org) as a specific example.

  6. Expanding Computer Service with Personal Computers.

    ERIC Educational Resources Information Center

    Bomzer, Herbert

    1983-01-01

    A planning technique, the mission justification document, and the evaluation procedures developed at Central Michigan University to ensure the orderly growth of computer-dependent resources within the constraints of tight budgets are described. (Author/MLW)

  7. Computer Literacy. Focus 11.

    ERIC Educational Resources Information Center

    Benderson, Albert

    One of a series on the responses by the Educational Testing Service (ETS) and others to critical problems in education, this overview addresses a variety of issues related to computer literacy. Topics discussed include the pace of the transition to the computer age, the development of microprocessors, and the computer as fad or revolution.…

  8. BNL ATLAS Grid Computing

    ScienceCinema

    Michael Ernst

    2016-07-12

    As the sole Tier-1 computing facility for ATLAS in the United States and the largest ATLAS computing center worldwide Brookhaven provides a large portion of the overall computing resources for U.S. collaborators and serves as the central hub for storing,

  9. Computers in Scientific Instrumentation.

    ERIC Educational Resources Information Center

    Enke, C. G.

    1982-01-01

    Computer applications in scientific instrumentation are traced from early data processing to modern computer-based instruments. Probable pathways toward instruments with increased "intelligence" include, among others, implementation of hierarchical computer networks and microprocessor controllers and the simplification of programing. The…

  10. Dietary Interviewing by Computer.

    ERIC Educational Resources Information Center

    Slack, Warner V.; And Others

    1976-01-01

    A computer based dietary interviewing program enhanced self awareness for overweight participants. In a three part interview designed for direct interaction between patient and computer, questions dealt with general dietary behavior and details of food intake. The computer assisted the patient in planning a weight reducing diet of approximately…

  11. Coping with Computing Success.

    ERIC Educational Resources Information Center

    Breslin, Richard D.

    Elements of computing success of Iona College, the challenges it currently faces, and the strategies conceived to cope with future computing needs are discussed. The college has mandated computer literacy for students and offers nine degrees in the computerized information system/management information system areas. Since planning is needed in…

  12. The fifth generation computer

    SciTech Connect

    Moto-Oka, T.; Kitsuregawa, M.

    1985-01-01

    The leader of Japan's Fifth Generation computer project, known as the 'Apollo' project, and a young computer scientist elucidate in this book the process of how the idea came about, international reactions, the basic technology, prospects for realization, and the abilities of the Fifth Generation computer. Topics considered included forecasting, research programs, planning, and technology impacts.

  13. The Computer Goes Home.

    ERIC Educational Resources Information Center

    Cirone, Bill

    2001-01-01

    A partnership between local community, business, and education leaders and the Santa Barbara County Education Office spawned Computers for Families--a program putting computers in needy families' homes. Adjudicated youth at a residential boys' camp gain vocational skills when refurbishing donated computers for families' use. (MLH)

  14. Writing, Thinking and Computers.

    ERIC Educational Resources Information Center

    Hartley, James

    1993-01-01

    Reviews the potential of word processors for changing the ways in which students process written text and think about writing. Three levels of computer-aided writing are considered: simple word processors; computer-aided writing programs; and higher-level computer-aided processing; and improvements in writing quality. (41 references) (LRW)

  15. Mathematics Unipac. Computers.

    ERIC Educational Resources Information Center

    Parma City School District, OH.

    Five activities are presented in this student workbook designed for exploration of a career in computers, and the mathematics related to that career. Included are activities on basic computer language, flow charts, and basic computer programming. Each activity includes the objective, materials needed, and information for completing the activity.…

  16. Personal Computers on Campus.

    ERIC Educational Resources Information Center

    Waldrop, M. Mitchell

    1985-01-01

    Examines issues involving the use of on-line databases, magnetic and optical data storage, digital telecommunications, and microcomputers on college campuses. These issues include access to computers and computer networking, and educational uses of the computers. Examples of efforts at four universities are included. (JN)

  17. Understanding Computer Terms.

    ERIC Educational Resources Information Center

    Lilly, Edward R.

    Designed to assist teachers and administrators approaching the subject of computers for the first time to acquire a feel for computer terminology, this document presents a computer term glossary on three levels. (1) The terms most frequently used, called a "basic vocabulary," are presented first in three paragraphs which explain their meanings:…

  18. Computer Applications for Children.

    ERIC Educational Resources Information Center

    Dulsky, Dwight; And Others

    1993-01-01

    Four articles discuss computer-assisted instruction, including (1) a middle school art and computer departments project that used LOGO to create rose window designs; (2) student journals; (3) the application of Piaget constructivism and Vygotskin social interaction to LOGO learning; and (4) computer lab writing workshops for elementary school…

  19. BBC Computer Literacy Project.

    ERIC Educational Resources Information Center

    Salkeld, Bob

    1982-01-01

    Describes the development and purpose of The Computer Programme, a 10-part television series produced by the BBC's Continuing Education Television Department and designed to provide a beginner's guide to the world of computers and computing. The history of the project is reviewed and the creation of the BBC microcomputer is recounted. (Author/JL)

  20. Education for Computers

    ERIC Educational Resources Information Center

    Heslep, Robert D.

    2012-01-01

    The computer engineers who refer to the education of computers do not have a definite idea of education and do not bother to justify the fuzzy ones to which they allude. Hence, they logically cannot specify the features a computer must have in order to be educable. This paper puts forth a non-standard, but not arbitrary, concept of education that…

  1. BNL ATLAS Grid Computing

    SciTech Connect

    Michael Ernst

    2008-10-02

    As the sole Tier-1 computing facility for ATLAS in the United States and the largest ATLAS computing center worldwide Brookhaven provides a large portion of the overall computing resources for U.S. collaborators and serves as the central hub for storing,

  2. The Computer Delusion.

    ERIC Educational Resources Information Center

    Oppenheimer, Todd

    1997-01-01

    Challenges research and prevailing attitudes that maintain that computers improve teaching and academic achievement. Criticizes and questions research methodology, computer literacy education, the need for computer skills to make a competitive workforce, support from the business community resulting from technology programs, and Internet use. (LRW)

  3. Computational Thinking Patterns

    ERIC Educational Resources Information Center

    Ioannidou, Andri; Bennett, Vicki; Repenning, Alexander; Koh, Kyu Han; Basawapatna, Ashok

    2011-01-01

    The iDREAMS project aims to reinvent Computer Science education in K-12 schools, by using game design and computational science for motivating and educating students through an approach we call Scalable Game Design, starting at the middle school level. In this paper we discuss the use of Computational Thinking Patterns as the basis for our…

  4. Teaching with Personal Computers.

    ERIC Educational Resources Information Center

    Stonebraker, Peter W.; Coye, Ray W.

    1988-01-01

    A discussion of computer integration in the college curriculum describes its impact on courses and classroom interaction, looks at levels of computer integration, and examines strategies for developing such a course. It is proposed that, although computer integration is inevitable, it will cause a turbulent transition period for classroom…

  5. Getting To Know Computers.

    ERIC Educational Resources Information Center

    Lundgren, Mary Beth

    Originally written for adult new readers involved in literacy programs, this book is also helpful to those individuals who want a basic book about computers. It uses the carefully controlled vocabulary with which adult new readers are familiar. Chapter 1 addresses the widespread use of computers. Chapter 2 discusses what a computer is and…

  6. The Computer Bulletin Board.

    ERIC Educational Resources Information Center

    Batt, Russell H., Ed.

    1990-01-01

    Four applications of microcomputers in the chemical laboratory are presented. Included are "Mass Spectrometer Interface with an Apple II Computer,""Interfacing the Spectronic 20 to a Computer,""A pH-Monitoring and Control System for Teaching Laboratories," and "A Computer-Aided Optical Melting Point Device." Software, instrumentation, and uses are…

  7. Computed Tomography (CT) - Spine

    MedlinePlus

    ... News Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Spine Computed tomography (CT) of the spine is a diagnostic imaging ... Spine? What is CT Scanning of the Spine? Computed tomography, more commonly known as a CT or CAT ...

  8. Computing environment logbook

    DOEpatents

    Osbourn, Gordon C; Bouchard, Ann M

    2012-09-18

    A computing environment logbook logs events occurring within a computing environment. The events are displayed as a history of past events within the logbook of the computing environment. The logbook provides search functionality to search through the history of past events to find one or more selected past events, and further, enables an undo of the one or more selected past events.

  9. Quantum walk computation

    SciTech Connect

    Kendon, Viv

    2014-12-04

    Quantum versions of random walks have diverse applications that are motivating experimental implementations as well as theoretical studies. Recent results showing quantum walks are “universal for quantum computation” relate to algorithms, to be run on quantum computers. We consider whether an experimental implementation of a quantum walk could provide useful computation before we have a universal quantum computer.

  10. On Understanding Computers.

    ERIC Educational Resources Information Center

    Olds, Henry F., Jr.; And Others

    1983-01-01

    Three articles discuss the use of computers in education: (1) "References for a Broader Vision" (Henry F. Olds, Jr.); (2) "What Every Teacher Should Know About Computer Simulations" (David Grady); and (3) "The Computer as Palette and Model Builder" (Interview of Alan Kay). (CJ)

  11. The Challenge of Computers.

    ERIC Educational Resources Information Center

    Leger, Guy

    Computers may change teachers' lifestyles, teaching styles, and perhaps even their personal values. A brief survey of the history of computers demonstrates the incredible pace at which computer technology is moving ahead. The cost and size of microchips will continue to decline dramatically over the next 20 years, while the capability and variety…

  12. Computer Center: CIBE Systems.

    ERIC Educational Resources Information Center

    Crovello, Theodore J.

    1982-01-01

    Differentiates between computer systems and Computers in Biological Education (CIBE) systems (computer system intended for use in biological education). Describes several CIBE stand alone systems: single-user microcomputer; single-user microcomputer/video-disc; multiuser microcomputers; multiuser maxicomputer; and local and long distance computer…

  13. Advanced Computing for Science.

    ERIC Educational Resources Information Center

    Hut, Piet; Sussman, Gerald Jay

    1987-01-01

    Discusses some of the contributions that high-speed computing is making to the study of science. Emphasizes the use of computers in exploring complicated systems without the simplification required in traditional methods of observation and experimentation. Provides examples of computer assisted investigations in astronomy and physics. (TW)

  14. Computers in Engineering Teaching.

    ERIC Educational Resources Information Center

    Rushby, N. J.

    This bibliography cites 26 books, papers, and reports dealing with various uses of computers in engineering education; and describes several computer programs available for use in teaching aeronautical, chemical, civil, electrical and electronic, mechanical, and nuclear engineering. Each computer program entry is presented by name, author,…

  15. Computer animation challenges for computational fluid dynamics

    NASA Astrophysics Data System (ADS)

    Vines, Mauricio; Lee, Won-Sook; Mavriplis, Catherine

    2012-07-01

    Computer animation requirements differ from those of traditional computational fluid dynamics (CFD) investigations in that visual plausibility and rapid frame update rates trump physical accuracy. We present an overview of the main techniques for fluid simulation in computer animation, starting with Eulerian grid approaches, the Lattice Boltzmann method, Fourier transform techniques and Lagrangian particle introduction. Adaptive grid methods, precomputation of results for model reduction, parallelisation and computation on graphical processing units (GPUs) are reviewed in the context of accelerating simulation computations for animation. A survey of current specific approaches for the application of these techniques to the simulation of smoke, fire, water, bubbles, mixing, phase change and solid-fluid coupling is also included. Adding plausibility to results through particle introduction, turbulence detail and concentration on regions of interest by level set techniques has elevated the degree of accuracy and realism of recent animations. Basic approaches are described here. Techniques to control the simulation to produce a desired visual effect are also discussed. Finally, some references to rendering techniques and haptic applications are mentioned to provide the reader with a complete picture of the challenges of simulating fluids in computer animation.

  16. The science of computing - Parallel computation

    NASA Technical Reports Server (NTRS)

    Denning, P. J.

    1985-01-01

    Although parallel computation architectures have been known for computers since the 1920s, it was only in the 1970s that microelectronic components technologies advanced to the point where it became feasible to incorporate multiple processors in one machine. Concommitantly, the development of algorithms for parallel processing also lagged due to hardware limitations. The speed of computing with solid-state chips is limited by gate switching delays. The physical limit implies that a 1 Gflop operational speed is the maximum for sequential processors. A computer recently introduced features a 'hypercube' architecture with 128 processors connected in networks at 5, 6 or 7 points per grid, depending on the design choice. Its computing speed rivals that of supercomputers, but at a fraction of the cost. The added speed with less hardware is due to parallel processing, which utilizes algorithms representing different parts of an equation that can be broken into simpler statements and processed simultaneously. Present, highly developed computer languages like FORTRAN, PASCAL, COBOL, etc., rely on sequential instructions. Thus, increased emphasis will now be directed at parallel processing algorithms to exploit the new architectures.

  17. Neural Computation and the Computational Theory of Cognition

    ERIC Educational Resources Information Center

    Piccinini, Gualtiero; Bahar, Sonya

    2013-01-01

    We begin by distinguishing computationalism from a number of other theses that are sometimes conflated with it. We also distinguish between several important kinds of computation: computation in a generic sense, digital computation, and analog computation. Then, we defend a weak version of computationalism--neural processes are computations in the…

  18. An artificial muscle computer

    NASA Astrophysics Data System (ADS)

    Marc O'Brien, Benjamin; Alexander Anderson, Iain

    2013-03-01

    We have built an artificial muscle computer based on Wolfram's "2, 3" Turing machine architecture, the simplest known universal Turing machine. Our computer uses artificial muscles for its instruction set, output buffers, and memory write and addressing mechanisms. The computer is very slow and large (0.15 Hz, ˜1 m3); however by using only 13 artificial muscle relays, it is capable of solving any computable problem given sufficient memory, time, and reliability. The development of this computer shows that artificial muscles can think—paving the way for soft robots with reflexes like those seen in nature.

  19. Scalable optical quantum computer

    SciTech Connect

    Manykin, E A; Mel'nichenko, E V

    2014-12-31

    A way of designing a scalable optical quantum computer based on the photon echo effect is proposed. Individual rare earth ions Pr{sup 3+}, regularly located in the lattice of the orthosilicate (Y{sub 2}SiO{sub 5}) crystal, are suggested to be used as optical qubits. Operations with qubits are performed using coherent and incoherent laser pulses. The operation protocol includes both the method of measurement-based quantum computations and the technique of optical computations. Modern hybrid photon echo protocols, which provide a sufficient quantum efficiency when reading recorded states, are considered as most promising for quantum computations and communications. (quantum computer)

  20. Computer algebra and operators

    NASA Technical Reports Server (NTRS)

    Fateman, Richard; Grossman, Robert

    1989-01-01

    The symbolic computation of operator expansions is discussed. Some of the capabilities that prove useful when performing computer algebra computations involving operators are considered. These capabilities may be broadly divided into three areas: the algebraic manipulation of expressions from the algebra generated by operators; the algebraic manipulation of the actions of the operators upon other mathematical objects; and the development of appropriate normal forms and simplification algorithms for operators and their actions. Brief descriptions are given of the computer algebra computations that arise when working with various operators and their actions.

  1. COMPUTATIONAL SCIENCE CENTER

    SciTech Connect

    DAVENPORT, J.

    2005-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include, for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security. To achieve our goals we have established a close alliance with applied mathematicians and computer scientists at Stony Brook and Columbia Universities.

  2. Richard Feynman and computation

    NASA Astrophysics Data System (ADS)

    Hey, Tony

    1999-04-01

    The enormous contribution of Richard Feynman to modern physics is well known, both to teaching through his famous Feynman Lectures on Physics, and to research with his Feynman diagram approach to quantum field theory and his path integral formulation of quantum mechanics. Less well known perhaps is his long-standing interest in the physics of computation and this is the subject of this paper. Feynman lectured on computation at Caltech for most of the last decade of his life, first with John Hopfield and Carver Mead, and then with Gerry Sussman. The story of how these lectures came to be written up as the Feynman Lectures on Computation is briefly recounted. Feynman also discussed the fundamentals of computation with other legendary figures of the computer science and physics community such as Ed Fredkin, Rolf Landauer, Carver Mead, Marvin Minsky and John Wheeler. He was also instrumental in stimulating developments in both nanotechnology and quantum computing. During the 1980s Feynman re-visited long-standing interests both in parallel computing with Geoffrey Fox and Danny Hillis, and in reversible computation and quantum computing with Charles Bennett, Norman Margolus, Tom Toffoli and Wojciech Zurek. This paper records Feynman's links with the computational community and includes some reminiscences about his involvement with the fundamentals of computing.

  3. ALMA correlator computer systems

    NASA Astrophysics Data System (ADS)

    Pisano, Jim; Amestica, Rodrigo; Perez, Jesus

    2004-09-01

    We present a design for the computer systems which control, configure, and monitor the Atacama Large Millimeter Array (ALMA) correlator and process its output. Two distinct computer systems implement this functionality: a rack- mounted PC controls and monitors the correlator, and a cluster of 17 PCs process the correlator output into raw spectral results. The correlator computer systems interface to other ALMA computers via gigabit Ethernet networks utilizing CORBA and raw socket connections. ALMA Common Software provides the software infrastructure for this distributed computer environment. The control computer interfaces to the correlator via multiple CAN busses and the data processing computer cluster interfaces to the correlator via sixteen dedicated high speed data ports. An independent array-wide hardware timing bus connects to the computer systems and the correlator hardware ensuring synchronous behavior and imposing hard deadlines on the control and data processor computers. An aggregate correlator output of 1 gigabyte per second with 16 millisecond periods and computational data rates of approximately 1 billion floating point operations per second define other hard deadlines for the data processing computer cluster.

  4. Multidisciplinary computational aerosciences

    NASA Technical Reports Server (NTRS)

    Kutler, Paul

    1992-01-01

    As the challenges of single disciplinary computational physics are met, such as computational fluid dynamics, computational structural mechanics, computational propulsion, computational aeroacoustics, computational electromagnetics, etc., scientists have begun investigating the combination of these single disciplines into what is being called multidisciplinary computational aerosciences (MCAS). The combination of several disciplines not only offers simulation realism but also formidable computational challenges. The solution of such problems will require computers orders of magnitude larger than those currently available. Such computer power can only be supplied by massively parallel machines because of the current speed-of-light limitation of conventional serial systems. Even with such machines, MCAS problems will require hundreds of hours for their solution. To efficiently utilize such a machine, research is required in three areas that include parallel architectures, systems software, and applications software. The main emphasis of this paper is the applications software element. Examples that demonstrate application software for multidisciplinary problems currently being solved at NASA Ames Research Center are presented. Pacing items for MCAS are discussed such as solution methodology, physical modeling, computer power, and multidisciplinary validation experiments.

  5. Computer-assisted psychotherapy

    PubMed Central

    Wright, Jesse H.; Wright, Andrew S.

    1997-01-01

    The rationale for using computers in psychotherapy includes the possibility that therapeutic software could improve the efficiency of treatment and provide access for greater numbers of patients. Computers have not been able to reliably duplicate the type of dialogue typically used in clinician-administered therapy. However, computers have significant strengths that can be used to advantage in designing treatment programs. Software developed for computer-assisted therapy generally has been well accepted by patients. Outcome studies have usually demonstrated treatment effectiveness for this form of therapy. Future development of computer tools may be influenced by changes in health care financing and rapid growth of new technologies. An integrated care delivery model incorporating the unique attributes of both clinicians and computers should be adopted for computer-assisted therapy. PMID:9292446

  6. Hyperswitch Communication Network Computer

    NASA Technical Reports Server (NTRS)

    Peterson, John C.; Chow, Edward T.; Priel, Moshe; Upchurch, Edwin T.

    1993-01-01

    Hyperswitch Communications Network (HCN) computer is prototype multiple-processor computer being developed. Incorporates improved version of hyperswitch communication network described in "Hyperswitch Network For Hypercube Computer" (NPO-16905). Designed to support high-level software and expansion of itself. HCN computer is message-passing, multiple-instruction/multiple-data computer offering significant advantages over older single-processor and bus-based multiple-processor computers, with respect to price/performance ratio, reliability, availability, and manufacturing. Design of HCN operating-system software provides flexible computing environment accommodating both parallel and distributed processing. Also achieves balance among following competing factors; performance in processing and communications, ease of use, and tolerance of (and recovery from) faults.

  7. Navier-Stokes Computations on Commodity Computers

    NASA Technical Reports Server (NTRS)

    Vatsa, Veer N.; Faulkner, Thomas R.

    1998-01-01

    In this paper we discuss and demonstrate the feasibility of solving high-fidelity, nonlinear computational fluid dynamics (CFD) problems of practical interest on commodity machines, namely Pentium Pro PC's. Such calculations have now become possible due to the progress in computational power and memory of the off-the-shelf commodity computers, along with the growth in bandwidth and communication speeds of networks. A widely used CFD code known as TLNS3D, which was developed originally on large shared memory computers was selected for this effort. This code has recently been ported to massively parallel processor (MPP) type machines, where natural partitioning along grid blocks is adopted in which one or more blocks are distributed to each of the available processors. In this paper, a similar approach is adapted to port this code to a cluster of Pentium Pro computers. The message passing among the processors is accomplished through the use of standard message passing interface (MPI) libraries. Scaling studies indicate fairly high level of parallelism on such clusters of commodity machines, thus making solutions to Navier-Stokes equations for practical problems more affordable.

  8. Computational Biology and High Performance Computing 2000

    SciTech Connect

    Simon, Horst D.; Zorn, Manfred D.; Spengler, Sylvia J.; Shoichet, Brian K.; Stewart, Craig; Dubchak, Inna L.; Arkin, Adam P.

    2000-10-19

    The pace of extraordinary advances in molecular biology has accelerated in the past decade due in large part to discoveries coming from genome projects on human and model organisms. The advances in the genome project so far, happening well ahead of schedule and under budget, have exceeded any dreams by its protagonists, let alone formal expectations. Biologists expect the next phase of the genome project to be even more startling in terms of dramatic breakthroughs in our understanding of human biology, the biology of health and of disease. Only today can biologists begin to envision the necessary experimental, computational and theoretical steps necessary to exploit genome sequence information for its medical impact, its contribution to biotechnology and economic competitiveness, and its ultimate contribution to environmental quality. High performance computing has become one of the critical enabling technologies, which will help to translate this vision of future advances in biology into reality. Biologists are increasingly becoming aware of the potential of high performance computing. The goal of this tutorial is to introduce the exciting new developments in computational biology and genomics to the high performance computing community.

  9. Computers and neurosurgery.

    PubMed

    Shaikhouni, Ammar; Elder, J Bradley

    2012-11-01

    At the turn of the twentieth century, the only computational device used in neurosurgical procedures was the brain of the surgeon. Today, most neurosurgical procedures rely at least in part on the use of a computer to help perform surgeries accurately and safely. The techniques that revolutionized neurosurgery were mostly developed after the 1950s. Just before that era, the transistor was invented in the late 1940s, and the integrated circuit was invented in the late 1950s. During this time, the first automated, programmable computational machines were introduced. The rapid progress in the field of neurosurgery not only occurred hand in hand with the development of modern computers, but one also can state that modern neurosurgery would not exist without computers. The focus of this article is the impact modern computers have had on the practice of neurosurgery. Neuroimaging, neuronavigation, and neuromodulation are examples of tools in the armamentarium of the modern neurosurgeon that owe each step in their evolution to progress made in computer technology. Advances in computer technology central to innovations in these fields are highlighted, with particular attention to neuroimaging. Developments over the last 10 years in areas of sensors and robotics that promise to transform the practice of neurosurgery further are discussed. Potential impacts of advances in computers related to neurosurgery in developing countries and underserved regions are also discussed. As this article illustrates, the computer, with its underlying and related technologies, is central to advances in neurosurgery over the last half century.

  10. Quantum analogue computing.

    PubMed

    Kendon, Vivien M; Nemoto, Kae; Munro, William J

    2010-08-13

    We briefly review what a quantum computer is, what it promises to do for us and why it is so hard to build one. Among the first applications anticipated to bear fruit is the quantum simulation of quantum systems. While most quantum computation is an extension of classical digital computation, quantum simulation differs fundamentally in how the data are encoded in the quantum computer. To perform a quantum simulation, the Hilbert space of the system to be simulated is mapped directly onto the Hilbert space of the (logical) qubits in the quantum computer. This type of direct correspondence is how data are encoded in a classical analogue computer. There is no binary encoding, and increasing precision becomes exponentially costly: an extra bit of precision doubles the size of the computer. This has important consequences for both the precision and error-correction requirements of quantum simulation, and significant open questions remain about its practicality. It also means that the quantum version of analogue computers, continuous-variable quantum computers, becomes an equally efficient architecture for quantum simulation. Lessons from past use of classical analogue computers can help us to build better quantum simulators in future.

  11. Computers and neurosurgery.

    PubMed

    Shaikhouni, Ammar; Elder, J Bradley

    2012-11-01

    At the turn of the twentieth century, the only computational device used in neurosurgical procedures was the brain of the surgeon. Today, most neurosurgical procedures rely at least in part on the use of a computer to help perform surgeries accurately and safely. The techniques that revolutionized neurosurgery were mostly developed after the 1950s. Just before that era, the transistor was invented in the late 1940s, and the integrated circuit was invented in the late 1950s. During this time, the first automated, programmable computational machines were introduced. The rapid progress in the field of neurosurgery not only occurred hand in hand with the development of modern computers, but one also can state that modern neurosurgery would not exist without computers. The focus of this article is the impact modern computers have had on the practice of neurosurgery. Neuroimaging, neuronavigation, and neuromodulation are examples of tools in the armamentarium of the modern neurosurgeon that owe each step in their evolution to progress made in computer technology. Advances in computer technology central to innovations in these fields are highlighted, with particular attention to neuroimaging. Developments over the last 10 years in areas of sensors and robotics that promise to transform the practice of neurosurgery further are discussed. Potential impacts of advances in computers related to neurosurgery in developing countries and underserved regions are also discussed. As this article illustrates, the computer, with its underlying and related technologies, is central to advances in neurosurgery over the last half century. PMID:22985531

  12. Architecture Adaptive Computing Environment

    NASA Technical Reports Server (NTRS)

    Dorband, John E.

    2006-01-01

    Architecture Adaptive Computing Environment (aCe) is a software system that includes a language, compiler, and run-time library for parallel computing. aCe was developed to enable programmers to write programs, more easily than was previously possible, for a variety of parallel computing architectures. Heretofore, it has been perceived to be difficult to write parallel programs for parallel computers and more difficult to port the programs to different parallel computing architectures. In contrast, aCe is supportable on all high-performance computing architectures. Currently, it is supported on LINUX clusters. aCe uses parallel programming constructs that facilitate writing of parallel programs. Such constructs were used in single-instruction/multiple-data (SIMD) programming languages of the 1980s, including Parallel Pascal, Parallel Forth, C*, *LISP, and MasPar MPL. In aCe, these constructs are extended and implemented for both SIMD and multiple- instruction/multiple-data (MIMD) architectures. Two new constructs incorporated in aCe are those of (1) scalar and virtual variables and (2) pre-computed paths. The scalar-and-virtual-variables construct increases flexibility in optimizing memory utilization in various architectures. The pre-computed-paths construct enables the compiler to pre-compute part of a communication operation once, rather than computing it every time the communication operation is performed.

  13. Parallel computation and computers for artificial intelligence

    SciTech Connect

    Kowalik, J.S. )

    1988-01-01

    This book discusses Parallel Processing in Artificial Intelligence; Parallel Computing using Multilisp; Execution of Common Lisp in a Parallel Environment; Qlisp; Restricted AND-Parallel Execution of Logic Programs; PARLOG: Parallel Programming in Logic; and Data-driven Processing of Semantic Nets. Attention is also given to: Application of the Butterfly Parallel Processor in Artificial Intelligence; On the Range of Applicability of an Artificial Intelligence Machine; Low-level Vision on Warp and the Apply Programming Mode; AHR: A Parallel Computer for Pure Lisp; FAIM-1: An Architecture for Symbolic Multi-processing; and Overview of Al Application Oriented Parallel Processing Research in Japan.

  14. Computational approaches to computational aero-acoustics

    NASA Technical Reports Server (NTRS)

    Hardin, Jay C.

    1996-01-01

    The various techniques by which the goal of computational aeroacoustics (the calculation and noise prediction of a fluctuating fluid flow) may be achieved are reviewed. The governing equations for compressible fluid flow are presented. The direct numerical simulation approach is shown to be computationally intensive for high Reynolds number viscous flows. Therefore, other approaches, such as the acoustic analogy, vortex models and various perturbation techniques that aim to break the analysis into a viscous part and an acoustic part are presented. The choice of the approach is shown to be problem dependent.

  15. COMPUTATIONAL SCIENCE CENTER

    SciTech Connect

    DAVENPORT, J.

    2006-11-01

    Computational Science is an integral component of Brookhaven's multi science mission, and is a reflection of the increased role of computation across all of science. Brookhaven currently has major efforts in data storage and analysis for the Relativistic Heavy Ion Collider (RHIC) and the ATLAS detector at CERN, and in quantum chromodynamics. The Laboratory is host for the QCDOC machines (quantum chromodynamics on a chip), 10 teraflop/s computers which boast 12,288 processors each. There are two here, one for the Riken/BNL Research Center and the other supported by DOE for the US Lattice Gauge Community and other scientific users. A 100 teraflop/s supercomputer will be installed at Brookhaven in the coming year, managed jointly by Brookhaven and Stony Brook, and funded by a grant from New York State. This machine will be used for computational science across Brookhaven's entire research program, and also by researchers at Stony Brook and across New York State. With Stony Brook, Brookhaven has formed the New York Center for Computational Science (NYCCS) as a focal point for interdisciplinary computational science, which is closely linked to Brookhaven's Computational Science Center (CSC). The CSC has established a strong program in computational science, with an emphasis on nanoscale electronic structure and molecular dynamics, accelerator design, computational fluid dynamics, medical imaging, parallel computing and numerical algorithms. We have been an active participant in DOES SciDAC program (Scientific Discovery through Advanced Computing). We are also planning a major expansion in computational biology in keeping with Laboratory initiatives. Additional laboratory initiatives with a dependence on a high level of computation include the development of hydrodynamics models for the interpretation of RHIC data, computational models for the atmospheric transport of aerosols, and models for combustion and for energy utilization. The CSC was formed to bring together

  16. Computational engine structural analysis

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Johns, R. H.

    1986-01-01

    A significant research activity at the NASA Lewis Research Center is the computational simulation of complex multidisciplinary engine structural problems. This simulation is performed using computational engine structural analysis (CESA) which consists of integrated multidisciplinary computer codes in conjunction with computer post-processing for problem-specific application. A variety of the computational simulations of specific cases are described in some detail in this paper. These case studies include: (1) aeroelastic behavior of bladed rotors, (2) high velocity impact of fan blades, (3) blade-loss transient response, (4) rotor/stator/squeeze-film/bearing interaction, (5) blade-fragment/rotor-burst containment, and (6) structural behavior of advanced swept turboprops. These representative case studies are selected to demonstrate the breath of the problems analyzed and the role of the computer including post-processing and graphical display of voluminous output data.

  17. Personal computers on campus.

    PubMed

    Waldrop, M M

    1985-04-26

    Colleges and universities are becoming test beds for the much-heralded "information society" as they incorporate a new series of information technologies. These include on-line databases, magnetic and optical data storage, digital telecommunications, computer networks, and, most visibly and dramatically, personal computers. The transition is presenting administrators and faculty with major challenges, however. This article discusses some of the issues involved, including access to computers and to computer networking, managing the transition, and the educational uses of personal computers. A final section discusses efforts at Massachusetts Institute of Technology, Brown University, and Camegie-Mellon University to shape a new-generation personal computer, the so-called "scholar's workstation." PMID:17746874

  18. Adiabatic topological quantum computing

    NASA Astrophysics Data System (ADS)

    Cesare, Chris; Landahl, Andrew J.; Bacon, Dave; Flammia, Steven T.; Neels, Alice

    2015-07-01

    Topological quantum computing promises error-resistant quantum computation without active error correction. However, there is a worry that during the process of executing quantum gates by braiding anyons around each other, extra anyonic excitations will be created that will disorder the encoded quantum information. Here, we explore this question in detail by studying adiabatic code deformations on Hamiltonians based on topological codes, notably Kitaev's surface codes and the more recently discovered color codes. We develop protocols that enable universal quantum computing by adiabatic evolution in a way that keeps the energy gap of the system constant with respect to the computation size and introduces only simple local Hamiltonian interactions. This allows one to perform holonomic quantum computing with these topological quantum computing systems. The tools we develop allow one to go beyond numerical simulations and understand these processes analytically.

  19. Trends; Integrating computer systems

    SciTech Connect

    de Buyl, M. )

    1991-11-04

    This paper reports that computers are invaluable tools in assisting E and P managers with their information management and analysis tasks. Oil companies and software houses are striving to adapt their products and work practices to capitalize on the rapid evolution in computer hardware performance and affordability. Ironically, an investment in computers aimed at reducing risk and cost also contains an element of added risk and cost. Hundreds of millions of dollars have been spent by the oil industry in purchasing hardware and software and in developing software. Unfortunately, these investments may not have completely fulfilled the industry's expectations. The lower return on computer science investments is due to: Unmet expectations in productivity gains. Premature computer hardware and software obsolescence. Inefficient data transfer between software applications. Hidden costs of computer support personnel and vendors.

  20. New computing systems and their impact on computational mechanics

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    1989-01-01

    Recent advances in computer technology that are likely to impact computational mechanics are reviewed. The technical needs for computational mechanics technology are outlined. The major features of new and projected computing systems, including supersystems, parallel processing machines, special-purpose computing hardware, and small systems are described. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed, and a novel partitioning strategy is outlined for maximizing the degree of parallelism on multiprocessor computers with a shared memory.

  1. Computer aided production engineering

    SciTech Connect

    Not Available

    1986-01-01

    This book presents the following contents: CIM in avionics; computer analysis of product designs for robot assembly; a simulation decision mould for manpower forecast and its application; development of flexible manufacturing system; advances in microcomputer applications in CAD/CAM; an automated interface between CAD and process planning; CAM and computer vision; low friction pneumatic actuators for accurate robot control; robot assembly of printed circuit boards; information systems design for computer integrated manufacture; and a CAD engineering language to aid manufacture.

  2. Sensor sentinel computing device

    DOEpatents

    Damico, Joseph P.

    2016-08-02

    Technologies pertaining to authenticating data output by sensors in an industrial environment are described herein. A sensor sentinel computing device receives time-series data from a sensor by way of a wireline connection. The sensor sentinel computing device generates a validation signal that is a function of the time-series signal. The sensor sentinel computing device then transmits the validation signal to a programmable logic controller in the industrial environment.

  3. Research in computer science

    NASA Technical Reports Server (NTRS)

    Ortega, J. M.

    1986-01-01

    Various graduate research activities in the field of computer science are reported. Among the topics discussed are: (1) failure probabilities in multi-version software; (2) Gaussian Elimination on parallel computers; (3) three dimensional Poisson solvers on parallel/vector computers; (4) automated task decomposition for multiple robot arms; (5) multi-color incomplete cholesky conjugate gradient methods on the Cyber 205; and (6) parallel implementation of iterative methods for solving linear equations.

  4. Computation of Standard Errors

    PubMed Central

    Dowd, Bryan E; Greene, William H; Norton, Edward C

    2014-01-01

    Objectives We discuss the problem of computing the standard errors of functions involving estimated parameters and provide the relevant computer code for three different computational approaches using two popular computer packages. Study Design We show how to compute the standard errors of several functions of interest: the predicted value of the dependent variable for a particular subject, and the effect of a change in an explanatory variable on the predicted value of the dependent variable for an individual subject and average effect for a sample of subjects. Empirical Application Using a publicly available dataset, we explain three different methods of computing standard errors: the delta method, Krinsky–Robb, and bootstrapping. We provide computer code for Stata 12 and LIMDEP 10/NLOGIT 5. Conclusions In most applications, choice of the computational method for standard errors of functions of estimated parameters is a matter of convenience. However, when computing standard errors of the sample average of functions that involve both estimated parameters and nonstochastic explanatory variables, it is important to consider the sources of variation in the function's values. PMID:24800304

  5. The Social Computer

    NASA Astrophysics Data System (ADS)

    Serugendo, Giovanna Di Marzo; Risoldi, Matteo; Solemayni, Mohammad

    The following sections are included: * Introduction * Problem and Research Questions * State of the Art * TSC Structure and Computational Awareness * Methodology and Research Directions * Case Study: Democracy * Conclusions

  6. Computing by Observing Changes

    NASA Astrophysics Data System (ADS)

    Cavaliere, Matteo; Leupold, Peter

    Computing by Observing is a paradigm for the implementation of models of Natural Computing. It was inspired by the setup of experiments in biochemistry. One central feature is an observer that translates the evolution of an underlying observed system into sequences over a finite alphabet. We take a step toward more realistic observers by allowing them to notice only an occurring change in the observed system rather than to read the system's entire configuration. Compared to previous implementations of the Computing by Observing paradigm, this decreases the computational power; but with relatively simple systems we still obtain the language class generated by matrix grammars.

  7. Optimization of computations

    SciTech Connect

    Mikhalevich, V.S.; Sergienko, I.V.; Zadiraka, V.K.; Babich, M.D.

    1994-11-01

    This article examines some topics of optimization of computations, which have been discussed at 25 seminar-schools and symposia organized by the V.M. Glushkov Institute of Cybernetics of the Ukrainian Academy of Sciences since 1969. We describe the main directions in the development of computational mathematics and present some of our own results that reflect a certain design conception of speed-optimal and accuracy-optimal (or nearly optimal) algorithms for various classes of problems, as well as a certain approach to optimization of computer computations.

  8. Cloud Computing: An Overview

    NASA Astrophysics Data System (ADS)

    Qian, Ling; Luo, Zhiguo; Du, Yujian; Guo, Leitao

    In order to support the maximum number of user and elastic service with the minimum resource, the Internet service provider invented the cloud computing. within a few years, emerging cloud computing has became the hottest technology. From the publication of core papers by Google since 2003 to the commercialization of Amazon EC2 in 2006, and to the service offering of AT&T Synaptic Hosting, the cloud computing has been evolved from internal IT system to public service, from cost-saving tools to revenue generator, and from ISP to telecom. This paper introduces the concept, history, pros and cons of cloud computing as well as the value chain and standardization effort.

  9. Forensic webwatch: Forensic computing.

    PubMed

    Bouhaidar, R

    2005-02-01

    With the rapid and continuous development of information technology, policing faces new challenges. As computer equipments are becoming cheaper and the internet more readily available, computer crime and criminal exploitation is on the increase. Investigating such crimes requires identification, preservation, analysis and presentation of digital evidence, the key elements of forensic computing. This is helped by the fact that Locard's principle is applicable to this branch of science as much as in other areas of forensic science. This webwatch considers the ever evolving area of Forensic Computing.

  10. Computational approaches to vision

    NASA Technical Reports Server (NTRS)

    Barrow, H. G.; Tenenbaum, J. M.

    1986-01-01

    Vision is examined in terms of a computational process, and the competence, structure, and control of computer vision systems are analyzed. Theoretical and experimental data on the formation of a computer vision system are discussed. Consideration is given to early vision, the recovery of intrinsic surface characteristics, higher levels of interpretation, and system integration and control. A computational visual processing model is proposed and its architecture and operation are described. Examples of state-of-the-art vision systems, which include some of the levels of representation and processing mechanisms, are presented.

  11. Beyond Computer Planning: Managing Educational Computer Innovations.

    ERIC Educational Resources Information Center

    Washington, Wenifort

    The vast underutilization of technology in educational environments suggests the need for more research to develop models to successfully adopt and diffuse computer systems in schools. Of 980 surveys mailed to various Ohio public schools, 529 were completed and returned to help determine current attitudes and perceptions of teachers and…

  12. Educational Computer Utilization and Computer Communications.

    ERIC Educational Resources Information Center

    Singh, Jai P.; Morgan, Robert P.

    As part of an analysis of educational needs and telecommunications requirements for future educational satellite systems, three studies were carried out. 1) The role of the computer in education was examined and both current status and future requirements were analyzed. Trade-offs between remote time sharing and remote batch process were explored…

  13. Computer Confrontation: Suppes and Albrecht

    ERIC Educational Resources Information Center

    Suppes, Patrick; Albrecht, Bob

    1973-01-01

    Two well-known computer specialists argue about the function of computers in schools. Patrick Suppes believes mastery of basic skills is the prime function of computers. Bob Albrecht believes computers should be learning devices and not drill masters. (DS)

  14. Introduction: The Computer After Me

    NASA Astrophysics Data System (ADS)

    Pitt, Jeremy

    The following sections are included: * Introduction * Computer Awareness in Science Fiction * Computer Awareness and Self-Awareness * How many senses does a computer have? * Does a computer know that it is a computer? * Does metal know when it is weakening? * Why Does Computer Awareness Matter? * Chapter Overviews * Summary and Conclusions

  15. Computer Awareness for Rural Educators.

    ERIC Educational Resources Information Center

    Barker, Bruce O.

    The meteoric rise of the computer age is a challenge for public educators, many of whom are still unfamiliar with basic computer technology. Yet many educators are finding that they can correct their misconceptions about computers by becoming "computer aware." Computer awareness comes from gaining a knowledge of computer history; a basic…

  16. Computers in Composition Instruction.

    ERIC Educational Resources Information Center

    Shostak, Robert, Ed.

    This volume consists of nine conference papers and journal articles concerned with microcomputer applications in the teaching of writing. After a general introduction entitled "Computer-Assisted Composition Instruction: The State of the Art," by Robert Shostak, four papers are devoted to how computers may help with the writing process. In…

  17. Computer Virus Protection

    ERIC Educational Resources Information Center

    Rajala, Judith B.

    2004-01-01

    A computer virus is a program--a piece of executable code--that has the unique ability to replicate. Like biological viruses, computer viruses can spread quickly and are often difficult to eradicate. They can attach themselves to just about any type of file, and are spread by replicating and being sent from one individual to another. Simply having…

  18. Preventing Computer Glitches

    ERIC Educational Resources Information Center

    Goldsborough, Reid

    2009-01-01

    It has been said that a computer lets a person make more mistakes faster than any other invention in human history, with the possible exceptions of handguns and tequila. Computers also make mistakes on their own, whether they're glitches, conflicts, bugs, crashes, or failures. Avoiding glitches is considerably less frustrating than trying to fix…

  19. Flexible Animation Computer Program

    NASA Technical Reports Server (NTRS)

    Stallcup, Scott S.

    1990-01-01

    FLEXAN (Flexible Animation), computer program animating structural dynamics on Evans and Sutherland PS300-series graphics workstation with VAX/VMS host computer. Typical application is animation of spacecraft undergoing structural stresses caused by thermal and vibrational effects. Displays distortions in shape of spacecraft. Program displays single natural mode of vibration, mode history, or any general deformation of flexible structure. Written in FORTRAN 77.

  20. Computer Anxiety and Instruction.

    ERIC Educational Resources Information Center

    Baumgarte, Roger

    While the computer is commonly viewed as a tool for simplifying and enriching lives, many individuals react to this technology with feelings of anxiety, paranoia, and alienation. These reactions may have potentially serious career and educational consequences. Fear of computers reflects a generalized fear of current technology and is most…

  1. Profiling Computing Coordinators.

    ERIC Educational Resources Information Center

    Edwards, Sigrid; Morton, Allan

    The people responsible for managing school computing resources in Australia have become known as Computing Coordinators. To date there has been no large systematic study of the role, responsibilities and characteristics of this position. This paper represents a first attempt to provide information on the functions and attributes of the Computing…

  2. Computational physics: a perspective.

    PubMed

    Stoneham, A M

    2002-06-15

    Computing comprises three distinct strands: hardware, software and the ways they are used in real or imagined worlds. Its use in research is more than writing or running code. Having something significant to compute and deploying judgement in what is attempted and achieved are especially challenging. In science or engineering, one must define a central problem in computable form, run such software as is appropriate and, last but by no means least, convince others that the results are both valid and useful. These several strands are highly interdependent. A major scientific development can transform disparate aspects of information and computer technologies. Computers affect the way we do science, as well as changing our personal worlds. Access to information is being transformed, with consequences beyond research or even science. Creativity in research is usually considered uniquely human, with inspiration a central factor. Scientific and technological needs are major forces in innovation, and these include hardware and software opportunities. One can try to define the scientific needs for established technologies (atomic energy, the early semiconductor industry), for rapidly developing technologies (advanced materials, microelectronics) and for emerging technologies (nanotechnology, novel information technologies). Did these needs define new computing, or was science diverted into applications of then-available codes? Regarding credibility, why is it that engineers accept computer realizations when designing engineered structures, whereas predictive modelling of materials has yet to achieve industrial confidence outside very special cases? The tensions between computing and traditional science are complex, unpredictable and potentially powerful.

  3. Campus Computing Strategies.

    ERIC Educational Resources Information Center

    McCredie, John W., Ed.

    Ten case studies that describe the planning process and strategies employed by colleges who use computing and communication systems are presented, based on a 1981-1982 study conducted by EDUCOM. An introduction by John W. McCredie summarizes several current and future effects of the rapid spread and integration of computing and communication…

  4. Computer-assisted Crystallization.

    ERIC Educational Resources Information Center

    Semeister, Joseph J., Jr.; Dowden, Edward

    1989-01-01

    To avoid a tedious task for recording temperature, a computer was used for calculating the heat of crystallization for the compound sodium thiosulfate. Described are the computer-interfacing procedures. Provides pictures of laboratory equipment and typical graphs from experiments. (YP)

  5. COMPUTER MODELS/EPANET

    EPA Science Inventory

    Pipe network flow analysis was among the first civil engineering applications programmed for solution on the early commercial mainframe computers in the 1960s. Since that time, advancements in analytical techniques and computing power have enabled us to solve systems with tens o...

  6. Computers and Personal Privacy.

    ERIC Educational Resources Information Center

    Ware, Willis H.

    Privacy is an issue that arises from the intersection of a demand for improved recordkeeping processes, and computing technology as the response to the demand. Recordkeeping in the United States centers on information about people. Modern day computing technology has the ability to maintain, store, and retrieve records quickly; however, this…

  7. Computer Assisted Learning Feature.

    ERIC Educational Resources Information Center

    Davies, Peter; Minogue, Claire

    1988-01-01

    Discusses the goals of the Computer Working Party in Great Britain, presenting their assessment of current computer hardware and the market for economics software. Examines "Running the British Economy," a macroeconomic policy simulation that investigates the links between values and policy objectives and encourages questioning of economic models.…

  8. Administration of Computer Resources.

    ERIC Educational Resources Information Center

    Franklin, Gene F.

    Computing at Stanford University has, until recently, been performed at one of five facilities. The Stanford hospital operates an IBM 370/135 mainly for administrative use. The university business office has an IBM 370/145 for its administrative needs and support of the medical clinic. Under the supervision of the Stanford Computation Center are…

  9. African Studies Computer Resources.

    ERIC Educational Resources Information Center

    Kuntz, Patricia S.

    African studies computer resources that are readily available in the United States with linkages to Africa are described, highlighting those most directly corresponding to African content. Africanists can use the following four fundamental computer systems: (1) Internet/Bitnet; (2) Fidonet; (3) Usenet; and (4) dial-up bulletin board services. The…

  10. Decoding Technology: Computer Shortcuts

    ERIC Educational Resources Information Center

    Walker, Tim; Donohue, Chip

    2008-01-01

    For the typical early childhood administrator, there will never be enough hours in a day to finish the work that needs to be done. This includes numerous hours spent on a computer tracking enrollment, managing the budget, researching curriculum ideas online, and many other administrative tasks. Improving an administrator's computer efficiency can…

  11. Computer Series, 78.

    ERIC Educational Resources Information Center

    Moore, John W., Ed.

    1986-01-01

    Presents six brief articles dealing with the use of computers in teaching various topics in chemistry. Describes hardware and software applications which relate to protein graphics, computer simulated metabolism, interfaces between microcomputers and measurement devices, courseware available for spectrophotometers, and the calculation of elemental…

  12. Computer Series, 112.

    ERIC Educational Resources Information Center

    Birk, James P., Ed.

    1990-01-01

    Four microcomputer applications are presented including: "Computer Simulated Process of 'Lead Optimization': A Student-Interactive Program,""A PROLOG Program for the Generation of Molecular Formulas,""Determination of Inflection Points from Experimental Data," and "LAOCOON PC: NMR Simulation on a Personal Computer." Software, availability,…

  13. The Overdominance of Computers

    ERIC Educational Resources Information Center

    Monke, Lowell W.

    2006-01-01

    Most schools are unwilling to consider decreasing computer use at school because they fear that without screen time, students will not be prepared for the demands of a high-tech 21st century. Monke argues that having young children spend a significant amount of time on computers in school is harmful, particularly when children spend so much…

  14. ELECTRONIC DIGITAL COMPUTER

    DOEpatents

    Stone, J.J. Jr.; Bettis, E.S.; Mann, E.R.

    1957-10-01

    The electronic digital computer is designed to solve systems involving a plurality of simultaneous linear equations. The computer can solve a system which converges rather rapidly when using Von Seidel's method of approximation and performs the summations required for solving for the unknown terms by a method of successive approximations.

  15. Computer analysis of arteriograms

    NASA Technical Reports Server (NTRS)

    Selzer, R. H.; Armstrong, J. H.; Beckenbach, E. B.; Blankenhorn, D. H.; Crawford, D. W.; Brooks, S. H.; Sanmarco, M. E.

    1977-01-01

    A computer system has been developed to quantify the degree of atherosclerosis in the human femoral artery. The analysis involves first scanning and digitizing angiographic film, then tracking the outline of the arterial image and finally computing the relative amount of roughness or irregularity in the vessel wall. The image processing system and method are described.

  16. Teaching Computer Applications.

    ERIC Educational Resources Information Center

    Lundgren, Carol A.; And Others

    This document, which is designed to provide classroom teachers at all levels with practical ideas for a computer applications course, examines curricular considerations, teaching strategies, delivery techniques, and assessment methods applicable to a course focusing on applications of computers in business. The guide is divided into three…

  17. Computers, Networks and Education.

    ERIC Educational Resources Information Center

    Kay, Alan C.

    1991-01-01

    Discussed is how globally networked, easy-to-use computers can enhance learning only within an educational environment that encourages students to question "facts" and seek challenges. The strengths and weaknesses of computers used as amplifiers for learning are described. (KR)

  18. Can Computers Create?

    ERIC Educational Resources Information Center

    Hausman, Carl R.

    1985-01-01

    To be creative, an act must have as its outcome something new in the way it is intelligible and valuable. Computers have restricted contexts of information and have no ability to weigh bits of information. Computer optimists presuppose either determinism or indeterminism, either of which abandons creativity. (MT)

  19. Computer Dictionary and Handbook.

    ERIC Educational Resources Information Center

    Sippl, Charles J.; Sippl, Charles P.

    Designed to provide a useful tool for the instruction and increased awareness of the growing number of computer users throughout the world, this edition of the dictionary was based on an information search which provided 22,000 separate definitions and concept explanations. Appended are an introduction to computer system principles and procedures,…

  20. The Computer Bulletin Board.

    ERIC Educational Resources Information Center

    Batt, Russell, Ed.

    1988-01-01

    Describes three situations in which computer software was used in a chemistry laboratory. Discusses interfacing voltage output instruments with Apple II computers and using spreadsheet programs to simulate gas chromatography and analysis of kinetic data. Includes information concerning procedures, hardware, and software used in each situation. (CW)

  1. Teaching Using Computer Games

    ERIC Educational Resources Information Center

    Miller, Lee Dee; Shell, Duane; Khandaker, Nobel; Soh, Leen-Kiat

    2011-01-01

    Computer games have long been used for teaching. Current reviews lack categorization and analysis using learning models which would help instructors assess the usefulness of computer games. We divide the use of games into two classes: game playing and game development. We discuss the Input-Process-Outcome (IPO) model for the learning process when…

  2. Uncertainty in Computational Aerodynamics

    NASA Technical Reports Server (NTRS)

    Luckring, J. M.; Hemsch, M. J.; Morrison, J. H.

    2003-01-01

    An approach is presented to treat computational aerodynamics as a process, subject to the fundamental quality assurance principles of process control and process improvement. We consider several aspects affecting uncertainty for the computational aerodynamic process and present a set of stages to determine the level of management required to meet risk assumptions desired by the customer of the predictions.

  3. Quantum Analog Computing

    NASA Technical Reports Server (NTRS)

    Zak, M.

    1998-01-01

    Quantum analog computing is based upon similarity between mathematical formalism of quantum mechanics and phenomena to be computed. It exploits a dynamical convergence of several competing phenomena to an attractor which can represent an externum of a function, an image, a solution to a system of ODE, or a stochastic process.

  4. Connecting Kids and Computers

    ERIC Educational Resources Information Center

    Giles, Rebecca McMahon

    2006-01-01

    Exposure to cell phones, DVD players, video games, computers, digital cameras, and iPods has made today's young people more technologically advanced than those of any previous generation. As a result, parents are now concerned that their children are spending too much time in front of the computer. In this article, the author focuses her…

  5. Computer Yearbook 72.

    ERIC Educational Resources Information Center

    1972

    Recent and expected developments in the computer industry are discussed in this 628-page yearbook, successor to "The Punched Card Annual." The first section of the report is an overview of current computer hardware and software and includes articles about future applications of mainframes, an analysis of the software industry, and a summary of the…

  6. Computer Aided Art Major.

    ERIC Educational Resources Information Center

    Gibson, Jim

    The Computer Aided Art program offered at Northern State State University (Aberdeen, South Dakota), is coordinated with the traditional art major. The program is designed to familiarize students with a wide range of art-related computer hardware and software and their applications and to prepare students for problem-solving with unfamiliar…

  7. Computer Support Technician.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Center on Education and Training for Employment.

    This publication contains 18 subjects appropriate for use in a competency list for the occupation of computer support technician, 1 of 12 12 occupations within the business/computer technologies cluster. Each unit consists of a number of competencies; a list of competency builders is provided for each competency. Titles of the 18 units are as…

  8. Computer Programmer/Analyst.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Center on Education and Training for Employment.

    This publication contains 25 subjects appropriate for use in a competency list for the occupation of computer programmer/analyst, 1 of 12 occupations within the business/computer technologies cluster. Each unit consists of a number of competencies; a list of competency builders is provided for each competency. Titles of the 25 units are as…

  9. Computer Communications and Learning.

    ERIC Educational Resources Information Center

    Bellman, Beryl L.

    1992-01-01

    Computer conferencing offers many opportunities for linking college students and faculty at a distance. From the Binational English and Spanish Telecommunications Network (BESTNET) has evolved a variety of bilingual video/computer/face-to-face instructional packages to serve institutions and nontraditional students on several continents. (MSE)

  10. Computer controlled antenna system

    NASA Technical Reports Server (NTRS)

    Raumann, N. A.

    1972-01-01

    The application of small computers using digital techniques for operating the servo and control system of large antennas is discussed. The advantages of the system are described. The techniques were evaluated with a forty foot antenna and the Sigma V computer. Programs have been completed which drive the antenna directly without the need for a servo amplifier, antenna position programmer or a scan generator.

  11. Computers and Classroom Culture.

    ERIC Educational Resources Information Center

    Schofield, Janet Ward

    This book explores the meaning of computer technology in schools. The book is based on data gathered from a two-year observation of more than 30 different classrooms in an urban high school: geometry classes in which students used artificially intelligent tutors; business classes in which students learned word processing; and computer science…

  12. Minnesota Educational Computing Consortium.

    ERIC Educational Resources Information Center

    Haugo, John E.

    The state of Minnesota has established the Minnesota Educational Computing Consortium (MECC) to coordinate the state's educational computing activities. The Consortium is governed by a board of directors representing the State Department of Education, the State Junior Colleges, the State Colleges, the State University and the public and is…

  13. Learning through Computer Simulations.

    ERIC Educational Resources Information Center

    Braun, Ludwig

    Prior to the relatively easy access to computers which began in the mid-1960's, simulation was a tool only of researchers. Even now, students are frequently excluded from direct laboratory experiences for many reasons. However, computer simulation can open up these experiences, providing a powerful teaching tool for individuals, for small and…

  14. The Case for Computers.

    ERIC Educational Resources Information Center

    BCEL Newsletter for the Business Community, 1985

    1985-01-01

    Given the large gap between the millions of adults in need of basic literacy skills and the shortage of teachers, tutors, and funds to serve them, computer-assisted instruction (CAI) has the potential to reduce the gap. The merits of CAI include its holding power, provision of a positive learning environment, opportunity to learn about computers,…

  15. Computer Processor Allocator

    2004-03-01

    The Compute Processor Allocator (CPA) provides an efficient and reliable mechanism for managing and allotting processors in a massively parallel (MP) computer. It maintains information in a database on the health. configuration and allocation of each processor. This persistent information is factored in to each allocation decision. The CPA runs in a distributed fashion to avoid a single point of failure.

  16. Computers as Cognitive Tools.

    ERIC Educational Resources Information Center

    Lajoie, Susanne P., Ed.; Derry, Sharon J., Ed.

    This book provides exemplars of the types of computer-based learning environments represented by the theoretical camps within the field and the practical applications of the theories. The contributors discuss a variety of computer applications to learning, ranging from school-related topics such as geometry, algebra, biology, history, physics, and…

  17. The Computer Bulletin Board.

    ERIC Educational Resources Information Center

    Batt, Russell H., Ed.

    1989-01-01

    Discussed are some uses of computers in chemistry classrooms. Described are: (1) interactive chromatographic analysis software; (2) computer interface for a digital frequency-period-counter-ratio meter and analog interface based on a voltage-to-frequency converter; and (3) use of spectrometer/microcomputer arrangement for teaching atomic theory.…

  18. Programming the social computer.

    PubMed

    Robertson, David; Giunchiglia, Fausto

    2013-03-28

    The aim of 'programming the global computer' was identified by Milner and others as one of the grand challenges of computing research. At the time this phrase was coined, it was natural to assume that this objective might be achieved primarily through extending programming and specification languages. The Internet, however, has brought with it a different style of computation that (although harnessing variants of traditional programming languages) operates in a style different to those with which we are familiar. The 'computer' on which we are running these computations is a social computer in the sense that many of the elementary functions of the computations it runs are performed by humans, and successful execution of a program often depends on properties of the human society over which the program operates. These sorts of programs are not programmed in a traditional way and may have to be understood in a way that is different from the traditional view of programming. This shift in perspective raises new challenges for the science of the Web and for computing in general.

  19. K-12 Computer Networking.

    ERIC Educational Resources Information Center

    ERIC Review, 1993

    1993-01-01

    The "ERIC Review" is published three times a year and announces research results, publications, and new programs relevant to each issue's theme topic. This issue explores computer networking in elementary and secondary schools via two principal articles: "Plugging into the 'Net'" (Michael B. Eisenberg and Donald P. Ely); and "Computer Networks for…

  20. Logic via Computer Programming.

    ERIC Educational Resources Information Center

    Wieschenberg, Agnes A.

    This paper proposed the question "How do we teach logical thinking and sophisticated mathematics to unsophisticated college students?" One answer among many is through the writing of computer programs. The writing of computer algorithms is mathematical problem solving and logic in disguise and it may attract students who would otherwise stop…

  1. Chippy's Computer Words.

    ERIC Educational Resources Information Center

    Willing, Kathlene R.; Girard, Suzanne

    Intended for young children just becoming familiar with computers, this naming book introduces and reinforces new computer vocabulary and concepts. The 20 words are presented alphabetically, along with illustrations, providing room for different activities in which children can match and name the pictures and words. The 20 vocabulary items are…

  2. Computer Series, 97.

    ERIC Educational Resources Information Center

    Kay, Jack G.; And Others

    1988-01-01

    Describes two applications of the microcomputer for laboratory exercises. Explores radioactive decay using the Batemen equations on a Macintosh computer. Provides examples and screen dumps of data. Investigates polymer configurations using a Monte Carlo simulation on an IBM personal computer. (MVL)

  3. Computer Technology for Industry

    NASA Technical Reports Server (NTRS)

    1979-01-01

    In this age of the computer, more and more business firms are automating their operations for increased efficiency in a great variety of jobs, from simple accounting to managing inventories, from precise machining to analyzing complex structures. In the interest of national productivity, NASA is providing assistance both to longtime computer users and newcomers to automated operations. Through a special technology utilization service, NASA saves industry time and money by making available already developed computer programs which have secondary utility. A computer program is essentially a set of instructions which tells the computer how to produce desired information or effect by drawing upon its stored input. Developing a new program from scratch can be costly and time-consuming. Very often, however, a program developed for one purpose can readily be adapted to a totally different application. To help industry take advantage of existing computer technology, NASA operates the Computer Software Management and Information Center (COSMIC)(registered TradeMark),located at the University of Georgia. COSMIC maintains a large library of computer programs developed for NASA, the Department of Defense, the Department of Energy and other technology-generating agencies of the government. The Center gets a continual flow of software packages, screens them for adaptability to private sector usage, stores them and informs potential customers of their availability.

  4. Videodisc-Computer Interfaces.

    ERIC Educational Resources Information Center

    Zollman, Dean

    1984-01-01

    Lists microcomputer-videodisc interfaces currently available from 26 sources, including home use systems connected through remote control jack and industrial/educational systems utilizing computer ports and new laser reflective and stylus technology. Information provided includes computer and videodisc type, language, authoring system, educational…

  5. Computed body tomography.

    PubMed

    Alfidi, R J; Haaga, J R

    1976-12-01

    Only the surface of the diagnostic possibilities inherent in CT imaging has been scratched. Solic organ pathology is readily visible in most instances by computed tomography. With further extension of present knowledge and development of newer contrast agents, the ability of computed body tomography to image a wide range of diseases appears almost limitless.

  6. Marketing via Computer Diskette.

    ERIC Educational Resources Information Center

    Thombs, Michael

    This report describes the development and evaluation of an interactive marketing diskette which describes the characteristics, advantages, and application procedures for each of the major computer-based graduate programs at Nova University. Copies of the diskettes were distributed at the 1988 Florida Instructional Computing Conference and were…

  7. Computers in Fundamental Mathematics.

    ERIC Educational Resources Information Center

    New York City Board of Education, Brooklyn, NY. Div. of Computer Information Services.

    This manual provides a resource for mathematics teachers who wish to take advantage of the extensive library of computer software materials available to expand and strengthen classroom instruction. Classroom management procedures, software duplication guidelines, copying procedures of diskettes for MS DOS and Apple II computers, and methods for…

  8. Learning with Ubiquitous Computing

    ERIC Educational Resources Information Center

    Rosenheck, Louisa

    2008-01-01

    If ubiquitous computing becomes a reality and is widely adopted, it will inevitably have an impact on education. This article reviews the background of ubiquitous computing and current research projects done involving educational "ubicomp." Finally it explores how ubicomp may and may not change education in both formal and informal settings and…

  9. The Economics of Computers.

    ERIC Educational Resources Information Center

    Sharpe, William F.

    A microeconomic theory is applied in this book to computer services and costs and for the benefit of those who are decision-makers in the selection, financing, and use of computers. Subtopics of the theory discussed include value and demand; revenue and profits; time and risk; and costs, inputs, and outputs. Application of the theory is explained…

  10. Computations in Plasma Physics.

    ERIC Educational Resources Information Center

    Cohen, Bruce I.; Killeen, John

    1983-01-01

    Discusses contributions of computers to research in magnetic and inertial-confinement fusion, charged-particle-beam propogation, and space sciences. Considers use in design/control of laboratory and spacecraft experiments and in data acquisition; and reviews major plasma computational methods and some of the important physics problems they…

  11. A Home Computer Primer.

    ERIC Educational Resources Information Center

    Stone, Antonia

    1982-01-01

    Provides general information on currently available microcomputers, computer programs (software), hardware requirements, software sources, costs, computer games, and programing. Includes a list of popular microcomputers, providing price category, model, list price, software (cassette, tape, disk), monitor specifications, amount of random access…

  12. Computational Modeling of Tires

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler); Tanner, John A. (Compiler)

    1995-01-01

    This document contains presentations and discussions from the joint UVA/NASA Workshop on Computational Modeling of Tires. The workshop attendees represented NASA, the Army and Air force, tire companies, commercial software developers, and academia. The workshop objectives were to assess the state of technology in the computational modeling of tires and to provide guidelines for future research.

  13. Computer Experience of Nurses.

    PubMed

    Schleder Gonçalves, Luciana; Cândida Castro, Talita; Fialek, Soraya

    2015-01-01

    This study aimed to identify the computing experience of nurses in southern Brazil, through exploratory survey research. The results, which were obtained from the application of The Staggers Nursing Computer Experience Questionnaire®, were analyzed by statistical tests. The survey was conducted with nurses working both in hospitals, as in public health, in a capital in southern Brazil. There is the predominance of novice nurses in the application of computer tools in their practices but most often declare the use the computers to develop their professional and also personal life activities. We conclude that the computer and health information systems are part of the working reality of the participants, being considered indispensable resources his activity, while noting limitations on the potential use of these tools. This study reflects on how the issue has been addressed in educational schools and the challenges of inclusion of the theme of Nursing Informatics in the curricula in Brazil. PMID:26262313

  14. Symmetry Effects in Computation

    NASA Astrophysics Data System (ADS)

    Yao, Andrew Chi-Chih

    2008-12-01

    The concept of symmetry has played a key role in the development of modern physics. For example, using symmetry, C.N. Yang and other physicists have greatly advanced our understanding of the fundamental laws of physics. Meanwhile, computer scientists have been pondering why some computational problems seem intractable, while others are easy. Just as in physics, the laws of computation sometimes can only be inferred indirectly by considerations of general principles such as symmetry. The symmetry properties of a function can indeed have a profound effect on how fast the function can be computed. In this talk, we present several elegant and surprising discoveries along this line, made by computer scientists using symmetry as their primary tool. Note from Publisher: This article contains the abstract only.

  15. Indirection and computer security.

    SciTech Connect

    Berg, Michael J.

    2011-09-01

    The discipline of computer science is built on indirection. David Wheeler famously said, 'All problems in computer science can be solved by another layer of indirection. But that usually will create another problem'. We propose that every computer security vulnerability is yet another problem created by the indirections in system designs and that focusing on the indirections involved is a better way to design, evaluate, and compare security solutions. We are not proposing that indirection be avoided when solving problems, but that understanding the relationships between indirections and vulnerabilities is key to securing computer systems. Using this perspective, we analyze common vulnerabilities that plague our computer systems, consider the effectiveness of currently available security solutions, and propose several new security solutions.

  16. (Computer) Vision without Sight

    PubMed Central

    Manduchi, Roberto; Coughlan, James

    2012-01-01

    Computer vision holds great promise for helping persons with blindness or visual impairments (VI) to interpret and explore the visual world. To this end, it is worthwhile to assess the situation critically by understanding the actual needs of the VI population and which of these needs might be addressed by computer vision. This article reviews the types of assistive technology application areas that have already been developed for VI, and the possible roles that computer vision can play in facilitating these applications. We discuss how appropriate user interfaces are designed to translate the output of computer vision algorithms into information that the user can quickly and safely act upon, and how system-level characteristics affect the overall usability of an assistive technology. Finally, we conclude by highlighting a few novel and intriguing areas of application of computer vision to assistive technology. PMID:22815563

  17. Next-generation computers

    SciTech Connect

    Torrero, E.A.

    1985-01-01

    Developments related to tomorrow's computers are discussed, taking into account advances toward the fifth generation in Japan, the challenge to U.S. supercomputers, plans concerning the creation of supersmart computers for the U.S. military, a U.S. industry response to the Japanese challenge, a survey of U.S. and European research, Great Britain, the European Common Market, codifying human knowledge for machine reading, software engineering, the next-generation softwave, plans for obtaining the million-transistor chip, and fabrication issues for next-generation circuits. Other topics explored are related to a status report regarding artificial intelligence, an assessment of the technical challenges, aspects of sociotechnology, and defense advanced research projects. Attention is also given to expert systems, speech recognition, computer vision, function-level programming and automated programming, computing at the speed limit, VLSI, and superpower computers.

  18. Onward to Petaflops Computing

    NASA Technical Reports Server (NTRS)

    Bailey, David H.; Chancellor, Marisa K. (Technical Monitor)

    1997-01-01

    With programs such as the US High Performance Computing and Communications Program (HPCCP), the attention of scientists and engineers worldwide has been focused on the potential of very high performance scientific computing, namely systems that are hundreds or thousands of times more powerful than those typically available in desktop systems at any given point in time. Extending the frontiers of computing in this manner has resulted in remarkable advances, both in computing technology itself and also in the various scientific and engineering disciplines that utilize these systems. Within the month or two, a sustained rate of 1 Tflop/s (also written 1 teraflops, or 10(exp 12) floating-point operations per second) is likely to be achieved by the 'ASCI Red' system at Sandia National Laboratory in New Mexico. With this objective in sight, it is reasonable to ask what lies ahead for high-end computing.

  19. Pen-based computers: Computers without keys

    NASA Technical Reports Server (NTRS)

    Conklin, Cheryl L.

    1994-01-01

    The National Space Transportation System (NSTS) is comprised of many diverse and highly complex systems incorporating the latest technologies. Data collection associated with ground processing of the various Space Shuttle system elements is extremely challenging due to the many separate processing locations where data is generated. This presents a significant problem when the timely collection, transfer, collation, and storage of data is required. This paper describes how new technology, referred to as Pen-Based computers, is being used to transform the data collection process at Kennedy Space Center (KSC). Pen-Based computers have streamlined procedures, increased data accuracy, and now provide more complete information than previous methods. The end results is the elimination of Shuttle processing delays associated with data deficiencies.

  20. Computational electromagnetics and parallel dense matrix computations

    SciTech Connect

    Forsman, K.; Kettunen, L.; Gropp, W.; Levine, D.

    1995-06-01

    We present computational results using CORAL, a parallel, three-dimensional, nonlinear magnetostatic code based on a volume integral equation formulation. A key feature of CORAL is the ability to solve, in parallel, the large, dense systems of linear equations that are inherent in the use of integral equation methods. Using the Chameleon and PSLES libraries ensures portability and access to the latest linear algebra solution technology.

  1. Coping with distributed computing

    SciTech Connect

    Cormell, L.

    1992-09-01

    The rapid increase in the availability of high performance, cost-effective RISC/UNIX workstations has been both a blessing and a curse. The blessing of having extremely powerful computing engines available on the desk top is well-known to many users. The user has tremendous freedom, flexibility, and control of his environment. That freedom can, however, become the curse of distributed computing. The user must become a system manager to some extent, he must worry about backups, maintenance, upgrades, etc. Traditionally these activities have been the responsibility of a central computing group. The central computing group, however, may find that it can no longer provide all of the traditional services. With the plethora of workstations now found on so many desktops throughout the entire campus or lab, the central computing group may be swamped by support requests. This talk will address several of these computer support and management issues by providing some examples of the approaches taken at various HEP institutions. In addition, a brief review of commercial directions or products for distributed computing and management will be given.

  2. Computing with synthetic protocells.

    PubMed

    Courbet, Alexis; Molina, Franck; Amar, Patrick

    2015-09-01

    In this article we present a new kind of computing device that uses biochemical reactions networks as building blocks to implement logic gates. The architecture of a computing machine relies on these generic and composable building blocks, computation units, that can be used in multiple instances to perform complex boolean functions. Standard logical operations are implemented by biochemical networks, encapsulated and insulated within synthetic vesicles called protocells. These protocells are capable of exchanging energy and information with each other through transmembrane electron transfer. In the paradigm of computation we propose, protoputing, a machine can solve only one problem and therefore has to be built specifically. Thus, the programming phase in the standard computing paradigm is represented in our approach by the set of assembly instructions (specific attachments) that directs the wiring of the protocells that constitute the machine itself. To demonstrate the computing power of protocellular machines, we apply it to solve a NP-complete problem, known to be very demanding in computing power, the 3-SAT problem. We show how to program the assembly of a machine that can verify the satisfiability of a given boolean formula. Then we show how to use the massive parallelism of these machines to verify in less than 20 min all the valuations of the input variables and output a fluorescent signal when the formula is satisfiable or no signal at all otherwise.

  3. Community Cloud Computing

    NASA Astrophysics Data System (ADS)

    Marinos, Alexandros; Briscoe, Gerard

    Cloud Computing is rising fast, with its data centres growing at an unprecedented rate. However, this has come with concerns over privacy, efficiency at the expense of resilience, and environmental sustainability, because of the dependence on Cloud vendors such as Google, Amazon and Microsoft. Our response is an alternative model for the Cloud conceptualisation, providing a paradigm for Clouds in the community, utilising networked personal computers for liberation from the centralised vendor model. Community Cloud Computing (C3) offers an alternative architecture, created by combing the Cloud with paradigms from Grid Computing, principles from Digital Ecosystems, and sustainability from Green Computing, while remaining true to the original vision of the Internet. It is more technically challenging than Cloud Computing, having to deal with distributed computing issues, including heterogeneous nodes, varying quality of service, and additional security constraints. However, these are not insurmountable challenges, and with the need to retain control over our digital lives and the potential environmental consequences, it is a challenge we must pursue.

  4. Computers as tools

    SciTech Connect

    Eriksson, I.V.

    1994-12-31

    The following message was recently posted on a bulletin board and clearly shows the relevance of the conference theme: {open_quotes}The computer and digital networks seem poised to change whole regions of human activity -- how we record knowledge, communicate, learn, work, understand ourselves and the world. What`s the best framework for understanding this digitalization, or virtualization, of seemingly everything? ... Clearly, symbolic tools like the alphabet, book, and mechanical clock have changed some of our most fundamental notions -- self, identity, mind, nature, time, space. Can we say what the computer, a purely symbolic {open_quotes}machine,{close_quotes} is doing to our thinking in these areas? Or is it too early to say, given how much more powerful and less expensive the technology seems destinated to become in the next few decades?{close_quotes} (Verity, 1994) Computers certainly affect our lives and way of thinking but what have computers to do with ethics? A narrow approach would be that on the one hand people can and do abuse computer systems and on the other hand people can be abused by them. Weli known examples of the former are computer comes such as the theft of money, services and information. The latter can be exemplified by violation of privacy, health hazards and computer monitoring. Broadening the concept from computers to information systems (ISs) and information technology (IT) gives a wider perspective. Computers are just the hardware part of information systems which also include software, people and data. Information technology is the concept preferred today. It extends to communication, which is an essential part of information processing. Now let us repeat the question: What has IT to do with ethics? Verity mentioned changes in {open_quotes}how we record knowledge, communicate, learn, work, understand ourselves and the world{close_quotes}.

  5. NEMAR plotting computer program

    NASA Technical Reports Server (NTRS)

    Myler, T. R.

    1981-01-01

    A FORTRAN coded computer program which generates CalComp plots of trajectory parameters is examined. The trajectory parameters are calculated and placed on a data file by the Near Earth Mission Analysis Routine computer program. The plot program accesses the data file and generates the plots as defined by inputs to the plot program. Program theory, user instructions, output definitions, subroutine descriptions and detailed FORTRAN coding information are included. Although this plot program utilizes a random access data file, a data file of the same type and formatted in 102 numbers per record could be generated by any computer program and used by this plot program.

  6. Interactive computer graphics

    NASA Astrophysics Data System (ADS)

    Purser, K.

    1980-08-01

    Design layouts have traditionally been done on a drafting board by drawing a two-dimensional representation with section cuts and side views to describe the exact three-dimensional model. With the advent of computer graphics, a three-dimensional model can be created directly. The computer stores the exact three-dimensional model, which can be examined from any angle and at any scale. A brief overview of interactive computer graphics, how models are made and some of the benefits/limitations are described.

  7. Computational biology for ageing.

    PubMed

    Wieser, Daniela; Papatheodorou, Irene; Ziehm, Matthias; Thornton, Janet M

    2011-01-12

    High-throughput genomic and proteomic technologies have generated a wealth of publicly available data on ageing. Easy access to these data, and their computational analysis, is of great importance in order to pinpoint the causes and effects of ageing. Here, we provide a description of the existing databases and computational tools on ageing that are available for researchers. We also describe the computational approaches to data interpretation in the field of ageing including gene expression, comparative and pathway analyses, and highlight the challenges for future developments. We review recent biological insights gained from applying bioinformatics methods to analyse and interpret ageing data in different organisms, tissues and conditions.

  8. High Performance Computing Today

    SciTech Connect

    Dongarra, Jack; Meuer,Hans; Simon,Horst D.; Strohmaier,Erich

    2000-04-01

    In last 50 years, the field of scientific computing has seen a rapid change of vendors, architectures, technologies and the usage of systems. Despite all these changes the evolution of performance on a large scale however seems to be a very steady and continuous process. Moore's Law is often cited in this context. If the authors plot the peak performance of various computers of the last 5 decades in Figure 1 that could have been called the supercomputers of their time they indeed see how well this law holds for almost the complete lifespan of modern computing. On average they see an increase in performance of two magnitudes of order every decade.

  9. Human Computer Interaction

    NASA Astrophysics Data System (ADS)

    Bhagwani, Akhilesh; Sengar, Chitransh; Talwaniper, Jyotsna; Sharma, Shaan

    2012-08-01

    The paper basically deals with the study of HCI (Human computer interaction) or BCI(Brain-Computer-Interfaces) Technology that can be used for capturing brain signals and translating them into commands that allow humans to control (just by thinking) devices such as computers, robots, rehabilitation technology and virtual reality environments. The HCI is based as a direct communication pathway between the brain and an external device. BCIs are often aimed at assisting, augmenting, or repairing human cognitive or sensory-motor functions.The paper also deals with many advantages of BCI Technology along with some of its applications and some major drawbacks.

  10. Optoelectronic reservoir computing.

    PubMed

    Paquot, Y; Duport, F; Smerieri, A; Dambre, J; Schrauwen, B; Haelterman, M; Massar, S

    2012-01-01

    Reservoir computing is a recently introduced, highly efficient bio-inspired approach for processing time dependent data. The basic scheme of reservoir computing consists of a non linear recurrent dynamical system coupled to a single input layer and a single output layer. Within these constraints many implementations are possible. Here we report an optoelectronic implementation of reservoir computing based on a recently proposed architecture consisting of a single non linear node and a delay line. Our implementation is sufficiently fast for real time information processing. We illustrate its performance on tasks of practical importance such as nonlinear channel equalization and speech recognition, and obtain results comparable to state of the art digital implementations.

  11. Computational Tractability - Beyond Turing?

    NASA Astrophysics Data System (ADS)

    Marcer, Peter; Rowlands, Peter

    A fundamental problem in the theory of computing concerns whether descriptions of systems at all times remain tractable, that is whether the complexity that inevitably results can be reduced to a polynomial form (P) or whether some problems lead to a non-polynomial (NP) exponential growth in complexity. Here, we propose that the universal computational rewrite system that can be shown to be responsible ultimately for the development of mathematics, physics, chemistry, biology and even human consciousness, is so structured that Nature will always be structured as P at any scale and so will be computationally tractable.

  12. Computational biology for ageing

    PubMed Central

    Wieser, Daniela; Papatheodorou, Irene; Ziehm, Matthias; Thornton, Janet M.

    2011-01-01

    High-throughput genomic and proteomic technologies have generated a wealth of publicly available data on ageing. Easy access to these data, and their computational analysis, is of great importance in order to pinpoint the causes and effects of ageing. Here, we provide a description of the existing databases and computational tools on ageing that are available for researchers. We also describe the computational approaches to data interpretation in the field of ageing including gene expression, comparative and pathway analyses, and highlight the challenges for future developments. We review recent biological insights gained from applying bioinformatics methods to analyse and interpret ageing data in different organisms, tissues and conditions. PMID:21115530

  13. Research in Computational Astrobiology

    NASA Technical Reports Server (NTRS)

    Chaban, Galina; Colombano, Silvano; Scargle, Jeff; New, Michael H.; Pohorille, Andrew; Wilson, Michael A.

    2003-01-01

    We report on several projects in the field of computational astrobiology, which is devoted to advancing our understanding of the origin, evolution and distribution of life in the Universe using theoretical and computational tools. Research projects included modifying existing computer simulation codes to use efficient, multiple time step algorithms, statistical methods for analysis of astrophysical data via optimal partitioning methods, electronic structure calculations on water-nuclei acid complexes, incorporation of structural information into genomic sequence analysis methods and calculations of shock-induced formation of polycylic aromatic hydrocarbon compounds.

  14. Convergence: Computing and communications

    SciTech Connect

    Catlett, C.

    1996-12-31

    This paper highlights the operations of the National Center for Supercomputing Applications (NCSA). NCSA is developing and implementing a national strategy to create, use, and transfer advanced computing and communication tools and information technologies for science, engineering, education, and business. The primary focus of the presentation is historical and expected growth in the computing capacity, personal computer performance, and Internet and WorldWide Web sites. Data are presented to show changes over the past 10 to 20 years in these areas. 5 figs., 4 tabs.

  15. Teaching Physics with Computers

    NASA Astrophysics Data System (ADS)

    Botet, R.; Trizac, E.

    2005-09-01

    Computers are now so common in our everyday life that it is difficult to imagine the computer-free scientific life of the years before the 1980s. And yet, in spite of an unquestionable rise, the use of computers in the realm of education is still in its infancy. This is not a problem with students: for the new generation, the pre-computer age seems as far in the past as the the age of the dinosaurs. It may instead be more a question of teacher attitude. Traditional education is based on centuries of polished concepts and equations, while computers require us to think differently about our method of teaching, and to revise the content accordingly. Our brains do not work in terms of numbers, but use abstract and visual concepts; hence, communication between computer and man boomed when computers escaped the world of numbers to reach a visual interface. From this time on, computers have generated new knowledge and, more importantly for teaching, new ways to grasp concepts. Therefore, just as real experiments were the starting point for theory, virtual experiments can be used to understand theoretical concepts. But there are important differences. Some of them are fundamental: a virtual experiment may allow for the exploration of length and time scales together with a level of microscopic complexity not directly accessible to conventional experiments. Others are practical: numerical experiments are completely safe, unlike some dangerous but essential laboratory experiments, and are often less expensive. Finally, some numerical approaches are suited only to teaching, as the concept necessary for the physical problem, or its solution, lies beyond the scope of traditional methods. For all these reasons, computers open physics courses to novel concepts, bringing education and research closer. In addition, and this is not a minor point, they respond naturally to the basic pedagogical needs of interactivity, feedback, and individualization of instruction. This is why one can

  16. Optoelectronic Reservoir Computing

    NASA Astrophysics Data System (ADS)

    Paquot, Y.; Duport, F.; Smerieri, A.; Dambre, J.; Schrauwen, B.; Haelterman, M.; Massar, S.

    2012-02-01

    Reservoir computing is a recently introduced, highly efficient bio-inspired approach for processing time dependent data. The basic scheme of reservoir computing consists of a non linear recurrent dynamical system coupled to a single input layer and a single output layer. Within these constraints many implementations are possible. Here we report an optoelectronic implementation of reservoir computing based on a recently proposed architecture consisting of a single non linear node and a delay line. Our implementation is sufficiently fast for real time information processing. We illustrate its performance on tasks of practical importance such as nonlinear channel equalization and speech recognition, and obtain results comparable to state of the art digital implementations.

  17. Computer aided surface representation

    SciTech Connect

    Barnhill, R.E.

    1990-02-19

    The central research problem of this project is the effective representation, computation, and display of surfaces interpolating to information in three or more dimensions. If the given information is located on another surface, then the problem is to construct a surface defined on a surface''. Sometimes properties of an already defined surface are desired, which is geometry processing''. Visualization of multivariate surfaces is possible by means of contouring higher dimensional surfaces. These problems and more are discussed below. The broad sweep from constructive mathematics through computational algorithms to computer graphics illustrations is utilized in this research. The breadth and depth of this research activity makes this research project unique.

  18. Computer surety: computer system inspection guidance. [Contains glossary

    SciTech Connect

    Not Available

    1981-07-01

    This document discusses computer surety in NRC-licensed nuclear facilities from the perspective of physical protection inspectors. It gives background information and a glossary of computer terms, along with threats and computer vulnerabilities, methods used to harden computer elements, and computer audit controls.

  19. Neural computation and the computational theory of cognition.

    PubMed

    Piccinini, Gualtiero; Bahar, Sonya

    2013-04-01

    We begin by distinguishing computationalism from a number of other theses that are sometimes conflated with it. We also distinguish between several important kinds of computation: computation in a generic sense, digital computation, and analog computation. Then, we defend a weak version of computationalism-neural processes are computations in the generic sense. After that, we reject on empirical grounds the common assimilation of neural computation to either analog or digital computation, concluding that neural computation is sui generis. Analog computation requires continuous signals; digital computation requires strings of digits. But current neuroscientific evidence indicates that typical neural signals, such as spike trains, are graded like continuous signals but are constituted by discrete functional elements (spikes); thus, typical neural signals are neither continuous signals nor strings of digits. It follows that neural computation is sui generis. Finally, we highlight three important consequences of a proper understanding of neural computation for the theory of cognition. First, understanding neural computation requires a specially designed mathematical theory (or theories) rather than the mathematical theories of analog or digital computation. Second, several popular views about neural computation turn out to be incorrect. Third, computational theories of cognition that rely on non-neural notions of computation ought to be replaced or reinterpreted in terms of neural computation.

  20. Impact of Classroom Computer Use on Computer Anxiety.

    ERIC Educational Resources Information Center

    Lambert, Matthew E.; And Others

    Increasing use of computer programs for undergraduate psychology education has raised concern over the impact of computer anxiety on educational performance. Additionally, some researchers have indicated that classroom computer use can exacerbate pre-existing computer anxiety. To evaluate the relationship between in-class computer use and computer…

  1. Neural computation and the computational theory of cognition.

    PubMed

    Piccinini, Gualtiero; Bahar, Sonya

    2013-04-01

    We begin by distinguishing computationalism from a number of other theses that are sometimes conflated with it. We also distinguish between several important kinds of computation: computation in a generic sense, digital computation, and analog computation. Then, we defend a weak version of computationalism-neural processes are computations in the generic sense. After that, we reject on empirical grounds the common assimilation of neural computation to either analog or digital computation, concluding that neural computation is sui generis. Analog computation requires continuous signals; digital computation requires strings of digits. But current neuroscientific evidence indicates that typical neural signals, such as spike trains, are graded like continuous signals but are constituted by discrete functional elements (spikes); thus, typical neural signals are neither continuous signals nor strings of digits. It follows that neural computation is sui generis. Finally, we highlight three important consequences of a proper understanding of neural computation for the theory of cognition. First, understanding neural computation requires a specially designed mathematical theory (or theories) rather than the mathematical theories of analog or digital computation. Second, several popular views about neural computation turn out to be incorrect. Third, computational theories of cognition that rely on non-neural notions of computation ought to be replaced or reinterpreted in terms of neural computation. PMID:23126542

  2. Ant-based computing.

    PubMed

    Michael, Loizos

    2009-01-01

    A biologically and physically plausible model for ants and pheromones is proposed. It is argued that the mechanisms described in this model are sufficiently powerful to reproduce the necessary components of universal computation. The claim is supported by illustrating the feasibility of designing arbitrary logic circuits, showing that the interactions of ants and pheromones lead to the expected behavior, and presenting computer simulation results to verify the circuits' working. The conclusions of this study can be taken as evidence that coherent deterministic and centralized computation can emerge from the collective behavior of simple distributed Markovian processes such as those followed by biological ants, but also, more generally, by artificial agents with limited computational and communication abilities. PMID:19239348

  3. Computers Transform an Industry.

    ERIC Educational Resources Information Center

    Simich, Jack

    1982-01-01

    Describes the use of computer technology in the graphics communication industry. Areas that are examined include typesetting, color scanners, communications satellites, page make-up systems, and the business office. (CT)

  4. Computed Tomography (CT) -- Head

    MedlinePlus

    ... further information please consult the ACR Manual on Contrast Media and its references. The risk of serious allergic ... Angiography (CTA) Stroke Brain Tumors Computer Tomography (CT) Safety During Pregnancy Head and Neck Cancer X-ray, ...

  5. Computed Tomography (CT) -- Sinuses

    MedlinePlus

    ... further information please consult the ACR Manual on Contrast Media and its references. The risk of serious allergic ... X-ray, Interventional Radiology and Nuclear Medicine Radiation Safety Images related to Computed Tomography (CT) - Sinuses About ...

  6. Quantum steady computation

    SciTech Connect

    Castagnoli, G. )

    1991-08-10

    This paper reports that current conceptions of quantum mechanical computers inherit from conventional digital machines two apparently interacting features, machine imperfection and temporal development of the computational process. On account of machine imperfection, the process would become ideally reversible only in the limiting case of zero speed. Therefore the process is irreversible in practice and cannot be considered to be a fundamental quantum one. By giving up classical features and using a linear, reversible and non-sequential representation of the computational process - not realizable in classical machines - the process can be identified with the mathematical form of a quantum steady state. This form of steady quantum computation would seem to have an important bearing on the notion of cognition.

  7. Computer Assisted Bioreview.

    ERIC Educational Resources Information Center

    Ballou, Walter E.; And Others

    1984-01-01

    Documentation and complete listing are provided for an Apple computer program designed to assist students in reviewing biology course content. All questions and answers are contained in data statements which are read into two separate question/answer arrays. (JN)

  8. Computers boost structural technology

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Venneri, Samuel L.

    1989-01-01

    Derived from matrix methods of structural analysis and finite element methods developed over the last three decades, computational structures technology (CST) blends computer science, numerical analysis, and approximation theory into structural analysis and synthesis. Recent significant advances in CST include stochastic-based modeling, strategies for performing large-scale structural calculations on new computing systems, and the integration of CST with other disciplinary modules for multidisciplinary analysis and design. New methodologies have been developed at NASA for integrated fluid-thermal structural analysis and integrated aerodynamic-structure-control design. The need for multiple views of data for different modules also led to the development of a number of sophisticated data-base management systems. For CST to play a role in the future development of structures technology and in the multidisciplinary design of future flight vehicles, major advances and computational tools are needed in a number of key areas.

  9. Cloud computing security.

    SciTech Connect

    Shin, Dongwan; Claycomb, William R.; Urias, Vincent E.

    2010-10-01

    Cloud computing is a paradigm rapidly being embraced by government and industry as a solution for cost-savings, scalability, and collaboration. While a multitude of applications and services are available commercially for cloud-based solutions, research in this area has yet to fully embrace the full spectrum of potential challenges facing cloud computing. This tutorial aims to provide researchers with a fundamental understanding of cloud computing, with the goals of identifying a broad range of potential research topics, and inspiring a new surge in research to address current issues. We will also discuss real implementations of research-oriented cloud computing systems for both academia and government, including configuration options, hardware issues, challenges, and solutions.

  10. Computers in the Classroom.

    ERIC Educational Resources Information Center

    Sigg, S. F.

    1986-01-01

    Some ways computers are used in classrooms today and their potential as a tool to enhance instruction are explored. Considered are graphics, lesson planning, test preparation, drill and practice, remedial work, demonstrations, simulations, and introduction to programming. (MNS)

  11. Cars, Computers, and Curriculum.

    ERIC Educational Resources Information Center

    Suhor, Charles

    1983-01-01

    After comparing the effect of the automobile on society and education in the early 1900's, the author states that present day arguments for drastic educational change because of the power of the computer are premature. (MLF)

  12. Computer security engineering management

    SciTech Connect

    McDonald, G.W.

    1988-01-01

    For best results, computer security should be engineered into a system during its development rather than being appended later on. This paper addresses the implementation of computer security in eight stages through the life cycle of the system; starting with the definition of security policies and ending with continuing support for the security aspects of the system throughout its operational life cycle. Security policy is addressed relative to successive decomposition of security objectives (through policy, standard, and control stages) into system security requirements. This is followed by a discussion of computer security organization and responsibilities. Next the paper directs itself to analysis and management of security-related risks, followed by discussion of design and development of the system itself. Discussion of security test and evaluation preparations, and approval to operate (certification and accreditation), is followed by discussion of computer security training for users is followed by coverage of life cycle support for the security of the system.

  13. Brain-Computer Symbiosis

    PubMed Central

    Schalk, Gerwin

    2009-01-01

    The theoretical groundwork of the 1930’s and 1940’s and the technical advance of computers in the following decades provided the basis for dramatic increases in human efficiency. While computers continue to evolve, and we can still expect increasing benefits from their use, the interface between humans and computers has begun to present a serious impediment to full realization of the potential payoff. This article is about the theoretical and practical possibility that direct communication between the brain and the computer can be used to overcome this impediment by improving or augmenting conventional forms of human communication. It is about the opportunity that the limitations of our body’s input and output capacities can be overcome using direct interaction with the brain, and it discusses the assumptions, possible limitations, and implications of a technology that I anticipate will be a major source of pervasive changes in the coming decades. PMID:18310804

  14. Modular missile borne computers

    NASA Technical Reports Server (NTRS)

    Ramseyer, R.; Arnold, R.; Applewhite, H.; Berg, R.

    1980-01-01

    The modular missile borne computer's architecture with emphasis on how that architecture evolved is discussed. A careful analysis is given of both the physical constraints and the processing requirements.

  15. Computers and Early Learning.

    ERIC Educational Resources Information Center

    Banet, Bernard

    1978-01-01

    This paper discusses the effect that microelectronic technology will have on elementary education in the decades ahead, and some of the uses of computers as learning aids for young children, including interactive games, tutorial systems, creative activity, and simulation. (MP)

  16. Computers, Networks and Work.

    ERIC Educational Resources Information Center

    Sproull, Lee; Kiesler, Sara

    1991-01-01

    Discussed are how computer networks can affect the nature of work and the relationships between managers and employees. The differences between face-to-face exchanges and electronic interactions are described. (KR)

  17. Communications, Computers and Networks.

    ERIC Educational Resources Information Center

    Dertouzos, Michael L.

    1991-01-01

    The infrastructure created by fusing computing and communications technologies is described. The effect of this infrastructure on the economy and society of the United States is discussed. The importance of knowing the value and role of information is emphasized. (KR)

  18. CMS computing model evolution

    NASA Astrophysics Data System (ADS)

    Grandi, C.; Bonacorsi, D.; Colling, D.; Fisk, I.; Girone, M.

    2014-06-01

    The CMS Computing Model was developed and documented in 2004. Since then the model has evolved to be more flexible and to take advantage of new techniques, but many of the original concepts remain and are in active use. In this presentation we will discuss the changes planned for the restart of the LHC program in 2015. We will discuss the changes planning in the use and definition of the computing tiers that were defined with the MONARC project. We will present how we intend to use new services and infrastructure to provide more efficient and transparent access to the data. We will discuss the computing plans to make better use of the computing capacity by scheduling more of the processor nodes, making better use of the disk storage, and more intelligent use of the networking.

  19. The Computer Bulletin Board.

    ERIC Educational Resources Information Center

    Larrabee, C. E., Jr.; And Others

    1988-01-01

    Describes the use of computer spreadsheet programs in physical chemistry classrooms. Stresses the application of bar graphs to fundamental quantum and statistical mechanics. Lists the advantages of the use of spreadsheets and gives examples of possible uses. (ML)

  20. Goldstone (GDSCC) administrative computing

    NASA Technical Reports Server (NTRS)

    Martin, H.

    1981-01-01

    The GDSCC Data Processing Unit provides various administrative computing services for Goldstone. Those activities, including finance, manpower and station utilization, deep-space station scheduling and engineering change order (ECO) control are discussed.

  1. Biomarkers in Computational Toxicology

    EPA Science Inventory

    Biomarkers are a means to evaluate chemical exposure and/or the subsequent impacts on toxicity pathways that lead to adverse health outcomes. Computational toxicology can integrate biomarker data with knowledge of exposure, chemistry, biology, pharmacokinetics, toxicology, and e...

  2. Computational Aeroacoustics: An Overview

    NASA Technical Reports Server (NTRS)

    Tam, Christopher K. W.

    2003-01-01

    An overview of recent advances in computational aeroacoustics (CAA) is presented. CAA algorithms must not be dispersive and dissipative. It should propagate waves supported by the Euler equations with the correct group velocities. Computation domains are inevitably finite in size. To avoid the reflection of acoustic and other outgoing waves at the boundaries of the computation domain, it is required that special boundary conditions be imposed at the boundary region. These boundary conditions either absorb all the outgoing waves without reflection or allow the waves to exit smoothly. High-order schemes, invariably, supports spurious short waves. These spurious waves tend to pollute the numerical solution. They must be selectively damped or filtered out. All these issues and relevant computation methods are briefly reviewed. Jet screech tones are known to have caused structural fatigue in military combat aircrafts. Numerical simulation of the jet screech phenomenon is presented as an example of a successful application of CAA.

  3. The Computer in Lexicography

    ERIC Educational Resources Information Center

    Bailey, Richard W.; Robinson, Jay L.

    1973-01-01

    Expanded version of a paper presented to the section on Computer Research in Language and Literature of the Midwest Modern Language Association, October 24, 1969. Article is part of Lexicography and Dialect Geography, Festgabe for Hans Kurath''. (DD)

  4. Quantum information and computation

    SciTech Connect

    Bennett, C.H.

    1995-10-01

    A new quantum theory of communication and computation is emerging, in which the stuff transmitted or processed is not classical information, but arbitrary superpositions of quantum states. {copyright} 1995 {ital American} {ital Institute} {ital of} {ital Physics}.

  5. Resurrecting the computer graveyard

    SciTech Connect

    McAdams, C.L.

    1995-02-01

    What eventually happens to all the old monitors, circuit boards, and other peripherial computer equipment when they die or simply become obsolete Disposal has been made difficult by a US EPA landfill ban on some circuit boards with high lead content, and other computer parts that contain what some call dangerous amounts of toxic substances. With few other options, most companies simply store the obsolete equipment. Often referred to as computer or dinosaur graveyards'', these increasingly numerous office purgatories can seem like permanent fixtures in the modern work-place. Not to worry, according to a new and expanding branch of the recycling industry. While there are some companies cashing in on the refurbishment and resale of these old computers, a growing number are successfully recycling the component parts--selling the plastics, metal, and circuit boards--and doing it safely.

  6. Computer Center: Software Review.

    ERIC Educational Resources Information Center

    Duhrkopf, Richard, Ed.; Belshe, John F., Ed.

    1988-01-01

    Reviews a software package, "Mitosis-Meiosis," available for Apple II or IBM computers with colorgraphics capabilities. Describes the documentation, presentation and flexibility of the program. Rates the program based on graphics and usability in a biology classroom. (CW)

  7. Quantum computational webs

    SciTech Connect

    Gross, D.; Eisert, J.

    2010-10-15

    We discuss the notion of quantum computational webs: These are quantum states universal for measurement-based computation, which can be built up from a collection of simple primitives. The primitive elements--reminiscent of building blocks in a construction kit--are (i) one-dimensional states (computational quantum wires) with the power to process one logical qubit and (ii) suitable couplings, which connect the wires to a computationally universal web. All elements are preparable by nearest-neighbor interactions in a single pass, of the kind accessible in a number of physical architectures. We provide a complete classification of qubit wires, a physically well-motivated class of universal resources that can be fully understood. Finally, we sketch possible realizations in superlattices and explore the power of coupling mechanisms based on Ising or exchange interactions.

  8. Computer Programs (Turbomachinery)

    NASA Technical Reports Server (NTRS)

    1978-01-01

    NASA computer programs are extensively used in design of industrial equipment. Available from the Computer Software Management and Information Center (COSMIC) at the University of Georgia, these programs are employed as analysis tools in design, test and development processes, providing savings in time and money. For example, two NASA computer programs are used daily in the design of turbomachinery by Delaval Turbine Division, Trenton, New Jersey. The company uses the NASA splint interpolation routine for analysis of turbine blade vibration and the performance of compressors and condensers. A second program, the NASA print plot routine, analyzes turbine rotor response and produces graphs for project reports. The photos show examples of Delaval test operations in which the computer programs play a part. In the large photo below, a 24-inch turbine blade is undergoing test; in the smaller photo, a steam turbine rotor is being prepared for stress measurements under actual operating conditions; the "spaghetti" is wiring for test instrumentation

  9. Computer Aided Braille Trainer

    PubMed Central

    Sibert, Thomas W.

    1984-01-01

    The problems involved in teaching visually impaired persons to Braille are numerous. Training while the individual is still sighted and using a computer to assist is one way of shortening the learning curve. Such a solution is presented here.

  10. [Computers in surgery].

    PubMed

    Grande, M; Torquati, A; Bellisario, A

    1990-01-01

    The introduction of computers in medicine, particularly for what surgery is concerned, has many implications related to its optimal utilization. Possible applications as well as advantages and limits of such recording system in a surgical ward are examined. Emphasis is placed on the clinical data management model and the ways the different structures of the system are related. The computer processing of these data provides valid material for either clinical and surgical research or statistical studies.

  11. Computation in electron microscopy.

    PubMed

    Kirkland, Earl J

    2016-01-01

    Some uses of the computer and computation in high-resolution transmission electron microscopy are reviewed. The theory of image calculation using Bloch wave and multislice methods with and without aberration correction is reviewed and some applications are discussed. The inverse problem of reconstructing the specimen structure from an experimentally measured electron microscope image is discussed. Some future directions of software development are given. PMID:26697863

  12. SLAC B Factory computing

    SciTech Connect

    Kunz, P.F.

    1992-02-01

    As part of the research and development program in preparation for a possible B Factory at SLAC, a group has been studying various aspects of HEP computing. In particular, the group is investigating the use of UNIX for all computing, from data acquisition, through analysis, and word processing. A summary of some of the results of this study will be given, along with some personal opinions on these topics.

  13. Recent computational chemistry

    SciTech Connect

    Onishi, Taku

    2015-12-31

    Now we can investigate quantum phenomena for the real materials and molecules, and can design functional materials by computation, due to the previous developments of quantum theory and calculation methods. As there still exist the limit and problem in theory, the cooperation between theory and computation is getting more important to clarify the unknown quantum mechanism, and discover more efficient functional materials. It would be next-generation standard. Finally, our theoretical methodology for boundary solid is introduced.

  14. Computer Games and Instruction

    ERIC Educational Resources Information Center

    Tobias, Sigmund, Ed.; Fletcher, J. D., Ed.

    2011-01-01

    There is intense interest in computer games. A total of 65 percent of all American households play computer games, and sales of such games increased 22.9 percent last year. The average amount of game playing time was found to be 13.2 hours per week. The popularity and market success of games is evident from both the increased earnings from games,…

  15. Computer Models of Proteins

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Dr. Marc Pusey (seated) and Dr. Craig Kundrot use computers to analyze x-ray maps and generate three-dimensional models of protein structures. With this information, scientists at Marshall Space Flight Center can learn how proteins are made and how they work. The computer screen depicts a proten structure as a ball-and-stick model. Other models depict the actual volume occupied by the atoms, or the ribbon-like structures that are crucial to a protein's function.

  16. Computer-assisted instruction

    NASA Technical Reports Server (NTRS)

    Atkinson, R. C.

    1974-01-01

    The results are presented of a project of research and development on strategies for optimizing the instructional process, and dissemination of information about the applications of such research to the instructional medium of computer-assisted instruction. Accomplishments reported include construction of the author language INSTRUCT, construction of a practical CAI course in the area of computer science, and a number of investigations into the individualization of instruction, using the course as a vehicle.

  17. Partnership in Computational Science

    SciTech Connect

    Huray, Paul G.

    1999-02-24

    This is the final report for the "Partnership in Computational Science" (PICS) award in an amount of $500,000 for the period January 1, 1993 through December 31, 1993. A copy of the proposal with its budget is attached as Appendix A. This report first describes the consequent significance of the DOE award in building infrastructure of high performance computing in the Southeast and then describes the work accomplished under this grant and a list of publications resulting from it.

  18. Professionalism in Computer Forensics

    NASA Astrophysics Data System (ADS)

    Irons, Alastair D.; Konstadopoulou, Anastasia

    The paper seeks to address the need to consider issues regarding professionalism in computer forensics in order to allow the discipline to develop and to ensure the credibility of the discipline from the differing perspectives of practitioners, the criminal justice system and in the eyes of the public. There is a need to examine and develop professionalism in computer forensics in order to promote the discipline and maintain the credibility of the discipline.

  19. Reconfigurable environmentally adaptive computing

    NASA Technical Reports Server (NTRS)

    Coxe, Robin L. (Inventor); Galica, Gary E. (Inventor)

    2008-01-01

    Described are methods and apparatus, including computer program products, for reconfigurable environmentally adaptive computing technology. An environmental signal representative of an external environmental condition is received. A processing configuration is automatically selected, based on the environmental signal, from a plurality of processing configurations. A reconfigurable processing element is reconfigured to operate according to the selected processing configuration. In some examples, the environmental condition is detected and the environmental signal is generated based on the detected condition.

  20. Computers and clinical arrhythmias.

    PubMed

    Knoebel, S B; Lovelace, D E

    1983-02-01

    Cardiac arrhythmias are ubiquitous in normal and abnormal hearts. These disorders may be life-threatening or benign, symptomatic or unrecognized. Arrhythmias may be the precursor of sudden death, a cause or effect of cardiac failure, a clinical reflection of acute or chronic disorders, or a manifestation of extracardiac conditions. Progress is being made toward unraveling the diagnostic and therapeutic problems involved in arrhythmogenesis. Many of the advances would not be possible, however, without the availability of computer technology. To preserve the proper balance and purposeful progression of computer usage, engineers and physicians have been exhorted not to work independently in this field. Both should learn some of the other's trade. The two disciplines need to come together to solve important problems with computers in cardiology. The intent of this article was to acquaint the practicing cardiologist with some of the extant and envisioned computer applications and some of the problems with both. We conclude that computer-based database management systems are necessary for sorting out the clinical factors of relevance for arrhythmogenesis, but computer database management systems are beset with problems that will require sophisticated solutions. The technology for detecting arrhythmias on routine electrocardiograms is quite good but human over-reading is still required, and the rationale for computer application in this setting is questionable. Systems for qualitative, continuous monitoring and review of extended time ECG recordings are adequate with proper noise rejection algorithms and editing capabilities. The systems are limited presently for clinical application to the recognition of ectopic rhythms and significant pauses. Attention should now be turned to the clinical goals for detection and quantification of arrhythmias. We should be asking the following questions: How quantitative do systems need to be? Are computers required for the detection of

  1. Research in computer science

    NASA Technical Reports Server (NTRS)

    Ortega, J. M.

    1985-01-01

    Synopses are given for NASA supported work in computer science at the University of Virginia. Some areas of research include: error seeding as a testing method; knowledge representation for engineering design; analysis of faults in a multi-version software experiment; implementation of a parallel programming environment; two computer graphics systems for visualization of pressure distribution and convective density particles; task decomposition for multiple robot arms; vectorized incomplete conjugate gradient; and iterative methods for solving linear equations on the Flex/32.

  2. The promise of analog computation

    NASA Astrophysics Data System (ADS)

    MacLennan, B. J.

    2014-10-01

    Future computing paradigms and technologies will have to be more like the physical processes by which they are realized, and because these processes are primarily continuous, post-Moore's law computing will involve an increased use of analog computation. Traditionally analog computers have computed ordinary differential equations of time, but analog field computation permits massively parallel temporal integration of partial differential equations. In principle many different physical media - not just electronics - can be exploited to implement the basic operations of analog computing, a small number of which are sufficient to approximate a wide variety of analog computations, thus providing a basis for universal analog computation and general-purpose analog computers. The contentious issue of the computational power of analog computers is addressed best on its own terms, rather by asking it within the context of Church-Turing computation, which distorts the relevant questions and their answers.

  3. Computing with neural synchrony.

    PubMed

    Brette, Romain

    2012-01-01

    Neurons communicate primarily with spikes, but most theories of neural computation are based on firing rates. Yet, many experimental observations suggest that the temporal coordination of spikes plays a role in sensory processing. Among potential spike-based codes, synchrony appears as a good candidate because neural firing and plasticity are sensitive to fine input correlations. However, it is unclear what role synchrony may play in neural computation, and what functional advantage it may provide. With a theoretical approach, I show that the computational interest of neural synchrony appears when neurons have heterogeneous properties. In this context, the relationship between stimuli and neural synchrony is captured by the concept of synchrony receptive field, the set of stimuli which induce synchronous responses in a group of neurons. In a heterogeneous neural population, it appears that synchrony patterns represent structure or sensory invariants in stimuli, which can then be detected by postsynaptic neurons. The required neural circuitry can spontaneously emerge with spike-timing-dependent plasticity. Using examples in different sensory modalities, I show that this allows simple neural circuits to extract relevant information from realistic sensory stimuli, for example to identify a fluctuating odor in the presence of distractors. This theory of synchrony-based computation shows that relative spike timing may indeed have computational relevance, and suggests new types of neural network models for sensory processing with appealing computational properties.

  4. Is thinking computable?

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1990-01-01

    Strong artificial intelligence claims that conscious thought can arise in computers containing the right algorithms even though none of the programs or components of those computers understand which is going on. As proof, it asserts that brains are finite webs of neurons, each with a definite function governed by the laws of physics; this web has a set of equations that can be solved (or simulated) by a sufficiently powerful computer. Strong AI claims the Turing test as a criterion of success. A recent debate in Scientific American concludes that the Turing test is not sufficient, but leaves intact the underlying premise that thought is a computable process. The recent book by Roger Penrose, however, offers a sharp challenge, arguing that the laws of quantum physics may govern mental processes and that these laws may not be computable. In every area of mathematics and physics, Penrose finds evidence of nonalgorithmic human activity and concludes that mental processes are inherently more powerful than computational processes.

  5. Computing with Neural Synchrony

    PubMed Central

    Brette, Romain

    2012-01-01

    Neurons communicate primarily with spikes, but most theories of neural computation are based on firing rates. Yet, many experimental observations suggest that the temporal coordination of spikes plays a role in sensory processing. Among potential spike-based codes, synchrony appears as a good candidate because neural firing and plasticity are sensitive to fine input correlations. However, it is unclear what role synchrony may play in neural computation, and what functional advantage it may provide. With a theoretical approach, I show that the computational interest of neural synchrony appears when neurons have heterogeneous properties. In this context, the relationship between stimuli and neural synchrony is captured by the concept of synchrony receptive field, the set of stimuli which induce synchronous responses in a group of neurons. In a heterogeneous neural population, it appears that synchrony patterns represent structure or sensory invariants in stimuli, which can then be detected by postsynaptic neurons. The required neural circuitry can spontaneously emerge with spike-timing-dependent plasticity. Using examples in different sensory modalities, I show that this allows simple neural circuits to extract relevant information from realistic sensory stimuli, for example to identify a fluctuating odor in the presence of distractors. This theory of synchrony-based computation shows that relative spike timing may indeed have computational relevance, and suggests new types of neural network models for sensory processing with appealing computational properties. PMID:22719243

  6. Word To Compute By: Keeping Up with Personal Computing Terminology.

    ERIC Educational Resources Information Center

    Crawford, Walt

    1997-01-01

    Reviews current terminology related to personal computing, referring to a previous article written five years ago. Highlights include Macintosh processors; Pentium processors; computer memory; computer buses; video displays; storage devices; newer media; and miscellaneous terms and concepts. (LRW)

  7. Publishing Trends in Educational Computing.

    ERIC Educational Resources Information Center

    O'Hair, Marilyn; Johnson, D. LaMont

    1989-01-01

    Describes results of a survey of secondary school and college teachers that was conducted to determine subject matter that should be included in educational computing journals. Areas of interest included computer applications; artificial intelligence; computer-aided instruction; computer literacy; computer-managed instruction; databases; distance…

  8. Quantum computing with trapped ions

    SciTech Connect

    Hughes, R.J.

    1998-01-01

    The significance of quantum computation for cryptography is discussed. Following a brief survey of the requirements for quantum computational hardware, an overview of the ion trap quantum computation project at Los Alamos is presented. The physical limitations to quantum computation with trapped ions are analyzed and an assessment of the computational potential of the technology is made.

  9. Computer Technology in Adult Education.

    ERIC Educational Resources Information Center

    Slider, Patty; Hodges, Kathy; Carter, Cea; White, Barbara

    This publication provides materials to help adult educators use computer technology in their teaching. Section 1, Computer Basics, contains activities and materials on these topics: increasing computer literacy, computer glossary, parts of a computer, keyboard, disk care, highlighting text, scrolling and wrap-around text, setting up text,…

  10. New Computing in Higher Education.

    ERIC Educational Resources Information Center

    Gilbert, Steven W.; Green, Kenneth C.

    1986-01-01

    With the advent of the computer revolution, major changes are underway in the ways campuses deal with computing and computers. One change is the gathering momentum of faculty members and administrators to become computer users. Another is the vast amount of individual and institutional effort invested in plans for integrating computing into the…

  11. Science Teaching and Computer Languages.

    ERIC Educational Resources Information Center

    Bork, Alfred M.

    Computer languages are analyzed and compared from the standpoint of the science teacher using computers in the classroom. Computers have three basic uses in teaching, to compute, to instruct, and to motivate; effective computer languages should be responsive to these three modes. Widely-used languages, including FORTRAN, ALGOL, PL/1, and APL, are…

  12. Computer-aided design and computer science technology

    NASA Technical Reports Server (NTRS)

    Fulton, R. E.; Voigt, S. J.

    1976-01-01

    A description is presented of computer-aided design requirements and the resulting computer science advances needed to support aerospace design. The aerospace design environment is examined, taking into account problems of data handling and aspects of computer hardware and software. The interactive terminal is normally the primary interface between the computer system and the engineering designer. Attention is given to user aids, interactive design, interactive computations, the characteristics of design information, data management requirements, hardware advancements, and computer science developments.

  13. Computer automated design and computer automated manufacture.

    PubMed

    Brncick, M

    2000-08-01

    The introduction of computer aided design and computer aided manufacturing into the field of prosthetics and orthotics did not arrive without concern. Many prosthetists feared that the computer would provide other allied health practitioners who had little or no experience in prosthetics the ability to fit and manage amputees. Technicians in the field felt their jobs may be jeopardized by automated fabrication techniques. This has not turned out to be the case. Prosthetists who use CAD-CAM techniques are finding they have more time for patient care and clinical assessment. CAD-CAM is another tool for them to provide better care for the patients/clients they serve. One of the factors that deterred the acceptance of CAD-CAM techniques in its early stages was that of cost. It took a significant investment in software and hardware for the prosthetists to begin to use the new systems. This new technique was not reimbursed by insurance coverage. Practitioners did not have enough information about this new technique to make a sound decision on their investment of time and money. Ironically, it is the need to hold health care costs down that may prove to be the catalyst for the increased use of CAD-CAM in the field. Providing orthoses and prostheses to patients who require them is a very labor intensive process. Practitioners are looking for better, faster, and more economical ways in which to provide their services under the pressure of managed care. CAD-CAM may be the answer. The author foresees shape sensing departments in hospitals where patients would be sent to be digitized, similar to someone going for radiograph or ultrasound. Afterwards, an orthosis or prosthesis could be provided from a central fabrication facility at a remote site, most likely on the same day. Not long ago, highly skilled practitioners with extensive technical ability would custom make almost every orthosis. One now practices in an atmosphere where off-the-shelf orthoses are the standard. This

  14. Parallel Computing in SCALE

    SciTech Connect

    DeHart, Mark D; Williams, Mark L; Bowman, Stephen M

    2010-01-01

    The SCALE computational architecture has remained basically the same since its inception 30 years ago, although constituent modules and capabilities have changed significantly. This SCALE concept was intended to provide a framework whereby independent codes can be linked to provide a more comprehensive capability than possible with the individual programs - allowing flexibility to address a wide variety of applications. However, the current system was designed originally for mainframe computers with a single CPU and with significantly less memory than today's personal computers. It has been recognized that the present SCALE computation system could be restructured to take advantage of modern hardware and software capabilities, while retaining many of the modular features of the present system. Preliminary work is being done to define specifications and capabilities for a more advanced computational architecture. This paper describes the state of current SCALE development activities and plans for future development. With the release of SCALE 6.1 in 2010, a new phase of evolutionary development will be available to SCALE users within the TRITON and NEWT modules. The SCALE (Standardized Computer Analyses for Licensing Evaluation) code system developed by Oak Ridge National Laboratory (ORNL) provides a comprehensive and integrated package of codes and nuclear data for a wide range of applications in criticality safety, reactor physics, shielding, isotopic depletion and decay, and sensitivity/uncertainty (S/U) analysis. Over the last three years, since the release of version 5.1 in 2006, several important new codes have been introduced within SCALE, and significant advances applied to existing codes. Many of these new features became available with the release of SCALE 6.0 in early 2009. However, beginning with SCALE 6.1, a first generation of parallel computing is being introduced. In addition to near-term improvements, a plan for longer term SCALE enhancement

  15. Making It in Computer Sales.

    ERIC Educational Resources Information Center

    Davidson, Robert L., III

    1987-01-01

    Discusses some of the possibilities for careers in computer sales. Describes some of the attributes of quality computer salespersons, as illustrated by interviews with two experts on computer sales. (TW)

  16. Experimental verification of quantum computation

    NASA Astrophysics Data System (ADS)

    Barz, Stefanie; Fitzsimons, Joseph F.; Kashefi, Elham; Walther, Philip

    2013-11-01

    Quantum computers are expected to offer substantial speed-ups over their classical counterparts and to solve problems intractable for classical computers. Beyond such practical significance, the concept of quantum computation opens up fundamental questions, among them the issue of whether quantum computations can be certified by entities that are inherently unable to compute the results themselves. Here we present the first experimental verification of quantum computation. We show, in theory and experiment, how a verifier with minimal quantum resources can test a significantly more powerful quantum computer. The new verification protocol introduced here uses the framework of blind quantum computing and is independent of the experimental quantum-computation platform used. In our scheme, the verifier is required only to generate single qubits and transmit them to the quantum computer. We experimentally demonstrate this protocol using four photonic qubits and show how the verifier can test the computer's ability to perform quantum computation.

  17. Computer Use and Computer Anxiety in Older Korean Americans.

    PubMed

    Yoon, Hyunwoo; Jang, Yuri; Xie, Bo

    2016-09-01

    Responding to the limited literature on computer use in ethnic minority older populations, the present study examined predictors of computer use and computer anxiety in older Korean Americans. Separate regression models were estimated for computer use and computer anxiety with the common sets of predictors: (a) demographic variables (age, gender, marital status, and education), (b) physical health indicators (chronic conditions, functional disability, and self-rated health), and (c) sociocultural factors (acculturation and attitudes toward aging). Approximately 60% of the participants were computer-users, and they had significantly lower levels of computer anxiety than non-users. A higher likelihood of computer use and lower levels of computer anxiety were commonly observed among individuals with younger age, male gender, advanced education, more positive ratings of health, and higher levels of acculturation. In addition, positive attitudes toward aging were found to reduce computer anxiety. Findings provide implications for developing computer training and education programs for the target population. PMID:25698717

  18. The ATLAS computing model & distributed computing evolution

    NASA Astrophysics Data System (ADS)

    Jones, Roger W. L.; Atlas Collaboration

    2012-12-01

    Despite only a brief availability of beam-related data, the typical usage patterns and operational requirements of the ATLAS computing model have been exercised, and the model as originally constructed remains remarkably unchanged. Resource requirements have been revised, and cosmic ray running has exercised much of the model in both duration and volume. The operational model has been adapted in several ways to increase performance and meet the asdelivered functionality of the available middleware. There are also changes reflecting the emerging roles of the different data formats. The model continues to evolve with a heightened focus on end-user performance; the key tools developed in the operational system are outlined, with an emphasis on those under recent development.

  19. Computational chemistry on commodity-type computers.

    PubMed

    Nicklaus, M C; Williams, R W; Bienfait, B; Billings, E S; Hodoscek, M

    1998-01-01

    A number of inexpensive computers were benchmarked with the ab initio program Gaussian 94, using both small standard test jobs and larger density functional (DFT) calculations. Several varieties of Pentium (x86) and Alpha CPU based systems were tested. Most of them were running under the open source code operating system Linux. They were compared with several workstations and supercomputers. The most powerful of today's commodity-type processors surpassed current supercomputers in speed. The choice of compilers and compilation options was often found to have a larger influence on job CPU times than details of the hardware. Especially on the x86 type machines, the jobs always ran faster the less memory (RAM) they were given. The fastest machine on a per-CPU basis was an Alpha/Linux system. For the DFT calculation, it was close to twice as fast as a Cray J90 supercomputer. PMID:9770303

  20. Highly parallel computer architecture for robotic computation

    NASA Technical Reports Server (NTRS)

    Fijany, Amir (Inventor); Bejczy, Anta K. (Inventor)

    1991-01-01

    In a computer having a large number of single instruction multiple data (SIMD) processors, each of the SIMD processors has two sets of three individual processor elements controlled by a master control unit and interconnected among a plurality of register file units where data is stored. The register files input and output data in synchronism with a minor cycle clock under control of two slave control units controlling the register file units connected to respective ones of the two sets of processor elements. Depending upon which ones of the register file units are enabled to store or transmit data during a particular minor clock cycle, the processor elements within an SIMD processor are connected in rings or in pipeline arrays, and may exchange data with the internal bus or with neighboring SIMD processors through interface units controlled by respective ones of the two slave control units.

  1. Computational Physics' Greatest Hits

    NASA Astrophysics Data System (ADS)

    Bug, Amy

    2011-03-01

    The digital computer, has worked its way so effectively into our profession that now, roughly 65 years after its invention, it is virtually impossible to find a field of experimental or theoretical physics unaided by computational innovation. It is tough to think of another device about which one can make that claim. In the session ``What is computational physics?'' speakers will distinguish computation within the field of computational physics from this ubiquitous importance across all subfields of physics. This talk will recap the invited session ``Great Advances...Past, Present and Future'' in which five dramatic areas of discovery (five of our ``greatest hits'') are chronicled: The physics of many-boson systems via Path Integral Monte Carlo, the thermodynamic behavior of a huge number of diverse systems via Monte Carlo Methods, the discovery of new pharmaceutical agents via molecular dynamics, predictive simulations of global climate change via detailed, cross-disciplinary earth system models, and an understanding of the formation of the first structures in our universe via galaxy formation simulations. The talk will also identify ``greatest hits'' in our field from the teaching and research perspectives of other members of DCOMP, including its Executive Committee.

  2. Verifiable Quantum Computing

    NASA Astrophysics Data System (ADS)

    Kashefi, Elham

    Over the next five to ten years we will see a state of flux as quantum devices become part of the mainstream computing landscape. However adopting and applying such a highly variable and novel technology is both costly and risky as this quantum approach has an acute verification and validation problem: On the one hand, since classical computations cannot scale up to the computational power of quantum mechanics, verifying the correctness of a quantum-mediated computation is challenging; on the other hand, the underlying quantum structure resists classical certification analysis. Our grand aim is to settle these key milestones to make the translation from theory to practice possible. Currently the most efficient ways to verify a quantum computation is to employ cryptographic methods. I will present the current state of the art of various existing protocols where generally there exists a trade-off between the practicality of the scheme versus their generality, trust assumptions and security level. EK gratefully acknowledges funding through EPSRC Grants EP/N003829/1 and EP/M013243/1.

  3. Computer Assisted Surgery

    NASA Astrophysics Data System (ADS)

    Arámbula Cosío, F.; Padilla Castañeda, M. A.

    2003-09-01

    Computer assisted surgery (CAS) systems can provide different levels of assistance to a surgeon during training and execution of a surgical procedure. This is done through the integration of : measurements taken on medical images; computer graphics techniques; and positioning or tracking mechanisms which accurately locate the surgical instruments inside the operating site. According to the type of assistance that is provided to the surgeon, CAS systems can be classified as: Image guided surgery systems; Assistant robots for surgery; and Training simulators for surgery. In this work are presented the main characteristics of CAS systems. It is also described the development of a computer simulator for training on Transurethral Resection of the Prostate (TURP) based on a computer model of the prostate gland which is able to simulate, in real time, deformations and resections of tissue. The model is constructed as a 3D mesh with physical properties such as elasticity. We describe the main characteristics of the prostate model and its performance. The prostate model will also be used in the development of a CAS system designed to assist the surgeon during a real TURP procedure. The system will provide 3D views of the shape of the prostate of the patient, and the position of the surgical instrument during the operation. The development of new computer graphics models which are able to simulate, in real time, the mechanical behavior of an organ during a surgical procedure, can improve significantly the training and execution of other minimally invasive surgical procedures such as laparoscopic gall bladder surgery.

  4. Optical computer motherboards

    NASA Astrophysics Data System (ADS)

    Jannson, Tomasz P.; Xu, Guoda; Bartha, John M.; Gruntman, Michael A.

    1997-09-01

    In this paper, we investigate the application of precision plastic optics into a communication/computer sub-system, such as a hybrid computer motherboard. We believe that using optical waveguides for next-generation computer motherboards can provide a high performance alternative for present multi-layer printed circuit motherboards. In response to this demand, we suggest our novel concept of a hybrid motherboard based on an internal-fiber-coupling (IFC) wavelength-division-multiplexing (WDM) optical backplane. The IFC/WDM backplane provides dedicated Tx/Rx connections, and applies low-cost, high-performance components, including CD LDs, GRIN plastic fibers, molding housing, and nonimaging optics connectors. Preliminary motherboard parameters are: speed 100 MHz/100 m, or 1 GHz/10 m; fiber loss approximately 0.01 dB/m; almost zero fan-out/fan-in optical power loss, and eight standard wavelength channels. The proposed hybrid computer motherboard, based on innovative optical backplane technology, should solve low-speed, low-parallelism bottlenecks in present electric computer motherboards.

  5. Extensible Computational Chemistry Environment

    2012-08-09

    ECCE provides a sophisticated graphical user interface, scientific visualization tools, and the underlying data management framework enabling scientists to efficiently set up calculations and store, retrieve, and analyze the rapidly growing volumes of data produced by computational chemistry studies. ECCE was conceived as part of the Environmental Molecular Sciences Laboratory construction to solve the problem of researchers being able to effectively utilize complex computational chemistry codes and massively parallel high performance compute resources. Bringing themore » power of these codes and resources to the desktops of researcher and thus enabling world class research without users needing a detailed understanding of the inner workings of either the theoretical codes or the supercomputers needed to run them was a grand challenge problem in the original version of the EMSL. ECCE allows collaboration among researchers using a web-based data repository where the inputs and results for all calculations done within ECCE are organized. ECCE is a first of kind end-to-end problem solving environment for all phases of computational chemistry research: setting up calculations with sophisticated GUI and direct manipulation visualization tools, submitting and monitoring calculations on remote high performance supercomputers without having to be familiar with the details of using these compute resources, and performing results visualization and analysis including creating publication quality images. ECCE is a suite of tightly integrated applications that are employed as the user moves through the modeling process.« less

  6. Computational Systems Biology

    SciTech Connect

    McDermott, Jason E.; Samudrala, Ram; Bumgarner, Roger E.; Montogomery, Kristina; Ireton, Renee

    2009-05-01

    Computational systems biology is the term that we use to describe computational methods to identify, infer, model, and store relationships between the molecules, pathways, and cells (“systems”) involved in a living organism. Based on this definition, the field of computational systems biology has been in existence for some time. However, the recent confluence of high throughput methodology for biological data gathering, genome-scale sequencing and computational processing power has driven a reinvention and expansion of this field. The expansions include not only modeling of small metabolic{Ishii, 2004 #1129; Ekins, 2006 #1601; Lafaye, 2005 #1744} and signaling systems{Stevenson-Paulik, 2006 #1742; Lafaye, 2005 #1744} but also modeling of the relationships between biological components in very large systems, incluyding whole cells and organisms {Ideker, 2001 #1124; Pe'er, 2001 #1172; Pilpel, 2001 #393; Ideker, 2002 #327; Kelley, 2003 #1117; Shannon, 2003 #1116; Ideker, 2004 #1111}{Schadt, 2003 #475; Schadt, 2006 #1661}{McDermott, 2002 #878; McDermott, 2005 #1271}. Generally these models provide a general overview of one or more aspects of these systems and leave the determination of details to experimentalists focused on smaller subsystems. The promise of such approaches is that they will elucidate patterns, relationships and general features that are not evident from examining specific components or subsystems. These predictions are either interesting in and of themselves (for example, the identification of an evolutionary pattern), or are interesting and valuable to researchers working on a particular problem (for example highlight a previously unknown functional pathway). Two events have occurred to bring about the field computational systems biology to the forefront. One is the advent of high throughput methods that have generated large amounts of information about particular systems in the form of genetic studies, gene expression analyses (both protein and

  7. Optoelectronic Reservoir Computing

    PubMed Central

    Paquot, Y.; Duport, F.; Smerieri, A.; Dambre, J.; Schrauwen, B.; Haelterman, M.; Massar, S.

    2012-01-01

    Reservoir computing is a recently introduced, highly efficient bio-inspired approach for processing time dependent data. The basic scheme of reservoir computing consists of a non linear recurrent dynamical system coupled to a single input layer and a single output layer. Within these constraints many implementations are possible. Here we report an optoelectronic implementation of reservoir computing based on a recently proposed architecture consisting of a single non linear node and a delay line. Our implementation is sufficiently fast for real time information processing. We illustrate its performance on tasks of practical importance such as nonlinear channel equalization and speech recognition, and obtain results comparable to state of the art digital implementations. PMID:22371825

  8. Constructing computer virus phylogenies

    SciTech Connect

    Goldberg, L.A.; Goldberg, P.W.; Phillips, C.A.; Sorkin, G.B.

    1996-03-01

    There has been much recent algorithmic work on the problem of reconstructing the evolutionary history of biological species. Computer virus specialists are interested in finding the evolutionary history of computer viruses--a virus is often written using code fragments from one or more other viruses, which are its immediate ancestors. A phylogeny for a collection of computer viruses is a directed acyclic graph whose nodes are the viruses and whose edges map ancestors to descendants and satisfy the property that each code fragment is ``invented`` only once. To provide a simple explanation for the data, we consider the problem of constructing such a phylogeny with a minimal number of edges. In general, this optimization problem cannot be solved in quasi-polynomial time unless NQP=QP; we present positive and negative results for associated approximated problems. When tree solutions exist, they can be constructed and randomly sampled in polynomial time.

  9. Computer integrated laboratory testing

    NASA Technical Reports Server (NTRS)

    Dahl, Charles C.

    1992-01-01

    The objective is the integration of computers into the Engineering Materials Science Laboratory course, where existing test equipment is not computerized. The first lab procedure is to demonstrate and produce a material phase change curve. The second procedure is a demonstration of the modulus of elasticity and related stress-strain curve, plastic performance, maximum and failure strength. The process of recording data by sensors that are connected to a data logger which adds a time base, and the data logger in turn connected to a computer, places the materials labs into a computer integrated mode with minimum expense and maximum flexibility. The sensor signals are input into a spread sheet for tabular records, curve generation, and graph printing.

  10. Computer networking at FERMILAB

    SciTech Connect

    Chartrand, G.

    1986-05-01

    Management aspects of data communications facilities at Fermilab are described. Local area networks include Ferminet, a broadband CATV system which serves as a backbone-type carrier for high-speed data traffic between major network nodes; micom network, four Micom Micro-600/2A port selectors via private twisted pair cables, dedicated telephone circuits, or Micom 800/2 statistical multiplexors; and Decnet/Ethernet, several small local area networks which provide host-to-host communications for about 35 VAX computers systems. Wide area (off site) computer networking includes an off site Micom network which provides access to all of Fermilab's computer systems for 10 universities via leased lines or modem; Tymnet, used by many European and Japanese collaborations: Physnet, used for shared data processing task communications by large collaborations of universities; Bitnet, used for file transfer, electronic mail, and communications with CERN; and Mfenet, for access to supercomputers. Plans to participate in Hepnet are also addressed. 3 figs. (DWL)

  11. Human-computer interface

    DOEpatents

    Anderson, Thomas G.

    2004-12-21

    The present invention provides a method of human-computer interfacing. Force feedback allows intuitive navigation and control near a boundary between regions in a computer-represented space. For example, the method allows a user to interact with a virtual craft, then push through the windshield of the craft to interact with the virtual world surrounding the craft. As another example, the method allows a user to feel transitions between different control domains of a computer representation of a space. The method can provide for force feedback that increases as a user's locus of interaction moves near a boundary, then perceptibly changes (e.g., abruptly drops or changes direction) when the boundary is traversed.

  12. Computational Analysis of Behavior.

    PubMed

    Egnor, S E Roian; Branson, Kristin

    2016-07-01

    In this review, we discuss the emerging field of computational behavioral analysis-the use of modern methods from computer science and engineering to quantitatively measure animal behavior. We discuss aspects of experiment design important to both obtaining biologically relevant behavioral data and enabling the use of machine vision and learning techniques for automation. These two goals are often in conflict. Restraining or restricting the environment of the animal can simplify automatic behavior quantification, but it can also degrade the quality or alter important aspects of behavior. To enable biologists to design experiments to obtain better behavioral measurements, and computer scientists to pinpoint fruitful directions for algorithm improvement, we review known effects of artificial manipulation of the animal on behavior. We also review machine vision and learning techniques for tracking, feature extraction, automated behavior classification, and automated behavior discovery, the assumptions they make, and the types of data they work best with.

  13. The future of computing

    NASA Astrophysics Data System (ADS)

    Simmons, Michelle

    2016-05-01

    Down-scaling has been the leading paradigm of the semiconductor industry since the invention of the first transistor in 1947. However miniaturization will soon reach the ultimate limit, set by the discreteness of matter, leading to intensified research in alternative approaches for creating logic devices. This talk will discuss the development of a radical new technology for creating atomic-scale devices which is opening a new frontier of research in electronics globally. We will introduce single atom transistors where we can measure both the charge and spin of individual dopants with unique capabilities in controlling the quantum world. To this end, we will discuss how we are now demonstrating atom by atom, the best way to build a quantum computer - a new type of computer that exploits the laws of physics at very small dimensions in order to provide an exponential speed up in computational processing power.

  14. Visual Computing Environment Workshop

    NASA Technical Reports Server (NTRS)

    Lawrence, Charles (Compiler)

    1998-01-01

    The Visual Computing Environment (VCE) is a framework for intercomponent and multidisciplinary computational simulations. Many current engineering analysis codes simulate various aspects of aircraft engine operation. For example, existing computational fluid dynamics (CFD) codes can model the airflow through individual engine components such as the inlet, compressor, combustor, turbine, or nozzle. Currently, these codes are run in isolation, making intercomponent and complete system simulations very difficult to perform. In addition, management and utilization of these engineering codes for coupled component simulations is a complex, laborious task, requiring substantial experience and effort. To facilitate multicomponent aircraft engine analysis, the CFD Research Corporation (CFDRC) is developing the VCE system. This system, which is part of NASA's Numerical Propulsion Simulation System (NPSS) program, can couple various engineering disciplines, such as CFD, structural analysis, and thermal analysis.

  15. Stopping computer crimes

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1989-01-01

    Two new books about intrusions and computer viruses remind us that attacks against our computers on networks are the actions of human beings. Cliff Stoll's book about the hacker who spent a year, beginning in Aug. 1986, attempting to use the Lawrence Berkeley Computer as a stepping-stone for access to military secrets is a spy thriller that illustrates the weaknesses of our password systems and the difficulties in compiling evidence against a hacker engaged in espionage. Pamela Kane's book about viruses that attack IBM PC's shows that viruses are the modern version of the old problem of a Trojan horse attack. It discusses the most famous viruses and their countermeasures, and it comes with a floppy disk of utility programs that will disinfect your PC and thwart future attack.

  16. Optical quantum computing.

    PubMed

    O'Brien, Jeremy L

    2007-12-01

    In 2001, all-optical quantum computing became feasible with the discovery that scalable quantum computing is possible using only single-photon sources, linear optical elements, and single-photon detectors. Although it was in principle scalable, the massive resource overhead made the scheme practically daunting. However, several simplifications were followed by proof-of-principle demonstrations, and recent approaches based on cluster states or error encoding have dramatically reduced this worrying resource overhead, making an all-optical architecture a serious contender for the ultimate goal of a large-scale quantum computer. Key challenges will be the realization of high-efficiency sources of indistinguishable single photons, low-loss, scalable optical circuits, high-efficiency single-photon detectors, and low-loss interfacing of these components.

  17. Computational Analysis of Behavior.

    PubMed

    Egnor, S E Roian; Branson, Kristin

    2016-07-01

    In this review, we discuss the emerging field of computational behavioral analysis-the use of modern methods from computer science and engineering to quantitatively measure animal behavior. We discuss aspects of experiment design important to both obtaining biologically relevant behavioral data and enabling the use of machine vision and learning techniques for automation. These two goals are often in conflict. Restraining or restricting the environment of the animal can simplify automatic behavior quantification, but it can also degrade the quality or alter important aspects of behavior. To enable biologists to design experiments to obtain better behavioral measurements, and computer scientists to pinpoint fruitful directions for algorithm improvement, we review known effects of artificial manipulation of the animal on behavior. We also review machine vision and learning techniques for tracking, feature extraction, automated behavior classification, and automated behavior discovery, the assumptions they make, and the types of data they work best with. PMID:27090952

  18. Auto covariance computer

    NASA Technical Reports Server (NTRS)

    Hepner, T. E.; Meyers, J. F. (Inventor)

    1985-01-01

    A laser velocimeter covariance processor which calculates the auto covariance and cross covariance functions for a turbulent flow field based on Poisson sampled measurements in time from a laser velocimeter is described. The device will process a block of data that is up to 4096 data points in length and return a 512 point covariance function with 48-bit resolution along with a 512 point histogram of the interarrival times which is used to normalize the covariance function. The device is designed to interface and be controlled by a minicomputer from which the data is received and the results returned. A typical 4096 point computation takes approximately 1.5 seconds to receive the data, compute the covariance function, and return the results to the computer.

  19. Hierarchical Approximate Bayesian Computation

    PubMed Central

    Turner, Brandon M.; Van Zandt, Trisha

    2013-01-01

    Approximate Bayesian computation (ABC) is a powerful technique for estimating the posterior distribution of a model’s parameters. It is especially important when the model to be fit has no explicit likelihood function, which happens for computational (or simulation-based) models such as those that are popular in cognitive neuroscience and other areas in psychology. However, ABC is usually applied only to models with few parameters. Extending ABC to hierarchical models has been difficult because high-dimensional hierarchical models add computational complexity that conventional ABC cannot accommodate. In this paper we summarize some current approaches for performing hierarchical ABC and introduce a new algorithm called Gibbs ABC. This new algorithm incorporates well-known Bayesian techniques to improve the accuracy and efficiency of the ABC approach for estimation of hierarchical models. We then use the Gibbs ABC algorithm to estimate the parameters of two models of signal detection, one with and one without a tractable likelihood function. PMID:24297436

  20. Computers, Nanotechnology and Mind

    NASA Astrophysics Data System (ADS)

    Ekdahl, Bertil

    2008-10-01

    In 1958, two years after the Dartmouth conference, where the term artificial intelligence was coined, Herbert Simon and Allen Newell asserted the existence of "machines that think, that learn and create." They were further prophesying that the machines' capacity would increase and be on par with the human mind. Now, 50 years later, computers perform many more tasks than one could imagine in the 1950s but, virtually, no computer can do more than could the first digital computer, developed by John von Neumann in the 1940s. Computers still follow algorithms, they do not create them. However, the development of nanotechnology seems to have given rise to new hopes. With nanotechnology two things are supposed to happen. Firstly, due to the small scale it will be possible to construct huge computer memories which are supposed to be the precondition for building an artificial brain, secondly, nanotechnology will make it possible to scan the brain which in turn will make reverse engineering possible; the mind will be decoded by studying the brain. The consequence of such a belief is that the brain is no more than a calculator, i.e., all that the mind can do is in principle the results of arithmetical operations. Computers are equivalent to formal systems which in turn was an answer to an idea by Hilbert that proofs should contain ideal statements for which operations cannot be applied in a contentual way. The advocates of artificial intelligence will place content in a machine that is developed not only to be free of content but also cannot contain content. In this paper I argue that the hope for artificial intelligence is in vain.

  1. Computer Component Tester

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Carlos Horvath of the Burroughs Corporation, inspired by information published in NASA Tech Briefs, developed the AC/DC tester which checks out ECL (Emitter Coupled Logic) devices and their functionality within the computer. Each ECL device has a specific task in the computer's operation; the tester determines whether the device is performing that function properly. Horvath's invention allows rapid manual checking without extensive programming as it is required by other test methods; thus the ECL tester makes it easier to find out what is malfunctioning, and does the job faster.

  2. Adventures in Computational Grids

    NASA Technical Reports Server (NTRS)

    Walatka, Pamela P.; Biegel, Bryan A. (Technical Monitor)

    2002-01-01

    Sometimes one supercomputer is not enough. Or your local supercomputers are busy, or not configured for your job. Or you don't have any supercomputers. You might be trying to simulate worldwide weather changes in real time, requiring more compute power than you could get from any one machine. Or you might be collecting microbiological samples on an island, and need to examine them with a special microscope located on the other side of the continent. These are the times when you need a computational grid.

  3. Computational peptide vaccinology.

    PubMed

    Söllner, Johannes

    2015-01-01

    Immunoinformatics focuses on modeling immune responses for better understanding of the immune system and in many cases for proposing agents able to modify the immune system. The most classical of these agents are vaccines derived from living organisms such as smallpox or polio. More modern vaccines comprise recombinant proteins, protein domains, and in some cases peptides. Generating a vaccine from peptides however requires technologies and concepts very different from classical vaccinology. Immunoinformatics therefore provides the computational tools to propose peptides suitable for formulation into vaccines. This chapter introduces the essential biological concepts affecting design and efficacy of peptide vaccines and discusses current methods and workflows applied to design successful peptide vaccines using computers.

  4. Computer Programs for Construction

    NASA Technical Reports Server (NTRS)

    1978-01-01

    A NASA computer program aids Hudson Engineering Corporation, Houston, Texas in the design and construction of huge petrochemical processing plants like the one shown, which is located at Ju'aymah, Saud Arabia. The pipes handling the flow of chemicals are subject to a variety of stresses, such as weight and variations in temperature. Hudson Engineering uses a COSMIC piping flexibility analysis computer program to analyze and insure the necessary strength and flexibility of the pipes. This program helps the company realize substantial savings in reduced engineering time.

  5. Computer networking for scientists.

    PubMed

    Jennings, D M; Landweber, L H; Fuchs, I H; Farber, D J; Adrion, W R

    1986-02-28

    Scientific research has always relied on communication for gathering and providing access to data; for exchanging information; for holding discussions, meetings, and seminars; for collaborating with widely dispersed researchers; and for disseminating results. The pace and complexity of modern research, especially collaborations of researchers in different institutions, has dramatically increased scientists' communications needs. Scientists now need immediate access to data and information, to colleagues and collaborators, and to advanced computing and information services. Furthermore, to be really useful, communication facilities must be integrated with the scientist's normal day-to-day working environment. Scientists depend on computing and communications tools and are handicapped without them. PMID:17740290

  6. Balancing Computation and Experiment

    SciTech Connect

    Farber, Rob

    2007-04-01

    How do you know when the science being performed on a supercomputer reflects what actually occurs inside a test tube, or in real life? The answer can be stated simply enough: “run the model on the computer, get the result, and then perform an experiment to test the result”. The adage “easier said than done” truly applies – especially when the focus is on innovative science to benefit government and industry. The trick in this case is to find the right balance of science-driven computing integrated with experiment.

  7. Highly parallel computation

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.; Tichy, Walter F.

    1990-01-01

    Highly parallel computing architectures are the only means to achieve the computation rates demanded by advanced scientific problems. A decade of research has demonstrated the feasibility of such machines and current research focuses on which architectures designated as multiple instruction multiple datastream (MIMD) and single instruction multiple datastream (SIMD) have produced the best results to date; neither shows a decisive advantage for most near-homogeneous scientific problems. For scientific problems with many dissimilar parts, more speculative architectures such as neural networks or data flow may be needed.

  8. Computational quantum chemistry website

    SciTech Connect

    1997-08-22

    This report contains the contents of a web page related to research on the development of quantum chemistry methods for computational thermochemistry and the application of quantum chemistry methods to problems in material chemistry and chemical sciences. Research programs highlighted include: Gaussian-2 theory; Density functional theory; Molecular sieve materials; Diamond thin-film growth from buckyball precursors; Electronic structure calculations on lithium polymer electrolytes; Long-distance electronic coupling in donor/acceptor molecules; and Computational studies of NOx reactions in radioactive waste storage.

  9. Exercises in Molecular Computing

    PubMed Central

    2014-01-01

    Conspectus The successes of electronic digital logic have transformed every aspect of human life over the last half-century. The word “computer” now signifies a ubiquitous electronic device, rather than a human occupation. Yet evidently humans, large assemblies of molecules, can compute, and it has been a thrilling challenge to develop smaller, simpler, synthetic assemblies of molecules that can do useful computation. When we say that molecules compute, what we usually mean is that such molecules respond to certain inputs, for example, the presence or absence of other molecules, in a precisely defined but potentially complex fashion. The simplest way for a chemist to think about computing molecules is as sensors that can integrate the presence or absence of multiple analytes into a change in a single reporting property. Here we review several forms of molecular computing developed in our laboratories. When we began our work, combinatorial approaches to using DNA for computing were used to search for solutions to constraint satisfaction problems. We chose to work instead on logic circuits, building bottom-up from units based on catalytic nucleic acids, focusing on DNA secondary structures in the design of individual circuit elements, and reserving the combinatorial opportunities of DNA for the representation of multiple signals propagating in a large circuit. Such circuit design directly corresponds to the intuition about sensors transforming the detection of analytes into reporting properties. While this approach was unusual at the time, it has been adopted since by other groups working on biomolecular computing with different nucleic acid chemistries. We created logic gates by modularly combining deoxyribozymes (DNA-based enzymes cleaving or combining other oligonucleotides), in the role of reporting elements, with stem–loops as input detection elements. For instance, a deoxyribozyme that normally exhibits an oligonucleotide substrate recognition region is

  10. Computation in gene networks

    NASA Astrophysics Data System (ADS)

    Ben-Hur, Asa; Siegelmann, Hava T.

    2004-03-01

    Genetic regulatory networks have the complex task of controlling all aspects of life. Using a model of gene expression by piecewise linear differential equations we show that this process can be considered as a process of computation. This is demonstrated by showing that this model can simulate memory bounded Turing machines. The simulation is robust with respect to perturbations of the system, an important property for both analog computers and biological systems. Robustness is achieved using a condition that ensures that the model equations, that are generally chaotic, follow a predictable dynamics.

  11. Distributed computing systems programme

    SciTech Connect

    Duce, D.

    1984-01-01

    Publication of this volume coincides with the completion of the U.K. Science and Engineering Research Council's coordinated programme of research in Distributed Computing Systems (DCS) which ran from 1977 to 1984. The volume is based on presentations made at the programme's final conference. The first chapter explains the origins and history of DCS and gives an overview of the programme and its achievements. The remaining sixteen chapters review particular research themes (including imperative and declarative languages, and performance modelling), and describe particular research projects in technical areas including local area networks, design, development and analysis of concurrent systems, parallel algorithm design, functional programming and non-von Neumann computer architectures.

  12. Demonstration of blind quantum computing.

    PubMed

    Barz, Stefanie; Kashefi, Elham; Broadbent, Anne; Fitzsimons, Joseph F; Zeilinger, Anton; Walther, Philip

    2012-01-20

    Quantum computers, besides offering substantial computational speedups, are also expected to preserve the privacy of a computation. We present an experimental demonstration of blind quantum computing in which the input, computation, and output all remain unknown to the computer. We exploit the conceptual framework of measurement-based quantum computation that enables a client to delegate a computation to a quantum server. Various blind delegated computations, including one- and two-qubit gates and the Deutsch and Grover quantum algorithms, are demonstrated. The client only needs to be able to prepare and transmit individual photonic qubits. Our demonstration is crucial for unconditionally secure quantum cloud computing and might become a key ingredient for real-life applications, especially when considering the challenges of making powerful quantum computers widely available.

  13. Center for computer security: Computer Security Group conference. Summary

    SciTech Connect

    Not Available

    1982-06-01

    Topics covered include: computer security management; detection and prevention of computer misuse; certification and accreditation; protection of computer security, perspective from a program office; risk analysis; secure accreditation systems; data base security; implementing R and D; key notarization system; DOD computer security center; the Sandia experience; inspector general's report; and backup and contingency planning. (GHT)

  14. The Relationship between Computer Anxiety and Computer Self-Efficacy

    ERIC Educational Resources Information Center

    Simsek, Ali

    2011-01-01

    This study examined the relationship between computer anxiety and computer self-efficacy of students and teachers in elementary and secondary schools. The sample included a total of 845 subjects from two private school systems in Turkey. The Oetting's Computer Anxiety Scale was used to measure computer anxiety whereas the Murphy's Computer…

  15. Redirecting Under-Utilised Computer Laboratories into Cluster Computing Facilities

    ERIC Educational Resources Information Center

    Atkinson, John S.; Spenneman, Dirk H. R.; Cornforth, David

    2005-01-01

    Purpose: To provide administrators at an Australian university with data on the feasibility of redirecting under-utilised computer laboratories facilities into a distributed high performance computing facility. Design/methodology/approach: The individual log-in records for each computer located in the computer laboratories at the university were…

  16. An introduction to computer viruses

    SciTech Connect

    Brown, D.R.

    1992-03-01

    This report on computer viruses is based upon a thesis written for the Master of Science degree in Computer Science from the University of Tennessee in December 1989 by David R. Brown. This thesis is entitled An Analysis of Computer Virus Construction, Proliferation, and Control and is available through the University of Tennessee Library. This paper contains an overview of the computer virus arena that can help the reader to evaluate the threat that computer viruses pose. The extent of this threat can only be determined by evaluating many different factors. These factors include the relative ease with which a computer virus can be written, the motivation involved in writing a computer virus, the damage and overhead incurred by infected systems, and the legal implications of computer viruses, among others. Based upon the research, the development of a computer virus seems to require more persistence than technical expertise. This is a frightening proclamation to the computing community. The education of computer professionals to the dangers that viruses pose to the welfare of the computing industry as a whole is stressed as a means of inhibiting the current proliferation of computer virus programs. Recommendations are made to assist computer users in preventing infection by computer viruses. These recommendations support solid general computer security practices as a means of combating computer viruses.

  17. Synchronizing compute node time bases in a parallel computer

    DOEpatents

    Chen, Dong; Faraj, Daniel A; Gooding, Thomas M; Heidelberger, Philip

    2015-01-27

    Synchronizing time bases in a parallel computer that includes compute nodes organized for data communications in a tree network, where one compute node is designated as a root, and, for each compute node: calculating data transmission latency from the root to the compute node; configuring a thread as a pulse waiter; initializing a wakeup unit; and performing a local barrier operation; upon each node completing the local barrier operation, entering, by all compute nodes, a global barrier operation; upon all nodes entering the global barrier operation, sending, to all the compute nodes, a pulse signal; and for each compute node upon receiving the pulse signal: waking, by the wakeup unit, the pulse waiter; setting a time base for the compute node equal to the data transmission latency between the root node and the compute node; and exiting the global barrier operation.

  18. Synchronizing compute node time bases in a parallel computer

    DOEpatents

    Chen, Dong; Faraj, Daniel A; Gooding, Thomas M; Heidelberger, Philip

    2014-12-30

    Synchronizing time bases in a parallel computer that includes compute nodes organized for data communications in a tree network, where one compute node is designated as a root, and, for each compute node: calculating data transmission latency from the root to the compute node; configuring a thread as a pulse waiter; initializing a wakeup unit; and performing a local barrier operation; upon each node completing the local barrier operation, entering, by all compute nodes, a global barrier operation; upon all nodes entering the global barrier operation, sending, to all the compute nodes, a pulse signal; and for each compute node upon receiving the pulse signal: waking, by the wakeup unit, the pulse waiter; setting a time base for the compute node equal to the data transmission latency between the root node and the compute node; and exiting the global barrier operation.

  19. EDITORIAL: Computational materials science Computational materials science

    NASA Astrophysics Data System (ADS)

    Kahl, Gerhard; Kresse, Georg

    2011-10-01

    Special issue in honour of Jürgen Hafner On 30 September 2010, Jürgen Hafner, one of the most prominent and influential members within the solid state community, retired. His remarkably broad scientific oeuvre has made him one of the founding fathers of modern computational materials science: more than 600 scientific publications, numerous contributions to books, and a highly cited monograph, which has become a standard reference in the theory of metals, witness not only the remarkable productivity of Jürgen Hafner but also his impact in theoretical solid state physics. In an effort to duly acknowledge Jürgen Hafner's lasting impact in this field, a Festsymposium was held on 27-29 September 2010 at the Universität Wien. The organizers of this symposium (and authors of this editorial) are proud to say that a large number of highly renowned scientists in theoretical condensed matter theory—co-workers, friends and students—accepted the invitation to this celebration of Hafner's jubilee. Some of these speakers also followed our invitation to submit their contribution to this Festschrift, published in Journal of Physics: Condensed Matter, a journal which Jürgen Hafner served in 2000-2003 and 2003-2006 as a member of the Advisory Editorial Board and member of the Executive Board, respectively. In the subsequent article, Volker Heine, friend and co-worker of Jürgen Hafner over many decades, gives an account of Hafner's impact in the field of theoretical condensed matter physics. Computational materials science contents Theoretical study of structural, mechanical and spectroscopic properties of boehmite (γ-AlOOH) D Tunega, H Pašalić, M H Gerzabek and H Lischka Ethylene epoxidation catalyzed by chlorine-promoted silver oxide M O Ozbek, I Onal and R A Van Santen First-principles study of Cu2ZnSnS4 and the related band offsets for photovoltaic applicationsA Nagoya, R Asahi and G Kresse Renormalization group study of random quantum magnetsIstván A Kovács and

  20. "Starry Night" on Computer.

    ERIC Educational Resources Information Center

    Freifeld, Susan

    1998-01-01

    Discusses an exploration of depth in landscape painting using Vincent van Gogh's "Starry Night" as an example. Used computer drawing software for children to allow students to create their own interpretations of "Starry Night" while exploring means of portraying depth in two-dimensional art. (DSK)

  1. Computing Logarithms by Hand

    ERIC Educational Resources Information Center

    Reed, Cameron

    2016-01-01

    How can old-fashioned tables of logarithms be computed without technology? Today, of course, no practicing mathematician, scientist, or engineer would actually use logarithms to carry out a calculation, let alone worry about deriving them from scratch. But high school students may be curious about the process. This article develops a…

  2. Low-Carbon Computing

    ERIC Educational Resources Information Center

    Hignite, Karla

    2009-01-01

    Green information technology (IT) is grabbing more mainstream headlines--and for good reason. Computing, data processing, and electronic file storage collectively account for a significant and growing share of energy consumption in the business world and on higher education campuses. With greater scrutiny of all activities that contribute to an…

  3. Introduction to Computer Applications.

    ERIC Educational Resources Information Center

    Johnson, Robin; McKnight, Terri; Tackett, Beverly

    This document is designed for high school teachers to use in teaching a course that introduces students to computing through hands-on experience with databases, spreadsheets, desktop publishing, and word processing. The document begins with a rationale, brief course description, list of course objectives, and list of 10 innovative teaching…

  4. American History. Computer Programs.

    ERIC Educational Resources Information Center

    Lengel, James G.

    1983-01-01

    THE FOLLOWING IS THE FULL TEXT OF THIS DOCUMENT: Seven interactive computer programs are available to help with the study of American History. They cover the period of the 17th century up through the present day, and involve a variety of approaches to instruction. These programs were conceived and programmed by Jim Lengel, a former state social…

  5. Accounting & Computing Curriculum Guide.

    ERIC Educational Resources Information Center

    Avani, Nathan T.; And Others

    This curriculum guide consists of materials for use in teaching a competency-based accounting and computing course that is designed to prepare students for employability in the following occupational areas: inventory control clerk, invoice clerk, payroll clerk, traffic clerk, general ledger bookkeeper, accounting clerk, account information clerk,…

  6. Hypercube matrix computation task

    NASA Technical Reports Server (NTRS)

    Calalo, R.; Imbriale, W.; Liewer, P.; Lyons, J.; Manshadi, F.; Patterson, J.

    1987-01-01

    The Hypercube Matrix Computation (Year 1986-1987) task investigated the applicability of a parallel computing architecture to the solution of large scale electromagnetic scattering problems. Two existing electromagnetic scattering codes were selected for conversion to the Mark III Hypercube concurrent computing environment. They were selected so that the underlying numerical algorithms utilized would be different thereby providing a more thorough evaluation of the appropriateness of the parallel environment for these types of problems. The first code was a frequency domain method of moments solution, NEC-2, developed at Lawrence Livermore National Laboratory. The second code was a time domain finite difference solution of Maxwell's equations to solve for the scattered fields. Once the codes were implemented on the hypercube and verified to obtain correct solutions by comparing the results with those from sequential runs, several measures were used to evaluate the performance of the two codes. First, a comparison was provided of the problem size possible on the hypercube with 128 megabytes of memory for a 32-node configuration with that available in a typical sequential user environment of 4 to 8 megabytes. Then, the performance of the codes was anlyzed for the computational speedup attained by the parallel architecture.

  7. Diagnosis by Computer.

    ERIC Educational Resources Information Center

    McKean, Kevin

    1982-01-01

    Describes a computer program to aid physicians in diagnosing diseases. The program begins by recording available information about the patient and consults a library of about 4,000 characteristics associated with approximately 550 maladies and makes a tally to determine the most likely disease candidate. (Author/JN)

  8. Teaching Astronomy With Computers.

    ERIC Educational Resources Information Center

    Carr, Claire J.; Carr, Everett Q.

    1982-01-01

    Details of the services offered by a school planetarium in upstate New York are provided. The use of computers includes a full-day simulation of a voyage to Mars and back, the finding of great-circle distances, and deriving sunrise and sunset times, moon positions, and eclipse seasons. (MP)

  9. Computer Series, 103.

    ERIC Educational Resources Information Center

    Birk, James P., Ed.

    1989-01-01

    Describes two computer programs for chemistry: (1) "A Microcomputer Simulation of Fractal Electrodeposition" (provides for the study of fractal aggregrates, BASIC 4.0); and (2) "Counters on Grids" (a game to illustrate the distribution of energy). Notes the programs are available from the authors. (MVL)

  10. Radiation Hardening of Computers

    NASA Technical Reports Server (NTRS)

    Nichols, D. K.; Smith, L. S.; Zoutendyk, J. A.; Giddings, A. E.; Hewlett, F. W.; Treece, R. K.

    1986-01-01

    Single-event upsets reduced by use of oversize transistors. Computers made less susceptible to ionizing radiation by replacing bipolar integrated circuits with properly designed, complementary metaloxide-semiconductor (CMOS) circuits. CMOS circuit chips made highly resistant to single-event upset (SEU), especially when certain feedback resistors are incorporated. Redesigned chips also consume less power than original chips.

  11. The Computer as Interpreter.

    ERIC Educational Resources Information Center

    Denning, J. David

    1990-01-01

    Reported are the results of a study at the Royal British Columbia Museum where an interactive computer program, "Explore the Seashore," was added to an exhibit to test its effects on visitor enjoyment and learning. Details of the presentation and structure of the hypercard media are presented. (CW)

  12. Computers and Chinese Linguistics.

    ERIC Educational Resources Information Center

    Kierman, Frank A.; Barber, Elizabeth

    This survey of the field of Chinese language computational linguistics was prepared as a background study for the Chinese Linguistics Project at Princeton. Since the authors' main purpose was "critical reconnaissance," quantitative emphasis is on systems with which they are most familiar. The complexity of the Chinese writing system has presented…

  13. Hypercube matrix computation task

    NASA Technical Reports Server (NTRS)

    Calalo, Ruel H.; Imbriale, William A.; Jacobi, Nathan; Liewer, Paulett C.; Lockhart, Thomas G.; Lyzenga, Gregory A.; Lyons, James R.; Manshadi, Farzin; Patterson, Jean E.

    1988-01-01

    A major objective of the Hypercube Matrix Computation effort at the Jet Propulsion Laboratory (JPL) is to investigate the applicability of a parallel computing architecture to the solution of large-scale electromagnetic scattering problems. Three scattering analysis codes are being implemented and assessed on a JPL/California Institute of Technology (Caltech) Mark 3 Hypercube. The codes, which utilize different underlying algorithms, give a means of evaluating the general applicability of this parallel architecture. The three analysis codes being implemented are a frequency domain method of moments code, a time domain finite difference code, and a frequency domain finite elements code. These analysis capabilities are being integrated into an electromagnetics interactive analysis workstation which can serve as a design tool for the construction of antennas and other radiating or scattering structures. The first two years of work on the Hypercube Matrix Computation effort is summarized. It includes both new developments and results as well as work previously reported in the Hypercube Matrix Computation Task: Final Report for 1986 to 1987 (JPL Publication 87-18).

  14. Computer Algebra versus Manipulation

    ERIC Educational Resources Information Center

    Zand, Hossein; Crowe, David

    2004-01-01

    In the UK there is increasing concern about the lack of skill in algebraic manipulation that is evident in students entering mathematics courses at university level. In this note we discuss how the computer can be used to ameliorate some of the problems. We take as an example the calculations needed in three dimensional vector analysis in polar…

  15. Home, Hearth and Computing.

    ERIC Educational Resources Information Center

    Seelig, Anita

    1982-01-01

    Advantages of having children use microcomputers at school and home include learning about sophisticated concepts early in life without a great deal of prodding, playing games that expand knowledge, and becoming literate in computer knowledge needed later in life. Includes comments from parents on their experiences with microcomputers and…

  16. Computers in Cardiology.

    ERIC Educational Resources Information Center

    Feldman, Charles L.

    The utilization of computers in the interpretation of electrocardiograms (EKG's) and vectorcardiograms is the subject of this report. A basic introduction into the operations of the electrocardiograms and vectorcardiograms is provided via an illustrated text. A historical development of the EKG starts with the 1950's with the first attempts to use…

  17. Computer Modeling and Simulation

    SciTech Connect

    Pronskikh, V. S.

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

  18. Women in Computer Sciences.

    ERIC Educational Resources Information Center

    Rose, Clare; Menninger, Sally Ann

    The keynote address of a conference that focused on the future of women in science and engineering fields and the opportunities available to them in the computer sciences is presented. Women's education in the sciences and education and entry into the job market in these fields has steadily been increasing. Excellent employment opportunities are…

  19. Thomas Jefferson's Computer.

    ERIC Educational Resources Information Center

    Smith, Catherine F.

    1996-01-01

    Notes that taken together, Thomas Jefferson's contributions to the history of writing technology demonstrate a virtual "computer." Links Jefferson's development of writing technology to his democratic political philosophy. Argues that this link should interest writing teachers. Suggests that Jeffersonian optimism effectively counters Foucaultian…

  20. Computer Narrative Assessment Reports.

    ERIC Educational Resources Information Center

    Mathews, Walter M.

    The use of narrative test reports overcomes the major barrier to understanding reports, understanding the language that is used. Early attempts to utilize the computer in generating narrative reports include: (1) Teaching Information Processing System (TIPS), involving periodic collection of information from students regarding courses, which is…

  1. Computer Series, 32.

    ERIC Educational Resources Information Center

    Moore, John W., Ed.

    1982-01-01

    Ten computer programs (available from authors) and a noncomputer calculation of the electron in one-dimensional, one-Bohr box are described, including programs for analytical chemistry, space group generation using Pascal, mass-spectral search system (Applesoft), microcomputer-simulated liquid chromatography, voltammetry/amperometric titrations,…

  2. Computer Literacy 113.

    ERIC Educational Resources Information Center

    Huse, Vanessa

    A course syllabus and descriptive outline are provided for an introductory, three-credit college course at Lon Morris College, Jacksonville, Texas, covering logical operations and the development of basic algorithmic processes using the BASIC computer programming language. Overall goals, specific objectives, and a brief description of lesson…

  3. Carbon nanotube computer.

    PubMed

    Shulaker, Max M; Hills, Gage; Patil, Nishant; Wei, Hai; Chen, Hong-Yu; Wong, H-S Philip; Mitra, Subhasish

    2013-09-26

    The miniaturization of electronic devices has been the principal driving force behind the semiconductor industry, and has brought about major improvements in computational power and energy efficiency. Although advances with silicon-based electronics continue to be made, alternative technologies are being explored. Digital circuits based on transistors fabricated from carbon nanotubes (CNTs) have the potential to outperform silicon by improving the energy-delay product, a metric of energy efficiency, by more than an order of magnitude. Hence, CNTs are an exciting complement to existing semiconductor technologies. Owing to substantial fundamental imperfections inherent in CNTs, however, only very basic circuit blocks have been demonstrated. Here we show how these imperfections can be overcome, and demonstrate the first computer built entirely using CNT-based transistors. The CNT computer runs an operating system that is capable of multitasking: as a demonstration, we perform counting and integer-sorting simultaneously. In addition, we implement 20 different instructions from the commercial MIPS instruction set to demonstrate the generality of our CNT computer. This experimental demonstration is the most complex carbon-based electronic system yet realized. It is a considerable advance because CNTs are prominent among a variety of emerging technologies that are being considered for the next generation of highly energy-efficient electronic systems.

  4. Classroom Computer Capsule.

    ERIC Educational Resources Information Center

    Kraines, David P.; And Others

    1991-01-01

    This article describes a calculus lesson that illustrates the nature of cycles in simple systems of nonlinear differential equations through the use of the Lotka-Volterra predator-prey model as incorporated in the computer software package, Phaser (version 1.0). (JJK)

  5. Powered Tate Pairing Computation

    NASA Astrophysics Data System (ADS)

    Kang, Bo Gyeong; Park, Je Hong

    In this letter, we provide a simple proof of bilinearity for the eta pairing. Based on it, we show an efficient method to compute the powered Tate pairing as well. Although efficiency of our method is equivalent to that of the Tate pairing on the eta pairing approach, but ours is more general in principle.

  6. Cloud Computing Explained

    ERIC Educational Resources Information Center

    Metz, Rosalyn

    2010-01-01

    While many talk about the cloud, few actually understand it. Three organizations' definitions come to the forefront when defining the cloud: Gartner, Forrester, and the National Institutes of Standards and Technology (NIST). Although both Gartner and Forrester provide definitions of cloud computing, the NIST definition is concise and uses…

  7. Computers in Abstract Algebra

    ERIC Educational Resources Information Center

    Nwabueze, Kenneth K.

    2004-01-01

    The current emphasis on flexible modes of mathematics delivery involving new information and communication technology (ICT) at the university level is perhaps a reaction to the recent change in the objectives of education. Abstract algebra seems to be one area of mathematics virtually crying out for computer instructional support because of the…

  8. Computers and Young Children

    ERIC Educational Resources Information Center

    Lacina, Jan

    2007-01-01

    Technology is a way of life for most Americans. A recent study published by the National Writing Project (2007) found that Americans believe that computers have a positive effect on writing skills. The importance of learning to use technology ranked just below learning to read and write, and 74 percent of the survey respondents noted that children…

  9. Animatronics, Children and Computation

    ERIC Educational Resources Information Center

    Sempere, Andrew

    2005-01-01

    In this article, we present CTRL_SPACE: a design for a software environment with companion hardware, developed to introduce preliterate children to basic computational concepts by means of an animatronic face, whose individual features serve as an analogy for a programmable object. In addition to presenting the environment, this article briefly…

  10. (Some) Computer Futures: Mainframes.

    ERIC Educational Resources Information Center

    Joseph, Earl C.

    Possible futures for the world of mainframe computers can be forecast through studies identifying forces of change and their impact on current trends. Some new prospects for the future have been generated by advances in information technology; for example, recent United States successes in applied artificial intelligence (AI) have created new…

  11. Computer Exercises in Meteorology.

    ERIC Educational Resources Information Center

    Trapasso, L. Michael; Conner, Glen; Stallins, Keith

    Beginning with Western Kentucky University's (Bowling Green) fall 1999 semester, exercises required for the geography and meteorology course used computers for learning. This course enrolls about 250 students per year, most of whom choose it to fulfill a general education requirement. Of the 185 geography majors, it is required for those who…

  12. Computer Series, 115.

    ERIC Educational Resources Information Center

    Birk, James P., Ed.

    1990-01-01

    Reviewed are six computer programs which may be useful in teaching college level chemistry. Topics include dynamic data storage in FORTRAN, "KC?DISCOVERER," pH of acids and bases, calculating percent boundary surfaces for orbitals, and laboratory interfacing with PT Nomograph for the Macintosh. (CW)

  13. Teaching Accounting with Computers.

    ERIC Educational Resources Information Center

    Shaoul, Jean

    This paper addresses the numerous ways that computers may be used to enhance the teaching of accounting and business topics. It focuses on the pedagogical use of spreadsheet software to improve the conceptual coverage of accounting principles and practice, increase student understanding by involvement in the solution process, and reduce the amount…

  14. Computationally efficient control allocation

    NASA Technical Reports Server (NTRS)

    Durham, Wayne (Inventor)

    2001-01-01

    A computationally efficient method for calculating near-optimal solutions to the three-objective, linear control allocation problem is disclosed. The control allocation problem is that of distributing the effort of redundant control effectors to achieve some desired set of objectives. The problem is deemed linear if control effectiveness is affine with respect to the individual control effectors. The optimal solution is that which exploits the collective maximum capability of the effectors within their individual physical limits. Computational efficiency is measured by the number of floating-point operations required for solution. The method presented returned optimal solutions in more than 90% of the cases examined; non-optimal solutions returned by the method were typically much less than 1% different from optimal and the errors tended to become smaller than 0.01% as the number of controls was increased. The magnitude of the errors returned by the present method was much smaller than those that resulted from either pseudo inverse or cascaded generalized inverse solutions. The computational complexity of the method presented varied linearly with increasing numbers of controls; the number of required floating point operations increased from 5.5 i, to seven times faster than did the minimum-norm solution (the pseudoinverse), and at about the same rate as did the cascaded generalized inverse solution. The computational requirements of the method presented were much better than that of previously described facet-searching methods which increase in proportion to the square of the number of controls.

  15. Computer-Assisted Organizing

    ERIC Educational Resources Information Center

    Brunner, David James

    2009-01-01

    Organizing refers to methods of distributing physical and symbolic tasks among multiple agents in order to achieve goals. My dissertation investigates the dynamics of organizing in hybrid information processing systems that incorporate both humans and computers. To explain the behavior of these hybrid systems, I develop and partially test a theory…

  16. Computer Technology for Industry

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Shell Oil Company used a COSMIC program, called VISCEL to insure the accuracy of the company's new computer code for analyzing polymers, and chemical compounds. Shell reported that there were no other programs available that could provide the necessary calculations. Shell produces chemicals for plastic products used in the manufacture of automobiles, housewares, appliances, film, textiles, electronic equipment and furniture.

  17. Computing and data processing

    NASA Technical Reports Server (NTRS)

    Smarr, Larry; Press, William; Arnett, David W.; Cameron, Alastair G. W.; Crutcher, Richard M.; Helfand, David J.; Horowitz, Paul; Kleinmann, Susan G.; Linsky, Jeffrey L.; Madore, Barry F.

    1991-01-01

    The applications of computers and data processing to astronomy are discussed. Among the topics covered are the emerging national information infrastructure, workstations and supercomputers, supertelescopes, digital astronomy, astrophysics in a numerical laboratory, community software, archiving of ground-based observations, dynamical simulations of complex systems, plasma astrophysics, and the remote control of fourth dimension supercomputers.

  18. The Primary Computer Dictionary.

    ERIC Educational Resources Information Center

    Girard, Suzanne; Willing, Kathlene

    Suitable for children from kindergarten to grade three, this dictionary is designed to introduce young children to computer terminology at a level that they will understand and find useful. It is also suitable for parents as a home resource, for library use, and as a handbook for teachers. The first sentence of each definition contains the kernel…

  19. Computers in Science Fiction.

    ERIC Educational Resources Information Center

    Kurland, Michael

    1984-01-01

    Science fiction writers' perceptions of the "thinking machine" are examined through a review of Baum's Oz books, Heinlein's "Beyond This Horizon," science fiction magazine articles, and works about robots including Asimov's "I, Robot." The future of computers in science fiction is discussed and suggested readings are listed. (MBR)

  20. Computer Designed Instruction & Testing.

    ERIC Educational Resources Information Center

    New Mexico State Univ., Las Cruces.

    Research findings on computer designed instruction and testing at the college level are discussed in 13 papers from the first Regional Conference on University Teaching at New Mexico State University. Titles and authors are as follows: "Don't Bother Me with Instructional Design, I'm Busy Programming! Suggestions for More Effective Educational…

  1. Activity: Computer Talk.

    ERIC Educational Resources Information Center

    Clearing: Nature and Learning in the Pacific Northwest, 1985

    1985-01-01

    Presents an activity in which students create a computer program capable of recording and projecting paper use at school. Includes instructional strategies and background information such as requirements for pounds of paper/tree, energy needs, water consumption, and paper value at the recycling center. A sample program is included. (DH)

  2. Computer Based Library Orientation.

    ERIC Educational Resources Information Center

    Machalow, Robert

    This document presents computer-based lessons used to teach basic library skills to college students at York College of the City University of New York. The information for library orientation has been entered on a disk which must be used in conjunction with a word processing program, the Applewriter IIe, and an Apple IIe microcomputer. The…

  3. Computer Series, 42.

    ERIC Educational Resources Information Center

    Moore, John W., Ed.

    1983-01-01

    Eleven separate reports on computer programs (with sources indicated) and related materials are presented. These include chemical applications of interactive function translator, numerical optimization on microcomputer, Apple pH meter, axes-drawing program for Hewlett-Packard digital plotters, inexpensive chart recorder for freshmen laboratory,…

  4. The Computer in Lexicography.

    ERIC Educational Resources Information Center

    Bailey, Richard W.; Robinson, Jay L.

    Proposals for the use of the computer in the humanities often ask more of the machine than it can reasonably yield, and the enthusiastic generation of data for dictionary projects may well overburden the editors who must eventually cope with it. Procedures in lexicography are not well enough defined for a substantial burden to be placed on the…

  5. Computer/Information Science

    ERIC Educational Resources Information Center

    Birman, Ken; Roughgarden, Tim; Seltzer, Margo; Spohrer, Jim; Stolterman, Erik; Kearsley, Greg; Koszalka, Tiffany; de Jong, Ton

    2013-01-01

    Scholars representing the field of computer/information science were asked to identify what they considered to be the most exciting and imaginative work currently being done in their field, as well as how that work might change our understanding. The scholars included Ken Birman, Jennifer Rexford, Tim Roughgarden, Margo Seltzer, Jim Spohrer, and…

  6. Interfaces for Advanced Computing.

    ERIC Educational Resources Information Center

    Foley, James D.

    1987-01-01

    Discusses the coming generation of supercomputers that will have the power to make elaborate "artificial realities" that facilitate user-computer communication. Illustrates these technological advancements with examples of the use of head-mounted monitors which are connected to position and orientation sensors, and gloves that track finger and…

  7. The Computer Bulletin Board.

    ERIC Educational Resources Information Center

    Batt, Russell H., Ed.

    1989-01-01

    Describes two chemistry computer programs: (1) "Eureka: A Chemistry Problem Solver" (problem files may be written by the instructor, MS-DOS 2.0, IBM with 384K); and (2) "PC-File+" (database management, IBM with 416K and two floppy drives). (MVL)

  8. Computer Series, 89.

    ERIC Educational Resources Information Center

    Moore, John W., Ed.

    1988-01-01

    Describes five computer software packages; four for MS-DOS Systems and one for Apple II. Included are SPEC20, an interactive simulation of a Bausch and Lomb Spectronic-20; a database for laboratory chemicals and programs for visualizing Boltzmann-like distributions, orbital plot for the hydrogen atom and molecular orbital theory. (CW)

  9. Syllabus Computer in Astronomy

    NASA Astrophysics Data System (ADS)

    Hojaev, Alisher S.

    2015-08-01

    One of the most important and actual subjects and training courses in the curricula for undergraduate level students at the National university of Uzbekistan is ‘Computer Methods in Astronomy’. It covers two semesters and includes both lecture and practice classes. Based on the long term experience we prepared the tutorial for students which contain the description of modern computer applications in astronomy.The main directions of computer application in field of astronomy briefly as follows:1) Automating the process of observation, data acquisition and processing2) Create and store databases (the results of observations, experiments and theoretical calculations) their generalization, classification and cataloging, working with large databases3) The decisions of the theoretical problems (physical modeling, mathematical modeling of astronomical objects and phenomena, derivation of model parameters to obtain a solution of the corresponding equations, numerical simulations), appropriate software creation4) The utilization in the educational process (e-text books, presentations, virtual labs, remote education, testing), amateur astronomy and popularization of the science5) The use as a means of communication and data transfer, research result presenting and dissemination (web-journals), the creation of a virtual information system (local and global computer networks).During the classes the special attention is drawn on the practical training and individual work of students including the independent one.

  10. Finding epitopes with computers.

    PubMed

    Malito, Enrico; Rappuoli, Rino

    2013-10-24

    The goal of structural vaccinology is to enable the design and engineering of improved antigens. In a recent issue of Chemistry & Biology, Gourlay and colleagues provided evidence that structure-based computational methods allow prediction of B cell epitopes, a crucial step for antigen selection and optimization in vaccine development.

  11. Computing Cosmic Cataclysms

    NASA Technical Reports Server (NTRS)

    Centrella, Joan M.

    2010-01-01

    The final merger of two black holes releases a tremendous amount of energy, more than the combined light from all the stars in the visible universe. This energy is emitted in the form of gravitational waves, and observing these sources with gravitational wave detectors requires that we know the pattern or fingerprint of the radiation emitted. Since black hole mergers take place in regions of extreme gravitational fields, we need to solve Einstein's equations of general relativity on a computer in order to calculate these wave patterns. For more than 30 years, scientists have tried to compute these wave patterns. However, their computer codes have been plagued by problems that caused them to crash. This situation has changed dramatically in the past few years, with a series of amazing breakthroughs. This talk will take you on this quest for these gravitational wave patterns, showing how a spacetime is constructed on a computer to build a simulation laboratory for binary black hole mergers. We will focus on the recent advances that are revealing these waveforms, and the dramatic new potential for discoveries that arises when these sources will be observed.

  12. Computing in the Clouds

    ERIC Educational Resources Information Center

    Johnson, Doug

    2010-01-01

    Web-based applications offer teachers, students, and school districts a convenient way to accomplish a wide range of tasks, from accounting to word processing, for free. Cloud computing has the potential to offer staff and students better services at a lower cost than the technology deployment models they're using now. Saving money and improving…

  13. Computer Vision Systems

    NASA Astrophysics Data System (ADS)

    Gunasekaran, Sundaram

    Food quality is of paramount consideration for all consumers, and its importance is perhaps only second to food safety. By some definition, food safety is also incorporated into the broad categorization of food quality. Hence, the need for careful and accurate evaluation of food quality is at the forefront of research and development both in the academia and industry. Among the many available methods for food quality evaluation, computer vision has proven to be the most powerful, especially for nondestructively extracting and quantifying many features that have direct relevance to food quality assessment and control. Furthermore, computer vision systems serve to rapidly evaluate the most readily observable foods quality attributes - the external characteristics such as color, shape, size, surface texture etc. In addition, it is now possible, using advanced computer vision technologies, to “see” inside a food product and/or package to examine important quality attributes ordinarily unavailable to human evaluators. With rapid advances in electronic hardware and other associated imaging technologies, the cost-effectiveness and speed of computer vision systems have greatly improved and many practical systems are already in place in the food industry.

  14. Computers and Schools.

    ERIC Educational Resources Information Center

    Shears, Lawrie, Ed.

    This book recounts what happened when a set of 25 laptop computers was introduced into each of 10 Australian schools (Camberwell, Anglican Girls Grammar School; Caulfield Grammar School; Frankston High School; Jamieson Park Secondary College; Kilvington Baptist Girls Grammar School; Korowa Anglican Girls' School; Mallacoota P-12 College; Methodist…

  15. Spies in the Computer.

    ERIC Educational Resources Information Center

    Sweetland, Robert D.

    1984-01-01

    Discusses a computer program requiring students to find an object on a grid by using location coordinates. The program, which provides a wide range of problem-solving activities, is a highly motivating approach to hands-on metric measurement, rounding off numbers, and plotting points on Cartesian coordinate systems. Includes complete program…

  16. Computer Programming: BASIC.

    ERIC Educational Resources Information Center

    Fisher, Patience; And Others

    This guide was prepared to help teachers of the Lincoln Public School's introductory computer programming course in BASIC to make the necessary adjustments for changes made in the course since the purchase of microcomputers and such peripheral devices as television monitors and disk drives, and the addition of graphics. Intended to teach a…

  17. Computer Series, 110.

    ERIC Educational Resources Information Center

    Birk, James P., Ed.

    1990-01-01

    Described are two applications of computers in chemistry. The multilinear equations and procedures used in obtaining a least-squares solution for curve fitting in chemistry are presented. Also discussed is the program "ATORB" which can be used to generate graphs of probable electron clouds for various atoms and molecules. (CW)

  18. Computationally modeling interpersonal trust

    PubMed Central

    Lee, Jin Joo; Knox, W. Bradley; Wormwood, Jolie B.; Breazeal, Cynthia; DeSteno, David

    2013-01-01

    We present a computational model capable of predicting—above human accuracy—the degree of trust a person has toward their novel partner by observing the trust-related nonverbal cues expressed in their social interaction. We summarize our prior work, in which we identify nonverbal cues that signal untrustworthy behavior and also demonstrate the human mind's readiness to interpret those cues to assess the trustworthiness of a social robot. We demonstrate that domain knowledge gained from our prior work using human-subjects experiments, when incorporated into the feature engineering process, permits a computational model to outperform both human predictions and a baseline model built in naiveté of this domain knowledge. We then present the construction of hidden Markov models to investigate temporal relationships among the trust-related nonverbal cues. By interpreting the resulting learned structure, we observe that models built to emulate different levels of trust exhibit different sequences of nonverbal cues. From this observation, we derived sequence-based temporal features that further improve the accuracy of our computational model. Our multi-step research process presented in this paper combines the strength of experimental manipulation and machine learning to not only design a computational trust model but also to further our understanding of the dynamics of interpersonal trust. PMID:24363649

  19. Computer Security Risk Assessment

    1992-02-11

    LAVA/CS (LAVA for Computer Security) is an application of the Los Alamos Vulnerability Assessment (LAVA) methodology specific to computer and information security. The software serves as a generic tool for identifying vulnerabilities in computer and information security safeguards systems. Although it does not perform a full risk assessment, the results from its analysis may provide valuable insights into security problems. LAVA/CS assumes that the system is exposed to both natural and environmental hazards and tomore » deliberate malevolent actions by either insiders or outsiders. The user in the process of answering the LAVA/CS questionnaire identifies missing safeguards in 34 areas ranging from password management to personnel security and internal audit practices. Specific safeguards protecting a generic set of assets (or targets) from a generic set of threats (or adversaries) are considered. There are four generic assets: the facility, the organization''s environment; the hardware, all computer-related hardware; the software, the information in machine-readable form stored both on-line or on transportable media; and the documents and displays, the information in human-readable form stored as hard-copy materials (manuals, reports, listings in full-size or microform), film, and screen displays. Two generic threats are considered: natural and environmental hazards, storms, fires, power abnormalities, water and accidental maintenance damage; and on-site human threats, both intentional and accidental acts attributable to a perpetrator on the facility''s premises.« less

  20. Computer Series, 13.

    ERIC Educational Resources Information Center

    Moore, John W., Ed.

    1981-01-01

    Provides short descriptions of chemists' applications of computers in instruction: an interactive instructional program for Instrumental-Qualitative Organic Analysis; question-and-answer exercises in organic chemistry; computerized organic nomenclature drills; integration of theoretical and descriptive materials; acid-base titration simulation;…