Sled, Elizabeth A.; Sheehy, Lisa M.; Felson, David T.; Costigan, Patrick A.; Lam, Miu; Cooke, T. Derek V.
2010-01-01
The objective of the study was to evaluate the reliability of frontal plane lower limb alignment measures using a landmark-based method by (1) comparing inter- and intra-reader reliability between measurements of alignment obtained manually with those using a computer program, and (2) determining inter- and intra-reader reliability of computer-assisted alignment measures from full-limb radiographs. An established method for measuring alignment was used, involving selection of 10 femoral and tibial bone landmarks. 1) To compare manual and computer methods, we used digital images and matching paper copies of five alignment patterns simulating healthy and malaligned limbs drawn using AutoCAD. Seven readers were trained in each system. Paper copies were measured manually and repeat measurements were performed daily for 3 days, followed by a similar routine with the digital images using the computer. 2) To examine the reliability of computer-assisted measures from full-limb radiographs, 100 images (200 limbs) were selected as a random sample from 1,500 full-limb digital radiographs which were part of the Multicenter Osteoarthritis (MOST) Study. Three trained readers used the software program to measure alignment twice from the batch of 100 images, with two or more weeks between batch handling. Manual and computer measures of alignment showed excellent agreement (intraclass correlations [ICCs] 0.977 – 0.999 for computer analysis; 0.820 – 0.995 for manual measures). The computer program applied to full-limb radiographs produced alignment measurements with high inter- and intra-reader reliability (ICCs 0.839 – 0.998). In conclusion, alignment measures using a bone landmark-based approach and a computer program were highly reliable between multiple readers. PMID:19882339
Hortness, J.E.
2004-01-01
The U.S. Geological Survey (USGS) measures discharge in streams using several methods. However, measurement of peak discharges is often impossible or impractical due to difficult access, inherent danger of making measurements during flood events, and timing often associated with flood events. Thus, many peak discharge values often are calculated after the fact by use of indirect methods. The most common indirect method for estimating peak dis- charges in streams is the slope-area method. This, like other indirect methods, requires measuring the flood profile through detailed surveys. Processing the survey data for efficient entry into computer streamflow models can be time demanding; SAM 2.1 is a program designed to expedite that process. The SAM 2.1 computer program is designed to be run in the field on a portable computer. The program processes digital surveying data obtained from an electronic surveying instrument during slope- area measurements. After all measurements have been completed, the program generates files to be input into the SAC (Slope-Area Computation program; Fulford, 1994) or HEC-RAS (Hydrologic Engineering Center-River Analysis System; Brunner, 2001) computer streamflow models so that an estimate of the peak discharge can be calculated.
Uranium dioxide fuel cladding strain investigation with the use of CYGRO-2 computer program
NASA Technical Reports Server (NTRS)
Smith, J. R.
1973-01-01
Previously irradiated UO2 thermionic fuel pins in which gross fuel-cladding strain occurred were modeled with the use of a computer program to define controlling parameters which may contribute to cladding strain. The computed strain was compared with measured strain, and the computer input data were studied in an attempt to get agreement with measured strain. Because of the limitations of the program and uncertainties in input data, good agreement with measured cladding strain was not attained. A discussion of these limitations is presented.
Culvert analysis program for indirect measurement of discharge
Fulford, Janice M.; ,
1993-01-01
A program based on the U.S. Geological Survey (USGS) methods for indirectly computing peak discharges through culverts allows users to employ input data formats used by the water surface profile program (WSPRO). The program can be used to compute discharge rating surfaces or curves that describe the behavior of flow through a particular culvert or to compute discharges from measurements of upstream of the gradually varied flow equations and has been adapted slightly to provide solutions that minimize the need for the user to determine between different flow regimes. The program source is written in Fortran 77 and has been run on mini-computers and personal computers. The program does not use or require graphics capability, a color monitor, or a mouse.
Foresters' Metric Conversions program (version 1.0). [Computer program
Jefferson A. Palmer
1999-01-01
The conversion of scientific measurements has become commonplace in the fields of - engineering, research, and forestry. Foresters? Metric Conversions is a Windows-based computer program that quickly converts user-defined measurements from English to metric and from metric to English. Foresters? Metric Conversions was derived from the publication "Metric...
Bennett, J M; Booty, M J
1966-01-01
A computational method of determining n and k for an evaporated film from the measured reflectance, transmittance, and film thickness has been programmed for an IBM 7094 computer. The method consists of modifications to the NOTS multilayer film program. The basic program computes normal incidence reflectance, transmittance, phase change on reflection, and other parameters from the optical constants and thicknesses of all materials. In the modification, n and k for the film are varied in a prescribed manner, and the computer picks from among these values one n and one k which yield reflectance and transmittance values almost equalling the measured values. Results are given for films of silicon and aluminum.
Three-dimensional vector modeling and restoration of flat finite wave tank radiometric measurements
NASA Technical Reports Server (NTRS)
Truman, W. M.; Balanis, C. A.
1977-01-01
The three-dimensional vector interaction between a microwave radiometer and a wave tank was modeled. Computer programs for predicting the response of the radiometer to the brightness temperature characteristics of the surroundings were developed along with a computer program that can invert (restore) the radiometer measurements. It is shown that the computer programs can be used to simulate the viewing of large bodies of water, and is applicable to radiometer measurements received from satellites monitoring the ocean. The water temperature, salinity, and wind speed can be determined.
Analysis of Compton continuum measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gold, R.; Olson, I. K.
1970-01-01
Five computer programs: COMPSCAT, FEND, GABCO, DOSE, and COMPLOT, have been developed and used for the analysis and subsequent reduction of measured energy distributions of Compton recoil electrons to continuous gamma spectra. In addition to detailed descriptions of these computer programs, the relationship amongst these codes is stressed. The manner in which these programs function is illustrated by tracing a sample measurement through a complete cycle of the data-reduction process.
Student Achievement in Computer Programming: Lecture vs Computer-Aided Instruction
ERIC Educational Resources Information Center
Tsai, San-Yun W.; Pohl, Norval F.
1978-01-01
This paper discusses a study of the differences in student learning achievement, as measured by four different types of common performance evaluation techniques, in a college-level computer programming course under three teaching/learning environments: lecture, computer-aided instruction, and lecture supplemented with computer-aided instruction.…
1992-02-01
develop,, and maintains computer programs for the Department of the Navy. It provides life cycle support for over 50 computer programs installed at over...the computer programs . Table 4 presents a list of possible product or output measures of functionality for ACDS Block 0 programs . Examples of output...were identified as important "causes" of process performance. Functionality of the computer programs was the result or "effect" of the combination of
NASA Technical Reports Server (NTRS)
Pickett, G. F.; Wells, R. A.; Love, R. A.
1977-01-01
A computer user's manual describing the operation and the essential features of the Modal Calculation Program is presented. The modal Calculation Program calculates the amplitude and phase of modal structures by means of acoustic pressure measurements obtained from microphones placed at selected locations within the fan inlet duct. In addition, the Modal Calculation Program also calculates the first-order errors in the modal coefficients that are due to tolerances in microphone location coordinates and inaccuracies in the acoustic pressure measurements.
NASA Technical Reports Server (NTRS)
Pickett, G. F.; Wells, R. A.; Love, R. A.
1977-01-01
A computer user's manual describing the operation and the essential features of the microphone location program is presented. The Microphone Location Program determines microphone locations that ensure accurate and stable results from the equation system used to calculate modal structures. As part of the computational procedure for the Microphone Location Program, a first-order measure of the stability of the equation system was indicated by a matrix 'conditioning' number.
High level language for measurement complex control based on the computer E-100I
NASA Technical Reports Server (NTRS)
Zubkov, B. V.
1980-01-01
A high level language was designed to control the process of conducting an experiment using the computer "Elektrinika-1001". Program examples are given to control the measuring and actuating devices. The procedure of including these programs in the suggested high level language is described.
An experimental and theoretical investigation of deposition patterns from an agricultural airplane
NASA Technical Reports Server (NTRS)
Morris, D. J.; Croom, C. C.; Vandam, C. P.; Holmes, B. J.
1984-01-01
A flight test program has been conducted with a representative agricultural airplane to provide data for validating a computer program model which predicts aerially applied particle deposition. Test procedures and the data from this test are presented and discussed. The computer program features are summarized, and comparisons of predicted and measured particle deposition are presented. Applications of the computer program for spray pattern improvement are illustrated.
Computer-aided programming for message-passing system; Problems and a solution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, M.Y.; Gajski, D.D.
1989-12-01
As the number of processors and the complexity of problems to be solved increase, programming multiprocessing systems becomes more difficult and error-prone. Program development tools are necessary since programmers are not able to develop complex parallel programs efficiently. Parallel models of computation, parallelization problems, and tools for computer-aided programming (CAP) are discussed. As an example, a CAP tool that performs scheduling and inserts communication primitives automatically is described. It also generates the performance estimates and other program quality measures to help programmers in improving their algorithms and programs.
1980-05-01
engineering ,ZteNo D R RPTE16 research w 9 laboratory COMPARISON OF BUILDING LOADS ANALYSIS AND SYSTEM THERMODYNAMICS (BLAST) AD 0 5 5,0 3COMPUTER PROGRAM...Building Loads Analysis and System Thermodynamics (BLAST) computer program. A dental clinic and a battalion headquarters and classroom building were...Building and HVAC System Data Computer Simulation Comparison of Actual and Simulated Results ANALYSIS AND FINDINGS
Identification of Program Signatures from Cloud Computing System Telemetry Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nichols, Nicole M.; Greaves, Mark T.; Smith, William P.
Malicious cloud computing activity can take many forms, including running unauthorized programs in a virtual environment. Detection of these malicious activities while preserving the privacy of the user is an important research challenge. Prior work has shown the potential viability of using cloud service billing metrics as a mechanism for proxy identification of malicious programs. Previously this novel detection method has been evaluated in a synthetic and isolated computational environment. In this paper we demonstrate the ability of billing metrics to identify programs, in an active cloud computing environment, including multiple virtual machines running on the same hypervisor. The openmore » source cloud computing platform OpenStack, is used for private cloud management at Pacific Northwest National Laboratory. OpenStack provides a billing tool (Ceilometer) to collect system telemetry measurements. We identify four different programs running on four virtual machines under the same cloud user account. Programs were identified with up to 95% accuracy. This accuracy is dependent on the distinctiveness of telemetry measurements for the specific programs we tested. Future work will examine the scalability of this approach for a larger selection of programs to better understand the uniqueness needed to identify a program. Additionally, future work should address the separation of signatures when multiple programs are running on the same virtual machine.« less
ERIC Educational Resources Information Center
Weber, Eric G.
2012-01-01
The purpose of this study was to determine the impact of a one-to-one laptop computer program on the literacy achievement of eighth-grade students with above average, average, and below average measured cognitive skill levels who are eligible and not eligible for free or reduced price lunch program participation. The study analyzed, student…
ERIC Educational Resources Information Center
Culp, G. H.; And Others
Over 100 interactive computer programs for use in general and organic chemistry at the University of Texas at Austin have been prepared. The rationale for the programs is based upon the belief that computer-assisted instruction (CAI) can improve education by, among other things, freeing teachers from routine tasks, measuring entry skills,…
Implementation of a computer database testing and analysis program.
Rouse, Deborah P
2007-01-01
The author is the coordinator of a computer software database testing and analysis program implemented in an associate degree nursing program. Computer software database programs help support the testing development and analysis process. Critical thinking is measurable and promoted with their use. The reader of this article will learn what is involved in procuring and implementing a computer database testing and analysis program in an academic nursing program. The use of the computerized database for testing and analysis will be approached as a method to promote and evaluate the nursing student's critical thinking skills and to prepare the nursing student for the National Council Licensure Examination.
Computer program determines performance efficiency of remote measuring systems
NASA Technical Reports Server (NTRS)
Merewether, E. K.
1966-01-01
Computer programs control and evaluate instrumentation system performance for numerous rocket engine test facilities and prescribe calibration and maintenance techniques to maintain the systems within process specifications. Similar programs can be written for other test equipment in an industry such as the petrochemical industry.
BROJA-2PID: A Robust Estimator for Bivariate Partial Information Decomposition
NASA Astrophysics Data System (ADS)
Makkeh, Abdullah; Theis, Dirk; Vicente, Raul
2018-04-01
Makkeh, Theis, and Vicente found in [8] that Cone Programming model is the most robust to compute the Bertschinger et al. partial information decompostion (BROJA PID) measure [1]. We developed a production-quality robust software that computes the BROJA PID measure based on the Cone Programming model. In this paper, we prove the important property of strong duality for the Cone Program and prove an equivalence between the Cone Program and the original Convex problem. Then describe in detail our software and how to use it.\
NASA Technical Reports Server (NTRS)
Kleckner, R. J.; Rosenlieb, J. W.; Dyba, G.
1980-01-01
The results of a series of full scale hardware tests comparing predictions of the SPHERBEAN computer program with measured data are presented. The SPHERBEAN program predicts the thermomechanical performance characteristics of high speed lubricated double row spherical roller bearings. The degree of correlation between performance predicted by SPHERBEAN and measured data is demonstrated. Experimental and calculated performance data is compared over a range in speed up to 19,400 rpm (0.8 MDN) under pure radial, pure axial, and combined loads.
Computer System Performance Measurement Techniques for ARTS III Computer Systems
DOT National Transportation Integrated Search
1973-12-01
The potential contribution of direct system measurement in the evolving ARTS 3 Program is discussed and software performance measurement techniques are comparatively assessed in terms of credibility of results, ease of implementation, volume of data,...
Operators manual for a computer controlled impedance measurement system
NASA Astrophysics Data System (ADS)
Gordon, J.
1987-02-01
Operating instructions of a computer controlled impedance measurement system based in Hewlett Packard instrumentation are given. Hardware details, program listings, flowcharts and a practical application are included.
Estimating Relative Positions of Outer-Space Structures
NASA Technical Reports Server (NTRS)
Balian, Harry; Breckenridge, William; Brugarolas, Paul
2009-01-01
A computer program estimates the relative position and orientation of two structures from measurements, made by use of electronic cameras and laser range finders on one structure, of distances and angular positions of fiducial objects on the other structure. The program was written specifically for use in determining errors in the alignment of large structures deployed in outer space from a space shuttle. The program is based partly on equations for transformations among the various coordinate systems involved in the measurements and on equations that account for errors in the transformation operators. It computes a least-squares estimate of the relative position and orientation. Sequential least-squares estimates, acquired at a measurement rate of 4 Hz, are averaged by passing them through a fourth-order Butterworth filter. The program is executed in a computer aboard the space shuttle, and its position and orientation estimates are displayed to astronauts on a graphical user interface.
Study Of Flow About A Helicopter Rotor
NASA Technical Reports Server (NTRS)
Tauber, Michael E.; Owen, F. Kevin
1989-01-01
Noninvasive instrument verifies computer program predicting velocities. Laser velocimeter measurements confirm predictions of transonic flow field around tip of helicopter-rotor blade. Report discusses measurements, which yield high-resolution orthogonal velocity components of flow field at rotor-tip. Mach numbers from 0.85 to 0.95, and use of measurements in verifying ability of computer program ROT22 to predict transonic flow field, including occurrences, strengths, and locations of shock waves causing high drag and noise.
How Do We Really Compute with Units?
ERIC Educational Resources Information Center
Fiedler, B. H.
2010-01-01
The methods that we teach students for computing with units of measurement are often not consistent with the practice of professionals. For professionals, the vast majority of computations with quantities of measure are performed within programs on electronic computers, for which an accounting for the units occurs only once, in the design of the…
NAVSIM 2: A computer program for simulating aided-inertial navigation for aircraft
NASA Technical Reports Server (NTRS)
Bjorkman, William S.
1987-01-01
NAVSIM II, a computer program for analytical simulation of aided-inertial navigation for aircraft, is described. The description is supported by a discussion of the program's application to the design and analysis of aided-inertial navigation systems as well as instructions for utilizing the program and for modifying it to accommodate new models, constraints, algorithms and scenarios. NAVSIM II simulates an airborne inertial navigation system built around a strapped-down inertial measurement unit and aided in its function by GPS, Doppler radar, altimeter, airspeed, and position-fix measurements. The measurements are incorporated into the navigation estimate via a UD-form Kalman filter. The simulation was designed and implemented using structured programming techniques and with particular attention to user-friendly operation.
Chu, Adeline; Mastel-Smith, Beth
2010-01-01
Technology has a great impact on nursing practice. With the increasing numbers of older Americans using computers and the Internet in recent years, nurses have the capability to deliver effective and efficient health education to their patients and the community. Based on the theoretical framework of Bandura's self-efficacy theory, the pilot project reported findings from a 5-week computer course on Internet health searches in older adults, 65 years or older, at a senior activity learning center. Twelve participants were recruited and randomized to either the intervention or the control group. Measures of computer anxiety, computer confidence, and computer self-efficacy scores were analyzed at baseline, at the end of the program, and 6 weeks after the completion of the program. Analysis was conducted with repeated-measures analysis of variance. Findings showed participants who attended a structured computer course on Internet health information retrieval reported lowered anxiety and increased confidence and self-efficacy at the end of the 5-week program and 6 weeks after the completion of the program as compared with participants who were not in the program. The study demonstrated that a computer course can help reduce anxiety and increase confidence and self-efficacy in online health searches in older adults.
UFO (UnFold Operator) computer program abstract
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kissel, L.; Biggs, F.
UFO (UnFold Operator) is an interactive user-oriented computer program designed to solve a wide range of problems commonly encountered in physical measurements. This document provides a summary of the capabilities of version 3A of UFO.
Computing Spacecraft Solar-Cell Damage by Charged Particles
NASA Technical Reports Server (NTRS)
Gaddy, Edward M.
2006-01-01
General EQFlux is a computer program that converts the measure of the damage done to solar cells in outer space by impingement of electrons and protons having many different kinetic energies into the measure of the damage done by an equivalent fluence of electrons, each having kinetic energy of 1 MeV. Prior to the development of General EQFlux, there was no single computer program offering this capability: For a given type of solar cell, it was necessary to either perform the calculations manually or to use one of three Fortran programs, each of which was applicable to only one type of solar cell. The problem in developing General EQFlux was to rewrite and combine the three programs into a single program that could perform the calculations for three types of solar cells and run in a Windows environment with a Windows graphical user interface. In comparison with the three prior programs, General EQFlux is easier to use.
NASA Technical Reports Server (NTRS)
Mullins, N. E.
1972-01-01
The GEODYN Orbit Determination and Geodetic Parameter Estimation System consists of a set of computer programs designed to determine and analyze definitive satellite orbits and their associated geodetic and measurement parameters. This manual describes the Support Programs used by the GEODYN System. The mathematics and programming descriptions are detailed. The operational procedures of each program are presented. GEODYN ancillary analysis programs may be grouped into three different categories: (1) orbit comparison - DELTA (2) data analysis using reference orbits - GEORGE, and (3) pass geometry computations - GROUNDTRACK. All of the above three programs use one or more tapes written by the GEODYN program in either a data reduction or orbit generator run.
Computer analysis of digital well logs
Scott, James H.
1984-01-01
A comprehensive system of computer programs has been developed by the U.S. Geological Survey for analyzing digital well logs. The programs are operational on a minicomputer in a research well-logging truck, making it possible to analyze and replot the logs while at the field site. The minicomputer also serves as a controller of digitizers, counters, and recorders during acquisition of well logs. The analytical programs are coordinated with the data acquisition programs in a flexible system that allows the operator to make changes quickly and easily in program variables such as calibration coefficients, measurement units, and plotting scales. The programs are designed to analyze the following well-logging measurements: natural gamma-ray, neutron-neutron, dual-detector density with caliper, magnetic susceptibility, single-point resistance, self potential, resistivity (normal and Wenner configurations), induced polarization, temperature, sonic delta-t, and sonic amplitude. The computer programs are designed to make basic corrections for depth displacements, tool response characteristics, hole diameter, and borehole fluid effects (when applicable). Corrected well-log measurements are output to magnetic tape or plotter with measurement units transformed to petrophysical and chemical units of interest, such as grade of uranium mineralization in percent eU3O8, neutron porosity index in percent, and sonic velocity in kilometers per second.
Handheld Computer Use in U.S. Family Practice Residency Programs
Criswell, Dan F.; Parchman, Michael L.
2002-01-01
Objective: The purpose of the study was to evaluate the uses of handheld computers (also called personal digital assistants, or PDAs) in family practice residency programs in the United States. Study Design: In November 2000, the authors mailed a questionnaire to the program directors of all American Academy of Family Physicians (AAFP) and American College of Osteopathic Family Practice (ACOFP) residency programs in the United States. Measurements: Data and patterns of the use and non-use of handheld computers were identified. Results: Approximately 50 percent (306 of 610) of the programs responded to the survey. Two thirds of the programs reported that handheld computers were used in their residencies, and an additional 14 percent had plans for implementation within 24 months. Both the Palm and the Windows CE operating systems were used, with the Palm operating system the most common. Military programs had the highest rate of use (8 of 10 programs, 80 percent), and osteopathic programs had the lowest (23 of 55 programs, 42 percent). Of programs that reported handheld computer use, 45 percent had required handheld computer applications that are used uniformly by all users. Funding for handheld computers and related applications was non-budgeted in 76percent of the programs in which handheld computers were used. In programs providing a budget for handheld computers, the average annual budget per user was $461.58. Interested faculty or residents, rather than computer information services personnel, performed upkeep and maintenance of handheld computers in 72 percent of the programs in which the computers are used. In addition to the installed calendar, memo pad, and address book, the most common clinical uses of handheld computers in the programs were as medication reference tools, electronic textbooks, and clinical computational or calculator-type programs. Conclusions: Handheld computers are widely used in family practice residency programs in the United States. Although handheld computers were designed as electronic organizers, in family practice residencies they are used as medication reference tools, electronic textbooks, and clinical computational programs and to track activities that were previously associated with desktop database applications. PMID:11751806
NASA Tech Briefs, June 2000. Volume 24, No. 6
NASA Technical Reports Server (NTRS)
2000-01-01
Topics include: Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Test and Measurement; Physical Sciences; Materials; Computer Programs; Computers and Peripherals;
Measuring Speed Using a Computer--Several Techniques.
ERIC Educational Resources Information Center
Pearce, Jon M.
1988-01-01
Introduces three different techniques to facilitate the measurement of speed and the associated kinematics and dynamics using a computer. Discusses sensing techniques using optical or ultrasonic sensors, interfacing with a computer, software routines for the interfaces, and other applications. Provides circuit diagrams, pictures, and a program to…
Computing quantum hashing in the model of quantum branching programs
NASA Astrophysics Data System (ADS)
Ablayev, Farid; Ablayev, Marat; Vasiliev, Alexander
2018-02-01
We investigate the branching program complexity of quantum hashing. We consider a quantum hash function that maps elements of a finite field into quantum states. We require that this function is preimage-resistant and collision-resistant. We consider two complexity measures for Quantum Branching Programs (QBP): a number of qubits and a number of compu-tational steps. We show that the quantum hash function can be computed efficiently. Moreover, we prove that such QBP construction is optimal. That is, we prove lower bounds that match the constructed quantum hash function computation.
ERIC Educational Resources Information Center
Hansen, David L.; Morgan, Robert L.
2008-01-01
This research evaluated effects of a multi-media computer-based instruction (CBI) program designed to teach grocery store purchasing skills to three high-school students with intellectual disabilities. A multiple baseline design across participants used measures of computer performance mastery and grocery store probes to evaluate the CBI. All…
Influence of direct computer experience on older adults' attitudes toward computers.
Jay, G M; Willis, S L
1992-07-01
This research examined whether older adults' attitudes toward computers became more positive as a function of computer experience. The sample comprised 101 community-dwelling older adults aged 57 to 87. The intervention involved a 2-week computer training program in which subjects learned to use a desktop publishing software program. A multidimensional computer attitude measure was used to assess differential attitude change and maintenance of change following training. The results indicated that older adults' computer attitudes are modifiable and that direct computer experience is an effective means of change. Attitude change as a function of training was found for the attitude dimensions targeted by the intervention program: computer comfort and efficacy. In addition, maintenance of attitude change was established for at least two weeks following training.
Computer-Aided Instruction in Automated Instrumentation.
ERIC Educational Resources Information Center
Stephenson, David T.
1986-01-01
Discusses functions of automated instrumentation systems, i.e., systems which combine electrical measuring instruments and a controlling computer to measure responses of a unit under test. The computer-assisted tutorial then described is programmed for use on such a system--a modern microwave spectrum analyzer--to introduce engineering students to…
NASA Technical Reports Server (NTRS)
Stocks, Dana R.
1986-01-01
The Dynamic Gas Temperature Measurement System compensation software accepts digitized data from two different diameter thermocouples and computes a compensated frequency response spectrum for one of the thermocouples. Detailed discussions of the physical system, analytical model, and computer software are presented in this volume and in Volume 1 of this report under Task 3. Computer program software restrictions and test cases are also presented. Compensated and uncompensated data may be presented in either the time or frequency domain. Time domain data are presented as instantaneous temperature vs time. Frequency domain data may be presented in several forms such as power spectral density vs frequency.
Computer Programs (Turbomachinery)
NASA Technical Reports Server (NTRS)
1978-01-01
NASA computer programs are extensively used in design of industrial equipment. Available from the Computer Software Management and Information Center (COSMIC) at the University of Georgia, these programs are employed as analysis tools in design, test and development processes, providing savings in time and money. For example, two NASA computer programs are used daily in the design of turbomachinery by Delaval Turbine Division, Trenton, New Jersey. The company uses the NASA splint interpolation routine for analysis of turbine blade vibration and the performance of compressors and condensers. A second program, the NASA print plot routine, analyzes turbine rotor response and produces graphs for project reports. The photos show examples of Delaval test operations in which the computer programs play a part. In the large photo below, a 24-inch turbine blade is undergoing test; in the smaller photo, a steam turbine rotor is being prepared for stress measurements under actual operating conditions; the "spaghetti" is wiring for test instrumentation
A FORTRAN Program for Computing Refractive Index Using the Double Variation Method.
ERIC Educational Resources Information Center
Blanchard, Frank N.
1984-01-01
Describes a computer program which calculates a best estimate of refractive index and dispersion from a large number of observations using the double variation method of measuring refractive index along with Sellmeier constants of the immersion oils. Program listing with examples will be provided on written request to the author. (Author/JM)
McKay, E
2000-01-01
An innovative research program was devised to investigate the interactive effect of instructional strategies enhanced with text-plus-textual metaphors or text-plus-graphical metaphors, and cognitive style on the acquisition of programming concepts. The Cognitive Styles Analysis (CSA) program (Riding,1991) was used to establish the participants' cognitive style. The QUEST Interactive Test Analysis System (Adams and Khoo,1996) provided the cognitive performance measuring tool, which ensured an absence of error measurement in the programming knowledge testing instruments. Therefore, reliability of the instrumentation was assured through the calibration techniques utilized by the QUEST estimate; providing predictability of the research design. A means analysis of the QUEST data, using the Cohen (1977) approach to size effect and statistical power further quantified the significance of the findings. The experimental methodology adopted for this research links the disciplines of instructional science, cognitive psychology, and objective measurement to provide reliable mechanisms for beneficial use in the evaluation of cognitive performance by the education, training and development sectors. Furthermore, the research outcomes will be of interest to educators, cognitive psychologists, communications engineers, and computer scientists specializing in computer-human interactions.
Methods for evaluating and ranking transportation energy conservation programs
NASA Astrophysics Data System (ADS)
Santone, L. C.
1981-04-01
The energy conservation programs are assessed in terms of petroleum savings, incremental costs to consumers probability of technical and market success, and external impacts due to environmental, economic, and social factors. Three ranking functions and a policy matrix are used to evaluate the programs. The net present value measure which computes the present worth of petroleum savings less the present worth of costs is modified by dividing by the present value of DOE funding to obtain a net present value per program dollar. The comprehensive ranking function takes external impacts into account. Procedures are described for making computations of the ranking functions and the attributes that require computation. Computations are made for the electric vehicle, Stirling engine, gas turbine, and MPG mileage guide program.
An acceptable role for computers in the aircraft design process
NASA Technical Reports Server (NTRS)
Gregory, T. J.; Roberts, L.
1980-01-01
Some of the reasons why the computerization trend is not wholly accepted are explored for two typical cases: computer use in the technical specialties and computer use in aircraft synthesis. The factors that limit acceptance are traced in part, to the large resources needed to understand the details of computer programs, the inability to include measured data as input to many of the theoretical programs, and the presentation of final results without supporting intermediate answers. Other factors are due solely to technical issues such as limited detail in aircraft synthesis and major simplifying assumptions in the technical specialties. These factors and others can be influenced by the technical specialist and aircraft designer. Some of these factors may become less significant as the computerization process evolves, but some issues, such as understanding large integrated systems, may remain issues in the future. Suggestions for improved acceptance include publishing computer programs so that they may be reviewed, edited, and read. Other mechanisms include extensive modularization of programs and ways to include measured information as part of the input to theoretical approaches.
Software For Computing Reliability Of Other Software
NASA Technical Reports Server (NTRS)
Nikora, Allen; Antczak, Thomas M.; Lyu, Michael
1995-01-01
Computer Aided Software Reliability Estimation (CASRE) computer program developed for use in measuring reliability of other software. Easier for non-specialists in reliability to use than many other currently available programs developed for same purpose. CASRE incorporates mathematical modeling capabilities of public-domain Statistical Modeling and Estimation of Reliability Functions for Software (SMERFS) computer program and runs in Windows software environment. Provides menu-driven command interface; enabling and disabling of menu options guides user through (1) selection of set of failure data, (2) execution of mathematical model, and (3) analysis of results from model. Written in C language.
Sum and mean. Standard programs for activation analysis.
Lindstrom, R M
1994-01-01
Two computer programs in use for over a decade in the Nuclear Methods Group at NIST illustrate the utility of standard software: programs widely available and widely used, in which (ideally) well-tested public algorithms produce results that are well understood, and thereby capable of comparison, within the community of users. Sum interactively computes the position, net area, and uncertainty of the area of spectral peaks, and can give better results than automatic peak search programs when peaks are very small, very large, or unusually shaped. Mean combines unequal measurements of a single quantity, tests for consistency, and obtains the weighted mean and six measures of its uncertainty.
Computing Programs for Determining Traffic Flows from Roundabouts
NASA Astrophysics Data System (ADS)
Boroiu, A. A.; Tabacu, I.; Ene, A.; Neagu, E.; Boroiu, A.
2017-10-01
For modelling road traffic at the level of a road network it is necessary to specify the flows of all traffic currents at each intersection. These data can be obtained by direct measurements at the traffic light intersections, but in the case of a roundabout this is not possible directly and the literature as well as the traffic modelling software doesn’t offer ways to solve this issue. Two sets of formulas are proposed by which all traffic flows from the roundabouts with 3 or 4 arms are calculated based on the streams that can be measured. The objective of this paper is to develop computational programs to operate with these formulas. For each of the two sets of analytical relations, a computational program was developed in the Java operating language. The obtained results fully confirm the applicability of the calculation programs. The final stage for capitalizing these programs will be to make them web pages in HTML format, so that they can be accessed and used on the Internet. The achievements presented in this paper are an important step to provide a necessary tool for traffic modelling because these computational programs can be easily integrated into specialized software.
Engineering and programming manual: Two-dimensional kinetic reference computer program (TDK)
NASA Technical Reports Server (NTRS)
Nickerson, G. R.; Dang, L. D.; Coats, D. E.
1985-01-01
The Two Dimensional Kinetics (TDK) computer program is a primary tool in applying the JANNAF liquid rocket thrust chamber performance prediction methodology. The development of a methodology that includes all aspects of rocket engine performance from analytical calculation to test measurements, that is physically accurate and consistent, and that serves as an industry and government reference is presented. Recent interest in rocket engines that operate at high expansion ratio, such as most Orbit Transfer Vehicle (OTV) engine designs, has required an extension of the analytical methods used by the TDK computer program. Thus, the version of TDK that is described in this manual is in many respects different from the 1973 version of the program. This new material reflects the new capabilities of the TDK computer program, the most important of which are described.
Computation of transonic potential flow about 3 dimensional inlets, ducts, and bodies
NASA Technical Reports Server (NTRS)
Reyhner, T. A.
1982-01-01
An analysis was developed and a computer code, P465 Version A, written for the prediction of transonic potential flow about three dimensional objects including inlet, duct, and body geometries. Finite differences and line relaxation are used to solve the complete potential flow equation. The coordinate system used for the calculations is independent of body geometry. Cylindrical coordinates are used for the computer code. The analysis is programmed in extended FORTRAN 4 for the CYBER 203 vector computer. The programming of the analysis is oriented toward taking advantage of the vector processing capabilities of this computer. Comparisons of computed results with experimental measurements are presented to verify the analysis. Descriptions of program input and output formats are also presented.
ERIC Educational Resources Information Center
Mahaffey, Michael L.; McKillip, William D.
This manual is designed for teachers using the Career Oriented Mathematics units on owning an automobile and driving as a career, retail sales, measurement, and area-perimeter. The volume begins with a discussion of the philosophy and scheduling of the program which is designed to improve students' attitudes and ability in computation by…
NASA Tech Briefs, February 2000. Volume 24, No. 2
NASA Technical Reports Server (NTRS)
2000-01-01
Topics covered include: Test and Measurement; Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Bio-Medical; Mathematics and Information Sciences; Computers and Peripherals.
DepositScan, a Scanning Program to Measure Spray Deposition Distributions
USDA-ARS?s Scientific Manuscript database
DepositScan, a scanning program was developed to quickly measure spray deposit distributions on water sensitive papers or Kromekote cards which are widely used for determinations of pesticide spray deposition quality on target areas. The program is installed in a portable computer and works with a ...
The Impact of the Measures of Academic Progress (MAP) Program on Student Reading Achievement
ERIC Educational Resources Information Center
Cordray, David S.; Pion, Georgine M.; Brandt, Chris; Molefe, Ayrin
2013-01-01
One of the most widely used commercially available systems incorporating benchmark assessment and training in differentiated instruction is the Northwest Evaluation Association's (NWEA) Measures of Academic Progress (MAP) program. The MAP program involves two components: (1) computer-adaptive assessments administered to students three to four…
Simulation of n-qubit quantum systems. V. Quantum measurements
NASA Astrophysics Data System (ADS)
Radtke, T.; Fritzsche, S.
2010-02-01
The FEYNMAN program has been developed during the last years to support case studies on the dynamics and entanglement of n-qubit quantum registers. Apart from basic transformations and (gate) operations, it currently supports a good number of separability criteria and entanglement measures, quantum channels as well as the parametrizations of various frequently applied objects in quantum information theory, such as (pure and mixed) quantum states, hermitian and unitary matrices or classical probability distributions. With the present update of the FEYNMAN program, we provide a simple access to (the simulation of) quantum measurements. This includes not only the widely-applied projective measurements upon the eigenspaces of some given operator but also single-qubit measurements in various pre- and user-defined bases as well as the support for two-qubit Bell measurements. In addition, we help perform generalized and POVM measurements. Knowing the importance of measurements for many quantum information protocols, e.g., one-way computing, we hope that this update makes the FEYNMAN code an attractive and versatile tool for both, research and education. New version program summaryProgram title: FEYNMAN Catalogue identifier: ADWE_v5_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADWE_v5_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 27 210 No. of bytes in distributed program, including test data, etc.: 1 960 471 Distribution format: tar.gz Programming language: Maple 12 Computer: Any computer with Maple software installed Operating system: Any system that supports Maple; the program has been tested under Microsoft Windows XP and Linux Classification: 4.15 Catalogue identifier of previous version: ADWE_v4_0 Journal reference of previous version: Comput. Phys. Commun. 179 (2008) 647 Does the new version supersede the previous version?: Yes Nature of problem: During the last decade, the field of quantum information science has largely contributed to our understanding of quantum mechanics, and has provided also new and efficient protocols that are used on quantum entanglement. To further analyze the amount and transfer of entanglement in n-qubit quantum protocols, symbolic and numerical simulations need to be handled efficiently. Solution method: Using the computer algebra system Maple, we developed a set of procedures in order to support the definition, manipulation and analysis of n-qubit quantum registers. These procedures also help to deal with (unitary) logic gates and (nonunitary) quantum operations and measurements that act upon the quantum registers. All commands are organized in a hierarchical order and can be used interactively in order to simulate and analyze the evolution of n-qubit quantum systems, both in ideal and noisy quantum circuits. Reasons for new version: Until the present, the FEYNMAN program supported the basic data structures and operations of n-qubit quantum registers [1], a good number of separability and entanglement measures [2], quantum operations (noisy channels) [3] as well as the parametrizations of various frequently applied objects, such as (pure and mixed) quantum states, hermitian and unitary matrices or classical probability distributions [4]. With the current extension, we here add all necessary features to simulate quantum measurements, including the projective measurements in various single-qubit and the two-qubit Bell basis, and POVM measurements. Together with the previously implemented functionality, this greatly enhances the possibilities of analyzing quantum information protocols in which measurements play a central role, e.g., one-way computation. Running time: Most commands require ⩽10 seconds of processor time on a Pentium 4 processor with ⩾2 GHz RAM or newer, if they work with quantum registers with five or less qubits. Moreover, about 5-20 MB of working memory is typically needed (in addition to the memory for the Maple environment itself). However, especially when working with symbolic expressions, the requirements on the CPU time and memory critically depend on the size of the quantum registers owing to the exponential growth of the dimension of the associated Hilbert space. For example, complex (symbolic) noise models, i.e. with several Kraus operators, may result in very large expressions that dramatically slow down the evaluation of e.g. distance measures or the final-state entropy, etc. In these cases, Maple's assume facility sometimes helps to reduce the complexity of the symbolic expressions, but more often than not only a numerical evaluation is feasible. Since the various commands can be applied to quite different scenarios, no general scaling rule can be given for the CPU time or the request of memory. References:[1] T. Radtke, S. Fritzsche, Comput. Phys. Commun. 173 (2005) 91.[2] T. Radtke, S. Fritzsche, Comput. Phys. Commun. 175 (2006) 145.[3] T. Radtke, S. Fritzsche, Comput. Phys. Commun. 176 (2007) 617.[4] T. Radtke, S. Fritzsche, Comput. Phys. Commun. 179 (2008) 647.
REST: a computer system for estimating logging residue by using the line-intersect method
A. Jeff Martin
1975-01-01
A computer program was designed to accept logging-residue measurements obtained by line-intersect sampling and transform them into summaries useful for the land manager. The features of the program, along with inputs and outputs, are briefly described, with a note on machine compatibility.
Automated validation of a computer operating system
NASA Technical Reports Server (NTRS)
Dervage, M. M.; Milberg, B. A.
1970-01-01
Programs apply selected input/output loads to complex computer operating system and measure performance of that system under such loads. Technique lends itself to checkout of computer software designed to monitor automated complex industrial systems.
Analysis and calculation of lightning-induced voltages in aircraft electrical circuits
NASA Technical Reports Server (NTRS)
Plumer, J. A.
1974-01-01
Techniques to calculate the transfer functions relating lightning-induced voltages in aircraft electrical circuits to aircraft physical characteristics and lightning current parameters are discussed. The analytical work was carried out concurrently with an experimental program of measurements of lightning-induced voltages in the electrical circuits of an F89-J aircraft. A computer program, ETCAL, developed earlier to calculate resistive and inductive transfer functions is refined to account for skin effect, providing results more valid over a wider range of lightning waveshapes than formerly possible. A computer program, WING, is derived to calculate the resistive and inductive transfer functions between a basic aircraft wing and a circuit conductor inside it. Good agreement is obtained between transfer inductances calculated by WING and those reduced from measured data by ETCAL. This computer program shows promise of expansion to permit eventual calculation of potential lightning-induced voltages in electrical circuits of complete aircraft in the design stage.
NASA Technical Reports Server (NTRS)
Maine, R. E.; Iliff, K. W.
1980-01-01
A new formulation is proposed for the problem of parameter estimation of dynamic systems with both process and measurement noise. The formulation gives estimates that are maximum likelihood asymptotically in time. The means used to overcome the difficulties encountered by previous formulations are discussed. It is then shown how the proposed formulation can be efficiently implemented in a computer program. A computer program using the proposed formulation is available in a form suitable for routine application. Examples with simulated and real data are given to illustrate that the program works well.
Possible 6-qubit NMR quantum computer device material; simulator of the NMR line width
NASA Astrophysics Data System (ADS)
Hashi, K.; Kitazawa, H.; Shimizu, T.; Goto, A.; Eguchi, S.; Ohki, S.
2002-12-01
For an NMR quantum computer, splitting of an NMR spectrum must be larger than a line width. In order to find a best device material for a solid-state NMR quantum computer, we have made a simulation program to calculate the NMR line width due to the nuclear dipole field by the 2nd moment method. The program utilizes the lattice information prepared by commercial software to draw a crystal structure. By applying this program, we can estimate the NMR line width due to the nuclear dipole field without measurements and find a candidate material for a 6-qubit solid-state NMR quantum computer device.
ERIC Educational Resources Information Center
Denner, Jill; Werner, Linda; Ortiz, Eloy
2012-01-01
Computer game programming has been touted as a promising strategy for engaging children in the kinds of thinking that will prepare them to be producers, not just users of technology. But little is known about what they learn when programming a game. In this article, we present a strategy for coding student games, and summarize the results of an…
NASA Technical Reports Server (NTRS)
Plankey, B.
1981-01-01
A computer program called ECPVER (Energy Consumption Program - Verification) was developed to simulate all energy loads for any number of buildings. The program computes simulated daily, monthly, and yearly energy consumption which can be compared with actual meter readings for the same time period. Such comparison can lead to validation of the model under a variety of conditions, which allows it to be used to predict future energy saving due to energy conservation measures. Predicted energy saving can then be compared with actual saving to verify the effectiveness of those energy conservation changes. This verification procedure is planned to be an important advancement in the Deep Space Network Energy Project, which seeks to reduce energy cost and consumption at all DSN Deep Space Stations.
Computer Processing Of Tunable-Diode-Laser Spectra
NASA Technical Reports Server (NTRS)
May, Randy D.
1991-01-01
Tunable-diode-laser spectrometer measuring transmission spectrum of gas operates under control of computer, which also processes measurement data. Measurements in three channels processed into spectra. Computer controls current supplied to tunable diode laser, stepping it through small increments of wavelength while processing spectral measurements at each step. Program includes library of routines for general manipulation and plotting of spectra, least-squares fitting of direct-transmission and harmonic-absorption spectra, and deconvolution for determination of laser linewidth and for removal of instrumental broadening of spectral lines.
Computers and Technological Forecasting
ERIC Educational Resources Information Center
Martino, Joseph P.
1971-01-01
Forecasting is becoming increasingly automated, thanks in large measure to the computer. It is now possible for a forecaster to submit his data to a computation center and call for the appropriate program. (No knowledge of statistics is required.) (Author)
HYSEP: A Computer Program for Streamflow Hydrograph Separation and Analysis
Sloto, Ronald A.; Crouse, Michele Y.
1996-01-01
HYSEP is a computer program that can be used to separate a streamflow hydrograph into base-flow and surface-runoff components. The base-flow component has traditionally been associated with ground-water discharge and the surface-runoff component with precipitation that enters the stream as overland runoff. HYSEP includes three methods of hydrograph separation that are referred to in the literature as the fixed interval, sliding-interval, and local-minimum methods. The program also describes the frequency and duration of measured streamflow and computed base flow and surface runoff. Daily mean stream discharge is used as input to the program in either an American Standard Code for Information Interchange (ASCII) or binary format. Output from the program includes table,s graphs, and data files. Graphical output may be plotted on the computer screen or output to a printer, plotter, or metafile.
HIFiRE-1 Turbulent Shock Boundary Layer Interaction - Flight Data and Computations
NASA Technical Reports Server (NTRS)
Kimmel, Roger L.; Prabhu, Dinesh
2015-01-01
The Hypersonic International Flight Research Experimentation (HIFiRE) program is a hypersonic flight test program executed by the Air Force Research Laboratory (AFRL) and Australian Defence Science and Technology Organisation (DSTO). This flight contained a cylinder-flare induced shock boundary layer interaction (SBLI). Computations of the interaction were conducted for a number of times during the ascent. The DPLR code used for predictions was calibrated against ground test data prior to exercising the code at flight conditions. Generally, the computations predicted the upstream influence and interaction pressures very well. Plateau pressures on the cylinder were predicted well at all conditions. Although the experimental heat transfer showed a large amount of scatter, especially at low heating levels, the measured heat transfer agreed well with computations. The primary discrepancy between the experiment and computation occurred in the pressures measured on the flare during second stage burn. Measured pressures exhibited large overshoots late in the second stage burn, the mechanism of which is unknown. The good agreement between flight measurements and CFD helps validate the philosophy of calibrating CFD against ground test, prior to exercising it at flight conditions.
An Assessment of Research-Doctorate Programs in the United States: Mathematical & Physical Sciences.
ERIC Educational Resources Information Center
Jones, Lyle V., Ed.; And Others
The quality of doctoral-level chemistry (N=145), computer science (N=58), geoscience (N=91), mathematics (N=115), physics (N=123), and statistics/biostatistics (N=64) programs at United States universities was assessed, using 16 measures. These measures focused on variables related to: program size; characteristics of graduates; reputational…
Consistent and efficient processing of ADCP streamflow measurements
Mueller, David S.; Constantinescu, George; Garcia, Marcelo H.; Hanes, Dan
2016-01-01
The use of Acoustic Doppler Current Profilers (ADCPs) from a moving boat is a commonly used method for measuring streamflow. Currently, the algorithms used to compute the average depth, compute edge discharge, identify invalid data, and estimate velocity and discharge for invalid data vary among manufacturers. These differences could result in different discharges being computed from identical data. Consistent computational algorithm, automated filtering, and quality assessment of ADCP streamflow measurements that are independent of the ADCP manufacturer are being developed in a software program that can process ADCP moving-boat discharge measurements independent of the ADCP used to collect the data.
ERIC Educational Resources Information Center
Ellis, Ashley F.
2014-01-01
The purpose of this mixed methods program evaluation study was to investigate the ways in which one public school district and its teachers implemented a Bring Your Own Technology (BYOT) initiative. This study also measured teachers' computer self-efficacy, as measured by Cassidy and Eachus' (2002) Computer User Self-Efficacy Scale, and…
The Use of Computers and Video Games in Brain Damage Therapy.
ERIC Educational Resources Information Center
Lorimer, David
The use of computer assisted therapy (CAT) in the rehabilitation of individuals with brain damage is examined. Hardware considerations are explored, and the variety of software programs available for brain injury rehabilitation is discussed. Structured testing and treatment programs in time measurement, memory, and direction finding are described,…
High performance computing and communications: Advancing the frontiers of information technology
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1997-12-31
This report, which supplements the President`s Fiscal Year 1997 Budget, describes the interagency High Performance Computing and Communications (HPCC) Program. The HPCC Program will celebrate its fifth anniversary in October 1996 with an impressive array of accomplishments to its credit. Over its five-year history, the HPCC Program has focused on developing high performance computing and communications technologies that can be applied to computation-intensive applications. Major highlights for FY 1996: (1) High performance computing systems enable practical solutions to complex problems with accuracies not possible five years ago; (2) HPCC-funded research in very large scale networking techniques has been instrumental inmore » the evolution of the Internet, which continues exponential growth in size, speed, and availability of information; (3) The combination of hardware capability measured in gigaflop/s, networking technology measured in gigabit/s, and new computational science techniques for modeling phenomena has demonstrated that very large scale accurate scientific calculations can be executed across heterogeneous parallel processing systems located thousands of miles apart; (4) Federal investments in HPCC software R and D support researchers who pioneered the development of parallel languages and compilers, high performance mathematical, engineering, and scientific libraries, and software tools--technologies that allow scientists to use powerful parallel systems to focus on Federal agency mission applications; and (5) HPCC support for virtual environments has enabled the development of immersive technologies, where researchers can explore and manipulate multi-dimensional scientific and engineering problems. Educational programs fostered by the HPCC Program have brought into classrooms new science and engineering curricula designed to teach computational science. This document contains a small sample of the significant HPCC Program accomplishments in FY 1996.« less
NASA Tech Briefs, May 2000. Volume 24, No. 5
NASA Technical Reports Server (NTRS)
2000-01-01
Topics include: Sensors: Test and Measurement; Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Composites and Plastics; Materials; Computer Programs; Mechanics;
DOT National Transportation Integrated Search
1978-09-01
This report documents comparisons between extensive rail freight service measurements (previously presented in Volume II) and simulations of the same operations using a sophisticated train performance calculator computer program. The comparisons cove...
Mairesse, Olivier; Hofmans, Joeri; Theuns, Peter
2008-05-01
We propose a free, easy-to-use computer program that does not requires prior knowledge of computer programming to generate and run experiments using textual or pictorial stimuli. Although the FM Experiment Builder suite was initially programmed for building and conducting FM experiments, it can also be applied for non-FM experiments that necessitate randomized, single, or multifactorial designs. The program is highly configurable, allowing multilingual use and a wide range of different response formats. The outputs of the experiments are Microsoft Excel compatible .xls files that allow easy copy-paste of the results into Weiss's FM CalSTAT program (2006) or any other statistical package. Its Java-based structure is compatible with both Windows and Macintosh operating systems, and its compactness (< 1 MB) makes it easily distributable over the Internet.
Synfuel program analysis. Volume 2: VENVAL users manual
NASA Astrophysics Data System (ADS)
Muddiman, J. B.; Whelan, J. W.
1980-07-01
This volume is intended for program analysts and is a users manual for the VENVAL model. It contains specific explanations as to input data requirements and programming procedures for the use of this model. VENVAL is a generalized computer program to aid in evaluation of prospective private sector production ventures. The program can project interrelated values of installed capacity, production, sales revenue, operating costs, depreciation, investment, dent, earnings, taxes, return on investment, depletion, and cash flow measures. It can also compute related public sector and other external costs and revenues if unit costs are furnished.
Potential-Field Geophysical Software for the PC
,
1995-01-01
The computer programs of the Potential-Field Software Package run under the DOS operating system on IBM-compatible personal computers. They are used for the processing, display, and interpretation of potential-field geophysical data (gravity- and magnetic-field measurements) and other data sets that can be represented as grids or profiles. These programs have been developed on a variety of computer systems over a period of 25 years by the U.S. Geological Survey.
Parallel computers - Estimate errors caused by imprecise data
NASA Technical Reports Server (NTRS)
Kreinovich, Vladik; Bernat, Andrew; Villa, Elsa; Mariscal, Yvonne
1991-01-01
A new approach to the problem of estimating errors caused by imprecise data is proposed in the context of software engineering. A software device is used to produce an ideal solution to the problem, when the computer is capable of computing errors of arbitrary programs. The software engineering aspect of this problem is to describe a device for computing the error estimates in software terms and then to provide precise numbers with error estimates to the user. The feasibility of the program capable of computing both some quantity and its error estimate in the range of possible measurement errors is demonstrated.
Computer-aided injection molding system
NASA Astrophysics Data System (ADS)
Wang, K. K.; Shen, S. F.; Cohen, C.; Hieber, C. A.; Isayev, A. I.
1982-10-01
Achievements are reported in cavity-filling simulation, modeling viscoelastic effects, measuring and predicting frozen-in birefringence in molded parts, measuring residual stresses and associated mechanical properties of molded parts, and developing an interactive mold-assembly design program and an automatic NC maching data generation and verification program. The Cornell Injection Molding Program (CIMP) consortium is discussed as are computer user manuals that have been published by the consortium. Major tasks which should be addressed in future efforts are listed, including: (1) predict and experimentally determine the post-fillin behavior of thermoplastics; (2) simulate and experimentally investigate the injection molding of thermosets and filled materials; and (3) further investigate residual stresses, orientation and mechanical properties.
NASA Technical Reports Server (NTRS)
Dayton, J. A., Jr.; Kosmahl, H. G.; Ramins, P.; Stankiewicz, N.
1979-01-01
Experimental and analytical results are compared for two high performance, octave bandwidth TWT's that use depressed collectors (MDC's) to improve the efficiency. The computations were carried out with advanced, multidimensional computer programs that are described here in detail. These programs model the electron beam as a series of either disks or rings of charge and follow their multidimensional trajectories from the RF input of the ideal TWT, through the slow wave structure, through the magnetic refocusing system, to their points of impact in the depressed collector. Traveling wave tube performance, collector efficiency, and collector current distribution were computed and the results compared with measurements for a number of TWT-MDC systems. Power conservation and correct accounting of TWT and collector losses were observed. For the TWT's operating at saturation, very good agreement was obtained between the computed and measured collector efficiencies. For a TWT operating 3 and 6 dB below saturation, excellent agreement between computed and measured collector efficiencies was obtained in some cases but only fair agreement in others. However, deviations can largely be explained by small differences in the computed and actual spent beam energy distributions. The analytical tools used here appear to be sufficiently refined to design efficient collectors for this class of TWT. However, for maximum efficiency, some experimental optimization (e.g., collector voltages and aperture sizes) will most likely be required.
NASA Technical Reports Server (NTRS)
Pate, T. H.
1982-01-01
Geographic coverage frequency and geographic shot density for a satellite borne Doppler lidar wind velocity measuring system are measured. The equations of motion of the light path on the ground were derived and a computer program devised to compute shot density and coverage frequency by latitude-longitude sections. The equations for the coverage boundaries were derived and a computer program developed to plot these boundaries, thus making it possible, after an application of a map coloring algorithm, to actually see the areas of multiple coverage. A theoretical cross-swath shot density function that gives close approximations in certain cases was also derived. This information should aid in the design of an efficient data-processing system for the Doppler lidar.
A dc model for power switching transistors suitable for computer-aided design and analysis
NASA Technical Reports Server (NTRS)
Wilson, P. M.; George, R. T., Jr.; Owen, H. A.; Wilson, T. G.
1979-01-01
A model for bipolar junction power switching transistors whose parameters can be readily obtained by the circuit design engineer, and which can be conveniently incorporated into standard computer-based circuit analysis programs is presented. This formulation results from measurements which may be made with standard laboratory equipment. Measurement procedures, as well as a comparison between actual and computed results, are presented.
Pilot of a computer-based brief multiple-health behavior intervention for college students.
Moore, Michele J; Werch, Chudley E; Bian, Hui
2012-01-01
Given the documented multiple health risks college students engage in, and the dearth of effective programs addressing them, the authors developed a computer-based brief multiple-health behavior intervention. This study reports immediate outcomes and feasibility of a pilot of this program. Two hundred students attending a midsized university participated. Participants were randomly assigned to the intervention or control program, both delivered via computer. Immediate feedback was collected with the computer program. Results indicate that the intervention had an early positive impact on alcohol and cigarette use intentions, as well as related constructs underlying the Behavior-Image Model specific to each of the 3 substances measured. Based on the implementation process, the program proved to be feasible to use and acceptable to the population. Results support the potential efficacy of the intervention to positively impact behavioral intentions and linkages between health promoting and damaging behaviors among college students.
Analysis of reference transactions using packaged computer programs.
Calabretta, N; Ross, R
1984-01-01
Motivated by a continuing education class attended by the authors on the measurement of reference desk activities, the reference department at Scott Memorial Library initiated a project to gather data on reference desk transactions and to analyze the data by using packaged computer programs. The programs utilized for the project were SPSS (Statistical Package for the Social Sciences) and SAS (Statistical Analysis System). The planning, implementation and development of the project are described.
HOPI: on-line injection optimization program
DOE Office of Scientific and Technical Information (OSTI.GOV)
LeMaire, J L
1977-10-26
A method of matching the beam from the 200 MeV linac to the AGS without the necessity of making emittance measurements is presented. An on-line computer program written on the PDP10 computer performs the matching by modifying independently the horizontal and vertical emittance. Experimental results show success with this method, which can be applied to any matching section.
ERIC Educational Resources Information Center
Esit, Omer
2011-01-01
This study investigated the effectiveness of an intelligent computer-assisted language learning (ICALL) program on Turkish learners' vocabulary learning. Within the scope of this research, an ICALL application with a morphological analyser (Your Verbal Zone, YVZ) was developed and used in an English language preparatory class to measure its…
BASIC Computer Scoring Program for the Leadership Scale for Sports.
ERIC Educational Resources Information Center
Garland, Daniel J.
This paper describes a computer scoring program, written in Commodore BASIC, that offers an efficient approach to the scoring of the Leadership Scale for Sports (LSS). The LSS measures: (1) the preferences of athletes for specific leader behaviors from the coach; (2) the perception of athletes regarding the actual leader behavior of their coach;…
Computer programs for optical dendrometer measurements of standing tree profiles
Jacob R. Beard; Thomas G. Matney; Emily B. Schultz
2015-01-01
Tree profile equations are effective volume predictors. Diameter data for building these equations are collected from felled trees using diameter tapes and calipers or from standing trees using optical dendrometers. Developing and implementing a profile function from the collected data is a tedious and error prone task. This study created a computer program, Profile...
Study of inducer load and stress, volume 2
NASA Technical Reports Server (NTRS)
1972-01-01
A program of analysis, design, fabrication and testing has been conducted to develop computer programs for predicting rocket engine turbopump inducer hydrodynamic loading, stress magnitude and distribution, and vibration characteristics. Methods of predicting blade loading, stress, and vibration characteristics were selected from a literature search and used as a basis for the computer programs. An inducer, representative of typical rocket engine inducers, was designed, fabricated, and tested with special instrumentation selected to provide measurements of blade surface pressures and stresses. Data from the tests were compared with predicted values and the computer programs were revised as required to improve correlation. For Volume 1 see N71-20403. For Volume 2 see N71-20404.
Ryhänen, Anne M; Siekkinen, Mervi; Rankinen, Sirkku; Korvenranta, Heikki; Leino-Kilpi, Helena
2010-04-01
The aim of this systematic review was to analyze what kind of Internet or interactive computer-based patient education programs have been developed and to analyze the effectiveness of these programs in the field of breast cancer patient education. Patient education for breast cancer patients is an important intervention to empower the patient. However, we know very little about the effects and potential of Internet-based patient education in the empowerment of breast cancer patients. Complete databases were searched covering the period from the beginning of each database to November 2008. Studies were included if they concerned patient education for breast cancer patients with Internet or interactive computer programs and were based on randomized controlled, on clinical trials or quasi-experimental studies. We identified 14 articles involving 2374 participants. The design was randomized controlled trial in nine papers, in two papers clinical trial and in three quasi-experimental. Seven of the studies were randomized to experimental and control groups, in two papers participants were grouped by ethnic and racial differences and by mode of Internet use and three studies measured the same group pre- and post-tests after using a computer program. The interventions used were described as interactive computer or multimedia programs and use of the Internet. The methodological solutions of the studies varied. The effects of the studies were diverse except for knowledge-related issues. Internet or interactive computer-based patient education programs in the care of breast cancer patients may have positive effect increasing breast cancer knowledge. The results suggest a positive relationship between the Internet or computer-based patient education program use and the knowledge level of patients with breast cancer but a diverse relationship between patient's participation and other outcome measures. There is need to develop and research more Internet-based patient education. 2009 Elsevier Ireland Ltd. All rights reserved.
Investigations of flowfields found in typical combustor geometries
NASA Technical Reports Server (NTRS)
Lilley, D. G.
1982-01-01
Measurements and computations are being applied to an axisymmetric swirling flow, emerging from swirl vanes at angle phi, entering a large chamber test section via a sudden expansion of various side-wall angles alpha. New features are: the turbulence measurements are being performed on swirling as well as nonswirling flow; and all measurements and computations are also being performed on a confined jet flowfield with realistic downstream blockage. Recent activity falls into three categories: (1) Time-mean flowfield characterization by five-hole pitot probe measurements and by flow visualization; (2) Turbulence measurements by a variety of single- and multi-wire hot-wire probe techniques; and (3) Flowfield computations using the computer code developed during the previous year's research program.
Measurement of Refractive Index Using a Michelson Interferometer.
ERIC Educational Resources Information Center
Fendley, J. J.
1982-01-01
Describes a novel and simple method of measuring the refractive index of transparent plates using a Michelson interferometer. Since it is necessary to use a computer program when determining the refractive index, undergraduates could be given the opportunity of writing their own programs. (Author/JN)
Data management of a multilaboratory field program using distributed processing. [PRECP
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tichler, J.L.
The PRECP program is a multilaboratory research effort conducted by the US Department of Energy as a part of the National Acid Precipitation Assessment Program (NAPAP). The primary objective of PRECP is to provide essential information for the quantitative description of chemical wet deposition as a function of air pollution loadings, geograpic location, and atmospheric processing. The program is broken into four closely interrelated sectors: Diagnostic Modeling; Field Measurements; Laboratory Measurements; and Climatological Evaluation. Data management tasks are: compile databases of the data collected in field studies; verify the contents of data sets; make data available to program participants eithermore » on-line or by means of computer tapes; perform requested analyses, graphical displays, and data aggregations; provide an index of what data is available; and provide documentation for field programs both as part of the computer database and as data reports.« less
Research papers and publications (1981-1987): Workload research program
NASA Technical Reports Server (NTRS)
Hart, Sandra G. (Compiler)
1987-01-01
An annotated bibliography of the research reports written by participants in NASA's Workload Research Program since 1981 is presented, representing the results of theoretical and applied research conducted at Ames Research Center and at universities and industrial laboratories funded by the program. The major program elements included: 1) developing an understanding of the workload concept; 2) providing valid, reliable, and practical measures of workload; and 3) creating a computer model to predict workload. The goal is to provide workload-related design principles, measures, guidelines, and computational models. The research results are transferred to user groups by establishing close ties with manufacturers, civil and military operators of aerospace systems, and regulatory agencies; publishing scientific articles; participating in and sponsoring workshops and symposia; providing information, guidelines, and computer models; and contributing to the formulation of standards. In addition, the methods and theories developed have been applied to specific operational and design problems at the request of a number of industry and government agencies.
Influence of Smartphones and Software on Acoustic Voice Measures
GRILLO, ELIZABETH U.; BROSIOUS, JENNA N.; SORRELL, STACI L.; ANAND, SUPRAJA
2016-01-01
This study assessed the within-subject variability of voice measures captured using different recording devices (i.e., smartphones and head mounted microphone) and software programs (i.e., Analysis of Dysphonia in Speech and Voice (ADSV), Multi-dimensional Voice Program (MDVP), and Praat). Correlations between the software programs that calculated the voice measures were also analyzed. Results demonstrated no significant within-subject variability across devices and software and that some of the measures were highly correlated across software programs. The study suggests that certain smartphones may be appropriate to record daily voice measures representing the effects of vocal loading within individuals. In addition, even though different algorithms are used to compute voice measures across software programs, some of the programs and measures share a similar relationship. PMID:28775797
Glang, Ann
2010-01-01
Objective The purpose of this study was to evaluate the “Bike Smart” program, an eHealth software program that teaches bicycle safety behaviors to young children. Methods Participants were 206 elementary students in grades kindergarten to 3. A random control design was employed to evaluate the program, with students assigned to either the treatment condition (Bike Smart) or the control condition (a video on childhood safety). Outcome measures included computer-based knowledge items (safety rules, helmet placement, hazard discrimination) and a behavioral measure of helmet placement. Results Results demonstrated that regardless of gender, cohort, and grade the participants in the treatment group showed greater gains than control participants in both the computer-presented knowledge items (p > .01) and the observational helmet measure (p > .05). Conclusions Findings suggest that the Bike Smart program can be a low cost, effective component of safety training packages that include both skills-based and experiential training. PMID:19755497
NASA Technical Reports Server (NTRS)
1974-01-01
This report presents the derivation, description, and operating instructions for a computer program (TEKVAL) which measures the economic value of advanced technology features applied to long range commercial passenger aircraft. The program consists of three modules; and airplane sizing routine, a direct operating cost routine, and an airline return-on-investment routine. These modules are linked such that they may be operated sequentially or individually, with one routine generating the input for the next or with the option of externally specifying the input for either of the economic routines. A very simple airplane sizing technique was previously developed, based on the Brequet range equation. For this program, that sizing technique has been greatly expanded and combined with the formerly separate DOC and ROI programs to produce TEKVAL.
A dc model for power switching transistors suitable for computer-aided design and analysis
NASA Technical Reports Server (NTRS)
Wilson, P. M.; George, R. T., Jr.; Owen, H. A., Jr.; Wilson, T. G.
1979-01-01
The proposed dc model for bipolar junction power switching transistors is based on measurements which may be made with standard laboratory equipment. Those nonlinearities which are of importance to power electronics design are emphasized. Measurements procedures are discussed in detail. A model formulation adapted for use with a computer program is presented, and a comparison between actual and computer-generated results is made.
Clock Agreement Among Parallel Supercomputer Nodes
Jones, Terry R.; Koenig, Gregory A.
2014-04-30
This dataset presents measurements that quantify the clock synchronization time-agreement characteristics among several high performance computers including the current world's most powerful machine for open science, the U.S. Department of Energy's Titan machine sited at Oak Ridge National Laboratory. These ultra-fast machines derive much of their computational capability from extreme node counts (over 18000 nodes in the case of the Titan machine). Time-agreement is commonly utilized by parallel programming applications and tools, distributed programming application and tools, and system software. Our time-agreement measurements detail the degree of time variance between nodes and how that variance changes over time. The dataset includes empirical measurements and the accompanying spreadsheets.
LFSPMC: Linear feature selection program using the probability of misclassification
NASA Technical Reports Server (NTRS)
Guseman, L. F., Jr.; Marion, B. P.
1975-01-01
The computational procedure and associated computer program for a linear feature selection technique are presented. The technique assumes that: a finite number, m, of classes exists; each class is described by an n-dimensional multivariate normal density function of its measurement vectors; the mean vector and covariance matrix for each density function are known (or can be estimated); and the a priori probability for each class is known. The technique produces a single linear combination of the original measurements which minimizes the one-dimensional probability of misclassification defined by the transformed densities.
Measured Radiation Patterns of the Boeing 91-Element ICAPA Antenna With Comparison to Calculations
NASA Technical Reports Server (NTRS)
Lambert, Kevin M.; Burke, Thomas (Technical Monitor)
2003-01-01
This report presents measured antenna patterns of the Boeing 91-Element Integrated Circuit Active Phased Array (ICAPA) Antenna at 19.85 GHz. These patterns were taken in support of various communication experiments that were performed using the antenna as a testbed. The goal here is to establish a foundation of the performance of the antenna for the experiments. An independent variable used in the communication experiments was the scan angle of the antenna. Therefore, the results presented here are patterns as a function of scan angle, at the stated frequency. Only a limited number of scan angles could be measured. Therefore, a computer program was written to simulate the pattern performance of the antenna at any scan angle. This program can be used to facilitate further study of the antenna. The computed patterns from this program are compared to the measured patterns as a means of validating the model.
NASA Technical Reports Server (NTRS)
Jones, J. E.; Richmond, J. H.
1974-01-01
An integral equation formulation is applied to predict pitch- and roll-plane radiation patterns of a thin VHF/UHF (very high frequency/ultra high frequency) annular slot communications antenna operating at several locations in the nose region of the space shuttle orbiter. Digital computer programs used to compute radiation patterns are given and the use of the programs is illustrated. Experimental verification of computed patterns is given from measurements made on 1/35-scale models of the orbiter.
Topics in the optimization of millimeter-wave mixers
NASA Technical Reports Server (NTRS)
Siegel, P. H.; Kerr, A. R.; Hwang, W.
1984-01-01
A user oriented computer program for the analysis of single-ended Schottky diode mixers is described. The program is used to compute the performance of a 140 to 220 GHz mixer and excellent agreement with measurements at 150 and 180 GHz is obtained. A sensitivity analysis indicates the importance of various diode and mount characteristics on the mixer performance. A computer program for the analysis of varactor diode multipliers is described. The diode operates in either the reverse biased varactor mode or with substantial forward current flow where the conversion mechanism is predominantly resistive. A description and analysis of a new H-plane rectangular waveguide transformer is reported. The transformer is made quickly and easily in split-block waveguide using a standard slitting saw. It is particularly suited for use in the millimeter-wave band, replacing conventional electroformed stepped transformers. A theoretical analysis of the transformer is given and good agreement is obtained with measurements made at X-band.
Simulating smokers' acceptance of modifications in a cessation program.
Spoth, R
1992-01-01
Recent research has underscored the importance of assessing barriers to smokers' acceptance of cessation programs. This paper illustrates the use of computer simulations to gauge smokers' response to program modifications which may produce barriers to participation. It also highlights methodological issues encountered in conducting this work. Computer simulations were based on conjoint analysis, a consumer research method which enables measurement of smokers' relative preference for various modifications of cessation programs. Results from two studies are presented in this paper. The primary study used a randomly selected sample of 218 adult smokers who participated in a computer-assisted phone interview. Initially, the study assessed smokers' relative utility rating of 30 features of cessation programs. Utility data were used in computer-simulated comparisons of a low-cost, self-help oriented program under development and five other existing programs. A baseline version of the program under development and two modifications (for example, use of a support group with a higher level of cost) were simulated. Both the baseline version and modifications received a favorable response vis-à-vis comparison programs. Modifications requiring higher program costs were, however, associated with moderately reduced levels of favorable consumer response. The second study used a sample of 70 smokers who responded to an expanded set of smoking cessation program features focusing on program packaging. This secondary study incorporate in-person, computer-assisted interviews at a shopping mall, with smokers viewing an artist's mock-up of various program options on display. A similar pattern of responses to simulated program modifications emerged, with monetary cost apparently playing a key role. The significance of conjoint-based computer simulation as a tool in program development or dissemination, salient methodological issues, and implications for further research are discussed. PMID:1738813
Simulating smokers' acceptance of modifications in a cessation program.
Spoth, R
1992-01-01
Recent research has underscored the importance of assessing barriers to smokers' acceptance of cessation programs. This paper illustrates the use of computer simulations to gauge smokers' response to program modifications which may produce barriers to participation. It also highlights methodological issues encountered in conducting this work. Computer simulations were based on conjoint analysis, a consumer research method which enables measurement of smokers' relative preference for various modifications of cessation programs. Results from two studies are presented in this paper. The primary study used a randomly selected sample of 218 adult smokers who participated in a computer-assisted phone interview. Initially, the study assessed smokers' relative utility rating of 30 features of cessation programs. Utility data were used in computer-simulated comparisons of a low-cost, self-help oriented program under development and five other existing programs. A baseline version of the program under development and two modifications (for example, use of a support group with a higher level of cost) were simulated. Both the baseline version and modifications received a favorable response vis-à-vis comparison programs. Modifications requiring higher program costs were, however, associated with moderately reduced levels of favorable consumer response. The second study used a sample of 70 smokers who responded to an expanded set of smoking cessation program features focusing on program packaging. This secondary study incorporate in-person, computer-assisted interviews at a shopping mall, with smokers viewing an artist's mock-up of various program options on display. A similar pattern of responses to simulated program modifications emerged, with monetary cost apparently playing a key role. The significance of conjoint-based computer simulation as a tool in program development or dissemination, salient methodological issues, and implications for further research are discussed.
NASA Technical Reports Server (NTRS)
Buchele, D. R.
1977-01-01
A computer program to calculate the temperature profile of a flame or hot gas was presented in detail. Emphasis was on profiles found in jet engine or rocket engine exhaust streams containing H2O or CO2 radiating gases. The temperature profile was assumed axisymmetric with an assumed functional form controlled by two variable parameters. The parameters were calculated using measurements of gas radiation at two wavelengths in the infrared. The program also gave some information on the pressure profile. A method of selection of wavelengths was given that is likely to lead to an accurate determination of the parameters. The program is written in FORTRAN IV language and runs in less than 60 seconds on a Univac 1100 computer.
1992-09-01
to acquire or develop effective simulation tools to observe the behavior of a RISC implementation as it executes different types of programs . We choose...Performance Computer performance is measured by the amount of the time required to execute a program . Performance encompasses two types of time, elapsed time...and CPU time. Elapsed time is the time required to execute a program from start to finish. It includes latency of input/output activities such as
Ocean Tide Loading Computation
NASA Technical Reports Server (NTRS)
Agnew, Duncan Carr
2005-01-01
September 15,2003 through May 15,2005 This grant funds the maintenance, updating, and distribution of programs for computing ocean tide loading, to enable the corrections for such loading to be more widely applied in space- geodetic and gravity measurements. These programs, developed under funding from the CDP and DOSE programs, incorporate the most recent global tidal models developed from Topex/Poscidon data, and also local tide models for regions around North America; the design of the algorithm and software makes it straightforward to combine local and global models.
NASA Technical Reports Server (NTRS)
Knauber, R. N.
1982-01-01
This report describes a FORTRAN IV coded computer program for post-flight evaluation of a launch vehicle upper stage on-off reaction control system. Aerodynamic and thrust misalignment disturbances are computed as well as the total disturbing moments in pitch, yaw, and roll. Effective thrust misalignment angle time histories of the rocket booster motor are calculated. Disturbing moments are integrated and used to estimate the required control system total inpulse. Effective control system specific inpulse is computed for the boost and coast phases using measured control fuel useage. This method has been used for more than fifteen years for analyzing the NASA Scout launch vehicle second and third-stage reaction control system performance. The computer program is set up in FORTRAN IV for a CDC CYBER 175 system. With slight modification it can be used on other machines having a FORTRAN compiler. The program has optional CALCOMP plotting output. With this option the program requires 19K words of memory and has 786 cards. Running time on a CDC CYBER 175 system is less than three (3) seconds for a typical problem.
A CS1 pedagogical approach to parallel thinking
NASA Astrophysics Data System (ADS)
Rague, Brian William
Almost all collegiate programs in Computer Science offer an introductory course in programming primarily devoted to communicating the foundational principles of software design and development. The ACM designates this introduction to computer programming course for first-year students as CS1, during which methodologies for solving problems within a discrete computational context are presented. Logical thinking is highlighted, guided primarily by a sequential approach to algorithm development and made manifest by typically using the latest, commercially successful programming language. In response to the most recent developments in accessible multicore computers, instructors of these introductory classes may wish to include training on how to design workable parallel code. Novel issues arise when programming concurrent applications which can make teaching these concepts to beginning programmers a seemingly formidable task. Student comprehension of design strategies related to parallel systems should be monitored to ensure an effective classroom experience. This research investigated the feasibility of integrating parallel computing concepts into the first-year CS classroom. To quantitatively assess student comprehension of parallel computing, an experimental educational study using a two-factor mixed group design was conducted to evaluate two instructional interventions in addition to a control group: (1) topic lecture only, and (2) topic lecture with laboratory work using a software visualization Parallel Analysis Tool (PAT) specifically designed for this project. A new evaluation instrument developed for this study, the Perceptions of Parallelism Survey (PoPS), was used to measure student learning regarding parallel systems. The results from this educational study show a statistically significant main effect among the repeated measures, implying that student comprehension levels of parallel concepts as measured by the PoPS improve immediately after the delivery of any initial three-week CS1 level module when compared with student comprehension levels just prior to starting the course. Survey results measured during the ninth week of the course reveal that performance levels remained high compared to pre-course performance scores. A second result produced by this study reveals no statistically significant interaction effect between the intervention method and student performance as measured by the evaluation instrument over three separate testing periods. However, visual inspection of survey score trends and the low p-value generated by the interaction analysis (0.062) indicate that further studies may verify improved concept retention levels for the lecture w/PAT group.
Attitude, Gender and Achievement in Computer Programming
ERIC Educational Resources Information Center
Baser, Mustafa
2013-01-01
The aim of this research was to explore the relationship among students' attitudes toward programming, gender and academic achievement in programming. The scale used for measuring students' attitudes toward programming was developed by the researcher and consisted of 35 five-point Likert type items in four subscales. The scale was administered to…
Verifiable Measurement-Only Blind Quantum Computing with Stabilizer Testing.
Hayashi, Masahito; Morimae, Tomoyuki
2015-11-27
We introduce a simple protocol for verifiable measurement-only blind quantum computing. Alice, a client, can perform only single-qubit measurements, whereas Bob, a server, can generate and store entangled many-qubit states. Bob generates copies of a graph state, which is a universal resource state for measurement-based quantum computing, and sends Alice each qubit of them one by one. Alice adaptively measures each qubit according to her program. If Bob is honest, he generates the correct graph state, and, therefore, Alice can obtain the correct computation result. Regarding the security, whatever Bob does, Bob cannot get any information about Alice's computation because of the no-signaling principle. Furthermore, malicious Bob does not necessarily send the copies of the correct graph state, but Alice can check the correctness of Bob's state by directly verifying the stabilizers of some copies.
Verifiable Measurement-Only Blind Quantum Computing with Stabilizer Testing
NASA Astrophysics Data System (ADS)
Hayashi, Masahito; Morimae, Tomoyuki
2015-11-01
We introduce a simple protocol for verifiable measurement-only blind quantum computing. Alice, a client, can perform only single-qubit measurements, whereas Bob, a server, can generate and store entangled many-qubit states. Bob generates copies of a graph state, which is a universal resource state for measurement-based quantum computing, and sends Alice each qubit of them one by one. Alice adaptively measures each qubit according to her program. If Bob is honest, he generates the correct graph state, and, therefore, Alice can obtain the correct computation result. Regarding the security, whatever Bob does, Bob cannot get any information about Alice's computation because of the no-signaling principle. Furthermore, malicious Bob does not necessarily send the copies of the correct graph state, but Alice can check the correctness of Bob's state by directly verifying the stabilizers of some copies.
Computation of Flow Through Water-Control Structures Using Program DAMFLO.2
Sanders, Curtis L.; Feaster, Toby D.
2004-01-01
As part of its mission to collect, analyze, and store streamflow data, the U.S. Geological Survey computes flow through several dam structures throughout the country. Flows are computed using hydraulic equations that describe flow through sluice and Tainter gates, crest gates, lock gates, spillways, locks, pumps, and siphons, which are calibrated using flow measurements. The program DAMFLO.2 was written to compute, tabulate, and plot flow through dam structures using data that describe the physical properties of dams and various hydraulic parameters and ratings that use time-varying data, such as lake elevations or gate openings. The program uses electronic computer files of time-varying data, such as lake elevation or gate openings, retrieved from the U.S. Geological Survey Automated Data Processing System. Computed time-varying flow data from DAMFLO.2 are output in flat files, which can be entered into the Automated Data Processing System database. All computations are made in units of feet and seconds. DAMFLO.2 uses the procedures and language developed by the SAS Institute Inc.
NASA Tech Briefs, November 2000. Volume 24, No. 11
NASA Technical Reports Server (NTRS)
2000-01-01
Topics covered include: Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Test and Measurement; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Data Acquisition.
ERIC Educational Resources Information Center
Rivera, William M., Comp.; Walker, Sharon M., Comp.
Among the 46 papers in this proceedings are the following 36 selected titles: "The Intergenerational Exercise/Movement Program" (Ansello); "Using Computers for Adult Literacy Instruction" (Askov et al.); "Measuring Adults' Attitudes toward Computers" (Delcourt, Lewis); "Issues in Computers and Adult Learning" (Gerver); "Preassessment of Adult…
Performance of a computer-based assessment of cognitive function measures in two cohorts of seniors
USDA-ARS?s Scientific Manuscript database
Computer-administered assessment of cognitive function is being increasingly incorporated in clinical trials, however its performance in these settings has not been systematically evaluated. The Seniors Health and Activity Research Program (SHARP) pilot trial (N=73) developed a computer-based tool f...
Some Measurement and Instruction Related Considerations Regarding Computer Assisted Testing.
ERIC Educational Resources Information Center
Oosterhof, Albert C.; Salisbury, David F.
The Assessment Resource Center (ARC) at Florida State University provides computer assisted testing (CAT) for approximately 4,000 students each term. Computer capabilities permit a small proctoring staff to administer tests simultaneously to large numbers of students. Programs provide immediate feedback for students and generate a variety of…
Jeffries, B F; Tarlton, M; De Smet, A A; Dwyer, S J; Brower, A C
1980-02-01
A computer program was created to identify and accept spatial data regarding the location of the thoracic and lumbar vertebral bodies on scoliosis films. With this information, the spine can be mathematically reconstructed and a scoliotic angle calculated. There was a 0.968 positive correlation between the computer and manual methods of measuring scoliosis. The computer method was more reproducible with a standard deviation of only 1.3 degrees. Computerized measurement of scoliosis also provides better evaluation of the true shape of the curve.
Effectiveness of computer ergonomics interventions for an engineering company: a program evaluation.
Goodman, Glenn; Landis, James; George, Christina; McGuire, Sheila; Shorter, Crystal; Sieminski, Michelle; Wilson, Tamika
2005-01-01
Ergonomic principles at the computer workstation may reduce the occurrence of work related injuries commonly associated with intensive computer use. A program implemented in 2001 by an occupational therapist and a physical therapist utilized these preventative measures with education about ergonomics, individualized evaluations of computer workstations, and recommendations for ergonomic and environmental changes. This study examined program outcomes and perceived effectiveness based on review of documents, interviews, and surveys of the employees and the plant manager. The program was deemed successful as shown by 59% of all therapist recommendations and 74% of ergonomic recommendations being implemented by the company, with an 85% satisfaction rate for the ergonomic interventions and an overall employee satisfaction rate of 70%. Eighty-one percent of the physical problems reported by employees were resolved to their satisfaction one year later. Successful implementation of ergonomics programs depend upon effective communication and education of the consumers, and the support, cooperation and collaboration of management and employees.
Zhang, Z L; Li, J P; Li, G; Ma, X C
2017-02-09
Objective: To establish and validate a computer program used to aid the detection of dental proximal caries in the images cone beam computed tomography (CBCT) images. Methods: According to the characteristics of caries lesions in X-ray images, a computer aided detection program for proximal caries was established with Matlab and Visual C++. The whole process for caries lesion detection included image import and preprocessing, measuring average gray value of air area, choosing region of interest and calculating gray value, defining the caries areas. The program was used to examine 90 proximal surfaces from 45 extracted human teeth collected from Peking University School and Hospital of Stomatology. The teeth were then scanned with a CBCT scanner (Promax 3D). The proximal surfaces of the teeth were respectively detected by caries detection program and scored by human observer for the extent of lesions with 6-level-scale. With histologic examination serving as the reference standard, the caries detection program and the human observer performances were assessed with receiver operating characteristic (ROC) curves. Student t -test was used to analyze the areas under the ROC curves (AUC) for the differences between caries detection program and human observer. Spearman correlation coefficient was used to analyze the detection accuracy of caries depth. Results: For the diagnosis of proximal caries in CBCT images, the AUC values of human observers and caries detection program were 0.632 and 0.703, respectively. There was a statistically significant difference between the AUC values ( P= 0.023). The correlation between program performance and gold standard (correlation coefficient r (s)=0.525) was higher than that of observer performance and gold standard ( r (s)=0.457) and there was a statistically significant difference between the correlation coefficients ( P= 0.000). Conclusions: The program that automatically detects dental proximal caries lesions could improve the diagnostic value of CBCT images.
NASA Astrophysics Data System (ADS)
Simmons, B. E.
1981-08-01
This report derives equations predicting satellite ephemeris error as a function of measurement errors of space-surveillance sensors. These equations lend themselves to rapid computation with modest computer resources. They are applicable over prediction times such that measurement errors, rather than uncertainties of atmospheric drag and of Earth shape, dominate in producing ephemeris error. This report describes the specialization of these equations underlying the ANSER computer program, SEEM (Satellite Ephemeris Error Model). The intent is that this report be of utility to users of SEEM for interpretive purposes, and to computer programmers who may need a mathematical point of departure for limited generalization of SEEM.
Benchmark radar targets for the validation of computational electromagnetics programs
NASA Technical Reports Server (NTRS)
Woo, Alex C.; Wang, Helen T. G.; Schuh, Michael J.; Sanders, Michael L.
1993-01-01
Results are presented of a set of computational electromagnetics validation measurements referring to three-dimensional perfectly conducting smooth targets, performed for the Electromagnetic Code Consortium. Plots are presented for both the low- and high-frequency measurements of the NASA almond, an ogive, a double ogive, a cone-sphere, and a cone-sphere with a gap.
NASA Technical Reports Server (NTRS)
Simmons, D. B.
1975-01-01
The DOMONIC system has been modified to run on the Univac 1108 and the CDC 6600 as well as the IBM 370 computer system. The DOMONIC monitor system has been implemented to gather data which can be used to optimize the DOMONIC system and to predict the reliability of software developed using DOMONIC. The areas of quality metrics, error characterization, program complexity, program testing, validation and verification are analyzed. A software reliability model for estimating program completion levels and one on which to base system acceptance have been developed. The DAVE system which performs flow analysis and error detection has been converted from the University of Colorado CDC 6400/6600 computer to the IBM 360/370 computer system for use with the DOMONIC system.
The effect of a computer-related ergonomic intervention program on learners in a school environment.
Sellschop, Ingrid; Myezwa, Hellen; Mudzi, Witness; Mbambo-Kekana, Nonceba
2015-01-01
The interest in school ergonomic intervention programs and their effects on musculoskeletal pain is increasing around the world. The objective of this longitudinal randomized control trial was to implement and measure the effects of a computer-related ergonomics intervention on grade eight learners in a school environment in Johannesburg South Africa (a developing country). The sample comprised of a control group (n= 66) and an intervention group (n= 61). The outcome measures used were posture assessment using the Rapid Upper Limb Assessment tool (RULA) and the prevalence of musculoskeletal pain using a visual analogue scale (VAS). Measurements were done at baseline, three months and six months post intervention. The results showed that the posture of the intervention group changed significantly from an Action Level 4 to an Action level 2 and Action level 3, indicating a sustained improvement of learners' postural positions whilst using computers. The intervention group showed a significant reduction in the prevalence of musculoskeletal pain from 42.6% at baseline to 18% six months post intervention (p< 0.003). In conclusion, the results indicated that a computer-related intervention program for grade eight learners in a school environment is effective and that behavioural changes can be made that are sustainable over a period of six months.
UDATE1: A computer program for the calculation of uranium-series isotopic ages
Rosenbauer, R.J.
1991-01-01
UDATE1 is a FORTRAN-77 program with an interface for an Apple Macintosh computer that calculates isotope activities from measured count rates to date geologic materials by uranium-series disequilibria. Dates on pure samples can be determined directly by the accumulation of 230Th from 234U and of 231Pa from 235U. Dates for samples contaminated by clays containing abundant natural thorium can be corrected by the program using various mixing models. Input to the program and file management are made simple and user friendly by a series of Macintosh modal dialog boxes. ?? 1991.
DART: A Microcomputer Program for Response Latency Analysis.
ERIC Educational Resources Information Center
Greene, John O.; Greene, Barry F.
1987-01-01
Discusses how chronometric measures such as the DART (Display And Response Timing) computer program, have become virtually indispensable in testing cognitive theories of human social behavior. Describes how the DART (1) provides a way to collect response latency data; and (2) allows measurement of response latencies to a set of user-specified,…
Program helps quickly calculate deviated well path
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gardner, M.P.
1993-11-22
A BASIC computer program quickly calculates the angle and measured depth of a simple directional well given only the true vertical depth and total displacement of the target. Many petroleum engineers and geologists need a quick, easy method to calculate the angle and measured depth necessary to reach a target in a proposed deviated well bore. Too many of the existing programs are large and require much input data. The drilling literature is full of equations and methods to calculate the course of well paths from surveys taken after a well is drilled. Very little information, however, covers how tomore » calculate well bore trajectories for proposed wells from limited data. Furthermore, many of the equations are quite complex and difficult to use. A figure lists a computer program with the equations to calculate the well bore trajectory necessary to reach a given displacement and true vertical depth (TVD) for a simple build plant. It can be run on an IBM compatible computer with MS-DOS version 5 or higher, QBasic, or any BASIC that does no require line numbers. QBasic 4.5 compiler will also run the program. The equations are based on conventional geometry and trigonometry.« less
NASA Tech Briefs, August 2000. Volume 24, No. 8
NASA Technical Reports Server (NTRS)
2000-01-01
Topics include: Simulation/Virtual Reality; Test and Measurement; Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Medical Design.
NASA Technical Reports Server (NTRS)
Knauber, R. N.
1982-01-01
A FORTRAN IV coded computer program is presented for post-flight analysis of a missile's control surface response. It includes preprocessing of digitized telemetry data for time lags, biases, non-linear calibration changes and filtering. Measurements include autopilot attitude rate and displacement gyro output and four control surface deflections. Simple first order lags are assumed for the pitch, yaw and roll axes of control. Each actuator is also assumed to be represented by a first order lag. Mixing of pitch, yaw and roll commands to four control surfaces is assumed. A pseudo-inverse technique is used to obtain the pitch, yaw and roll components from the four measured deflections. This program has been used for over 10 years on the NASA/SCOUT launch vehicle for post-flight analysis and was helpful in detecting incipient actuator stall due to excessive hinge moments. The program is currently set up for a CDC CYBER 175 computer system. It requires 34K words of memory and contains 675 cards. A sample problem presented herein including the optional plotting requires eleven (11) seconds of central processor time.
Measurement of fault latency in a digital avionic mini processor, part 2
NASA Technical Reports Server (NTRS)
Mcgough, J.; Swern, F.
1983-01-01
The results of fault injection experiments utilizing a gate-level emulation of the central processor unit of the Bendix BDX-930 digital computer are described. Several earlier programs were reprogrammed, expanding the instruction set to capitalize on the full power of the BDX-930 computer. As a final demonstration of fault coverage an extensive, 3-axis, high performance flght control computation was added. The stages in the development of a CPU self-test program emphasizing the relationship between fault coverage, speed, and quantity of instructions were demonstrated.
Toward using alpha and theta brain waves to quantify programmer expertise.
Crk, Igor; Kluthe, Timothy
2014-01-01
Empirical studies of programming language learnability and usability have thus far depended on indirect measures of human cognitive performance, attempting to capture what is at its essence a purely cognitive exercise through various indicators of comprehension, such as the correctness of coding tasks or the time spent working out the meaning of code and producing acceptable solutions. Understanding program comprehension is essential to understanding the inherent complexity of programming languages, and ultimately, having a measure of mental effort based on direct observation of the brain at work will illuminate the nature of the work of programming. We provide evidence of direct observation of the cognitive effort associated with programming tasks, through a carefully constructed empirical study using a cross-section of undergraduate computer science students and an inexpensive, off-the-shelf brain-computer interface device. This study presents a link between expertise and programming language comprehension, draws conclusions about the observed indicators of cognitive effort using recent cognitive theories, and proposes directions for future work that is now possible.
Algorithm for Atmospheric Corrections of Aircraft and Satellite Imagery
NASA Technical Reports Server (NTRS)
Fraser, Robert S.; Kaufman, Yoram J.; Ferrare, Richard A.; Mattoo, Shana
1989-01-01
A simple and fast atmospheric correction algorithm is described which is used to correct radiances of scattered sunlight measured by aircraft and/or satellite above a uniform surface. The atmospheric effect, the basic equations, a description of the computational procedure, and a sensitivity study are discussed. The program is designed to take the measured radiances, view and illumination directions, and the aerosol and gaseous absorption optical thickness to compute the radiance just above the surface, the irradiance on the surface, and surface reflectance. Alternatively, the program will compute the upward radiance at a specific altitude for a given surface reflectance, view and illumination directions, and aerosol and gaseous absorption optical thickness. The algorithm can be applied for any view and illumination directions and any wavelength in the range 0.48 micron to 2.2 micron. The relation between the measured radiance and surface reflectance, which is expressed as a function of atmospheric properties and measurement geometry, is computed using a radiative transfer routine. The results of the computations are presented in a table which forms the basis of the correction algorithm. The algorithm can be used for atmospheric corrections in the presence of a rural aerosol. The sensitivity of the derived surface reflectance to uncertainties in the model and input data is discussed.
Algorithm for atmospheric corrections of aircraft and satellite imagery
NASA Technical Reports Server (NTRS)
Fraser, R. S.; Ferrare, R. A.; Kaufman, Y. J.; Markham, B. L.; Mattoo, S.
1992-01-01
A simple and fast atmospheric correction algorithm is described which is used to correct radiances of scattered sunlight measured by aircraft and/or satellite above a uniform surface. The atmospheric effect, the basic equations, a description of the computational procedure, and a sensitivity study are discussed. The program is designed to take the measured radiances, view and illumination directions, and the aerosol and gaseous absorption optical thickness to compute the radiance just above the surface, the irradiance on the surface, and surface reflectance. Alternatively, the program will compute the upward radiance at a specific altitude for a given surface reflectance, view and illumination directions, and aerosol and gaseous absorption optical thickness. The algorithm can be applied for any view and illumination directions and any wavelength in the range 0.48 micron to 2.2 microns. The relation between the measured radiance and surface reflectance, which is expressed as a function of atmospheric properties and measurement geometry, is computed using a radiative transfer routine. The results of the computations are presented in a table which forms the basis of the correction algorithm. The algorithm can be used for atmospheric corrections in the presence of a rural aerosol. The sensitivity of the derived surface reflectance to uncertainties in the model and input data is discussed.
Energy Use and Power Levels in New Monitors and Personal Computers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roberson, Judy A.; Homan, Gregory K.; Mahajan, Akshay
2002-07-23
Our research was conducted in support of the EPA ENERGY STAR Office Equipment program, whose goal is to reduce the amount of electricity consumed by office equipment in the U.S. The most energy-efficient models in each office equipment category are eligible for the ENERGY STAR label, which consumers can use to identify and select efficient products. As the efficiency of each category improves over time, the ENERGY STAR criteria need to be revised accordingly. The purpose of this study was to provide reliable data on the energy consumption of the newest personal computers and monitors that the EPA can usemore » to evaluate revisions to current ENERGY STAR criteria as well as to improve the accuracy of ENERGY STAR program savings estimates. We report the results of measuring the power consumption and power management capabilities of a sample of new monitors and computers. These results will be used to improve estimates of program energy savings and carbon emission reductions, and to inform rev isions of the ENERGY STAR criteria for these products. Our sample consists of 35 monitors and 26 computers manufactured between July 2000 and October 2001; it includes cathode ray tube (CRT) and liquid crystal display (LCD) monitors, Macintosh and Intel-architecture computers, desktop and laptop computers, and integrated computer systems, in which power consumption of the computer and monitor cannot be measured separately. For each machine we measured power consumption when off, on, and in each low-power level. We identify trends in and opportunities to reduce power consumption in new personal computers and monitors. Our results include a trend among monitor manufacturers to provide a single very low low-power level, well below the current ENERGY STAR criteria for sleep power consumption. These very low sleep power results mean that energy consumed when monitors are off or in active use has become more important in terms of contribution to the overall unit energy consumption (UEC). Cur rent ENERGY STAR monitor and computer criteria do not specify off or on power, but our results suggest opportunities for saving energy in these modes. Also, significant differences between CRT and LCD technology, and between field-measured and manufacturer-reported power levels reveal the need for standard methods and metrics for measuring and comparing monitor power consumption.« less
Ozone measurement systems improvements studies
NASA Technical Reports Server (NTRS)
Thomas, R. W.; Guard, K.; Holland, A. C.; Spurling, J. F.
1974-01-01
Results are summarized of an initial study of techniques for measuring atmospheric ozone, carried out as the first phase of a program to improve ozone measurement techniques. The study concentrated on two measurement systems, the electro chemical cell (ECC) ozonesonde and the Dobson ozone spectrophotometer, and consisted of two tasks. The first task consisted of error modeling and system error analysis of the two measurement systems. Under the second task a Monte-Carlo model of the Dobson ozone measurement technique was developed and programmed for computer operation.
Computer program for the calculation of grain size statistics by the method of moments
Sawyer, Michael B.
1977-01-01
A computer program is presented for a Hewlett-Packard Model 9830A desk-top calculator (1) which calculates statistics using weight or point count data from a grain-size analysis. The program uses the method of moments in contrast to the more commonly used but less inclusive graphic method of Folk and Ward (1957). The merits of the program are: (1) it is rapid; (2) it can accept data in either grouped or ungrouped format; (3) it allows direct comparison with grain-size data in the literature that have been calculated by the method of moments; (4) it utilizes all of the original data rather than percentiles from the cumulative curve as in the approximation technique used by the graphic method; (5) it is written in the computer language BASIC, which is easily modified and adapted to a wide variety of computers; and (6) when used in the HP-9830A, it does not require punching of data cards. The method of moments should be used only if the entire sample has been measured and the worker defines the measured grain-size range. (1) Use of brand names in this paper does not imply endorsement of these products by the U.S. Geological Survey.
NASA Technical Reports Server (NTRS)
1984-01-01
The structure of the program, the five priority levels, the drive routines, the stepwise drive plan, the figure routines, meander X and y, the range of measurement table, the optimization of figure drive, the figure drive plan, dialogue routines, stack processing, the drive for the main terminal, the protocol routines, the drive for the microterminal, the drive for the experiment computer, and the main program are discussed.
Parental Perceptions and Recommendations of Computing Majors: A Technology Acceptance Model Approach
ERIC Educational Resources Information Center
Powell, Loreen; Wimmer, Hayden
2017-01-01
Currently, there are more technology related jobs then there are graduates in supply. The need to understand user acceptance of computing degrees is the first step in increasing enrollment in computing fields. Additionally, valid measurement scales for predicting user acceptance of Information Technology degree programs are required. The majority…
Contextual Fraction as a Measure of Contextuality.
Abramsky, Samson; Barbosa, Rui Soares; Mansfield, Shane
2017-08-04
We consider the contextual fraction as a quantitative measure of contextuality of empirical models, i.e., tables of probabilities of measurement outcomes in an experimental scenario. It provides a general way to compare the degree of contextuality across measurement scenarios; it bears a precise relationship to violations of Bell inequalities; its value, and a witnessing inequality, can be computed using linear programing; it is monotonic with respect to the "free" operations of a resource theory for contextuality; and it measures quantifiable advantages in informatic tasks, such as games and a form of measurement-based quantum computing.
Contextual Fraction as a Measure of Contextuality
NASA Astrophysics Data System (ADS)
Abramsky, Samson; Barbosa, Rui Soares; Mansfield, Shane
2017-08-01
We consider the contextual fraction as a quantitative measure of contextuality of empirical models, i.e., tables of probabilities of measurement outcomes in an experimental scenario. It provides a general way to compare the degree of contextuality across measurement scenarios; it bears a precise relationship to violations of Bell inequalities; its value, and a witnessing inequality, can be computed using linear programing; it is monotonic with respect to the "free" operations of a resource theory for contextuality; and it measures quantifiable advantages in informatic tasks, such as games and a form of measurement-based quantum computing.
NASA Tech Briefs, July 2000. Volume 24, No. 7
NASA Technical Reports Server (NTRS)
2000-01-01
Topics covered include: Data Acquisition; Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Test and Measurement; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Life Sciences; Books and Reports.
Evaluation of strains in bituminous surfaces : stiffness-fatigue investigation.
DOT National Transportation Integrated Search
1973-01-01
The study was designed to determine if strains in Virginia's thin asphaltic pavements were high enough to cause early fatigue failure. Strains were computed with the Chevron multilayer computer program, and also measured on selected highways using el...
NASA Technical Reports Server (NTRS)
Lansing, F. L.; Chai, V. W.; Lascu, D.; Urbenajo, R.; Wong, P.
1978-01-01
The engineering manual provides a complete companion documentation about the structure of the main program and subroutines, the preparation of input data, the interpretation of output results, access and use of the program, and the detailed description of all the analytic, logical expressions and flow charts used in computations and program structure. A numerical example is provided and solved completely to show the sequence of computations followed. The program is carefully structured to reduce both user's time and costs without sacrificing accuracy. The user would expect a cost of CPU time of approximately $5.00 per building zone excluding printing costs. The accuracy, on the other hand, measured by deviation of simulated consumption from watt-hour meter readings, was found by many simulation tests not to exceed + or - 10 percent margin.
NASA Technical Reports Server (NTRS)
Bareiss, L. E.
1978-01-01
The paper presents a compilation of the results of a systems level Shuttle/payload contamination analysis and related computer modeling activities. The current technical assessment of the contamination problems anticipated during the Spacelab program are discussed and recommendations are presented on contamination abatement designs and operational procedures based on experience gained in the field of contamination analysis and assessment, dating back to the pre-Skylab era. The ultimate test of the Shuttle/Payload Contamination Evaluation program will be through comparison of predictions with measured levels of contamination during actual flight.
Perceptions of the Pure Pallet Program
2006-03-01
These values are used in computing the Kaiser - Meyer - Olkin (KMO) measure of sampling adequacy by comparin them with that item’s simple correlations...values are provided in Table 32 of Appendix G. The Kaiser - Meyer - Olkin (KMO) measure of sampling adequacy was computed, resulting in a value of .92...This comparison is expressed as an index with values between zero and one. Kaiser declares, as quoted by Spicer, that measur 0.90s as marvelous, in
Software for Use with Optoelectronic Measuring Tool
NASA Technical Reports Server (NTRS)
Ballard, Kim C.
2004-01-01
A computer program has been written to facilitate and accelerate the process of measurement by use of the apparatus described in "Optoelectronic Tool Adds Scale Marks to Photographic Images" (KSC-12201). The tool contains four laser diodes that generate parallel beams of light spaced apart at a known distance. The beams of light are used to project bright spots that serve as scale marks that become incorporated into photographic images (including film and electronic images). The sizes of objects depicted in the images can readily be measured by reference to the scale marks. The computer program is applicable to a scene that contains the laser spots and that has been imaged in a square pixel format that can be imported into a graphical user interface (GUI) generated by the program. It is assumed that the laser spots and the distance(s) to be measured all lie in the same plane and that the plane is perpendicular to the line of sight of the camera used to record the image
Corona performance of a compact 230-kV line
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chartier, V.L.; Blair, D.E.; Easley, M.D.
Permitting requirements and the acquisition of new rights-of-way for transmission facilities has in recent years become increasingly difficult for most utilities, including Puget Sound Power and Light Company. In order to maintain a high degree of reliability of service while being responsive to public concerns regarding the siting of high voltage (HV) transmission facilities, Puget Power has found it necessary to more heavily rely upon the use of compact lines in franchise corridors. Compaction does, however, precipitant increased levels of audible noise (AN) and radio and TV interference (RI and TVI) due to corona on the conductors and insulator assemblies.more » Puget Power relies upon the Bonneville Power Administration (BPA) Corona and Field Effects computer program to calculate AN and RI for new lines. Since there was some question of the program`s ability to accurately represent quiet 230-kV compact designs, a joint project was undertaken with BPA to verify the program`s algorithms. Long-term measurements made on an operating Puget Power 230-kV compact line confirmed the accuracy of BPA`s AN model; however, the RI measurements were much lower than predicted by the BPA computer and other programs. This paper also describes how the BPA computer program can be used to calculate the voltage needed to expose insulator assemblies to the correct electric field in single test setups in HV laboratories.« less
A nonproprietary, nonsecret program for calculating Stirling cryocoolers
NASA Technical Reports Server (NTRS)
Martini, W. R.
1985-01-01
A design program for an integrated Stirling cycle cryocooler was written on an IBM-PC computer. The program is easy to use and shows the trends and itemizes the losses. The calculated results were compared with some measured performance values. The program predicts somewhat optimistic performance and needs to be calibrated more with experimental measurements. Adding a multiplier to the friction factor can bring the calculated rsults in line with the limited test results so far available. The program is offered as a good framework on which to build a truly useful design program for all types of cryocoolers.
Advanced Simulation and Computing: A Summary Report to the Director's Review
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCoy, M G; Peck, T
2003-06-01
It has now been three years since the Advanced Simulation and Computing Program (ASCI), as managed by Defense and Nuclear Technologies (DNT) Directorate, has been reviewed by this Director's Review Committee (DRC). Since that time, there has been considerable progress for all components of the ASCI Program, and these developments will be highlighted in this document and in the presentations planned for June 9 and 10, 2003. There have also been some name changes. Today, the Program is called ''Advanced Simulation and Computing,'' Although it retains the familiar acronym ASCI, the initiative nature of the effort has given way tomore » sustained services as an integral part of the Stockpile Stewardship Program (SSP). All computing efforts at LLNL and the other two Defense Program (DP) laboratories are funded and managed under ASCI. This includes the so-called legacy codes, which remain essential tools in stockpile stewardship. The contract between the Department of Energy (DOE) and the University of California (UC) specifies an independent appraisal of Directorate technical work and programmatic management. Such represents the work of this DNT Review Committee. Beginning this year, the Laboratory is implementing a new review system. This process was negotiated between UC, the National Nuclear Security Administration (NNSA), and the Laboratory Directors. Central to this approach are eight performance objectives that focus on key programmatic and administrative goals. Associated with each of these objectives are a number of performance measures to more clearly characterize the attainment of the objectives. Each performance measure has a lead directorate and one or more contributing directorates. Each measure has an evaluation plan and has identified expected documentation to be included in the ''Assessment File''.« less
WINDOWS: a program for the analysis of spectral data foil activation measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stallmann, F.W.; Eastham, J.F.; Kam, F.B.K.
The computer program WINDOWS together with its subroutines is described for the analysis of neutron spectral data foil activation measurements. In particular, the unfolding of the neutron differential spectrum, estimated windows and detector contributions, upper and lower bounds for an integral response, and group fluxes obtained from neutron transport calculations. 116 references. (JFP)
Composite Failures: A Comparison of Experimental Test Results and Computational Analysis Using XFEM
2016-09-30
NUWC-NPT Technical Report 12,218 30 September 2016 Composite Failures: A Comparison of Experimental Test Results and Computational Analysis...A Comparison of Experimental Test Results and Computational Analysis Using XFEM 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...availability of measurement techniques, experimental testing of composite materials has largely outpaced the computational modeling ability, forcing
Reliability model derivation of a fault-tolerant, dual, spare-switching, digital computer system
NASA Technical Reports Server (NTRS)
1974-01-01
A computer based reliability projection aid, tailored specifically for application in the design of fault-tolerant computer systems, is described. Its more pronounced characteristics include the facility for modeling systems with two distinct operational modes, measuring the effect of both permanent and transient faults, and calculating conditional system coverage factors. The underlying conceptual principles, mathematical models, and computer program implementation are presented.
Research Education in Undergraduate Occupational Therapy Programs.
ERIC Educational Resources Information Center
Petersen, Paul; And Others
1992-01-01
Of 63 undergraduate occupational therapy programs surveyed, the 38 responses revealed some common areas covered: elementary descriptive statistics, validity, reliability, and measurement. Areas underrepresented include statistical analysis with or without computers, research design, and advanced statistics. (SK)
NASA Technical Reports Server (NTRS)
Bradley, P. F.; Throckmorton, D. A.
1981-01-01
A study was completed to determine the sensitivity of computed convective heating rates to uncertainties in the thermal protection system thermal model. Those parameters considered were: density, thermal conductivity, and specific heat of both the reusable surface insulation and its coating; coating thickness and emittance; and temperature measurement uncertainty. The assessment used a modified version of the computer program to calculate heating rates from temperature time histories. The original version of the program solves the direct one dimensional heating problem and this modified version of The program is set up to solve the inverse problem. The modified program was used in thermocouple data reduction for shuttle flight data. Both nominal thermal models and altered thermal models were used to determine the necessity for accurate knowledge of thermal protection system's material thermal properties. For many thermal properties, the sensitivity (inaccuracies created in the calculation of convective heating rate by an altered property) was very low.
Thermoelectric property measurements with computer controlled systems
NASA Technical Reports Server (NTRS)
Chmielewski, A. B.; Wood, C.
1984-01-01
A joint JPL-NASA program to develop an automated system to measure the thermoelectric properties of newly developed materials is described. Consideration is given to the difficulties created by signal drift in measurements of Hall voltage and the Large Delta T Seebeck coefficient. The benefits of a computerized system were examined with respect to error reduction and time savings for human operators. It is shown that the time required to measure Hall voltage can be reduced by a factor of 10 when a computer is used to fit a curve to the ratio of the measured signal and its standard deviation. The accuracy of measurements of the Large Delta T Seebeck coefficient and thermal diffusivity was also enhanced by the use of computers.
Arithmetic 400. A Computer Educational Program.
ERIC Educational Resources Information Center
Firestein, Laurie
"ARITHMETIC 400" is the first of the next generation of educational programs designed to encourage thinking about arithmetic problems. Presented in video game format, performance is a measure of correctness, speed, accuracy, and fortune as well. Play presents a challenge to individuals at various skill levels. The program, run on an Apple…
User's manual for computer program BASEPLOT
Sanders, Curtis L.
2002-01-01
The checking and reviewing of daily records of streamflow within the U.S. Geological Survey is traditionally accomplished by hand-plotting and mentally collating tables of data. The process is time consuming, difficult to standardize, and subject to errors in computation, data entry, and logic. In addition, the presentation of flow data on the internet requires more timely and accurate computation of daily flow records. BASEPLOT was developed for checking and review of primary streamflow records within the U.S. Geological Survey. Use of BASEPLOT enables users to (1) provide efficiencies during the record checking and review process, (2) improve quality control, (3) achieve uniformity of checking and review techniques of simple stage-discharge relations, and (4) provide a tool for teaching streamflow computation techniques. The BASEPLOT program produces tables of quality control checks and produces plots of rating curves and discharge measurements; variable shift (V-shift) diagrams; and V-shifts converted to stage-discharge plots, using data stored in the U.S. Geological Survey Automatic Data Processing System database. In addition, the program plots unit-value hydrographs that show unit-value stages, shifts, and datum corrections; input shifts, datum corrections, and effective dates; discharge measurements; effective dates for rating tables; and numeric quality control checks. Checklist/tutorial forms are provided for reviewers to ensure completeness of review and standardize the review process. The program was written for the U.S. Geological Survey SUN computer using the Statistical Analysis System (SAS) software produced by SAS Institute, Incorporated.
Phase Calibration for the Block 1 VLBI System
NASA Technical Reports Server (NTRS)
Roth, M. G.; Runge, T. F.
1983-01-01
Very Long Baseline Interferometry (VLBI) in the DSN provides support for spacecraft navigation, Earth orientation measurements, and synchronization of network time and frequency standards. An improved method for calibrating instrumental phase shifts has recently been implemented as a computer program in the Block 1 system. The new calibration program, called PRECAL, performs calibrations over intervals as small as 0.4 seconds and greatly reduces the amount of computer processing required to perform phase calibration.
Collection and processing of data from a phase-coherent meteor radar
NASA Technical Reports Server (NTRS)
Backof, C. A., Jr.; Bowhill, S. A.
1974-01-01
An analysis of the measurement accuracy requirement of a high resolution meteor radar for observing short period, atmospheric waves is presented, and a system which satisfies the requirements is described. A medium scale, real time computer is programmed to perform all echo recognition and coordinate measurement functions. The measurement algorithms are exercised on noisy data generated by a program which simulates the hardware system, in order to find the effects of noise on the measurement accuracies.
NASA Technical Reports Server (NTRS)
Carter, J. E.
1977-01-01
A computer program called STAYLAM is presented for the computation of the compressible laminar boundary-layer flow over a yawed infinite wing including distributed suction. This program is restricted to the transonic speed range or less due to the approximate treatment of the compressibility effects. The prescribed suction distribution is permitted to change discontinuously along the chord measured perpendicular to the wing leading edge. Estimates of transition are made by considering leading edge contamination, cross flow instability, and instability of the Tollmien-Schlichting type. A program listing is given in addition to user instructions and a sample case.
Communications oriented programming of parallel iterative solutions of sparse linear systems
NASA Technical Reports Server (NTRS)
Patrick, M. L.; Pratt, T. W.
1986-01-01
Parallel algorithms are developed for a class of scientific computational problems by partitioning the problems into smaller problems which may be solved concurrently. The effectiveness of the resulting parallel solutions is determined by the amount and frequency of communication and synchronization and the extent to which communication can be overlapped with computation. Three different parallel algorithms for solving the same class of problems are presented, and their effectiveness is analyzed from this point of view. The algorithms are programmed using a new programming environment. Run-time statistics and experience obtained from the execution of these programs assist in measuring the effectiveness of these algorithms.
R2 effect-size measures for mediation analysis
Fairchild, Amanda J.; MacKinnon, David P.; Taborga, Marcia P.; Taylor, Aaron B.
2010-01-01
R2 effect-size measures are presented to assess variance accounted for in mediation models. The measures offer a means to evaluate both component paths and the overall mediated effect in mediation models. Statistical simulation results indicate acceptable bias across varying parameter and sample-size combinations. The measures are applied to a real-world example using data from a team-based health promotion program to improve the nutrition and exercise habits of firefighters. SAS and SPSS computer code are also provided for researchers to compute the measures in their own data. PMID:19363189
R2 effect-size measures for mediation analysis.
Fairchild, Amanda J; Mackinnon, David P; Taborga, Marcia P; Taylor, Aaron B
2009-05-01
R(2) effect-size measures are presented to assess variance accounted for in mediation models. The measures offer a means to evaluate both component paths and the overall mediated effect in mediation models. Statistical simulation results indicate acceptable bias across varying parameter and sample-size combinations. The measures are applied to a real-world example using data from a team-based health promotion program to improve the nutrition and exercise habits of firefighters. SAS and SPSS computer code are also provided for researchers to compute the measures in their own data.
The use of wireless laptop computers for computer-assisted learning in pharmacokinetics.
Munar, Myrna Y; Singh, Harleen; Belle, Donna; Brackett, Carolyn C; Earle, Sandra B
2006-02-15
To implement computer-assisted learning workshops into pharmacokinetics courses in a doctor of pharmacy (PharmD) program. Workshops were designed for students to utilize computer software programs on laptop computers to build pharmacokinetic models to predict drug concentrations resulting from various dosage regimens. In addition, students were able to visualize through graphing programs how altering different parameters changed drug concentration-time curves. Surveys were conducted to measure students' attitudes toward computer technology before and after implementation. Finally, traditional examinations were used to evaluate student learning. Doctor of pharmacy students responded favorably to the use of wireless laptop computers in problem-based pharmacokinetic workshops. Eighty-eight percent (n = 61/69) and 82% (n = 55/67) of PharmD students completed surveys before and after computer implementation, respectively. Prior to implementation, 95% of students agreed that computers would enhance learning in pharmacokinetics. After implementation, 98% of students strongly agreed (p < 0.05) that computers enhanced learning. Examination results were significantly higher after computer implementation (89% with computers vs. 84% without computers; p = 0.01). Implementation of wireless laptop computers in a pharmacokinetic course enabled students to construct their own pharmacokinetic models that could respond to changing parameters. Students had greater comprehension and were better able to interpret results and provide appropriate recommendations. Computer-assisted pharmacokinetic techniques can be powerful tools when making decisions about drug therapy.
The Use of Wireless Laptop Computers for Computer-Assisted Learning in Pharmacokinetics
Munar, Myrna Y.; Singh, Harleen; Belle, Donna; Brackett, Carolyn C.; Earle, Sandra B.
2006-01-01
Objective To implement computer-assisted learning workshops into pharmacokinetics courses in a doctor of pharmacy (PharmD) program. Design Workshops were designed for students to utilize computer software programs on laptop computers to build pharmacokinetic models to predict drug concentrations resulting from various dosage regimens. In addition, students were able to visualize through graphing programs how altering different parameters changed drug concentration-time curves. Surveys were conducted to measure students’ attitudes toward computer technology before and after implementation. Finally, traditional examinations were used to evaluate student learning. Assessment Doctor of pharmacy students responded favorably to the use of wireless laptop computers in problem-based pharmacokinetic workshops. Eighty-eight percent (n = 61/69) and 82% (n = 55/67) of PharmD students completed surveys before and after computer implementation, respectively. Prior to implementation, 95% of students agreed that computers would enhance learning in pharmacokinetics. After implementation, 98% of students strongly agreed (p < 0.05) that computers enhanced learning. Examination results were significantly higher after computer implementation (89% with computers vs. 84% without computers; p = 0.01). Conclusion Implementation of wireless laptop computers in a pharmacokinetic course enabled students to construct their own pharmacokinetic models that could respond to changing parameters. Students had greater comprehension and were better able to interpret results and provide appropriate recommendations. Computer-assisted pharmacokinetic techniques can be powerful tools when making decisions about drug therapy. PMID:17136147
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heroux, Michael; Lethin, Richard
Programming models and environments play the essential roles in high performance computing of enabling the conception, design, implementation and execution of science and engineering application codes. Programmer productivity is strongly influenced by the effectiveness of our programming models and environments, as is software sustainability since our codes have lifespans measured in decades, so the advent of new computing architectures, increased concurrency, concerns for resilience, and the increasing demands for high-fidelity, multi-physics, multi-scale and data-intensive computations mean that we have new challenges to address as part of our fundamental R&D requirements. Fortunately, we also have new tools and environments that makemore » design, prototyping and delivery of new programming models easier than ever. The combination of new and challenging requirements and new, powerful toolsets enables significant synergies for the next generation of programming models and environments R&D. This report presents the topics discussed and results from the 2014 DOE Office of Science Advanced Scientific Computing Research (ASCR) Programming Models & Environments Summit, and subsequent discussions among the summit participants and contributors to topics in this report.« less
Dynamic gas temperature measurement system. Volume 2: Operation and program manual
NASA Technical Reports Server (NTRS)
Purpura, P. T.
1983-01-01
The hot section technology (HOST) dynamic gas temperature measurement system computer program acquires data from two type B thermocouples of different diameters. The analysis method determines the in situ value of an aerodynamic parameter T, containing the heat transfer coefficient from the transfer function of the two thermocouples. This aerodynamic parameter is used to compute a fequency response spectrum and compensate the dynamic portion of the signal of the smaller thermocouple. The calculations for the aerodynamic parameter and the data compensation technique are discussed. Compensated data are presented in either the time or frequency domain, time domain data as dynamic temperature vs time, or frequency domain data.
Computer-aided engineering of semiconductor integrated circuits
NASA Astrophysics Data System (ADS)
Meindl, J. D.; Dutton, R. W.; Gibbons, J. F.; Helms, C. R.; Plummer, J. D.; Tiller, W. A.; Ho, C. P.; Saraswat, K. C.; Deal, B. E.; Kamins, T. I.
1980-07-01
Economical procurement of small quantities of high performance custom integrated circuits for military systems is impeded by inadequate process, device and circuit models that handicap low cost computer aided design. The principal objective of this program is to formulate physical models of fabrication processes, devices and circuits to allow total computer-aided design of custom large-scale integrated circuits. The basic areas under investigation are (1) thermal oxidation, (2) ion implantation and diffusion, (3) chemical vapor deposition of silicon and refractory metal silicides, (4) device simulation and analytic measurements. This report discusses the fourth year of the program.
MXLKID: a maximum likelihood parameter identifier. [In LRLTRAN for CDC 7600
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gavel, D.T.
MXLKID (MaXimum LiKelihood IDentifier) is a computer program designed to identify unknown parameters in a nonlinear dynamic system. Using noisy measurement data from the system, the maximum likelihood identifier computes a likelihood function (LF). Identification of system parameters is accomplished by maximizing the LF with respect to the parameters. The main body of this report briefly summarizes the maximum likelihood technique and gives instructions and examples for running the MXLKID program. MXLKID is implemented LRLTRAN on the CDC7600 computer at LLNL. A detailed mathematical description of the algorithm is given in the appendices. 24 figures, 6 tables.
Military Vision Research Program
2011-07-01
accomplishments emanating from this research . • 3 novel computer-based tasks have been developed that measure visual distortions • These tests are based...10-1-0392 TITLE: Military Vision Research Program PRINCIPAL INVESTIGATOR: Dr. Darlene Dartt...CONTRACTING ORGANIZATION: The Schepens Eye Research
Computers in medical education 1: evaluation of a problem-orientated learning package.
Devitt, P; Palmer, E
1998-04-01
A computer-based learning package has been developed, aimed at expanding students' knowledge base, as well as improving data-handling abilities and clinical problem-solving skills. The program was evaluated by monitoring its use by students, canvassing users' opinions and measuring its effectiveness as a learning tool compared to tutorials on the same material. Evaluation was undertaken using three methods: initially, by a questionnaire on computers as a learning tool and the applicability of the content: second, through monitoring by the computer of student use, decisions and performance; finally, through pre- and post-test assessment of fifth-year students who either used a computer package or attended a tutorial on equivalent material. Most students provided positive comments on the learning material and expressed a willingness to see computer-aided learning (CAL) introduced into the curriculum. Over a 3-month period, 26 modules in the program were used on 1246 occasions. Objective measurement showed a significant gain in knowledge, data handling and problem-solving skills. Computer-aided learning is a valuable learning resource that deserves better attention in medical education. When used appropriately, the computer can be an effective learning resource, not only for the delivery of knowledge. but also to help students develop their problem-solving skills.
Designing for deeper learning in a blended computer science course for middle school students
NASA Astrophysics Data System (ADS)
Grover, Shuchi; Pea, Roy; Cooper, Stephen
2015-04-01
The focus of this research was to create and test an introductory computer science course for middle school. Titled "Foundations for Advancing Computational Thinking" (FACT), the course aims to prepare and motivate middle school learners for future engagement with algorithmic problem solving. FACT was also piloted as a seven-week course on Stanford's OpenEdX MOOC platform for blended in-class learning. Unique aspects of FACT include balanced pedagogical designs that address the cognitive, interpersonal, and intrapersonal aspects of "deeper learning"; a focus on pedagogical strategies for mediating and assessing for transfer from block-based to text-based programming; curricular materials for remedying misperceptions of computing; and "systems of assessments" (including formative and summative quizzes and tests, directed as well as open-ended programming assignments, and a transfer test) to get a comprehensive picture of students' deeper computational learning. Empirical investigations, accomplished over two iterations of a design-based research effort with students (aged 11-14 years) in a public school, sought to examine student understanding of algorithmic constructs, and how well students transferred this learning from Scratch to text-based languages. Changes in student perceptions of computing as a discipline were measured. Results and mixed-method analyses revealed that students in both studies (1) achieved substantial learning gains in algorithmic thinking skills, (2) were able to transfer their learning from Scratch to a text-based programming context, and (3) achieved significant growth toward a more mature understanding of computing as a discipline. Factor analyses of prior computing experience, multivariate regression analyses, and qualitative analyses of student projects and artifact-based interviews were conducted to better understand the factors affecting learning outcomes. Prior computing experiences (as measured by a pretest) and math ability were found to be strong predictors of learning outcomes.
Cancer-meter: measure and cure.
Kashyap, Sunil Kumar; Sharma, Birendra Kumar; Banerjee, Amitabh
2017-05-01
This paper presents a theory and system on "Cancer-Meter'. This idea came through the statement that "cancer is curable if it is measurable". The Cancer-Meter proves that it is possible. This paper proposes the cancer-meter in two ways, theoretical and electronically, as per the measurement and treatment. By the mathematics, first part is defined but the second part is based on computer programming, electrical and electronics. Thus, the cancer-meter is a programmed-electrical-electronic device which measures and cures the cancer both.
N'Gom, Moussa; Lien, Miao-Bin; Estakhri, Nooshin M; Norris, Theodore B; Michielssen, Eric; Nadakuditi, Raj Rao
2017-05-31
Complex Semi-Definite Programming (SDP) is introduced as a novel approach to phase retrieval enabled control of monochromatic light transmission through highly scattering media. In a simple optical setup, a spatial light modulator is used to generate a random sequence of phase-modulated wavefronts, and the resulting intensity speckle patterns in the transmitted light are acquired on a camera. The SDP algorithm allows computation of the complex transmission matrix of the system from this sequence of intensity-only measurements, without need for a reference beam. Once the transmission matrix is determined, optimal wavefronts are computed that focus the incident beam to any position or sequence of positions on the far side of the scattering medium, without the need for any subsequent measurements or wavefront shaping iterations. The number of measurements required and the degree of enhancement of the intensity at focus is determined by the number of pixels controlled by the spatial light modulator.
An evaluation of Space Shuttle STS-2 payload bay acoustic data and comparison with predictions
NASA Technical Reports Server (NTRS)
Wilby, J. F.; Piersol, A. G.; Wilby, E. G.
1982-01-01
Space average sound pressure levels computed from measurements at 18 locations in the payload bay of the Space Shuttle orbiter vehicle during the STS-2 launch were compared with predicted levels obtained using the PACES computer program. The comparisons were performed over the frequency range 12.5 Hz to 1000 Hz, since the test data at higher frequencies are contaminated by instrumentation background noise. In general the PACES computer program tends to overpredict the space average sound levels in the payload bay, although the magnitude of the discrepancy is usually small. Furthermore the discrepancy depends to some extent on the manner in which the payload is modeled analytically, and the method used to determine the "measured' space average sound pressure levels. Thus the difference between predicted and measured sound levels, averaged over the 20 one third octave bands from 12.5 Hz to 1000 Hz, varies from 1 dB to 3.5 dB.
NASA Technical Reports Server (NTRS)
Farley, Douglas L.
2005-01-01
NASA's Aviation Safety and Security Program is pursuing research in on-board Structural Health Management (SHM) technologies for purposes of reducing or eliminating aircraft accidents due to system and component failures. Under this program, NASA Langley Research Center (LaRC) is developing a strain-based structural health-monitoring concept that incorporates a fiber optic-based measuring system for acquiring strain values. This fiber optic-based measuring system provides for the distribution of thousands of strain sensors embedded in a network of fiber optic cables. The resolution of strain value at each discrete sensor point requires a computationally demanding data reduction software process that, when hosted on a conventional processor, is not suitable for near real-time measurement. This report describes the development and integration of an alternative computing environment using dedicated computing hardware for performing the data reduction. Performance comparison between the existing and the hardware-based system is presented.
A Fortran Program to Aid in Mineral Identification Using Optical Properties.
ERIC Educational Resources Information Center
Blanchard, Frank N.
1980-01-01
Describes a search and match computer program which retreives from a user-generated mineral file those minerals which are not incompatible with the observed or measured optical properties of an unknown. Careful selection of input lists make it unlikely that the program will fail when reasonably accurate observations are recorded. (Author/JN)
Prediction of quantitative intrathoracic fluid volume to diagnose pulmonary oedema using LabVIEW.
Urooj, Shabana; Khan, M; Ansari, A Q; Lay-Ekuakille, Aimé; Salhan, Ashok K
2012-01-01
Pulmonary oedema is a life-threatening disease that requires special attention in the area of research and clinical diagnosis. Computer-based techniques are rarely used to quantify the intrathoracic fluid volume (IFV) for diagnostic purposes. This paper discusses a software program developed to detect and diagnose pulmonary oedema using LabVIEW. The software runs on anthropometric dimensions and physiological parameters, mainly transthoracic electrical impedance (TEI). This technique is accurate and faster than existing manual techniques. The LabVIEW software was used to compute the parameters required to quantify IFV. An equation relating per cent control and IFV was obtained. The results of predicted TEI and measured TEI were compared with previously reported data to validate the developed program. It was found that the predicted values of TEI obtained from the computer-based technique were much closer to the measured values of TEI. Six new subjects were enrolled to measure and predict transthoracic impedance and hence to quantify IFV. A similar difference was also observed in the measured and predicted values of TEI for the new subjects.
NASA Rotor 37 CFD Code Validation: Glenn-HT Code
NASA Technical Reports Server (NTRS)
Ameri, Ali A.
2010-01-01
In order to advance the goals of NASA aeronautics programs, it is necessary to continuously evaluate and improve the computational tools used for research and design at NASA. One such code is the Glenn-HT code which is used at NASA Glenn Research Center (GRC) for turbomachinery computations. Although the code has been thoroughly validated for turbine heat transfer computations, it has not been utilized for compressors. In this work, Glenn-HT was used to compute the flow in a transonic compressor and comparisons were made to experimental data. The results presented here are in good agreement with this data. Most of the measures of performance are well within the measurement uncertainties and the exit profiles of interest agree with the experimental measurements.
ERIC Educational Resources Information Center
Palestis, Ernest
1997-01-01
Describes the award-winning technology endeavors and parent involvement programs developed in the Mine Hill School District (New Jersey). Topics include the multiyear plan, community and board of education support, funding, measuring student learning outcomes, and evening computer education programs for parents and children. (LRW)
NASA Technical Reports Server (NTRS)
Borysow, Aleksandra
1998-01-01
Accurate knowledge of certain collision-induced absorption continua of molecular pairs such as H2-H2, H2-He, H2-CH4, CO2-CO2, etc., is a prerequisite for most spectral analyses and modelling attempts of atmospheres of planets and cold stars. We collect and regularly update simple, state of the art computer programs for the calculation of the absorption coefficient of such molecular pairs over a broad range of temperatures and frequencies, for the various rotovibrational bands. The computational results are in agreement with the existing laboratory measurements of such absorption continua, recorded with a spectral resolution of a few wavenumbers, but reliable computational results may be expected even in the far wings, and at temperatures for which laboratory measurements do not exist. Detailed information is given concerning the systems thus studied, the temperature and frequency ranges considered, the rotovibrational bands thus modelled, and how one may obtain copies of the FORTRAN77 computer programs by e-mail.
Omega flight-test data reduction sequence. [computer programs for reduction of navigation data
NASA Technical Reports Server (NTRS)
Lilley, R. W.
1974-01-01
Computer programs for Omega data conversion, summary, and preparation for distribution are presented. Program logic and sample data formats are included, along with operational instructions for each program. Flight data (or data collected in flight format in the laboratory) is provided by the Ohio University Omega receiver base in the form of 6-bit binary words representing the phase of an Omega station with respect to the receiver's local clock. All eight Omega stations are measured in each 10-second Omega time frame. In addition, an event-marker bit and a time-slot D synchronizing bit are recorded. Program FDCON is used to remove data from the flight recorder tape and place it on data-processing cards for later use. Program FDSUM provides for computer plotting of selected LOP's, for single-station phase plots, and for printout of basic signal statistics for each Omega channel. Mean phase and standard deviation are printed, along with data from which a phase distribution can be plotted for each Omega station. Program DACOP simply copies the Omega data deck a controlled number of times, for distribution to users.
Operation of the HP2250 with the HP9000 series 200 using PASCAL 3.0
NASA Technical Reports Server (NTRS)
Perry, John; Stroud, C. W.
1986-01-01
A computer program has been written to provide an interface between the HP Series 200 desktop computers, operating under HP Standard Pascal 3.0, and the HP2250 Data Acquisition and Control System. Pascal 3.0 for the HP9000 desktop computer gives a number of procedures for handling bus communication at various levels. It is necessary, however, to reach the lowest possible level in Pascal to handle the bus protocols required by the HP2250. This makes programming extremely complex since these protocols are not documented. The program described solves those problems and allows the user to immediately program, simply and efficiently, any measurement and control language (MCL/50) application with a few procedure calls. The complete set of procedures is available on a 5 1/4 inch diskette from Cosmic. Included in this group of procedures is an Exerciser which allows the user to exercise his HP2250 interactively. The exerciser operates in a fashion similar to the Series 200 operating system programs, but is adapted to the requirements of the HP2250. The programs on the diskette and the user's manual assume the user is acquainted with both the MCL/50 programming language and HP Standard Pascal 3.0 for the HP series 200 desktop computers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frank, R.N.
1990-02-28
The Inspection Shop at Lawrence Livermore Lab recently purchased a Sheffield Apollo RS50 Direct Computer Control Coordinate Measuring Machine. The performance of the machine was specified to conform to B89 standard which relies heavily upon using the measuring machine in its intended manner to verify its accuracy (rather than parametric tests). Although it would be possible to use the interactive measurement system to perform these tasks, a more thorough and efficient job can be done by creating Function Library programs for certain tasks which integrate Hewlett-Packard Basic 5.0 language and calls to proprietary analysis and machine control routines. This combinationmore » provides efficient use of the measuring machine with a minimum of keyboard input plus an analysis of the data with respect to the B89 Standard rather than a CMM analysis which would require subsequent interpretation. This paper discusses some characteristics of the Sheffield machine control and analysis software and my use of H-P Basic language to create automated measurement programs to support the B89 performance evaluation of the CMM. 1 ref.« less
Parallel Wavefront Analysis for a 4D Interferometer
NASA Technical Reports Server (NTRS)
Rao, Shanti R.
2011-01-01
This software provides a programming interface for automating data collection with a PhaseCam interferometer from 4D Technology, and distributing the image-processing algorithm across a cluster of general-purpose computers. Multiple instances of 4Sight (4D Technology s proprietary software) run on a networked cluster of computers. Each connects to a single server (the controller) and waits for instructions. The controller directs the interferometer to several images, then assigns each image to a different computer for processing. When the image processing is finished, the server directs one of the computers to collate and combine the processed images, saving the resulting measurement in a file on a disk. The available software captures approximately 100 images and analyzes them immediately. This software separates the capture and analysis processes, so that analysis can be done at a different time and faster by running the algorithm in parallel across several processors. The PhaseCam family of interferometers can measure an optical system in milliseconds, but it takes many seconds to process the data so that it is usable. In characterizing an adaptive optics system, like the next generation of astronomical observatories, thousands of measurements are required, and the processing time quickly becomes excessive. A programming interface distributes data processing for a PhaseCam interferometer across a Windows computing cluster. A scriptable controller program coordinates data acquisition from the interferometer, storage on networked hard disks, and parallel processing. Idle time of the interferometer is minimized. This architecture is implemented in Python and JavaScript, and may be altered to fit a customer s needs.
Solar radiation measurement project
NASA Technical Reports Server (NTRS)
Ioup, J. W.
1981-01-01
The Xavier solar radiation measurement project and station are described. Measurements of the total solar radiation on a horizontal surface from an Eppley pyranometer were collected into computer data files. Total radiation in watt hours was converted from ten minute intervals to hourly intervals. Graphs of this total radiation data are included. A computer program in Fortran was written to calculate the total extraterrestrial radiation on a horizontal surface for each day of the month. Educational and social benefits of the project are cited.
NASA Technical Reports Server (NTRS)
Howell, W. E.
1974-01-01
The structural performance of a boron-epoxy reinforced titanium drag strut, which contains a bonded scarf joint and was designed to the criteria of the Boeing 747 transport, was evaluated. An experimental and analytical investigation was conducted. The strut was exposed to two lifetimes of spectrum loading and was statically loaded to the tensile and compressive design ultimate loads. Throughout the test program no evidence of any damage in the drag strut was detected by strain gage measurements, ultrasonic inspection, or visual observation. An analytical study of the bonded joint was made using the NASA structural analysis computer program NASTRAN. A comparison of the strains predicted by the NASTRAN computer program with the experimentally determined values shows excellent agreement. The NASTRAN computer program is a viable tool for studying, in detail, the stresses and strains induced in a bonded joint.
Obtaining Reliable Predictions of Terrestrial Energy Coupling From Real-Time Solar Wind Measurement
NASA Technical Reports Server (NTRS)
Weimer, Daniel R.
2001-01-01
The first draft of a manuscript titled "Variable time delays in the propagation of the interplanetary magnetic field" has been completed, for submission to the Journal of Geophysical Research. In the preparation of this manuscript all data and analysis programs had been updated to the highest temporal resolution possible, at 16 seconds or better. The program which computes the "measured" IMF propagation time delays from these data has also undergone another improvement. In another significant development, a technique has been developed in order to predict IMF phase plane orientations, and the resulting time delays, using only measurements from a single satellite at L1. The "minimum variance" method is used for this computation. Further work will be done on optimizing the choice of several parameters for the minimum variance calculation.
NASA Astrophysics Data System (ADS)
Parker, Tehri Davenport
1997-09-01
This study designed, implemented, and evaluated an environmental education hypermedia program for use in a residential environmental education facility. The purpose of the study was to ascertain whether a hypermedia program could increase student knowledge and positive attitudes toward the environment and environmental education. A student/computer interface, based on the theory of social cognition, was developed to direct student interactions with the computer. A quasi-experimental research design was used. Students were randomly assigned to either the experimental or control group. The experimental group used the hypermedia program to learn about the topic of energy. The control group received the same conceptual information from a teacher/naturalist. An Environmental Awareness Quiz was administered to measure differences in the students' cognitive understanding of energy issues. Students participated in one on one interviews to discuss their attitudes toward the lesson and the overall environmental education experience. Additionally, members of the experimental group were tape recorded while they used the hypermedia program. These tapes were analyzed to identify aspects of the hypermedia program that promoted student learning. The findings of this study suggest that computers, and hypermedia programs, can be integrated into residential environmental education facilities, and can assist environmental educators in meeting their goals for students. The study found that the hypermedia program was as effective as the teacher/naturalist for teaching about environmental education material. Students who used the computer reported more positive attitudes toward the lesson on energy, and thought that they had learned more than the control group. Students in the control group stated that they did not learn as much as the computer group. The majority of students had positive attitudes toward the inclusion of computers in the camp setting, and stated that they were a good way to learn about environmental education material. This study also identified lack of social skills as a barrier to social cognition among mixed gender groups using the computer program.
Human Factors Research in Aircrew Performance and Training: 1990 Annual Summary Report
1991-06-01
TRS) . . . . . . . . . 97 DEVELOPMENT OF A METHODOLOGY FOR MEASURING BOTH CONSCIOUS AND SUBCONSCIOUS ASPECTS OF AIRCREW COORDINATION IN ARMY HELICOPTER...Perkin-Elmer mini- computer and FORTRAN programming language. The model was later reprogrammed using the TOSS software and an IBM personal computer. The...will be conducted by the UAFDL. 101 44 DEVELOPMENT OF A METHODOLOGY FOR MEASURING BOTH CONSCIOUS .AND SUBCONSCIOUS ASPECTS OF AIRCREW COORDINATION IN
Measurement of hard tissue density based on image density of intraoral radiograph
NASA Astrophysics Data System (ADS)
Katsumata, Akitoshi; Fukui, Tatsumasa; Shimoda, Shinji; Kobayashi, Kaoru; Hayashi, Tatsuro
2018-02-01
We developed a DentalSCOPE computer program to measure the bone mineral density (BMD) of the alveolar bone. Mineral density measurement of alveolar bone may be useful to predict possible patients who will occur medication-related osteonecrosis of the jaw (MRONJ). Because these osteoporosis medicines affect the mineral density of alveolar bone significantly. The BMD of alveolar bone was compared between dual-energy X-ray absorptiometry (DEXA) and the DentalSCOPE program. A high correlation coefficient was revealed between the DentalSCOPE measurement and the DEXA measurement.
AAFE RADSCAT data reduction programs user's guide
NASA Technical Reports Server (NTRS)
Claassen, J. P.
1976-01-01
Theory, design and operation of the computer programs which automate the reduction of joint radiometer and scatterometer observations are presented. The programs reduce scatterometer measurements to the normalized scattering coefficient; whereas the radiometer measurements are converted into antenna temperatures. The programs are both investigator and user oriented. Supplementary parameters are provided to aid in the interpretation of the observations. A hierarchy of diagnostics is available to evaluate the operation of the instrument, the conduct of the experiments and the quality of the records. General descriptions of the programs and their data products are also presented. This document therefore serves as a user's guide to the programs and is therefore intended to serve both the experimenter and the program operator.
ERIC Educational Resources Information Center
Congress of the U.S., Washington, DC. House Committee on Ways and Means.
This hearing on legislation designed to encourage contributions of computers and computer equipment to elementary and secondary schools emphasizes California's experience with a state-level program. Testimony is included from the following witnesses: Kay Pacheco, Alameda County Office of Education; Michael D. Rashkin, Apple Computer, Inc.; Barbara…
[Application of virtual instrumentation technique in toxicological studies].
Moczko, Jerzy A
2005-01-01
Research investigations require frequently direct connection of measuring equipment to the computer. Virtual instrumentation technique considerably facilitates programming of sophisticated acquisition-and-analysis procedures. In standard approach these two steps are performed subsequently with separate software tools. The acquired data are transfered with export / import procedures of particular program to the another one which executes next step of analysis. The described procedure is cumbersome, time consuming and may be potential source of the errors. In 1987 National Instruments Corporation introduced LabVIEW language based on the concept of graphical programming. Contrary to conventional textual languages it allows the researcher to concentrate on the resolved problem and omit all syntactical rules. Programs developed in LabVIEW are called as virtual instruments (VI) and are portable among different computer platforms as PCs, Macintoshes, Sun SPARCstations, Concurrent PowerMAX stations, HP PA/RISK workstations. This flexibility warrants that the programs prepared for one particular platform would be also appropriate to another one. In presented paper basic principles of connection of research equipment to computer systems were described.
A computer program for geochemical analysis of acid-rain and other low-ionic-strength, acidic waters
Johnsson, P.A.; Lord, D.G.
1987-01-01
ARCHEM, a computer program written in FORTRAN 77, is designed primarily for use in the routine geochemical interpretation of low-ionic-strength, acidic waters. On the basis of chemical analyses of the water, and either laboratory or field determinations of pH, temperature, and dissolved oxygen, the program calculates the equilibrium distribution of major inorganic aqueous species and of inorganic aluminum complexes. The concentration of the organic anion is estimated from the dissolved organic concentration. Ionic ferrous iron is calculated from the dissolved oxygen concentration. Ionic balances and comparisons of computed with measured specific conductances are performed as checks on the analytical accuracy of chemical analyses. ARCHEM may be tailored easily to fit different sampling protocols, and may be run on multiple sample analyses. (Author 's abstract)
STX--Fortran-4 program for estimates of tree populations from 3P sample-tree-measurements
L. R. Grosenbaugh
1967-01-01
Describes how to use an improved and greatly expanded version of an earlier computer program (1964) that converts dendrometer measurements of 3P-sample trees to population values in terms of whatever units user desires. Many new options are available, including that of obtaining a product-yield and appraisal report based on regression coefficients supplied by user....
BASIC Programming In Water And Wastewater Analysis
NASA Technical Reports Server (NTRS)
Dreschel, Thomas
1988-01-01
Collection of computer programs assembled for use in water-analysis laboratories. First program calculates quality-control parameters used in routine water analysis. Second calculates line of best fit for standard concentrations and absorbances entered. Third calculates specific conductance from conductivity measurement and temperature at which measurement taken. Fourth calculates any one of four types of residue measured in water. Fifth, sixth, and seventh calculate results of titrations commonly performed on water samples. Eighth converts measurements, to actual dissolved-oxygen concentration using oxygen-saturation values for fresh and salt water. Ninth and tenth perform calculations of two other common titrimetric analyses. Eleventh calculates oil and grease residue from water sample. Last two use spectro-photometric measurements of absorbance at different wavelengths and residue measurements. Programs included in collection written for Hewlett-Packard 2647F in H-P BASIC.
Home-Based Computer Gaming in Vestibular Rehabilitation of Gaze and Balance Impairment.
Szturm, Tony; Reimer, Karen M; Hochman, Jordan
2015-06-01
Disease or damage of the vestibular sense organs cause a range of distressing symptoms and functional problems that could include loss of balance, gaze instability, disorientation, and dizziness. A novel computer-based rehabilitation system with therapeutic gaming application has been developed. This method allows different gaze and head movement exercises to be coupled to a wide range of inexpensive, commercial computer games. It can be used in standing, and thus graded balance demands using a sponge pad can be incorporated into the program. A case series pre- and postintervention study was conducted of nine adults diagnosed with peripheral vestibular dysfunction who received a 12-week home rehabilitation program. The feasibility and usability of the home computer-based therapeutic program were established. Study findings revealed that using head rotation to interact with computer games, when coupled to demanding balance conditions, resulted in significant improvements in standing balance, dynamic visual acuity, gaze control, and walking performance. Perception of dizziness as measured by the Dizziness Handicap Inventory also decreased significantly. These preliminary findings provide support that a low-cost home game-based exercise program is well suited to train standing balance and gaze control (with active and passive head motion).
NASA Technical Reports Server (NTRS)
1979-01-01
The computer program Linear SCIDNT which evaluates rotorcraft stability and control coefficients from flight or wind tunnel test data is described. It implements the maximum likelihood method to maximize the likelihood function of the parameters based on measured input/output time histories. Linear SCIDNT may be applied to systems modeled by linear constant-coefficient differential equations. This restriction in scope allows the application of several analytical results which simplify the computation and improve its efficiency over the general nonlinear case.
Software For Computer-Security Audits
NASA Technical Reports Server (NTRS)
Arndt, Kate; Lonsford, Emily
1994-01-01
Information relevant to potential breaches of security gathered efficiently. Automated Auditing Tools for VAX/VMS program includes following automated software tools performing noted tasks: Privileged ID Identification, program identifies users and their privileges to circumvent existing computer security measures; Critical File Protection, critical files not properly protected identified; Inactive ID Identification, identifications of users no longer in use found; Password Lifetime Review, maximum lifetimes of passwords of all identifications determined; and Password Length Review, minimum allowed length of passwords of all identifications determined. Written in DEC VAX DCL language.
1980-01-01
AND ADDRESS 1 .PROGRAM ELEMENT. PROJECT, TASK AREA & WORK UNIT NUMBERS Aircraft and -.rew Systems Technology Di r. (C:ode 60 E61-.U o A0 Naval Air...0 . . . . . . . . . . . B- 1 B-If Input Statistics For Equations 5 and 6 .... ....... B- 2 B-Ill Computation of Coefficients of...trapezoidal panels and the formula for PAR can be derived for the case where equal spanwise and chordwise divisions are used: PAR (N/2M)/( 1 +X ( 2 = ( 2
Visual Basic Programming Impact on Cognitive Style of College Students: Need for Prerequisites
ERIC Educational Resources Information Center
White, Garry L.
2012-01-01
This research investigated the impact learning a visual programming language, Visual Basic, has on hemispheric cognitive style, as measured by the Hemispheric Mode Indicator (HMI). The question to be answered is: will a computer programming course help students improve their cognitive abilities in order to perform well? The cognitive styles for…
Computer Generated Diffraction Patterns Of Rough Surfaces
NASA Astrophysics Data System (ADS)
Rakels, Jan H.
1989-03-01
It is generally accepted, that optical methods are the most promising for the in-process measurement of surface finish. These methods have the advantages of being non-contacting and fast data acquisition. In the Micro-Engineering Centre at the University of Warwick, an optical sensor has been devised which can measure the rms roughness, slope and wavelength of turned and precision ground surfaces. The operation of this device is based upon the Kirchhoff-Fresnel diffraction integral. Application of this theory to ideal turned surfaces is straightforward, and indeed the theoretically calculated diffraction patterns are in close agreement with patterns produced by an actual optical instrument. Since it is mathematically difficult to introduce real surface profiles into the diffraction integral, a computer program has been devised, which simulates the operation of the optical sensor. The program produces a diffraction pattern as a graphical output. Comparison between computer generated and actual diffraction patterns of the same surfaces show a high correlation.
Interactive information processing for NASA's mesoscale analysis and space sensor program
NASA Technical Reports Server (NTRS)
Parker, K. G.; Maclean, L.; Reavis, N.; Wilson, G.; Hickey, J. S.; Dickerson, M.; Karitani, S.; Keller, D.
1985-01-01
The Atmospheric Sciences Division (ASD) of the Systems Dynamics Laboratory at NASA's Marshall Space Flight Center (MSFC) is currently involved in interactive information processing for the Mesoscale Analysis and Space Sensor (MASS) program. Specifically, the ASD is engaged in the development and implementation of new space-borne remote sensing technology to observe and measure mesoscale atmospheric processes. These space measurements and conventional observational data are being processed together to gain an improved understanding of the mesoscale structure and the dynamical evolution of the atmosphere relative to cloud development and precipitation processes. To satisfy its vast data processing requirements, the ASD has developed a Researcher Computer System consiting of three primary computer systems which provides over 20 scientists with a wide range of capabilities for processing and displaying a large volumes of remote sensing data. Each of the computers performs a specific function according to its unique capabilities.
Computer quantitation of coronary angiograms
NASA Technical Reports Server (NTRS)
Ledbetter, D. C.; Selzer, R. H.; Gordon, R. M.; Blankenhorn, D. H.; Sanmarco, M. E.
1978-01-01
A computer technique is being developed at the Jet Propulsion Laboratory to automate the measurement of coronary stenosis. A Vanguard 35mm film transport is optically coupled to a Spatial Data System vidicon/digitizer which in turn is controlled by a DEC PDP 11/55 computer. Programs have been developed to track the edges of the arterial shadow, to locate normal and atherosclerotic vessel sections and to measure percent stenosis. Multiple frame analysis techniques are being investigated that involve on the one hand, averaging stenosis measurements from adjacent frames, and on the other hand, averaging adjacent frame images directly and then measuring stenosis from the averaged image. For the latter case, geometric transformations are used to force registration of vessel images whose spatial orientation changes.
A Survey of Quantum Programming Languages: History, Methods, and Tools
2008-01-01
and entanglement , to achieve computational solutions to certain problems in less time (fewer computational cycles) than is possible using classical...superposition of quantum bits, entanglement , destructive measurement, and the no-cloning theorem. These differences must be thoroughly understood and even...computers using well-known languages such as C, C++, Java, and rapid prototyping languages such as Maple, Mathematica, and Matlab . A good on-line
The ALL-OUT Library; A Design for Computer-Powered, Multidimensional Services.
ERIC Educational Resources Information Center
Sleeth, Jim; LaRue, James
1983-01-01
Preliminary description of design of electronic library and home information delivery system highlights potentials of personal computer interface program (applying for service, assuring that users are valid, checking for measures, searching, locating titles) and incorporation of concepts used in other information systems (security checks,…
Bibliography. Citations Obtained through the National Library of Medicine's MEDLARS Program.
ERIC Educational Resources Information Center
Journal of Medical Education, 1980
1980-01-01
A bibliography from the National Library of Medicine's MEDLARS Program covers: accreditation, certification and licensure; computers; continuing education; curriculum; educational measurement; faculty; forensic medicine; history; internship and residency; medical education in other countries; minority groups, sex and age factors; and premedical…
Computer program for analysis of coupled-cavity traveling wave tubes
NASA Technical Reports Server (NTRS)
Connolly, D. J.; Omalley, T. A.
1977-01-01
A flexible, accurate, large signal computer program was developed for the design of coupled cavity traveling wave tubes. The program is written in FORTRAN IV for an IBM 360/67 time sharing system. The beam is described by a disk model and the slow wave structure by a sequence of cavities, or cells. The computational approach is arranged so that each cavity may have geometrical or electrical parameters different from those of its neighbors. This allows the program user to simulate a tube of almost arbitrary complexity. Input and output couplers, severs, complicated velocity tapers, and other features peculiar to one or a few cavities may be modeled by a correct choice of input data. The beam-wave interaction is handled by an approach in which the radio frequency fields are expanded in solutions to the transverse magnetic wave equation. All significant space harmonics are retained. The program was used to perform a design study of the traveling-wave tube developed for the Communications Technology Satellite. Good agreement was obtained between the predictions of the program and the measured performance of the flight tube.
NASA Technical Reports Server (NTRS)
Tibbetts, J. G.
1980-01-01
Detailed instructions for using the near field cruise noise prediction program, a program listing, and a sample case with output are presented. The total noise for free field lossless conditions at selected observer locations is obtained by summing the contributions from up to nine acoustic sources. These noise sources, selected at the user's option, include the fan/compressor, turbine, core (combustion), jet, shock, and airframe (trailing edge and turbulent boundary layers). The effects of acoustic suppression materials such as engine inlet treatment may also be included in the noise prediction. The program is available for use on the NASA/Langley Research Center CDC computer. Comparisons of the program predictions with measured data are also given, and some possible reasons for their lack of agreement presented.
NASA Technical Reports Server (NTRS)
Whitaker, Mike
1991-01-01
Severe precipitation static problems affecting the communication equipment onboard the P-3B aircraft were recently studied. The study was conducted after precipitation static created potential safety-of-flight problems on Naval Reserve aircraft. A specially designed flight test program was conducted in order to measure, record, analyze, and characterize potential precipitation static problem areas. The test program successfully characterized the precipitation static interference problems while the P-3B was flown in moderate to extreme precipitation conditions. Data up to 400 MHz were collected on the effects of engine charging, precipitation static, and extreme cross fields. These data were collected using a computer controlled acquisition system consisting of a signal generator, RF spectrum and audio analyzers, data recorders, and instrumented static dischargers. The test program is outlined and the computer controlled data acquisition system is described in detail which was used during flight and ground testing. The correlation of test results is also discussed which were recorded during the flight test program and those measured during ground testing.
Higgins, Eleanor L; Raskind, Marshall H
2004-12-01
This study was conducted to assess the effectiveness of two programs developed by the Frostig Center Research Department to improve the reading and spelling of students with learning disabilities (LD): a computer Speech Recognition-based Program (SRBP) and a computer and text-based Automaticity Program (AP). Twenty-eight LD students with reading and spelling difficulties (aged 8 to 18) received each program for 17 weeks and were compared with 16 students in a contrast group who did not receive either program. After adjusting for age and IQ, both the SRBP and AP groups showed significant differences over the contrast group in improving word recognition and reading comprehension. Neither program showed significant differences over contrasts in spelling. The SRBP also improved the performance of the target group when compared with the contrast group on phonological elision and nonword reading efficiency tasks. The AP showed significant differences in all process and reading efficiency measures.
Physical Medicine and Rehabilitation Resident Use of iPad Mini Mobile Devices.
Niehaus, William; Boimbo, Sandra; Akuthota, Venu
2015-05-01
Previous research on the use of tablet devices in residency programs has been undertaken in radiology and medicine or with standard-sized tablet devices. With new, smaller tablet devices, there is an opportunity to assess their effect on resident behavior. This prospective study attempts to evaluate resident behavior after receiving a smaller tablet device. To evaluate whether smaller tablet computers facilitate residents' daily tasks. Prospective study that administered surveys to evaluate tablet computer use. Residency program. Thirteen physical medicine and rehabilitation residents. Residents were provided 16-GB iPad Minis and surveyed with Redcap to collect usage information at baseline, 3, and 6 months. Survey analysis was conducted using SAS (SAS, Cary, NC) for descriptive analysis. To evaluate multiple areas of resident education, the following tasks were selected: accessing e-mail, logging duty hours, logging procedures, researching clinical information, accessing medical journals, reviewing didactic presentations, and completing evaluations. Then, measurements were taken of: (1) residents' response to how tablet computers made it easier to access the aforementioned tasks; and (2) residents' response to how tablet computers affected the frequency they performed the aforementioned tasks. After being provided tablet computers, our physical medicine and rehabilitation residents reported significantly greater access to e-mail, medical journals, and didactic material. Also, receiving tablet computers was reported to increase the frequency that residents accessed e-mail, researched clinical information, accessed medical journals, reviewed didactic presentations, and completed evaluations. After receiving a tablet computer, residents reported an increase in the use of calendar programs, note-taking programs, PDF readers, online storage programs, and file organization programs. These physical medicine and rehabilitation residents reported tablet computers increased access to e-mail, presentation material, and medical journals. Tablet computers also were reported to increase the frequency residents were able to complete tasks associated with residency training. Copyright © 2015 American Academy of Physical Medicine and Rehabilitation. Published by Elsevier Inc. All rights reserved.
Pocket computers: a new aid to nutritional support.
Colley, C M; Fleck, A; Howard, J P
1985-01-01
A program has been written to run on a pocket computer (Sharp PC-1500) that can be used at the bedside to predict the nutritional requirements of patients with a wide range of clinical conditions. The predictions of the program showed good correlation with measured values for energy and nitrogen requirements. The program was used, with good results, in the management of over 100 patients needing nutritional support. The calculation of nutritional requirements for each patient individually facilitates more appropriate treatment and may also produce financial savings when compared with administration of a standard feeding regimen to all patients. Images FIG 1 PMID:3922512
Computation of records of streamflow at control structures
Collins, Dannie L.
1977-01-01
Traditional methods of computing streamflow records on large, low-gradient streams require a continuous record of water-surface slope over a natural channel reach. This slope must be of sufficient magnitude to be accuratly measured with available stage measuring devices. On highly regulated streams, this slope approaches zero during periods of low flow and accurate measurement is difficult. Methods are described to calibrate multipurpose regulating control structures to more accurately compute streamflow records on highly-regulated streams. Hydraulic theory, assuming steady, uniform flow during a computational interval, is described for five different types of flow control. The controls are: Tainter gates, hydraulic turbines, fixed spillways, navigation locks, and crest gates. Detailed calibration procedures are described for the five different controls as well as for several flow regimes for some of the controls. The instrumentation package and computer programs necessary to collect and process the field data are discussed. Two typical calibration procedures and measurement data are presented to illustrate the accuracy of the methods. (Woodard-USGS)
CBT for depression: a pilot RCT comparing mobile phone vs. computer.
Watts, Sarah; Mackenzie, Anna; Thomas, Cherian; Griskaitis, Al; Mewton, Louise; Williams, Alishia; Andrews, Gavin
2013-02-07
This paper reports the results of a pilot randomized controlled trial comparing the delivery modality (mobile phone/tablet or fixed computer) of a cognitive behavioural therapy intervention for the treatment of depression. The aim was to establish whether a previously validated computerized program (The Sadness Program) remained efficacious when delivered via a mobile application. 35 participants were recruited with Major Depression (80% female) and randomly allocated to access the program using a mobile app (on either a mobile phone or iPad) or a computer. Participants completed 6 lessons, weekly homework assignments, and received weekly email contact from a clinical psychologist or psychiatrist until completion of lesson 2. After lesson 2 email contact was only provided in response to participant request, or in response to a deterioration in psychological distress scores. The primary outcome measure was the Patient Health Questionnaire 9 (PHQ-9). Of the 35 participants recruited, 68.6% completed 6 lessons and 65.7% completed the 3-months follow up. Attrition was handled using mixed-model repeated-measures ANOVA. Both the Mobile and Computer Groups were associated with statistically significantly benefits in the PHQ-9 at post-test. At 3 months follow up, the reduction seen for both groups remained significant. These results provide evidence to indicate that delivering a CBT program using a mobile application, can result in clinically significant improvements in outcomes for patients with depression. Australian New Zealand Clinical Trials Registry ACTRN 12611001257954.
Computed gray levels in multislice and cone-beam computed tomography.
Azeredo, Fabiane; de Menezes, Luciane Macedo; Enciso, Reyes; Weissheimer, Andre; de Oliveira, Rogério Belle
2013-07-01
Gray level is the range of shades of gray in the pixels, representing the x-ray attenuation coefficient that allows for tissue density assessments in computed tomography (CT). An in-vitro study was performed to investigate the relationship between computed gray levels in 3 cone-beam CT (CBCT) scanners and 1 multislice spiral CT device using 5 software programs. Six materials (air, water, wax, acrylic, plaster, and gutta-percha) were scanned with the CBCT and CT scanners, and the computed gray levels for each material at predetermined points were measured with OsiriX Medical Imaging software (Geneva, Switzerland), OnDemand3D (CyberMed International, Seoul, Korea), E-Film (Merge Healthcare, Milwaukee, Wis), Dolphin Imaging (Dolphin Imaging & Management Solutions, Chatsworth, Calif), and InVivo Dental Software (Anatomage, San Jose, Calif). The repeatability of these measurements was calculated with intraclass correlation coefficients, and the gray levels were averaged to represent each material. Repeated analysis of variance tests were used to assess the differences in gray levels among scanners and materials. There were no differences in mean gray levels with the different software programs. There were significant differences in gray levels between scanners for each material evaluated (P <0.001). The software programs were reliable and had no influence on the CT and CBCT gray level measurements. However, the gray levels might have discrepancies when different CT and CBCT scanners are used. Therefore, caution is essential when interpreting or evaluating CBCT images because of the significant differences in gray levels between different CBCT scanners, and between CBCT and CT values. Copyright © 2013 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.
Study to design and develop remote manipulator system
NASA Technical Reports Server (NTRS)
Hill, J. W.; Sword, A. J.
1973-01-01
Human performance measurement techniques for remote manipulation tasks and remote sensing techniques for manipulators are described for common manipulation tasks, performance is monitored by means of an on-line computer capable of measuring the joint angles of both master and slave arms as a function of time. The computer programs allow measurements of the operator's strategy and physical quantities such as task time and power consumed. The results are printed out after a test run to compare different experimental conditions. For tracking tasks, we describe a method of displaying errors in three dimensions and measuring the end-effector position in three dimensions.
Development of weight/sizing design synthesis computer program. Volume 3: User Manual
NASA Technical Reports Server (NTRS)
Garrison, J. M.
1973-01-01
The user manual for the weight/sizing design synthesis program is presented. The program is applied to an analysis of the basic weight relationships for the space shuttle which contribute significant portions of the inert weight. The relationships measure the parameters of load, geometry, material, and environment. A verbal description of the processes simulated, data input procedures, output data, and values present in the program is included.
A Computerised English Language Proofing Cloze Program.
ERIC Educational Resources Information Center
Coniam, David
1997-01-01
Describes a computer program that takes multiple-choice cloze passages and compiles them into proofreading exercises. Results reveal that such a computerized test type can be used to accurately measure the proficiency of students of English as a Second Language in Hong Kong. (14 references) (Author/CK)
NASA Astrophysics Data System (ADS)
Wilcox, William Edward, Jr.
1995-01-01
A computer program (LIDAR-PC) and associated atmospheric spectral databases have been developed which accurately simulate the laser remote sensing of the atmosphere and the system performance of a direct-detection Lidar or tunable Differential Absorption Lidar (DIAL) system. This simulation program allows, for the first time, the use of several different large atmospheric spectral databases to be coupled with Lidar parameter simulations on the same computer platform to provide a real-time, interactive, and easy to use design tool for atmospheric Lidar simulation and modeling. LIDAR -PC has been used for a range of different Lidar simulations and compared to experimental Lidar data. In general, the simulations agreed very well with the experimental measurements. In addition, the simulation offered, for the first time, the analysis and comparison of experimental Lidar data to easily determine the range-resolved attenuation coefficient of the atmosphere and the effect of telescope overlap factor. The software and databases operate on an IBM-PC or compatible computer platform, and thus are very useful to the research community for Lidar analysis. The complete Lidar and atmospheric spectral transmission modeling program uses the HITRAN database for high-resolution molecular absorption lines of the atmosphere, the BACKSCAT/LOWTRAN computer databases and models for the effects of aerosol and cloud backscatter and attenuation, and the range-resolved Lidar equation. The program can calculate the Lidar backscattered signal-to-noise for a slant path geometry from space and simulate the effect of high resolution, tunable, single frequency, and moderate line width lasers on the Lidar/DIAL signal. The program was used to model and analyze the experimental Lidar data obtained from several measurements. A fixed wavelength, Ho:YSGG aerosol Lidar (Sugimoto, 1990) developed at USF and a tunable Ho:YSGG DIAL system (Cha, 1991) for measuring atmospheric water vapor at 2.1 μm were analyzed. The simulations agreed very well with the measurements, and also yielded, for the first time, the ability to easily deduce the atmospheric attentuation coefficient, alpha, from the Lidar data. Simulations and analysis of other Lidar measurements included that of a 1.57 mu m OPO aerosol Lidar system developed at USF (Harrell, 1995) and of the NASA LITE (Laser-in-Space Technology Experiment) Lidar recently flown in the Space shuttle. Finally, an extensive series of laboratory experiments were made with the 1.57 μm OPO Lidar system to test calculations of the telescope/laser overlap and the effect of different telescope sizes and designs. The simulations agreed well with the experimental data for the telescope diameter and central obscuration test cases. The LIDAR-PC programs are available on the Internet from the USAF Lidar Home Page Web site, http://www.cas.usf.edu/physics/lidar.html/.
Ground temperature measurement by PRT-5 for maps experiment
NASA Technical Reports Server (NTRS)
Gupta, S. K.; Tiwari, S. N.
1978-01-01
A simple algorithm and computer program were developed for determining the actual surface temperature from the effective brightness temperature as measured remotely by a radiation thermometer called PRT-5. This procedure allows the computation of atmospheric correction to the effective brightness temperature without performing detailed radiative transfer calculations. Model radiative transfer calculations were performed to compute atmospheric corrections for several values of the surface and atmospheric parameters individually and in combination. Polynomial regressions were performed between the magnitudes or deviations of these parameters and the corresponding computed corrections to establish simple analytical relations between them. Analytical relations were also developed to represent combined correction for simultaneous variation of parameters in terms of their individual corrections.
Medical education as a science: the quality of evidence for computer-assisted instruction.
Letterie, Gerard S
2003-03-01
A marked increase in the number of computer programs for computer-assisted instruction in the medical sciences has occurred over the past 10 years. The quality of both the programs and the literature that describe these programs has varied considerably. The purposes of this study were to evaluate the published literature that described computer-assisted instruction in medical education and to assess the quality of evidence for its implementation, with particular emphasis on obstetrics and gynecology. Reports published between 1988 and 2000 on computer-assisted instruction in medical education were identified through a search of MEDLINE and Educational Resource Identification Center and a review of the bibliographies of the articles that were identified. Studies were selected if they included a description of computer-assisted instruction in medical education, regardless of the type of computer program. Data were extracted with a content analysis of 210 reports. The reports were categorized according to study design (comparative, prospective, descriptive, review, or editorial), type of computer-assisted instruction, medical specialty, and measures of effectiveness. Computer-assisted instruction programs included online technologies, CD-ROMs, video laser disks, multimedia work stations, virtual reality, and simulation testing. Studies were identified in all medical specialties, with a preponderance in internal medicine, general surgery, radiology, obstetrics and gynecology, pediatrics, and pathology. Ninety-six percent of the articles described a favorable impact of computer-assisted instruction in medical education, regardless of the quality of the evidence. Of the 210 reports that were identified, 60% were noncomparative, descriptive reports of new techniques in computer-assisted instruction, and 15% and 14% were reviews and editorials, respectively, of existing technology. Eleven percent of studies were comparative and included some form of assessment of the effectiveness of the computer program. These assessments included pre- and posttesting and questionnaires to score program quality, perceptions of the medical students and/or residents regarding the program, and impact on learning. In one half of these comparative studies, computer-assisted instruction was compared with traditional modes of teaching, such as text and lectures. Six studies compared performance before and after the computer-assisted instruction. Improvements were shown in 5 of the studies. In the remainder of the studies, computer-assisted instruction appeared to result in similar test performance. Despite study design or outcome, most articles described enthusiastic endorsement of the programs by the participants, including medical students, residents, and practicing physicians. Only 1 study included cost analysis. Thirteen of the articles were in obstetrics and gynecology. Computer-assisted instruction has assumed to have an increasing role in medical education. In spite of enthusiastic endorsement and continued improvements in software, few studies of good design clearly demonstrate improvement in medical education over traditional modalities. There are no comparative studies in obstetrics and gynecology that demonstrate a clear-cut advantage. Future studies of computer-assisted instruction that include comparisons and cost assessments to gauge their effectiveness over traditional methods may better define their precise role.
MEASUREMENT AND PRECISION, EXPERIMENTAL VERSION.
ERIC Educational Resources Information Center
Harvard Univ., Cambridge, MA. Harvard Project Physics.
THIS DOCUMENT IS AN EXPERIMENTAL VERSION OF A PROGRAMED TEXT ON MEASUREMENT AND PRECISION. PART I CONTAINS 24 FRAMES DEALING WITH PRECISION AND SIGNIFICANT FIGURES ENCOUNTERED IN VARIOUS MATHEMATICAL COMPUTATIONS AND MEASUREMENTS. PART II BEGINS WITH A BRIEF SECTION ON EXPERIMENTAL DATA, COVERING SUCH POINTS AS (1) ESTABLISHING THE ZERO POINT, (2)…
Gardner, William; Morton, Suzanne; Byron, Sepheen C; Tinoco, Aldo; Canan, Benjamin D; Leonhart, Karen; Kong, Vivian; Scholle, Sarah Hudson
2014-01-01
Objective To determine whether quality measures based on computer-extracted EHR data can reproduce findings based on data manually extracted by reviewers. Data Sources We studied 12 measures of care indicated for adolescent well-care visits for 597 patients in three pediatric health systems. Study Design Observational study. Data Collection/Extraction Methods Manual reviewers collected quality data from the EHR. Site personnel programmed their EHR systems to extract the same data from structured fields in the EHR according to national health IT standards. Principal Findings Overall performance measured via computer-extracted data was 21.9 percent, compared with 53.2 percent for manual data. Agreement measures were high for immunizations. Otherwise, agreement between computer extraction and manual review was modest (Kappa = 0.36) because computer-extracted data frequently missed care events (sensitivity = 39.5 percent). Measure validity varied by health care domain and setting. A limitation of our findings is that we studied only three domains and three sites. Conclusions The accuracy of computer-extracted EHR quality reporting depends on the use of structured data fields, with the highest agreement found for measures and in the setting that had the greatest concentration of structured fields. We need to improve documentation of care, data extraction, and adaptation of EHR systems to practice workflow. PMID:24471935
Processing EOS MLS Level-2 Data
NASA Technical Reports Server (NTRS)
Snyder, W. Van; Wu, Dong; Read, William; Jiang, Jonathan; Wagner, Paul; Livesey, Nathaniel; Schwartz, Michael; Filipiak, Mark; Pumphrey, Hugh; Shippony, Zvi
2006-01-01
A computer program performs level-2 processing of thermal-microwave-radiance data from observations of the limb of the Earth by the Earth Observing System (EOS) Microwave Limb Sounder (MLS). The purpose of the processing is to estimate the composition and temperature of the atmosphere versus altitude from .8 to .90 km. "Level-2" as used here is a specialists f term signifying both vertical profiles of geophysical parameters along the measurement track of the instrument and processing performed by this or other software to generate such profiles. Designed to be flexible, the program is controlled via a configuration file that defines all aspects of processing, including contents of state and measurement vectors, configurations of forward models, measurement and calibration data to be read, and the manner of inverting the models to obtain the desired estimates. The program can operate in a parallel form in which one instance of the program acts a master, coordinating the work of multiple slave instances on a cluster of computers, each slave operating on a portion of the data. Optionally, the configuration file can be made to instruct the software to produce files of simulated radiances based on state vectors formed from sets of geophysical data-product files taken as input.
Additional Support for the Information Systems Analyst Exam as a Valid Program Assessment Tool
ERIC Educational Resources Information Center
Carpenter, Donald A.; Snyder, Johnny; Slauson, Gayla Jo; Bridge, Morgan K.
2011-01-01
This paper presents a statistical analysis to support the notion that the Information Systems Analyst (ISA) exam can be used as a program assessment tool in addition to measuring student performance. It compares ISA exam scores earned by students in one particular Computer Information Systems program with scores earned by the same students on the…
Three-dimensional transonic potential flow about complex 3-dimensional configurations
NASA Technical Reports Server (NTRS)
Reyhner, T. A.
1984-01-01
An analysis has been developed and a computer code written to predict three-dimensional subsonic or transonic potential flow fields about lifting or nonlifting configurations. Possible condfigurations include inlets, nacelles, nacelles with ground planes, S-ducts, turboprop nacelles, wings, and wing-pylon-nacelle combinations. The solution of the full partial differential equation for compressible potential flow written in terms of a velocity potential is obtained using finite differences, line relaxation, and multigrid. The analysis uses either a cylindrical or Cartesian coordinate system. The computational mesh is not body fitted. The analysis has been programmed in FORTRAN for both the CDC CYBER 203 and the CRAY-1 computers. Comparisons of computed results with experimental measurement are presented. Descriptions of the program input and output formats are included.
Edwards, Roger L; Edwards, Sandra L; Bryner, James; Cunningham, Kelly; Rogers, Amy; Slattery, Martha L
2008-04-01
We describe a computer-assisted data collection system developed for a multicenter cohort study of American Indian and Alaska Native people. The study computer-assisted participant evaluation system or SCAPES is built around a central database server that controls a small private network with touch screen workstations. SCAPES encompasses the self-administered questionnaires, the keyboard-based stations for interviewer-administered questionnaires, a system for inputting medical measurements, and administrative tasks such as data exporting, backup and management. Elements of SCAPES hardware/network design, data storage, programming language, software choices, questionnaire programming including the programming of questionnaires administered using audio computer-assisted self-interviewing (ACASI), and participant identification/data security system are presented. Unique features of SCAPES are that data are promptly made available to participants in the form of health feedback; data can be quickly summarized for tribes for health monitoring and planning at the community level; and data are available to study investigators for analyses and scientific evaluation.
Free oscilloscope web app using a computer mic, built-in sound library, or your own files
NASA Astrophysics Data System (ADS)
Ball, Edward; Ruiz, Frances; Ruiz, Michael J.
2017-07-01
We have developed an online oscilloscope program which allows users to see waveforms by utilizing their computer microphones, selecting from our library of over 30 audio files, and opening any *.mp3 or *.wav file on their computers. The oscilloscope displays real-time signals against time. The oscilloscope has been calibrated so one can make accurate frequency measurements of periodic waves to within 1%. The web app is ideal for computer projection in class.
Estimating normal mixture parameters from the distribution of a reduced feature vector
NASA Technical Reports Server (NTRS)
Guseman, L. F.; Peters, B. C., Jr.; Swasdee, M.
1976-01-01
A FORTRAN computer program was written and tested. The measurements consisted of 1000 randomly chosen vectors representing 1, 2, 3, 7, and 10 subclasses in equal portions. In the first experiment, the vectors are computed from the input means and covariances. In the second experiment, the vectors are 16 channel measurements. The starting covariances were constructed as if there were no correlation between separate passes. The biases obtained from each run are listed.
Digital image processing of vascular angiograms
NASA Technical Reports Server (NTRS)
Selzer, R. H.; Blankenhorn, D. H.; Beckenbach, E. S.; Crawford, D. W.; Brooks, S. H.
1975-01-01
A computer image processing technique was developed to estimate the degree of atherosclerosis in the human femoral artery. With an angiographic film of the vessel as input, the computer was programmed to estimate vessel abnormality through a series of measurements, some derived primarily from the vessel edge information and others from optical density variations within the lumen shadow. These measurements were combined into an atherosclerosis index, which was found to correlate well with both visual and chemical estimates of atherosclerotic disease.
ERIC Educational Resources Information Center
Texas Education Agency, Austin. Div. of Educational Assessment.
This document lists the objectives for the Texas educational assessment program in mathematics. Eighteen objectives for exit level mathematics are listed, by category: number concepts (4); computation (3); applied computation (5); statistical concepts (3); geometric concepts (2); and algebraic concepts (1). Then general specifications are listed…
ERIC Educational Resources Information Center
Blikstein, Paulo; Worsley, Marcelo
2016-01-01
New high-frequency multimodal data collection technologies and machine learning analysis techniques could offer new insights into learning, especially when students have the opportunity to generate unique, personalized artifacts, such as computer programs, robots, and solutions engineering challenges. To date most of the work on learning analytics…
Performance Measures in Courses Using Computer-Aided Personalized System of Instruction
ERIC Educational Resources Information Center
Springer, C. R.; Pear, J. J.
2008-01-01
Archived data from four courses taught with computer-aided personalized system of instruction (CAPSI)--an online, self-paced, instructional program--were used to explore the relationship between objectively rescored final exam grades, peer reviewing, and progress rate--i.e., the rate at which students completed unit tests. There was a strong…
A Scheme for Text Analysis Using Fortran.
ERIC Educational Resources Information Center
Koether, Mary E.; Coke, Esther U.
Using string-manipulation algorithms, FORTRAN computer programs were designed for analysis of written material. The programs measure length of a text and its complexity in terms of the average length of words and sentences, map the occurrences of keywords or phrases, calculate word frequency distribution and certain indicators of style. Trials of…
Quest: The Interactive Test Analysis System.
ERIC Educational Resources Information Center
Adams, Raymond J.; Khoo, Siek-Toon
The Quest program offers a comprehensive test and questionnaire analysis environment by providing a data analyst (a computer program) with access to the most recent developments in Rasch measurement theory, as well as a range of traditional analysis procedures. This manual helps the user use Quest to construct and validate variables based on…
Computer-assisted uncertainty assessment of k0-NAA measurement results
NASA Astrophysics Data System (ADS)
Bučar, T.; Smodiš, B.
2008-10-01
In quantifying measurement uncertainty of measurement results obtained by the k0-based neutron activation analysis ( k0-NAA), a number of parameters should be considered and appropriately combined in deriving the final budget. To facilitate this process, a program ERON (ERror propagatiON) was developed, which computes uncertainty propagation factors from the relevant formulae and calculates the combined uncertainty. The program calculates uncertainty of the final result—mass fraction of an element in the measured sample—taking into account the relevant neutron flux parameters such as α and f, including their uncertainties. Nuclear parameters and their uncertainties are taken from the IUPAC database (V.P. Kolotov and F. De Corte, Compilation of k0 and related data for NAA). Furthermore, the program allows for uncertainty calculations of the measured parameters needed in k0-NAA: α (determined with either the Cd-ratio or the Cd-covered multi-monitor method), f (using the Cd-ratio or the bare method), Q0 (using the Cd-ratio or internal comparator method) and k0 (using the Cd-ratio, internal comparator or the Cd subtraction method). The results of calculations can be printed or exported to text or MS Excel format for further analysis. Special care was taken to make the calculation engine portable by having possibility of its incorporation into other applications (e.g., DLL and WWW server). Theoretical basis and the program are described in detail, and typical results obtained under real measurement conditions are presented.
Application-Program-Installer Builder
NASA Technical Reports Server (NTRS)
Wolgast, Paul; Demore, Martha; Lowik, Paul
2007-01-01
A computer program builds application programming interfaces (APIs) and related software components for installing and uninstalling application programs in any of a variety of computers and operating systems that support the Java programming language in its binary form. This program is partly similar in function to commercial (e.g., Install-Shield) software. This program is intended to enable satisfaction of a quasi-industry-standard set of requirements for a set of APIs that would enable such installation and uninstallation and that would avoid the pitfalls that are commonly encountered during installation of software. The requirements include the following: 1) Properly detecting prerequisites to an application program before performing the installation; 2) Properly registering component requirements; 3) Correctly measuring the required hard-disk space, including accounting for prerequisite components that have already been installed; and 4) Correctly uninstalling an application program. Correct uninstallation includes (1) detecting whether any component of the program to be removed is required by another program, (2) not removing that component, and (3) deleting references to requirements of the to-be-removed program for components of other programs so that those components can be properly removed at a later time.
Bradley, D. Nathan
2012-01-01
The slope-area method is a technique for estimating the peak discharge of a flood after the water has receded (Dalrymple and Benson, 1967). This type of discharge estimate is called an “indirect measurement” because it relies on evidence left behind by the flood, such as high-water marks (HWMs) on trees or buildings. These indicators of flood stage are combined with measurements of the cross-sectional geometry of the stream, estimates of channel roughness, and a mathematical model that balances the total energy of the flow between cross sections. This is in contrast to a “direct” measurement of discharge during the flood where cross-sectional area is measured and a current meter or acoustic equipment is used to measure the water velocity. When a direct discharge measurement cannot be made at a gage during high flows because of logistics or safety reasons, an indirect measurement of a peak discharge is useful for defining the high-flow section of the stage-discharge relation (rating curve) at the stream gage, resulting in more accurate computation of high flows. The Slope-Area Computation program (SAC; Fulford, 1994) is an implementation of the slope-area method that computes a peak-discharge estimate from inputs of water-surface slope (from surveyed HWMs), channel geometry, and estimated channel roughness. SAC is a command line program written in Fortran that reads input data from a formatted text file and prints results to another formatted text file. Preparing the input file can be time-consuming and prone to errors. This document describes the SAC graphical user interface (GUI), a crossplatform “wrapper” application that prepares the SAC input file, executes the program, and helps the user interpret the output. The SAC GUI is an update and enhancement of the slope-area method (SAM; Hortness, 2004; Berenbrock, 1996), an earlier spreadsheet tool used to aid field personnel in the completion of a slope-area measurement. The SAC GUI reads survey data, develops a plan-view plot, water-surface profile, cross-section plots, and develops the SAC input file. The SAC GUI also develops HEC-2 files that can be imported into HEC–RAS.
Visualizing ultrasound through computational modeling
NASA Technical Reports Server (NTRS)
Guo, Theresa W.
2004-01-01
The Doppler Ultrasound Hematocrit Project (DHP) hopes to find non-invasive methods of determining a person s blood characteristics. Because of the limits of microgravity and the space travel environment, it is important to find non-invasive methods of evaluating the health of persons in space. Presently, there is no well developed method of determining blood composition non-invasively. This projects hopes to use ultrasound and Doppler signals to evaluate the characteristic of hematocrit, the percentage by volume of red blood cells within whole blood. These non-invasive techniques may also be developed to be used on earth for trauma patients where invasive measure might be detrimental. Computational modeling is a useful tool for collecting preliminary information and predictions for the laboratory research. We hope to find and develop a computer program that will be able to simulate the ultrasound signals the project will work with. Simulated models of test conditions will more easily show what might be expected from laboratory results thus help the research group make informed decisions before and during experimentation. There are several existing Matlab based computer programs available, designed to interpret and simulate ultrasound signals. These programs will be evaluated to find which is best suited for the project needs. The criteria of evaluation that will be used are 1) the program must be able to specify transducer properties and specify transmitting and receiving signals, 2) the program must be able to simulate ultrasound signals through different attenuating mediums, 3) the program must be able to process moving targets in order to simulate the Doppler effects that are associated with blood flow, 4) the program should be user friendly and adaptable to various models. After a computer program is chosen, two simulation models will be constructed. These models will simulate and interpret an RF data signal and a Doppler signal.
Measurement of hydraulic conductivity of unsaturated soils with thermocouple psychometers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Daniel, D.E.
1982-11-01
A method of measuring the hydraulic conductivity of unsaturated soil using the instantaneous profile method with psychometric probes to measure water potential is developed and described. Soil is compacted into cylindrical tubes, and the tubes are sealed and instrumented with thermocouple psychrometers. The soil is moistened or dried from one end of the tube. Psychrometers are read periodically. Hydraulic conductivity is computed from the psychrometer readings and the appropriate moisture characteristic curve for the soil and then plotted as a function of water potential, water content, or degree of saturation. Hydraulic conductivities of six soils were measured at water potentialsmore » as low as -80 bar. The measured hydraulic conductivities and moisture characteristic curves were used along with the known boundary flux in a computer program to calculate the final water content profiles. Computed and measured final water content profiles agreed tolerably well.« less
VAPEPS user's reference manual, version 5.0
NASA Technical Reports Server (NTRS)
Park, D. M.
1988-01-01
This is the reference manual for the VibroAcoustic Payload Environment Prediction System (VAPEPS). The system consists of a computer program and a vibroacoustic database. The purpose of the system is to collect measurements of vibroacoustic data taken from flight events and ground tests, and to retrieve this data and provide a means of using the data to predict future payload environments. This manual describes the operating language of the program. Topics covered include database commands, Statistical Energy Analysis (SEA) prediction commands, stress prediction command, and general computational commands.
Computing Reliabilities Of Ceramic Components Subject To Fracture
NASA Technical Reports Server (NTRS)
Nemeth, N. N.; Gyekenyesi, J. P.; Manderscheid, J. M.
1992-01-01
CARES calculates fast-fracture reliability or failure probability of macroscopically isotropic ceramic components. Program uses results from commercial structural-analysis program (MSC/NASTRAN or ANSYS) to evaluate reliability of component in presence of inherent surface- and/or volume-type flaws. Computes measure of reliability by use of finite-element mathematical model applicable to multiple materials in sense model made function of statistical characterizations of many ceramic materials. Reliability analysis uses element stress, temperature, area, and volume outputs, obtained from two-dimensional shell and three-dimensional solid isoparametric or axisymmetric finite elements. Written in FORTRAN 77.
Review of NASA antiskid braking research
NASA Technical Reports Server (NTRS)
Tanner, J. A.
1982-01-01
NASA antiskid braking system research programs are reviewed. These programs include experimental studies of four antiskid systems on the Langley Landing Loads Track, flights tests with a DC-9 airplane, and computer simulation studies. Results from these research efforts include identification of factors contributing to degraded antiskid performance under adverse weather conditions, tire tread temperature measurements during antiskid braking on dry runway surfaces, and an assessment of the accuracy of various brake pressure-torque computer models. This information should lead to the development of better antiskid systems in the future.
Acoustic Detection Of Loose Particles In Pressure Sensors
NASA Technical Reports Server (NTRS)
Kwok, Lloyd C.
1995-01-01
Particle-impact-noise-detector (PIND) apparatus used in conjunction with computer program analyzing output of apparatus to detect extraneous particles trapped in pressure sensors. PIND tester essentially shaker equipped with microphone measuring noise in pressure sensor or other object being shaken. Shaker applies controlled vibration. Output of microphone recorded and expressed in terms of voltage, yielding history of noise subsequently processed by computer program. Data taken at sampling rate sufficiently high to enable identification of all impacts of particles on sensor diaphragm and on inner surfaces of sensor cavities.
Computation of iodine species concentrations in water
NASA Technical Reports Server (NTRS)
Schultz, John R.; Mudgett, Paul D.; Flanagan, David T.; Sauer, Richard L.
1994-01-01
During an evaluation of the use of iodine as a water disinfectant and the development of methods for measuring various iodine species in water onboard Space Freedom, it became necessary to compute the concentration of the various species based on equilibrium principles alone. Of particular concern was the case when various amounts of iodine, iodide, strong acid, and strong base are added to water. Such solutions can be used to evaluate the performance of various monitoring methods being considered. The authors of this paper present an overview of aqueous iodine chemistry, a set of nonlinear equations which can be used to model the above case, and a computer program for solving this system of equations using the Newton-Raphson method. The program was validated by comparing results over a range of concentrations and pH values with those previously presented by Gottardi for a given pH. Use of this program indicated that there are multiple roots to many cases and selecting an appropriate initial guess is important. Comparison of program results with laboratory results for the case when only iodine is added to water indicates the program gives high pH values for the iodine concentrations normally used for water disinfection. Extending the model to include the effects of iodate formation results in the computer pH values being closer to those observed, but the model with iodate does not agree well for the case in which base is added in addition to iodine to raise the pH. Potential explanations include failure to obtain equilibrium conditions in the lab, inaccuracies in published values for the equilibrium constants, and inadequate model of iodine chemistry and/or the lack of adequate analytical methods for measuring the various iodine species in water.
Wiksten, Denise Lebsack; Patterson, Patricia; Antonio, Kimberly; De La Cruz, Daniel; Buxton, Barton P.
1998-01-01
Objective: To evaluate the effectiveness of an interactive athletic training educational curriculum (IATEC) computer program as compared with traditional lecture instruction. Instructions on assessment of the quadriceps angle (Q-angle) were compared. Dependent measures consisted of cognitive knowledge, practical skill assessment, and attitudes toward the 2 methods of instruction. Design and Setting: Sixty-six subjects were selected and then randomly assigned to 3 different groups: traditional lecture, IATEC, and control. The traditional lecture group (n = 22) received a 50-minute lecture/demonstration covering the same instructional content as the Q-angle module of the IATEC program. The IATEC group (n = 20; 2 subjects were dropped from this group due to scheduling conflicts) worked independently for 50 to 65 minutes using the Q-angle module of the IATEC program. The control group (n = 22) received no instruction. Subjects: Subjects were recruited from an undergraduate athletic training education program and were screened for prior knowledge of the Q-angle. Measurements: A 9-point multiple choice examination was used to determine cognitive knowledge of the Q-angle. A 12-point yes-no checklist was used to determine whether or not the subjects were able to correctly measure the Q-angle. The Allen Attitude Toward Computer-Assisted Instruction Semantic Differential Survey was used to assess student attitudes toward the 2 methods of instruction. The survey examined overall attitudes, in addition to 3 subscales: comfort, creativity, and function. The survey was scored from 1 to 7, with 7 being the most favorable and 1 being the least favorable. Results: Results of a 1-way ANOVA on cognitive knowledge of the Q-angle revealed that the traditional lecture and IATEC groups performed significantly better than the control group, and the traditional lecture group performed significantly better than the IATEC group. Results of a 1-way ANOVA on practical skill performance revealed that the traditional lecture and IATEC groups performed significantly better than the control group, but there were no significant differences between the traditional lecture and IATEC groups on practical skill performance. Results of a t test indicated significantly more favorable attitudes (P < .05) for the traditional lecture group when compared with the IATEC group for comfort, creativity, and function. Conclusions: Our results suggest that use of the IATEC computer module is an effective means of instruction; however, use of the IATEC program alone may not be sufficient for educating students in cognitive knowledge. Further research is needed to determine the effectiveness of the IATEC computer program as a supplement to traditional lecture instruction in athletic training education. PMID:16558517
High-Speed GPU-Based Fully Three-Dimensional Diffuse Optical Tomographic System
Saikia, Manob Jyoti; Kanhirodan, Rajan; Mohan Vasu, Ram
2014-01-01
We have developed a graphics processor unit (GPU-) based high-speed fully 3D system for diffuse optical tomography (DOT). The reduction in execution time of 3D DOT algorithm, a severely ill-posed problem, is made possible through the use of (1) an algorithmic improvement that uses Broyden approach for updating the Jacobian matrix and thereby updating the parameter matrix and (2) the multinode multithreaded GPU and CUDA (Compute Unified Device Architecture) software architecture. Two different GPU implementations of DOT programs are developed in this study: (1) conventional C language program augmented by GPU CUDA and CULA routines (C GPU), (2) MATLAB program supported by MATLAB parallel computing toolkit for GPU (MATLAB GPU). The computation time of the algorithm on host CPU and the GPU system is presented for C and Matlab implementations. The forward computation uses finite element method (FEM) and the problem domain is discretized into 14610, 30823, and 66514 tetrahedral elements. The reconstruction time, so achieved for one iteration of the DOT reconstruction for 14610 elements, is 0.52 seconds for a C based GPU program for 2-plane measurements. The corresponding MATLAB based GPU program took 0.86 seconds. The maximum number of reconstructed frames so achieved is 2 frames per second. PMID:24891848
High-Speed GPU-Based Fully Three-Dimensional Diffuse Optical Tomographic System.
Saikia, Manob Jyoti; Kanhirodan, Rajan; Mohan Vasu, Ram
2014-01-01
We have developed a graphics processor unit (GPU-) based high-speed fully 3D system for diffuse optical tomography (DOT). The reduction in execution time of 3D DOT algorithm, a severely ill-posed problem, is made possible through the use of (1) an algorithmic improvement that uses Broyden approach for updating the Jacobian matrix and thereby updating the parameter matrix and (2) the multinode multithreaded GPU and CUDA (Compute Unified Device Architecture) software architecture. Two different GPU implementations of DOT programs are developed in this study: (1) conventional C language program augmented by GPU CUDA and CULA routines (C GPU), (2) MATLAB program supported by MATLAB parallel computing toolkit for GPU (MATLAB GPU). The computation time of the algorithm on host CPU and the GPU system is presented for C and Matlab implementations. The forward computation uses finite element method (FEM) and the problem domain is discretized into 14610, 30823, and 66514 tetrahedral elements. The reconstruction time, so achieved for one iteration of the DOT reconstruction for 14610 elements, is 0.52 seconds for a C based GPU program for 2-plane measurements. The corresponding MATLAB based GPU program took 0.86 seconds. The maximum number of reconstructed frames so achieved is 2 frames per second.
NASA Technical Reports Server (NTRS)
Schmidt, G.; Ruster, R.; Czechowsky, P.
1983-01-01
The SOUSY-VHF-Radar operates at a frequency of 53.5 MHz in a valley in the Harz mountains, Germany, 90 km from Hanover. The radar controller, which is programmed by a 16-bit computer holds 1024 program steps in core and controls, via 8 channels, the whole radar system: in particular the master oscillator, the transmitter, the transmit-receive-switch, the receiver, the analog to digital converter, and the hardware adder. The high-sensitivity receiver has a dynamic range of 70 dB and a video bandwidth of 1 MHz. Phase coding schemes are applied, in particular for investigations at mesospheric heights, in order to carry out measurements with the maximum duty cycle and the maximum height resolution. The computer takes the data from the adder to store it in magnetic tape or disc. The radar controller is programmed by the computer using simple FORTRAN IV statements. After the program has been loaded and the computer has started the radar controller, it runs automatically, stopping at the program end. In case of errors or failures occurring during the radar operation, the radar controller is shut off caused either by a safety circuit or by a power failure circuit or by a parity check system.
1997-06-27
This is a computer generated model of a ground based casting. The objective of the therophysical properties program is to measure thermal physical properties of commercial casting alloys for use in computer programs that predict soldification behavior. This could reduce trial and error in casting design and promote less scrap, sounder castings, and less weight. In order for the computer models to reliably simulate the details of industrial alloy solidification, the input thermophysical property data must be absolutely reliable. Recently Auburn University and TPRL Inc. formed a teaming relationship to establish reliable measurement techniques for the most critical properties of commercially important alloys: transformation temperatures, thermal conductivity, electrical conductivity, specific heat, latent heat, density, solid fraction evolution, surface tension, and viscosity. A new initiative with the American Foundrymens Society has been started to measure the thermophysical properties of commercial ferrous and non-ferrous casting alloys and make the thermophysical property data widely available. Development of casting processes for the new gamma titanium aluminide alloys as well as existing titanium alloys will remain a trial-and-error procedure until accurate thermophysical properties can be obtained. These molten alloys react with their containers on earth and change their composition - invalidating the measurements even while the data are being acquired in terrestrial laboratories. However, measurements on the molten alloys can be accomplished in space using freely floating droplets which are completely untouched by any container. These data are expected to be exceptionally precise because of the absence of impurity contamination and buoyancy convection effects. Although long duration orbital experiments will be required for the large scale industrial alloy measurement program that results from this research, short duration experiments on NASA's KC-135 low-g aircraft are already providing preliminary data and experience.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dickens, J.K.
1988-04-01
This document provides a discussion of the development of the FORTRAN Monte Carlo program SCINFUL (for scintillator full response), a program designed to provide a calculated full response anticipated for either an NE-213 (liquid) scintillator or an NE-110 (solid) scintillator. The program may also be used to compute angle-integrated spectra of charged particles (p, d, t, /sup 3/He, and ..cap alpha..) following neutron interactions with /sup 12/C. Extensive comparisons with a variety of experimental data are given. There is generally overall good agreement (<10% differences) of results from SCINFUL calculations with measured detector responses, i.e., N(E/sub r/) vs E/sub r/more » where E/sub r/ is the response pulse height, reproduce measured detector responses with an accuracy which, at least partly, depends upon how well the experimental configuration is known. For E/sub n/ < 16 MeV and for E/sub r/ > 15% of the maximum pulse height response, calculated spectra are within +-5% of experiment on the average. For E/sub n/ up to 50 MeV similar good agreement is obtained with experiment for E/sub r/ > 30% of maximum response. For E/sub n/ up to 75 MeV the calculated shape of the response agrees with measurements, but the calculations underpredicts the measured response by up to 30%. 65 refs., 64 figs., 3 tabs.« less
Programming model for distributed intelligent systems
NASA Technical Reports Server (NTRS)
Sztipanovits, J.; Biegl, C.; Karsai, G.; Bogunovic, N.; Purves, B.; Williams, R.; Christiansen, T.
1988-01-01
A programming model and architecture which was developed for the design and implementation of complex, heterogeneous measurement and control systems is described. The Multigraph Architecture integrates artificial intelligence techniques with conventional software technologies, offers a unified framework for distributed and shared memory based parallel computational models and supports multiple programming paradigms. The system can be implemented on different hardware architectures and can be adapted to strongly different applications.
Evaluation of thermal network correction program using test temperature data
NASA Technical Reports Server (NTRS)
Ishimoto, T.; Fink, L. C.
1972-01-01
An evaluation process to determine the accuracy of a computer program for thermal network correction is discussed. The evaluation is required since factors such as inaccuracies of temperatures, insufficient number of temperature points over a specified time period, lack of one-to-one correlation between temperature sensor and nodal locations, and incomplete temperature measurements are not present in the computer-generated information. The mathematical models used in the evaluation are those that describe a physical system composed of both a conventional and a heat pipe platform. A description of the models used, the results of the evaluation of the thermal network correction, and input instructions for the thermal network correction program are presented.
Robbins, L G
2000-01-01
Graduate school programs in genetics have become so full that courses in statistics have often been eliminated. In addition, typical introductory statistics courses for the "statistics user" rather than the nascent statistician are laden with methods for analysis of measured variables while genetic data are most often discrete numbers. These courses are often seen by students and genetics professors alike as largely irrelevant cookbook courses. The powerful methods of likelihood analysis, although commonly employed in human genetics, are much less often used in other areas of genetics, even though current computational tools make this approach readily accessible. This article introduces the MLIKELY.PAS computer program and the logic of do-it-yourself maximum-likelihood statistics. The program itself, course materials, and expanded discussions of some examples that are only summarized here are available at http://www.unisi. it/ricerca/dip/bio_evol/sitomlikely/mlikely.h tml. PMID:10628965
The Preliminary Development of a Robotic Laser System Used for Ophthalmic Surgery
1988-01-01
proposed design, there is not sufficient computer time to ensure a zero probability of * error. But, what’s more important than a zero probability of...even zero proved to shorten the computation time. 4.3.6 The User Interface To put things in perspective, the step by step procedure for using the routine...was measured from the identified slice. The sectional area was measured using a Summa- graphic digitizing pad and the Sigma-scan program from Jandel
Correlation of predicted and measured thermal stresses on a truss-type aircraft structure
NASA Technical Reports Server (NTRS)
Jenkins, J. M.; Schuster, L. S.; Carter, A. L.
1978-01-01
A test structure representing a portion of a hypersonic vehicle was instrumented with strain gages and thermocouples. This test structure was then subjected to laboratory heating representative of supersonic and hypersonic flight conditions. A finite element computer model of this structure was developed using several types of elements with the NASA structural analysis (NASTRAN) computer program. Temperature inputs from the test were used to generate predicted model thermal stresses and these were correlated with the test measurements.
Systems identification using a modified Newton-Raphson method: A FORTRAN program
NASA Technical Reports Server (NTRS)
Taylor, L. W., Jr.; Iliff, K. W.
1972-01-01
A FORTRAN program is offered which computes a maximum likelihood estimate of the parameters of any linear, constant coefficient, state space model. For the case considered, the maximum likelihood estimate can be identical to that which minimizes simultaneously the weighted mean square difference between the computed and measured response of a system and the weighted square of the difference between the estimated and a priori parameter values. A modified Newton-Raphson or quasilinearization method is used to perform the minimization which typically requires several iterations. A starting technique is used which insures convergence for any initial values of the unknown parameters. The program and its operation are described in sufficient detail to enable the user to apply the program to his particular problem with a minimum of difficulty.
Brohet, C R; Richman, H G
1979-06-01
Automated processing of electrocardiograms by the Veterans Administration program was evaluated for both agreement with physician interpretation and interpretative accuracy as assessed with nonelectrocardiographic criteria. One thousand unselected electrocardiograms were analyzed by two reviewer groups, one familiar and the other unfamiliar with the computer program. A significant number of measurement errors involving repolarization changes and left axis deviation occurred; however, interpretative disagreements related to statistical decision were largely language-related. Use of a printout with a more traditional format resulted in agreement with physician interpretation by both reviewer groups in more than 80 percent of cases. Overall sensitivity based on agreement with nonelectrocardiographic criteria was significantly greater with use of the computer program than with use of the conventional criteria utilized by the reviewers. This difference was particularly evident in the subgroup analysis of myocardial infarction and left ventricular hypertrophy. The degree of overdiagnosis of left ventricular hypertrophy and posteroinferior infarction was initially unacceptable, but this difficulty was corrected by adjustment of probabilities. Clinical acceptability of the Veterans Administration program appears to require greater physician education than that needed for other computer programs of electrocardiographic analysis; the flexibility of interpretation by statistical decision offers the potential for better diagnostic accuracy.
Li, Xiangrui; Lu, Zhong-Lin
2012-02-29
Display systems based on conventional computer graphics cards are capable of generating images with 8-bit gray level resolution. However, most experiments in vision research require displays with more than 12 bits of luminance resolution. Several solutions are available. Bit++ (1) and DataPixx (2) use the Digital Visual Interface (DVI) output from graphics cards and high resolution (14 or 16-bit) digital-to-analog converters to drive analog display devices. The VideoSwitcher (3) described here combines analog video signals from the red and blue channels of graphics cards with different weights using a passive resister network (4) and an active circuit to deliver identical video signals to the three channels of color monitors. The method provides an inexpensive way to enable high-resolution monochromatic displays using conventional graphics cards and analog monitors. It can also provide trigger signals that can be used to mark stimulus onsets, making it easy to synchronize visual displays with physiological recordings or response time measurements. Although computer keyboards and mice are frequently used in measuring response times (RT), the accuracy of these measurements is quite low. The RTbox is a specialized hardware and software solution for accurate RT measurements. Connected to the host computer through a USB connection, the driver of the RTbox is compatible with all conventional operating systems. It uses a microprocessor and high-resolution clock to record the identities and timing of button events, which are buffered until the host computer retrieves them. The recorded button events are not affected by potential timing uncertainties or biases associated with data transmission and processing in the host computer. The asynchronous storage greatly simplifies the design of user programs. Several methods are available to synchronize the clocks of the RTbox and the host computer. The RTbox can also receive external triggers and be used to measure RT with respect to external events. Both VideoSwitcher and RTbox are available for users to purchase. The relevant information and many demonstration programs can be found at http://lobes.usc.edu/.
Assessing Text Readability Using Cognitively Based Indices
ERIC Educational Resources Information Center
Crossley, Scott A.; Greenfield, Jerry; McNamara, Danielle S.
2008-01-01
Many programs designed to compute the readability of texts are narrowly based on surface-level linguistic features and take too little account of the processes which a reader brings to the text. This study is an exploratory examination of the use of Coh-Metrix, a computational tool that measures cohesion and text difficulty at various levels of…
ERIC Educational Resources Information Center
Bayley-Hamlet, Simone O.
2017-01-01
The purpose of this study was to examine the effect of Imagine Learning, a computer assisted language learning (CALL) program, on addressing reading achievement for English language learners (ELLs). This is a measurement used in the Accessing Comprehension and Communication in English State-to-State (ACCESS for ELLs or ACCESS) reading scale…
Using Interactive Multimedia to Teach Pedestrian Safety: An Exploratory Study
ERIC Educational Resources Information Center
Glang, Ann; Noell, John; Ary, Dennis; Swartz, Lynne
2005-01-01
Objectives: To evaluate an interactive multimedia (IMM) program that teaches young children safe pedestrian skills. Methods: The program uses IMM (animation and video) to teach children critical skills for crossing streets safely. A computer-delivered video assessment and a real-life street simulation were used to measure the effectiveness of the…
ERIC Educational Resources Information Center
Kynigos, Chronis
1993-01-01
Used 2 12-year-old children to investigate deductive and inductive reasoning in plane geometry. A LOGO microworld was programmed to measure distances and turns relative to points on the plane. Learning environments like this may enhance formation of inductive geometrical understandings. (Contains 44 references.) (LDR)
TT : a program that implements predictor sort design and analysis
S. P. Verrill; D. W. Green; V. L. Herian
1997-01-01
In studies on wood strength, researchers sometimes replace experimental unit allocation via random sampling with allocation via sorts based on nondestructive measurements of strength predictors such as modulus of elasticity and specific gravity. This report documents TT, a computer program that implements recently published methods to increase the sensitivity of such...
1978-05-01
Program is a cooperative venture between RADC and some sixty-five universities eligible to participate in the program. Syracuse Uiaiversity (Department...of Electrical and Computer Engineering), Purdue University (School of Electrical Engineering), Georgia Institute of Technology (School of Electrical...Engineering), and State University of New York at Buffalo (Department of Electrical / ,./. / Engineering) act as prime contractor schools with other
ERIC Educational Resources Information Center
Creel, Jo Anne; Denson, Cornelius; New, Ray
2007-01-01
Secondary vocational-technical education programs in Mississippi are faced with many challenges resulting from sweeping educational reforms at the national and state levels. Schools and teachers are increasingly being held accountable for providing true learning activities to every student in the classroom. This accountability is measured through…
de Ruijter, D; Smit, E S; de Vries, H; Hoving, C
2016-05-01
Dutch practice nurses sub-optimally adhere to evidence-based smoking cessation guidelines. Web-based computer-tailoring could be effective in improving their guideline adherence. Therefore, this paper aims to describe the development of a web-based computer-tailored program and the design of a randomized controlled trial testing its (cost-)effectiveness. Theoretically grounded in the I-Change Model and Self-Determination Theory, and based on the results of a qualitative needs assessment among practice nurses, a web-based computer-tailored program was developed including three modules with tailored advice, an online forum, modules with up-to-date information about smoking cessation, Frequently Asked Questions (FAQs) and project information, and a counseling checklist. The program's effects are assessed by comparing an intervention group (access to all modules) with a control group (access to FAQs, project information and counseling checklist only). Smoking cessation guideline adherence and behavioral predictors (i.e. intention, knowledge, attitude, self-efficacy, social influence, action and coping planning) are measured at baseline and at 6- and 12-month follow-up. Additionally, the program's indirect effects on smokers' quit rates and the number of quit attempts are assessed after 6 and 12months. This paper describes the development of a web-based computer-tailored adherence support program for practice nurses and the study design of a randomized controlled trial testing its (cost-)effectiveness. This program potentially contributes to improving the quality of smoking cessation care in Dutch general practices. If proven effective, the program could be adapted for use by other healthcare professionals, increasing the public health benefits of improved smoking cessation counseling for smokers. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Tech Briefs, September 2000. Volume 24, No. 9
NASA Technical Reports Server (NTRS)
2000-01-01
Topics include: Sensors; Test and Measurement; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Bio-Medical; semiconductors/ICs; Books and Reports.
NASA Tech Briefs, December 1994. Volume 18, No. 12
NASA Technical Reports Server (NTRS)
1994-01-01
Topics: Test and Measurement; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication; Mathematics and Information Sciences; Life Sciences; Books and Reports
Development and implementation of an automated quantitative film digitizer quality control program
NASA Astrophysics Data System (ADS)
Fetterly, Kenneth A.; Avula, Ramesh T. V.; Hangiandreou, Nicholas J.
1999-05-01
A semi-automated, quantitative film digitizer quality control program that is based on the computer analysis of the image data from a single digitized test film was developed. This program includes measurements of the geometric accuracy, optical density performance, signal to noise ratio, and presampled modulation transfer function. The variability of the measurements was less than plus or minus 5%. Measurements were made on a group of two clinical and two laboratory laser film digitizers during a trial period of approximately four months. Quality control limits were established based on clinical necessity, vendor specifications and digitizer performance. During the trial period, one of the digitizers failed the performance requirements and was corrected by calibration.
NASA Technical Reports Server (NTRS)
Butler, C. M.; Hogge, J. E.
1978-01-01
Air quality sampling was conducted. Data for air quality parameters, recorded on written forms, punched cards or magnetic tape, are available for 1972 through 1975. Computer software was developed to (1) calculate several daily statistical measures of location, (2) plot time histories of data or the calculated daily statistics, (3) calculate simple correlation coefficients, and (4) plot scatter diagrams. Computer software was developed for processing air quality data to include time series analysis and goodness of fit tests. Computer software was developed to (1) calculate a larger number of daily statistical measures of location, and a number of daily monthly and yearly measures of location, dispersion, skewness and kurtosis, (2) decompose the extended time series model and (3) perform some goodness of fit tests. The computer program is described, documented and illustrated by examples. Recommendations are made for continuation of the development of research on processing air quality data.
Mueller, David S.
2016-06-21
The software program, QRev applies common and consistent computational algorithms combined with automated filtering and quality assessment of the data to improve the quality and efficiency of streamflow measurements and helps ensure that U.S. Geological Survey streamflow measurements are consistent, accurate, and independent of the manufacturer of the instrument used to make the measurement. Software from different manufacturers uses different algorithms for various aspects of the data processing and discharge computation. The algorithms used by QRev to filter data, interpolate data, and compute discharge are documented and compared to the algorithms used in the manufacturers’ software. QRev applies consistent algorithms and creates a data structure that is independent of the data source. QRev saves an extensible markup language (XML) file that can be imported into databases or electronic field notes software. This report is the technical manual for version 2.8 of QRev.
Thermodynamics of computation and information distance
NASA Astrophysics Data System (ADS)
Bennett, Charles H.; Gacs, Peter; Li, Ming; Vitanyi, Paul M. R. B.; Zurek, Wojciech H.
1993-06-01
Intuitively, the minimal information distance between x and y is the length of the shortest program for a universal computer to transform x into y and y into x. This measure is shown to be, up to a logarithmic additive term, equal to the maximum of the conditional Kolmogorov complexities E(sub 1)(x,y) = max(K(y/x), K(x/y)). Any reasonable distance to measure similarity of pictures should be an effectively approximable, symmetric, positive function of x and y satisfying a reasonable normalization condition and obeying the triangle inequality. It turns out that E(sub 1) is minimal up to an additive constant among all such distances. Hence it is a universal 'picture distance', which accounts for any effective similarity between pictures. A third information distance, based on the ideal that the aim should be for dissipationless computations, and hence for reversible ones, is given by the length E(sub 2)(x,y) = KR(y/x) = KR(x/y) of the shortest reversible program that transforms x into y and y into x on a universal reversible computer. It is shown that also E(sub 2) = E(sub 1), up to a logarithmic additive term. It is remarkable that three so differently motivated definitions turn out to define one and the same notion. Another information distance, E(sub 3), is obtained by minimizing the total amount of information flowing in and out during a reversible computation in which the program is not retained, in other words the number of extra bits (apart from x) that must be irreversibly supplied at the beginning, plus the number of garbage bits (apart from y) that must be irreversibly erased at the end of the computation to obtain a 'clean' y. This distance is within a logarithmic additive term of the sum of the conditional complexities, E(sub 3)(x, y) = K(y/x) + K(x/y). Using the physical theory of reversible computation, the simple difference K(x) - K(y) is shown to be an appropriate (universal, antisymmetric, and transitive) measure of the amount of thermodynamic work required to transform string x into string y by the most efficient process.
NASA Technical Reports Server (NTRS)
Roberts, Floyd E., III
1994-01-01
Software provides for control and acquisition of data from optical pyrometer. There are six individual programs in PYROLASER package. Provides quick and easy way to set up, control, and program standard Pyrolaser. Temperature and emisivity measurements either collected as if Pyrolaser in manual operating mode or displayed on real-time strip charts and stored in standard spreadsheet format for posttest analysis. Shell supplied to allow macros, which are test-specific, added to system easily. Written using Labview software for use on Macintosh-series computers running System 6.0.3 or later, Sun Sparc-series computers running Open-Windows 3.0 or MIT's X Window System (X11R4 or X11R5), and IBM PC or compatible computers running Microsoft Windows 3.1 or later.
Hands-on work fine-tunes X-band PIN-diode duplexer
NASA Astrophysics Data System (ADS)
Schneider, P.
1985-06-01
Computer-aided design (CAD) programs for fabricating PIN-diode duplexers are useful in avoiding time-consuming cut-and-try techniques. Nevertheless, to attain minimum insertion loss, only experimentation yields the optimum microstrip circuitry. A PIN-diode duplexer, consisting of two SPST PIN-diode switches and a pair of 3-dB Lange microstrip couplers, designed for an X-band transmit/receive module exemplifies what is possible when computer-derived designs and experimentation are used together. Differences between the measured and computer-generated figures for insertion loss can be attributed to several factors not included in the CAD program - for example, radiation and connector losses. Mechanical tolerances of the microstrip PC board and variations in the SMA connector-to-microstrip transition contribute to the discrepancy.
Concentrator optical characterization using computer mathematical modelling and point source testing
NASA Technical Reports Server (NTRS)
Dennison, E. W.; John, S. L.; Trentelman, G. F.
1984-01-01
The optical characteristics of a paraboloidal solar concentrator are analyzed using the intercept factor curve (a format for image data) to describe the results of a mathematical model and to represent reduced data from experimental testing. This procedure makes it possible not only to test an assembled concentrator, but also to evaluate single optical panels or to conduct non-solar tests of an assembled concentrator. The use of three-dimensional ray tracing computer programs to calculate the mathematical model is described. These ray tracing programs can include any type of optical configuration from simple paraboloids to array of spherical facets and can be adapted to microcomputers or larger computers, which can graphically display real-time comparison of calculated and measured data.
TETRA-COM: a comprehensive SPSS program for estimating the tetrachoric correlation.
Lorenzo-Seva, Urbano; Ferrando, Pere J
2012-12-01
We provide an SPSS program that implements descriptive and inferential procedures for estimating tetrachoric correlations. These procedures have two main purposes: (1) bivariate estimation in contingency tables and (2) constructing a correlation matrix to be used as input for factor analysis (in particular, the SPSS FACTOR procedure). In both cases, the program computes accurate point estimates, as well as standard errors and confidence intervals that are correct for any population value. For purpose (1), the program computes the contingency table together with five other measures of association. For purpose (2), the program checks the positive definiteness of the matrix, and if it is found not to be Gramian, performs a nonlinear smoothing procedure at the user's request. The SPSS syntax, a short manual, and data files related to this article are available as supplemental materials from brm.psychonomic-journals.org/content/supplemental.
A computer system for the storage and retrieval of gravity data, Kingdom of Saudi Arabia
Godson, Richard H.; Andreasen, Gordon H.
1974-01-01
A computer system has been developed for the systematic storage and retrieval of gravity data. All pertinent facts relating to gravity station measurements and computed Bouguer values may be retrieved either by project name or by geographical coordinates. Features of the system include visual display in the form of printer listings of gravity data and printer plots of station locations. The retrieved data format interfaces with the format of GEOPAC, a system of computer programs designed for the analysis of geophysical data.
Application of Computer Axial Tomography (CAT) to measuring crop canopy geometry. [corn and soybeans
NASA Technical Reports Server (NTRS)
Bauer, M. E.; Vanderbilt, V. C. (Principal Investigator); Kilgore, R. W.
1981-01-01
The feasibility of using the principles of computer axial topography (CAT) to quantify the structure of crop canopies was investigated because six variables are needed to describe the position-orientation with time of a small piece of canopy foliage. Several cross sections were cut through the foliage of healthy, green corn and soybean canopies in the dent and full pod development stages, respectively. A photograph of each cross section representing the intersection of a plane with the foliage was enlarged and the air-foliage boundaries delineated by the plane were digitized. A computer program was written and used to reconstruct the cross section of the canopy. The approach used in applying optical computer axial tomography to measuring crop canopy geometry shows promise of being able to provide needed geometric information for input data to canopy reflectance models. The difficulty of using the CAT scanner to measure large canopies of crops like corn is discussed and a solution is proposed involving the measurement of plants one at a time.
Cornuet, Jean-Marie; Santos, Filipe; Beaumont, Mark A; Robert, Christian P; Marin, Jean-Michel; Balding, David J; Guillemaud, Thomas; Estoup, Arnaud
2008-12-01
Genetic data obtained on population samples convey information about their evolutionary history. Inference methods can extract part of this information but they require sophisticated statistical techniques that have been made available to the biologist community (through computer programs) only for simple and standard situations typically involving a small number of samples. We propose here a computer program (DIY ABC) for inference based on approximate Bayesian computation (ABC), in which scenarios can be customized by the user to fit many complex situations involving any number of populations and samples. Such scenarios involve any combination of population divergences, admixtures and population size changes. DIY ABC can be used to compare competing scenarios, estimate parameters for one or more scenarios and compute bias and precision measures for a given scenario and known values of parameters (the current version applies to unlinked microsatellite data). This article describes key methods used in the program and provides its main features. The analysis of one simulated and one real dataset, both with complex evolutionary scenarios, illustrates the main possibilities of DIY ABC. The software DIY ABC is freely available at http://www.montpellier.inra.fr/CBGP/diyabc.
Maximum likelihood convolutional decoding (MCD) performance due to system losses
NASA Technical Reports Server (NTRS)
Webster, L.
1976-01-01
A model for predicting the computational performance of a maximum likelihood convolutional decoder (MCD) operating in a noisy carrier reference environment is described. This model is used to develop a subroutine that will be utilized by the Telemetry Analysis Program to compute the MCD bit error rate. When this computational model is averaged over noisy reference phase errors using a high-rate interpolation scheme, the results are found to agree quite favorably with experimental measurements.
Computer modeling and simulators as part of university training for NPP operating personnel
NASA Astrophysics Data System (ADS)
Volman, M.
2017-01-01
This paper considers aspects of a program for training future nuclear power plant personnel developed by the NPP Department of Ivanovo State Power Engineering University. Computer modeling is used for numerical experiments on the kinetics of nuclear reactors in Mathcad. Simulation modeling is carried out on the computer and full-scale simulator of water-cooled power reactor for the simulation of neutron-physical reactor measurements and the start-up - shutdown process.
An interactive multimedia program to prevent HIV transmission in men with intellectual disability.
Wells, Jennifer; Clark, Khaya; Sarno, Karen
2014-05-01
The efficacy of a computer-based interactive multimedia HIV/AIDS prevention program for men with intellectual disability (ID) was examined using a quasi-experimental within-subjects design. Thirty-seven men with mild to moderate intellectual disability evaluated the program. The pretest and posttest instruments assessed HIV/AIDS knowledge (high-risk fluids, HIV transmission, and condom facts) and condom application skills. All outcome measures showed statistically significant gains from pretest to posttest, with medium to large effect sizes. In addition, a second study was conducted with twelve service providers who work with men with ID. Service providers reviewed the HIV/AIDS prevention program, completed a demographics questionnaire, and a program satisfaction survey. Overall, service providers rated the program highly on several outcome measures (stimulation, relevance, and usability).
Measuring Financial Gains from Genetically Superior Trees
George Dutrow; Clark Row
1976-01-01
Planting genetically superior loblolly pines will probably yield high profits.Forest economists have made computer simulations that predict financial gains expected from a tree improvement program under actual field conditions.
ERIC Educational Resources Information Center
Mississippi Research and Curriculum Unit for Vocational and Technical Education, State College.
This document, which is intended for use by community and junior colleges throughout Mississippi, contains curriculum frameworks for two programs in the state's postsecondary-level computer information systems technology cluster: computer programming and network support. Presented in the introduction are program descriptions and suggested course…
Differences in Computed Individual-Tree Volumes Caused by Differences in Field Measurements
James A. Westfall
2008-01-01
Individual-tree volumes are primarily predicted using volume equations that rely on measured tree attributes. In the northeastern United States, the Forest Inventory and Analysis program determines tree volume using dbh, bole height, proportion of cull, and species information. These measurements are subject to variability due to a host of factors. The sensitivity of...
Human voice quality measurement in noisy environments.
Ueng, Shyh-Kuang; Luo, Cheng-Ming; Tsai, Tsung-Yu; Yeh, Hsuan-Chen
2015-01-01
Computerized acoustic voice measurement is essential for the diagnosis of vocal pathologies. Previous studies showed that ambient noises have significant influences on the accuracy of voice quality assessment. This paper presents a voice quality assessment system that can accurately measure qualities of voice signals, even though the input voice data are contaminated by low-frequency noises. The ambient noises in our living rooms and laboratories are collected and the frequencies of these noises are analyzed. Based on the analysis, a filter is designed to reduce noise level of the input voice signal. Then, improved numerical algorithms are employed to extract voice parameters from the voice signal to reveal the health of the voice signal. Compared with MDVP and Praat, the proposed method outperforms these two widely used programs in measuring fundamental frequency and harmonic-to-noise ratio, and its performance is comparable to these two famous programs in computing jitter and shimmer. The proposed voice quality assessment method is resistant to low-frequency noises and it can measure human voice quality in environments filled with noises from air-conditioners, ceiling fans and cooling fans of computers.
Automated Routines for Calculating Whole-Stream Metabolism: Theoretical Background and User's Guide
Bales, Jerad D.; Nardi, Mark R.
2007-01-01
In order to standardize methods and facilitate rapid calculation and archival of stream-metabolism variables, the Stream Metabolism Program was developed to calculate gross primary production, net ecosystem production, respiration, and selected other variables from continuous measurements of dissolved-oxygen concentration, water temperature, and other user-supplied information. Methods for calculating metabolism from continuous measurements of dissolved-oxygen concentration and water temperature are fairly well known, but a standard set of procedures and computation software for all aspects of the calculations were not available previously. The Stream Metabolism Program addresses this deficiency with a stand-alone executable computer program written in Visual Basic.NET?, which runs in the Microsoft Windows? environment. All equations and assumptions used in the development of the software are documented in this report. Detailed guidance on application of the software is presented, along with a summary of the data required to use the software. Data from either a single station or paired (upstream, downstream) stations can be used with the software to calculate metabolism variables.
Space shuttle propulsion estimation development verification, volume 1
NASA Technical Reports Server (NTRS)
Rogers, Robert M.
1989-01-01
The results of the Propulsion Estimation Development Verification are summarized. A computer program developed under a previous contract (NAS8-35324) was modified to include improved models for the Solid Rocket Booster (SRB) internal ballistics, the Space Shuttle Main Engine (SSME) power coefficient model, the vehicle dynamics using quaternions, and an improved Kalman filter algorithm based on the U-D factorized algorithm. As additional output, the estimated propulsion performances, for each device are computed with the associated 1-sigma bounds. The outputs of the estimation program are provided in graphical plots. An additional effort was expended to examine the use of the estimation approach to evaluate single engine test data. In addition to the propulsion estimation program PFILTER, a program was developed to produce a best estimate of trajectory (BET). The program LFILTER, also uses the U-D factorized algorithm form of the Kalman filter as in the propulsion estimation program PFILTER. The necessary definitions and equations explaining the Kalman filtering approach for the PFILTER program, the models used for this application for dynamics and measurements, program description, and program operation are presented.
The benchmark aeroelastic models program: Description and highlights of initial results
NASA Technical Reports Server (NTRS)
Bennett, Robert M.; Eckstrom, Clinton V.; Rivera, Jose A., Jr.; Dansberry, Bryan E.; Farmer, Moses G.; Durham, Michael H.
1991-01-01
An experimental effort was implemented in aeroelasticity called the Benchmark Models Program. The primary purpose of this program is to provide the necessary data to evaluate computational fluid dynamic codes for aeroelastic analysis. It also focuses on increasing the understanding of the physics of unsteady flows and providing data for empirical design. An overview is given of this program and some results obtained in the initial tests are highlighted. The tests that were completed include measurement of unsteady pressures during flutter of rigid wing with a NACA 0012 airfoil section and dynamic response measurements of a flexible rectangular wing with a thick circular arc airfoil undergoing shock boundary layer oscillations.
Li, Xiang; Samei, Ehsan; Segars, W. Paul; Sturgeon, Gregory M.; Colsher, James G.; Toncheva, Greta; Yoshizumi, Terry T.; Frush, Donald P.
2011-01-01
Purpose: Radiation-dose awareness and optimization in CT can greatly benefit from a dose-reporting system that provides dose and risk estimates specific to each patient and each CT examination. As the first step toward patient-specific dose and risk estimation, this article aimed to develop a method for accurately assessing radiation dose from CT examinations. Methods: A Monte Carlo program was developed to model a CT system (LightSpeed VCT, GE Healthcare). The geometry of the system, the energy spectra of the x-ray source, the three-dimensional geometry of the bowtie filters, and the trajectories of source motions during axial and helical scans were explicitly modeled. To validate the accuracy of the program, a cylindrical phantom was built to enable dose measurements at seven different radial distances from its central axis. Simulated radial dose distributions in the cylindrical phantom were validated against ion chamber measurements for single axial scans at all combinations of tube potential and bowtie filter settings. The accuracy of the program was further validated using two anthropomorphic phantoms (a pediatric one-year-old phantom and an adult female phantom). Computer models of the two phantoms were created based on their CT data and were voxelized for input into the Monte Carlo program. Simulated dose at various organ locations was compared against measurements made with thermoluminescent dosimetry chips for both single axial and helical scans. Results: For the cylindrical phantom, simulations differed from measurements by −4.8% to 2.2%. For the two anthropomorphic phantoms, the discrepancies between simulations and measurements ranged between (−8.1%, 8.1%) and (−17.2%, 13.0%) for the single axial scans and the helical scans, respectively. Conclusions: The authors developed an accurate Monte Carlo program for assessing radiation dose from CT examinations. When combined with computer models of actual patients, the program can provide accurate dose estimates for specific patients. PMID:21361208
Post-Flight Estimation of Motion of Space Structures: Part 1
NASA Technical Reports Server (NTRS)
Brugarolas, Paul; Breckenridge, William
2008-01-01
A computer program estimates the relative positions and orientations of two space structures from data on the angular positions and distances of fiducial objects on one structure as measured by a target tracking electronic camera and laser range finders on another structure. The program is written specifically for determining the relative alignments of two antennas, connected by a long truss, deployed in outer space from a space shuttle. The program is based partly on transformations among the various coordinate systems involved in the measurements and on a nonlinear mathematical model of vibrations of the truss. The program implements a Kalman filter that blends the measurement data with data from the model. Using time series of measurement data from the tracking camera and range finders, the program generates time series of data on the relative position and orientation of the antennas. A similar program described in a prior NASA Tech Briefs article was used onboard for monitoring the structures during flight. The present program is more precise and designed for use on Earth in post-flight processing of the measurement data to enable correction, for antenna motions, of scientific data acquired by use of the antennas.
NASA Astrophysics Data System (ADS)
Lin, Chern-Sheng; Ho, Chien-Wa; Chang, Kai-Chieh; Hung, San-Shan; Shei, Hung-Jung; Yeh, Mau-Shiun
2006-06-01
This study describes the design and combination of an eye-controlled and a head-controlled human-machine interface system. This system is a highly effective human-machine interface, detecting head movement by changing positions and numbers of light sources on the head. When the users utilize the head-mounted display to browse a computer screen, the system will catch the images of the user's eyes with CCD cameras, which can also measure the angle and position of the light sources. In the eye-tracking system, the program in the computer will locate each center point of the pupils in the images, and record the information on moving traces and pupil diameters. In the head gesture measurement system, the user wears a double-source eyeglass frame, so the system catches images of the user's head by using a CCD camera in front of the user. The computer program will locate the center point of the head, transferring it to the screen coordinates, and then the user can control the cursor by head motions. We combine the eye-controlled and head-controlled human-machine interface system for the virtual reality applications.
NASA Technical Reports Server (NTRS)
Hrach, F. J.; Arpasi, D. J.; Bruton, W. M.
1975-01-01
A self-learning, sensor fail-operational, control system for the TF30-P-3 afterburning turbofan engine was designed and evaluated. The sensor fail-operational control system includes a digital computer program designed to operate in conjunction with the standard TF30-P-3 bill-of-materials control. Four engine measurements and two compressor face measurements are tested. If any engine measurements are found to have failed, they are replaced by values synthesized from computer-stored information. The control system was evaluated by using a realtime, nonlinear, hybrid computer engine simulation at sea level static condition, at a typical cruise condition, and at several extreme flight conditions. Results indicate that the addition of such a system can improve the reliability of an engine digital control system.
ERIC Educational Resources Information Center
Ehrich, Roger W.; McCreary, Faith; Reaux, Ray; Rowland, Keith; Ramsey, Amy
The U.S. Department of Education is supporting a 3-year program involving Virginia Tech's computer department and a rural public elementary school. The project seeks to determine whether immersive access to networked computing by students and their families has measurable effects on long-term student achievement. A fifth-grade classroom was…
ERIC Educational Resources Information Center
Cady, Donna; Terrell, Steven R.
2008-01-01
Females are underrepresented in technology-related careers and educational programs; many researchers suggest this can be traced back to negative feelings of computer self-efficacy developed as early as the age of 10. This study investigated the effect of embedding technology into a 5th grade science classroom and measuring its effect on…
NASA Astrophysics Data System (ADS)
Israel, Maya; Wherfel, Quentin M.; Shehab, Saadeddine; Ramos, Evan A.; Metzger, Adam; Reese, George C.
2016-07-01
This paper describes the development, validation, and uses of the Collaborative Computing Observation Instrument (C-COI), a web-based analysis instrument that classifies individual and/or collaborative behaviors of students during computing problem-solving (e.g. coding, programming). The C-COI analyzes data gathered through video and audio screen recording software that captures students' computer screens as they program, and their conversations with their peers or adults. The instrument allows researchers to organize and quantify these data to track behavioral patterns that could be further analyzed for deeper understanding of persistence and/or collaborative interactions. The article provides a rationale for the C-COI including the development of a theoretical framework for measuring collaborative interactions in computer-mediated environments. This theoretical framework relied on the computer-supported collaborative learning literature related to adaptive help seeking, the joint problem-solving space in which collaborative computing occurs, and conversations related to outcomes and products of computational activities. Instrument development and validation also included ongoing advisory board feedback from experts in computer science, collaborative learning, and K-12 computing as well as classroom observations to test out the constructs in the C-COI. These processes resulted in an instrument with rigorous validation procedures and a high inter-rater reliability.
Data acquisition, processing and firing aid software for multichannel EMP simulation
NASA Astrophysics Data System (ADS)
Eumurian, Gregoire; Arbaud, Bruno
1986-08-01
Electromagnetic compatibility testing yields a large quantity of data for systematic analysis. An automated data acquisition system has been developed. It is based on standard EMP instrumentation which allows a pre-established program to be followed whilst orientating the measurements according to the results obtained. The system is controlled by a computer running interactive programs (multitask windows, scrollable menus, mouse, etc.) which handle the measurement channels, files, displays and process data in addition to providing an aid to firing.
NASA Technical Reports Server (NTRS)
Homan, D. J.
1977-01-01
A computer program written to calculate the proximity aerodynamic force and moment coefficients of the Orbiter/Shuttle Carrier Aircraft (SCA) vehicles based on flight instrumentation is described. The ground reduced aerodynamic coefficients and instrumentation errors (GRACIE) program was developed as a tool to aid in flight test verification of the Orbiter/SCA separation aerodynamic data base. The program calculates the force and moment coefficients of each vehicle in proximity to the other, using the load measurement system data, flight instrumentation data and the vehicle mass properties. The uncertainty in each coefficient is determined, based on the quoted instrumentation accuracies. A subroutine manipulates the Orbiter/747 Carrier Separation Aerodynamic Data Book to calculate a comparable set of predicted coefficients for comparison to the calculated flight test data.
ERIC Educational Resources Information Center
Neeson, John F.; Austin, Stephen
1975-01-01
Describes a method for the measurement of the velocity of sound in various liquids based on the Raman-Nath theory of light-sound interaction. Utilizes an analog computer program to calculate the intensity of light scattered into various diffraction orders. (CP)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Granderson, Jessica; Touzani, Samir; Taylor, Cody
Trustworthy savings calculations are critical to convincing regulators of both the cost-effectiveness of energy efficiency program investments and their ability to defer supply-side capital investments. Today’s methods for measurement and verification (M&V) of energy savings constitute a significant portion of the total costs of energy efficiency programs. They also require time-consuming data acquisition. A spectrum of savings calculation approaches is used, with some relying more heavily on measured data and others relying more heavily on estimated, modeled, or stipulated data. The rising availability of “smart” meters and devices that report near-real time data, combined with new analytical approaches to quantifyingmore » savings, offers potential to conduct M&V more quickly and at lower cost, with comparable or improved accuracy. Commercial energy management and information systems (EMIS) technologies are beginning to offer M&V capabilities, and program administrators want to understand how they might assist programs in quickly and accurately measuring energy savings. This paper presents the results of recent testing of the ability to use automation to streamline some parts of M&V. Here in this paper, we detail metrics to assess the performance of these new M&V approaches, and a framework to compute the metrics. We also discuss the accuracy, cost, and time trade-offs between more traditional M&V, and these emerging streamlined methods that use high-resolution energy data and automated computational intelligence. Finally we discuss the potential evolution of M&V and early results of pilots currently underway to incorporate M&V automation into ratepayer-funded programs and professional implementation and evaluation practice.« less
Corona performance of a compact 230-kV line
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chartier, V.L.; Blair, D.E.; Easley, M.D.
Permitting requirements and the acquisition of new rights-of-way for transmission facilities has in recent years become increasingly difficult for most utilities, including Puget Sound Power and Light Company. In order to maintain a high degree of reliability of service while being responsive to public concerns regarding the siting of high voltage (HV) transmission facilities, Puget Power has found it necessary to more heavily rely upon the use of compact lines in franchise corridors. Compaction does, however, precipitate increased levels of audible noise (AN) and radio and TV interference (RI and TVI) due to corona on the conductors and insulator assemblies.more » Puget Power relies upon the Bonneville Power Administration (BPA) Corona and Field Effects computer program to calculate AN and RI for new lines. Since there was some question of the program`s ability to accurately represent quiet 230-kV compact designs, a joint project was undertaken with BPA to verify the program`s algorithms. Long-term measurements made on an operating Puget Power 230-kV compact line confirmed the accuracy of BPA`s AN model; however, the RI measurements were much lower than predicted by the BPA and other programs. This paper also describes how the BPA computer program can be used to calculate the voltage needed to expose insulator assemblies to the correct electric field in single test setups in HV laboratories.« less
Processing data from soil assessment surveys with the computer program SOILS.
John W. Hazard; Jeralyn Snellgrove; J. Michael Geist
1985-01-01
Program SOILS processes data from soil assessment surveys following a design adopted by the Pacific Northwest Region of the USDA Forest Service. It accepts measurements from line transects and associated soil subsamples and generates estimates of the percentages of the sampled area falling in each soil condition class. Total disturbance is calculated by combining...
ERIC Educational Resources Information Center
Griffith, Jennifer M.; Sorenson, James R.; Bowling, J. Michael; Jennings-Grant, Tracey
2005-01-01
The Enhancing Patient Prenatal Education study tested the feasibility and educational impact of an interactive program for patient prenatal genetic screening and testing education. Patients at two private practices and one public health clinic participated (N = 207). The program collected knowledge and measures of anxiety before and after use of…
Mikkelsen, Sigurd; Vilstrup, Imogen; Lassen, Christina Funch; Kryger, Ann Isabel; Thomsen, Jane Frølund; Andersen, Johan Hviid
2007-01-01
Objective To examine the validity and potential biases in self‐reports of computer, mouse and keyboard usage times, compared with objective recordings. Methods A study population of 1211 people was asked in a questionnaire to estimate the average time they had worked with computer, mouse and keyboard during the past four working weeks. During the same period, a software program recorded these activities objectively. The study was part of a one‐year follow‐up study from 2000–1 of musculoskeletal outcomes among Danish computer workers. Results Self‐reports on computer, mouse and keyboard usage times were positively associated with objectively measured activity, but the validity was low. Self‐reports explained only between a quarter and a third of the variance of objectively measured activity, and were even lower for one measure (keyboard time). Self‐reports overestimated usage times. Overestimation was large at low levels and declined with increasing levels of objectively measured activity. Mouse usage time proportion was an exception with a near 1:1 relation. Variability in objectively measured activity, arm pain, gender and age influenced self‐reports in a systematic way, but the effects were modest and sometimes in different directions. Conclusion Self‐reported durations of computer activities are positively associated with objective measures but they are quite inaccurate. Studies using self‐reports to establish relations between computer work times and musculoskeletal pain could be biased and lead to falsely increased or decreased risk estimates. PMID:17387136
Mikkelsen, Sigurd; Vilstrup, Imogen; Lassen, Christina Funch; Kryger, Ann Isabel; Thomsen, Jane Frølund; Andersen, Johan Hviid
2007-08-01
To examine the validity and potential biases in self-reports of computer, mouse and keyboard usage times, compared with objective recordings. A study population of 1211 people was asked in a questionnaire to estimate the average time they had worked with computer, mouse and keyboard during the past four working weeks. During the same period, a software program recorded these activities objectively. The study was part of a one-year follow-up study from 2000-1 of musculoskeletal outcomes among Danish computer workers. Self-reports on computer, mouse and keyboard usage times were positively associated with objectively measured activity, but the validity was low. Self-reports explained only between a quarter and a third of the variance of objectively measured activity, and were even lower for one measure (keyboard time). Self-reports overestimated usage times. Overestimation was large at low levels and declined with increasing levels of objectively measured activity. Mouse usage time proportion was an exception with a near 1:1 relation. Variability in objectively measured activity, arm pain, gender and age influenced self-reports in a systematic way, but the effects were modest and sometimes in different directions. Self-reported durations of computer activities are positively associated with objective measures but they are quite inaccurate. Studies using self-reports to establish relations between computer work times and musculoskeletal pain could be biased and lead to falsely increased or decreased risk estimates.
More-Realistic Digital Modeling of a Human Body
NASA Technical Reports Server (NTRS)
Rogge, Renee
2010-01-01
A MATLAB computer program has been written to enable improved (relative to an older program) modeling of a human body for purposes of designing space suits and other hardware with which an astronaut must interact. The older program implements a kinematic model based on traditional anthropometric measurements that do provide important volume and surface information. The present program generates a three-dimensional (3D) whole-body model from 3D body-scan data. The program utilizes thin-plate spline theory to reposition the model without need for additional scans.
EMI Measurement and Mitigation Testing for the ARPA Hybrid Electric Vehicle Program
1996-08-27
communication range is reduced, computers malfunction, or monitoring systems fail. Various electric vehicles ( EVs ) were measured to evaluate their...electric vehicles ( EVs ) were measured to evaluate their potential EMI emissions when used in today’s hostile commercial electromagnetic environment...monitoring systems fail. Various electric vehicles ( EVs ) were measured to evaluate their potential EMI emissions when used in today’s hostile commercial
NASA Tech Briefs, February 1997. Volume 2, No. 2
NASA Technical Reports Server (NTRS)
1997-01-01
Topics include: Test and Measurement; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Life Sciences; Books and Reports
NASA Tech Briefs, February 1994. Volume 18, No. 2
NASA Technical Reports Server (NTRS)
1994-01-01
Topics covered include: Test and Measurement; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery; Fabrication Technology; Mathematics and Information Sciences; Life Sciences; Books and Reports
Computer-Guided Deep Brain Stimulation Programming for Parkinson's Disease.
Heldman, Dustin A; Pulliam, Christopher L; Urrea Mendoza, Enrique; Gartner, Maureen; Giuffrida, Joseph P; Montgomery, Erwin B; Espay, Alberto J; Revilla, Fredy J
2016-02-01
Pilot study to evaluate computer-guided deep brain stimulation (DBS) programming designed to optimize stimulation settings using objective motion sensor-based motor assessments. Seven subjects (five males; 54-71 years) with Parkinson's disease (PD) and recently implanted DBS systems participated in this pilot study. Within two months of lead implantation, the subject returned to the clinic to undergo computer-guided programming and parameter selection. A motion sensor was placed on the index finger of the more affected hand. Software guided a monopolar survey during which monopolar stimulation on each contact was iteratively increased followed by an automated assessment of tremor and bradykinesia. After completing assessments at each setting, a software algorithm determined stimulation settings designed to minimize symptom severities, side effects, and battery usage. Optimal DBS settings were chosen based on average severity of motor symptoms measured by the motion sensor. Settings chosen by the software algorithm identified a therapeutic window and improved tremor and bradykinesia by an average of 35.7% compared with baseline in the "off" state (p < 0.01). Motion sensor-based computer-guided DBS programming identified stimulation parameters that significantly improved tremor and bradykinesia with minimal clinician involvement. Automated motion sensor-based mapping is worthy of further investigation and may one day serve to extend programming to populations without access to specialized DBS centers. © 2015 International Neuromodulation Society.
Cognitive training in Parkinson disease: cognition-specific vs nonspecific computer training.
Zimmermann, Ronan; Gschwandtner, Ute; Benz, Nina; Hatz, Florian; Schindler, Christian; Taub, Ethan; Fuhr, Peter
2014-04-08
In this study, we compared a cognition-specific computer-based cognitive training program with a motion-controlled computer sports game that is not cognition-specific for their ability to enhance cognitive performance in various cognitive domains in patients with Parkinson disease (PD). Patients with PD were trained with either a computer program designed to enhance cognition (CogniPlus, 19 patients) or a computer sports game with motion-capturing controllers (Nintendo Wii, 20 patients). The effect of training in 5 cognitive domains was measured by neuropsychological testing at baseline and after training. Group differences over all variables were assessed with multivariate analysis of variance, and group differences in single variables were assessed with 95% confidence intervals of mean difference. The groups were similar regarding age, sex, and educational level. Patients with PD who were trained with Wii for 4 weeks performed better in attention (95% confidence interval: -1.49 to -0.11) than patients trained with CogniPlus. In our study, patients with PD derived at least the same degree of cognitive benefit from non-cognition-specific training involving movement as from cognition-specific computerized training. For patients with PD, game consoles may be a less expensive and more entertaining alternative to computer programs specifically designed for cognitive training. This study provides Class III evidence that, in patients with PD, cognition-specific computer-based training is not superior to a motion-controlled computer game in improving cognitive performance.
Clinical application of a light-pen computer system for quantitative angiography
NASA Technical Reports Server (NTRS)
Alderman, E. L.
1975-01-01
The important features in a clinical system for quantitative angiography were examined. The human interface for data input, whether an electrostatic pen, sonic pen, or light-pen must be engineered to optimize the quality of margin definition. The computer programs which the technician uses for data entry and computation of ventriculographic measurements must be convenient to use on a routine basis in a laboratory performing multiple studies per day. The method used for magnification correction must be continuously monitored.
Queueing Network Models for Parallel Processing of Task Systems: an Operational Approach
NASA Technical Reports Server (NTRS)
Mak, Victor W. K.
1986-01-01
Computer performance modeling of possibly complex computations running on highly concurrent systems is considered. Earlier works in this area either dealt with a very simple program structure or resulted in methods with exponential complexity. An efficient procedure is developed to compute the performance measures for series-parallel-reducible task systems using queueing network models. The procedure is based on the concept of hierarchical decomposition and a new operational approach. Numerical results for three test cases are presented and compared to those of simulations.
Horizon sensor errors calculated by computer models compared with errors measured in orbit
NASA Technical Reports Server (NTRS)
Ward, K. A.; Hogan, R.; Andary, J.
1982-01-01
Using a computer program to model the earth's horizon and to duplicate the signal processing procedure employed by the ESA (Earth Sensor Assembly), errors due to radiance variation have been computed for a particular time of the year. Errors actually occurring in flight at the same time of year are inferred from integrated rate gyro data for a satellite of the TIROS series of NASA weather satellites (NOAA-A). The predicted performance is compared with actual flight history.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-05
.... Description of the Matching Program A. General The Computer Matching and Privacy Protection Act of 1988 (Pub... 1974: CMS Computer Matching Program Match No. 2013-01; HHS Computer Matching Program Match No. 1312...). ACTION: Notice of Computer Matching Program (CMP). SUMMARY: In accordance with the requirements of the...
NASA Astrophysics Data System (ADS)
McCubbine, Jack; Tontini, Fabio Caratori; Stagpoole, Vaughan; Smith, Euan; O'Brien, Grant
2018-01-01
A Python program (Gsolve) with a graphical user interface has been developed to assist with routine data processing of relative gravity measurements. Gsolve calculates the gravity at each measurement site of a relative gravity survey, which is referenced to at least one known gravity value. The tidal effects of the sun and moon, gravimeter drift and tares in the data are all accounted for during the processing of the survey measurements. The calculation is based on a least squares formulation where the difference between the absolute gravity at each surveyed location and parameters relating to the dynamics of the gravimeter are minimized with respect to the relative gravity observations, and some supplied gravity reference site values. The program additionally allows the user to compute free air gravity anomalies, with respect to the GRS80 and GRS67 reference ellipsoids, from the determined gravity values and calculate terrain corrections at each of the surveyed sites using a prism formula and a user supplied digital elevation model. This paper reviews the mathematical framework used to reduce relative gravimeter survey observations to gravity values. It then goes on to detail how the processing steps can be implemented using the software.
NASA Technical Reports Server (NTRS)
Otterman, J.; Fraser, R. S.
1976-01-01
Programs for computing atmospheric transmission and scattering solar radiation were used to compute the ratios of the Earth-atmosphere system (space) directional reflectivities in the vertical direction to the surface reflectivity, for the four bands of the LANDSAT multispectral scanner (MSS). These ratios are presented as graphs for two water vapor levels, as a function of the surface reflectivity, for various sun elevation angles. Space directional reflectivities in the vertical direction are reported for selected arid regions in Asia, Africa and Central America from the spectral radiance levels measured by the LANDSAT MSS. From these space reflectivities, surface vertical reflectivities were computed applying the pertinent graphs. These surface reflectivities were used to estimate the surface albedo for the entire solar spectrum. The estimated albedos are in the range 0.34-0.52, higher than the values reported by most previous researchers from space measurements, but are consistent with laboratory measurements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Laros, James H.; Grant, Ryan; Levenhagen, Michael J.
Measuring and controlling the power and energy consumption of high performance computing systems by various components in the software stack is an active research area. Implementations in lower level software layers are beginning to emerge in some production systems, which is very welcome. To be most effective, a portable interface to measurement and control features would significantly facilitate participation by all levels of the software stack. We present a proposal for a standard power Application Programming Interface (API) that endeavors to cover the entire software space, from generic hardware interfaces to the input from the computer facility manager.
Vacuum ultraviolet line radiation measurements of a shock-heated nitrogen plasma
NASA Technical Reports Server (NTRS)
Mcclenahan, J. O.
1972-01-01
Line radiation, in the wavelength region from 1040 to 2500 A from nitrogen plasmas, was measured at conditions typical of those produced in the shock layer in front of vehicles entering the earth's atmosphere at superorbital velocities. The radiation was also predicted with a typical radiation transport computer program to determine whether such calculations adequately model plasmas for the conditions tested. The results of the comparison show that the radiant intensities of the lines between 1040 and 1700 A are actually lower than are predicted by such computer models.
Frequency-Domain Identification Of Aeroelastic Modes
NASA Technical Reports Server (NTRS)
Acree, C. W., Jr.; Tischler, Mark B.
1991-01-01
Report describes flight measurements and frequency-domain analyses of aeroelastic vibrational modes of wings of XV-15 tilt-rotor aircraft. Begins with description of flight-test methods. Followed by brief discussion of methods of analysis, which include Fourier-transform computations using chirp z transformers, use of coherence and other spectral functions, and methods and computer programs to obtain frequencies and damping coefficients from measurements. Includes brief description of results of flight tests and comparisions among various experimental and theoretical results. Ends with section on conclusions and recommended improvements in techniques.
Improved coordinates of features in the vicinity of the Viking lander site on Mars
NASA Technical Reports Server (NTRS)
Davies, M. E.; Dole, S. H.
1980-01-01
The measurement of longitude of the Viking 1 landing site and the accuracy of the coordinates of features in the area around the landing site are discussed. The longitude must be measured photogrammatically from the small crater, Airy 0, which defines the 0 deg meridian on Mars. The computer program, GIANT, which was used to perform the analytical triangulations, and the photogrammetric computation of the longitude of the Viking 1 lander site are described. Improved coordinates of features in the vicinity of the Viking 1 lander site are presented.
Handheld computer use in U.S. family practice residency programs.
Criswell, Dan F; Parchman, Michael L
2002-01-01
The purpose of the study was to evaluate the uses of handheld computers (also called personal digital assistants, or PDAs) in family practice residency programs in the United States. In November 2000, the authors mailed a questionnaire to the program directors of all American Academy of Family Physicians (AAFP) and American College of Osteopathic Family Practice (ACOFP) residency programs in the United States. Data and patterns of the use and non-use of handheld computers were identified. Approximately 50 percent (306 of 610) of the programs responded to the survey. Two thirds of the programs reported that handheld computers were used in their residencies, and an additional 14 percent had plans for implementation within 24 months. Both the Palm and the Windows CE operating systems were used, with the Palm operating system the most common. Military programs had the highest rate of use (8 of 10 programs, 80 percent), and osteopathic programs had the lowest (23 of 55 programs, 42 percent). Of programs that reported handheld computer use, 45 percent had required handheld computer applications that are used uniformly by all users. Funding for handheld computers and related applications was non-budgeted in 76percent of the programs in which handheld computers were used. In programs providing a budget for handheld computers, the average annual budget per user was 461.58 dollars. Interested faculty or residents, rather than computer information services personnel, performed upkeep and maintenance of handheld computers in 72 percent of the programs in which the computers are used. In addition to the installed calendar, memo pad, and address book, the most common clinical uses of handheld computers in the programs were as medication reference tools, electronic textbooks, and clinical computational or calculator-type programs. Handheld computers are widely used in family practice residency programs in the United States. Although handheld computers were designed as electronic organizers, in family practice residencies they are used as medication reference tools, electronic textbooks, and clinical computational programs and to track activities that were previously associated with desktop database applications.
The RANDOM computer program: A linear congruential random number generator
NASA Technical Reports Server (NTRS)
Miles, R. F., Jr.
1986-01-01
The RANDOM Computer Program is a FORTRAN program for generating random number sequences and testing linear congruential random number generators (LCGs). The linear congruential form of random number generator is discussed, and the selection of parameters of an LCG for a microcomputer described. This document describes the following: (1) The RANDOM Computer Program; (2) RANDOM.MOD, the computer code needed to implement an LCG in a FORTRAN program; and (3) The RANCYCLE and the ARITH Computer Programs that provide computational assistance in the selection of parameters for an LCG. The RANDOM, RANCYCLE, and ARITH Computer Programs are written in Microsoft FORTRAN for the IBM PC microcomputer and its compatibles. With only minor modifications, the RANDOM Computer Program and its LCG can be run on most micromputers or mainframe computers.
Devane, P A; Horne, J G; Foley, G; Stanley, J
2017-10-01
This paper describes the methodology, validation and reliability of a new computer-assisted method which uses models of the patient's bones and the components to measure their migration and polyethylene wear from radiographs after total hip arthroplasty (THA). Models of the patient's acetabular and femoral component obtained from the manufacturer and models of the patient's pelvis and femur built from a single computed tomography (CT) scan, are used by a computer program to measure the migration of the components and the penetration of the femoral head from anteroposterior and lateral radiographs taken at follow-up visits. The program simulates the radiographic setup and matches the position and orientation of the models to outlines of the pelvis, the acetabular and femoral component, and femur on radiographs. Changes in position and orientation reflect the migration of the components and the penetration of the femoral head. Validation was performed using radiographs of phantoms simulating known migration and penetration, and the clinical feasibility of measuring migration was assessed in two patients. Migration of the acetabular and femoral components can be measured with limits of agreement (LOA) of 0.37 mm and 0.33 mm, respectively. Penetration of the femoral head can be measured with LOA of 0.161 mm. The migration of components and polyethylene wear can be measured without needing specialised radiographs. Accurate measurement may allow earlier prediction of failure after THA. Cite this article: Bone Joint J 2017;99-B:1290-7. ©2017 The British Editorial Society of Bone & Joint Surgery.
A PC program for estimating measurement uncertainty for aeronautics test instrumentation
NASA Technical Reports Server (NTRS)
Blumenthal, Philip Z.
1995-01-01
A personal computer program was developed which provides aeronautics and operations engineers at Lewis Research Center with a uniform method to quickly provide values for the uncertainty in test measurements and research results. The software package used for performing the calculations is Mathcad 4.0, a Windows version of a program which provides an interactive user interface for entering values directly into equations with immediate display of results. The error contribution from each component of the system is identified individually in terms of the parameter measured. The final result is given in common units, SI units, and percent of full scale range. The program also lists the specifications for all instrumentation and calibration equipment used for the analysis. It provides a presentation-quality printed output which can be used directly for reports and documents.
DIALOG: An executive computer program for linking independent programs
NASA Technical Reports Server (NTRS)
Glatt, C. R.; Hague, D. S.; Watson, D. A.
1973-01-01
A very large scale computer programming procedure called the DIALOG executive system was developed for the CDC 6000 series computers. The executive computer program, DIALOG, controls the sequence of execution and data management function for a library of independent computer programs. Communication of common information is accomplished by DIALOG through a dynamically constructed and maintained data base of common information. Each computer program maintains its individual identity and is unaware of its contribution to the large scale program. This feature makes any computer program a candidate for use with the DIALOG executive system. The installation and uses of the DIALOG executive system are described.
Empirical flow parameters - a tool for hydraulic model validity assessment : [summary].
DOT National Transportation Integrated Search
2013-10-01
Hydraulic modeling assembles models based on generalizations of parameter values from textbooks, professional literature, computer program documentation, and engineering experience. Actual measurements adjacent to the model location are seldom availa...
NASA Tech Briefs, April 2000. Volume 24, No. 4
NASA Technical Reports Server (NTRS)
2000-01-01
Topics covered include: Imaging/Video/Display Technology; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Bio-Medical; Test and Measurement; Mathematics and Information Sciences; Books and Reports.
ERIC Educational Resources Information Center
Physics Education, 1982
1982-01-01
Describes: (1) an apparatus which provides a simple method for measuring Stefan's constant; (2) a simple phase shifting circuit; (3) a radioactive decay computer program (for ZX81); and (4) phase difference between transformer voltages. (Author/JN)
Research on computer systems benchmarking
NASA Technical Reports Server (NTRS)
Smith, Alan Jay (Principal Investigator)
1996-01-01
This grant addresses the topic of research on computer systems benchmarking and is more generally concerned with performance issues in computer systems. This report reviews work in those areas during the period of NASA support under this grant. The bulk of the work performed concerned benchmarking and analysis of CPUs, compilers, caches, and benchmark programs. The first part of this work concerned the issue of benchmark performance prediction. A new approach to benchmarking and machine characterization was reported, using a machine characterizer that measures the performance of a given system in terms of a Fortran abstract machine. Another report focused on analyzing compiler performance. The performance impact of optimization in the context of our methodology for CPU performance characterization was based on the abstract machine model. Benchmark programs are analyzed in another paper. A machine-independent model of program execution was developed to characterize both machine performance and program execution. By merging these machine and program characterizations, execution time can be estimated for arbitrary machine/program combinations. The work was continued into the domain of parallel and vector machines, including the issue of caches in vector processors and multiprocessors. All of the afore-mentioned accomplishments are more specifically summarized in this report, as well as those smaller in magnitude supported by this grant.
NASA Technical Reports Server (NTRS)
Gupta, S. K.; Tiwari, S. N.
1976-01-01
A simple procedure and computer program were developed for retrieving the surface temperature from the measurement of upwelling infrared radiance in a single spectral region in the atmosphere. The program evaluates the total upwelling radiance at any altitude in the region of the CO fundamental band (2070-2220 1/cm) for several values of surface temperature. Actual surface temperature is inferred by interpolation of the measured upwelling radiance between the computed values of radiance for the same altitude. Sensitivity calculations were made to determine the effect of uncertainty in various surface, atmospheric and experimental parameters on the inferred value of surface temperature. It is found that the uncertainties in water vapor concentration and surface emittance are the most important factors affecting the accuracy of the inferred value of surface temperature.
PC-CUBE: A Personal Computer Based Hypercube
NASA Technical Reports Server (NTRS)
Ho, Alex; Fox, Geoffrey; Walker, David; Snyder, Scott; Chang, Douglas; Chen, Stanley; Breaden, Matt; Cole, Terry
1988-01-01
PC-CUBE is an ensemble of IBM PCs or close compatibles connected in the hypercube topology with ordinary computer cables. Communication occurs at the rate of 115.2 K-band via the RS-232 serial links. Available for PC-CUBE is the Crystalline Operating System III (CrOS III), Mercury Operating System, CUBIX and PLOTIX which are parallel I/O and graphics libraries. A CrOS performance monitor was developed to facilitate the measurement of communication and computation time of a program and their effects on performance. Also available are CXLISP, a parallel version of the XLISP interpreter; GRAFIX, some graphics routines for the EGA and CGA; and a general execution profiler for determining execution time spent by program subroutines. PC-CUBE provides a programming environment similar to all hypercube systems running CrOS III, Mercury and CUBIX. In addition, every node (personal computer) has its own graphics display monitor and storage devices. These allow data to be displayed or stored at every processor, which has much instructional value and enables easier debugging of applications. Some application programs which are taken from the book Solving Problems on Concurrent Processors (Fox 88) were implemented with graphics enhancement on PC-CUBE. The applications range from solving the Mandelbrot set, Laplace equation, wave equation, long range force interaction, to WaTor, an ecological simulation.
Quantitative morphometrical characterization of human pronuclear zygotes.
Beuchat, A; Thévenaz, P; Unser, M; Ebner, T; Senn, A; Urner, F; Germond, M; Sorzano, C O S
2008-09-01
Identification of embryos with high implantation potential remains a challenge in in vitro fertilization (IVF). Subjective pronuclear (PN) zygote scoring systems have been developed for that purpose. The aim of this work was to provide a software tool that enables objective measuring of morphological characteristics of the human PN zygote. A computer program was created to analyse zygote images semi-automatically, providing precise morphological measurements. The accuracy of this approach was first validated by comparing zygotes from two different IVF centres with computer-assisted measurements or subjective scoring. Computer-assisted measurement and subjective scoring were then compared for their ability to classify zygotes with high and low implantation probability by using a linear discriminant analysis. Zygote images coming from the two IVF centres were analysed with the software, resulting in a series of precise measurements of 24 variables. Using subjective scoring, the cytoplasmic halo was the only feature which was significantly different between the two IVF centres. Computer-assisted measurements revealed significant differences between centres in PN centring, PN proximity, cytoplasmic halo and features related to nucleolar precursor bodies distribution. The zygote classification error achieved with the computer-assisted measurements (0.363) was slightly inferior to that of the subjective ones (0.393). A precise and objective characterization of the morphology of human PN zygotes can be achieved by the use of an advanced image analysis tool. This computer-assisted analysis allows for a better morphological characterization of human zygotes and can be used for classification.
A CAD (Classroom Assessment Design) of a Computer Programming Course
ERIC Educational Resources Information Center
Hawi, Nazir S.
2012-01-01
This paper presents a CAD (classroom assessment design) of an entry-level undergraduate computer programming course "Computer Programming I". CAD has been the product of a long experience in teaching computer programming courses including teaching "Computer Programming I" 22 times. Each semester, CAD is evaluated and modified…
NASA Technical Reports Server (NTRS)
Pan, Y. S.; Drummond, J. P.; Mcclinton, C. R.
1978-01-01
Two parabolic flow computer programs, SHIP (a finite-difference program) and COMOC (a finite-element program), are used for predicting three-dimensional turbulent reacting flow fields in supersonic combustors. The theoretical foundation of the two computer programs are described, and then the programs are applied to a three-dimensional turbulent mixing experiment. The cold (nonreacting) flow experiment was performed to study the mixing of helium jets with a supersonic airstream in a rectangular duct. Surveys of the flow field at an upstream were used as the initial data by programs; surveys at a downstream station provided comparison to assess program accuracy. Both computer programs predicted the experimental results and data trends reasonably well. However, the comparison between the computations from the two programs indicated that SHIP was more accurate in computation and more efficient in both computer storage and computing time than COMOC.
HOME COMPUTER USE AND THE DEVELOPMENT OF HUMAN CAPITAL*
Malamud, Ofer; Pop-Eleches, Cristian
2012-01-01
This paper uses a regression discontinuity design to estimate the effect of home computers on child and adolescent outcomes by exploiting a voucher program in Romania. Our main results indicate that home computers have both positive and negative effects on the development of human capital. Children who won a voucher to purchase a computer had significantly lower school grades but show improved computer skills. There is also some evidence that winning a voucher increased cognitive skills, as measured by Raven’s Progressive Matrices. We do not find much evidence for an effect on non-cognitive outcomes. Parental rules regarding homework and computer use attenuate the effects of computer ownership, suggesting that parental monitoring and supervision may be important mediating factors. PMID:22719135
Computer program CDCID: an automated quality control program using CDC update
DOE Office of Scientific and Technical Information (OSTI.GOV)
Singer, G.L.; Aguilar, F.
1984-04-01
A computer program, CDCID, has been developed in coordination with a quality control program to provide a highly automated method of documenting changes to computer codes at EG and G Idaho, Inc. The method uses the standard CDC UPDATE program in such a manner that updates and their associated documentation are easily made and retrieved in various formats. The method allows each card image of a source program to point to the document which describes it, who created the card, and when it was created. The method described is applicable to the quality control of computer programs in general. Themore » computer program described is executable only on CDC computing systems, but the program could be modified and applied to any computing system with an adequate updating program.« less
ERIC Educational Resources Information Center
Terry, Janet L.; Geske, Joel
A case study investigated how journalism and mass communication faculty members diffused and used computing technology in teaching. Subjects, 21 tenured and tenure-track faculty members in a mid-sized journalism and mass communication department, completed an indepth questionnaire designed to measure the general attitude of the faculty towards…
Demonstration of Multi- and Single-Reader Sample Size Program for Diagnostic Studies software.
Hillis, Stephen L; Schartz, Kevin M
2015-02-01
The recently released software Multi- and Single-Reader Sample Size Sample Size Program for Diagnostic Studies , written by Kevin Schartz and Stephen Hillis, performs sample size computations for diagnostic reader-performance studies. The program computes the sample size needed to detect a specified difference in a reader performance measure between two modalities, when using the analysis methods initially proposed by Dorfman, Berbaum, and Metz (DBM) and Obuchowski and Rockette (OR), and later unified and improved by Hillis and colleagues. A commonly used reader performance measure is the area under the receiver-operating-characteristic curve. The program can be used with typical common reader-performance measures which can be estimated parametrically or nonparametrically. The program has an easy-to-use step-by-step intuitive interface that walks the user through the entry of the needed information. Features of the software include the following: (1) choice of several study designs; (2) choice of inputs obtained from either OR or DBM analyses; (3) choice of three different inference situations: both readers and cases random, readers fixed and cases random, and readers random and cases fixed; (4) choice of two types of hypotheses: equivalence or noninferiority; (6) choice of two output formats: power for specified case and reader sample sizes, or a listing of case-reader combinations that provide a specified power; (7) choice of single or multi-reader analyses; and (8) functionality in Windows, Mac OS, and Linux.
Three Dimensional Measurements And Display Using A Robot Arm
NASA Astrophysics Data System (ADS)
Swift, Thomas E.
1984-02-01
The purpose of this paper is to describe a project which makes three dimensional measurements of an object using a robot arm. A program was written to determine the X-Y-Z coordinates of the end point of a Minimover-5 robot arm which was interfaced to a TRS-80 Model III microcomputer. This program was used in conjunction with computer graphics subroutines that draw a projected three dimensional object.. The robot arm was direc-ted to touch points on an object and then lines were drawn on the screen of the microcomputer between consecutive points as they were entered. A representation of the entire object is in this way constructed on the screen. The three dimensional graphics subroutines have the ability to rotate the projected object about any of the three axes, and to scale the object to any size. This project has applications in the computer-aided design and manufacturing fields because it can accurately measure the features of an irregularly shaped object.
Cost-Effectiveness and Cost-Utility of Internet-Based Computer Tailoring for Smoking Cessation
Evers, Silvia MAA; de Vries, Hein; Hoving, Ciska
2013-01-01
Background Although effective smoking cessation interventions exist, information is limited about their cost-effectiveness and cost-utility. Objective To assess the cost-effectiveness and cost-utility of an Internet-based multiple computer-tailored smoking cessation program and tailored counseling by practice nurses working in Dutch general practices compared with an Internet-based multiple computer-tailored program only and care as usual. Methods The economic evaluation was embedded in a randomized controlled trial, for which 91 practice nurses recruited 414 eligible smokers. Smokers were randomized to receive multiple tailoring and counseling (n=163), multiple tailoring only (n=132), or usual care (n=119). Self-reported cost and quality of life were assessed during a 12-month follow-up period. Prolonged abstinence and 24-hour and 7-day point prevalence abstinence were assessed at 12-month follow-up. The trial-based economic evaluation was conducted from a societal perspective. Uncertainty was accounted for by bootstrapping (1000 times) and sensitivity analyses. Results No significant differences were found between the intervention arms with regard to baseline characteristics or effects on abstinence, quality of life, and addiction level. However, participants in the multiple tailoring and counseling group reported significantly more annual health care–related costs than participants in the usual care group. Cost-effectiveness analysis, using prolonged abstinence as the outcome measure, showed that the mere multiple computer-tailored program had the highest probability of being cost-effective. Compared with usual care, in this group €5100 had to be paid for each additional abstinent participant. With regard to cost-utility analyses, using quality of life as the outcome measure, usual care was probably most efficient. Conclusions To our knowledge, this was the first study to determine the cost-effectiveness and cost-utility of an Internet-based smoking cessation program with and without counseling by a practice nurse. Although the Internet-based multiple computer-tailored program seemed to be the most cost-effective treatment, the cost-utility was probably highest for care as usual. However, to ease the interpretation of cost-effectiveness results, future research should aim at identifying an acceptable cutoff point for the willingness to pay per abstinent participant. PMID:23491820
Sobie, Eric A
2011-09-13
This two-part lecture introduces students to the scientific computing language MATLAB. Prior computer programming experience is not required. The lectures present basic concepts of computer programming logic that tend to cause difficulties for beginners in addition to concepts that relate specifically to the MATLAB language syntax. The lectures begin with a discussion of vectors, matrices, and arrays. Because many types of biological data, such as fluorescence images and DNA microarrays, are stored as two-dimensional objects, processing these data is a form of array manipulation, and MATLAB is especially adept at handling such array objects. The students are introduced to basic commands in MATLAB, as well as built-in functions that provide useful shortcuts. The second lecture focuses on the differences between MATLAB scripts and MATLAB functions and describes when one method of programming organization might be preferable to the other. The principles are illustrated through the analysis of experimental data, specifically measurements of intracellular calcium concentration in live cells obtained using confocal microscopy.
Sobie, Eric A.
2014-01-01
This two-part lecture introduces students to the scientific computing language MATLAB. Prior computer programming experience is not required. The lectures present basic concepts of computer programming logic that tend to cause difficulties for beginners in addition to concepts that relate specifically to the MATLAB language syntax. The lectures begin with a discussion of vectors, matrices, and arrays. Because many types of biological data, such as fluorescence images and DNA microarrays, are stored as two-dimensional objects, processing these data is a form of array manipulation, and MATLAB is especially adept at handling such array objects. The students are introduced to basic commands in MATLAB, as well as built-in functions that provide useful shortcuts. The second lecture focuses on the differences between MATLAB scripts and MATLAB functions and describes when one method of programming organization might be preferable to the other. The principles are illustrated through the analysis of experimental data, specifically measurements of intracellular calcium concentration in live cells obtained using confocal microscopy. PMID:21934110
NASA Technical Reports Server (NTRS)
Cowings, Patricia S.; Naifeh, Karen; Thrasher, Chet
1988-01-01
This report contains the source code and documentation for a computer program used to process impedance cardiography data. The cardiodynamic measures derived from impedance cardiography are ventricular stroke column, cardiac output, cardiac index and Heather index. The program digitizes data collected from the Minnesota Impedance Cardiograph, Electrocardiography (ECG), and respiratory cycles and then stores these data on hard disk. It computes the cardiodynamic functions using interactive graphics and stores the means and standard deviations of each 15-sec data epoch on floppy disk. This software was designed on a Digital PRO380 microcomputer and used version 2.0 of P/OS, with (minimally) a 4-channel 16-bit analog/digital (A/D) converter. Applications software is written in FORTRAN 77, and uses Digital's Pro-Tool Kit Real Time Interface Library, CORE Graphic Library, and laboratory routines. Source code can be readily modified to accommodate alternative detection, A/D conversion and interactive graphics. The object code utilizing overlays and multitasking has a maximum of 50 Kbytes.
Internet (WWW) based system of ultrasonic image processing tools for remote image analysis.
Zeng, Hong; Fei, Ding-Yu; Fu, Cai-Ting; Kraft, Kenneth A
2003-07-01
Ultrasonic Doppler color imaging can provide anatomic information and simultaneously render flow information within blood vessels for diagnostic purpose. Many researchers are currently developing ultrasound image processing algorithms in order to provide physicians with accurate clinical parameters from the images. Because researchers use a variety of computer languages and work on different computer platforms to implement their algorithms, it is difficult for other researchers and physicians to access those programs. A system has been developed using World Wide Web (WWW) technologies and HTTP communication protocols to publish our ultrasonic Angle Independent Doppler Color Image (AIDCI) processing algorithm and several general measurement tools on the Internet, where authorized researchers and physicians can easily access the program using web browsers to carry out remote analysis of their local ultrasonic images or images provided from the database. In order to overcome potential incompatibility between programs and users' computer platforms, ActiveX technology was used in this project. The technique developed may also be used for other research fields.
NASA Astrophysics Data System (ADS)
Pellas, Nikolaos; Peroutseas, Efstratios
2017-01-01
Students in secondary education strive hard enough to understand basic programming concepts. With all that is known regarding the benefits of programming, little is the published evidence showing how high school students can learn basic programming concepts following innovative instructional formats correctly with the respect to gain/enhance their computational thinking skills. This distinction has caused lack of their motivation and interest in Computer Science courses. This case study presents the opinions of twenty-eight (n = 28) high school students who participated voluntarily in a 3D-game-like environment created in Second Life. This environment was combined with the 2D programming environment of Scratch4SL for the implementation of programming concepts (i.e. sequence and concurrent programming commands) in a blended instructional format. An instructional framework based on Papert's theory of Constructionism to assist students how to coordinate or manage better the learning material in collaborative practice-based learning activities is also proposed. By conducting a mixed-method research, before and after finishing several learning tasks, students' participation in focus group (qualitative data) and their motivation based on their experiences (quantitative data) are measured. Findings indicated that an instructional design framework based on Constructionism for acquiring or empowering students' social, cognitive, higher order and computational thinking skills is meaningful. Educational implications and recommendations for future research are also discussed.
A handheld computer as part of a portable in vivo knee joint load monitoring system
Szivek, JA; Nandakumar, VS; Geffre, CP; Townsend, CP
2009-01-01
In vivo measurement of loads and pressures acting on articular cartilage in the knee joint during various activities and rehabilitative therapies following focal defect repair will provide a means of designing activities that encourage faster and more complete healing of focal defects. It was the goal of this study to develop a totally portable monitoring system that could be used during various activities and allow continuous monitoring of forces acting on the knee. In order to make the monitoring system portable, a handheld computer with custom software, a USB powered miniature wireless receiver and a battery-powered coil were developed to replace a currently used computer, AC powered bench top receiver and power supply. A Dell handheld running Windows Mobile operating system(OS) programmed using Labview was used to collect strain measurements. Measurements collected by the handheld based system connected to the miniature wireless receiver were compared with the measurements collected by a hardwired system and a computer based system during bench top testing and in vivo testing. The newly developed handheld based system had a maximum accuracy of 99% when compared to the computer based system. PMID:19789715
Evaluating young children's cognitive capacities through computer versus hand drawings.
Olsen, J
1992-09-01
Young normal and handicapped children, aged 3 to 6 years, were taught to draw a scene of a house, garden and a sky with a computer drawing program that uses icons and is operated by a mouse. The drawings were rated by a team of experts on a 7-category scale. The children's computer- and hand-produced drawings were compared with one another and with results on cognitive, visual and fine motor tests. The computer drawing program made it possible for the children to accurately draw closed shapes, to get instant feedback on the adequacy of the drawing, and to make corrections with ease. It was hypothesized that these features would compensate for the young children's limitations in such cognitive skills, as memory, concentration, planning and accomplishment, as well as their weak motor skills. In addition, it was hypothesized that traditional cognitive ratings of hand drawings may underestimate young children's intellectual ability, because drawing by hand demands motor skills and memory, concentration and planning skills that are more developed than that actually shown by young children. To test the latter hypothesis, the children completed a training program in using a computer to make drawings. The results show that cognitive processes such as planning, analysis and synthesis can be investigated by means of a computer drawing program in a way not possible using traditional pencil and paper drawings. It can be said that the method used here made it possible to measure cognitive abilities "under the floor" of what is ordinarily possible by means of traditionally hand drawings.
System enhancements of Mesoscale Analysis and Space Sensor (MASS) computer system
NASA Technical Reports Server (NTRS)
Hickey, J. S.; Karitani, S.
1985-01-01
The interactive information processing for the mesoscale analysis and space sensor (MASS) program is reported. The development and implementation of new spaceborne remote sensing technology to observe and measure atmospheric processes is described. The space measurements and conventional observational data are processed together to gain an improved understanding of the mesoscale structure and dynamical evolution of the atmosphere relative to cloud development and precipitation processes. A Research Computer System consisting of three primary computers was developed (HP-1000F, Perkin-Elmer 3250, and Harris/6) which provides a wide range of capabilities for processing and displaying interactively large volumes of remote sensing data. The development of a MASS data base management and analysis system on the HP-1000F computer and extending these capabilities by integration with the Perkin-Elmer and Harris/6 computers using the MSFC's Apple III microcomputer workstations is described. The objectives are: to design hardware enhancements for computer integration and to provide data conversion and transfer between machines.
Quantitative Assay for Starch by Colorimetry Using a Desktop Scanner
ERIC Educational Resources Information Center
Matthews, Kurt R.; Landmark, James D.; Stickle, Douglas F.
2004-01-01
The procedure to produce standard curve for starch concentration measurement by image analysis using a color scanner and computer for data acquisition and color analysis is described. Color analysis is performed by a Visual Basic program that measures red, green, and blue (RGB) color intensities for pixels within the scanner image.
Is a Web Survey as Effective as a Mail Survey? A Field Experiment Among Computer Users
ERIC Educational Resources Information Center
Kiernan, Nancy; Kiernan, Michaela; Oyler, Mary; Gilles, Carolyn
2005-01-01
With the exponential increase in Web access, program evaluators need to understand the methodological benefits and barriers of using the Web to collect survey data from program participants. In this experimental study, the authors examined whether a Web survey can be as effective as the more established mail survey on three measures of survey…
John Pitlick; Yantao Cui; Peter Wilcock
2009-01-01
This manual provides background information and instructions on the use of a spreadsheet-based program for Bedload Assessment in Gravel-bed Streams (BAGS). The program implements six bed load transport equations developed specifically for gravel-bed rivers. Transport capacities are calculated on the basis of field measurements of channel geometry, reach-average slope,...
Bradley, D. Nathan
2013-01-01
The peak discharge of a flood can be estimated from the elevation of high-water marks near the inlet and outlet of a culvert after the flood has occurred. This type of discharge estimate is called an “indirect measurement” because it relies on evidence left behind by the flood, such as high-water marks on trees or buildings. When combined with the cross-sectional geometry of the channel upstream from the culvert and the culvert size, shape, roughness, and orientation, the high-water marks define a water-surface profile that can be used to estimate the peak discharge by using the methods described by Bodhaine (1968). This type of measurement is in contrast to a “direct” measurement of discharge made during the flood where cross-sectional area is measured and a current meter or acoustic equipment is used to measure the water velocity. When a direct discharge measurement cannot be made at a streamgage during high flows because of logistics or safety reasons, an indirect measurement of a peak discharge is useful for defining the high-flow section of the stage-discharge relation (rating curve) at the streamgage, resulting in more accurate computation of high flows. The Culvert Analysis Program (CAP) (Fulford, 1998) is a command-line program written in Fortran for computing peak discharges and culvert rating surfaces or curves. CAP reads input data from a formatted text file and prints results to another formatted text file. Preparing and correctly formatting the input file may be time-consuming and prone to errors. This document describes the CAP graphical user interface (GUI)—a modern, cross-platform, menu-driven application that prepares the CAP input file, executes the program, and helps the user interpret the output
Long-Term Pavement Performance Automated Faulting Measurement
DOT National Transportation Integrated Search
2015-02-01
This study focused on identifying transverse joint locations on jointed plain concrete pavements using an automated joint detection algorithm and computing faulting at these locations using Long-Term Pavement Performance (LTPP) Program profile data c...
ERIC Educational Resources Information Center
School Science Review, 1984
1984-01-01
Presents 26 activities, experiments, demonstrations, games, and computer programs for biology, chemistry, and physics. Background information, laboratory procedures, equipment lists, and instructional strategies are given. Topics include eye measurements, nutrition, soil test tube rack, population dynamics, angular momentum, transition metals,…
Developing an Index to Measure Health System Performance: Measurement for Districts of Nepal.
Kandel, N; Fric, A; Lamichhane, J
2014-01-01
Various frameworks for measuring health system performance have been proposed and discussed. The scope of using performance indicators are broad, ranging from examining national health system to individual patients at various levels of health system. Development of innovative and easy index is essential to measure multidimensionality of health systems. We used indicators, which also serve as proxy to the set of activities, whose primary goal is to maintain and improve health. We used eleven indicators of MDGs, which represent all dimensions of health to develop index. These indicators are computed with similar methodology that of human development index. We used published data of Nepal for computation of the index for districts of Nepal as an illustration. To validate our finding, we compared the indices of these districts with other development indices of Nepal. An index for each district has been computed from eleven indicators. Then indices are compared with that of human development index, socio-economic and infrastructure development indices and findings has shown the similarity on distribution of districts. Categories of low and high performing districts on health system performance are also having low and high human development, socio-economic, and infrastructure indices respectively. This methodology of computing index from various indicators could assist policy makers and program managers to prioritize activities based on their performance. Validation of the findings with that of other development indicators show that this can be one of the tools, which can assist on assessing health system performance for policy makers, program managers and others.
Efficacy of a short cognitive training program in patients with multiple sclerosis
Pérez-Martín, María Yaiza; González-Platas, Montserrat; Eguía-del Río, Pablo; Croissier-Elías, Cristina; Jiménez Sosa, Alejandro
2017-01-01
Background Cognitive impairment is a common feature in multiple sclerosis (MS) and may have a substantial impact on quality of life. Evidence about the effectiveness of neuropsychological rehabilitation is still limited, but current data suggest that computer-assisted cognitive training improves cognitive performance. Objective The objective of this study was to evaluate the efficacy of combined computer-assisted training supported by home-based neuropsychological training to improve attention, processing speed, memory and executive functions during 3 consecutive months. Methods In this randomized controlled study blinded for the evaluators, 62 MS patients with clinically stable disease and mild-to-moderate levels of cognitive impairment were randomized to receive a computer-assisted neuropsychological training program (n=30) or no intervention (control group [CG]; n=32). The cognitive assessment included the Brief Repeatable Battery of Neuropsychological Test. Other secondary measures included subjective cognitive impairment, anxiety and depression, fatigue and quality of life measures. Results The treatment group (TG) showed significant improvements in measures of verbal memory, working memory and phonetic fluency after intervention, and repeated measures analysis of covariance revealed a positive effect in most of the functions. The control group (CG) did not show changes. The TG showed a significant reduction in anxiety symptoms and significant improvement in quality of life. There were no improvements in fatigue levels and depressive symptoms. Conclusion Cognitive intervention with a computer-assisted training supported by home training between face-to-face sessions is a useful tool to treat patients with MS and improve functions such as verbal memory, working memory and phonetic fluency. PMID:28223806
Cornuet, Jean-Marie; Santos, Filipe; Beaumont, Mark A.; Robert, Christian P.; Marin, Jean-Michel; Balding, David J.; Guillemaud, Thomas; Estoup, Arnaud
2008-01-01
Summary: Genetic data obtained on population samples convey information about their evolutionary history. Inference methods can extract part of this information but they require sophisticated statistical techniques that have been made available to the biologist community (through computer programs) only for simple and standard situations typically involving a small number of samples. We propose here a computer program (DIY ABC) for inference based on approximate Bayesian computation (ABC), in which scenarios can be customized by the user to fit many complex situations involving any number of populations and samples. Such scenarios involve any combination of population divergences, admixtures and population size changes. DIY ABC can be used to compare competing scenarios, estimate parameters for one or more scenarios and compute bias and precision measures for a given scenario and known values of parameters (the current version applies to unlinked microsatellite data). This article describes key methods used in the program and provides its main features. The analysis of one simulated and one real dataset, both with complex evolutionary scenarios, illustrates the main possibilities of DIY ABC. Availability: The software DIY ABC is freely available at http://www.montpellier.inra.fr/CBGP/diyabc. Contact: j.cornuet@imperial.ac.uk Supplementary information: Supplementary data are also available at http://www.montpellier.inra.fr/CBGP/diyabc PMID:18842597
Acceleration of Radiance for Lighting Simulation by Using Parallel Computing with OpenCL
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zuo, Wangda; McNeil, Andrew; Wetter, Michael
2011-09-06
We report on the acceleration of annual daylighting simulations for fenestration systems in the Radiance ray-tracing program. The algorithm was optimized to reduce both the redundant data input/output operations and the floating-point operations. To further accelerate the simulation speed, the calculation for matrix multiplications was implemented using parallel computing on a graphics processing unit. We used OpenCL, which is a cross-platform parallel programming language. Numerical experiments show that the combination of the above measures can speed up the annual daylighting simulations 101.7 times or 28.6 times when the sky vector has 146 or 2306 elements, respectively.
Comparisons of AEROX computer program predictions of lift and induced drag with flight test data
NASA Technical Reports Server (NTRS)
Axelson, J.; Hill, G. C.
1981-01-01
The AEROX aerodynamic computer program which provides accurate predictions of induced drag and trim drag for the full angle of attack range and for Mach numbers from 0.4 to 3.0 is described. This capability is demonstrated comparing flight test data and AEROX predictions for 17 different tactical aircraft. Values of minimum (skin friction, pressure, and zero lift wave) drag coefficients and lift coefficient offset due to camber (when required) were input from the flight test data to produce total lift and drag curves. The comparisons of trimmed lift drag polars show excellent agreement between the AEROX predictions and the in flight measurements.
CAD of control systems: Application of nonlinear programming to a linear quadratic formulation
NASA Technical Reports Server (NTRS)
Fleming, P.
1983-01-01
The familiar suboptimal regulator design approach is recast as a constrained optimization problem and incorporated in a Computer Aided Design (CAD) package where both design objective and constraints are quadratic cost functions. This formulation permits the separate consideration of, for example, model following errors, sensitivity measures and control energy as objectives to be minimized or limits to be observed. Efficient techniques for computing the interrelated cost functions and their gradients are utilized in conjunction with a nonlinear programming algorithm. The effectiveness of the approach and the degree of insight into the problem which it affords is illustrated in a helicopter regulation design example.
Experiences with leak rate calculations methods for LBB application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grebner, H.; Kastner, W.; Hoefler, A.
1997-04-01
In this paper, three leak rate computer programs for the application of leak before break analysis are described and compared. The programs are compared to each other and to results of an HDR Reactor experiment and two real crack cases. The programs analyzed are PIPELEAK, FLORA, and PICEP. Generally, the different leak rate models are in agreement. To obtain reasonable agreement between measured and calculated leak rates, it was necessary to also use data from detailed crack investigations.
Application Portable Parallel Library
NASA Technical Reports Server (NTRS)
Cole, Gary L.; Blech, Richard A.; Quealy, Angela; Townsend, Scott
1995-01-01
Application Portable Parallel Library (APPL) computer program is subroutine-based message-passing software library intended to provide consistent interface to variety of multiprocessor computers on market today. Minimizes effort needed to move application program from one computer to another. User develops application program once and then easily moves application program from parallel computer on which created to another parallel computer. ("Parallel computer" also include heterogeneous collection of networked computers). Written in C language with one FORTRAN 77 subroutine for UNIX-based computers and callable from application programs written in C language or FORTRAN 77.
NASA Technical Reports Server (NTRS)
Davis, S. J.; Rosenstein, H.
1975-01-01
The Comprehensive Airship Sizing and Performance Computer Program (CASCOMP) is described which was developed and used in the design and evaluation of advanced lighter-than-air (LTA) craft. The program defines design details such as engine size and number, component weight buildups, required power, and the physical dimensions of airships which are designed to meet specified mission requirements. The program is used in a comparative parametric evaluation of six advanced lighter-than-air concepts. The results indicate that fully buoyant conventional airships have the lightest gross lift required when designed for speeds less than 100 knots and the partially buoyant concepts are superior above 100 knots. When compared on the basis of specific productivity, which is a measure of the direct operating cost, the partially buoyant lifting body/tilting prop-rotor concept is optimum.
The study of microstrip antenna arrays and related problems
NASA Technical Reports Server (NTRS)
Lo, Y. T.
1986-01-01
In February, an initial computer program to be used in analyzing the four-element array module was completed. This program performs the analysis of modules composed of four rectangular patches which are corporately fed by a microstrip line network terminated in four identical load impedances. Currently, a rigorous full-wave analysis of various types of microstrip line feed structures and patches is being performed. These tests include the microstrip line feed between layers of different electrical parameters. A method of moments was implemented for the case of a single dielectric layer and microstrip line fed rectangular patches in which the primary source is assumed to be a magnetic current ribbon across the line some distance from the patch. Measured values are compared with those computed by the program.
White, Timothy C.; Sauter, Edward A.; Stewart, Duff C.
2014-01-01
Intermagnet is an international oversight group which exists to establish a global network for geomagnetic observatories. This group establishes data standards and standard operating procedures for members and prospective members. Intermagnet has proposed a new One-Second Data Standard, for that emerging geomagnetic product. The standard specifies that all data collected must have a time stamp accuracy of ±10 milliseconds of the top-of-the-second Coordinated Universal Time. Therefore, the U.S. Geological Survey Geomagnetism Program has designed and executed several tests on its current data collection system, the Personal Computer Data Collection Platform. Tests are designed to measure the time shifts introduced by individual components within the data collection system, as well as to measure the time shift introduced by the entire Personal Computer Data Collection Platform. Additional testing designed for Intermagnet will be used to validate further such measurements. Current results of the measurements showed a 5.0–19.9 millisecond lag for the vertical channel (Z) of the Personal Computer Data Collection Platform and a 13.0–25.8 millisecond lag for horizontal channels (H and D) of the collection system. These measurements represent a dynamically changing delay introduced within the U.S. Geological Survey Personal Computer Data Collection Platform.
DIALOG: An executive computer program for linking independent programs
NASA Technical Reports Server (NTRS)
Glatt, C. R.; Hague, D. S.; Watson, D. A.
1973-01-01
A very large scale computer programming procedure called the DIALOG Executive System has been developed for the Univac 1100 series computers. The executive computer program, DIALOG, controls the sequence of execution and data management function for a library of independent computer programs. Communication of common information is accomplished by DIALOG through a dynamically constructed and maintained data base of common information. The unique feature of the DIALOG Executive System is the manner in which computer programs are linked. Each program maintains its individual identity and as such is unaware of its contribution to the large scale program. This feature makes any computer program a candidate for use with the DIALOG Executive System. The installation and use of the DIALOG Executive System are described at Johnson Space Center.
Programming the social computer.
Robertson, David; Giunchiglia, Fausto
2013-03-28
The aim of 'programming the global computer' was identified by Milner and others as one of the grand challenges of computing research. At the time this phrase was coined, it was natural to assume that this objective might be achieved primarily through extending programming and specification languages. The Internet, however, has brought with it a different style of computation that (although harnessing variants of traditional programming languages) operates in a style different to those with which we are familiar. The 'computer' on which we are running these computations is a social computer in the sense that many of the elementary functions of the computations it runs are performed by humans, and successful execution of a program often depends on properties of the human society over which the program operates. These sorts of programs are not programmed in a traditional way and may have to be understood in a way that is different from the traditional view of programming. This shift in perspective raises new challenges for the science of the Web and for computing in general.
Toward full life cycle control: Adding maintenance measurement to the SEL
NASA Technical Reports Server (NTRS)
Rombach, H. Dieter; Ulery, Bradford T.; Valett, Jon D.
1992-01-01
Organization-wide measurement of software products and processes is needed to establish full life cycle control over software products. The Software Engineering Laboratory (SEL)--a joint venture between NASA GSFC, the University of Maryland, and Computer Sciences Corporation--started measurement of software development more than 15 years ago. Recently, the measurement of maintenance was added to the scope of the SEL. In this article, the maintenance measurement program is presented as an addition to the already existing and well-established SEL development measurement program and evaluated in terms of its immediate benefits and long-term improvement potential. Immediate benefits of this program for the SEL include an increased understanding of the maintenance domain, the differences and commonalities between development and maintenance, and the cause-effect relationships between development and maintenance. Initial results from a sample maintenance study are presented to substantiate these benefits. The long-term potential of this program includes the use of maintenance baselines to better plan and manage future projects and to improve development and maintenance practices for future projects wherever warranted.
Han, Zifa; Leung, Chi Sing; So, Hing Cheung; Constantinides, Anthony George
2017-08-15
A commonly used measurement model for locating a mobile source is time-difference-of-arrival (TDOA). As each TDOA measurement defines a hyperbola, it is not straightforward to compute the mobile source position due to the nonlinear relationship in the measurements. This brief exploits the Lagrange programming neural network (LPNN), which provides a general framework to solve nonlinear constrained optimization problems, for the TDOA-based localization. The local stability of the proposed LPNN solution is also analyzed. Simulation results are included to evaluate the localization accuracy of the LPNN scheme by comparing with the state-of-the-art methods and the optimality benchmark of Cramér-Rao lower bound.
ERIC Educational Resources Information Center
Stoilescu, Dorian; Egodawatte, Gunawardena
2010-01-01
Research shows that female and male students in undergraduate computer science programs view computer culture differently. Female students are interested more in the use of computers than in doing programming, whereas male students see computer science mainly as a programming activity. The overall purpose of our research was not to find new…
Eberl, Dennis D.; Drits, V.A.; Srodon, J.
2000-01-01
GALOPER is a computer program that simulates the shapes of crystal size distributions (CSDs) from crystal growth mechanisms. This manual describes how to use the program. The theory for the program's operation has been described previously (Eberl, Drits, and Srodon, 1998). CSDs that can be simulated using GALOPER include those that result from growth mechanisms operating in the open system, such as constant-rate nucleation and growth, nucleation with a decaying nucleation rate and growth, surface-controlled growth, supply-controlled growth, and constant-rate and random growth; and those that result from mechanisms operating in the closed system such as Ostwald ripening, random ripening, and crystal coalescence. In addition, CSDs for two types weathering reactions can be simulated. The operation of associated programs also is described, including two statistical programs used for comparing calculated with measured CSDs, a program used for calculating lognormal CSDs, and a program for arranging measured crystal sizes into size groupings (bins).
Tailored program evaluation: Past, present, future.
Suggs, L Suzanne; Cowdery, Joan E; Carroll, Jennifer B
2006-11-01
This paper discusses measurement issues related to the evaluation of computer-tailored health behavior change programs. As the first generation of commercially available tailored products is utilized in health promotion programming, programmers and researchers are becoming aware of the unique challenges that the evaluation of these programs presents. A project is presented that used an online tailored health behavior assessment (HBA) in a worksite setting. Process and outcome evaluation methods are described and include the challenges faced, and strategies proposed and implemented, for meeting them. Implications for future research in tailored program development, implementation, and evaluation are also discussed.
M.S.L.A.P. Modular Spectral Line Analysis Program documentation
NASA Technical Reports Server (NTRS)
Joseph, Charles L.; Jenkins, Edward B.
1991-01-01
MSLAP is a software for analyzing spectra, providing the basic structure to identify spectral features, to make quantitative measurements of this features, and to store the measurements for convenient access. MSLAP can be used to measure not only the zeroth moment (equivalent width) of a profile, but also the first and second moments. Optical depths and the corresponding column densities across the profile can be measured as well for sufficiently high resolution data. The software was developed for an interactive, graphical analysis where the computer carries most of the computational and data organizational burden and the investigator is responsible only for all judgement decisions. It employs sophisticated statistical techniques for determining the best polynomial fit to the continuum and for calculating the uncertainties.
System of Programmed Modules for Measuring Photographs with a Gamma-Telescope
NASA Technical Reports Server (NTRS)
Averin, S. A.; Veselova, G. V.; Navasardyan, G. V.
1978-01-01
Physical experiments using tracking cameras resulted in hundreds of thousands of stereo photographs of events being received. To process such a large volume of information, automatic and semiautomatic measuring systems are required. At the Institute of Space Research of the Academy of Science of the USSR, a system for processing film information from the spark gamma-telescope was developed. The system is based on a BPS-75 projector in line with the minicomputer Elektronika 1001. The report describes this system. The various computer programs available to the operators are discussed.
Ground-Based Photometric Measurements HAES Program Support.
1979-01-31
photometric system such as the MTP can be optimized to a certain extent, but the fundamental limitations on 16 A z I- z -e 0 / LUJ LU z 00 04 LUL 0’ -1...Introduction 12 2.2 Background and Relevance 12 2.3 Measurement Requirements 15 2.4 MTP Optical Design 16 2.5 Digital Photon-Counting Data System 19 2.6...optical head 17 3 Block diagram of modular photometer, digital data and control systems 20 4 Flow diagram of computer program used to analyze three beam
Yang, Yea-Ru; Chen, Yi-Hua; Chang, Heng-Chih; Chan, Rai-Chi; Wei, Shun-Hwa; Wang, Ray-Yau
2015-10-01
We investigated the effects of a computer-generated interactive visual feedback training program on the recovery from pusher syndrome in stroke patients. Assessor-blinded, pilot randomized controlled study. A total of 12 stroke patients with pusher syndrome were randomly assigned to either the experimental group (N = 7, computer-generated interactive visual feedback training) or control group (N = 5, mirror visual feedback training). The scale for contraversive pushing for severity of pusher syndrome, the Berg Balance Scale for balance performance, and the Fugl-Meyer assessment scale for motor control were the outcome measures. Patients were assessed pre- and posttraining. A comparison of pre- and posttraining assessment results revealed that both training programs led to the following significant changes: decreased severity of pusher syndrome scores (decreases of 4.0 ± 1.1 and 1.4 ± 1.0 in the experimental and control groups, respectively); improved balance scores (increases of 14.7 ± 4.3 and 7.2 ± 1.6 in the experimental and control groups, respectively); and higher scores for lower extremity motor control (increases of 8.4 ± 2.2 and 5.6 ± 3.3 in the experimental and control groups, respectively). Furthermore, the computer-generated interactive visual feedback training program produced significantly better outcomes in the improvement of pusher syndrome (p < 0.01) and balance (p < 0.05) compared with the mirror visual feedback training program. Although both training programs were beneficial, the computer-generated interactive visual feedback training program more effectively aided recovery from pusher syndrome compared with mirror visual feedback training. © The Author(s) 2014.
CUGatesDensity—Quantum circuit analyser extended to density matrices
NASA Astrophysics Data System (ADS)
Loke, T.; Wang, J. B.
2013-12-01
CUGatesDensity is an extension of the original quantum circuit analyser CUGates (Loke and Wang, 2011) [7] to provide explicit support for the use of density matrices. The new package enables simulation of quantum circuits involving statistical ensemble of mixed quantum states. Such analysis is of vital importance in dealing with quantum decoherence, measurements, noise and error correction, and fault tolerant computation. Several examples involving mixed state quantum computation are presented to illustrate the use of this package. Catalogue identifier: AEPY_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEPY_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 5368 No. of bytes in distributed program, including test data, etc.: 143994 Distribution format: tar.gz Programming language: Mathematica. Computer: Any computer installed with a copy of Mathematica 6.0 or higher. Operating system: Any system with a copy of Mathematica 6.0 or higher installed. Classification: 4.15. Nature of problem: To simulate arbitrarily complex quantum circuits comprised of single/multiple qubit and qudit quantum gates with mixed state registers. Solution method: A density matrix representation for mixed states and a state vector representation for pure states are used. The construct is based on an irreducible form of matrix decomposition, which allows a highly efficient implementation of general controlled gates with multiple conditionals. Running time: The examples provided in the notebook CUGatesDensity.nb take approximately 30 s to run on a laptop PC.
NASA Astrophysics Data System (ADS)
Iwasawa, Masaki; Tanikawa, Ataru; Hosono, Natsuki; Nitadori, Keigo; Muranushi, Takayuki; Makino, Junichiro
2016-08-01
We present the basic idea, implementation, measured performance, and performance model of FDPS (Framework for Developing Particle Simulators). FDPS is an application-development framework which helps researchers to develop simulation programs using particle methods for large-scale distributed-memory parallel supercomputers. A particle-based simulation program for distributed-memory parallel computers needs to perform domain decomposition, exchange of particles which are not in the domain of each computing node, and gathering of the particle information in other nodes which are necessary for interaction calculation. Also, even if distributed-memory parallel computers are not used, in order to reduce the amount of computation, algorithms such as the Barnes-Hut tree algorithm or the Fast Multipole Method should be used in the case of long-range interactions. For short-range interactions, some methods to limit the calculation to neighbor particles are required. FDPS provides all of these functions which are necessary for efficient parallel execution of particle-based simulations as "templates," which are independent of the actual data structure of particles and the functional form of the particle-particle interaction. By using FDPS, researchers can write their programs with the amount of work necessary to write a simple, sequential and unoptimized program of O(N2) calculation cost, and yet the program, once compiled with FDPS, will run efficiently on large-scale parallel supercomputers. A simple gravitational N-body program can be written in around 120 lines. We report the actual performance of these programs and the performance model. The weak scaling performance is very good, and almost linear speed-up was obtained for up to the full system of the K computer. The minimum calculation time per timestep is in the range of 30 ms (N = 107) to 300 ms (N = 109). These are currently limited by the time for the calculation of the domain decomposition and communication necessary for the interaction calculation. We discuss how we can overcome these bottlenecks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of manymore » computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.« less
NASA Technical Reports Server (NTRS)
Holland, C.; Brodie, I.
1985-01-01
A test stand has been set up to measure the current fluctuation noise properties of B- and M-type dispenser cathodes in a typical TWT gun structure. Noise techniques were used to determine the work function distribution on the cathode surfaces. Significant differences between the B and M types and significant changes in the work function distribution during activation and life are found. In turn, knowledge of the expected work function can be used to accurately determine the cathode-operating temperatures in a TWT structure. Noise measurements also demonstrate more sensitivity to space charge effects than the Miram method. Full automation of the measurements and computations is now required to speed up data acquisition and reduction. The complete set of equations for the space charge limited diode were programmed so that given four of the five measurable variables (J, J sub O, T, D, and V) the fifth could be computed. Using this program, we estimated that an rms fluctuation in the diode spacing d in the frequency range of 145 Hz about 20 kHz of only about 10 to the -5 power A would account for the observed noise in a space charge limited diode with 1 mm spacing.
Schinke, Steven P; Cole, Kristin C A; Fang, Lin
2009-01-01
This study evaluated a gender-specific, computer-mediated intervention program to prevent underage drinking among early adolescent girls. Study participants were adolescent girls and their mothers from New York, New Jersey, and Connecticut. Participants completed pretests online and were randomly divided between intervention and control arms. Intervention-arm girls and their mothers interacted with a computer program aimed to enhance mother-daughter relationships and to teach girls skills for managing conflict, resisting media influences, refusing alcohol and drugs, and correcting peer norms about underage drinking, smoking, and drug use. After intervention, all participants (control and intervention) completed posttest and follow-up measurements. Two months following program delivery and relative to control-arm participants, intervention-arm girls and mothers had improved their mother-daughter communication skills and their perceptions and applications of parental monitoring and rule-setting relative to girls' alcohol use. Also at follow-up, intervention-arm girls had improved their conflict management and alcohol use-refusal skills; reported healthier normative beliefs about underage drinking; demonstrated greater self-efficacy about their ability to avoid underage drinking; reported less alcohol consumption in the past 7 days, 30 days, and year; and expressed lower intentions to drink as adults. Study findings modestly support the viability of a mother-daughter, computer-mediated program to prevent underage drinking among adolescent girls. The data have implications for the further development of gender-specific approaches to combat increases in alcohol and other substance use among American girls.
Meet EPA Engineer Gayle Hagler, Ph.D.
Gayle develops innovative ways to measure air pollution through field studies, data analysis and computer modeling. She is deeply involved with a research program that explores near-roadway air pollution sources and other local air pollution emissions
Evaluating Internal Communication: The ICA Communication Audit.
ERIC Educational Resources Information Center
Goldhaber, Gerald M.
1978-01-01
The ICA Communication Audit is described in detail as an effective measurement procedure that can help an academic institution to evaluate its internal communication system. Tools, computer programs, analysis, and feedback procedures are described and illustrated. (JMF)
User's manual for MMLE3, a general FORTRAN program for maximum likelihood parameter estimation
NASA Technical Reports Server (NTRS)
Maine, R. E.; Iliff, K. W.
1980-01-01
A user's manual for the FORTRAN IV computer program MMLE3 is described. It is a maximum likelihood parameter estimation program capable of handling general bilinear dynamic equations of arbitrary order with measurement noise and/or state noise (process noise). The theory and use of the program is described. The basic MMLE3 program is quite general and, therefore, applicable to a wide variety of problems. The basic program can interact with a set of user written problem specific routines to simplify the use of the program on specific systems. A set of user routines for the aircraft stability and control derivative estimation problem is provided with the program.
Computer assisted screening, correction, and analysis of historical weather measurements
NASA Astrophysics Data System (ADS)
Burnette, Dorian J.; Stahle, David W.
2013-04-01
A computer program, Historical Observation Tools (HOB Tools), has been developed to facilitate many of the calculations used by historical climatologists to develop instrumental and documentary temperature and precipitation datasets and makes them readily accessible to other researchers. The primitive methodology used by the early weather observers makes the application of standard techniques difficult. HOB Tools provides a step-by-step framework to visually and statistically assess, adjust, and reconstruct historical temperature and precipitation datasets. These routines include the ability to check for undocumented discontinuities, adjust temperature data for poor thermometer exposures and diurnal averaging, and assess and adjust daily precipitation data for undercount. This paper provides an overview of the Visual Basic.NET program and a demonstration of how it can assist in the development of extended temperature and precipitation datasets using modern and early instrumental measurements from the United States.
Design of the aerosol sampling manifold for the Southern Great Plains site
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leifer, R.; Knuth, R.H.; Guggenheim, S.F.
1995-04-01
To meet the needs of the ARM program, the Environmental Measurements Laboratory (EML) has the responsibility to establish a surface aerosol measurements program at the Southern Great Plains (SGP) site in Lamont, OK. At the present time, EML has scheduled installation of five instruments at SGP: a single wavelength nephelometer, an optical particle counter (OPC), a condensation particle counter (CPC), an optical absorption monitor (OAM), and an ozone monitor. ARM`s operating protocol requires that all the observational data be placed online and sent to the main computer facility in real time. EML currently maintains a computer file containing back trajectorymore » (BT) analyses for the SGP site. These trajectories are used to characterize air mass types as they pass over the site. EML is continuing to calculate and store the resulting trajectory analyses for future use by the ARM science team.« less
41 CFR 105-64.110 - When may GSA establish computer matching programs?
Code of Federal Regulations, 2013 CFR
2013-07-01
... computer matching programs? 105-64.110 Section 105-64.110 Public Contracts and Property Management Federal... GSA establish computer matching programs? (a) System managers will establish computer matching... direction of the GSA Data Integrity Board that will be established when and if computer matching programs...
41 CFR 105-64.110 - When may GSA establish computer matching programs?
Code of Federal Regulations, 2012 CFR
2012-01-01
... computer matching programs? 105-64.110 Section 105-64.110 Public Contracts and Property Management Federal... GSA establish computer matching programs? (a) System managers will establish computer matching... direction of the GSA Data Integrity Board that will be established when and if computer matching programs...
41 CFR 105-64.110 - When may GSA establish computer matching programs?
Code of Federal Regulations, 2014 CFR
2014-01-01
... computer matching programs? 105-64.110 Section 105-64.110 Public Contracts and Property Management Federal... GSA establish computer matching programs? (a) System managers will establish computer matching... direction of the GSA Data Integrity Board that will be established when and if computer matching programs...
41 CFR 105-64.110 - When may GSA establish computer matching programs?
Code of Federal Regulations, 2010 CFR
2010-07-01
... computer matching programs? 105-64.110 Section 105-64.110 Public Contracts and Property Management Federal... GSA establish computer matching programs? (a) System managers will establish computer matching... direction of the GSA Data Integrity Board that will be established when and if computer matching programs...
41 CFR 105-64.110 - When may GSA establish computer matching programs?
Code of Federal Regulations, 2011 CFR
2011-01-01
... computer matching programs? 105-64.110 Section 105-64.110 Public Contracts and Property Management Federal... GSA establish computer matching programs? (a) System managers will establish computer matching... direction of the GSA Data Integrity Board that will be established when and if computer matching programs...
Increasing the computational efficient of digital cross correlation by a vectorization method
NASA Astrophysics Data System (ADS)
Chang, Ching-Yuan; Ma, Chien-Ching
2017-08-01
This study presents a vectorization method for use in MATLAB programming aimed at increasing the computational efficiency of digital cross correlation in sound and images, resulting in a speedup of 6.387 and 36.044 times compared with performance values obtained from looped expression. This work bridges the gap between matrix operations and loop iteration, preserving flexibility and efficiency in program testing. This paper uses numerical simulation to verify the speedup of the proposed vectorization method as well as experiments to measure the quantitative transient displacement response subjected to dynamic impact loading. The experiment involved the use of a high speed camera as well as a fiber optic system to measure the transient displacement in a cantilever beam under impact from a steel ball. Experimental measurement data obtained from the two methods are in excellent agreement in both the time and frequency domain, with discrepancies of only 0.68%. Numerical and experiment results demonstrate the efficacy of the proposed vectorization method with regard to computational speed in signal processing and high precision in the correlation algorithm. We also present the source code with which to build MATLAB-executable functions on Windows as well as Linux platforms, and provide a series of examples to demonstrate the application of the proposed vectorization method.
Opcode counting for performance measurement
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gara, Alan; Satterfield, David L.; Walkup, Robert E.
Methods, systems and computer program products are disclosed for measuring a performance of a program running on a processing unit of a processing system. In one embodiment, the method comprises informing a logic unit of each instruction in the program that is executed by the processing unit, assigning a weight to each instruction, assigning the instructions to a plurality of groups, and analyzing the plurality of groups to measure one or more metrics. In one embodiment, each instruction includes an operating code portion, and the assigning includes assigning the instructions to the groups based on the operating code portions ofmore » the instructions. In an embodiment, each type of instruction is assigned to a respective one of the plurality of groups. These groups may be combined into a plurality of sets of the groups.« less
Opcode counting for performance measurement
Gara, Alan; Satterfield, David L; Walkup, Robert E
2013-10-29
Methods, systems and computer program products are disclosed for measuring a performance of a program running on a processing unit of a processing system. In one embodiment, the method comprises informing a logic unit of each instruction in the program that is executed by the processing unit, assigning a weight to each instruction, assigning the instructions to a plurality of groups, and analyzing the plurality of groups to measure one or more metrics. In one embodiment, each instruction includes an operating code portion, and the assigning includes assigning the instructions to the groups based on the operating code portions of the instructions. In an embodiment, each type of instruction is assigned to a respective one of the plurality of groups. These groups may be combined into a plurality of sets of the groups.
Opcode counting for performance measurement
Gara, Alan; Satterfield, David L.; Walkup, Robert E.
2015-08-11
Methods, systems and computer program products are disclosed for measuring a performance of a program running on a processing unit of a processing system. In one embodiment, the method comprises informing a logic unit of each instruction in the program that is executed by the processing unit, assigning a weight to each instruction, assigning the instructions to a plurality of groups, and analyzing the plurality of groups to measure one or more metrics. In one embodiment, each instruction includes an operating code portion, and the assigning includes assigning the instructions to the groups based on the operating code portions of the instructions. In an embodiment, each type of instruction is assigned to a respective one of the plurality of groups. These groups may be combined into a plurality of sets of the groups.
Opcode counting for performance measurement
Gara, Alan; Satterfield, David L.; Walkup, Robert E.
2016-10-18
Methods, systems and computer program products are disclosed for measuring a performance of a program running on a processing unit of a processing system. In one embodiment, the method comprises informing a logic unit of each instruction in the program that is executed by the processing unit, assigning a weight to each instruction, assigning the instructions to a plurality of groups, and analyzing the plurality of groups to measure one or more metrics. In one embodiment, each instruction includes an operating code portion, and the assigning includes assigning the instructions to the groups based on the operating code portions of the instructions. In an embodiment, each type of instruction is assigned to a respective one of the plurality of groups. These groups may be combined into a plurality of sets of the groups.
Department of Defense High Performance Computing Modernization Program. 2007 Annual Report
2008-03-01
Directorate, Kirtland AFB, NM Applications of Time-Accurate CFD in Order to Account for Blade -Row Interactions and Distortion Transfer in the Design of...Patterson AFB, OH Direct Numerical Simulations of Active Control for Low- Pressure Turbine Blades Herman Fasel, University of Arizona, Tucson, AZ (Air Force...interactions with the rotor wake . These HI-ARMS computations compare favorably with available wind tunnel test measurements of surface and flowfield
NASA Technical Reports Server (NTRS)
Goldhirsh, J.
1977-01-01
Disdrometer measurements and radar reflectivity measurements were injected into a computer program to estimate the path attenuation of the signal. Predicted attenuations when compared with the directly measured ones showed generally good correlation on a case by case basis and very good agreement statistically. The utility of using radar in conjunction with disdrometer measurements for predicting fade events and long term fade distributions associated with earth-satellite telecommunications is demonstrated.
Crosswords to computers: a critical review of popular approaches to cognitive enhancement.
Jak, Amy J; Seelye, Adriana M; Jurick, Sarah M
2013-03-01
Cognitive enhancement strategies have gained recent popularity and have the potential to benefit clinical and non-clinical populations. As technology advances and the number of cognitively healthy adults seeking methods of improving or preserving cognitive functioning grows, the role of electronic (e.g., computer and video game based) cognitive training becomes more relevant and warrants greater scientific scrutiny. This paper serves as a critical review of empirical evaluations of publically available electronic cognitive training programs. Many studies have found that electronic training approaches result in significant improvements in trained cognitive tasks. Fewer studies have demonstrated improvements in untrained tasks within the trained cognitive domain, non-trained cognitive domains, or on measures of everyday function. Successful cognitive training programs will elicit effects that generalize to untrained, practical tasks for extended periods of time. Unfortunately, many studies of electronic cognitive training programs are hindered by methodological limitations such as lack of an adequate control group, long-term follow-up and ecologically valid outcome measures. Despite these limitations, evidence suggests that computerized cognitive training has the potential to positively impact one's sense of social connectivity and self-efficacy.
The IAEA neutron coincidence counting (INCC) and the DEMING least-squares fitting programs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krick, M.S.; Harker, W.C.; Rinard, P.M.
1998-12-01
Two computer programs are described: (1) the INCC (IAEA or International Neutron Coincidence Counting) program and (2) the DEMING curve-fitting program. The INCC program is an IAEA version of the Los Alamos NCC (Neutron Coincidence Counting) code. The DEMING program is an upgrade of earlier Windows{reg_sign} and DOS codes with the same name. The versions described are INCC 3.00 and DEMING 1.11. The INCC and DEMING codes provide inspectors with the software support needed to perform calibration and verification measurements with all of the neutron coincidence counting systems used in IAEA inspections for the nondestructive assay of plutonium and uranium.
NASA Technical Reports Server (NTRS)
Hazelton, R. C.; Yadlowsky, E. J.; Churchill, R. J.; Parker, L. W.; Sellers, B.
1981-01-01
The effect differential charging of spacecraft thermal control surfaces is assessed by studying the dynamics of the charging process. A program to experimentally validate a computer model of the charging process was established. Time resolved measurements of the surface potential were obtained for samples of Kapton and Teflon irradiated with a monoenergetic electron beam. Results indicate that the computer model and experimental measurements agree well and that for Teflon, secondary emission is the governing factor. Experimental data indicate that bulk conductivities play a significant role in the charging of Kapton.
NASA Astrophysics Data System (ADS)
Ceres, M.; Heselton, L. R., III
1981-11-01
This manual describes the computer programs for the FIREFINDER Digital Topographic Data Verification-Library-Dubbing System (FFDTDVLDS), and will assist in the maintenance of these programs. The manual contains detailed flow diagrams and associated descriptions for each computer program routine and subroutine. Complete computer program listings are also included. This information should be used when changes are made in the computer programs. The operating system has been designed to minimize operator intervention.
Measurement of Loneliness Among Clients Representing Four Stages of Cancer: An Exploratory Study.
1985-03-01
status, and membership in organizations for each client were entered into a SPSS program in a mainframe computer . The means and a one-way analysis of...Study 6. PERFORMING ORG. REPORT NUMBER 7. AUTHOR(e) S. CONTRACT OR GRANT NUMBER(&) Suanne Smith 9. PERFORMING ORGANIZATION NAME AND ADDRESS 10. PROGRAM ...27 Definitions of Terms .......... . . . . 28 II. MErODOLOGY . . . . . . . . . . ......... 30 Overviev of Design
ERIC Educational Resources Information Center
Amodeo, Luiza B.; Emslie, Julia Rosa
Mathematics anxiety and competence of 57 Anglo and Hispanic pre-service teachers were measured before and after a 30-hour workshop using the training program EQUALS. Students were divided into three groups: elementary, secondary, and library media. Students in the library media class served as the control group; the other two groups were the…
AV Programs for Computer Know-How.
ERIC Educational Resources Information Center
Mandell, Phyllis Levy
1985-01-01
Lists 44 audiovisual programs (most released between 1983 and 1984) grouped in seven categories: computers in society, introduction to computers, computer operations, languages and programing, computer graphics, robotics, computer careers. Excerpts from "School Library Journal" reviews, price, and intended grade level are included. Names…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, G.; Mansur, D.L.; Ruhter, W.D.
1994-10-01
This report presents the details of the Lawrence Livermore National Laboratory safeguards and securities program. This program is focused on developing new technology, such as x- and gamma-ray spectrometry, for measurement of special nuclear materials. This program supports the Office of Safeguards and Securities in the following five areas; safeguards technology, safeguards and decision support, computer security, automated physical security, and automated visitor access control systems.
NASA Technical Reports Server (NTRS)
Tompkins, F. G.
1983-01-01
The report presents guidance for the NASA Computer Security Program Manager and the NASA Center Computer Security Officials as they develop training requirements and implement computer security training programs. NASA audiences are categorized based on the computer security knowledge required to accomplish identified job functions. Training requirements, in terms of training subject areas, are presented for both computer security program management personnel and computer resource providers and users. Sources of computer security training are identified.
The Tacitness of Tacitus. A Methodological Approach to European Thought. No. 46.
ERIC Educational Resources Information Center
Bierschenk, Bernhard
This study measured the analysis of verbal flows by means of volume-elasticity measures and the analysis of information flow structures and their representations in the form of a metaphysical cube. A special purpose system of computer programs (PERTEX) was used to establish the language space in which the textual flow patterns occurred containing…
The Gaertner L119 ellipsometer and its use in the measurement of thin films
NASA Technical Reports Server (NTRS)
Linkous, M.
1973-01-01
An introduction to the study of ellipsometry is presented, with special attention given to the Gaertner model L119 ellipsometer and the techniques of measuring thin films with this instrument. Values obtained from the ellipsometer are analyzed by a computer program for a determination of optical constants and thickness of the film.
ERIC Educational Resources Information Center
Wen, Pey-Shan
2009-01-01
Individuals with moderate to severe TBI often need extensive rehabilitation. To verify the effectiveness of intervention and design rehabilitation programs that meet individual's needs, precise and efficient outcome measures are crucial. Current assessments for TBI either focus on measuring impairments, such as neuropsychological tests or lack of…
NASA Technical Reports Server (NTRS)
Morin, Bruce L.
2010-01-01
Pratt & Whitney has developed a Broadband Fan Noise Prediction System (BFaNS) for turbofan engines. This system computes the noise generated by turbulence impinging on the leading edges of the fan and fan exit guide vane, and noise generated by boundary-layer turbulence passing over the fan trailing edge. BFaNS has been validated on three fan rigs that were tested during the NASA Advanced Subsonic Technology Program (AST). The predicted noise spectra agreed well with measured data. The predicted effects of fan speed, vane count, and vane sweep also agreed well with measurements. The noise prediction system consists of two computer programs: Setup_BFaNS and BFaNS. Setup_BFaNS converts user-specified geometry and flow-field information into a BFaNS input file. From this input file, BFaNS computes the inlet and aft broadband sound power spectra generated by the fan and FEGV. The output file from BFaNS contains the inlet, aft and total sound power spectra from each noise source. This report is the second volume of a three-volume set documenting the Broadband Fan Noise Prediction System: Volume 1: Setup_BFaNS User s Manual and Developer s Guide; Volume 2: BFaNS User s Manual and Developer s Guide; and Volume 3: Validation and Test Cases. The present volume begins with an overview of the Broadband Fan Noise Prediction System, followed by step-by-step instructions for installing and running BFaNS. It concludes with technical documentation of the BFaNS computer program.
NASA Technical Reports Server (NTRS)
Morin, Bruce L.
2010-01-01
Pratt & Whitney has developed a Broadband Fan Noise Prediction System (BFaNS) for turbofan engines. This system computes the noise generated by turbulence impinging on the leading edges of the fan and fan exit guide vane, and noise generated by boundary-layer turbulence passing over the fan trailing edge. BFaNS has been validated on three fan rigs that were tested during the NASA Advanced Subsonic Technology Program (AST). The predicted noise spectra agreed well with measured data. The predicted effects of fan speed, vane count, and vane sweep also agreed well with measurements. The noise prediction system consists of two computer programs: Setup_BFaNS and BFaNS. Setup_BFaNS converts user-specified geometry and flow-field information into a BFaNS input file. From this input file, BFaNS computes the inlet and aft broadband sound power spectra generated by the fan and FEGV. The output file from BFaNS contains the inlet, aft and total sound power spectra from each noise source. This report is the first volume of a three-volume set documenting the Broadband Fan Noise Prediction System: Volume 1: Setup_BFaNS User s Manual and Developer s Guide; Volume 2: BFaNS User's Manual and Developer s Guide; and Volume 3: Validation and Test Cases. The present volume begins with an overview of the Broadband Fan Noise Prediction System, followed by step-by-step instructions for installing and running Setup_BFaNS. It concludes with technical documentation of the Setup_BFaNS computer program.
Wiksten, D L; Patterson, P; Antonio, K; De La Cruz, D; Buxton, B P
1998-07-01
To evaluate the effectiveness of an interactive athletic training educational curriculum (IATEC) computer program as compared with traditional lecture instruction. Instructions on assessment of the quadriceps angle (Q-angle) were compared. Dependent measures consisted of cognitive knowledge, practical skill assessment, and attitudes toward the 2 methods of instruction. Sixty-six subjects were selected and then randomly assigned to 3 different groups: traditional lecture, IATEC, and control. The traditional lecture group (n = 22) received a 50-minute lecture/demonstration covering the same instructional content as the Q-angle module of the IATEC program. The IATEC group (n = 20; 2 subjects were dropped from this group due to scheduling conflicts) worked independently for 50 to 65 minutes using the Q-angle module of the IATEC program. The control group (n = 22) received no instruction. Subjects were recruited from an undergraduate athletic training education program and were screened for prior knowledge of the Q-angle. A 9-point multiple choice examination was used to determine cognitive knowledge of the Q-angle. A 12-point yes-no checklist was used to determine whether or not the subjects were able to correctly measure the Q-angle. The Allen Attitude Toward Computer-Assisted Instruction Semantic Differential Survey was used to assess student attitudes toward the 2 methods of instruction. The survey examined overall attitudes, in addition to 3 subscales: comfort, creativity, and function. The survey was scored from 1 to 7, with 7 being the most favorable and 1 being the least favorable. Results of a 1-way ANOVA on cognitive knowledge of the Q-angle revealed that the traditional lecture and IATEC groups performed significantly better than the control group, and the traditional lecture group performed significantly better than the IATEC group. Results of a 1-way ANOVA on practical skill performance revealed that the traditional lecture and IATEC groups performed significantly better than the control group, but there were no significant differences between the traditional lecture and IATEC groups on practical skill performance. Results of a t test indicated significantly more favorable attitudes (P < .05) for the traditional lecture group when compared with the IATEC group for comfort, creativity, and function. Our results suggest that use of the IATEC computer module is an effective means of instruction; however, use of the IATEC program alone may not be sufficient for educating students in cognitive knowledge. Further research is needed to determine the effectiveness of the IATEC computer program as a supplement to traditional lecture instruction in athletic training education.
Raffaelli, Marcela; Armstrong, Jessica; Tran, Steve P; Griffith, Aisha N; Walker, Kathrin; Gutierrez, Vanessa
2016-06-01
Computer-assisted data collection offers advantages over traditional paper and pencil measures; however, little guidance is available regarding the logistics of conducting computer-assisted data collection with adolescents in group settings. To address this gap, we draw on our experiences conducting a multi-site longitudinal study of adolescent development. Structured questionnaires programmed on laptop computers using Audio Computer Assisted Self-Interviewing (ACASI) were administered to groups of adolescents in community-based and afterschool programs. Although implementing ACASI required additional work before entering the field, we benefited from reduced data processing time, high data quality, and high levels of youth motivation. Preliminary findings from an ethnically diverse sample of 265 youth indicate favorable perceptions of using ACASI. Using our experiences as a case study, we provide recommendations on selecting an appropriate data collection device (including hardware and software), preparing and testing the ACASI, conducting data collection in the field, and managing data. Copyright © 2016 The Foundation for Professionals in Services for Adolescents. Published by Elsevier Ltd. All rights reserved.
Ranking Surgical Residency Programs: Reputation Survey or Outcomes Measures?
Wilson, Adam B; Torbeck, Laura J; Dunnington, Gary L
2015-01-01
The release of general surgery residency program rankings by Doximity and U.S. News & World Report accentuates the need to define and establish measurable standards of program quality. This study evaluated the extent to which program rankings based solely on peer nominations correlated with familiar program outcomes measures. Publicly available data were collected for all 254 general surgery residency programs. To generate a rudimentary outcomes-based program ranking, surgery programs were rank-ordered according to an average percentile rank that was calculated using board pass rates and the prevalence of alumni publications. A Kendall τ-b rank correlation computed the linear association between program rankings based on reputation alone and those derived from outcomes measures to validate whether reputation was a reasonable surrogate for globally judging program quality. For the 218 programs with complete data eligible for analysis, the mean board pass rate was 72% with a standard deviation of 14%. A total of 60 programs were placed in the 75th percentile or above for the number of publications authored by program alumni. The correlational analysis reported a significant correlation of 0.428, indicating only a moderate association between programs ranked by outcomes measures and those ranked according to reputation. Seventeen programs that were ranked in the top 30 according to reputation were also ranked in the top 30 based on outcomes measures. This study suggests that reputation alone does not fully capture a representative snapshot of a program's quality. Rather, the use of multiple quantifiable indicators and attributes unique to programs ought to be given more consideration when assigning ranks to denote program quality. It is advised that the interpretation and subsequent use of program rankings be met with caution until further studies can rigorously demonstrate best practices for awarding program standings. Copyright © 2015 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Ketchum, Eleanor A. (Inventor)
2000-01-01
A computer-implemented method and apparatus for determining position of a vehicle within 100 km autonomously from magnetic field measurements and attitude data without a priori knowledge of position. An inverted dipole solution of two possible position solutions for each measurement of magnetic field data are deterministically calculated by a program controlled processor solving the inverted first order spherical harmonic representation of the geomagnetic field for two unit position vectors 180 degrees apart and a vehicle distance from the center of the earth. Correction schemes such as a successive substitutions and a Newton-Raphson method are applied to each dipole. The two position solutions for each measurement are saved separately. Velocity vectors for the position solutions are calculated so that a total energy difference for each of the two resultant position paths is computed. The position path with the smaller absolute total energy difference is chosen as the true position path of the vehicle.
Automatic data partitioning on distributed memory multicomputers. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Gupta, Manish
1992-01-01
Distributed-memory parallel computers are increasingly being used to provide high levels of performance for scientific applications. Unfortunately, such machines are not very easy to program. A number of research efforts seek to alleviate this problem by developing compilers that take over the task of generating communication. The communication overheads and the extent of parallelism exploited in the resulting target program are determined largely by the manner in which data is partitioned across different processors of the machine. Most of the compilers provide no assistance to the programmer in the crucial task of determining a good data partitioning scheme. A novel approach is presented, the constraints-based approach, to the problem of automatic data partitioning for numeric programs. In this approach, the compiler identifies some desirable requirements on the distribution of various arrays being referenced in each statement, based on performance considerations. These desirable requirements are referred to as constraints. For each constraint, the compiler determines a quality measure that captures its importance with respect to the performance of the program. The quality measure is obtained through static performance estimation, without actually generating the target data-parallel program with explicit communication. Each data distribution decision is taken by combining all the relevant constraints. The compiler attempts to resolve any conflicts between constraints such that the overall execution time of the parallel program is minimized. This approach has been implemented as part of a compiler called Paradigm, that accepts Fortran 77 programs, and specifies the partitioning scheme to be used for each array in the program. We have obtained results on some programs taken from the Linpack and Eispack libraries, and the Perfect Benchmarks. These results are quite promising, and demonstrate the feasibility of automatic data partitioning for a significant class of scientific application programs with regular computations.
Computer programs: Operational and mathematical, a compilation
NASA Technical Reports Server (NTRS)
1973-01-01
Several computer programs which are available through the NASA Technology Utilization Program are outlined. Presented are: (1) Computer operational programs which can be applied to resolve procedural problems swiftly and accurately. (2) Mathematical applications for the resolution of problems encountered in numerous industries. Although the functions which these programs perform are not new and similar programs are available in many large computer center libraries, this collection may be of use to centers with limited systems libraries and for instructional purposes for new computer operators.
Computer Simulation Of An In-Process Surface Finish Sensor.
NASA Astrophysics Data System (ADS)
Rakels, Jan H.
1987-01-01
It is generally accepted, that optical methods are the most promising for the in-process measurement of surface finish. These methods have the advantages of being non-contacting and fast data acquisition. Furthermore, these optical instruments can be easily retrofitted on existing machine-tools. In the Micro-Engineering Centre at the University of Warwick, an optical sensor has been developed which can measure the rms roughness, slope and wavelength of turned and precision ground surfaces during machining. The operation of this device is based upon the Kirchhoff-Fresnel diffraction integral. Application of this theory to ideal turned and ground surfaces is straightforward, and indeed the calculated diffraction patterns are in close agreement with patterns produced by an actual optical instrument. Since it is mathematically difficult to introduce real machine-tool behaviour into the diffraction integral, a computer program has been devised, which simulates the operation of the optical sensor. The program produces a diffraction pattern as a graphical output. Comparison between computer generated and actual diffraction patterns of the same surfaces show a high correlation. The main aim of this program is to construct an atlas, which maps known machine-tool errors versus optical diffraction patterns. This atlas can then be used for machine-tool condition diagnostics. It has been found that optical monitoring is very sensitive to minor defects. Therefore machine-tool detoriation can be detected before it is detrimental.
Computer Literacy Project. A General Orientation in Basic Computer Concepts and Applications.
ERIC Educational Resources Information Center
Murray, David R.
This paper proposes a two-part, basic computer literacy program for university faculty, staff, and students with no prior exposure to computers. The program described would introduce basic computer concepts and computing center service programs and resources; provide fundamental preparation for other computer courses; and orient faculty towards…
NASA Astrophysics Data System (ADS)
Czerepicki, A.; Koniak, M.
2017-06-01
The paper presents a method of modelling the processes of aging lithium-ion batteries, its implementation as a computer application and results for battery state estimation. Authors use previously developed behavioural battery model, which was built using battery operating characteristics obtained from the experiment. This model was implemented in the form of a computer program using a database to store battery characteristics. Batteries aging process is a new extended functionality of the model. Algorithm of computer simulation uses a real measurements of battery capacity as a function of the battery charge and discharge cycles number. Simulation allows to take into account the incomplete cycles of charge or discharge battery, which are characteristic for transport powered by electricity. The developed model was used to simulate the battery state estimation for different load profiles, obtained by measuring the movement of the selected means of transport.
Multicolor pyrometer for materials processing in space
NASA Technical Reports Server (NTRS)
Frish, M. B.; Frank, J.; Baker, J. E.; Foutter, R. R.; Beerman, H.; Allen, M. G.
1990-01-01
This report documents the work performed by Physical Sciences Inc. (PSI), under contract to NASA JPL, during a 2.5-year SBIR Phase 2 Program. The program goals were to design, construct, and program a prototype passive imaging pyrometer capable of measuring, as accurately as possible, and controlling the temperature distribution across the surface of a moving object suspended in space. These goals were achieved and the instrument was delivered to JPL in November 1989. The pyrometer utilizes an optical system which operates at short wavelengths compared to the peak of the black-body spectrum for the temperature range of interest, thus minimizing errors associated with a lack of knowledge about the heated sample's emissivity. To cover temperatures from 900 to 2500 K, six wavelengths are available. The preferred wavelength for measurement of a particular temperature decreases as the temperature increases. Images at all six wavelengths are projected onto a single CCD camera concurrently. The camera and optical system have been calibrated to relate the measured intensity at each pixel to the temperature of the heated object. The output of the camera is digitized by a frame grabber installed in a personal computer and analyzed automatically to yield temperature information. The data can be used in a feedback loop to alter the status of computer-activated switches and thereby control a heating system.
Alabama NASA EPSCoR Preparation Grant Program: Grant No. NCC5-391
NASA Technical Reports Server (NTRS)
Gregory, John C.
2003-01-01
The funded research projects under the Experimental Program to Stimulate Cooperative Research (EPSCoR) grant program and the student fellowship awards are summarized in this report. The projects include: 1) Crystallization of Dehydratase/DcoH: A Target in Lung Disease; 2) Measuring Velocity Profiles in Liquid Metals using an Ultrasonic Doppler Velocimeter; 3) Synthesis, Structure, and Properties of New Thermoelectric Materials; 4) Computational Determination of Structures and Reactivity of Phenol-Formaldehyde Resins; 5) Synthesis of Microbial Polyesters in the NASA Bioreactor; 6) Visualization of Flow-Fields in Magnetocombustion; 7) Synthesis of Fluorescent Saccharide Derivatives. The student fellowship awards include: 1) Distributed Fusion of Satellite Images; 2) Study of the Relationship between Urban Development, Local Climate, and Water Quality for the Atlanta, Georgia Metrop; 3) Computer Simulation of the Effectiveness of a Spring-Loaded Exercise Device.
Preliminary demonstration of a robust controller design method
NASA Technical Reports Server (NTRS)
Anderson, L. R.
1980-01-01
Alternative computational procedures for obtaining a feedback control law which yields a control signal based on measurable quantitites are evaluated. The three methods evaluated are: (1) the standard linear quadratic regulator design model; (2) minimization of the norm of the feedback matrix, k via nonlinear programming subject to the constraint that the closed loop eigenvalues be in a specified domain in the complex plane; and (3) maximize the angles between the closed loop eigenvectors in combination with minimizing the norm of K also via the constrained nonlinear programming. The third or robust design method was chosen to yield a closed loop system whose eigenvalues are insensitive to small changes in the A and B matrices. The relationship between orthogonality of closed loop eigenvectors and the sensitivity of closed loop eigenvalues is described. Computer programs are described.
2010-05-01
Figure 2: Cloud Computing Deployment Models 13 Figure 3: NIST Essential Characteristics 14 Figure 4: NASA Nebula Container 37...Access Computing Environment (RACE) program, the National Aeronautics and Space Administration’s (NASA) Nebula program, and the Department of...computing programs: the DOD’s RACE program; NASA’s Nebula program; and Department of Transportation’s CARS program, including lessons learned related
Mueller, David S.
2016-05-12
The software program, QRev computes the discharge from moving-boat acoustic Doppler current profiler measurements using data collected with any of the Teledyne RD Instrument or SonTek bottom tracking acoustic Doppler current profilers. The computation of discharge is independent of the manufacturer of the acoustic Doppler current profiler because QRev applies consistent algorithms independent of the data source. In addition, QRev automates filtering and quality checking of the collected data and provides feedback to the user of potential quality issues with the measurement. Various statistics and characteristics of the measurement, in addition to a simple uncertainty assessment are provided to the user to assist them in properly rating the measurement. QRev saves an extensible markup language file that can be imported into databases or electronic field notes software. The user interacts with QRev through a tablet-friendly graphical user interface. This report is the manual for version 2.8 of QRev.
CPMIP: measurements of real computational performance of Earth system models in CMIP6
NASA Astrophysics Data System (ADS)
Balaji, Venkatramani; Maisonnave, Eric; Zadeh, Niki; Lawrence, Bryan N.; Biercamp, Joachim; Fladrich, Uwe; Aloisio, Giovanni; Benson, Rusty; Caubel, Arnaud; Durachta, Jeffrey; Foujols, Marie-Alice; Lister, Grenville; Mocavero, Silvia; Underwood, Seth; Wright, Garrett
2017-01-01
A climate model represents a multitude of processes on a variety of timescales and space scales: a canonical example of multi-physics multi-scale modeling. The underlying climate system is physically characterized by sensitive dependence on initial conditions, and natural stochastic variability, so very long integrations are needed to extract signals of climate change. Algorithms generally possess weak scaling and can be I/O and/or memory-bound. Such weak-scaling, I/O, and memory-bound multi-physics codes present particular challenges to computational performance. Traditional metrics of computational efficiency such as performance counters and scaling curves do not tell us enough about real sustained performance from climate models on different machines. They also do not provide a satisfactory basis for comparative information across models. codes present particular challenges to computational performance. We introduce a set of metrics that can be used for the study of computational performance of climate (and Earth system) models. These measures do not require specialized software or specific hardware counters, and should be accessible to anyone. They are independent of platform and underlying parallel programming models. We show how these metrics can be used to measure actually attained performance of Earth system models on different machines, and identify the most fruitful areas of research and development for performance engineering. codes present particular challenges to computational performance. We present results for these measures for a diverse suite of models from several modeling centers, and propose to use these measures as a basis for a CPMIP, a computational performance model intercomparison project (MIP).
An analysis of the 70-meter antenna hydrostatic bearing by means of computer simulation
NASA Technical Reports Server (NTRS)
Bartos, R. D.
1993-01-01
Recently, the computer program 'A Computer Solution for Hydrostatic Bearings with Variable Film Thickness,' used to design the hydrostatic bearing of the 70-meter antennas, was modified to improve the accuracy with which the program predicts the film height profile and oil pressure distribution between the hydrostatic bearing pad and the runner. This article presents a description of the modified computer program, the theory upon which the computer program computations are based, computer simulation results, and a discussion of the computer simulation results.
Choi, Catherine J; Lefebvre, Daniel R; Yoon, Michael K
2016-06-01
The aim of this article is to validate the accuracy of Facial Assessment by Computer Evaluation (FACE) program in eyelid measurements. Sixteen subjects between the ages of 27 and 65 were included with IRB approval. Clinical measurements of upper eyelid margin reflex distance (MRD1) and inter-palpebral fissure (IPF) were obtained. Photographs were then taken with a digital single lens reflex camera with built-in pop-up flash (dSLR-pop) and a dSLR with lens-mounted ring flash (dSLR-ring) with the cameras upright, rotated 90, 180, and 270 degrees. The images were analyzed using both the FACE and ImageJ software to measure MRD1 and IPF.Thirty-two eyes of sixteen subjects were included. Comparison of clinical measurement of MRD1 and IPF with FACE measurements of photos in upright position showed no statistically significant differences for dSLR-pop (MRD1: p = 0.0912, IPF: p = 0.334) and for dSLR-ring (MRD1: p = 0.105, IPF: p = 0.538). One-to-one comparison of MRD1 and IPF measurements in four positions obtained with FACE versus ImageJ for dSLR-pop showed moderate to substantial agreement for MRD1 (intraclass correlation coefficient = 0.534 upright, 0.731 in 90 degree rotation, 0.627 in 180 degree rotation, 0.477 in 270 degree rotation) and substantial to excellent agreement in IPF (ICC = 0.740, 0.859, 0.849, 0.805). In photos taken with dSLR-ring, there was excellent agreement of all MRD1 (ICC = 0.916, 0.932, 0.845, 0.812) and IPF (ICC = 0.937, 0.938, 0.917, 0.888) values. The FACE program is a valid method for measuring margin reflex distance and inter-palpebral fissure.
Computer program documentation for the patch subsampling processor
NASA Technical Reports Server (NTRS)
Nieves, M. J.; Obrien, S. O.; Oney, J. K. (Principal Investigator)
1981-01-01
The programs presented are intended to provide a way to extract a sample from a full-frame scene and summarize it in a useful way. The sample in each case was chosen to fill a 512-by-512 pixel (sample-by-line) image since this is the largest image that can be displayed on the Integrated Multivariant Data Analysis and Classification System. This sample size provides one megabyte of data for manipulation and storage and contains about 3% of the full-frame data. A patch image processor computes means for 256 32-by-32 pixel squares which constitute the 512-by-512 pixel image. Thus, 256 measurements are available for 8 vegetation indexes over a 100-mile square.
NASA Technical Reports Server (NTRS)
Ludewig, M.; Omori, S.; Rao, G. L.
1974-01-01
Tests were conducted to determine the experimental pressure drop and velocity data for water flowing through woven screens. The types of materials used are dutch twill and square weave fabrics. Pressure drop measures were made at four locations in a rectangular channel. The data are presented as change in pressure compared with the average entry velocity and the numerical relationship is determined by dividing the volumetric flow rate by the screen area open to flow. The equations of continuity and momentum are presented. A computer program listing an extension of a theoretical model and data from that computer program are included.
DE and NLP Based QPLS Algorithm
NASA Astrophysics Data System (ADS)
Yu, Xiaodong; Huang, Dexian; Wang, Xiong; Liu, Bo
As a novel evolutionary computing technique, Differential Evolution (DE) has been considered to be an effective optimization method for complex optimization problems, and achieved many successful applications in engineering. In this paper, a new algorithm of Quadratic Partial Least Squares (QPLS) based on Nonlinear Programming (NLP) is presented. And DE is used to solve the NLP so as to calculate the optimal input weights and the parameters of inner relationship. The simulation results based on the soft measurement of diesel oil solidifying point on a real crude distillation unit demonstrate that the superiority of the proposed algorithm to linear PLS and QPLS which is based on Sequential Quadratic Programming (SQP) in terms of fitting accuracy and computational costs.
ERIC Educational Resources Information Center
Liberman, Eva; And Others
Many library operations involving large data banks lend themselves readily to computer operation. In setting up library computer programs, in changing or expanding programs, cost in programming and time delays could be substantially reduced if the programmers had access to library computer programs being used by other libraries, providing similar…
A performance evaluation of the IBM 370/XT personal computer
NASA Technical Reports Server (NTRS)
Dominick, Wayne D. (Editor); Triantafyllopoulos, Spiros
1984-01-01
An evaluation of the IBM 370/XT personal computer is given. This evaluation focuses primarily on the use of the 370/XT for scientific and technical applications and applications development. A measurement of the capabilities of the 370/XT was performed by means of test programs which are presented. Also included is a review of facilities provided by the operating system (VM/PC), along with comments on the IBM 370/XT hardware configuration.
PERFORMANCE OF A COMPUTER-BASED ASSESSMENT OF COGNITIVE FUNCTION MEASURES IN TWO COHORTS OF SENIORS
Espeland, Mark A.; Katula, Jeffrey A.; Rushing, Julia; Kramer, Arthur F.; Jennings, Janine M.; Sink, Kaycee M.; Nadkarni, Neelesh K.; Reid, Kieran F.; Castro, Cynthia M.; Church, Timothy; Kerwin, Diana R.; Williamson, Jeff D.; Marottoli, Richard A.; Rushing, Scott; Marsiske, Michael; Rapp, Stephen R.
2013-01-01
Background Computer-administered assessment of cognitive function is being increasingly incorporated in clinical trials, however its performance in these settings has not been systematically evaluated. Design The Seniors Health and Activity Research Program (SHARP) pilot trial (N=73) developed a computer-based tool for assessing memory performance and executive functioning. The Lifestyle Interventions and Independence for Seniors (LIFE) investigators incorporated this battery in a full scale multicenter clinical trial (N=1635). We describe relationships that test scores have with those from interviewer-administered cognitive function tests and risk factors for cognitive deficits and describe performance measures (completeness, intra-class correlations). Results Computer-based assessments of cognitive function had consistent relationships across the pilot and full scale trial cohorts with interviewer-administered assessments of cognitive function, age, and a measure of physical function. In the LIFE cohort, their external validity was further demonstrated by associations with other risk factors for cognitive dysfunction: education, hypertension, diabetes, and physical function. Acceptable levels of data completeness (>83%) were achieved on all computer-based measures, however rates of missing data were higher among older participants (odds ratio=1.06 for each additional year; p<0.001) and those who reported no current computer use (odds ratio=2.71; p<0.001). Intra-class correlations among clinics were at least as low (ICC≤0.013) as for interviewer measures (ICC≤0.023), reflecting good standardization. All cognitive measures loaded onto the first principal component (global cognitive function), which accounted for 40% of the overall variance. Conclusion Our results support the use of computer-based tools for assessing cognitive function in multicenter clinical trials of older individuals. PMID:23589390
DOE Office of Scientific and Technical Information (OSTI.GOV)
2014-06-11
This program is a graphical user interface for measuring and performing inter-active analysis of physical unclonable functions (PUFs). It is intended for demonstration and education purposes. See license.txt for license details. The program features a PUF visualization that demonstrates how signatures differ between PUFs and how they exhibit noise over repeated measurements. A similarity scoreboard shows the user how close the current measurement is to the closest chip signatures in the database. Other metrics such as average noise and inter-chip Hamming distances are presented to the user. Randomness tests published in NIST SP 800-22 can be computed and displayed. Noisemore » and inter-chip histograms for the sample of PUFs and repeated PUF measurements can be drawn.« less
Visualization of anthropometric measures of workers in computer 3D modeling of work place.
Mijović, B; Ujević, D; Baksa, S
2001-12-01
In this work, 3D visualization of a work place by means of a computer-made 3D-machine model and computer animation of a worker have been performed. By visualization of 3D characters in inverse kinematic and dynamic relation with the operating part of a machine, the biomechanic characteristics of worker's body have been determined. The dimensions of a machine have been determined by an inspection of technical documentation as well as by direct measurements and recordings of the machine by camera. On the basis of measured body height of workers all relevant anthropometric measures have been determined by a computer program developed by the authors. By knowing the anthropometric measures, the vision fields and the scope zones while forming work places, exact postures of workers while performing technological procedures were determined. The minimal and maximal rotation angles and the translation of upper and lower arm which are basis for the analysis of worker burdening were analyzed. The dimensions of the seized space of a body are obtained by computer anthropometric analysis of movement, e.g. range of arms, position of legs, head, back. The influence of forming of a work place on correct postures of workers during work has been reconsidered and thus the consumption of energy and fatigue can be reduced to a minimum.
32 CFR 701.125 - Computer matching program.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 32 National Defense 5 2012-07-01 2012-07-01 false Computer matching program. 701.125 Section 701... OF THE NAVY DOCUMENTS AFFECTING THE PUBLIC DON Privacy Program § 701.125 Computer matching program. The DPO has responsibility for coordinating the approval of DOD's participation in Computer Matching...
32 CFR 701.125 - Computer matching program.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 32 National Defense 5 2014-07-01 2014-07-01 false Computer matching program. 701.125 Section 701... OF THE NAVY DOCUMENTS AFFECTING THE PUBLIC DON Privacy Program § 701.125 Computer matching program. The DPO has responsibility for coordinating the approval of DOD's participation in Computer Matching...
32 CFR 701.125 - Computer matching program.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 32 National Defense 5 2013-07-01 2013-07-01 false Computer matching program. 701.125 Section 701... OF THE NAVY DOCUMENTS AFFECTING THE PUBLIC DON Privacy Program § 701.125 Computer matching program. The DPO has responsibility for coordinating the approval of DOD's participation in Computer Matching...
32 CFR 701.125 - Computer matching program.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 32 National Defense 5 2010-07-01 2010-07-01 false Computer matching program. 701.125 Section 701... OF THE NAVY DOCUMENTS AFFECTING THE PUBLIC DON Privacy Program § 701.125 Computer matching program. The DPO has responsibility for coordinating the approval of DOD's participation in Computer Matching...
32 CFR 701.125 - Computer matching program.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 32 National Defense 5 2011-07-01 2011-07-01 false Computer matching program. 701.125 Section 701... OF THE NAVY DOCUMENTS AFFECTING THE PUBLIC DON Privacy Program § 701.125 Computer matching program. The DPO has responsibility for coordinating the approval of DOD's participation in Computer Matching...
NASA Astrophysics Data System (ADS)
Stoilescu, Dorian; Egodawatte, Gunawardena
2010-12-01
Research shows that female and male students in undergraduate computer science programs view computer culture differently. Female students are interested more in the use of computers than in doing programming, whereas male students see computer science mainly as a programming activity. The overall purpose of our research was not to find new definitions for computer science culture but to see how male and female students see themselves involved in computer science practices, how they see computer science as a successful career, and what they like and dislike about current computer science practices. The study took place in a mid-sized university in Ontario. Sixteen students and two instructors were interviewed to get their views. We found that male and female views are different on computer use, programming, and the pattern of student interactions. Female and male students did not have any major issues in using computers. In computing programming, female students were not so involved in computing activities whereas male students were heavily involved. As for the opinions about successful computer science professionals, both female and male students emphasized hard working, detailed oriented approaches, and enjoying playing with computers. The myth of the geek as a typical profile of successful computer science students was not found to be true.
Computer program for the computation of total sediment discharge by the modified Einstein procedure
Stevens, H.H.
1985-01-01
Two versions of a computer program to compute total sediment discharge by the modified Einstein procedure are presented. The FORTRAN 77 language version is for use on the PRIME computer, and the BASIC language version is for use on most microcomputers. The program contains built-in limitations and input-output options that closely follow the original modified Einstein procedure. Program documentation and listings of both versions of the program are included. (USGS)
Bibliography: Citations Obtained through the National Library of Medicine's MEDLARS Program.
ERIC Educational Resources Information Center
Journal of Medical Education, 1979
1979-01-01
Approximately 370 references are cited in the following areas of medical education: accreditation, computers, continuing education, curriculum, educational measurement, research and evaluation, faculty, foreign graduates, forensic medicine, history, minority groups, foreign education, specialties, teaching methods, etc. (LBH)
Bibliography. Citations Obtained Through the National Library of Medicine's MEDLARS Program
ERIC Educational Resources Information Center
Journal of Medical Education, 1978
1978-01-01
Approximately 200 MEDLARS references are cited dealing with: accreditation and licensure; computers; continuing education; curriculum; educational measurement, and research and development; forensic medicine; graduate education; history; internship and residency; foreign medical education; minority groups; schools; specialism; students; teaching…
Programming the Navier-Stokes computer: An abstract machine model and a visual editor
NASA Technical Reports Server (NTRS)
Middleton, David; Crockett, Tom; Tomboulian, Sherry
1988-01-01
The Navier-Stokes computer is a parallel computer designed to solve Computational Fluid Dynamics problems. Each processor contains several floating point units which can be configured under program control to implement a vector pipeline with several inputs and outputs. Since the development of an effective compiler for this computer appears to be very difficult, machine level programming seems necessary and support tools for this process have been studied. These support tools are organized into a graphical program editor. A programming process is described by which appropriate computations may be efficiently implemented on the Navier-Stokes computer. The graphical editor would support this programming process, verifying various programmer choices for correctness and deducing values such as pipeline delays and network configurations. Step by step details are provided and demonstrated with two example programs.
PROGRAM HTVOL: The Determination of Tree Crown Volume by Layers
Joseph C. Mawson; Jack Ward Thomas; Richard M. DeGraaf
1976-01-01
A FORTRAN IV computer program calculates, from a few field measurements, the volume of tree crowns. This volume is in layers of a specified thickness of trees or large shrubs. Each tree is assigned one of 15 solid forms, formed by using one of five side shapes (a circle, an ellipse, a neiloid, a triangle, or a parabolalike shape), and one of three bottom shapes (a...
2014-01-01
Background The speeding increase and the high prevalence of childhood obesity is a serious problem for Public Health. Community Based Interventions has been developed to combat against the childhood obesity epidemic. However little is known on the efficacy of these programs. Therefore, there is an urgent need to determine the effect of community based intervention on changes in lifestyle and surrogate measures of adiposity. Methods/design Parallel intervention study including two thousand 2249 children aged 8 to 10 years ( 4th and 5th grade of elementary school) from 4 Spanish towns. The THAO-Child Health Program, a community based intervention, were implemented in 2 towns. Body weight, height, and waist circumferences were measured. Children recorded their dietary intake on a computer-based 24h recall. All children also completed validated computer based questionnaires to estimate physical activity, diet quality, eating behaviors, and quality of life and sleep. Additionally, parental diet quality and physical activity were assessed by validated questionnaires. Discussion This study will provide insight in the efficacy of the THAO-Child Health Program to promote a healthy lifestyle. Additionally it will evaluate if lifestyle changes are accompanied by favorable weight management. Trial registration Trial Registration Number ISRCTN68403446 PMID:25174356
Computer Electronics. Florida Vocational Program Guide.
ERIC Educational Resources Information Center
University of South Florida, Tampa. Dept. of Adult and Vocational Education.
This packet contains a program guide and Career Merit Achievement Plan (Career MAP) for the implementation of a computer electronics technology (computer service technician) program in Florida secondary and postsecondary schools. The program guide describes the program content and structure, provides a program description, lists job titles under…
NASA Technical Reports Server (NTRS)
Forssen, B.; Wang, Y. S.; Crocker, M. J.
1981-01-01
Several aspects were studied. The SEA theory was used to develop a theoretical model to predict the transmission loss through an aircraft window. This work mainly consisted of the writing of two computer programs. One program predicts the sound transmission through a plexiglass window (the case of a single partition). The other program applies to the case of a plexiglass window window with a window shade added (the case of a double partition with an air gap). The sound transmission through a structure was measured in experimental studies using several different methods in order that the accuracy and complexity of all the methods could be compared. Also, the measurements were conducted on the simple model of a fuselage (a cylindrical shell), on a real aircraft fuselage, and on stiffened panels.
NASA Astrophysics Data System (ADS)
Forssen, B.; Wang, Y. S.; Crocker, M. J.
1981-12-01
Several aspects were studied. The SEA theory was used to develop a theoretical model to predict the transmission loss through an aircraft window. This work mainly consisted of the writing of two computer programs. One program predicts the sound transmission through a plexiglass window (the case of a single partition). The other program applies to the case of a plexiglass window window with a window shade added (the case of a double partition with an air gap). The sound transmission through a structure was measured in experimental studies using several different methods in order that the accuracy and complexity of all the methods could be compared. Also, the measurements were conducted on the simple model of a fuselage (a cylindrical shell), on a real aircraft fuselage, and on stiffened panels.
Use of computer programs STLK1 and STWT1 for analysis of stream-aquifer hydraulic interaction
Desimone, Leslie A.; Barlow, Paul M.
1999-01-01
Quantifying the hydraulic interaction of aquifers and streams is important in the analysis of stream base fow, flood-wave effects, and contaminant transport between surface- and ground-water systems. This report describes the use of two computer programs, STLK1 and STWT1, to analyze the hydraulic interaction of streams with confined, leaky, and water-table aquifers during periods of stream-stage fuctuations and uniform, areal recharge. The computer programs are based on analytical solutions to the ground-water-flow equation in stream-aquifer settings and calculate ground-water levels, seepage rates across the stream-aquifer boundary, and bank storage that result from arbitrarily varying stream stage or recharge. Analysis of idealized, hypothetical stream-aquifer systems is used to show how aquifer type, aquifer boundaries, and aquifer and streambank hydraulic properties affect aquifer response to stresses. Published data from alluvial and stratifed-drift aquifers in Kentucky, Massachusetts, and Iowa are used to demonstrate application of the programs to field settings. Analytical models of these three stream-aquifer systems are developed on the basis of available hydrogeologic information. Stream-stage fluctuations and recharge are applied to the systems as hydraulic stresses. The models are calibrated by matching ground-water levels calculated with computer program STLK1 or STWT1 to measured ground-water levels. The analytical models are used to estimate hydraulic properties of the aquifer, aquitard, and streambank; to evaluate hydrologic conditions in the aquifer; and to estimate seepage rates and bank-storage volumes resulting from flood waves and recharge. Analysis of field examples demonstrates the accuracy and limitations of the analytical solutions and programs when applied to actual ground-water systems and the potential uses of the analytical methods as alternatives to numerical modeling for quantifying stream-aquifer interactions.
Electrostatic Discharge Issues in International Space Station Program EVAs
NASA Technical Reports Server (NTRS)
Bacon, John B.
2009-01-01
EVA activity in the ISS program encounters several dangerous ESD conditions. The ISS program has been aggressive for many years to find ways to mitigate or to eliminate the associated risks. Investments have included: (1) Major mods to EVA tools, suit connectors & analytical tools (2) Floating Potential Measurement Unit (3) Plasma Contactor Units (4) Certification of new ISS flight attitudes (5) Teraflops of computation (6) Thousands of hours of work by scores of specialists (7) Monthly management attention at the highest program levels. The risks are now mitigated to a level that is orders of magnitude safer than prior operations
NASA Technical Reports Server (NTRS)
Fischer, James R.; Grosch, Chester; Mcanulty, Michael; Odonnell, John; Storey, Owen
1987-01-01
NASA's Office of Space Science and Applications (OSSA) gave a select group of scientists the opportunity to test and implement their computational algorithms on the Massively Parallel Processor (MPP) located at Goddard Space Flight Center, beginning in late 1985. One year later, the Working Group presented its report, which addressed the following: algorithms, programming languages, architecture, programming environments, the way theory relates, and performance measured. The findings point to a number of demonstrated computational techniques for which the MPP architecture is ideally suited. For example, besides executing much faster on the MPP than on conventional computers, systolic VLSI simulation (where distances are short), lattice simulation, neural network simulation, and image problems were found to be easier to program on the MPP's architecture than on a CYBER 205 or even a VAX. The report also makes technical recommendations covering all aspects of MPP use, and recommendations concerning the future of the MPP and machines based on similar architectures, expansion of the Working Group, and study of the role of future parallel processors for space station, EOS, and the Great Observatories era.
Frequency modulation television analysis: Threshold impulse analysis. [with computer program
NASA Technical Reports Server (NTRS)
Hodge, W. H.
1973-01-01
A computer program is developed to calculate the FM threshold impulse rates as a function of the carrier-to-noise ratio for a specified FM system. The system parameters and a vector of 1024 integers, representing the probability density of the modulating voltage, are required as input parameters. The computer program is utilized to calculate threshold impulse rates for twenty-four sets of measured probability data supplied by NASA and for sinusoidal and Gaussian modulating waveforms. As a result of the analysis several conclusions are drawn: (1) The use of preemphasis in an FM television system improves the threshold by reducing the impulse rate. (2) Sinusoidal modulation produces a total impulse rate which is a practical upper bound for the impulse rates of TV signals providing the same peak deviations. (3) As the moment of the FM spectrum about the center frequency of the predetection filter increases, the impulse rate tends to increase. (4) A spectrum having an expected frequency above (below) the center frequency of the predetection filter produces a higher negative (positive) than positive (negative) impulse rate.
Packaging printed circuit boards: A production application of interactive graphics
NASA Technical Reports Server (NTRS)
Perrill, W. A.
1975-01-01
The structure and use of an Interactive Graphics Packaging Program (IGPP), conceived to apply computer graphics to the design of packaging electronic circuits onto printed circuit boards (PCB), were described. The intent was to combine the data storage and manipulative power of the computer with the imaginative, intuitive power of a human designer. The hardware includes a CDC 6400 computer and two CDC 777 terminals with CRT screens, light pens, and keyboards. The program is written in FORTRAN 4 extended with the exception of a few functions coded in COMPASS (assembly language). The IGPP performs four major functions for the designer: (1) data input and display, (2) component placement (automatic or manual), (3) conductor path routing (automatic or manual), and (4) data output. The most complex PCB packaged to date measured 16.5 cm by 19 cm and contained 380 components, two layers of ground planes and four layers of conductors mixed with ground planes.
Turbine Blade and Endwall Heat Transfer Measured in NASA Glenn's Transonic Turbine Blade Cascade
NASA Technical Reports Server (NTRS)
Giel, Paul W.
2000-01-01
Higher operating temperatures increase the efficiency of aircraft gas turbine engines, but can also degrade internal components. High-pressure turbine blades just downstream of the combustor are particularly susceptible to overheating. Computational fluid dynamics (CFD) computer programs can predict the flow around the blades so that potential hot spots can be identified and appropriate cooling schemes can be designed. Various blade and cooling schemes can be examined computationally before any hardware is built, thus saving time and effort. Often though, the accuracy of these programs has been found to be inadequate for predicting heat transfer. Code and model developers need highly detailed aerodynamic and heat transfer data to validate and improve their analyses. The Transonic Turbine Blade Cascade was built at the NASA Glenn Research Center at Lewis Field to help satisfy the need for this type of data.
Another Program For Generating Interactive Graphics
NASA Technical Reports Server (NTRS)
Costenbader, Jay; Moleski, Walt; Szczur, Martha; Howell, David; Engelberg, Norm; Li, Tin P.; Misra, Dharitri; Miller, Philip; Neve, Leif; Wolf, Karl;
1991-01-01
VAX/Ultrix version of Transportable Applications Environment Plus (TAE+) computer program provides integrated, portable software environment for developing and running interactive window, text, and graphical-object-based application software systems. Enables programmer or nonprogrammer to construct easily custom software interface between user and application program and to move resulting interface program and its application program to different computers. When used throughout company for wide range of applications, makes both application program and computer seem transparent, with noticeable improvements in learning curve. Available in form suitable for following six different groups of computers: DEC VAX station and other VMS VAX computers, Macintosh II computers running AUX, Apollo Domain Series 3000, DEC VAX and reduced-instruction-set-computer workstations running Ultrix, Sun 3- and 4-series workstations running Sun OS and IBM RT/PC's and PS/2 computers running AIX, and HP 9000 S
Adolescents' Chunking of Computer Programs.
ERIC Educational Resources Information Center
Magliaro, Susan; Burton, John K.
To investigate what children learn during computer programming instruction, students attending a summer computer camp were asked to recall either single lines or chunks of computer programs from either coherent or scrambled programs. The 16 subjects, ages 12 to 17, were divided into three instructional groups: (1) beginners, who were taught to…
Advanced Certification Program for Computer Graphic Specialists. Final Performance Report.
ERIC Educational Resources Information Center
Parkland Coll., Champaign, IL.
A pioneer program in computer graphics was implemented at Parkland College (Illinois) to meet the demand for specialized technicians to visualize data generated on high performance computers. In summer 1989, 23 students were accepted into the pilot program. Courses included C programming, calculus and analytic geometry, computer graphics, and…
32 CFR 505.13 - Computer Matching Agreement Program.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 32 National Defense 3 2013-07-01 2013-07-01 false Computer Matching Agreement Program. 505.13... AUTHORITIES AND PUBLIC RELATIONS ARMY PRIVACY ACT PROGRAM § 505.13 Computer Matching Agreement Program. (a) General provisions. (1) Pursuant to the Privacy Act and this part, DA records may be subject to computer...
32 CFR 505.13 - Computer Matching Agreement Program.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 32 National Defense 3 2012-07-01 2009-07-01 true Computer Matching Agreement Program. 505.13... AUTHORITIES AND PUBLIC RELATIONS ARMY PRIVACY ACT PROGRAM § 505.13 Computer Matching Agreement Program. (a) General provisions. (1) Pursuant to the Privacy Act and this part, DA records may be subject to computer...
32 CFR 505.13 - Computer Matching Agreement Program.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 32 National Defense 3 2014-07-01 2014-07-01 false Computer Matching Agreement Program. 505.13... AUTHORITIES AND PUBLIC RELATIONS ARMY PRIVACY ACT PROGRAM § 505.13 Computer Matching Agreement Program. (a) General provisions. (1) Pursuant to the Privacy Act and this part, DA records may be subject to computer...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-06
...: Computer Matching Program AGENCY: Treasury Inspector General for Tax Administration, Treasury. ACTION... Internal Revenue Service (IRS) concerning the conduct of TIGTA's computer matching program. DATES... INFORMATION: TIGTA's computer matching program assists in the detection and deterrence of fraud, waste, and...
78 FR 50146 - Privacy Act of 1974: Computer Matching Program
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-16
... DEPARTMENT OF VETERANS AFFAIRS Privacy Act of 1974: Computer Matching Program AGENCY: Department of Veterans Affairs. ACTION: Notice of Computer Match Program. SUMMARY: Pursuant to 5 U.S.C. 552a... to conduct a computer matching program with the Internal Revenue Service (IRS). Data from the...
76 FR 47299 - Privacy Act of 1974: Computer Matching Program
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-04
... DEPARTMENT OF VETERANS AFFAIRS Privacy Act of 1974: Computer Matching Program AGENCY: Department of Veterans Affairs. ACTION: Notice of Computer Match Program. SUMMARY: Pursuant to 5 U.S.C. 552a... to conduct a computer matching program with the Internal Revenue Service (IRS). Data from the...
32 CFR 505.13 - Computer Matching Agreement Program.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 32 National Defense 3 2011-07-01 2009-07-01 true Computer Matching Agreement Program. 505.13... AUTHORITIES AND PUBLIC RELATIONS ARMY PRIVACY ACT PROGRAM § 505.13 Computer Matching Agreement Program. (a) General provisions. (1) Pursuant to the Privacy Act and this part, DA records may be subject to computer...
ERIC Educational Resources Information Center
Garg, Deepti; Garg, Ajay K.
2007-01-01
This study applied the Theory of Reasoned Action and the Technology Acceptance Model to measure outcomes of general education courses (GECs) under the University of Botswana Computer and Information Skills (CIS) program. An exploratory model was validated for responses from 298 students. The results suggest that resources currently committed to…
Missile Systems Maintenance, AFSC 411XOB/C.
1988-04-01
technician’s rating. A statistical measurement of their agreement, known as the interrater reliability (as assessed through components of variance of...senior technician’s ratings. A statistical measurement of their agreement, known as the interrater reliability (as assessed through components of...FABRICATION TRANSITORS *INPUT/OUTPUT (PERIPHERAL) DEVICES SOLID-STATE SPECIAL PURPOSE DEVICES COMPUTER MICRO PROCESSORS AND PROGRAMS POWER SUPPLIES
1-to-1 Computing: A Measure of Success
ERIC Educational Resources Information Center
O'Hanlon, Charlene
2007-01-01
When Texas' Technology Immersion Project began in the spring of 2004, a grant from the US Department of Education allowed a parallel project to launch--eTxTIP--to evaluate and measure the success of the program, which equips middle school students in high-risk, high-need areas with laptops. Data is beginning to come in on several of the first…
Mason F. Patterson; P. Eric Wiseman; Matthew F. Winn; Sang-mook Lee; Philip A. Araman
2011-01-01
UrbanCrowns is a software program developed by the USDA Forest Service that computes crown attributes using a side-view digital photograph and a few basic field measurements. From an operational standpoint, it is not known how well the software performs under varying photographic conditions for trees of diverse size, which could impact measurement reproducibility and...
Estimating Thruster Impulses From IMU and Doppler Data
NASA Technical Reports Server (NTRS)
Lisano, Michael E.; Kruizinga, Gerhard L.
2009-01-01
A computer program implements a thrust impulse measurement (TIM) filter, which processes data on changes in velocity and attitude of a spacecraft to estimate the small impulsive forces and torques exerted by the thrusters of the spacecraft reaction control system (RCS). The velocity-change data are obtained from line-of-sight-velocity data from Doppler measurements made from the Earth. The attitude-change data are the telemetered from an inertial measurement unit (IMU) aboard the spacecraft. The TIM filter estimates the threeaxis thrust vector for each RCS thruster, thereby enabling reduction of cumulative navigation error attributable to inaccurate prediction of thrust vectors. The filter has been augmented with a simple mathematical model to compensate for large temperature fluctuations in the spacecraft thruster catalyst bed in order to estimate thrust more accurately at deadbanding cold-firing levels. Also, rigorous consider-covariance estimation is applied in the TIM to account for the expected uncertainty in the moment of inertia and the location of the center of gravity of the spacecraft. The TIM filter was built with, and depends upon, a sigma-point consider-filter algorithm implemented in a Python-language computer program.
Ogata, Y; Nishizawa, K
1995-10-01
An automated smear counting and data processing system for a life science laboratory was developed to facilitate routine surveys and eliminate human errors by using a notebook computer. This system was composed of a personal computer, a liquid scintillation counter and a well-type NaI(Tl) scintillation counter. The radioactivity of smear samples was automatically measured by these counters. The personal computer received raw signals from the counters through an interface of RS-232C. The software for the computer evaluated the surface density of each radioisotope and printed out that value along with other items as a report. The software was programmed in Pascal language. This system was successfully applied to routine surveys for contamination in our facility.
Measurement of Satellite Impact Test Fragments for Modeling Orbital Debris
NASA Technical Reports Server (NTRS)
Hill, Nicole M.
2009-01-01
There are over 13,000 pieces of catalogued objects 10cm and larger in orbit around Earth [ODQN, January 2009, p12]. More than 6000 of these objects are fragments from explosions and collisions. As the earth-orbiting object count increases, debris-generating collisions in the future become a statistical inevitability. To aid in understanding this collision risk, the NASA Orbital Debris Program Office has developed computer models that calculate quantity and orbits of debris both currently in orbit and in future epochs. In order to create a reasonable computer model of the orbital debris environment, it is important to understand the mechanics of creation of debris as a result of a collision. The measurement of the physical characteristics of debris resulting from ground-based, hypervelocity impact testing aids in understanding the sizes and shapes of debris produced from potential impacts in orbit. To advance the accuracy of fragment shape/size determination, the NASA Orbital Debris Program Office recently implemented a computerized measurement system. The goal of this system is to improve knowledge and understanding of the relation between commonly used dimensions and overall shape. The technique developed involves scanning a single fragment with a hand-held laser device, measuring its size properties using a sophisticated software tool, and creating a three-dimensional computer model to demonstrate how the object might appear in orbit. This information is used to aid optical techniques in shape determination. This more automated and repeatable method provides higher accuracy in the size and shape determination of debris.
32 CFR 310.53 - Computer matching agreements (CMAs).
Code of Federal Regulations, 2013 CFR
2013-07-01
... 32 National Defense 2 2013-07-01 2013-07-01 false Computer matching agreements (CMAs). 310.53... (CONTINUED) PRIVACY PROGRAM DOD PRIVACY PROGRAM Computer Matching Program Procedures § 310.53 Computer.... (3) Justification and expected results. Explain why computer matching as opposed to some other...
32 CFR 310.53 - Computer matching agreements (CMAs).
Code of Federal Regulations, 2014 CFR
2014-07-01
... 32 National Defense 2 2014-07-01 2014-07-01 false Computer matching agreements (CMAs). 310.53... (CONTINUED) PRIVACY PROGRAM DOD PRIVACY PROGRAM Computer Matching Program Procedures § 310.53 Computer.... (3) Justification and expected results. Explain why computer matching as opposed to some other...
32 CFR 310.53 - Computer matching agreements (CMAs).
Code of Federal Regulations, 2012 CFR
2012-07-01
... 32 National Defense 2 2012-07-01 2012-07-01 false Computer matching agreements (CMAs). 310.53... (CONTINUED) PRIVACY PROGRAM DOD PRIVACY PROGRAM Computer Matching Program Procedures § 310.53 Computer.... (3) Justification and expected results. Explain why computer matching as opposed to some other...
32 CFR 310.53 - Computer matching agreements (CMAs).
Code of Federal Regulations, 2010 CFR
2010-07-01
... 32 National Defense 2 2010-07-01 2010-07-01 false Computer matching agreements (CMAs). 310.53... (CONTINUED) PRIVACY PROGRAM DOD PRIVACY PROGRAM Computer Matching Program Procedures § 310.53 Computer.... (3) Justification and expected results. Explain why computer matching as opposed to some other...
32 CFR 310.53 - Computer matching agreements (CMAs).
Code of Federal Regulations, 2011 CFR
2011-07-01
... 32 National Defense 2 2011-07-01 2011-07-01 false Computer matching agreements (CMAs). 310.53... (CONTINUED) PRIVACY PROGRAM DOD PRIVACY PROGRAM Computer Matching Program Procedures § 310.53 Computer.... (3) Justification and expected results. Explain why computer matching as opposed to some other...
NASA Technical Reports Server (NTRS)
Roskam, J.
1983-01-01
The transmission loss characteristics of panels using the acoustic intensity technique is presented. The theoretical formulation, installation of hardware, modifications to the test facility, and development of computer programs and test procedures are described. A listing of all the programs is also provided. The initial test results indicate that the acoustic intensity technique is easily adapted to measure transmission loss characteristics of panels. Use of this method will give average transmission loss values. The fixtures developed to position the microphones along the grid points are very useful in plotting the intensity maps of vibrating panels.
Ascent guidance algorithm using lidar wind measurements
NASA Technical Reports Server (NTRS)
Cramer, Evin J.; Bradt, Jerre E.; Hardtla, John W.
1990-01-01
The formulation of a general nonlinear programming guidance algorithm that incorporates wind measurements in the computation of ascent guidance steering commands is discussed. A nonlinear programming (NLP) algorithm that is designed to solve a very general problem has the potential to address the diversity demanded by future launch systems. Using B-splines for the command functional form allows the NLP algorithm to adjust the shape of the command profile to achieve optimal performance. The algorithm flexibility is demonstrated by simulation of ascent with dynamic loading constraints through a set of random wind profiles with and without wind sensing capability.
WE-A-BRE-01: Debate: To Measure or Not to Measure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moran, J; Miften, M; Mihailidis, D
2014-06-15
Recent studies have highlighted some of the limitations of patient-specific pre-treatment IMRT QA measurements with respect to assessing plan deliverability. Pre-treatment QA measurements are frequently performed with detectors in phantoms that do not involve any patient heterogeneities or with an EPID without a phantom. Other techniques have been developed where measurement results are used to recalculate the patient-specific dose volume histograms. Measurements continue to play a fundamental role in understanding the initial and continued performance of treatment planning and delivery systems. Less attention has been focused on the role of computational techniques in a QA program such as calculation withmore » independent dose calculation algorithms or recalculation of the delivery with machine log files or EPID measurements. This session will explore the role of pre-treatment measurements compared to other methods such as computational and transit dosimetry techniques. Efficiency and practicality of the two approaches will also be presented and debated. The speakers will present a history of IMRT quality assurance and debate each other regarding which types of techniques are needed today and for future quality assurance. Examples will be shared of situations where overall quality needed to be assessed with calculation techniques in addition to measurements. Elements where measurements continue to be crucial such as for a thorough end-to-end test involving measurement will be discussed. Operational details that can reduce the gamma tool effectiveness and accuracy for patient-specific pre-treatment IMRT/VMAT QA will be described. Finally, a vision for the future of IMRT and VMAT plan QA will be discussed from a safety perspective. Learning Objectives: Understand the advantages and limitations of measurement and calculation approaches for pre-treatment measurements for IMRT and VMAT planning Learn about the elements of a balanced quality assurance program involving modulated techniques Learn how to use tools and techniques such as an end-to-end test to enhance your IMRT and VMAT QA program.« less
Schinke, Steven P.; Cole, Kristin C. A.; Fang, Lin
2009-01-01
Objective: This study evaluated a gender-specific, computer-mediated intervention program to prevent underage drinking among early adolescent girls. Method: Study participants were adolescent girls and their mothers from New York, New Jersey, and Connecticut. Participants completed pretests online and were randomly divided between intervention and control arms. Intervention-arm girls and their mothers interacted with a computer program aimed to enhance mother-daughter relationships and to teach girls skills for managing conflict, resisting media influences, refusing alcohol and drugs, and correcting peer norms about underage drinking, smoking, and drug use. After intervention, all participants (control and intervention) completed posttest and follow-up measurements. Results: Two months following program delivery and relative to control-arm participants, intervention-arm girls and mothers had improved their mother-daughter communication skills and their perceptions and applications of parental monitoring and rule-setting relative to girls' alcohol use. Also at follow-up, intervention-arm girls had improved their conflict management and alcohol use-refusal skills; reported healthier normative beliefs about underage drinking; demonstrated greater self-efficacy about their ability to avoid underage drinking; reported less alcohol consumption in the past 7 days, 30 days, and year; and expressed lower intentions to drink as adults. Conclusions: Study findings modestly support the viability of a mother-daughter, computer-mediated program to prevent underage drinking among adolescent girls. The data have implications for the further development of gender-specific approaches to combat increases in alcohol and other substance use among American girls. PMID:19118394
Szabo, Bence T; Aksoy, Seçil; Repassy, Gabor; Csomo, Krisztian; Dobo-Nagy, Csaba; Orhan, Kaan
2017-06-09
The aim of this study was to compare the paranasal sinus volumes obtained by manual and semiautomatic imaging software programs using both CT and CBCT imaging. 121 computed tomography (CT) and 119 cone beam computed tomography (CBCT) examinations were selected from the databases of the authors' institutes. The Digital Imaging and Communications in Medicine (DICOM) images were imported into 3-dimensonal imaging software, in which hand mode and semiautomatic tracing methods were used to measure the volumes of both maxillary sinuses and the sphenoid sinus. The determined volumetric means were compared to previously published averages. Isometric CBCT-based volume determination results were closer to the real volume conditions, whereas the non-isometric CT-based volume measurements defined coherently lower volumes. By comparing the 2 volume measurement modes, the values gained from hand mode were closer to the literature data. Furthermore, CBCT-based image measurement results corresponded to the known averages. Our results suggest that CBCT images provide reliable volumetric information that can be depended on for artificial organ construction, and which may aid the guidance of the operator prior to or during the intervention.
Computer programs for computing particle-size statistics of fluvial sediments
Stevens, H.H.; Hubbell, D.W.
1986-01-01
Two versions of computer programs for inputing data and computing particle-size statistics of fluvial sediments are presented. The FORTRAN 77 language versions are for use on the Prime computer, and the BASIC language versions are for use on microcomputers. The size-statistics program compute Inman, Trask , and Folk statistical parameters from phi values and sizes determined for 10 specified percent-finer values from inputed size and percent-finer data. The program also determines the percentage gravel, sand, silt, and clay, and the Meyer-Peter effective diameter. Documentation and listings for both versions of the programs are included. (Author 's abstract)
NEMAR plotting computer program
NASA Technical Reports Server (NTRS)
Myler, T. R.
1981-01-01
A FORTRAN coded computer program which generates CalComp plots of trajectory parameters is examined. The trajectory parameters are calculated and placed on a data file by the Near Earth Mission Analysis Routine computer program. The plot program accesses the data file and generates the plots as defined by inputs to the plot program. Program theory, user instructions, output definitions, subroutine descriptions and detailed FORTRAN coding information are included. Although this plot program utilizes a random access data file, a data file of the same type and formatted in 102 numbers per record could be generated by any computer program and used by this plot program.
NASA Technical Reports Server (NTRS)
Smith, Tamara A.; Pavli, Albert J.; Kacynski, Kenneth J.
1987-01-01
The Joint Army, Navy, NASA, Air Force (JANNAF) rocket-engine performance-prediction procedure is based on the use of various reference computer programs. One of the reference programs for nozzle analysis is the Two-Dimensional Kinetics (TDK) Program. The purpose of this report is to calibrate the JANNAF procedure that has been incorporated into the December 1984 version of the TDK program for the high-area-ratio rocket-engine regime. The calibration was accomplished by modeling the performance of a 1030:1 rocket nozzle tested at NASA Lewis. A detailed description of the test conditions and TDK input parameters is given. The reuslts indicate that the computer code predicts delivered vacuum specific impulse to within 0.12 to 1.9 percent of the experimental data. Vacuum thrust coefficient predictions were within + or - 1.3 percent of experimental results. Predictions of wall static pressure were within approximately + or - 5 percent of the measured values.
An innovative approach to compensator design
NASA Technical Reports Server (NTRS)
Mitchell, J. R.
1972-01-01
The primary goal is to present for a control system a computer-aided-compensator design technique from a frequency domain point of view. The thesis for developing this technique is to describe the open loop frequency response by n discrete frequency points which result in n functions of the compensator coefficients. Several of these functions are chosen so that the system specifications are properly portrayed; then mathematical programming is used to improve all of these functions which have values below minimum standards. In order to do this several definitions in regard to measuring the performance of a system in the frequency domain are given. Next, theorems which govern the number of compensator coefficients necessary to make improvements in a certain number of functions are proved. After this a mathematical programming tool for aiding in the solution of the problem is developed. Then for applying the constraint improvement algorithm generalized gradients for the constraints are derived. Finally, the necessary theory is incorporated in a computer program called CIP (compensator improvement program).
Automated Performance Prediction of Message-Passing Parallel Programs
NASA Technical Reports Server (NTRS)
Block, Robert J.; Sarukkai, Sekhar; Mehra, Pankaj; Woodrow, Thomas S. (Technical Monitor)
1995-01-01
The increasing use of massively parallel supercomputers to solve large-scale scientific problems has generated a need for tools that can predict scalability trends of applications written for these machines. Much work has been done to create simple models that represent important characteristics of parallel programs, such as latency, network contention, and communication volume. But many of these methods still require substantial manual effort to represent an application in the model's format. The NIK toolkit described in this paper is the result of an on-going effort to automate the formation of analytic expressions of program execution time, with a minimum of programmer assistance. In this paper we demonstrate the feasibility of our approach, by extending previous work to detect and model communication patterns automatically, with and without overlapped computations. The predictions derived from these models agree, within reasonable limits, with execution times of programs measured on the Intel iPSC/860 and Paragon. Further, we demonstrate the use of MK in selecting optimal computational grain size and studying various scalability metrics.
Rorrer, Audrey S
2016-04-01
This paper describes the approach and process undertaken to develop evaluation capacity among the leaders of a federally funded undergraduate research program. An evaluation toolkit was developed for Computer and Information Sciences and Engineering(1) Research Experiences for Undergraduates(2) (CISE REU) programs to address the ongoing need for evaluation capacity among principal investigators who manage program evaluation. The toolkit was the result of collaboration within the CISE REU community with the purpose being to provide targeted instructional resources and tools for quality program evaluation. Challenges were to balance the desire for standardized assessment with the responsibility to account for individual program contexts. Toolkit contents included instructional materials about evaluation practice, a standardized applicant management tool, and a modulated outcomes measure. Resulting benefits from toolkit deployment were having cost effective, sustainable evaluation tools, a community evaluation forum, and aggregate measurement of key program outcomes for the national program. Lessons learned included the imperative of understanding the evaluation context, engaging stakeholders, and building stakeholder trust. Results from project measures are presented along with a discussion of guidelines for facilitating evaluation capacity building that will serve a variety of contexts. Copyright © 2016. Published by Elsevier Ltd.
User's guide to the NOZL3D and NOZLIC computer programs
NASA Technical Reports Server (NTRS)
Thomas, P. D.
1980-01-01
Complete FORTRAN listings and running instructions are given for a set of computer programs that perform an implicit numerical solution to the unsteady Navier-Stokes equations to predict the flow characteristics and performance of nonaxisymmetric nozzles. The set includes the NOZL3D program, which performs the flow computations; the NOZLIC program, which sets up the flow field initial conditions for general nozzle configurations, and also generates the computational grid for simple two dimensional and axisymmetric configurations; and the RGRIDD program, which generates the computational grid for complicated three dimensional configurations. The programs are designed specifically for the NASA-Langley CYBER 175 computer, and employ auxiliary disk files for primary data storage. Input instructions and computed results are given for four test cases that include two dimensional, three dimensional, and axisymmetric configurations.
Computer program for pulsed thermocouples with corrections for radiation effects
NASA Technical Reports Server (NTRS)
Will, H. A.
1981-01-01
A pulsed thermocouple was used for measuring gas temperatures above the melting point of common thermocouples. This was done by allowing the thermocouple to heat until it approaches its melting point and then turning on the protective cooling gas. This method required a computer to extrapolate the thermocouple data to the higher gas temperatures. A method that includes the effect of radiation in the extrapolation is described. Computations of gas temperature are provided, along with the estimate of the final thermocouple wire temperature. Results from tests on high temperature combustor research rigs are presented.
OASIS connections: results from an evaluation study.
Czaja, Sara J; Lee, Chin Chin; Branham, Janice; Remis, Peggy
2012-10-01
The objectives of this study were to evaluate a community-based basic computer and Internet training program designed for older adults, provide recommendations for program refinement, and gather preliminary information on program sustainability. The program was developed by the OASIS Institute, a nonprofit agency serving older adults and implemented in 4 cities by community trainers across the United States. One hundred and ninety-six adults aged 40-90 years were assigned to the training or a wait-list control group. Knowledge of computers and the Internet, attitudes toward computers, and computer/Internet use were assessed at baseline, posttraining, and 3 months posttraining. The program was successful in increasing the computer/Internet skills of the trainees. The data indicated a significant increase in computer and Internet knowledge and comfort with computers among those who received the training. Further, those who completed the course reported an increase in both computer and Internet use 3 months posttraining. The findings indicate that a community-based computer and Internet training program delivered by community instructors can be effective in terms of increasing computer and Internet skills and comfort with computer technology among older adults.