Program risk analysis handbook
NASA Technical Reports Server (NTRS)
Batson, R. G.
1987-01-01
NASA regulations specify that formal risk analysis be performed on a program at each of several milestones. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from extremely simple to complex network-based simulation, are described in this handbook in order to provide both analyst and manager with a guide for selection of the most appropriate technique. All program risk assessment techniques are shown to be based on elicitation and encoding of subjective probability estimates from the various area experts on a program. Techniques to encode the five most common distribution types are given. Then, a total of twelve distinct approaches to risk assessment are given. Steps involved, good and bad points, time involved, and degree of computer support needed are listed. Why risk analysis should be used by all NASA program managers is discussed. Tools available at NASA-MSFC are identified, along with commercially available software. Bibliography (150 entries) and a program risk analysis check-list are provided.
Directed Incremental Symbolic Execution
NASA Technical Reports Server (NTRS)
Person, Suzette; Yang, Guowei; Rungta, Neha; Khurshid, Sarfraz
2011-01-01
The last few years have seen a resurgence of interest in the use of symbolic execution -- a program analysis technique developed more than three decades ago to analyze program execution paths. Scaling symbolic execution and other path-sensitive analysis techniques to large systems remains challenging despite recent algorithmic and technological advances. An alternative to solving the problem of scalability is to reduce the scope of the analysis. One approach that is widely studied in the context of regression analysis is to analyze the differences between two related program versions. While such an approach is intuitive in theory, finding efficient and precise ways to identify program differences, and characterize their effects on how the program executes has proved challenging in practice. In this paper, we present Directed Incremental Symbolic Execution (DiSE), a novel technique for detecting and characterizing the effects of program changes. The novelty of DiSE is to combine the efficiencies of static analysis techniques to compute program difference information with the precision of symbolic execution to explore program execution paths and generate path conditions affected by the differences. DiSE is a complementary technique to other reduction or bounding techniques developed to improve symbolic execution. Furthermore, DiSE does not require analysis results to be carried forward as the software evolves -- only the source code for two related program versions is required. A case-study of our implementation of DiSE illustrates its effectiveness at detecting and characterizing the effects of program changes.
A Cost-Effectiveness/Benefit Analysis Model for Postsecondary Vocational Programs. Technical Report.
ERIC Educational Resources Information Center
Kim, Jin Eun
A cost-effectiveness/benefit analysis is defined as a technique for measuring the outputs of existing and new programs in relation to their specified program objectives, against the costs of those programs. In terms of its specific use, the technique is conceptualized as a systems analysis method, an evaluation method, and a planning tool for…
Regression Verification Using Impact Summaries
NASA Technical Reports Server (NTRS)
Backes, John; Person, Suzette J.; Rungta, Neha; Thachuk, Oksana
2013-01-01
Regression verification techniques are used to prove equivalence of syntactically similar programs. Checking equivalence of large programs, however, can be computationally expensive. Existing regression verification techniques rely on abstraction and decomposition techniques to reduce the computational effort of checking equivalence of the entire program. These techniques are sound but not complete. In this work, we propose a novel approach to improve scalability of regression verification by classifying the program behaviors generated during symbolic execution as either impacted or unimpacted. Our technique uses a combination of static analysis and symbolic execution to generate summaries of impacted program behaviors. The impact summaries are then checked for equivalence using an o-the-shelf decision procedure. We prove that our approach is both sound and complete for sequential programs, with respect to the depth bound of symbolic execution. Our evaluation on a set of sequential C artifacts shows that reducing the size of the summaries can help reduce the cost of software equivalence checking. Various reduction, abstraction, and compositional techniques have been developed to help scale software verification techniques to industrial-sized systems. Although such techniques have greatly increased the size and complexity of systems that can be checked, analysis of large software systems remains costly. Regression analysis techniques, e.g., regression testing [16], regression model checking [22], and regression verification [19], restrict the scope of the analysis by leveraging the differences between program versions. These techniques are based on the idea that if code is checked early in development, then subsequent versions can be checked against a prior (checked) version, leveraging the results of the previous analysis to reduce analysis cost of the current version. Regression verification addresses the problem of proving equivalence of closely related program versions [19]. These techniques compare two programs with a large degree of syntactic similarity to prove that portions of one program version are equivalent to the other. Regression verification can be used for guaranteeing backward compatibility, and for showing behavioral equivalence in programs with syntactic differences, e.g., when a program is refactored to improve its performance, maintainability, or readability. Existing regression verification techniques leverage similarities between program versions by using abstraction and decomposition techniques to improve scalability of the analysis [10, 12, 19]. The abstractions and decomposition in the these techniques, e.g., summaries of unchanged code [12] or semantically equivalent methods [19], compute an over-approximation of the program behaviors. The equivalence checking results of these techniques are sound but not complete-they may characterize programs as not functionally equivalent when, in fact, they are equivalent. In this work we describe a novel approach that leverages the impact of the differences between two programs for scaling regression verification. We partition program behaviors of each version into (a) behaviors impacted by the changes and (b) behaviors not impacted (unimpacted) by the changes. Only the impacted program behaviors are used during equivalence checking. We then prove that checking equivalence of the impacted program behaviors is equivalent to checking equivalence of all program behaviors for a given depth bound. In this work we use symbolic execution to generate the program behaviors and leverage control- and data-dependence information to facilitate the partitioning of program behaviors. The impacted program behaviors are termed as impact summaries. The dependence analyses that facilitate the generation of the impact summaries, we believe, could be used in conjunction with other abstraction and decomposition based approaches, [10, 12], as a complementary reduction technique. An evaluation of our regression verification technique shows that our approach is capable of leveraging similarities between program versions to reduce the size of the queries and the time required to check for logical equivalence. The main contributions of this work are: - A regression verification technique to generate impact summaries that can be checked for functional equivalence using an off-the-shelf decision procedure. - A proof that our approach is sound and complete with respect to the depth bound of symbolic execution. - An implementation of our technique using the LLVMcompiler infrastructure, the klee Symbolic Virtual Machine [4], and a variety of Satisfiability Modulo Theory (SMT) solvers, e.g., STP [7] and Z3 [6]. - An empirical evaluation on a set of C artifacts which shows that the use of impact summaries can reduce the cost of regression verification.
NECAP 4.1: NASA's Energy-Cost Analysis Program input manual
NASA Technical Reports Server (NTRS)
Jensen, R. N.
1982-01-01
The computer program NECAP (NASA's Energy Cost Analysis Program) is described. The program is a versatile building design and energy analysis tool which has embodied within it state of the art techniques for performing thermal load calculations and energy use predictions. With the program, comparisons of building designs and operational alternatives for new or existing buildings can be made. The major feature of the program is the response factor technique for calculating the heat transfer through the building surfaces which accounts for the building's mass. The program expands the response factor technique into a space response factor to account for internal building temperature swings; this is extremely important in determining true building loads and energy consumption when internal temperatures are allowed to swing.
NASA Technical Reports Server (NTRS)
Zamora, M. A.
1977-01-01
Consumables analysis/crew training simulator interface requirements were defined. Two aspects were investigated: consumables analysis support techniques to crew training simulator for advanced spacecraft programs, and the applicability of the above techniques to the crew training simulator for the space shuttle program in particular.
Automatic differentiation evaluated as a tool for rotorcraft design and optimization
NASA Technical Reports Server (NTRS)
Walsh, Joanne L.; Young, Katherine C.
1995-01-01
This paper investigates the use of automatic differentiation (AD) as a means for generating sensitivity analyses in rotorcraft design and optimization. This technique transforms an existing computer program into a new program that performs sensitivity analysis in addition to the original analysis. The original FORTRAN program calculates a set of dependent (output) variables from a set of independent (input) variables, the new FORTRAN program calculates the partial derivatives of the dependent variables with respect to the independent variables. The AD technique is a systematic implementation of the chain rule of differentiation, this method produces derivatives to machine accuracy at a cost that is comparable with that of finite-differencing methods. For this study, an analysis code that consists of the Langley-developed hover analysis HOVT, the comprehensive rotor analysis CAMRAD/JA, and associated preprocessors is processed through the AD preprocessor ADIFOR 2.0. The resulting derivatives are compared with derivatives obtained from finite-differencing techniques. The derivatives obtained with ADIFOR 2.0 are exact within machine accuracy and do not depend on the selection of step-size, as are the derivatives obtained with finite-differencing techniques.
Development Context Driven Change Awareness and Analysis Framework
NASA Technical Reports Server (NTRS)
Sarma, Anita; Branchaud, Josh; Dwyer, Matthew B.; Person, Suzette; Rungta, Neha
2014-01-01
Recent work on workspace monitoring allows conflict prediction early in the development process, however, these approaches mostly use syntactic differencing techniques to compare different program versions. In contrast, traditional change-impact analysis techniques analyze related versions of the program only after the code has been checked into the master repository. We propose a novel approach, De- CAF (Development Context Analysis Framework), that leverages the development context to scope a change impact analysis technique. The goal is to characterize the impact of each developer on other developers in the team. There are various client applications such as task prioritization, early conflict detection, and providing advice on testing that can benefit from such a characterization. The DeCAF framework leverages information from the development context to bound the iDiSE change impact analysis technique to analyze only the parts of the code base that are of interest. Bounding the analysis can enable DeCAF to efficiently compute the impact of changes using a combination of program dependence and symbolic execution based approaches.
Development Context Driven Change Awareness and Analysis Framework
NASA Technical Reports Server (NTRS)
Sarma, Anita; Branchaud, Josh; Dwyer, Matthew B.; Person, Suzette; Rungta, Neha; Wang, Yurong; Elbaum, Sebastian
2014-01-01
Recent work on workspace monitoring allows conflict prediction early in the development process, however, these approaches mostly use syntactic differencing techniques to compare different program versions. In contrast, traditional change-impact analysis techniques analyze related versions of the program only after the code has been checked into the master repository. We propose a novel approach, DeCAF (Development Context Analysis Framework), that leverages the development context to scope a change impact analysis technique. The goal is to characterize the impact of each developer on other developers in the team. There are various client applications such as task prioritization, early conflict detection, and providing advice on testing that can benefit from such a characterization. The DeCAF framework leverages information from the development context to bound the iDiSE change impact analysis technique to analyze only the parts of the code base that are of interest. Bounding the analysis can enable DeCAF to efficiently compute the impact of changes using a combination of program dependence and symbolic execution based approaches.
NASA Technical Reports Server (NTRS)
Kalben, P.
1972-01-01
The FORTRAN IV Program developed to analyze the flow field associated with scramjet exhaust systems is presented. The instructions for preparing input and interpreting output are described. The program analyzes steady three dimensional supersonic flow by the reference plane characteristic technique. The governing equations and numerical techniques employed are presented in Volume 1 of this report.
Developing a Systematic Patent Search Training Program
ERIC Educational Resources Information Center
Zhang, Li
2009-01-01
This study aims to develop a systematic patent training program using patent analysis and citation analysis techniques applied to patents held by the University of Saskatchewan. The results indicate that the target audience will be researchers in life sciences, and aggregated patent database searching and advanced search techniques should be…
A Survey of New Trends in Symbolic Execution for Software Testing and Analysis
NASA Technical Reports Server (NTRS)
Pasareanu, Corina S.; Visser, Willem
2009-01-01
Symbolic execution is a well-known program analysis technique which represents values of program inputs with symbolic values instead of concrete (initialized) data and executes the program by manipulating program expressions involving the symbolic values. Symbolic execution has been proposed over three decades ago but recently it has found renewed interest in the research community, due in part to the progress in decision procedures, availability of powerful computers and new algorithmic developments. We provide a survey of some of the new research trends in symbolic execution, with particular emphasis on applications to test generation and program analysis. We first describe an approach that handles complex programming constructs such as input data structures, arrays, as well as multi-threading. We follow with a discussion of abstraction techniques that can be used to limit the (possibly infinite) number of symbolic configurations that need to be analyzed for the symbolic execution of looping programs. Furthermore, we describe recent hybrid techniques that combine concrete and symbolic execution to overcome some of the inherent limitations of symbolic execution, such as handling native code or availability of decision procedures for the application domain. Finally, we give a short survey of interesting new applications, such as predictive testing, invariant inference, program repair, analysis of parallel numerical programs and differential symbolic execution.
Application of a sensitivity analysis technique to high-order digital flight control systems
NASA Technical Reports Server (NTRS)
Paduano, James D.; Downing, David R.
1987-01-01
A sensitivity analysis technique for multiloop flight control systems is studied. This technique uses the scaled singular values of the return difference matrix as a measure of the relative stability of a control system. It then uses the gradients of these singular values with respect to system and controller parameters to judge sensitivity. The sensitivity analysis technique is first reviewed; then it is extended to include digital systems, through the derivation of singular-value gradient equations. Gradients with respect to parameters which do not appear explicitly as control-system matrix elements are also derived, so that high-order systems can be studied. A complete review of the integrated technique is given by way of a simple example: the inverted pendulum problem. The technique is then demonstrated on the X-29 control laws. Results show linear models of real systems can be analyzed by this sensitivity technique, if it is applied with care. A computer program called SVA was written to accomplish the singular-value sensitivity analysis techniques. Thus computational methods and considerations form an integral part of many of the discussions. A user's guide to the program is included. The SVA is a fully public domain program, running on the NASA/Dryden Elxsi computer.
A linear circuit analysis program with stiff systems capability
NASA Technical Reports Server (NTRS)
Cook, C. H.; Bavuso, S. J.
1973-01-01
Several existing network analysis programs have been modified and combined to employ a variable topological approach to circuit translation. Efficient numerical integration techniques are used for transient analysis.
Reliability techniques for computer executive programs
NASA Technical Reports Server (NTRS)
1972-01-01
Computer techniques for increasing the stability and reliability of executive and supervisory systems were studied. Program segmentation characteristics are discussed along with a validation system which is designed to retain the natural top down outlook in coding. An analysis of redundancy techniques and roll back procedures is included.
A Categorization of Dynamic Analyzers
NASA Technical Reports Server (NTRS)
Lujan, Michelle R.
1997-01-01
Program analysis techniques and tools are essential to the development process because of the support they provide in detecting errors and deficiencies at different phases of development. The types of information rendered through analysis includes the following: statistical measurements of code, type checks, dataflow analysis, consistency checks, test data,verification of code, and debugging information. Analyzers can be broken into two major categories: dynamic and static. Static analyzers examine programs with respect to syntax errors and structural properties., This includes gathering statistical information on program content, such as the number of lines of executable code, source lines. and cyclomatic complexity. In addition, static analyzers provide the ability to check for the consistency of programs with respect to variables. Dynamic analyzers in contrast are dependent on input and the execution of a program providing the ability to find errors that cannot be detected through the use of static analysis alone. Dynamic analysis provides information on the behavior of a program rather than on the syntax. Both types of analysis detect errors in a program, but dynamic analyzers accomplish this through run-time behavior. This paper focuses on the following broad classification of dynamic analyzers: 1) Metrics; 2) Models; and 3) Monitors. Metrics are those analyzers that provide measurement. The next category, models, captures those analyzers that present the state of the program to the user at specified points in time. The last category, monitors, checks specified code based on some criteria. The paper discusses each classification and the techniques that are included under them. In addition, the role of each technique in the software life cycle is discussed. Familiarization with the tools that measure, model and monitor programs provides a framework for understanding the program's dynamic behavior from different, perspectives through analysis of the input/output data.
Linear Programming for Vocational Education Planning. Interim Report.
ERIC Educational Resources Information Center
Young, Robert C.; And Others
The purpose of the paper is to define for potential users of vocational education management information systems a quantitative analysis technique and its utilization to facilitate more effective planning of vocational education programs. Defining linear programming (LP) as a management technique used to solve complex resource allocation problems…
The composite sequential clustering technique for analysis of multispectral scanner data
NASA Technical Reports Server (NTRS)
Su, M. Y.
1972-01-01
The clustering technique consists of two parts: (1) a sequential statistical clustering which is essentially a sequential variance analysis, and (2) a generalized K-means clustering. In this composite clustering technique, the output of (1) is a set of initial clusters which are input to (2) for further improvement by an iterative scheme. This unsupervised composite technique was employed for automatic classification of two sets of remote multispectral earth resource observations. The classification accuracy by the unsupervised technique is found to be comparable to that by traditional supervised maximum likelihood classification techniques. The mathematical algorithms for the composite sequential clustering program and a detailed computer program description with job setup are given.
Comprehension-Driven Program Analysis (CPA) for Malware Detection in Android Phones
2015-07-01
COMPREHENSION-DRIVEN PROGRAM ANALYSIS (CPA) FOR MALWARE DETECTION IN ANDROID PHONES IOWA STATE UNIVERSITY JULY 2015 FINAL...DRIVEN PROGRAM ANALYSIS (CPA) FOR MALWARE DETECTION IN ANDROID PHONES Sb. GRANT NUMBER N/A Sc. PROGRAM ELEMENT NUMBER 6 1101E 6. AUTHOR(S) Sd...machine analysis system to detect novel, sophisticated Android malware. (c) An innovative library summarization technique and its incorporation in
Quadrant Analysis as a Strategic Planning Technique in Curriculum Development and Program Marketing.
ERIC Educational Resources Information Center
Lynch, James; And Others
1996-01-01
Quadrant analysis, a widely-used research technique, is suggested as useful in college or university strategic planning. The technique uses consumer preference data and produces information suitable for a wide variety of curriculum and marketing decisions. Basic quadrant analysis design is described, and advanced variations are discussed, with…
ERIC Educational Resources Information Center
Montoya, Isaac D.
2008-01-01
Three classification techniques (Chi-square Automatic Interaction Detection [CHAID], Classification and Regression Tree [CART], and discriminant analysis) were tested to determine their accuracy in predicting Temporary Assistance for Needy Families program recipients' future employment. Technique evaluation was based on proportion of correctly…
NASA Technical Reports Server (NTRS)
Coen, Peter G.
1991-01-01
A new computer technique for the analysis of transport aircraft sonic boom signature characteristics was developed. This new technique, based on linear theory methods, combines the previously separate equivalent area and F function development with a signature propagation method using a single geometry description. The new technique was implemented in a stand-alone computer program and was incorporated into an aircraft performance analysis program. Through these implementations, both configuration designers and performance analysts are given new capabilities to rapidly analyze an aircraft's sonic boom characteristics throughout the flight envelope.
WAATS: A computer program for Weights Analysis of Advanced Transportation Systems
NASA Technical Reports Server (NTRS)
Glatt, C. R.
1974-01-01
A historical weight estimating technique for advanced transportation systems is presented. The classical approach to weight estimation is discussed and sufficient data is presented to estimate weights for a large spectrum of flight vehicles including horizontal and vertical takeoff aircraft, boosters and reentry vehicles. A computer program, WAATS (Weights Analysis for Advanced Transportation Systems) embracing the techniques discussed has been written and user instructions are presented. The program was developed for use in the ODIN (Optimal Design Integration System) system.
Structured Analysis and the Data Flow Diagram: Tools for Library Analysis.
ERIC Educational Resources Information Center
Carlson, David H.
1986-01-01
This article discusses tools developed to aid the systems analysis process (program evaluation and review technique, Gantt charts, organizational charts, decision tables, flowcharts, hierarchy plus input-process-output). Similarities and differences among techniques, library applications of analysis, structured systems analysis, and the data flow…
PCC Framework for Program-Generators
NASA Technical Reports Server (NTRS)
Kong, Soonho; Choi, Wontae; Yi, Kwangkeun
2009-01-01
In this paper, we propose a proof-carrying code framework for program-generators. The enabling technique is abstract parsing, a static string analysis technique, which is used as a component for generating and validating certificates. Our framework provides an efficient solution for certifying program-generators whose safety properties are expressed in terms of the grammar representing the generated program. The fixed-point solution of the analysis is generated and attached with the program-generator on the code producer side. The consumer receives the code with a fixed-point solution and validates that the received fixed point is indeed a fixed point of the received code. This validation can be done in a single pass.
NASA Technical Reports Server (NTRS)
Watson, H. K.
1971-01-01
Digital computer program determines tolerance values of end to end signal chain or flow path, given preselected probability value. Technique is useful in the synthesis and analysis phases of subsystem design processes.
Scheduling: A guide for program managers
NASA Technical Reports Server (NTRS)
1994-01-01
The following topics are discussed concerning scheduling: (1) milestone scheduling; (2) network scheduling; (3) program evaluation and review technique; (4) critical path method; (5) developing a network; (6) converting an ugly duckling to a swan; (7) network scheduling problem; (8) (9) network scheduling when resources are limited; (10) multi-program considerations; (11) influence on program performance; (12) line-of-balance technique; (13) time management; (14) recapitulization; and (15) analysis.
Application of Economic Analysis to School-Wide Positive Behavior Support (SWPBS) Programs
ERIC Educational Resources Information Center
Blonigen, Bruce A.; Harbaugh, William T.; Singell, Larry D.; Horner, Robert H.; Irvin, Larry K.; Smolkowski, Keith S.
2008-01-01
The authors discuss how to use economic techniques to evaluate educational programs and show how to apply basic cost analysis to implementation of school-wide positive behavior support (SWPBS). A description of cost analysis concepts used for economic program evaluation is provided, emphasizing the suitability of these concepts for evaluating…
Methodologies for Evaluating the Impact of Contraceptive Social Marketing Programs.
ERIC Educational Resources Information Center
Bertrand, Jane T.; And Others
1989-01-01
An overview of the evaluation issues associated with contraceptive social marketing programs is provided. Methodologies covered include survey techniques, cost-effectiveness analyses, retail audits of sales data, time series analysis, nested logit analysis, and discriminant analysis. (TJH)
A program to form a multidisciplinary data base and analysis for dynamic systems
NASA Technical Reports Server (NTRS)
Taylor, L. W.; Suit, W. T.; Mayo, M. H.
1984-01-01
Diverse sets of experimental data and analysis programs have been assembled for the purpose of facilitating research in systems identification, parameter estimation and state estimation techniques. The data base analysis programs are organized to make it easy to compare alternative approaches. Additional data and alternative forms of analysis will be included as they become available.
NASA Technical Reports Server (NTRS)
Manford, J. S.; Bennett, G. R.
1985-01-01
The Space Station Program will incorporate analysis of operations constraints and considerations in the early design phases to avoid the need for later modifications to the Space Station for operations. The application of modern tools and administrative techniques to minimize the cost of performing effective orbital operations planning and design analysis in the preliminary design phase of the Space Station Program is discussed. Tools and techniques discussed include: approach for rigorous analysis of operations functions, use of the resources of a large computer network, and providing for efficient research and access to information.
Automatic Estimation of Verified Floating-Point Round-Off Errors via Static Analysis
NASA Technical Reports Server (NTRS)
Moscato, Mariano; Titolo, Laura; Dutle, Aaron; Munoz, Cesar A.
2017-01-01
This paper introduces a static analysis technique for computing formally verified round-off error bounds of floating-point functional expressions. The technique is based on a denotational semantics that computes a symbolic estimation of floating-point round-o errors along with a proof certificate that ensures its correctness. The symbolic estimation can be evaluated on concrete inputs using rigorous enclosure methods to produce formally verified numerical error bounds. The proposed technique is implemented in the prototype research tool PRECiSA (Program Round-o Error Certifier via Static Analysis) and used in the verification of floating-point programs of interest to NASA.
NASA Technical Reports Server (NTRS)
1971-01-01
Computational techniques were developed and assimilated for the design optimization. The resulting computer program was then used to perform initial optimization and sensitivity studies on a typical thermal protection system (TPS) to demonstrate its application to the space shuttle TPS design. The program was developed in Fortran IV for the CDC 6400 but was subsequently converted to the Fortran V language to be used on the Univac 1108. The program allows for improvement and update of the performance prediction techniques. The program logic involves subroutines which handle the following basic functions: (1) a driver which calls for input, output, and communication between program and user and between the subroutines themselves; (2) thermodynamic analysis; (3) thermal stress analysis; (4) acoustic fatigue analysis; and (5) weights/cost analysis. In addition, a system total cost is predicted based on system weight and historical cost data of similar systems. Two basic types of input are provided, both of which are based on trajectory data. These are vehicle attitude (altitude, velocity, and angles of attack and sideslip), for external heat and pressure loads calculation, and heating rates and pressure loads as a function of time.
Analysis of distortion data from TF30-P-3 mixed compression inlet test
NASA Technical Reports Server (NTRS)
King, R. W.; Schuerman, J. A.; Muller, R. G.
1976-01-01
A program was conducted to reduce and analyze inlet and engine data obtained during testing of a TF30-P-3 engine operating behind a mixed compression inlet. Previously developed distortion analysis techniques were applied to the data to assist in the development of a new distortion methodology. Instantaneous distortion techniques were refined as part of the distortion methodology development. A technique for estimating maximum levels of instantaneous distortion from steady state and average turbulence data was also developed as part of the program.
Three Techniques for Task Analysis: Examples from the Nuclear Utilities.
ERIC Educational Resources Information Center
Carlisle, Kenneth E.
1984-01-01
Discusses three task analysis techniques utilized at the Palo Verde Nuclear Generating Station to review training programs: analysis of (1) job positions, (2) procedures, and (3) instructional presentations. All of these include task breakdown, relationship determination, and task restructuring. (MBR)
NASA Astrophysics Data System (ADS)
Vasant, P.; Ganesan, T.; Elamvazuthi, I.
2012-11-01
A fairly reasonable result was obtained for non-linear engineering problems using the optimization techniques such as neural network, genetic algorithms, and fuzzy logic independently in the past. Increasingly, hybrid techniques are being used to solve the non-linear problems to obtain better output. This paper discusses the use of neuro-genetic hybrid technique to optimize the geological structure mapping which is known as seismic survey. It involves the minimization of objective function subject to the requirement of geophysical and operational constraints. In this work, the optimization was initially performed using genetic programming, and followed by hybrid neuro-genetic programming approaches. Comparative studies and analysis were then carried out on the optimized results. The results indicate that the hybrid neuro-genetic hybrid technique produced better results compared to the stand-alone genetic programming method.
NECAP: NASA's Energy-Cost Analysis Program. Part 1: User's manual
NASA Technical Reports Server (NTRS)
Henninger, R. H. (Editor)
1975-01-01
The NECAP is a sophisticated building design and energy analysis tool which has embodied within it all of the latest ASHRAE state-of-the-art techniques for performing thermal load calculation and energy usage predictions. It is a set of six individual computer programs which include: response factor program, data verification program, thermal load analysis program, variable temperature program, system and equipment simulation program, and owning and operating cost program. Each segment of NECAP is described, and instructions are set forth for preparing the required input data and for interpreting the resulting reports.
NASA Technical Reports Server (NTRS)
Rummler, D. R.
1976-01-01
The results are presented of investigations to apply regression techniques to the development of methodology for creep-rupture data analysis. Regression analysis techniques are applied to the explicit description of the creep behavior of materials for space shuttle thermal protection systems. A regression analysis technique is compared with five parametric methods for analyzing three simulated and twenty real data sets, and a computer program for the evaluation of creep-rupture data is presented.
ERIC Educational Resources Information Center
Sinacore, James M.; And Others
1992-01-01
It is argued that there is a benefit to applying techniques of exploratory data analysis (EDA) to program evaluation. The evaluation of a rehabilitation program for people with rheumatoid arthritis (20 subjects and 21 comparisons) through EDA supports the argument, indicating outcomes more precisely than conventional analysis of variance. (SLD)
MULGRES: a computer program for stepwise multiple regression analysis
A. Jeff Martin
1971-01-01
MULGRES is a computer program source deck that is designed for multiple regression analysis employing the technique of stepwise deletion in the search for most significant variables. The features of the program, along with inputs and outputs, are briefly described, with a note on machine compatibility.
Statistical Symbolic Execution with Informed Sampling
NASA Technical Reports Server (NTRS)
Filieri, Antonio; Pasareanu, Corina S.; Visser, Willem; Geldenhuys, Jaco
2014-01-01
Symbolic execution techniques have been proposed recently for the probabilistic analysis of programs. These techniques seek to quantify the likelihood of reaching program events of interest, e.g., assert violations. They have many promising applications but have scalability issues due to high computational demand. To address this challenge, we propose a statistical symbolic execution technique that performs Monte Carlo sampling of the symbolic program paths and uses the obtained information for Bayesian estimation and hypothesis testing with respect to the probability of reaching the target events. To speed up the convergence of the statistical analysis, we propose Informed Sampling, an iterative symbolic execution that first explores the paths that have high statistical significance, prunes them from the state space and guides the execution towards less likely paths. The technique combines Bayesian estimation with a partial exact analysis for the pruned paths leading to provably improved convergence of the statistical analysis. We have implemented statistical symbolic execution with in- formed sampling in the Symbolic PathFinder tool. We show experimentally that the informed sampling obtains more precise results and converges faster than a purely statistical analysis and may also be more efficient than an exact symbolic analysis. When the latter does not terminate symbolic execution with informed sampling can give meaningful results under the same time and memory limits.
ERIC Educational Resources Information Center
Calsyn, Robert J.; And Others
1977-01-01
After arguing that treatment programs for the elderly need to be evaluated with better research designs, the authors illustrate how interrupted time series analysis can be used to evaluate programs for the elderly when random assignment to experimental and control groups is not possible. (Author)
Computer program uses Monte Carlo techniques for statistical system performance analysis
NASA Technical Reports Server (NTRS)
Wohl, D. P.
1967-01-01
Computer program with Monte Carlo sampling techniques determines the effect of a component part of a unit upon the overall system performance. It utilizes the full statistics of the disturbances and misalignments of each component to provide unbiased results through simulated random sampling.
An algol program for dissimilarity analysis: a divisive-omnithetic clustering technique
Tipper, J.C.
1979-01-01
Clustering techniques are used properly to generate hypotheses about patterns in data. Of the hierarchical techniques, those which are divisive and omnithetic possess many theoretically optimal properties. One such method, dissimilarity analysis, is implemented here in ALGOL 60, and determined to be competitive computationally with most other methods. ?? 1979.
The NASTRAN theoretical manual
NASA Technical Reports Server (NTRS)
1981-01-01
Designed to accommodate additions and modifications, this commentary on NASTRAN describes the problem solving capabilities of the program in a narrative fashion and presents developments of the analytical and numerical procedures that underlie the program. Seventeen major sections and numerous subsections cover; the organizational aspects of the program, utility matrix routines, static structural analysis, heat transfer, dynamic structural analysis, computer graphics, special structural modeling techniques, error analysis, interaction between structures and fluids, and aeroelastic analysis.
Basic research for the geodynamics program
NASA Technical Reports Server (NTRS)
1984-01-01
Some objectives of this geodynamic program are: (1) optimal utilization of laser and VLBI observations as reference frames for geodynamics, (2) utilization of range difference observations in geodynamics, and (3) estimation techniques in crustal deformation analysis. The determination of Earth rotation parameters from different space geodetic systems is studied. Also reported on is the utilization of simultaneous laser range differences for the determination of baseline variation. An algorithm for the analysis of regional or local crustal deformation measurements is proposed along with other techniques and testing procedures. Some results of the reference from comparisons in terms of the pole coordinates from different techniques are presented.
Statistical Evaluation of Time Series Analysis Techniques
NASA Technical Reports Server (NTRS)
Benignus, V. A.
1973-01-01
The performance of a modified version of NASA's multivariate spectrum analysis program is discussed. A multiple regression model was used to make the revisions. Performance improvements were documented and compared to the standard fast Fourier transform by Monte Carlo techniques.
Thermal radiation analysis system TRASYS 2: User's manual
NASA Technical Reports Server (NTRS)
Goble, R. G.; Jensen, C. L.
1980-01-01
The Thermal Radiation Analyzer System (TRASYS) program put thermal radiation analysis on the same basis as thermal analysis using program systems such as MITAS and SINDA. The user is provided the powerful options of writing his own executive, or driver logic and choosing, among several available options, the most desirable solution technique(s) for the problem at hand. This User's Manual serves the twofold purpose of instructing the user in all applications and providing a convenient reference book that presents the features and capabilities in a concise, easy-to-find manner.
Wind Tunnel Test Technique and Instrumentation Development at LaRC
NASA Technical Reports Server (NTRS)
Putnam, Lawrence E.
1999-01-01
LaRC has an aggressive test technique development program underway. This program has been developed using 3rd Generation R&D management techniques and is a closely coordinated program between suppliers and wind tunnel operators- wind tunnel customers' informal input relative to their needs has been an essential ingredient in developing the research portfolio. An attempt has been made to balance this portfolio to meet near term and long term test technique needs. Major efforts are underway to develop techniques for determining model wing twist and location of boundary layer transition in the NTF (National Transonic Facility). The foundation of all new instrumentation developments, procurements, and upgrades will be based on uncertainty analysis.
Estimating the cost of major ongoing cost plus hardware development programs
NASA Technical Reports Server (NTRS)
Bush, J. C.
1990-01-01
Approaches are developed for forecasting the cost of major hardware development programs while these programs are in the design and development C/D phase. Three approaches are developed: a schedule assessment technique for bottom-line summary cost estimation, a detailed cost estimation approach, and an intermediate cost element analysis procedure. The schedule assessment technique was developed using historical cost/schedule performance data.
NASA Technical Reports Server (NTRS)
Brooner, W. G.; Nichols, D. A.
1972-01-01
Development of a scheme for utilizing remote sensing technology in an operational program for regional land use planning and land resource management program applications. The scheme utilizes remote sensing imagery as one of several potential inputs to derive desired and necessary data, and considers several alternative approaches to the expansion and/or reduction and analysis of data, using automated data handling techniques. Within this scheme is a five-stage program development which includes: (1) preliminary coordination, (2) interpretation and encoding, (3) creation of data base files, (4) data analysis and generation of desired products, and (5) applications.
ERIC Educational Resources Information Center
Boden, Andrea; Archwamety, Teara; McFarland, Max
This review used meta-analytic techniques to integrate findings from 30 independent studies that compared programmed instruction to conventional methods of instruction at the secondary level. The meta-analysis demonstrated that programmed instruction resulted in higher achievement when compared to conventional methods of instruction (average…
Simulated trajectories error analysis program, version 2. Volume 2: Programmer's manual
NASA Technical Reports Server (NTRS)
Vogt, E. D.; Adams, G. L.; Working, M. M.; Ferguson, J. B.; Bynum, M. R.
1971-01-01
A series of three computer programs for the mathematical analysis of navigation and guidance of lunar and interplanetary trajectories was developed. All three programs require the integration of n-body trajectories for both interplanetary and lunar missions. The virutal mass technique is used in all three programs. The user's manual contains the information necessary to operate the programs. The input and output quantities of the programs are described. Sample cases are given and discussed.
Optimization Techniques for Analysis of Biological and Social Networks
2012-03-28
analyzing a new metaheuristic technique, variable objective search. 3. Experimentation and application: Implement the proposed algorithms , test and fine...alternative mathematical programming formulations, their theoretical analysis, the development of exact algorithms , and heuristics. Originally, clusters...systematic fashion under a unifying theoretical and algorithmic framework. Optimization, Complex Networks, Social Network Analysis, Computational
A Change Impact Analysis to Characterize Evolving Program Behaviors
NASA Technical Reports Server (NTRS)
Rungta, Neha Shyam; Person, Suzette; Branchaud, Joshua
2012-01-01
Change impact analysis techniques estimate the potential effects of changes made to software. Directed Incremental Symbolic Execution (DiSE) is an intraprocedural technique for characterizing the impact of software changes on program behaviors. DiSE first estimates the impact of the changes on the source code using program slicing techniques, and then uses the impact sets to guide symbolic execution to generate path conditions that characterize impacted program behaviors. DiSE, however, cannot reason about the flow of impact between methods and will fail to generate path conditions for certain impacted program behaviors. In this work, we present iDiSE, an extension to DiSE that performs an interprocedural analysis. iDiSE combines static and dynamic calling context information to efficiently generate impacted program behaviors across calling contexts. Information about impacted program behaviors is useful for testing, verification, and debugging of evolving programs. We present a case-study of our implementation of the iDiSE algorithm to demonstrate its efficiency at computing impacted program behaviors. Traditional notions of coverage are insufficient for characterizing the testing efforts used to validate evolving program behaviors because they do not take into account the impact of changes to the code. In this work we present novel definitions of impacted coverage metrics that are useful for evaluating the testing effort required to test evolving programs. We then describe how the notions of impacted coverage can be used to configure techniques such as DiSE and iDiSE in order to support regression testing related tasks. We also discuss how DiSE and iDiSE can be configured for debugging finding the root cause of errors introduced by changes made to the code. In our empirical evaluation we demonstrate that the configurations of DiSE and iDiSE can be used to support various software maintenance tasks
Battery Test Manual For 48 Volt Mild Hybrid Electric Vehicles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walker, Lee Kenneth
2017-03-01
This manual details the U.S. Advanced Battery Consortium and U.S. Department of Energy Vehicle Technologies Program goals, test methods, and analysis techniques for a 48 Volt Mild Hybrid Electric Vehicle system. The test methods are outlined stating with characterization tests, followed by life tests. The final section details standardized analysis techniques for 48 V systems that allow for the comparison of different programs that use this manual. An example test plan is included, along with guidance to filling in gap table numbers.
Analysis of the Apollo spacecraft operational data management system. Executive summary
NASA Technical Reports Server (NTRS)
1971-01-01
A study was made of Apollo, Skylab, and several other data management systems to determine those techniques which could be applied to the management of operational data for future manned spacecraft programs. The results of the study are presented and include: (1) an analysis of present data management systems, (2) a list of requirements for future operational data management systems, (3) an evaluation of automated data management techniques, and (4) a plan for data management applicable to future space programs.
Analysis and Prediction of Sea Ice Evolution using Koopman Mode Decomposition Techniques
2018-04-30
Title: Analysis and Prediction of Sea Ice Evolution using Koopman Mode Decomposition Techniques Subject: Monthly Progress Report Period of...Resources: N/A TOTAL: $18,687 2 TECHNICAL STATUS REPORT Abstract The program goal is analysis of sea ice dynamical behavior using Koopman Mode Decompo...sition (KMD) techniques. The work in the program’s first month consisted of improvements to data processing code, inclusion of additional arctic sea ice
Runtime Speculative Software-Only Fault Tolerance
2012-06-01
reliability of RSFT, a in-depth analysis on its window of vulnerability is also discussed and measured via simulated fault injection. The performance...propagation of faults through the entire program. For optimal performance, these techniques have to use herotic alias analysis to find the minimum set of...affect program output. No program source code or alias analysis is needed to analyze the fault propagation ahead of time. 2.3 Limitations of Existing
Execution models for mapping programs onto distributed memory parallel computers
NASA Technical Reports Server (NTRS)
Sussman, Alan
1992-01-01
The problem of exploiting the parallelism available in a program to efficiently employ the resources of the target machine is addressed. The problem is discussed in the context of building a mapping compiler for a distributed memory parallel machine. The paper describes using execution models to drive the process of mapping a program in the most efficient way onto a particular machine. Through analysis of the execution models for several mapping techniques for one class of programs, we show that the selection of the best technique for a particular program instance can make a significant difference in performance. On the other hand, the results of benchmarks from an implementation of a mapping compiler show that our execution models are accurate enough to select the best mapping technique for a given program.
Strategies for Evaluating a Freshman Studies Program.
ERIC Educational Resources Information Center
Ketkar, Kusum; Bennett, Shelby D.
1989-01-01
The study developed an economic model for the evaluation of Seaton Hall University's freshman studies program. Two techniques used to evaluate the economic success of the program are break-even analysis and elasticity coefficient. (Author/MLW)
AIDS Education for Tanzanian Youth: A Mediation Analysis
ERIC Educational Resources Information Center
Stigler, Melissa H.; Kugler, K. C.; Komro, K. A.; Leshabari, M. T.; Klepp, K. I.
2006-01-01
Mediation analysis is a statistical technique that can be used to identify mechanisms by which intervention programs achieve their effects. This paper presents the results of a mediation analysis of Ngao, an acquired immunodeficiency syndrome (AIDS) education program that was implemented with school children in Grades 6 and 7 in Tanzania in the…
A Simplified Shuttle Payload Thermal Analyzer /SSPTA/ program
NASA Technical Reports Server (NTRS)
Bartoszek, J. T.; Huckins, B.; Coyle, M.
1979-01-01
A simple thermal analysis program for Space Shuttle payloads has been developed to accommodate the user who requires an easily understood but dependable analytical tool. The thermal analysis program includes several thermal subprograms traditionally employed in spacecraft thermal studies, a data management system for data generated by the subprograms, and a master program to coordinate the data files and thermal subprograms. The language and logic used to run the thermal analysis program are designed for the small user. In addition, analytical and storage techniques which conserve computer time and minimize core requirements are incorporated into the program.
NASA Astrophysics Data System (ADS)
Spicer, James B.; Dagdigian, Paul; Osiander, Robert; Miragliotta, Joseph A.; Zhang, Xi-Cheng; Kersting, Roland; Crosley, David R.; Hanson, Ronald K.; Jeffries, Jay
2003-09-01
The research center established by Army Research Office under the Multidisciplinary University Research Initiative program pursues a multidisciplinary approach to investigate and advance the use of complementary analytical techniques for sensing of explosives and/or explosive-related compounds as they occur in the environment. The techniques being investigated include Terahertz (THz) imaging and spectroscopy, Laser-Induced Breakdown Spectroscopy (LIBS), Cavity Ring Down Spectroscopy (CRDS) and Resonance Enhanced Multiphoton Ionization (REMPI). This suite of techniques encompasses a diversity of sensing approaches that can be applied to detection of explosives in condensed phases such as adsorbed species in soil or can be used for vapor phase detection above the source. Some techniques allow for remote detection while others have highly specific and sensitive analysis capabilities. This program is addressing a range of fundamental, technical issues associated with trace detection of explosive related compounds using these techniques. For example, while both LIBS and THz can be used to carry-out remote analysis of condensed phase analyte from a distance in excess several meters, the sensitivities of these techniques to surface adsorbed explosive-related compounds are not currently known. In current implementations, both CRDS and REMPI require sample collection techniques that have not been optimized for environmental applications. Early program elements will pursue the fundamental advances required for these techniques including signature identification for explosive-related compounds/interferents and trace analyte extraction. Later program tasks will explore simultaneous application of two or more techniques to assess the benefits of sensor fusion.
Summary of 1971 water remote sensing investigations
NASA Technical Reports Server (NTRS)
Tilton, E. L., III
1972-01-01
The Earth Resources Laboratory sea remote sensing program has concentrated on project planning, data acquisition procedures, and data preparation techniques to establish a firm procedural basis for the program. Most of these procedural elements were established and proven during the three missions conducted in 1971. It is anticipated that the program in 1972 will see the analysis completed on the Mississippi Sound series and the first series of Eastern Gulf experiments allowing increased emphasis to be given to more intensive technique development studies, the interrelationship of parameters for the measurement and prediction of water circulation, and the demonstration of the application of these techniques.
Safety considerations in the design and operation of large wind turbines
NASA Technical Reports Server (NTRS)
Reilly, D. H.
1979-01-01
The engineering and safety techniques used to assure the reliable and safe operation of large wind turbine generators utilizing the Mod 2 Wind Turbine System Program as an example is described. The techniques involve a careful definition of the wind turbine's natural and operating environments, use of proven structural design criteria and analysis techniques, an evaluation of potential failure modes and hazards, and use of a fail safe and redundant component engineering philosophy. The role of an effective quality assurance program, tailored to specific hardware criticality, and the checkout and validation program developed to assure system integrity are described.
CHEMICAL ANALYSIS OF WET SCRUBBERS UTILIZING ION CHROMATOGRAPHY
The report describes the key elements required to develop a sampling and analysis program for a wet scrubber using ion chromatography as the main analytical technique. The first part of the report describes a sampling program for two different types of wet scrubbers: the venturi/...
1988-09-01
analysis phase of the software life cycle (16:1-1). While editing a SADT diagram, the tool should be able to check whether or not structured analysis...diag-ams are valid for the SADT’s syntax, produce error messages, do error recovery, and perform editing suggestions. Thus, this tool must have the...directed editors are editors which use the syn- tax of the programming language while editing a program. While text editors treat programs as text, syntax
Approximation concepts for efficient structural synthesis
NASA Technical Reports Server (NTRS)
Schmit, L. A., Jr.; Miura, H.
1976-01-01
It is shown that efficient structural synthesis capabilities can be created by using approximation concepts to mesh finite element structural analysis methods with nonlinear mathematical programming techniques. The history of the application of mathematical programming techniques to structural design optimization problems is reviewed. Several rather general approximation concepts are described along with the technical foundations of the ACCESS 1 computer program, which implements several approximation concepts. A substantial collection of structural design problems involving truss and idealized wing structures is presented. It is concluded that since the basic ideas employed in creating the ACCESS 1 program are rather general, its successful development supports the contention that the introduction of approximation concepts will lead to the emergence of a new generation of practical and efficient, large scale, structural synthesis capabilities in which finite element analysis methods and mathematical programming algorithms will play a central role.
FLUT - A program for aeroelastic stability analysis. [of aircraft structures in subsonic flow
NASA Technical Reports Server (NTRS)
Johnson, E. H.
1977-01-01
A computer program (FLUT) that can be used to evaluate the aeroelastic stability of aircraft structures in subsonic flow is described. The algorithm synthesizes data from a structural vibration analysis with an unsteady aerodynamics analysis and then performs a complex eigenvalue analysis to assess the system stability. The theoretical basis of the program is discussed with special emphasis placed on some innovative techniques which improve the efficiency of the analysis. User information needed to efficiently and successfully utilize the program is provided. In addition to identifying the required input, the flow of the program execution and some possible sources of difficulty are included. The use of the program is demonstrated with a listing of the input and output for a simple example.
Forest Fire History... A Computer Method of Data Analysis
Romain M. Meese
1973-01-01
A series of computer programs is available to extract information from the individual Fire Reports (U.S. Forest Service Form 5100-29). The programs use a statistical technique to fit a continuous distribution to a set of sampled data. The goodness-of-fit program is applicable to data other than the fire history. Data summaries illustrate analysis of fire occurrence,...
Nationwide forestry applications program. Analysis of forest classification accuracy
NASA Technical Reports Server (NTRS)
Congalton, R. G.; Mead, R. A.; Oderwald, R. G.; Heinen, J. (Principal Investigator)
1981-01-01
The development of LANDSAT classification accuracy assessment techniques, and of a computerized system for assessing wildlife habitat from land cover maps are considered. A literature review on accuracy assessment techniques and an explanation for the techniques development under both projects are included along with listings of the computer programs. The presentations and discussions at the National Working Conference on LANDSAT Classification Accuracy are summarized. Two symposium papers which were published on the results of this project are appended.
NASA Technical Reports Server (NTRS)
Ratliff, A. W.; Smith, S. D.; Penny, N. M.
1972-01-01
A summary is presented of the various documents that discuss and describe the computer programs and analysis techniques which are available for rocket nozzle and exhaust plume calculations. The basic method of characteristics program is discussed, along with such auxiliary programs as the plume impingement program, the plot program and the thermochemical properties program.
ERIC Educational Resources Information Center
Hand, Michael L.
1990-01-01
Use of the bootstrap resampling technique (BRT) is assessed in its application to resampling analysis associated with measurement of payment allocation errors by federally funded Family Assistance Programs. The BRT is applied to a food stamp quality control database in Oregon. This analysis highlights the outlier-sensitivity of the…
Economic Analysis of Equal Educational Opportunity Programs.
ERIC Educational Resources Information Center
Mela, Ken
1997-01-01
Presents methods for assessing the impact and economic viability of federal equal-educational-opportunity programs, particularly in higher education. Techniques for gathering needed data and analyzing them are offered in the context of a hypothetical community college Veterans Upward Bound (VUB) program and two real VUB programs. (MSE)
NASA Technical Reports Server (NTRS)
1994-01-01
This manual presents a series of recommended techniques that can increase overall operational effectiveness of both flight and ground based NASA systems. It provides a set of tools that minimizes risk associated with: (1) restoring failed functions (both ground and flight based); (2) conducting complex and highly visible maintenance operations; and (3) sustaining a technical capability to support the NASA mission using aging equipment or facilities. It considers (1) program management - key elements of an effective maintainability effort; (2) design and development - techniques that have benefited previous programs; (3) analysis and test - quantitative and qualitative analysis processes and testing techniques; and (4) operations and operational design techniques that address NASA field experience. This document is a valuable resource for continuous improvement ideas in executing the systems development process in accordance with the NASA 'better, faster, smaller, and cheaper' goal without compromising safety.
SRB Environment Evaluation and Analysis. Volume 2: RSRB Joint Filling Test/Analysis Improvements
NASA Technical Reports Server (NTRS)
Knox, E. C.; Woods, G. Hamilton
1991-01-01
Following the Challenger accident a very comprehensive solid rocket booster (SRB) redesign program was initiated. One objective of the program was to develop expertise at NASA/MSFC in the techniques for analyzing the flow of hot gases in the SRB joints. Several test programs were undertaken to provide a data base of joint performance with manufactured defects in the joints to allow hot gases to fill the joints. This data base was used also to develop the analytical techniques. Some of the test programs were Joint Environment Simulator (JES), Nozzle Joint Environment Simulator (NJES), Transient Pressure Test Article (TPTA), and Seventy-Pound Charge (SPC). In 1988 the TPTA test hardware was moved from the Utah site to MSFC and several RSRM tests were scheduled, to be followed by tests for the ASRM program. REMTECH Inc. supported these activities with pretest estimates of the flow conditions in the test joints, and post-test analysis and evaluation of the measurements. During this support REMTECH identified deficiencies in the gas-measurement instrumentation that existed in the TPTA hardware, made recommendations for its replacement, and identified improvements to the analytical tools used in the test support. Only one test was completed under the TPTA RSRM test program, and those scheduled for the ASRM were rescheduled to a time after the expiration of this contract. The attention of this effort was directed toward improvements in the analytical techniques in preparation for when the ASRM program begins.
The Shock and Vibration Digest. Volume 16, Number 1
1984-01-01
investigation of the measure- ment of frequency band average loss factors of structural components for use in the statistical energy analysis method of...stiffness. Matrix methods Key Words: Finite element technique. Statistical energy analysis . Experimental techniques. Framed structures, Com- puter...programs In order to further understand the practical application of the statistical energy analysis , a two section plate-like frame structure is
Cooperative analysis expert situation assessment research
NASA Technical Reports Server (NTRS)
Mccown, Michael G.
1987-01-01
For the past few decades, Rome Air Development Center (RADC) has been conducting research in Artificial Intelligence (AI). When the recent advances in hardware technology made many AI techniques practical, the Intelligence and Reconnaissance Directorate of RADC initiated an applications program entitled Knowledge Based Intelligence Systems (KBIS). The goal of the program is the development of a generic Intelligent Analyst System, an open machine with the framework for intelligence analysis, natural language processing, and man-machine interface techniques, needing only the specific problem domain knowledge to be operationally useful. The development of KBIS is described.
NASA/ASEE Summer Faculty Fellowship Program, 1990, Volume 1
NASA Technical Reports Server (NTRS)
Bannerot, Richard B. (Editor); Goldstein, Stanley H. (Editor)
1990-01-01
The 1990 Johnson Space Center (JSC) NASA/American Society for Engineering Education (ASEE) Summer Faculty Fellowship Program was conducted by the University of Houston-University Park and JSC. A compilation of the final reports on the research projects are presented. The topics covered include: the Space Station; the Space Shuttle; exobiology; cell biology; culture techniques; control systems design; laser induced fluorescence; spacecraft reliability analysis; reduced gravity; biotechnology; microgravity applications; regenerative life support systems; imaging techniques; cardiovascular system; physiological effects; extravehicular mobility units; mathematical models; bioreactors; computerized simulation; microgravity simulation; and dynamic structural analysis.
NASA Technical Reports Server (NTRS)
Gallardo, V. C.; Gaffney, E. F.; Bach, L. J.; Stallone, M. J.
1981-01-01
An analytical technique was developed to predict the behavior of a rotor system subjected to sudden unbalance. The technique is implemented in the Turbine Engine Transient Rotor Analysis (TETRA) computer program using the component element method. The analysis was particularly aimed toward blade-loss phenomena in gas turbine engines. A dual-rotor, casing, and pylon structure can be modeled by the computer program. Blade tip rubs, Coriolis forces, and mechanical clearances are included. The analytical system was verified by modeling and simulating actual test conditions for a rig test as well as a full-engine, blade-release demonstration.
An investigation of a mathematical model for atmospheric absorption spectra
NASA Technical Reports Server (NTRS)
Niple, E. R.
1979-01-01
A computer program that calculates absorption spectra for slant paths through the atmosphere is described. The program uses an efficient convolution technique (Romberg integration) to simulate instrument resolution effects. A brief information analysis is performed on a set of calculated spectra to illustrate how such techniques may be used to explore the quality of the information in a spectrum.
GSFC VLBI Analysis Center Annual Report
NASA Technical Reports Server (NTRS)
Gordon, David; Ma, Chopo; MacMillan, Dan
1999-01-01
The GSFC VLBI group, located at NASA's Goddard Space Flight Center in Greenbelt, MD, is a part of the NASA Space Geodesy Program. Since its inception in the mid 1970's, this group has been involved with and been a leader in most aspects of geodetic and astrometric VLBI. Current major activities include coordination of the international geodetic observing program; coordination and analysis of the CORE program; VLBI technique development; and all types of data processing, analysis, and research activities.
Modeling and prototyping of biometric systems using dataflow programming
NASA Astrophysics Data System (ADS)
Minakova, N.; Petrov, I.
2018-01-01
The development of biometric systems is one of the labor-intensive processes. Therefore, the creation and analysis of approaches and techniques is an urgent task at present. This article presents a technique of modeling and prototyping biometric systems based on dataflow programming. The technique includes three main stages: the development of functional blocks, the creation of a dataflow graph and the generation of a prototype. A specially developed software modeling environment that implements this technique is described. As an example of the use of this technique, an example of the implementation of the iris localization subsystem is demonstrated. A variant of modification of dataflow programming is suggested to solve the problem related to the undefined order of block activation. The main advantage of the presented technique is the ability to visually display and design the model of the biometric system, the rapid creation of a working prototype and the reuse of the previously developed functional blocks.
Simulation/Emulation Techniques: Compressing Schedules With Parallel (HW/SW) Development
NASA Technical Reports Server (NTRS)
Mangieri, Mark L.; Hoang, June
2014-01-01
NASA has always been in the business of balancing new technologies and techniques to achieve human space travel objectives. NASA's Kedalion engineering analysis lab has been validating and using many contemporary avionics HW/SW development and integration techniques, which represent new paradigms to NASA's heritage culture. Kedalion has validated many of the Orion HW/SW engineering techniques borrowed from the adjacent commercial aircraft avionics solution space, inserting new techniques and skills into the Multi - Purpose Crew Vehicle (MPCV) Orion program. Using contemporary agile techniques, Commercial-off-the-shelf (COTS) products, early rapid prototyping, in-house expertise and tools, and extensive use of simulators and emulators, NASA has achieved cost effective paradigms that are currently serving the Orion program effectively. Elements of long lead custom hardware on the Orion program have necessitated early use of simulators and emulators in advance of deliverable hardware to achieve parallel design and development on a compressed schedule.
Ozone measurement systems improvements studies
NASA Technical Reports Server (NTRS)
Thomas, R. W.; Guard, K.; Holland, A. C.; Spurling, J. F.
1974-01-01
Results are summarized of an initial study of techniques for measuring atmospheric ozone, carried out as the first phase of a program to improve ozone measurement techniques. The study concentrated on two measurement systems, the electro chemical cell (ECC) ozonesonde and the Dobson ozone spectrophotometer, and consisted of two tasks. The first task consisted of error modeling and system error analysis of the two measurement systems. Under the second task a Monte-Carlo model of the Dobson ozone measurement technique was developed and programmed for computer operation.
A hybrid nonlinear programming method for design optimization
NASA Technical Reports Server (NTRS)
Rajan, S. D.
1986-01-01
Solutions to engineering design problems formulated as nonlinear programming (NLP) problems usually require the use of more than one optimization technique. Moreover, the interaction between the user (analysis/synthesis) program and the NLP system can lead to interface, scaling, or convergence problems. An NLP solution system is presented that seeks to solve these problems by providing a programming system to ease the user-system interface. A simple set of rules is used to select an optimization technique or to switch from one technique to another in an attempt to detect, diagnose, and solve some potential problems. Numerical examples involving finite element based optimal design of space trusses and rotor bearing systems are used to illustrate the applicability of the proposed methodology.
Study to determine cloud motion from meteorological satellite data
NASA Technical Reports Server (NTRS)
Clark, B. B.
1972-01-01
Processing techniques were tested for deducing cloud motion vectors from overlapped portions of pairs of pictures made from meteorological satellites. This was accomplished by programming and testing techniques for estimating pattern motion by means of cross correlation analysis with emphasis placed upon identifying and reducing errors resulting from various factors. Techniques were then selected and incorporated into a cloud motion determination program which included a routine which would select and prepare sample array pairs from the preprocessed test data. The program was then subjected to limited testing with data samples selected from the Nimbus 4 THIR data provided by the 11.5 micron channel.
Edge delamination of composite laminates subject to combined tension and torsional loading
NASA Technical Reports Server (NTRS)
Hooper, Steven J.
1990-01-01
Delamination is a common failure mode of laminated composite materials. Edge delamination is important since it results in reduced stiffness and strength of the laminate. The tension/torsion load condition is of particular significance to the structural integrity of composite helicopter rotor systems. Material coupons can easily be tested under this type of loading in servo-hydraulic tension/torsion test stands using techniques very similar to those used for the Edge Delamination Tensile Test (EDT) delamination specimen. Edge delamination of specimens loaded in tension was successfully analyzed by several investigators using both classical laminate theory and quasi-three dimensional (Q3D) finite element techniques. The former analysis technique can be used to predict the total strain energy release rate, while the latter technique enables the calculation of the mixed-mode strain energy release rates. The Q3D analysis is very efficient since it produces a three-dimensional solution to a two-dimensional domain. A computer program was developed which generates PATRAN commands to generate the finite element model. PATRAN is a pre- and post-processor which is commonly used with a variety of finite element programs such as MCS/NASTRAN. The program creates a sufficiently dense mesh at the delamination crack tips to support a mixed-mode fracture mechanics analysis. The program creates a coarse mesh in those regions where the gradients in the stress field are low (away from the delamination regions). A transition mesh is defined between these regions. This program is capable of generating a mesh for an arbitrarily oriented matrix crack. This program significantly reduces the modeling time required to generate these finite element meshes, thus providing a realistic tool with which to investigate the tension torsion problem.
Error analysis of Dobson spectrophotometer measurements of the total ozone content
NASA Technical Reports Server (NTRS)
Holland, A. C.; Thomas, R. W. L.
1975-01-01
A study of techniques for measuring atmospheric ozone is reported. This study represents the second phase of a program designed to improve techniques for the measurement of atmospheric ozone. This phase of the program studied the sensitivity of Dobson direct sun measurements and the ozone amounts inferred from those measurements to variation in the atmospheric temperature profile. The study used the plane - parallel Monte-Carlo model developed and tested under the initial phase of this program, and a series of standard model atmospheres.
A PERT/CPM of the Computer Assisted Completion of The Ministry September Report. Research Report.
ERIC Educational Resources Information Center
Feeney, J. D.
Using two statistical analysis techniques (the Program Evaluation and Review Technique and the Critical Path Method), this study analyzed procedures for compiling the required yearly report of the Metropolitan Separate School Board (Catholic) of Toronto, Canada. The computer-assisted analysis organized the process of completing the report more…
Interactive Graphics Analysis for Aircraft Design
NASA Technical Reports Server (NTRS)
Townsend, J. C.
1983-01-01
Program uses higher-order far field drag minimization. Computer program WDES WDEM preliminary aerodynamic design tool for one or two interacting, subsonic lifting surfaces. Subcritical wing design code employs higher-order far-field drag minimization technique. Linearized aerodynamic theory used. Program written in FORTRAN IV.
NASA Technical Reports Server (NTRS)
Foss, W. E., Jr.
1981-01-01
A computer technique to determine the mission radius and maneuverability characteristics of combat aircraft was developed. The technique was used to determine critical operational requirements and the areas in which research programs would be expected to yield the most beneficial results. In turn, the results of research efforts were evaluated in terms of aircraft performance on selected mission segments and for complete mission profiles. Extensive use of the technique in evaluation studies indicates that the calculated performance is essentially the same as that obtained by the proprietary programs in use throughout the aircraft industry.
Construction of dynamic stochastic simulation models using knowledge-based techniques
NASA Technical Reports Server (NTRS)
Williams, M. Douglas; Shiva, Sajjan G.
1990-01-01
Over the past three decades, computer-based simulation models have proven themselves to be cost-effective alternatives to the more structured deterministic methods of systems analysis. During this time, many techniques, tools and languages for constructing computer-based simulation models have been developed. More recently, advances in knowledge-based system technology have led many researchers to note the similarities between knowledge-based programming and simulation technologies and to investigate the potential application of knowledge-based programming techniques to simulation modeling. The integration of conventional simulation techniques with knowledge-based programming techniques is discussed to provide a development environment for constructing knowledge-based simulation models. A comparison of the techniques used in the construction of dynamic stochastic simulation models and those used in the construction of knowledge-based systems provides the requirements for the environment. This leads to the design and implementation of a knowledge-based simulation development environment. These techniques were used in the construction of several knowledge-based simulation models including the Advanced Launch System Model (ALSYM).
Static Analysis of Programming Exercises: Fairness, Usefulness and a Method for Application
ERIC Educational Resources Information Center
Nutbrown, Stephen; Higgins, Colin
2016-01-01
This article explores the suitability of static analysis techniques based on the abstract syntax tree (AST) for the automated assessment of early/mid degree level programming. Focus is on fairness, timeliness and consistency of grades and feedback. Following investigation into manual marking practises, including a survey of markers, the assessment…
Potential Uses of Occupational Analysis Data By Air Force Management Engineering Teams.
ERIC Educational Resources Information Center
McFarland, Barry P.
Both the occupational analysis program and the management engineering program are primarily concerned with task level descriptions of time spent to perform tasks required in the Air Force, the first being personnel specialty code oriented and the second being work center oriented. However two separate and independent techniques have been developed…
Menu-Driven Solver Of Linear-Programming Problems
NASA Technical Reports Server (NTRS)
Viterna, L. A.; Ferencz, D.
1992-01-01
Program assists inexperienced user in formulating linear-programming problems. A Linear Program Solver (ALPS) computer program is full-featured LP analysis program. Solves plain linear-programming problems as well as more-complicated mixed-integer and pure-integer programs. Also contains efficient technique for solution of purely binary linear-programming problems. Written entirely in IBM's APL2/PC software, Version 1.01. Packed program contains licensed material, property of IBM (copyright 1988, all rights reserved).
1976-09-01
The purpose of this research effort was to determine the financial management educational needs of USAF graduate logistics positions. Goal analysis...was used to identify financial management techniques and task analysis was used to develop a method to identify the use of financial management techniques...positions. The survey identified financial management techniques in five areas: cost accounting, capital budgeting, working capital, financial forecasting, and programming. (Author)
A Sensitivity Analysis of Circular Error Probable Approximation Techniques
1992-03-01
SENSITIVITY ANALYSIS OF CIRCULAR ERROR PROBABLE APPROXIMATION TECHNIQUES THESIS Presented to the Faculty of the School of Engineering of the Air Force...programming skills. Major Paul Auclair patiently advised me in this endeavor, and Major Andy Howell added numerous insightful contributions. I thank my...techniques. The two ret(st accuratec techniiques require numerical integration and can take several hours to run ov a personal comlputer [2:1-2,4-6]. Some
Social marketing and public health intervention.
Lefebvre, R C; Flora, J A
1988-01-01
The rapid proliferation of community-based health education programs has out-paced the knowledge base of behavior change strategies that are appropriate and effective for public health interventions. However, experiences from a variety of large-scale studies suggest that principles and techniques of social marketing may help bridge this gap. This article discusses eight essential aspects of the social marketing process: the use of a consumer orientation to develop and market intervention techniques, exchange theory as a model from which to conceptualize service delivery and program participation, audience analysis and segmentation strategies, the use of formative research in program design and pretesting of intervention materials, channel analysis for devising distribution systems and promotional campaigns, employment of the "marketing mix" concept in intervention planning and implementation, development of a process tracking system, and a management process of problem analysis, planning, implementation, feedback and control functions. Attention to such variables could result in more cost-effective programs that reach larger numbers of the target audience.
CPM and PERT in Library Management.
ERIC Educational Resources Information Center
Main, Linda
1989-01-01
Discusses two techniques of systems analysis--Critical Path Method (CPM) and Program Evaluation Review Techniques (PERT)--and their place in library management. An overview of CPM and PERT charting procedures is provided. (11 references) (Author/MES)
Programmable Logic Application Notes
NASA Technical Reports Server (NTRS)
Katz, Richard
2000-01-01
This column will be provided each quarter as a source for reliability, radiation results, NASA capabilities, and other information on programmable logic devices and related applications. This quarter will continue a series of notes concentrating on analysis techniques with this issue's section discussing: Digital Timing Analysis Tools and Techniques. Articles in this issue include: SX and SX-A Series Devices Power Sequencing; JTAG and SXISX-AISX-S Series Devices; Analysis Techniques (i.e., notes on digital timing analysis tools and techniques); Status of the Radiation Hard reconfigurable Field Programmable Gate Array Program, Input Transition Times; Apollo Guidance Computer Logic Study; RT54SX32S Prototype Data Sets; A54SX32A - 0.22 micron/UMC Test Results; Ramtron FM1608 FRAM; and Analysis of VHDL Code and Synthesizer Output.
The evolution of optics education at the U.S. National Optical Astronomy Observatory
NASA Astrophysics Data System (ADS)
Pompea, Stephen M.; Walker, Constance E.; Sparks, Robert T.
2014-07-01
The last decade of optics education at the U.S. National Optical Astronomy Observatory will be described in terms of program planning, assessment of community needs, identification of networks and strategic partners, the establishment of specific program goals and objectives, and program metrics and evaluation. A number of NOAO's optics education programs for formal and informal audiences will be described, including our Hands-On Optics program, illumination engineering/dark skies energy education programs, afterschool programs, adaptive optics education program, student outreach, and Galileoscope program. Particular emphasis will be placed on techniques for funding and sustaining high-quality programs. The use of educational gap analysis to identify the key needs of the formal and informal educational systems will be emphasized as a technique that has helped us to maximize our educational program effectiveness locally, regionally, nationally, and in Chile.
Efficient Ada multitasking on a RISC register window architecture
NASA Technical Reports Server (NTRS)
Kearns, J. P.; Quammen, D.
1987-01-01
This work addresses the problem of reducing context switch overhead on a processor which supports a large register file - a register file much like that which is part of the Berkeley RISC processors and several other emerging architectures (which are not necessarily reduced instruction set machines in the purest sense). Such a reduction in overhead is particularly desirable in a real-time embedded application, in which task-to-task context switch overhead may result in failure to meet crucial deadlines. A storage management technique by which a context switch may be implemented as cheaply as a procedure call is presented. The essence of this technique is the avoidance of the save/restore of registers on the context switch. This is achieved through analysis of the static source text of an Ada tasking program. Information gained during that analysis directs the optimized storage management strategy for that program at run time. A formal verification of the technique in terms of an operational control model and an evaluation of the technique's performance via simulations driven by synthetic Ada program traces are presented.
Reliability analysis of laminated CMC components through shell subelement techniques
NASA Technical Reports Server (NTRS)
Starlinger, Alois; Duffy, Stephen F.; Gyekenyesi, John P.
1992-01-01
An updated version of the integrated design program Composite Ceramics Analysis and Reliability Evaluation of Structures (C/CARES) was developed for the reliability evaluation of ceramic matrix composites (CMC) laminated shell components. The algorithm is now split into two modules: a finite-element data interface program and a reliability evaluation algorithm. More flexibility is achieved, allowing for easy implementation with various finite-element programs. The interface program creates a neutral data base which is then read by the reliability module. This neutral data base concept allows easy data transfer between different computer systems. The new interface program from the finite-element code Matrix Automated Reduction and Coupling (MARC) also includes the option of using hybrid laminates (a combination of plies of different materials or different layups) and allows for variations in temperature fields throughout the component. In the current version of C/CARES, a subelement technique was implemented, enabling stress gradients within an element to be taken into account. The noninteractive reliability function is now evaluated at each Gaussian integration point instead of using averaging techniques. As a result of the increased number of stress evaluation points, considerable improvements in the accuracy of reliability analyses were realized.
A Technique for the Analysis of Auto Exhaust.
ERIC Educational Resources Information Center
Sothern, Ray D.; And Others
Developed for presentation at the 12th Conference on Methods in Air Pollution and Industrial Hygiene Studies, University of Southern California, April, 1971, this outline explains a technique for separating the complex mixture of hydrocarbons contained in automotive exhausts. A Golay column and subambient temperature programming technique are…
A methodology for producing reliable software, volume 1
NASA Technical Reports Server (NTRS)
Stucki, L. G.; Moranda, P. B.; Foshee, G.; Kirchoff, M.; Omre, R.
1976-01-01
An investigation into the areas having an impact on producing reliable software including automated verification tools, software modeling, testing techniques, structured programming, and management techniques is presented. This final report contains the results of this investigation, analysis of each technique, and the definition of a methodology for producing reliable software.
Integrating computer programs for engineering analysis and design
NASA Technical Reports Server (NTRS)
Wilhite, A. W.; Crisp, V. K.; Johnson, S. C.
1983-01-01
The design of a third-generation system for integrating computer programs for engineering and design has been developed for the Aerospace Vehicle Interactive Design (AVID) system. This system consists of an engineering data management system, program interface software, a user interface, and a geometry system. A relational information system (ARIS) was developed specifically for the computer-aided engineering system. It is used for a repository of design data that are communicated between analysis programs, for a dictionary that describes these design data, for a directory that describes the analysis programs, and for other system functions. A method is described for interfacing independent analysis programs into a loosely-coupled design system. This method emphasizes an interactive extension of analysis techniques and manipulation of design data. Also, integrity mechanisms exist to maintain database correctness for multidisciplinary design tasks by an individual or a team of specialists. Finally, a prototype user interface program has been developed to aid in system utilization.
NASA Technical Reports Server (NTRS)
Hayden, W. L.; Robinson, L. H.
1972-01-01
Spectral analyses of angle-modulated communication systems is studied by: (1) performing a literature survey of candidate power spectrum computational techniques, determining the computational requirements, and formulating a mathematical model satisfying these requirements; (2) implementing the model on UNIVAC 1230 digital computer as the Spectral Analysis Program (SAP); and (3) developing the hardware specifications for a data acquisition system which will acquire an input modulating signal for SAP. The SAP computational technique uses extended fast Fourier transform and represents a generalized approach for simple and complex modulating signals.
Earth rotation, station coordinates and orbit determination from satellite laser ranging
NASA Astrophysics Data System (ADS)
Murata, Masaaki
The Project MERIT, a special program of international colaboration to Monitor Earth Rotation and Intercompare the Techniques of observation and analysis, has come to an end with great success. Its major objective was to evaluate the ultimate potential of space techniques such as VLBI and satellite laser ranging, in contrast with the other conventional techniques, in the determination of rotational dynamics of the earth. The National Aerospace Laboratory (NAL) has officially participated in the project as an associate analysis center for satellite laser technique for the period of the MERIT Main Campaign (September 1983-October 1984). In this paper, the NAL analysis center results are presented.
NASA Technical Reports Server (NTRS)
Reichel, R. H.; Hague, D. S.; Jones, R. T.; Glatt, C. R.
1973-01-01
This computer program manual describes in two parts the automated combustor design optimization code AUTOCOM. The program code is written in the FORTRAN 4 language. The input data setup and the program outputs are described, and a sample engine case is discussed. The program structure and programming techniques are also described, along with AUTOCOM program analysis.
Toward synthesizing executable models in biology.
Fisher, Jasmin; Piterman, Nir; Bodik, Rastislav
2014-01-01
Over the last decade, executable models of biological behaviors have repeatedly provided new scientific discoveries, uncovered novel insights, and directed new experimental avenues. These models are computer programs whose execution mechanistically simulates aspects of the cell's behaviors. If the observed behavior of the program agrees with the observed biological behavior, then the program explains the phenomena. This approach has proven beneficial for gaining new biological insights and directing new experimental avenues. One advantage of this approach is that techniques for analysis of computer programs can be applied to the analysis of executable models. For example, one can confirm that a model agrees with experiments for all possible executions of the model (corresponding to all environmental conditions), even if there are a huge number of executions. Various formal methods have been adapted for this context, for example, model checking or symbolic analysis of state spaces. To avoid manual construction of executable models, one can apply synthesis, a method to produce programs automatically from high-level specifications. In the context of biological modeling, synthesis would correspond to extracting executable models from experimental data. We survey recent results about the usage of the techniques underlying synthesis of computer programs for the inference of biological models from experimental data. We describe synthesis of biological models from curated mutation experiment data, inferring network connectivity models from phosphoproteomic data, and synthesis of Boolean networks from gene expression data. While much work has been done on automated analysis of similar datasets using machine learning and artificial intelligence, using synthesis techniques provides new opportunities such as efficient computation of disambiguating experiments, as well as the ability to produce different kinds of models automatically from biological data.
Reduction and analysis of data collected during the electromagnetic tornado experiment
NASA Technical Reports Server (NTRS)
Davisson, L. D.
1976-01-01
Techniques for data processing and analysis are described to support tornado detection by analysis of radio frequency interference in various frequency bands, and sea state determination from short pulse radar measurements. Activities include: strip chart recording of tornado data; the development and implementation of computer programs for digitalization and analysis of the data; data reduction techniques for short pulse radar data, and the simulation of radar returns from the sea surface by computer models.
NASA standard: Trend analysis techniques
NASA Technical Reports Server (NTRS)
1990-01-01
Descriptive and analytical techniques for NASA trend analysis applications are presented in this standard. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. This document should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend analysis is neither a precise term nor a circumscribed methodology: it generally connotes quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this document. The basic ideas needed for qualitative and quantitative assessment of trends along with relevant examples are presented.
An exploratory investigation of weight estimation techniques for hypersonic flight vehicles
NASA Technical Reports Server (NTRS)
Cook, E. L.
1981-01-01
The three basic methods of weight prediction (fixed-fraction, statistical correlation, and point stress analysis) and some of the computer programs that have been developed to implement them are discussed. A modified version of the WAATS (Weights Analysis of Advanced Transportation Systems) program is presented, along with input data forms and an example problem.
A Meta-Analytic Review of Components Associated with Parent Training Program Effectiveness
ERIC Educational Resources Information Center
Kaminski, Jennifer Wyatt; Valle, Linda Anne; Filene, Jill H.; Boyle, Cynthia L.
2008-01-01
This component analysis used meta-analytic techniques to synthesize the results of 77 published evaluations of parent training programs (i.e., programs that included the active acquisition of parenting skills) to enhance behavior and adjustment in children aged 0-7. Characteristics of program content and delivery method were used to predict effect…
Modifications Of Hydrostatic-Bearing Computer Program
NASA Technical Reports Server (NTRS)
Hibbs, Robert I., Jr.; Beatty, Robert F.
1991-01-01
Several modifications made to enhance utility of HBEAR, computer program for analysis and design of hydrostatic bearings. Modifications make program applicable to more realistic cases and reduce time and effort necessary to arrive at a suitable design. Uses search technique to iterate on size of orifice to obtain required pressure ratio.
Microcomputer Applications for Teaching Microeconomic Concepts: Some Old and New Approaches.
ERIC Educational Resources Information Center
Smith, L. Murphy; Smith, L. C., Jr.
1989-01-01
Presents microcomputer programs and programing techniques and demonstrates how these programs can be used by teachers to explain economics concepts and to help students make judgments. Each microcomputer application is supplemented by traditional graphic and mathematical analysis. Discusses applications dealing with supply, demand, elasticity,…
Dual Nozzle Aerodynamic and Cooling Analysis Study.
1981-02-27
program and to the aerodynamic model computer program. This pro - cedure was used to define two secondary nozzle contours for the baseline con - figuration...both the dual-throat and dual-expander con - cepts. Advanced analytical techniques were utilized to provide quantitative estimates of the bleed flow...preliminary heat transfer analysis of both con - cepts, and (5) engineering analysis of data from the NASA/MSFC hot-fire testing of a dual-throat
Acoustic prediction methods for the NASA generalized advanced propeller analysis system (GAPAS)
NASA Technical Reports Server (NTRS)
Padula, S. L.; Block, P. J. W.
1984-01-01
Classical methods of propeller performance analysis are coupled with state-of-the-art Aircraft Noise Prediction Program (ANOPP:) techniques to yield a versatile design tool, the NASA Generalized Advanced Propeller Analysis System (GAPAS) for the novel quiet and efficient propellers. ANOPP is a collection of modular specialized programs. GAPAS as a whole addresses blade geometry and aerodynamics, rotor performance and loading, and subsonic propeller noise.
Graphic analysis of resources by numerical evaluation techniques (Garnet)
Olson, A.C.
1977-01-01
An interactive computer program for graphical analysis has been developed by the U.S. Geological Survey. The program embodies five goals, (1) economical use of computer resources, (2) simplicity for user applications, (3) interactive on-line use, (4) minimal core requirements, and (5) portability. It is designed to aid (1) the rapid analysis of point-located data, (2) structural mapping, and (3) estimation of area resources. ?? 1977.
System analysis in rotorcraft design: The past decade
NASA Technical Reports Server (NTRS)
Galloway, Thomas L.
1988-01-01
Rapid advances in the technology of electronic digital computers and the need for an integrated synthesis approach in developing future rotorcraft programs has led to increased emphasis on system analysis techniques in rotorcraft design. The task in systems analysis is to deal with complex, interdependent, and conflicting requirements in a structured manner so rational and objective decisions can be made. Whether the results are wisdom or rubbish depends upon the validity and sometimes more importantly, the consistency of the inputs, the correctness of the analysis, and a sensible choice of measures of effectiveness to draw conclusions. In rotorcraft design this means combining design requirements, technology assessment, sensitivity analysis and reviews techniques currently in use by NASA and Army organizations in developing research programs and vehicle specifications for rotorcraft. These procedures span simple graphical approaches to comprehensive analysis on large mainframe computers. Examples of recent applications to military and civil missions are highlighted.
Analysis on laser plasma emission for characterization of colloids by video-based computer program
NASA Astrophysics Data System (ADS)
Putri, Kirana Yuniati; Lumbantoruan, Hendra Damos; Isnaeni
2016-02-01
Laser-induced breakdown detection (LIBD) is a sensitive technique for characterization of colloids with small size and low concentration. There are two types of detection, optical and acoustic. Optical LIBD employs CCD camera to capture the plasma emission and uses the information to quantify the colloids. This technique requires sophisticated technology which is often pricey. In order to build a simple, home-made LIBD system, a dedicated computer program based on MATLAB™ for analyzing laser plasma emission was developed. The analysis was conducted by counting the number of plasma emissions (breakdowns) during a certain period of time. Breakdown probability provided information on colloid size and concentration. Validation experiment showed that the computer program performed well on analyzing the plasma emissions. Optical LIBD has A graphical user interface (GUI) was also developed to make the program more user-friendly.
An Algebra-Based Introductory Computational Neuroscience Course with Lab.
Fink, Christian G
2017-01-01
A course in computational neuroscience has been developed at Ohio Wesleyan University which requires no previous experience with calculus or computer programming, and which exposes students to theoretical models of neural information processing and techniques for analyzing neural data. The exploration of theoretical models of neural processes is conducted in the classroom portion of the course, while data analysis techniques are covered in lab. Students learn to program in MATLAB and are offered the opportunity to conclude the course with a final project in which they explore a topic of their choice within computational neuroscience. Results from a questionnaire administered at the beginning and end of the course indicate significant gains in student facility with core concepts in computational neuroscience, as well as with analysis techniques applied to neural data.
SYSTID - A flexible tool for the analysis of communication systems.
NASA Technical Reports Server (NTRS)
Dawson, C. T.; Tranter, W. H.
1972-01-01
Description of the System Time Domain Simulation (SYSTID) computer-aided analysis program which is specifically structured for communication systems analysis. The SYSTID program is user oriented so that very little knowledge of computer techniques and very little programming ability are required for proper application. The program is designed so that the user can go from a system block diagram to an accurate simulation by simply programming a single English language statement for each block in the system. The mathematical and functional models available in the SYSTID library are presented. An example problem is given which illustrates the ease of modeling communication systems. Examples of the outputs available are presented, and proposed improvements are summarized.
NASA Technical Reports Server (NTRS)
Ricks, Wendell R.
1995-01-01
Pairwise comparison (PWC) is computer program that collects data for psychometric scaling techniques now used in cognitive research. It applies technique of pairwise comparisons, which is one of many techniques commonly used to acquire the data necessary for analyses. PWC administers task, collects data from test subject, and formats data for analysis. Written in Turbo Pascal v6.0.
NASA Technical Reports Server (NTRS)
Cooke, C. H.
1975-01-01
STICAP (Stiff Circuit Analysis Program) is a FORTRAN 4 computer program written for the CDC-6400-6600 computer series and SCOPE 3.0 operating system. It provides the circuit analyst a tool for automatically computing the transient responses and frequency responses of large linear time invariant networks, both stiff and nonstiff (algorithms and numerical integration techniques are described). The circuit description and user's program input language is engineer-oriented, making simple the task of using the program. Engineering theories underlying STICAP are examined. A user's manual is included which explains user interaction with the program and gives results of typical circuit design applications. Also, the program structure from a systems programmer's viewpoint is depicted and flow charts and other software documentation are given.
[Application of virtual instrumentation technique in toxicological studies].
Moczko, Jerzy A
2005-01-01
Research investigations require frequently direct connection of measuring equipment to the computer. Virtual instrumentation technique considerably facilitates programming of sophisticated acquisition-and-analysis procedures. In standard approach these two steps are performed subsequently with separate software tools. The acquired data are transfered with export / import procedures of particular program to the another one which executes next step of analysis. The described procedure is cumbersome, time consuming and may be potential source of the errors. In 1987 National Instruments Corporation introduced LabVIEW language based on the concept of graphical programming. Contrary to conventional textual languages it allows the researcher to concentrate on the resolved problem and omit all syntactical rules. Programs developed in LabVIEW are called as virtual instruments (VI) and are portable among different computer platforms as PCs, Macintoshes, Sun SPARCstations, Concurrent PowerMAX stations, HP PA/RISK workstations. This flexibility warrants that the programs prepared for one particular platform would be also appropriate to another one. In presented paper basic principles of connection of research equipment to computer systems were described.
Burton, R; Mauk, D
1993-03-01
By integrating customer satisfaction planning and industrial engineering techniques when examining internal costs and efficiencies, materiel managers are able to better realize what concepts will best meet their customers' needs. Defining your customer(s), applying industrial engineering techniques, completing work sampling studies, itemizing recommendations and benefits to each alternative, performing feasibility and cost-analysis matrixes and utilizing resources through productivity monitoring will get you on the right path toward selecting concepts to use. This article reviews the above procedures as they applied to one hospital's decision-making process to determine whether to incorporate a stockless inventory program. Through an analysis of customer demand, the hospital realized that stockless was the way to go, but not by outsourcing the function--the hospital incorporated an in-house stockless inventory program.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brady, D.N.; Church, B.W.; White, M.G.
Soil sampling activities during 1974 were concentrated in Area 5 of the Nevada Test Site (NTS). Area 5 has been assigned the highest priority because of the number of atmospheric test events held and a wide distribution of contaminants. Improved sampling techniques are described. Preliminary data analysis aided in designing a program to infer $sup 239-240$Pu results by Ge(Li) scanning techniques. (auth)
Solid oxide fuel cell simulation and design optimization with numerical adjoint techniques
NASA Astrophysics Data System (ADS)
Elliott, Louie C.
This dissertation reports on the application of numerical optimization techniques as applied to fuel cell simulation and design. Due to the "multi-physics" inherent in a fuel cell, which results in a highly coupled and non-linear behavior, an experimental program to analyze and improve the performance of fuel cells is extremely difficult. This program applies new optimization techniques with computational methods from the field of aerospace engineering to the fuel cell design problem. After an overview of fuel cell history, importance, and classification, a mathematical model of solid oxide fuel cells (SOFC) is presented. The governing equations are discretized and solved with computational fluid dynamics (CFD) techniques including unstructured meshes, non-linear solution methods, numerical derivatives with complex variables, and sensitivity analysis with adjoint methods. Following the validation of the fuel cell model in 2-D and 3-D, the results of the sensitivity analysis are presented. The sensitivity derivative for a cost function with respect to a design variable is found with three increasingly sophisticated techniques: finite difference, direct differentiation, and adjoint. A design cycle is performed using a simple optimization method to improve the value of the implemented cost function. The results from this program could improve fuel cell performance and lessen the world's dependence on fossil fuels.
Runtime Analysis of Linear Temporal Logic Specifications
NASA Technical Reports Server (NTRS)
Giannakopoulou, Dimitra; Havelund, Klaus
2001-01-01
This report presents an approach to checking a running program against its Linear Temporal Logic (LTL) specifications. LTL is a widely used logic for expressing properties of programs viewed as sets of executions. Our approach consists of translating LTL formulae to finite-state automata, which are used as observers of the program behavior. The translation algorithm we propose modifies standard LTL to B chi automata conversion techniques to generate automata that check finite program traces. The algorithm has been implemented in a tool, which has been integrated with the generic JPaX framework for runtime analysis of Java programs.
The PAWS and STEM reliability analysis programs
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Stevenson, Philip H.
1988-01-01
The PAWS and STEM programs are new design/validation tools. These programs provide a flexible, user-friendly, language-based interface for the input of Markov models describing the behavior of fault-tolerant computer systems. These programs produce exact solutions of the probability of system failure and provide a conservative estimate of the number of significant digits in the solution. PAWS uses a Pade approximation as a solution technique; STEM uses a Taylor series as a solution technique. Both programs have the capability to solve numerically stiff models. PAWS and STEM possess complementary properties with regard to their input space; and, an additional strength of these programs is that they accept input compatible with the SURE program. If used in conjunction with SURE, PAWS and STEM provide a powerful suite of programs to analyze the reliability of fault-tolerant computer systems.
Application of a substructuring technique to the problem of crack extension and closure
NASA Technical Reports Server (NTRS)
Armen, H., Jr.
1974-01-01
A substructuring technique, originally developed for the efficient reanalysis of structures, is incorporated into the methodology associated with the plastic analysis of structures. An existing finite-element computer program that accounts for elastic-plastic material behavior under cyclic loading was modified to account for changing kinematic constraint conditions - crack growth and intermittent contact of crack surfaces in two dimensional regions. Application of the analysis is presented for a problem of a centercrack panel to demonstrate the efficiency and accuracy of the technique.
2017-12-01
carefully to ensure only minimum information needed for effective management control is requested. Requires cost-benefit analysis and PM...baseline offers metrics that highlights performance treads and program variances. This information provides Program Managers and higher levels of...The existing training philosophy is effective only if the managers using the information have well trained and experienced personnel that can
The design of large petal-type paraboloidal solar collectors for the ASTEC Program requires a capability for determining the distortion and stress...analysis of a parabolic curved beam is given along with a numerical solution and digital program. The dynamic response of the ASTEC flight-test vehicle is discussed on the basis of modal analysis.
Reliability analysis of laminated CMC components through shell subelement techniques
NASA Technical Reports Server (NTRS)
Starlinger, A.; Duffy, S. F.; Gyekenyesi, J. P.
1992-01-01
An updated version of the integrated design program C/CARES (composite ceramic analysis and reliability evaluation of structures) was developed for the reliability evaluation of CMC laminated shell components. The algorithm is now split in two modules: a finite-element data interface program and a reliability evaluation algorithm. More flexibility is achieved, allowing for easy implementation with various finite-element programs. The new interface program from the finite-element code MARC also includes the option of using hybrid laminates and allows for variations in temperature fields throughout the component.
NASA standard: Trend analysis techniques
NASA Technical Reports Server (NTRS)
1988-01-01
This Standard presents descriptive and analytical techniques for NASA trend analysis applications. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. Use of this Standard is not mandatory; however, it should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend Analysis is neither a precise term nor a circumscribed methodology, but rather connotes, generally, quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this Standard. The document presents the basic ideas needed for qualitative and quantitative assessment of trends, together with relevant examples. A list of references provides additional sources of information.
NASA Technical Reports Server (NTRS)
Page, Juliet A.; Hodgdon, Kathleen K.; Krecker, Peg; Cowart, Robbie; Hobbs, Chris; Wilmer, Clif; Koening, Carrie; Holmes, Theresa; Gaugler, Trent; Shumway, Durland L.;
2014-01-01
The Waveforms and Sonic boom Perception and Response (WSPR) Program was designed to test and demonstrate the applicability and effectiveness of techniques to gather data relating human subjective response to multiple low-amplitude sonic booms. It was in essence a practice session for future wider scale testing on naive communities, using a purpose built low-boom demonstrator aircraft. The low-boom community response pilot experiment was conducted in California in November 2011. The WSPR team acquired sufficient data to assess and evaluate the effectiveness of the various physical and psychological data gathering techniques and analysis methods.
Munson, Mark; Lieberman, Harvey; Tserlin, Elina; Rocnik, Jennifer; Ge, Jie; Fitzgerald, Maria; Patel, Vinod; Garcia-Echeverria, Carlos
2015-08-01
Herein, we report a novel and general method, lead optimization attrition analysis (LOAA), to benchmark two distinct small-molecule lead series using a relatively unbiased, simple technique and commercially available software. We illustrate this approach with data collected during lead optimization of two independent oncology programs as a case study. Easily generated graphics and attrition curves enabled us to calibrate progress and support go/no go decisions on each program. We believe that this data-driven technique could be used broadly by medicinal chemists and management to guide strategic decisions during drug discovery. Copyright © 2015 Elsevier Ltd. All rights reserved.
Development of solution techniques for nonlinear structural analysis
NASA Technical Reports Server (NTRS)
Vos, R. G.; Andrews, J. S.
1974-01-01
Nonlinear structural solution methods in the current research literature are classified according to order of the solution scheme, and it is shown that the analytical tools for these methods are uniformly derivable by perturbation techniques. A new perturbation formulation is developed for treating an arbitrary nonlinear material, in terms of a finite-difference generated stress-strain expansion. Nonlinear geometric effects are included in an explicit manner by appropriate definition of an applicable strain tensor. A new finite-element pilot computer program PANES (Program for Analysis of Nonlinear Equilibrium and Stability) is presented for treatment of problems involving material and geometric nonlinearities, as well as certain forms on nonconservative loading.
Issues in NASA program and project management
NASA Technical Reports Server (NTRS)
Hoban, Francis T. (Editor)
1989-01-01
This new collection of papers on aerospace management issues contains a history of NASA program and project management, some lessons learned in the areas of management and budget from the Space Shuttle Program, an analysis of tools needed to keep large multilayer programs organized and on track, and an update of resources for NASA managers. A wide variety of opinions and techniques are presented.
Discrepancy Analysis and Continuity Matrix: Tools for Measuring the Impact of Inservice Training.
ERIC Educational Resources Information Center
Kite, R. Hayman
Within an inservice training program there is a functional interdependent relationship among problems, causes, and solutions. During a sequence of eight steps to ascertain program impact, a "continuity matrix", a management technique that assists in dealing with the problem/solution paradox is created. A successful training program must: (1) aim…
Preliminary Analysis of LORAN-C System Reliability for Civil Aviation.
1981-09-01
overviev of the analysis technique. Section 3 describes the computerized LORAN-C coverage model which is used extensively in the reliability analysis...Xth Plenary Assembly, Geneva, 1963, published by International Telecomunications Union. S. Braff, R., Computer program to calculate a Karkov Chain Reliability Model, unpublished york, MITRE Corporation. A-1 I.° , 44J Ili *Y 0E 00 ...F i8 1110 Prelim inary Analysis of Program Engineering & LORAN’C System ReliabilityMaintenance Service i ~Washington. D.C.
STS-1 environmental control and life support system. Consumables and thermal analysis
NASA Technical Reports Server (NTRS)
Steines, G.
1980-01-01
The Environmental Control and Life Support Systems (ECLSS)/thermal systems analysis for the Space Transportation System 1 Flight (STS-1) was performed using the shuttle environmental consumables usage requirements evaluation (SECURE) computer program. This program employs a nodal technique utilizing the Fortran Environmental Analysis Routines (FEAR). The output parameters evaluated were consumable quantities, fluid temperatures, heat transfer and rejection, and cabin atmospheric pressure. Analysis of these indicated that adequate margins exist for the nonpropulsive consumables and related thermal environment.
United States Air Force Summer Faculty Research Program (1986). Program Management Report
1986-12-01
become better acquainted with experimental techniques. Obtained new insights into aerodynamic research programs of interest to the Air Force. Broadened his...Provided in-depth analysis and new insights into aerodynamic data. He looked at some new radiations that we are considering for use with printed circuit...1979-1983 period through an AFOSR Minigrant Program. On 1 September 1983, AFOSR replaced the Minigrant Program with a new Research Initiation Program
Computer Code for Transportation Network Design and Analysis
DOT National Transportation Integrated Search
1977-01-01
This document describes the results of research into the application of the mathematical programming technique of decomposition to practical transportation network problems. A computer code called Catnap (for Control Analysis Transportation Network A...
Selected Logistics Models and Techniques.
1984-09-01
TI - 59 Programmable Calculator LCC...Program 27 TI - 59 Programmable Calculator LCC Model 30 Unmanned Spacecraft Cost Model 31 iv I: TABLE OF CONTENTS (CONT’D) (Subject Index) LOGISTICS...34"" - % - "° > - " ° .° - " .’ > -% > ]*° - LOGISTICS ANALYSIS MODEL/TECHNIQUE DATA MODEL/TECHNIQUE NAME: TI - 59 Programmable Calculator LCC Model TYPE MODEL: Cost Estimating DEVELOPED BY:
Basic research for the geodynamics program
NASA Technical Reports Server (NTRS)
Mueller, I. I.
1985-01-01
The current technical objectives for the geodynamics program consist of (1) optimal utilization of laser and Very Long Baseline Interferometry (VLBI) observations for reference frames for geodynamics; (2) utilization of range difference observations in geodynamics; and (3) estimation techniques in crustal deformation analysis.
2013-06-01
In this research, we examine the Naval Sea Logistics Command s Continuous Integrated Logistics Support Targeted Allowancing Technique (CILS TAT) and... the feasibility of program re-implementation. We conduct an analysis of this allowancing method s effectiveness onboard U.S. Navy Ballistic Missile...Defense (BMD) ships, measure the costs associated with performing a CILS TAT, and provide recommendations concerning possible improvements to the
Naval War College Review. Volume 67, Number 1, Winter 2014
2014-01-01
squishier” terms, phrases, and concepts yielded by qualitative techniques, such as grounded theory , ethnography , case studies , or content analysis.43...variety of established, qualitative techniques (i.e., grounded theory , content analysis, and survey research ) to “triangulate” the game’s findings...mari- time research , regional studies , distance education, war gaming, and education/ programs at the operational level of war. each of these intricate
NASA Technical Reports Server (NTRS)
Perangelo, H. J.; Milordi, F. W.
1976-01-01
Analysis techniques used in the automated telemetry station (ATS) for on line data reduction are encompassed in a broad range of software programs. Concepts that form the basis for the algorithms used are mathematically described. The control the user has in interfacing with various on line programs is discussed. The various programs are applied to an analysis of flight data which includes unimodal and bimodal response signals excited via a swept frequency shaker and/or random aerodynamic forces. A nonlinear response error modeling analysis approach is described. Preliminary results in the analysis of a hard spring nonlinear resonant system are also included.
Light aircraft crash safety program
NASA Technical Reports Server (NTRS)
Thomson, R. G.; Hayduk, R. J.
1974-01-01
NASA is embarked upon research and development tasks aimed at providing the general aviation industry with a reliable crashworthy airframe design technology. The goals of the NASA program are: reliable analytical techniques for predicting the nonlinear behavior of structures; significant design improvements of airframes; and simulated full-scale crash test data. The analytical tools will include both simplified procedures for estimating energy absorption characteristics and more complex computer programs for analysis of general airframe structures under crash loading conditions. The analytical techniques being developed both in-house and under contract are described, and a comparison of some analytical predictions with experimental results is shown.
An analysis of thermal response factors and how to reduce their computational time requirement
NASA Technical Reports Server (NTRS)
Wiese, M. R.
1982-01-01
Te RESFAC2 version of the Thermal Response Factor Program (RESFAC) is the result of numerous modifications and additions to the original RESFAC. These modifications and additions have significantly reduced the program's computational time requirement. As a result of this work, the program is more efficient and its code is both readable and understandable. This report describes what a thermal response factor is; analyzes the original matrix algebra calculations and root finding techniques; presents a new root finding technique and streamlined matrix algebra; supplies ten validation cases and their results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koch, C.D.; Pirkle, F.L.; Schmidt, J.S.
1981-01-01
A Principal Components Analysis (PCA) has been written to aid in the interpretation of multivariate aerial radiometric data collected by the US Department of Energy (DOE) under the National Uranium Resource Evaluation (NURE) program. The variations exhibited by these data have been reduced and classified into a number of linear combinations by using the PCA program. The PCA program then generates histograms and outlier maps of the individual variates. Black and white plots can be made on a Calcomp plotter by the application of follow-up programs. All programs referred to in this guide were written for a DEC-10. From thismore » analysis a geologist may begin to interpret the data structure. Insight into geological processes underlying the data may be obtained.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, E.; Engebrecht-Metzger, C.; Horowitz, S.
As BA has grown to include a large and diverse cross-section of the home building and retrofit industries, it has become more important to develop accurate, consistent analysis techniques to measure progress towards the program's goals. The House Simulation Protocol (HSP) document provides guidance to program partners and managers so they can compare energy savings for new construction and retrofit projects. The HSP provides the program with analysis methods that are proven to be effective and reliable in investigating the energy use of advanced energy systems and of entire houses.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, E.; Engebrecht, C. Metzger; Horowitz, S.
As Building America has grown to include a large and diverse cross-section of the home building and retrofit industries, it has become more important to develop accurate, consistent analysis techniques to measure progress towards the program's goals. The House Simulation Protocol (HSP) document provides guidance to program partners and managers so they can compare energy savings for new construction and retrofit projects. The HSP provides the program with analysis methods that are proven to be effective and reliable in investigating the energy use of advanced energy systems and of entire houses.
Recent developments with the ORSER system
NASA Technical Reports Server (NTRS)
Baumer, G. M.; Turner, B. J.; Myers, W. L.
1981-01-01
Additions to the ORSER remote sensing data processing package are described. The ORSER package consists of about 35 individual programs that are grouped into preprocessing, data analysis, and display subsystems. Additional data formats and data management, data transformation, and geometric correlation programs were supplemented to the preprocessing subsystem. Enhancements to the data analysis techniques include a maximum likelihood classifier (MAXCLASS) and a new version of the STATS program which makes delineation of training areas easier and allows for detection of outlier points. Ongoing developments are also described.
A Proposed Study Program for the Enhancement of Performance of Clocks in the DCS Timing system.
1982-08-31
INTRODUCTION This technical note presents a proposed program of test and analysis with a goal of using prediction techniques to enhance the...future digital DCS and methods of satisfying them so that the reader will understand: (1) what is needed from this program to enhance the performance...over a number of years, using both simulation and analysis , have recommended that all major nodes of the DCS be referenced to Coordinated Universal Time
Acoustic Source Bearing Estimation (ASBE) computer program development
NASA Technical Reports Server (NTRS)
Wiese, Michael R.
1987-01-01
A new bearing estimation algorithm (Acoustic Source Analysis Technique - ASAT) and an acoustic analysis computer program (Acoustic Source Bearing Estimation - ASBE) are described, which were developed by Computer Sciences Corporation for NASA Langley Research Center. The ASBE program is used by the Acoustics Division/Applied Acoustics Branch and the Instrument Research Division/Electro-Mechanical Instrumentation Branch to analyze acoustic data and estimate the azimuths from which the source signals radiated. Included are the input and output from a benchmark test case.
Develop Advanced Nonlinear Signal Analysis Topographical Mapping System
NASA Technical Reports Server (NTRS)
Jong, Jen-Yi
1997-01-01
During the development of the SSME, a hierarchy of advanced signal analysis techniques for mechanical signature analysis has been developed by NASA and AI Signal Research Inc. (ASRI) to improve the safety and reliability for Space Shuttle operations. These techniques can process and identify intelligent information hidden in a measured signal which is often unidentifiable using conventional signal analysis methods. Currently, due to the highly interactive processing requirements and the volume of dynamic data involved, detailed diagnostic analysis is being performed manually which requires immense man-hours with extensive human interface. To overcome this manual process, NASA implemented this program to develop an Advanced nonlinear signal Analysis Topographical Mapping System (ATMS) to provide automatic/unsupervised engine diagnostic capabilities. The ATMS will utilize a rule-based Clips expert system to supervise a hierarchy of diagnostic signature analysis techniques in the Advanced Signal Analysis Library (ASAL). ASAL will perform automatic signal processing, archiving, and anomaly detection/identification tasks in order to provide an intelligent and fully automated engine diagnostic capability. The ATMS has been successfully developed under this contract. In summary, the program objectives to design, develop, test and conduct performance evaluation for an automated engine diagnostic system have been successfully achieved. Software implementation of the entire ATMS system on MSFC's OISPS computer has been completed. The significance of the ATMS developed under this program is attributed to the fully automated coherence analysis capability for anomaly detection and identification which can greatly enhance the power and reliability of engine diagnostic evaluation. The results have demonstrated that ATMS can significantly save time and man-hours in performing engine test/flight data analysis and performance evaluation of large volumes of dynamic test data.
Techniques for hot structures testing
NASA Technical Reports Server (NTRS)
Deangelis, V. Michael; Fields, Roger A.
1990-01-01
Hot structures testing have been going on since the early 1960's beginning with the Mach 6, X-15 airplane. Early hot structures test programs at NASA-Ames-Dryden focused on operational testing required to support the X-15 flight test program, and early hot structures research projects focused on developing lab test techniques to simulate flight thermal profiles. More recent efforts involved numerous large and small hot structures test programs that served to develop test methods and measurement techniques to provide data that promoted the correlation of test data with results from analytical codes. In Nov. 1988 a workshop was sponsored that focused on the correlation of hot structures test data with analysis. Limited material is drawn from the workshop and a more formal documentation is provided of topics that focus on hot structures test techniques used at NASA-Ames-Dryden. Topics covered include the data acquisition and control of testing, the quartz lamp heater systems, current strain and temperature sensors, and hot structures test techniques used to simulate the flight thermal environment in the lab.
Design automation techniques for custom LSI arrays
NASA Technical Reports Server (NTRS)
Feller, A.
1975-01-01
The standard cell design automation technique is described as an approach for generating random logic PMOS, CMOS or CMOS/SOS custom large scale integration arrays with low initial nonrecurring costs and quick turnaround time or design cycle. The system is composed of predesigned circuit functions or cells and computer programs capable of automatic placement and interconnection of the cells in accordance with an input data net list. The program generates a set of instructions to drive an automatic precision artwork generator. A series of support design automation and simulation programs are described, including programs for verifying correctness of the logic on the arrays, performing dc and dynamic analysis of MOS devices, and generating test sequences.
A survey of application: genomics and genetic programming, a new frontier.
Khan, Mohammad Wahab; Alam, Mansaf
2012-08-01
The aim of this paper is to provide an introduction to the rapidly developing field of genetic programming (GP). Particular emphasis is placed on the application of GP to genomics. First, the basic methodology of GP is introduced. This is followed by a review of applications in the areas of gene network inference, gene expression data analysis, SNP analysis, epistasis analysis and gene annotation. Finally this paper concluded by suggesting potential avenues of possible future research on genetic programming, opportunities to extend the technique, and areas for possible practical applications. Copyright © 2012 Elsevier Inc. All rights reserved.
Multidisciplinary design optimization using multiobjective formulation techniques
NASA Technical Reports Server (NTRS)
Chattopadhyay, Aditi; Pagaldipti, Narayanan S.
1995-01-01
This report addresses the development of a multidisciplinary optimization procedure using an efficient semi-analytical sensitivity analysis technique and multilevel decomposition for the design of aerospace vehicles. A semi-analytical sensitivity analysis procedure is developed for calculating computational grid sensitivities and aerodynamic design sensitivities. Accuracy and efficiency of the sensitivity analysis procedure is established through comparison of the results with those obtained using a finite difference technique. The developed sensitivity analysis technique are then used within a multidisciplinary optimization procedure for designing aerospace vehicles. The optimization problem, with the integration of aerodynamics and structures, is decomposed into two levels. Optimization is performed for improved aerodynamic performance at the first level and improved structural performance at the second level. Aerodynamic analysis is performed by solving the three-dimensional parabolized Navier Stokes equations. A nonlinear programming technique and an approximate analysis procedure are used for optimization. The proceduredeveloped is applied to design the wing of a high speed aircraft. Results obtained show significant improvements in the aircraft aerodynamic and structural performance when compared to a reference or baseline configuration. The use of the semi-analytical sensitivity technique provides significant computational savings.
NASA Technical Reports Server (NTRS)
Guseman, L. F., Jr. (Principal Investigator)
1984-01-01
Several papers addressing image analysis and pattern recognition techniques for satellite imagery are presented. Texture classification, image rectification and registration, spatial parameter estimation, and surface fitting are discussed.
Lean and Efficient Software: Whole-Program Optimization of Executables
2015-09-30
libraries. Many levels of library interfaces—where some libraries are dynamically linked and some are provided in binary form only—significantly limit...software at build time. The opportunity: Our objective in this project is to substantially improve the performance, size, and robustness of binary ...executables by using static and dynamic binary program analysis techniques to perform whole-program optimization directly on compiled programs
Gregory A. Reams; Ronald E. McRoberts; Paul C. van Deusen; [Editors
2001-01-01
Documents progress in developing techniques in remote sensing, statistics, information management, and analysis required for full implementation of the national Forest Inventory and Analysis programâs annual forest inventory system.
Structuring an Internal Evaluation Process.
ERIC Educational Resources Information Center
Gordon, Sheila C.; Heinemann, Harry N.
1980-01-01
The design of an internal program evaluation system requires (1) formulation of program, operational, and institutional objectives; (2) establishment of evaluation criteria; (3) choice of data collection and evaluation techniques; (4) analysis of results; and (5) integration of the system into the mainstream of operations. (SK)
Survey of Airport Access Analysis Techniques - Models, Data and a Research Program
DOT National Transportation Integrated Search
1972-06-01
The report points up the differences and similarities between airport access travel and general urban trip making. Models and surveys developed for, or applicable, to airport access planning are reviewed. A research program is proposed which would ge...
ERIC Educational Resources Information Center
Thornton, Teresa; Leahy, Jessica
2012-01-01
Social network analysis (SNA) is a social science research tool that has not been applied to educational programs. This analysis is critical to documenting the changes in social capital and networks that result from community based K-12 educational collaborations. We review SNA and show an application of this technique in a school-centered,…
Using Block-local Atomicity to Detect Stale-value Concurrency Errors
NASA Technical Reports Server (NTRS)
Artho, Cyrille; Havelund, Klaus; Biere, Armin
2004-01-01
Data races do not cover all kinds of concurrency errors. This paper presents a data-flow-based technique to find stale-value errors, which are not found by low-level and high-level data race algorithms. Stale values denote copies of shared data where the copy is no longer synchronized. The algorithm to detect such values works as a consistency check that does not require any assumptions or annotations of the program. It has been implemented as a static analysis in JNuke. The analysis is sound and requires only a single execution trace if implemented as a run-time checking algorithm. Being based on an analysis of Java bytecode, it encompasses the full program semantics, including arbitrarily complex expressions. Related techniques are more complex and more prone to over-reporting.
NASA Astrophysics Data System (ADS)
McCray, Wilmon Wil L., Jr.
The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.
The Statistical Package for the Social Sciences (SPSS) as an adjunct to pharmacokinetic analysis.
Mather, L E; Austin, K L
1983-01-01
Computer techniques for numerical analysis are well known to pharmacokineticists. Powerful techniques for data file management have been developed by social scientists but have, in general, been ignored by pharmacokineticists because of their apparent lack of ability to interface with pharmacokinetic programs. Extensive use has been made of the Statistical Package for the Social Sciences (SPSS) for its data handling capabilities, but at the same time, techniques have been developed within SPSS to interface with pharmacokinetic programs of the users' choice and to carry out a variety of user-defined pharmacokinetic tasks within SPSS commands, apart from the expected variety of statistical tasks. Because it is based on a ubiquitous package, this methodology has all of the benefits of excellent documentation, interchangeability between different types and sizes of machines and true portability of techniques and data files. An example is given of the total management of a pharmacokinetic study previously reported in the literature by the authors.
NASA Technical Reports Server (NTRS)
Wang, T.; Simon, T. W.
1988-01-01
Development of a recent experimental program to investigate the effects of streamwise curvature on boundary layer transition required making a bendable, heated and instrumented test wall, a rather nonconventional surface. The present paper describes this surface, the design choices made in its development and how uncertainty analysis was used, beginning early in the test program, to make such design choices. Published uncertainty analysis techniques were found to be of great value; but, it became clear that another step, one herein called the pre-test analysis, would aid the program development. Finally, it is shown how the uncertainty analysis was used to determine whether the test surface was qualified for service.
Visualization of Concurrent Program Executions
NASA Technical Reports Server (NTRS)
Artho, Cyrille; Havelund, Klaus; Honiden, Shinichi
2007-01-01
Various program analysis techniques are efficient at discovering failures and properties. However, it is often difficult to evaluate results, such as program traces. This calls for abstraction and visualization tools. We propose an approach based on UML sequence diagrams, addressing shortcomings of such diagrams for concurrency. The resulting visualization is expressive and provides all the necessary information at a glance.
Program Analyzes Radar Altimeter Data
NASA Technical Reports Server (NTRS)
Vandemark, Doug; Hancock, David; Tran, Ngan
2004-01-01
A computer program has been written to perform several analyses of radar altimeter data. The program was designed to improve on previous methods of analysis of altimeter engineering data by (1) facilitating and accelerating the analysis of large amounts of data in a more direct manner and (2) improving the ability to estimate performance of radar-altimeter instrumentation and provide data corrections. The data in question are openly available to the international scientific community and can be downloaded from anonymous file-transfer- protocol (FTP) locations that are accessible via links from altimetry Web sites. The software estimates noise in range measurements, estimates corrections for electromagnetic bias, and performs statistical analyses on various parameters for comparison of different altimeters. Whereas prior techniques used to perform similar analyses of altimeter range noise require comparison of data from repetitions of satellite ground tracks, the present software uses a high-pass filtering technique to obtain similar results from single satellite passes. Elimination of the requirement for repeat-track analysis facilitates the analysis of large amounts of satellite data to assess subtle variations in range noise.
Pease, J M; Morselli, M F
1987-01-01
This paper deals with a computer program adapted to a statistical method for analyzing an unlimited quantity of binary recorded data of an independent circular variable (e.g. wind direction), and a linear variable (e.g. maple sap flow volume). Circular variables cannot be statistically analyzed with linear methods, unless they have been transformed. The program calculates a critical quantity, the acrophase angle (PHI, phi o). The technique is adapted from original mathematics [1] and is written in Fortran 77 for easier conversion between computer networks. Correlation analysis can be performed following the program or regression which, because of the circular nature of the independent variable, becomes periodic regression. The technique was tested on a file of approximately 4050 data pairs.
2013-05-30
In this research, we examine the Naval Sea Logistics Command’s Continuous Integrated Logistics Support-Targeted Allowancing Technique (CILS-TAT) and... the feasibility of program re-implementation. We conduct an analysis of this allowancing method’s effectiveness onboard U.S. Navy Ballistic Missile...Defense (BMD) ships, measure the costs associated with performing a CILS-TAT, and provide recommendations concerning possible improvements to the
Mathematical analysis techniques for modeling the space network activities
NASA Technical Reports Server (NTRS)
Foster, Lisa M.
1992-01-01
The objective of the present work was to explore and identify mathematical analysis techniques, and in particular, the use of linear programming. This topic was then applied to the Tracking and Data Relay Satellite System (TDRSS) in order to understand the space network better. Finally, a small scale version of the system was modeled, variables were identified, data was gathered, and comparisons were made between actual and theoretical data.
Calculation of three-dimensional, inviscid, supersonic, steady flows
NASA Technical Reports Server (NTRS)
Moretti, G.
1981-01-01
A detailed description of a computational program for the evaluation of three dimensional supersonic, inviscid, steady flow past airplanes is presented. Emphasis was put on how a powerful, automatic mapping technique is coupled to the fluid mechanical analysis. Each of the three constituents of the analysis (body geometry, mapping technique, and gas dynamical effects) was carefully coded and described. Results of computations based on sample geometrics and discussions are also presented.
Home visit program improves technique survival in peritoneal dialysis.
Martino, Francesca; Adıbelli, Z; Mason, G; Nayak, A; Ariyanon, W; Rettore, E; Crepaldi, Carlo; Rodighiero, Mariapia; Ronco, Claudio
2014-01-01
Peritoneal dialysis (PD) is a home therapy, and technique survival is related to the adherence to PD prescription at home. The presence of a home visit program could improve PD outcomes. We evaluated its effects on clinical outcome during 1 year of follow-up. This was a case-control study. The case group included all 96 patients who performed PD in our center on January 1, 2013, and who attended a home visit program; the control group included all 92 patients who performed PD on January 1, 2008. The home visit program consisted of several additional visits to reinforce patients' confidence in PD management in their own environment. Outcomes were defined as technique failure, peritonitis episode, and hospitalization. Clinical and dialysis features were evaluated for each patient. The case group was significantly older (p = 0.048), with a lower grade of autonomy (p = 0.033), but a better hemoglobin level (p = 0.02) than the control group. During the observational period, we had 11 episodes of technique failure. We found a significant reduction in the rate of technique failure in the case group (p = 0.004). Furthermore, survival analysis showed a significant extension of PD treatment in the patients supported by the home visit program (52 vs. 48.8 weeks, p = 0.018). We did not find any difference between the two groups in terms of peritonitis and hospitalization rate; however, trends toward a reduction of Gram-positive peritonitis rates as well as prevalence and duration of hospitalization related to PD problems were identified in the case group. The retrospective nature of the analysis was a limitation of this study. The home visit program improves the survival of PD patients and could reduce the rate of Gram-positive peritonitis and hospitalization. Video Journal Club "Cappuccino with Claudio Ronco" at http://www.karger.com/?doi=365168.
Supporting flight data analysis for Space Shuttle Orbiter Experiments at NASA Ames Research Center
NASA Technical Reports Server (NTRS)
Green, M. J.; Budnick, M. P.; Yang, L.; Chiasson, M. P.
1983-01-01
The Space Shuttle Orbiter Experiments program in responsible for collecting flight data to extend the research and technology base for future aerospace vehicle design. The Infrared Imagery of Shuttle (IRIS), Catalytic Surface Effects, and Tile Gap Heating experiments sponsored by Ames Research Center are part of this program. The paper describes the software required to process the flight data which support these experiments. In addition, data analysis techniques, developed in support of the IRIS experiment, are discussed. Using the flight data base, the techniques have provided information useful in analyzing and correcting problems with the experiment, and in interpreting the IRIS image obtained during the entry of the third Shuttle mission.
Supporting flight data analysis for Space Shuttle Orbiter experiments at NASA Ames Research Center
NASA Technical Reports Server (NTRS)
Green, M. J.; Budnick, M. P.; Yang, L.; Chiasson, M. P.
1983-01-01
The space shuttle orbiter experiments program is responsible for collecting flight data to extend the research and technology base for future aerospace vehicle design. The infrared imagery of shuttle (IRIS), catalytic surface effects, and tile gap heating experiments sponsored by Ames Research Center are part of this program. The software required to process the flight data which support these experiments is described. In addition, data analysis techniques, developed in support of the IRIS experiment, are discussed. Using the flight data base, the techniques provide information useful in analyzing and correcting problems with the experiment, and in interpreting the IRIS image obtained during the entry of the third shuttle mission.
NASA and CFD - Making investments for the future
NASA Technical Reports Server (NTRS)
Hessenius, Kristin A.; Richardson, P. F.
1992-01-01
From a NASA perspective, CFD is a new tool for fluid flow simulation and prediction with virtually none of the inherent limitations of other ground-based simulation techniques. A primary goal of NASA's CFD research program is to develop efficient and accurate computational techniques for utilization in the design and analysis of aerospace vehicles. The program in algorithm development has systematically progressed through the hierarchy of engineering simplifications of the Navier-Stokes equations, starting with the inviscid formulations such as transonic small disturbance, full potential, and Euler.
Signal Detection Techniques for Diagnostic Monitoring of Space Shuttle Main Engine Turbomachinery
NASA Technical Reports Server (NTRS)
Coffin, Thomas; Jong, Jen-Yi
1986-01-01
An investigation to develop, implement, and evaluate signal analysis techniques for the detection and classification of incipient mechanical failures in turbomachinery is reviewed. A brief description of the Space Shuttle Main Engine (SSME) test/measurement program is presented. Signal analysis techniques available to describe dynamic measurement characteristics are reviewed. Time domain and spectral methods are described, and statistical classification in terms of moments is discussed. Several of these waveform analysis techniques have been implemented on a computer and applied to dynamc signals. A laboratory evaluation of the methods with respect to signal detection capability is described. A unique coherence function (the hyper-coherence) was developed through the course of this investigation, which appears promising as a diagnostic tool. This technique and several other non-linear methods of signal analysis are presented and illustrated by application. Software for application of these techniques has been installed on the signal processing system at the NASA/MSFC Systems Dynamics Laboratory.
Arthropod surveillance programs: Basic components, strategies, and analysis
USDA-ARS?s Scientific Manuscript database
Effective entomological surveillance planning stresses a careful consideration of methodology, trapping technologies, and analysis techniques. Herein, the basic principles and technological components of arthropod surveillance plans are described, as promoted in the symposium “Advancements in arthro...
Educational Geographers and Applied Geography.
ERIC Educational Resources Information Center
Frazier, John W.
1979-01-01
Describes the development of applied geography programs and restructuring of curricula with an emphasis on new technique and methodology courses, though retaining the liberal arts role. Educational geographers can help the programs to succeed through curriculum analysis, auditing, advising students, and liaison with other geography sources. (CK)
Using Job Analysis Techniques to Understand Training Needs for Promotores de Salud.
Ospina, Javier H; Langford, Toshiko A; Henry, Kimberly L; Nelson, Tristan Q
2018-04-01
Despite the value of community health worker programs, such as Promotores de Salud, for addressing health disparities in the Latino community, little consensus has been reached to formally define the unique roles and duties associated with the job, thereby creating unique job training challenges. Understanding the job tasks and worker attributes central to this work is a critical first step for developing the training and evaluation systems of promotores programs. Here, we present the process and findings of a job analysis conducted for promotores working for Planned Parenthood. We employed a systematic approach, the combination job analysis method, to define the job in terms of its work and worker requirements, identifying key job tasks, as well as the worker attributes necessary to effectively perform them. Our results suggest that the promotores' job encompasses a broad range of activities and requires an equally broad range of personal characteristics to perform. These results played an important role in the development of our training and evaluation protocols. In this article, we introduce the technique of job analysis, provide an overview of the results from our own application of this technique, and discuss how these findings can be used to inform a training and performance evaluation system. This article provides a template for other organizations implementing similar community health worker programs and illustrates the value of conducting a job analysis for clarifying job roles, developing and evaluating job training materials, and selecting qualified job candidates.
40 CFR 85.2120 - Maintenance and submittal of records.
Code of Federal Regulations, 2010 CFR
2010-07-01
... testing program, including all production part sampling techniques used to verify compliance of the... subsequent analyses of that data; (7) A description of all the methodology, analysis, testing and/or sampling techniques used to ascertain the emission critical parameter specifications of the originial equipment part...
Validation of helicopter noise prediction techniques
NASA Technical Reports Server (NTRS)
Succi, G. P.
1981-01-01
The current techniques of helicopter rotor noise prediction attempt to describe the details of the noise field precisely and remove the empiricisms and restrictions inherent in previous methods. These techniques require detailed inputs of the rotor geometry, operating conditions, and blade surface pressure distribution. The purpose of this paper is to review those techniques in general and the Farassat/Nystrom analysis in particular. The predictions of the Farassat/Nystrom noise computer program, using both measured and calculated blade surface pressure data, are compared to measured noise level data. This study is based on a contract from NASA to Bolt Beranek and Newman Inc. with measured data from the AH-1G Helicopter Operational Loads Survey flight test program supplied by Bell Helicopter Textron.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1990-01-01
The present conference on flight testing encompasses avionics, flight-testing programs, technologies for flight-test predictions and measurements, testing tools, analysis methods, targeting techniques, and flightline testing. Specific issues addressed include flight testing of a digital terrain-following system, a digital Doppler rate-of-descent indicator, a high-technology testbed, a low-altitude air-refueling flight-test program, techniques for in-flight frequency-response testing for helicopters, limit-cycle oscillation and flight-flutter testing, and the research flight test of a scaled unmanned air vehicle. Also addressed are AV-8B V/STOL performance analysis, incorporating pilot-response time in failure-case testing, the development of pitot static flightline testing, targeting techniques for ground-based hover testing, a low-profilemore » microsensor for aerodynamic pressure measurement, and the use of a variable-capacitance accelerometer for flight-test measurements.« less
NASA Technical Reports Server (NTRS)
Dyer, M. K.; Little, D. G.; Hoard, E. G.; Taylor, A. C.; Campbell, R.
1972-01-01
An approach that might be used for determining the applicability of NASA management techniques to benefit almost any type of down-to-earth enterprise is presented. A study was made to determine the following: (1) the practicality of adopting NASA contractual quality management techniques to the U.S. Geological Survey Outer Continental Shelf lease management function; (2) the applicability of failure mode effects analysis to the drilling, production, and delivery systems in use offshore; (3) the impact on industrial offshore operations and onshore management operations required to apply recommended NASA techniques; and (4) the probable changes required in laws or regulations in order to implement recommendations. Several management activities that have been applied to space programs are identified, and their institution for improved management of offshore and onshore oil and gas operations is recommended.
Sun Series program for the REEDA System. [predicting orbital lifetime using sunspot values
NASA Technical Reports Server (NTRS)
Shankle, R. W.
1980-01-01
Modifications made to data bases and to four programs in a series of computer programs (Sun Series) which run on the REEDA HP minicomputer system to aid NASA's solar activity predictions used in orbital life time predictions are described. These programs utilize various mathematical smoothing technique and perform statistical and graphical analysis of various solar activity data bases residing on the REEDA System.
NASA Technical Reports Server (NTRS)
Kenigsberg, I. J.; Dean, M. W.; Malatino, R.
1974-01-01
The correlation achieved with each program provides the material for a discussion of modeling techniques developed for general application to finite-element dynamic analyses of helicopter airframes. Included are the selection of static and dynamic degrees of freedom, cockpit structural modeling, and the extent of flexible-frame modeling in the transmission support region and in the vicinity of large cut-outs. The sensitivity of predicted results to these modeling assumptions are discussed. Both the Sikorsky Finite-Element Airframe Vibration analysis Program (FRAN/Vibration Analysis) and the NASA Structural Analysis Program (NASTRAN) have been correlated with data taken in full-scale vibration tests of a modified CH-53A helicopter.
Intrasystem Analysis Program (IAP) code summaries
NASA Astrophysics Data System (ADS)
Dobmeier, J. J.; Drozd, A. L. S.; Surace, J. A.
1983-05-01
This report contains detailed descriptions and capabilities of the codes that comprise the Intrasystem Analysis Program. The four codes are: Intrasystem Electromagnetic Compatibility Analysis Program (IEMCAP), General Electromagnetic Model for the Analysis of Complex Systems (GEMACS), Nonlinear Circuit Analysis Program (NCAP), and Wire Coupling Prediction Models (WIRE). IEMCAP is used for computer-aided evaluation of electromagnetic compatibility (ECM) at all stages of an Air Force system's life cycle, applicable to aircraft, space/missile, and ground-based systems. GEMACS utilizes a Method of Moments (MOM) formalism with the Electric Field Integral Equation (EFIE) for the solution of electromagnetic radiation and scattering problems. The code employs both full matrix decomposition and Banded Matrix Iteration solution techniques and is expressly designed for large problems. NCAP is a circuit analysis code which uses the Volterra approach to solve for the transfer functions and node voltage of weakly nonlinear circuits. The Wire Programs deal with the Application of Multiconductor Transmission Line Theory to the Prediction of Cable Coupling for specific classes of problems.
Computer program for analysis of imperfection sensitivity of ring stiffened shells of revolution
NASA Technical Reports Server (NTRS)
Cohen, G. A.
1971-01-01
A FORTRAN 4 digital computer program is presented for the initial postbuckling and imperfection sensitivity analysis of bifurcation buckling modes for ring-stiffened orthotropic multilayered shells of revolution. The boundary value problem for the second-order contribution to the buckled state was solved by the forward integration technique using the Runge-Kutta method. The effects of nonlinear prebuckling states and live pressure loadings are included.
W. Keith Moser; Greg Liknes; Mark Hansen; Kevin Nimerfro
2005-01-01
The Forest Inventory and Analysis program at the North Central Research Station focuses on understanding the forested ecosystems in the North Central and Northern Great Plains States through analyzing the results of annual inventories. The program also researches techniques for data collection and analysis. The FIA process measures the above-ground vegetation and the...
NASA Technical Reports Server (NTRS)
Landmann, A. E.; Tillema, H. F.; Marshall, S. E.
1989-01-01
The application of selected analysis techniques to low frequency cabin noise associated with advanced propeller engine installations is evaluated. Three design analysis techniques were chosen for evaluation including finite element analysis, statistical energy analysis (SEA), and a power flow method using element of SEA (computer program Propeller Aircraft Interior Noise). An overview of the three procedures is provided. Data from tests of a 727 airplane (modified to accept a propeller engine) were used to compare with predictions. Comparisons of predicted and measured levels at the end of the first year's effort showed reasonable agreement leading to the conclusion that each technique had value for propeller engine noise predictions on large commercial transports. However, variations in agreement were large enough to remain cautious and to lead to recommendations for further work with each technique. Assessment of the second year's results leads to the conclusion that the selected techniques can accurately predict trends and can be useful to a designer, but that absolute level predictions remain unreliable due to complexity of the aircraft structure and low modal densities.
Overview: Applications of numerical optimization methods to helicopter design problems
NASA Technical Reports Server (NTRS)
Miura, H.
1984-01-01
There are a number of helicopter design problems that are well suited to applications of numerical design optimization techniques. Adequate implementation of this technology will provide high pay-offs. There are a number of numerical optimization programs available, and there are many excellent response/performance analysis programs developed or being developed. But integration of these programs in a form that is usable in the design phase should be recognized as important. It is also necessary to attract the attention of engineers engaged in the development of analysis capabilities and to make them aware that analysis capabilities are much more powerful if integrated into design oriented codes. Frequently, the shortcoming of analysis capabilities are revealed by coupling them with an optimization code. Most of the published work has addressed problems in preliminary system design, rotor system/blade design or airframe design. Very few published results were found in acoustics, aerodynamics and control system design. Currently major efforts are focused on vibration reduction, and aerodynamics/acoustics applications appear to be growing fast. The development of a computer program system to integrate the multiple disciplines required in helicopter design with numerical optimization technique is needed. Activities in Britain, Germany and Poland are identified, but no published results from France, Italy, the USSR or Japan were found.
Multidisciplinary Techniques and Novel Aircraft Control Systems
NASA Technical Reports Server (NTRS)
Padula, Sharon L.; Rogers, James L.; Raney, David L.
2000-01-01
The Aircraft Morphing Program at NASA Langley Research Center explores opportunities to improve airframe designs with smart technologies. Two elements of this basic research program are multidisciplinary design optimization (MDO) and advanced flow control. This paper describes examples where MDO techniques such as sensitivity analysis, automatic differentiation, and genetic algorithms contribute to the design of novel control systems. In the test case, the design and use of distributed shape-change devices to provide low-rate maneuvering capability for a tailless aircraft is considered. The ability of MDO to add value to control system development is illustrated using results from several years of research funded by the Aircraft Morphing Program.
Multidisciplinary Techniques and Novel Aircraft Control Systems
NASA Technical Reports Server (NTRS)
Padula, Sharon L.; Rogers, James L.; Raney, David L.
2000-01-01
The Aircraft Morphing Program at NASA Langley Research Center explores opportunities to improve airframe designs with smart technologies. Two elements of this basic research program are multidisciplinary design optimization (MDO) and advanced flow control. This paper describes examples where MDO techniques such as sensitivity analysis, automatic differentiation, and genetic algorithms contribute to the design of novel control systems. In the test case, the design and use of distributed shapechange devices to provide low-rate maneuvering capability for a tailless aircraft is considered. The ability of MDO to add value to control system development is illustrated using results from several years of research funded by the Aircraft Morphing Program.
NASA Technical Reports Server (NTRS)
Hoffman, Edward J. (Editor); Lawbaugh, William M. (Editor)
1996-01-01
Papers address the following topics: NASA's project management development process; Better decisions through structural analysis; NASA's commercial technology management system; Today's management techniques and tools; Program control in NASA - needs and opportunities; and Resources for NASA managers.
Using NASA and the Space Program to Help High School and College Students Learn Chemistry.
ERIC Educational Resources Information Center
Kelter, Paul B.; And Others
1987-01-01
Discusses the current state of space-related research and manufacturing techniques. Focuses on the areas of spectroscopy, materials processing, electrochemistry, and analysis. Provides examples and classroom application for using these aspects of the space program to teach chemistry. (TW)
Health Lifestyles: Audience Segmentation Analysis for Public Health Interventions.
ERIC Educational Resources Information Center
Slater, Michael D.; Flora, June A.
This paper is concerned with the application of market research techniques to segment large populations into homogeneous units in order to improve the reach, utilization, and effectiveness of health programs. The paper identifies seven distinctive patterns of health attitudes, social influences, and behaviors using cluster analytic techniques in a…
Analytical aids in land management planning
David R. Betters
1978-01-01
Quantitative techniques may be applied to aid in completing various phases of land management planning. Analytical procedures which have been used include a procedure for public involvement, PUBLIC; a matrix information generator, MAGE5; an allocation procedure, linear programming (LP); and an input-output economic analysis (EA). These techniques have proven useful in...
Forest inventory and stratified estimation: a cautionary note
John Coulston
2008-01-01
The Forest Inventory and Analysis (FIA) Program uses stratified estimation techniques to produce estimates of forest attributes. Stratification must be unbiased and stratification procedures should be examined to identify any potential bias. This note explains simple techniques for identifying potential bias, discriminating between sample bias and stratification bias,...
ERIC Educational Resources Information Center
Shin, YoungJu; Miller-Day, Michelle; Pettigrew, Jonathan; Hecht, Michael L.; Krieger, Janice L.
2014-01-01
Enhancing the delivery quality of school-based, evidence-based prevention programs is one key to ensuring uniform program effects on student outcomes. Program evaluations often focus on content dosage when implementing prevention curricula, however, less is known about implementation quality of prevention content, especially among teachers who may…
Modeling and Analysis of Power Processing Systems (MAPPS), initial phase 2
NASA Technical Reports Server (NTRS)
Yu, Y.; Lee, F. C.; Wangenheim, H.; Warren, D.
1977-01-01
The overall objective of the program is to provide the engineering tools to reduce the analysis, design, and development effort, and thus the cost, in achieving the required performances for switching regulators and dc-dc converter systems. The program was both tutorial and application oriented. Various analytical methods were described in detail and supplemented with examples, and those with standardization appeals were reduced into computer-based subprograms. Major program efforts included those concerning small and large signal control-dependent performance analysis and simulation, control circuit design, power circuit design and optimization, system configuration study, and system performance simulation. Techniques including discrete time domain, conventional frequency domain, Lagrange multiplier, nonlinear programming, and control design synthesis were employed in these efforts. To enhance interactive conversation between the modeling and analysis subprograms and the user, a working prototype of the Data Management Program was also developed to facilitate expansion as future subprogram capabilities increase.
SEP thrust subsystem performance sensitivity analysis
NASA Technical Reports Server (NTRS)
Atkins, K. L.; Sauer, C. G., Jr.; Kerrisk, D. J.
1973-01-01
This is a two-part report on solar electric propulsion (SEP) performance sensitivity analysis. The first part describes the preliminary analysis of the SEP thrust system performance for an Encke rendezvous mission. A detailed description of thrust subsystem hardware tolerances on mission performance is included together with nominal spacecraft parameters based on these tolerances. The second part describes the method of analysis and graphical techniques used in generating the data for Part 1. Included is a description of both the trajectory program used and the additional software developed for this analysis. Part 2 also includes a comprehensive description of the use of the graphical techniques employed in this performance analysis.
Beyond Constant Comparison Qualitative Data Analysis: Using NVivo
ERIC Educational Resources Information Center
Leech, Nancy L.; Onwuegbuzie, Anthony J.
2011-01-01
The purposes of this paper are to outline seven types of qualitative data analysis techniques, to present step-by-step guidance for conducting these analyses via a computer-assisted qualitative data analysis software program (i.e., NVivo9), and to present screenshots of the data analysis process. Specifically, the following seven analyses are…
Shock and vibration technology with applications to electrical systems
NASA Technical Reports Server (NTRS)
Eshleman, R. L.
1972-01-01
A survey is presented of shock and vibration technology for electrical systems developed by the aerospace programs. The shock environment is surveyed along with new techniques for modeling, computer simulation, damping, and response analysis. Design techniques based on the use of analog computers, shock spectra, optimization, and nonlinear isolation are discussed. Shock mounting of rotors for performance and survival, and vibration isolation techniques are reviewed.
The integrated manual and automatic control of complex flight systems
NASA Technical Reports Server (NTRS)
Schmidt, D. K.
1986-01-01
The topics of research in this program include pilot/vehicle analysis techniques, identification of pilot dynamics, and control and display synthesis techniques for optimizing aircraft handling qualities. The project activities are discussed. The current technical activity is directed at extending and validating the active display synthesis procedure, and the pilot/vehicle analysis of the NLR rate-command flight configurations in the landing task. Two papers published by the researchers are attached as appendices.
Managing complex processing of medical image sequences by program supervision techniques
NASA Astrophysics Data System (ADS)
Crubezy, Monica; Aubry, Florent; Moisan, Sabine; Chameroy, Virginie; Thonnat, Monique; Di Paola, Robert
1997-05-01
Our objective is to offer clinicians wider access to evolving medical image processing (MIP) techniques, crucial to improve assessment and quantification of physiological processes, but difficult to handle for non-specialists in MIP. Based on artificial intelligence techniques, our approach consists in the development of a knowledge-based program supervision system, automating the management of MIP libraries. It comprises a library of programs, a knowledge base capturing the expertise about programs and data and a supervision engine. It selects, organizes and executes the appropriate MIP programs given a goal to achieve and a data set, with dynamic feedback based on the results obtained. It also advises users in the development of new procedures chaining MIP programs.. We have experimented the approach for an application of factor analysis of medical image sequences as a means of predicting the response of osteosarcoma to chemotherapy, with both MRI and NM dynamic image sequences. As a result our program supervision system frees clinical end-users from performing tasks outside their competence, permitting them to concentrate on clinical issues. Therefore our approach enables a better exploitation of possibilities offered by MIP and higher quality results, both in terms of robustness and reliability.
Image analysis library software development
NASA Technical Reports Server (NTRS)
Guseman, L. F., Jr.; Bryant, J.
1977-01-01
The Image Analysis Library consists of a collection of general purpose mathematical/statistical routines and special purpose data analysis/pattern recognition routines basic to the development of image analysis techniques for support of current and future Earth Resources Programs. Work was done to provide a collection of computer routines and associated documentation which form a part of the Image Analysis Library.
Information transfer satellite concept study. Volume 4: computer manual
NASA Technical Reports Server (NTRS)
Bergin, P.; Kincade, C.; Kurpiewski, D.; Leinhaupel, F.; Millican, F.; Onstad, R.
1971-01-01
The Satellite Telecommunications Analysis and Modeling Program (STAMP) provides the user with a flexible and comprehensive tool for the analysis of ITS system requirements. While obtaining minimum cost design points, the program enables the user to perform studies over a wide range of user requirements and parametric demands. The program utilizes a total system approach wherein the ground uplink and downlink, the spacecraft, and the launch vehicle are simultaneously synthesized. A steepest descent algorithm is employed to determine the minimum total system cost design subject to the fixed user requirements and imposed constraints. In the process of converging to the solution, the pertinent subsystem tradeoffs are resolved. This report documents STAMP through a technical analysis and a description of the principal techniques employed in the program.
Orion FSW V and V and Kedalion Engineering Lab Insight
NASA Technical Reports Server (NTRS)
Mangieri, Mark L.
2010-01-01
NASA, along with its prime Orion contractor and its subcontractor s are adapting an avionics system paradigm borrowed from the manned commercial aircraft industry for use in manned space flight systems. Integrated Modular Avionics (IMA) techniques have been proven as a robust avionics solution for manned commercial aircraft (B737/777/787, MD 10/90). This presentation will outline current approaches to adapt IMA, along with its heritage FSW V&V paradigms, into NASA's manned space flight program for Orion. NASA's Kedalion engineering analysis lab is on the forefront of validating many of these contemporary IMA based techniques. Kedalion has already validated many of the proposed Orion FSW V&V paradigms using Orion's precursory Flight Test Article (FTA) Pad Abort 1 (PA-1) program. The Kedalion lab will evolve its architectures, tools, and techniques in parallel with the evolving Orion program.
Meteoritic Sulfur Isotopic Analysis
NASA Technical Reports Server (NTRS)
Thiemens, Mark H.
1996-01-01
Funds were requested to continue our program in meteoritic sulfur isotopic analysis. We have recently detected a potential nucleosynthetic sulfur isotopic anomaly. We will search for potential carriers. The documentation of bulk systematics and the possible relation to nebular chemistry and oxygen isotopes will be explored. Analytical techniques for delta(sup 33), delta(sup 34)S, delta(sup 36)S isotopic analysis were improved. Analysis of sub milligram samples is now possible. A possible relation between sulfur isotopes and oxygen was detected, with similar group systematics noted, particularly in the case of aubrites, ureilites and entstatite chondrites. A possible nucleosynthetic excess S-33 has been noted in bulk ureilites and an oldhamite separate from Norton County. High energy proton (approximately 1 GeV) bombardments of iron foils were done to experimentally determine S-33, S-36 spallogenic yields for quantitation of isotopic measurements in iron meteorites. Techniques for measurement of mineral separates were perfected and an analysis program initiated. The systematic behavior of bulk sulfur isotopes will continue to be explored.
40 CFR 141.705 - Approved laboratories.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Cryptosporidium analysis by an equivalent State laboratory certification program. (b) E. coli. Any laboratory... coliform or fecal coliform analysis under § 141.74 is approved for E. coli analysis under this subpart when the laboratory uses the same technique for E. coli that the laboratory uses for § 141.74. (c...
40 CFR 141.705 - Approved laboratories.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Cryptosporidium analysis by an equivalent State laboratory certification program. (b) E. coli. Any laboratory... coliform or fecal coliform analysis under § 141.74 is approved for E. coli analysis under this subpart when the laboratory uses the same technique for E. coli that the laboratory uses for § 141.74. (c...
40 CFR 141.705 - Approved laboratories.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Cryptosporidium analysis by an equivalent State laboratory certification program. (b) E. coli. Any laboratory... coliform or fecal coliform analysis under § 141.74 is approved for E. coli analysis under this subpart when the laboratory uses the same technique for E. coli that the laboratory uses for § 141.74. (c...
40 CFR 141.705 - Approved laboratories.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Cryptosporidium analysis by an equivalent State laboratory certification program. (b) E. coli. Any laboratory... coliform or fecal coliform analysis under § 141.74 is approved for E. coli analysis under this subpart when the laboratory uses the same technique for E. coli that the laboratory uses for § 141.74. (c...
40 CFR 141.705 - Approved laboratories.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Cryptosporidium analysis by an equivalent State laboratory certification program. (b) E. coli. Any laboratory... coliform or fecal coliform analysis under § 141.74 is approved for E. coli analysis under this subpart when the laboratory uses the same technique for E. coli that the laboratory uses for § 141.74. (c...
POLO: a user's guide to Probit Or LOgit analysis.
Jacqueline L. Robertson; Robert M. Russell; N.E. Savin
1980-01-01
This user's guide provides detailed instructions for the use of POLO (Probit Or LOgit), a computer program for the analysis of quantal response data such as that obtained from insecticide bioassays by the techniques of probit or logit analysis. Dosage-response lines may be compared for parallelism or...
NASA's program on icing research and technology
NASA Technical Reports Server (NTRS)
Reinmann, John J.; Shaw, Robert J.; Ranaudo, Richard J.
1989-01-01
NASA's program in aircraft icing research and technology is reviewed. The program relies heavily on computer codes and modern applied physics technology in seeking icing solutions on a finer scale than those offered in earlier programs. Three major goals of this program are to offer new approaches to ice protection, to improve our ability to model the response of an aircraft to an icing encounter, and to provide improved techniques and facilities for ground and flight testing. This paper reviews the following program elements: (1) new approaches to ice protection; (2) numerical codes for deicer analysis; (3) measurement and prediction of ice accretion and its effect on aircraft and aircraft components; (4) special wind tunnel test techniques for rotorcraft icing; (5) improvements of icing wind tunnels and research aircraft; (6) ground de-icing fluids used in winter operation; (7) fundamental studies in icing; and (8) droplet sizing instruments for icing clouds.
L-O-S-T: Logging Optimization Selection Technique
Jerry L. Koger; Dennis B. Webster
1984-01-01
L-O-S-T is a FORTRAN computer program developed to systematically quantify, analyze, and improve user selected harvesting methods. Harvesting times and costs are computed for road construction, landing construction, system move between landings, skidding, and trucking. A linear programming formulation utilizing the relationships among marginal analysis, isoquants, and...
A Primer on the Financial Management of Experiential Learning Assessment Programs.
ERIC Educational Resources Information Center
MacTaggart, Terrence
1983-01-01
The success and failure of experiential learning assessment programs rests not only on their academic quality, but also on their financial management. Types of cost and the meaning of cost-effectiveness are discussed. Break-even analysis, cost-reduction activities, and revenue-enhancement techniques are described. (Author/MLW)
Communicative Competence of the Fourth Year Students: Basis for Proposed English Language Program
ERIC Educational Resources Information Center
Tuan, Vu Van
2017-01-01
This study on level of communicative competence covering linguistic/grammatical and discourse has aimed at constructing a proposed English language program for 5 key universities in Vietnam. The descriptive method utilized was scientifically employed with comparative techniques and correlational analysis. The researcher treated the surveyed data…
ARCHITECTURAL PROGRAMMING--STATE OF THE ART.
ERIC Educational Resources Information Center
EVANS, BENJAMIN H.
IN RESPONSE TO A NEED FOR A MORE THOROUGH AND RIGOROUS STUDY AND ANALYSIS PROCESS IN ENVIRONMENTAL FUNCTIONS PRIOR TO THE DESIGN OF NEW BUILDINGS, A STUDY WAS UNDERTAKEN TO IDENTIFY THE EMERGING TECHNIQUES OF ARCHITECTURAL PROGRAMING PRACTICE. THE STUDY INCLUDED CORRESPONDENCE AND REVIEW OF PERIODICALS, QUESTIONNAIRES AND VISITATIONS, AND A…
On 3D inelastic analysis methods for hot section components
NASA Technical Reports Server (NTRS)
Mcknight, R. L.; Chen, P. C.; Dame, L. T.; Holt, R. V.; Huang, H.; Hartle, M.; Gellin, S.; Allen, D. H.; Haisler, W. E.
1986-01-01
Accomplishments are described for the 2-year program, to develop advanced 3-D inelastic structural stress analysis methods and solution strategies for more accurate and cost effective analysis of combustors, turbine blades and vanes. The approach was to develop a matrix of formulation elements and constitutive models. Three constitutive models were developed in conjunction with optimized iterating techniques, accelerators, and convergence criteria within a framework of dynamic time incrementing. Three formulations models were developed; an eight-noded mid-surface shell element, a nine-noded mid-surface shell element and a twenty-noded isoparametric solid element. A separate computer program was developed for each combination of constitutive model-formulation model. Each program provides a functional stand alone capability for performing cyclic nonlinear structural analysis. In addition, the analysis capabilities incorporated into each program can be abstracted in subroutine form for incorporation into other codes or to form new combinations.
The 3D inelastic analysis methods for hot section components
NASA Technical Reports Server (NTRS)
Mcknight, R. L.; Maffeo, R. J.; Tipton, M. T.; Weber, G.
1992-01-01
A two-year program to develop advanced 3D inelastic structural stress analysis methods and solution strategies for more accurate and cost effective analysis of combustors, turbine blades, and vanes is described. The approach was to develop a matrix of formulation elements and constitutive models. Three constitutive models were developed in conjunction with optimized iterating techniques, accelerators, and convergence criteria within a framework of dynamic time incrementing. Three formulation models were developed: an eight-noded midsurface shell element; a nine-noded midsurface shell element; and a twenty-noded isoparametric solid element. A separate computer program has been developed for each combination of constitutive model-formulation model. Each program provides a functional stand alone capability for performing cyclic nonlinear structural analysis. In addition, the analysis capabilities incorporated into each program can be abstracted in subroutine form for incorporation into other codes or to form new combinations.
Systems Analysis in Small Educational Systems: A Case Study.
ERIC Educational Resources Information Center
Vazquez-Abad, Jesus; And Others
1982-01-01
The use of systems analysis in transforming a graduate program in educational technology from a lecture-based system to a self-instructional one is described. Several operational research techniques are illustrated. A bibliography of 10 items is included. (CHC)
Programs for analysis and resizing of complex structures. [computerized minimum weight design
NASA Technical Reports Server (NTRS)
Haftka, R. T.; Prasad, B.
1978-01-01
The paper describes the PARS (Programs for Analysis and Resizing of Structures) system. PARS is a user oriented system of programs for the minimum weight design of structures modeled by finite elements and subject to stress, displacement, flutter and thermal constraints. The system is built around SPAR - an efficient and modular general purpose finite element program, and consists of a series of processors that communicate through the use of a data base. An efficient optimizer based on the Sequence of Unconstrained Minimization Technique (SUMT) with an extended interior penalty function and Newton's method is used. Several problems are presented for demonstration of the system capabilities.
Cross validation issues in multiobjective clustering
Brusco, Michael J.; Steinley, Douglas
2018-01-01
The implementation of multiobjective programming methods in combinatorial data analysis is an emergent area of study with a variety of pragmatic applications in the behavioural sciences. Most notably, multiobjective programming provides a tool for analysts to model trade offs among competing criteria in clustering, seriation, and unidimensional scaling tasks. Although multiobjective programming has considerable promise, the technique can produce numerically appealing results that lack empirical validity. With this issue in mind, the purpose of this paper is to briefly review viable areas of application for multiobjective programming and, more importantly, to outline the importance of cross-validation when using this method in cluster analysis. PMID:19055857
NASA Astrophysics Data System (ADS)
Viswanathan, V. K.
1980-11-01
The optical design and analysis of the LASL carbon dioxide laser fusion systems required the use of techniques that are quite different from the currently used method in conventional optical design problems. The necessity for this is explored and the method that has been successfully used at Los Alamos to understand these systems is discussed with examples. This method involves characterization of the various optical components in their mounts by a Zernike polynomial set and using fast Fourier transform techniques to propagate the beam, taking diffraction and other nonlinear effects that occur in these types of systems into account. The various programs used for analysis are briefly discussed.
Computed Tomography Inspection and Analysis for Additive Manufacturing Components
NASA Technical Reports Server (NTRS)
Beshears, Ronald D.
2016-01-01
Computed tomography (CT) inspection was performed on test articles additively manufactured from metallic materials. Metallic AM and machined wrought alloy test articles with programmed flaws were inspected using a 2MeV linear accelerator based CT system. Performance of CT inspection on identically configured wrought and AM components and programmed flaws was assessed using standard image analysis techniques to determine the impact of additive manufacturing on inspectability of objects with complex geometries.
Environmental Definition Program Cross Sectional Analysis; Summary of Data and Analysis Techniques
1975-12-31
selected locations, and to characterize cloud and precipitation systems during certain tests and experiments conducted at Wallops Flight Center and at...values. The LWCA was on- site at Wallops Flight Center assisting the special Elight measurements. Other computer programs developed were designed to...550581 N 37025, E Kiev 50024 N 30027 E Simferopol 45001, N 33059! E Perm 58001, N 56018, E Aktyubinsk 50020, N 570131 E Semipalatinsk 50021! N 80015
Static analysis techniques for semiautomatic synthesis of message passing software skeletons
Sottile, Matthew; Dagit, Jason; Zhang, Deli; ...
2015-06-29
The design of high-performance computing architectures demands performance analysis of large-scale parallel applications to derive various parameters concerning hardware design and software development. The process of performance analysis and benchmarking an application can be done in several ways with varying degrees of fidelity. One of the most cost-effective ways is to do a coarse-grained study of large-scale parallel applications through the use of program skeletons. The concept of a “program skeleton” that we discuss in this article is an abstracted program that is derived from a larger program where source code that is determined to be irrelevant is removed formore » the purposes of the skeleton. In this work, we develop a semiautomatic approach for extracting program skeletons based on compiler program analysis. Finally, we demonstrate correctness of our skeleton extraction process by comparing details from communication traces, as well as show the performance speedup of using skeletons by running simulations in the SST/macro simulator.« less
Sawja: Static Analysis Workshop for Java
NASA Astrophysics Data System (ADS)
Hubert, Laurent; Barré, Nicolas; Besson, Frédéric; Demange, Delphine; Jensen, Thomas; Monfort, Vincent; Pichardie, David; Turpin, Tiphaine
Static analysis is a powerful technique for automatic verification of programs but raises major engineering challenges when developing a full-fledged analyzer for a realistic language such as Java. Efficiency and precision of such a tool rely partly on low level components which only depend on the syntactic structure of the language and therefore should not be redesigned for each implementation of a new static analysis. This paper describes the Sawja library: a static analysis workshop fully compliant with Java 6 which provides OCaml modules for efficiently manipulating Java bytecode programs. We present the main features of the library, including i) efficient functional data-structures for representing a program with implicit sharing and lazy parsing, ii) an intermediate stack-less representation, and iii) fast computation and manipulation of complete programs. We provide experimental evaluations of the different features with respect to time, memory and precision.
LOX/Hydrocarbon Combustion Instability Investigation
NASA Technical Reports Server (NTRS)
Jensen, R. J.; Dodson, H. C.; Claflin, S. E.
1989-01-01
The LOX/Hydrocarbon Combustion Instability Investigation Program was structured to determine if the use of light hydrocarbon combustion fuels with liquid oxygen (LOX) produces combustion performance and stability behavior similar to the LOX/hydrogen propellant combination. In particular methane was investigated to determine if that fuel can be rated for combustion instability using the same techniques as previously used for LOX/hydrogen. These techniques included fuel temperature ramping and stability bomb tests. The hot fire program probed the combustion behavior of methane from ambient to subambient temperatures. Very interesting results were obtained from this program that have potential importance to future LOX/methane development programs. A very thorough and carefully reasoned documentation of the experimental data obtained is contained. The hot fire test logic and the associated tests are discussed. Subscale performance and stability rating testing was accomplished using 40,000 lb. thrust class hardware. Stability rating tests used both bombs and fuel temperature ramping techniques. The test program was successful in generating data for the evaluation of the methane stability characteristics relative to hydrogen and to anchor stability models. Data correlations, performance analysis, stability analyses, and key stability margin enhancement parameters are discussed.
NASA Technical Reports Server (NTRS)
Chin, J.; Barbero, P.
1975-01-01
The revision of an existing digital program to analyze the stability of models mounted on a two-cable mount system used in a transonic dynamics wind tunnel is presented. The program revisions and analysis of an active feedback control system to be used for controlling the free-flying models are treated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keepin, G.R.
Over the years the Los Alamos safeguards program has developed, tested, and implemented a broad range of passive and active nondestructive analysis (NDA) instruments (based on gamma and x-ray detection and neutron counting) that are now widely employed in safeguarding nuclear materials of all forms. Here very briefly, the major categories of gamma ray and neutron based NDA techniques, give some representative examples of NDA instruments currently in use, and cite a few notable instances of state-of-the-art NDA technique development. Historical aspects and a broad overview of the safeguards program are also presented.
1988-06-01
Continue on reverse if necessary and identify by block number) FIELD GROUP SUB-GROUP Computer Assisted Instruction; Artificial Intelligence 194...while he/she tries to perform given tasks. Means-ends analysis, a classic technique for solving search problems in Artificial Intelligence, has been...he/she tries to perform given tasks. Means-ends analysis, a classic technique for solving search problems in Artificial Intelligence, has been used
An Analysis of and a Prescription for the Capital Improvement Programming Process for Small Cities.
1980-12-01
thorough analysis and discussion in texts re- lating to business finance, managerial finance, and management accounting . The most notable (that is, the...Theory of the Firm, Prentice-Hall, 1963. DeMoville, W., "Capital Budgeting in Municipalities," Management Accounting , v. 59, no. 1, p. 17-20, 28, July...involves the use of the present value technique. This technique is adequately explained in the literature of basic business finance and management
A Monte Carlo study of Weibull reliability analysis for space shuttle main engine components
NASA Technical Reports Server (NTRS)
Abernethy, K.
1986-01-01
The incorporation of a number of additional capabilities into an existing Weibull analysis computer program and the results of Monte Carlo computer simulation study to evaluate the usefulness of the Weibull methods using samples with a very small number of failures and extensive censoring are discussed. Since the censoring mechanism inherent in the Space Shuttle Main Engine (SSME) data is hard to analyze, it was decided to use a random censoring model, generating censoring times from a uniform probability distribution. Some of the statistical techniques and computer programs that are used in the SSME Weibull analysis are described. The methods documented in were supplemented by adding computer calculations of approximate (using iteractive methods) confidence intervals for several parameters of interest. These calculations are based on a likelihood ratio statistic which is asymptotically a chisquared statistic with one degree of freedom. The assumptions built into the computer simulations are described. The simulation program and the techniques used in it are described there also. Simulation results are tabulated for various combinations of Weibull shape parameters and the numbers of failures in the samples.
Pyrotechnic Shock Analysis Using Statistical Energy Analysis
2015-10-23
SEA subsystems. A couple of validation examples are provided to demonstrate the new approach. KEY WORDS : Peak Ratio, phase perturbation...Ballistic Shock Prediction Models and Techniques for Use in the Crusader Combat Vehicle Program,” 11th Annual US Army Ground Vehicle Survivability
Methods of analysis and resources available for genetic trait mapping.
Ott, J
1999-01-01
Methods of genetic linkage analysis are reviewed and put in context with other mapping techniques. Sources of information are outlined (books, web sites, computer programs). Special consideration is given to statistical problems in canine genetic mapping (heterozygosity, inbreeding, marker maps).
NASA Technical Reports Server (NTRS)
Smetana, F. O.; Summery, D. C.; Johnson, W. D.
1972-01-01
Techniques quoted in the literature for the extraction of stability derivative information from flight test records are reviewed. A recent technique developed at NASA's Langley Research Center was regarded as the most productive yet developed. Results of tests of the sensitivity of this procedure to various types of data noise and to the accuracy of the estimated values of the derivatives are reported. Computer programs for providing these initial estimates are given. The literature review also includes a discussion of flight test measuring techniques, instrumentation, and piloting techniques.
A survey of compiler optimization techniques
NASA Technical Reports Server (NTRS)
Schneck, P. B.
1972-01-01
Major optimization techniques of compilers are described and grouped into three categories: machine dependent, architecture dependent, and architecture independent. Machine-dependent optimizations tend to be local and are performed upon short spans of generated code by using particular properties of an instruction set to reduce the time or space required by a program. Architecture-dependent optimizations are global and are performed while generating code. These optimizations consider the structure of a computer, but not its detailed instruction set. Architecture independent optimizations are also global but are based on analysis of the program flow graph and the dependencies among statements of source program. A conceptual review of a universal optimizer that performs architecture-independent optimizations at source-code level is also presented.
Confirmatory factor analysis using Microsoft Excel.
Miles, Jeremy N V
2005-11-01
This article presents a method for using Microsoft (MS) Excel for confirmatory factor analysis (CFA). CFA is often seen as an impenetrable technique, and thus, when it is taught, there is frequently little explanation of the mechanisms or underlying calculations. The aim of this article is to demonstrate that this is not the case; it is relatively straightforward to produce a spreadsheet in MS Excel that can carry out simple CFA. It is possible, with few or no programming skills, to effectively program a CFA analysis and, thus, to gain insight into the workings of the procedure.
Failure Modes and Effects Analysis (FMEA): A Bibliography
NASA Technical Reports Server (NTRS)
2000-01-01
Failure modes and effects analysis (FMEA) is a bottom-up analytical process that identifies process hazards, which helps managers understand vulnerabilities of systems, as well as assess and mitigate risk. It is one of several engineering tools and techniques available to program and project managers aimed at increasing the likelihood of safe and successful NASA programs and missions. This bibliography references 465 documents in the NASA STI Database that contain the major concepts, failure modes or failure analysis, in either the basic index of the major subject terms.
Structural mode significance using INCA. [Interactive Controls Analysis computer program
NASA Technical Reports Server (NTRS)
Bauer, Frank H.; Downing, John P.; Thorpe, Christopher J.
1990-01-01
Structural finite element models are often too large to be used in the design and analysis of control systems. Model reduction techniques must be applied to reduce the structural model to manageable size. In the past, engineers either performed the model order reduction by hand or used distinct computer programs to retrieve the data, to perform the significance analysis and to reduce the order of the model. To expedite this process, the latest version of INCA has been expanded to include an interactive graphical structural mode significance and model order reduction capability.
NASA Technical Reports Server (NTRS)
Murthy, T. Sreekanta; Kvaternik, Raymond G.
1991-01-01
A NASA/industry rotorcraft structural dynamics program known as Design Analysis Methods for VIBrationS (DAMVIBS) was initiated at Langley Research Center in 1984 with the objective of establishing the technology base needed by the industry for developing an advanced finite-element-based vibrations design analysis capability for airframe structures. As a part of the in-house activities contributing to that program, a study was undertaken to investigate the use of formal, nonlinear programming-based, numerical optimization techniques for airframe vibrations design work. Considerable progress has been made in connection with that study since its inception in 1985. This paper presents a unified summary of the experiences and results of that study. The formulation and solution of airframe optimization problems are discussed. Particular attention is given to describing the implementation of a new computational procedure based on MSC/NASTRAN and CONstrained function MINimization (CONMIN) in a computer program system called DYNOPT for the optimization of airframes subject to strength, frequency, dynamic response, and fatigue constraints. The results from the application of the DYNOPT program to the Bell AH-1G helicopter are presented and discussed.
Research-Informed Curriculum Design for a Master's-Level Program in Project Management
ERIC Educational Resources Information Center
Bentley, Yongmei; Richardson, Diane; Duan, Yanqing; Philpott, Elly; Ong, Vincent; Owen, David
2013-01-01
This article reports on the application of Research-Informed Curriculum Design (RICD) for the development and implementation of an MSc Program in Project Management. The research focused on contemporary issues in project management and provided an analysis of project management approaches, tools, and techniques currently used in organizations.…
Eyes on Target: Intelligence Support to an Effects-based Approach
2007-01-01
and theoretical support for such techniques is available from the field of neurolinguistic programming ,21 which involves analysis of word choice...Approach 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7
Marketing Merit Aid: The Response of Flagship Campuses to State Merit Aid Programs
ERIC Educational Resources Information Center
Ness, Erik C.; Lips, Adam J. A.
2011-01-01
This study examines the differences in the portfolio of institutional scholarships and the marketing of these awards between flagship campuses with and without state merit aid programs. Using content analysis techniques to analyze institutional websites of the 16 Southern Regional Education Board (SREB) flagship campuses, three thematic responses…
An Application of Indian Health Service Standards for Alcoholism Programs.
ERIC Educational Resources Information Center
Burns, Thomas R.
1984-01-01
Discusses Phoenix-area applications of 1981 Indian Health Service standards for alcoholism programs. Results of standard statistical techniques note areas of deficiency through application of a one-tailed z test at .05 level of significance. Factor analysis sheds further light on design of standards. Implications for revisions are suggested.…
A proposed streamflow-data program for North Dakota
Crosby, O.A.
1970-01-01
An evaluation of the streamflow data available in North Dakota was made to provide guidelines for planning future programs. The basic steps in the evaluation procedure were (1) definition of the long-term goals of the streamflow data program in quantitative form, (2) examination and analysis of all available data to determine which goals have already been met, and (3) consideration of alternate programs and techniques to meet the remaining objectives. None of the goals could be met by generalization of the data for gaged basins by regression analysis. This fact indicates that significant changes should be made in the present data program to obtain better areal coverage to achieve the goals set. A streamflow data program based on the guidelines developed in this study is proposed for the future.
1983-12-01
while at the same time improving its operational efficiency. Through their integration and use, System Program Managers have a comprehensive analytical... systems . The NRLA program is hosted on the CREATE Operating System and contains approxiamately 5500 lines of computer code. It consists of a main...associated with C alternative maintenance plans. As the technological complexity of weapons systems has increased new and innovative logisitcal support
Theoretical basis of the DOE-2 building energy use analysis program
NASA Astrophysics Data System (ADS)
Curtis, R. B.
1981-04-01
A user-oriented, public domain, computer program was developed that will enable architects and engineers to perform design and retrofit studies of the energy-use of buildings under realistic weather conditions. The DOE-2.1A has been named by the US DOE as the standard evaluation technique for the Congressionally mandated building energy performance standards (BEPS). A number of program design decisions were made that determine the breadth of applicability of DOE-2.1. Such design decisions are intrinsic to all building energy use analysis computer programs and determine the types of buildings or the kind of HVAC systems that can be modeled. In particular, the weighting factor method used in DOE-2 has both advantages and disadvantages relative to other computer programs.
Implementing a Reliability Centered Maintenance Program at NASA's Kennedy Space Center
NASA Technical Reports Server (NTRS)
Tuttle, Raymond E.; Pete, Robert R.
1998-01-01
Maintenance practices have long focused on time based "preventive maintenance" techniques. Components were changed out and parts replaced based on how long they had been in place instead of what condition they were in. A reliability centered maintenance (RCM) program seeks to offer equal or greater reliability at decreased cost by insuring only applicable, effective maintenance is performed and by in large part replacing time based maintenance with condition based maintenance. A significant portion of this program involved introducing non-intrusive technologies, such as vibration analysis, oil analysis and I/R cameras, to an existing labor force and management team.
Fracture network evaluation program (FraNEP): A software for analyzing 2D fracture trace-line maps
NASA Astrophysics Data System (ADS)
Zeeb, Conny; Gomez-Rivas, Enrique; Bons, Paul D.; Virgo, Simon; Blum, Philipp
2013-10-01
Fractures, such as joints, faults and veins, strongly influence the transport of fluids through rocks by either enhancing or inhibiting flow. Techniques used for the automatic detection of lineaments from satellite images and aerial photographs, LIDAR technologies and borehole televiewers significantly enhanced data acquisition. The analysis of such data is often performed manually or with different analysis software. Here we present a novel program for the analysis of 2D fracture networks called FraNEP (Fracture Network Evaluation Program). The program was developed using Visual Basic for Applications in Microsoft Excel™ and combines features from different existing software and characterization techniques. The main novelty of FraNEP is the possibility to analyse trace-line maps of fracture networks applying the (1) scanline sampling, (2) window sampling or (3) circular scanline and window method, without the need of switching programs. Additionally, binning problems are avoided by using cumulative distributions, rather than probability density functions. FraNEP is a time-efficient tool for the characterisation of fracture network parameters, such as density, intensity and mean length. Furthermore, fracture strikes can be visualized using rose diagrams and a fitting routine evaluates the distribution of fracture lengths. As an example of its application, we use FraNEP to analyse a case study of lineament data from a satellite image of the Oman Mountains.
Multiprocessor smalltalk: Implementation, performance, and analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pallas, J.I.
1990-01-01
Multiprocessor Smalltalk demonstrates the value of object-oriented programming on a multiprocessor. Its implementation and analysis shed light on three areas: concurrent programming in an object oriented language without special extensions, implementation techniques for adapting to multiprocessors, and performance factors in the resulting system. Adding parallelism to Smalltalk code is easy, because programs already use control abstractions like iterators. Smalltalk's basic control and concurrency primitives (lambda expressions, processes and semaphores) can be used to build parallel control abstractions, including parallel iterators, parallel objects, atomic objects, and futures. Language extensions for concurrency are not required. This implementation demonstrates that it is possiblemore » to build an efficient parallel object-oriented programming system and illustrates techniques for doing so. Three modification tools-serialization, replication, and reorganization-adapted the Berkeley Smalltalk interpreter to the Firefly multiprocessor. Multiprocessor Smalltalk's performance shows that the combination of multiprocessing and object-oriented programming can be effective: speedups (relative to the original serial version) exceed 2.0 for five processors on all the benchmarks; the median efficiency is 48%. Analysis shows both where performance is lost and how to improve and generalize the experimental results. Changes in the interpreter to support concurrency add at most 12% overhead; better access to per-process variables could eliminate much of that. Changes in the user code to express concurrency add as much as 70% overhead; this overhead could be reduced to 54% if blocks (lambda expressions) were reentrant. Performance is also lost when the program cannot keep all five processors busy.« less
Kim, Yun-Jeong; Chae, Joon-Seok; Chang, Jun Keun; Kang, Seong Ho
2005-08-12
We have developed a novel method for the ultra-fast analysis of genetically modified organisms (GMOs) in soybeans by microchip capillary gel electrophoresis (MCGE) using programmed field strength gradients (PFSG) in a conventional glass double-T microchip. Under the programmed electric field strength and 0.3% poly(ethylene oxide) sieving matrix, the GMO in soybeans was analyzed within only 11 s of the microchip. The MCGE-PFSG method was a program that changes the electric field strength during GMO analysis, and was also applied to the ultra-fast analysis of PCR products. Compared to MCGE using a conventional and constantly applied electric field, the MCGE-PFSG analysis generated faster results without the loss of resolving power and reproducibility for specific DNA fragments (100- and 250-bp DNA) of GM-soybeans. The MCGE-PFSG technique may prove to be a new tool in the GMO analysis due to its speed, simplicity, and high efficiency.
Computational Aspects of Heat Transfer in Structures
NASA Technical Reports Server (NTRS)
Adelman, H. M. (Compiler)
1982-01-01
Techniques for the computation of heat transfer and associated phenomena in complex structures are examined with an emphasis on reentry flight vehicle structures. Analysis methods, computer programs, thermal analysis of large space structures and high speed vehicles, and the impact of computer systems are addressed.
Quality Assessment of College Admissions Processes.
ERIC Educational Resources Information Center
Fisher, Caroline; Weymann, Elizabeth; Todd, Amy
2000-01-01
This study evaluated the admissions process for a Master's in Business Administration Program using such quality improvement techniques as customer surveys, benchmarking, and gap analysis. Analysis revealed that student dissatisfaction with the admissions process may be a factor influencing declining enrollment. Cycle time and number of student…
Methods for geochemical analysis
Baedecker, Philip A.
1987-01-01
The laboratories for analytical chemistry within the Geologic Division of the U.S. Geological Survey are administered by the Office of Mineral Resources. The laboratory analysts provide analytical support to those programs of the Geologic Division that require chemical information and conduct basic research in analytical and geochemical areas vital to the furtherance of Division program goals. Laboratories for research and geochemical analysis are maintained at the three major centers in Reston, Virginia, Denver, Colorado, and Menlo Park, California. The Division has an expertise in a broad spectrum of analytical techniques, and the analytical research is designed to advance the state of the art of existing techniques and to develop new methods of analysis in response to special problems in geochemical analysis. The geochemical research and analytical results are applied to the solution of fundamental geochemical problems relating to the origin of mineral deposits and fossil fuels, as well as to studies relating to the distribution of elements in varied geologic systems, the mechanisms by which they are transported, and their impact on the environment.
CometBoards Users Manual Release 1.0
NASA Technical Reports Server (NTRS)
Guptill, James D.; Coroneos, Rula M.; Patnaik, Surya N.; Hopkins, Dale A.; Berke, Lazlo
1996-01-01
Several nonlinear mathematical programming algorithms for structural design applications are available at present. These include the sequence of unconstrained minimizations technique, the method of feasible directions, and the sequential quadratic programming technique. The optimality criteria technique and the fully utilized design concept are two other structural design methods. A project was undertaken to bring all these design methods under a common computer environment so that a designer can select any one of these tools that may be suitable for his/her application. To facilitate selection of a design algorithm, to validate and check out the computer code, and to ascertain the relative merits of the design tools, modest finite element structural analysis programs based on the concept of stiffness and integrated force methods have been coupled to each design method. The code that contains both these design and analysis tools, by reading input information from analysis and design data files, can cast the design of a structure as a minimum-weight optimization problem. The code can then solve it with a user-specified optimization technique and a user-specified analysis method. This design code is called CometBoards, which is an acronym for Comparative Evaluation Test Bed of Optimization and Analysis Routines for the Design of Structures. This manual describes for the user a step-by-step procedure for setting up the input data files and executing CometBoards to solve a structural design problem. The manual includes the organization of CometBoards; instructions for preparing input data files; the procedure for submitting a problem; illustrative examples; and several demonstration problems. A set of 29 structural design problems have been solved by using all the optimization methods available in CometBoards. A summary of the optimum results obtained for these problems is appended to this users manual. CometBoards, at present, is available for Posix-based Cray and Convex computers, Iris and Sun workstations, and the VM/CMS system.
Debugging Techniques Used by Experienced Programmers to Debug Their Own Code.
1990-09-01
IS. NUMBER OF PAGES code debugging 62 computer programmers 16. PRICE CODE debug programming 17. SECURITY CLASSIFICATION 18. SECURITY CLASSIFICATION 119...Davis, and Schultz (1987) also compared experts and novices, but focused on the way a computer program is represented cognitively and how that...of theories in the emerging computer programming domain (Fisher, 1987). In protocol analysis, subjects are asked to talk/think aloud as they solve
Experimenters' reference based upon Skylab experiment management
NASA Technical Reports Server (NTRS)
1974-01-01
The methods and techniques for experiment development and integration that evolved during the Skylab Program are described to facilitate transferring this experience to experimenters in future manned space programs. Management responsibilities and the sequential process of experiment evolution from initial concept through definition, development, integration, operation and postflight analysis are outlined and amplified, as appropriate. Emphasis is placed on specific lessons learned on Skylab that are worthy of consideration by future programs.
Flight test derived heating math models for critical locations on the orbiter during reentry
NASA Technical Reports Server (NTRS)
Hertzler, E. K.; Phillips, P. W.
1983-01-01
An analysis technique was developed for expanding the aerothermodynamic envelope of the Space Shuttle without subjecting the vehicle to sustained flight at more stressing heating conditions. A transient analysis program was developed to take advantage of the transient maneuvers that were flown as part of this analysis technique. Heat rates were derived from flight test data for various locations on the orbiter. The flight derived heat rates were used to update heating models based on predicted data. Future missions were then analyzed based on these flight adjusted models. A technique for comparing flight and predicted heating rate data and the extrapolation of the data to predict the aerothermodynamic environment of future missions is presented.
ZIP3D: An elastic and elastic-plastic finite-element analysis program for cracked bodies
NASA Technical Reports Server (NTRS)
Shivakumar, K. N.; Newman, J. C., Jr.
1990-01-01
ZIP3D is an elastic and an elastic-plastic finite element program to analyze cracks in three dimensional solids. The program may also be used to analyze uncracked bodies or multi-body problems involving contacting surfaces. For crack problems, the program has several unique features including the calculation of mixed-mode strain energy release rates using the three dimensional virtual crack closure technique, the calculation of the J integral using the equivalent domain integral method, the capability to extend the crack front under monotonic or cyclic loading, and the capability to close or open the crack surfaces during cyclic loading. The theories behind the various aspects of the program are explained briefly. Line-by-line data preparation is presented. Input data and results for an elastic analysis of a surface crack in a plate and for an elastic-plastic analysis of a single-edge-crack-tension specimen are also presented.
ERIC Educational Resources Information Center
Dimick, Barbara
1995-01-01
Marketing techniques in youth services are useful for designing programs, collections, and services and for determining customer needs. The marketing mix--product, place, price, and practice--provides a framework for service analysis. (AEF)
NASA Technical Reports Server (NTRS)
Creason, A. S.; Miranda, F. A.
1996-01-01
Knowledge of the microwave properties at cryogenic temperatures of components fabricated using High-Temperature-Superconductors (HTS) is useful in the design of HTS-based microwave circuits. Therefore, fast and reliable characterization techniques have been developed to study the aforementioned properties. In this paper, we discuss computer analysis techniques employed in the cryogenic characterization of HTS-based resonators. The revised data analysis process requires minimal user input. and organizes the data in a form that is easily accessible by the user for further examination. These programs retrieve data generated during the cryogenic characterization at microwave frequencies of HTS based resonators and use it to calculate parameters such as the loaded and unloaded quality factors (Q and Q(sub o), respectively), the resonant frequency (f(sub o)), and the coupling coefficient (k), which are important quantities in the evaluation of HTS resonators. While the data are also stored for further use, the programs allow the user to obtain a graphical representation of any of the measured parameters as a function of temperature soon after the completion of the cryogenic measurement cycle. Although these programs were developed to study planar HTS-based resonators operating in the reflection mode, they could also be used in the cryogenic characterization of two ports (i.e., reflection/transmission) resonators.
NASA Astrophysics Data System (ADS)
Mert, Bayram Ali; Dag, Ahmet
2017-12-01
In this study, firstly, a practical and educational geostatistical program (JeoStat) was developed, and then example analysis of porosity parameter distribution, using oilfield data, was presented. With this program, two or three-dimensional variogram analysis can be performed by using normal, log-normal or indicator transformed data. In these analyses, JeoStat offers seven commonly used theoretical variogram models (Spherical, Gaussian, Exponential, Linear, Generalized Linear, Hole Effect and Paddington Mix) to the users. These theoretical models can be easily and quickly fitted to experimental models using a mouse. JeoStat uses ordinary kriging interpolation technique for computation of point or block estimate, and also uses cross-validation test techniques for validation of the fitted theoretical model. All the results obtained by the analysis as well as all the graphics such as histogram, variogram and kriging estimation maps can be saved to the hard drive, including digitised graphics and maps. As such, the numerical values of any point in the map can be monitored using a mouse and text boxes. This program is available to students, researchers, consultants and corporations of any size free of charge. The JeoStat software package and source codes available at: http://www.jeostat.com/JeoStat_2017.0.rar.
Symbolic Analysis of Concurrent Programs with Polymorphism
NASA Technical Reports Server (NTRS)
Rungta, Neha Shyam
2010-01-01
The current trend of multi-core and multi-processor computing is causing a paradigm shift from inherently sequential to highly concurrent and parallel applications. Certain thread interleavings, data input values, or combinations of both often cause errors in the system. Systematic verification techniques such as explicit state model checking and symbolic execution are extensively used to detect errors in such systems [7, 9]. Explicit state model checking enumerates possible thread schedules and input data values of a program in order to check for errors [3, 9]. To partially mitigate the state space explosion from data input values, symbolic execution techniques substitute data input values with symbolic values [5, 7, 6]. Explicit state model checking and symbolic execution techniques used in conjunction with exhaustive search techniques such as depth-first search are unable to detect errors in medium to large-sized concurrent programs because the number of behaviors caused by data and thread non-determinism is extremely large. We present an overview of abstraction-guided symbolic execution for concurrent programs that detects errors manifested by a combination of thread schedules and data values [8]. The technique generates a set of key program locations relevant in testing the reachability of the target locations. The symbolic execution is then guided along these locations in an attempt to generate a feasible execution path to the error state. This allows the execution to focus in parts of the behavior space more likely to contain an error.
Old Wine in New Bottles: The Quality of Work Life in Schools and School Districts.
ERIC Educational Resources Information Center
Bacharach, Samuel B.; Mitchell, Stephen M.
This essay reviews quality of work life as a management technique and argues that quality-of-work-life programs, conceptualized multidimensionally, offer a unique mechanism for improving working conditions in schools and within districts. A brief analysis of major management ideologies concludes that some techniques advocated under the label of…
ERIC Educational Resources Information Center
STOLLER, DAVID S.
AN INTERNATIONAL GROUP FOR THE ORGANIZATION FOR ECONOMIC COOPERATION AND DEVELOPMENT (OECD) MET TO EXCHANGE METHODS OF EDUCATIONAL PLANNING, TECHNIQUES, AND PROGRESS, AND TO DISCUSS MEANS OF MAKING EDUCATION AVAILABLE TO ALL SOCIOECONOMIC LEVELS OF SOCIETY. RAPIDLY EXPANDING INDUSTRIAL, TECHNOLOGICAL, MILITARY, AND ADMINISTRATIVE PROGRAMS IN ALL…
Conservation and restoration of forested wetlands: new techniques and perspectives
James Johnston; Steve Hartley; Antonio Martucci
2000-01-01
A partnership of state and federal agencies and private organizations is developing advanced spatial analysis techniques applied for conservation and restoration of forested wetlands. The project goal is to develop an application to assist decisionmakers in defining the eligibility of land sites for entry in the Wetland Reserve Program (WRP) of the U.S. Department of...
Development and evaluation of the impulse transfer function technique
NASA Technical Reports Server (NTRS)
Mantus, M.
1972-01-01
The development of the test/analysis technique known as the impulse transfer function (ITF) method is discussed. This technique, when implemented with proper data processing systems, should become a valuable supplement to conventional dynamic testing and analysis procedures that will be used in the space shuttle development program. The method can relieve many of the problems associated with extensive and costly testing of the shuttle for transient loading conditions. In addition, the time history information derived from impulse testing has the potential for being used to determine modal data for the structure under investigation. The technique could be very useful in determining the time-varying modal characteristics of structures subjected to thermal transients, where conventional mode surveys are difficult to perform.
Benchmarking in Universities: League Tables Revisited
ERIC Educational Resources Information Center
Turner, David
2005-01-01
This paper examines the practice of benchmarking universities using a "league table" approach. Taking the example of the "Sunday Times University League Table", the author reanalyses the descriptive data on UK universities. Using a linear programming technique, data envelope analysis (DEA), the author uses the re-analysis to…
Suspect screening (SSA) and non-targeted analysis (NTA) methods using high-resolution mass spectrometry (HRMS) offer new approaches to efficiently generate exposure data for chemicals in a variety of environmental and biological media. These techniques aid characterization of the...
Static analysis of class invariants in Java programs
NASA Astrophysics Data System (ADS)
Bonilla-Quintero, Lidia Dionisia
2011-12-01
This paper presents a technique for the automatic inference of class invariants from Java bytecode. Class invariants are very important for both compiler optimization and as an aid to programmers in their efforts to reduce the number of software defects. We present the original DC-invariant analysis from Adam Webber, talk about its shortcomings and suggest several different ways to improve it. To apply the DC-invariant analysis to identify DC-invariant assertions, all that one needs is a monotonic method analysis function and a suitable assertion domain. The DC-invariant algorithm is very general; however, the method analysis can be highly tuned to the problem in hand. For example, one could choose shape analysis as the method analysis function and use the DC-invariant analysis to simply extend it to an analysis that would yield class-wide invariants describing the shapes of linked data structures. We have a prototype implementation: a system we refer to as "the analyzer" that infers DC-invariant unary and binary relations and provides them to the user in a human readable format. The analyzer uses those relations to identify unnecessary array bounds checks in Java programs and perform null-reference analysis. It uses Adam Webber's relational constraint technique for the class-invariant binary relations. Early results with the analyzer were very imprecise in the presence of "dirty-called" methods. A dirty-called method is one that is called, either directly or transitively, from any constructor of the class, or from any method of the class at a point at which a disciplined field has been altered. This result was unexpected and forced an extensive search for improved techniques. An important contribution of this paper is the suggestion of several ways to improve the results by changing the way dirty-called methods are handled. The new techniques expand the set of class invariants that can be inferred over Webber's original results. The technique that produces better results uses in-line analysis. Final results are promising: we can infer sound class invariants for full-scale, not just toy applications.
Symbolic Execution Enhanced System Testing
NASA Technical Reports Server (NTRS)
Davies, Misty D.; Pasareanu, Corina S.; Raman, Vishwanath
2012-01-01
We describe a testing technique that uses information computed by symbolic execution of a program unit to guide the generation of inputs to the system containing the unit, in such a way that the unit's, and hence the system's, coverage is increased. The symbolic execution computes unit constraints at run-time, along program paths obtained by system simulations. We use machine learning techniques treatment learning and function fitting to approximate the system input constraints that will lead to the satisfaction of the unit constraints. Execution of system input predictions either uncovers new code regions in the unit under analysis or provides information that can be used to improve the approximation. We have implemented the technique and we have demonstrated its effectiveness on several examples, including one from the aerospace domain.
Deriving Earth Science Data Analytics Tools/Techniques Requirements
NASA Astrophysics Data System (ADS)
Kempler, S. J.
2015-12-01
Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists. Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics tools/techniques requirements that would support specific ESDA type goals. Representative existing data analytics tools/techniques relevant to ESDA will also be addressed.
NASA Technical Reports Server (NTRS)
Anderson, B. H.
1983-01-01
A broad program to develop advanced, reliable, and user oriented three-dimensional viscous design techniques for supersonic inlet systems, and encourage their transfer into the general user community is discussed. Features of the program include: (1) develop effective methods of computing three-dimensional flows within a zonal modeling methodology; (2) ensure reasonable agreement between said analysis and selective sets of benchmark validation data; (3) develop user orientation into said analysis; and (4) explore and develop advanced numerical methodology.
Program for an improved hypersonic temperature-sensing probe
NASA Technical Reports Server (NTRS)
Reilly, Richard J.
1993-01-01
Under a NASA Dryden-sponsored contract in the mid 1960s, temperatures of up to 2200 C were successfully measured using a fluid oscillator. The current program, although limited in scope, explores the problem areas which must be solved if this technique is to be extended to 10,000 R. The potential for measuring extremely high temperatures, using fluid oscillator techniques, stems from the fact that the measuring element is the fluid itself. The containing structure of the oscillator need not be brought to equilibrium temperature with with the fluid for temperature measurement, provided that a suitable calibration can be arranged. This program concentrated on review of high-temperature material developments since the original program was completed. Other areas of limited study included related pressure instrumentation requirements, dissociation, rarefied gas effects, and analysis of sensor time response.
Floating-point system quantization errors in digital control systems
NASA Technical Reports Server (NTRS)
Phillips, C. L.; Vallely, D. P.
1978-01-01
This paper considers digital controllers (filters) operating in floating-point arithmetic in either open-loop or closed-loop systems. A quantization error analysis technique is developed, and is implemented by a digital computer program that is based on a digital simulation of the system. The program can be integrated into existing digital simulations of a system.
Hsieh, Ming-Yeh; Lynch, Georgina; Madison, Charles
2018-04-27
This study examined intervention techniques used with children with autism spectrum disorder (ASD) by speech-language pathologists (SLPs) in the United States and Taiwan working in clinic/hospital settings. The research questions addressed intervention techniques used with children with ASD, intervention techniques used with different age groups (under and above 8 years old), and training received before using the intervention techniques. The survey was distributed through the American Speech-Language-Hearing Association to selected SLPs across the United States. In Taiwan, the survey (Chinese version) was distributed through the Taiwan Speech-Language Pathologist Union, 2018, to certified SLPs. Results revealed that SLPs in the United States and Taiwan used 4 common intervention techniques: Social Skill Training, Augmentative and Alternative Communication, Picture Exchange Communication System, and Social Stories. Taiwanese SLPs reported SLP preparation program training across these common intervention strategies. In the United States, SLPs reported training via SLP preparation programs, peer therapists, and self-taught. Most SLPs reported using established or emerging evidence-based practices as defined by the National Professional Development Center (2014) and the National Standards Report (2015). Future research should address comparison of SLP preparation programs to examine the impact of preprofessional training on use of evidence-based practices to treat ASD.
Gleberzon, Brain J
2002-01-01
In a previous article, the author reported on the recommendations gathered from student projects between 1996 and 1999 investigating their preferences for including certain chiropractic Name technique systems into the curriculum at the Canadian Memorial Chiropractic College (CMCC). These results were found to be congruent with the professional treatment technique used by Canadian chiropractors. This article reports on the data obtained during the 2000 and 2001 academic years, comparing these results to those previously gathered. In addition, because of the implementation of a new curriculum during this time period, there was unique opportunity to observe whether or not student perceptions differed between those students in the `old' curricular program, and those students in the `new' curricular program. The results gathered indicate that students in both curricular programs show an interest in learning Thompson Terminal Point, Activator Methods, Gonstead, and Active Release Therapy techniques in the core curriculum, as an elective, or during continuing educational programs provided by the college. Students continue to show less interest in learning CranioSacral Therapy, SacroOccipital Technique, Logan Basic, Applied Kinesiology and Chiropractic BioPhysics. Over time, student interest has moved away from Palmer HIO and other upper cervical techniques, and students show a declining interest in being offered instruction in either Network Spinal Analysis or Torque Release Techniques. Since these findings reflect the practice activities of Canadian chiropractors they may have implications not only towards pedagogical decision-making processes at CMCC, but they may also influence professional standards of care.
Analysis of Thick Sandwich Shells with Embedded Ceramic Tiles
NASA Technical Reports Server (NTRS)
Davila, Carlos G.; Smith, C.; Lumban-Tobing, F.
1996-01-01
The Composite Armored Vehicle (CAV) is an advanced technology demonstrator of an all-composite ground combat vehicle. The CAV upper hull is made of a tough light-weight S2-glass/epoxy laminate with embedded ceramic tiles that serve as armor. The tiles are bonded to a rubber mat with a carefully selected, highly viscoelastic adhesive. The integration of armor and structure offers an efficient combination of ballistic protection and structural performance. The analysis of this anisotropic construction, with its inherent discontinuous and periodic nature, however, poses several challenges. The present paper describes a shell-based 'element-layering' technique that properly accounts for these effects and for the concentrated transverse shear flexibility in the rubber mat. One of the most important advantages of the element-layering technique over advanced higher-order elements is that it is based on conventional elements. This advantage allows the models to be portable to other structural analysis codes, a prerequisite in a program that involves the computational facilities of several manufacturers and government laboratories. The element-layering technique was implemented into an auto-layering program that automatically transforms a conventional shell model into a multi-layered model. The effects of tile layer homogenization, tile placement patterns, and tile gap size on the analysis results are described.
NASA Technical Reports Server (NTRS)
Wingrove, R. C.
1994-01-01
This program was developed by Ames Research Center, in cooperation with the National Transportation Safety Board, as a technique for deriving time histories of an aircraft's motion from Air Traffic Control (ATC) radar records. This technique uses the radar range and azimuth data, along with the downlinked altitude data, to derive an expanded set of data which includes airspeed, lift, attitude angles (pitch, roll, and heading), etc. This technique should prove useful as a source of data in the investigation of commercial airline accidents and in the analysis of accidents involving aircraft which do not have onboard data recorders (e.g., military, short-haul, and general aviation). The technique used to determine the aircraft motions involves smoothing of raw radar data. These smoothed results, in combination with other available information (wind profiles and aircraft performance data), are used to derive the expanded set of data. This program uses a cubic least-square fit to smooth the raw data. This moving-arc procedure provides a smoothed time history of the aircraft position, the inertial velocities, and accelerations. Using known winds, these inertial data are transformed to aircraft stability axes to provide true airspeed, thrust-drag, lift, and roll angle. Further derivation, based on aircraft dependent performance data, can determine the aircraft angle of attack, pitch, and heading angle. Results of experimental tests indicate that values derived from ATC radar records using this technique agree favorably with airborne measurements. This program is written in FORTRAN IV to be executed in the batch mode, and has been implemented on a CDC 6000 series computer with a central memory requirement of 64k (octal) of 60 bit words.
Automata-Based Verification of Temporal Properties on Running Programs
NASA Technical Reports Server (NTRS)
Giannakopoulou, Dimitra; Havelund, Klaus; Lan, Sonie (Technical Monitor)
2001-01-01
This paper presents an approach to checking a running program against its Linear Temporal Logic (LTL) specifications. LTL is a widely used logic for expressing properties of programs viewed as sets of executions. Our approach consists of translating LTL formulae to finite-state automata, which are used as observers of the program behavior. The translation algorithm we propose modifies standard LTL to Buchi automata conversion techniques to generate automata that check finite program traces. The algorithm has been implemented in a tool, which has been integrated with the generic JPaX framework for runtime analysis of Java programs.
Large Advanced Space Systems (LASS) computer-aided design program additions
NASA Technical Reports Server (NTRS)
Farrell, C. E.
1982-01-01
The LSS preliminary and conceptual design requires extensive iteractive analysis because of the effects of structural, thermal, and control intercoupling. A computer aided design program that will permit integrating and interfacing of required large space system (LSS) analyses is discussed. The primary objective of this program is the implementation of modeling techniques and analysis algorithms that permit interactive design and tradeoff studies of LSS concepts. Eight software modules were added to the program. The existing rigid body controls module was modified to include solar pressure effects. The new model generator modules and appendage synthesizer module are integrated (interfaced) to permit interactive definition and generation of LSS concepts. The mass properties module permits interactive specification of discrete masses and their locations. The other modules permit interactive analysis of orbital transfer requirements, antenna primary beam n, and attitude control requirements.
Program Analysis Techniques for Efficient Software Model Checking
2011-02-28
MIT Press, 1986. [29] D. Marinov, A . Andoni, D. Daniliuc, S . Khurshid, and M . Rinard. An evaluation of exhaustive testing for data structures...such as reading, writing, creating, or deleting a file or a directory) on a file system state s , it uses its analyses to identify other file system...ples of Programming Languages (POPL), January 2003. [6] C. Boyapati and M . Rinard. A parameterized type system for race-free Java programs. In
MSFC Skylab experimenter's reference
NASA Technical Reports Server (NTRS)
1974-01-01
The methods and techniques for experiment development and integration that evolved during the Skylab Program are described to facilitate transferring this experience to experimenters in future manned space programs. Management responsibilities and the sequential process of experiment evolution from initial concept through definition, development, integration, operation and postflight analysis are outlined in the main text and amplified, as appropriate, in appendixes. Emphasis is placed on specific lessons learned on Skylab that are worthy of consideration by future programs.
Linear combination reading program for capture gamma rays
Tanner, Allan B.
1971-01-01
This program computes a weighting function, Qj, which gives a scalar output value of unity when applied to the spectrum of a desired element and a minimum value (considering statistics) when applied to spectra of materials not containing the desired element. Intermediate values are obtained for materials containing the desired element, in proportion to the amount of the element they contain. The program is written in the BASIC language in a format specific to the Hewlett-Packard 2000A Time-Sharing System, and is an adaptation of an earlier program for linear combination reading for X-ray fluorescence analysis (Tanner and Brinkerhoff, 1971). Following the program is a sample run from a study of the application of the linear combination technique to capture-gamma-ray analysis for calcium (report in preparation).
Structural performance analysis and redesign
NASA Technical Reports Server (NTRS)
Whetstone, W. D.
1978-01-01
Program performs stress buckling and vibrational analysis of large, linear, finite-element systems in excess of 50,000 degrees of freedom. Cost, execution time, and storage requirements are kept reasonable through use of sparse matrix solution techniques, and other computational and data management procedures designed for problems of very large size.
Suspect screening (SSA) and non-targeted analysis (NTA) methods using high-resolution mass spectrometry (HRMS) offer new approaches to efficiently generate exposure data for chemicals in a variety of environmental and biological media. These techniques aid characterization of the...
Methods for the design and analysis of power optimized finite-state machines using clock gating
NASA Astrophysics Data System (ADS)
Chodorowski, Piotr
2017-11-01
The paper discusses two methods of design of power optimized FSMs. Both methods use clock gating techniques. The main objective of the research was to write a program capable of generating automatic hardware description of finite-state machines in VHDL and testbenches to help power analysis. The creation of relevant output files is detailed step by step. The program was tested using the LGSynth91 FSM benchmark package. An analysis of the generated circuits shows that the second method presented in this paper leads to significant reduction of power consumption.
Methods for evaluating a mature substance abuse prevention/early intervention program.
Becker, L R; Hall, M; Fisher, D A; Miller, T R
2000-05-01
The authors describe methods for work in progress to evaluate four workplace prevention and/or early intervention programs designed to change occupational norms and reduce substance abuse at a major U.S. transportation company. The four programs are an employee assistance program, random drug testing, managed behavioral health care, and a peer-led intervention program. An elaborate mixed-methods evaluation combines data collection and analysis techniques from several traditions. A process-improvement evaluation focuses on the peer-led component to describe its evolution, document the implementation process for those interested in replicating it, and provide information for program improvement. An outcome-assessment evaluation examines impacts of the four programs on job performance measures (e.g., absenteeism, turnover, injury, and disability rates) and includes a cost-offset and employer cost-savings analysis. Issues related to using archival data, combining qualitative and quantitative designs, and working in a corporate environment are discussed.
Tools and techniques for computational reproducibility.
Piccolo, Stephen R; Frampton, Michael B
2016-07-11
When reporting research findings, scientists document the steps they followed so that others can verify and build upon the research. When those steps have been described in sufficient detail that others can retrace the steps and obtain similar results, the research is said to be reproducible. Computers play a vital role in many research disciplines and present both opportunities and challenges for reproducibility. Computers can be programmed to execute analysis tasks, and those programs can be repeated and shared with others. The deterministic nature of most computer programs means that the same analysis tasks, applied to the same data, will often produce the same outputs. However, in practice, computational findings often cannot be reproduced because of complexities in how software is packaged, installed, and executed-and because of limitations associated with how scientists document analysis steps. Many tools and techniques are available to help overcome these challenges; here we describe seven such strategies. With a broad scientific audience in mind, we describe the strengths and limitations of each approach, as well as the circumstances under which each might be applied. No single strategy is sufficient for every scenario; thus we emphasize that it is often useful to combine approaches.
Transonic Free-To-Roll Analysis of the F/A-18E and F-35 Configurations
NASA Technical Reports Server (NTRS)
Owens, D. Bruce; McConnell, Jeffrey K.; Brandon, Jay M.; Hall, Robert M.
2004-01-01
The free-to-roll technique is used as a tool for predicting areas of uncommanded lateral motions. Recently, the NASA/Navy/Air Force Abrupt Wing Stall Program extended the use of this technique to the transonic speed regime. Using this technique, this paper evaluates various wing configurations on the pre-production F/A-18E aircraft and the Joint Strike Fighter (F-35) aircraft. The configurations investigated include leading and trailing edge flap deflections, fences, leading edge flap gap seals, and vortex generators. These tests were conducted in the NASA Langley 16-Foot Transonic Tunnel. The analysis used a modification of a figure-of-merit developed during the Abrupt Wing Stall Program to discern configuration effects. The results showed how the figure-of-merit can be used to schedule wing flap deflections to avoid areas of uncommanded lateral motion. The analysis also used both static and dynamic wind tunnel data to provide insight into the uncommanded lateral behavior. The dynamic data was extracted from the time history data using parameter identification techniques. In general, modifications to the pre-production F/A-18E resulted in shifts in angle-of-attack where uncommanded lateral activity occurred. Sealing the gap between the inboard and outboard leading-edge flaps on the Navy version of the F-35 eliminated uncommanded lateral activity or delayed the activity to a higher angle-of-attack.
NASA Technical Reports Server (NTRS)
Bennett, R. L.
1975-01-01
The analytical techniques and computer program developed in the fully-coupled rotor vibration study are described. The rotor blade natural frequency and mode shape analysis was implemented in a digital computer program designated DF1758. The program computes collective, cyclic, and scissor modes for a single blade within a specified range of frequency for specified values of rotor RPM and collective angle. The analysis includes effects of blade twist, cg offset from reference axis, and shear center offset from reference axis. Coupled inplane, out-of-plane, and torsional vibrations are considered. Normalized displacements, shear forces and moments may be printed out and Calcomp plots of natural frequencies as a function of rotor RPM may be produced.
Microgravity sciences application visiting scientist program
NASA Technical Reports Server (NTRS)
Glicksman, Martin; Vanalstine, James
1995-01-01
Marshall Space Flight Center pursues scientific research in the area of low-gravity effects on materials and processes. To facilitate these Government performed research responsibilities, a number of supplementary research tasks were accomplished by a group of specialized visiting scientists. They participated in work on contemporary research problems with specific objectives related to current or future space flight experiments and defined and established independent programs of research which were based on scientific peer review and the relevance of the defined research to NASA microgravity for implementing a portion of the national program. The programs included research in the following areas: protein crystal growth, X-ray crystallography and computer analysis of protein crystal structure, optimization and analysis of protein crystal growth techniques, and design and testing of flight hardware.
Application of Decomposition to Transportation Network Analysis
DOT National Transportation Integrated Search
1976-10-01
This document reports preliminary results of five potential applications of the decomposition techniques from mathematical programming to transportation network problems. The five application areas are (1) the traffic assignment problem with fixed de...
NASA Technical Reports Server (NTRS)
Puttkamer, J. V.
1973-01-01
An analysis has been conducted to find out whether the management techniques developed in connection with the Apollo project could be used for dealing with such urgent problems of modern society as the crisis of the cities, the increasing environmental pollution, and the steadily growing traffic. Basic concepts and definitions of program and system management are discussed together with details regarding the employment of these concepts in connection with the solution of the problems of the Apollo program. Principles and significance of a systems approach are considered, giving attention to planning, system analysis, system integration, and project management. An application of the methods of project management to the problems of the civil sector is possible if the special characteristics of each particular case are taken into account.
Developement of an Optimum Interpolation Analysis Method for the CYBER 205
NASA Technical Reports Server (NTRS)
Nestler, M. S.; Woollen, J.; Brin, Y.
1985-01-01
A state-of-the-art technique to assimilate the diverse observational database obtained during FGGE, and thus create initial conditions for numerical forecasts is described. The GLA optimum interpolation (OI) analysis method analyzes pressure, winds, and temperature at sea level, mixing ratio at six mandatory pressure levels up to 300 mb, and heights and winds at twelve levels up to 50 mb. Conversion to the CYBER 205 required a major re-write of the Amdahl OI code to take advantage of the CYBER vector processing capabilities. Structured programming methods were used to write the programs and this has resulted in a modular, understandable code. Among the contributors to the increased speed of the CYBER code are a vectorized covariance-calculation routine, an extremely fast matrix equation solver, and an innovative data search and sort technique.
Parallel line analysis: multifunctional software for the biomedical sciences
NASA Technical Reports Server (NTRS)
Swank, P. R.; Lewis, M. L.; Damron, K. L.; Morrison, D. R.
1990-01-01
An easy to use, interactive FORTRAN program for analyzing the results of parallel line assays is described. The program is menu driven and consists of five major components: data entry, data editing, manual analysis, manual plotting, and automatic analysis and plotting. Data can be entered from the terminal or from previously created data files. The data editing portion of the program is used to inspect and modify data and to statistically identify outliers. The manual analysis component is used to test the assumptions necessary for parallel line assays using analysis of covariance techniques and to determine potency ratios with confidence limits. The manual plotting component provides a graphic display of the data on the terminal screen or on a standard line printer. The automatic portion runs through multiple analyses without operator input. Data may be saved in a special file to expedite input at a future time.
CORSS: Cylinder Optimization of Rings, Skin, and Stringers
NASA Technical Reports Server (NTRS)
Finckenor, J.; Rogers, P.; Otte, N.
1994-01-01
Launch vehicle designs typically make extensive use of cylindrical skin stringer construction. Structural analysis methods are well developed for preliminary design of this type of construction. This report describes an automated, iterative method to obtain a minimum weight preliminary design. Structural optimization has been researched extensively, and various programs have been written for this purpose. Their complexity and ease of use depends on their generality, the failure modes considered, the methodology used, and the rigor of the analysis performed. This computer program employs closed-form solutions from a variety of well-known structural analysis references and joins them with a commercially available numerical optimizer called the 'Design Optimization Tool' (DOT). Any ring and stringer stiffened shell structure of isotropic materials that has beam type loading can be analyzed. Plasticity effects are not included. It performs a more limited analysis than programs such as PANDA, but it provides an easy and useful preliminary design tool for a large class of structures. This report briefly describes the optimization theory, outlines the development and use of the program, and describes the analysis techniques that are used. Examples of program input and output, as well as the listing of the analysis routines, are included.
Coupled rotor/airframe vibration analysis
NASA Technical Reports Server (NTRS)
Sopher, R.; Studwell, R. E.; Cassarino, S.; Kottapalli, S. B. R.
1982-01-01
A coupled rotor/airframe vibration analysis developed as a design tool for predicting helicopter vibrations and a research tool to quantify the effects of structural properties, aerodynamic interactions, and vibration reduction devices on vehicle vibration levels is described. The analysis consists of a base program utilizing an impedance matching technique to represent the coupled rotor/airframe dynamics of the system supported by inputs from several external programs supplying sophisticated rotor and airframe aerodynamic and structural dynamic representation. The theoretical background, computer program capabilities and limited correlation results are presented in this report. Correlation results using scale model wind tunnel results show that the analysis can adequately predict trends of vibration variations with airspeed and higher harmonic control effects. Predictions of absolute values of vibration levels were found to be very sensitive to modal characteristics and results were not representative of measured values.
Krstacic, Goran; Krstacic, Antonija; Smalcelj, Anton; Milicic, Davor; Jembrek-Gostovic, Mirjana
2007-04-01
Dynamic analysis techniques may quantify abnormalities in heart rate variability (HRV) based on nonlinear and fractal analysis (chaos theory). The article emphasizes clinical and prognostic significance of dynamic changes in short-time series applied on patients with coronary heart disease (CHD) during the exercise electrocardiograph (ECG) test. The subjects were included in the series after complete cardiovascular diagnostic data. Series of R-R and ST-T intervals were obtained from exercise ECG data after sampling digitally. The range rescaled analysis method determined the fractal dimension of the intervals. To quantify fractal long-range correlation's properties of heart rate variability, the detrended fluctuation analysis technique was used. Approximate entropy (ApEn) was applied to quantify the regularity and complexity of time series, as well as unpredictability of fluctuations in time series. It was found that the short-term fractal scaling exponent (alpha(1)) is significantly lower in patients with CHD (0.93 +/- 0.07 vs 1.09 +/- 0.04; P < 0.001). The patients with CHD had higher fractal dimension in each exercise test program separately, as well as in exercise program at all. ApEn was significant lower in CHD group in both RR and ST-T ECG intervals (P < 0.001). The nonlinear dynamic methods could have clinical and prognostic applicability also in short-time ECG series. Dynamic analysis based on chaos theory during the exercise ECG test point out the multifractal time series in CHD patients who loss normal fractal characteristics and regularity in HRV. Nonlinear analysis technique may complement traditional ECG analysis.
NASA Technical Reports Server (NTRS)
Dunn, A. R.
1975-01-01
Computer techniques for data analysis of sunspot observations are presented. Photographic spectra were converted to digital form and analyzed. Methods of determining magnetic field strengths, i.e., the Zeeman effect, are discussed. Errors originating with telescope equipment and the magnetograph are treated. Flow charts of test programs and procedures of the data analysis are shown.
ERIC Educational Resources Information Center
Aragón, Sonia; Lapresa, Daniel; Arana, Javier; Anguera, M. Teresa; Garzón, Belén
2017-01-01
Polar coordinate analysis is a powerful data reduction technique based on the Zsum statistic, which is calculated from adjusted residuals obtained by lag sequential analysis. Its use has been greatly simplified since the addition of a module in the free software program HOISAN for performing the necessary computations and producing…
Alignment-free genetic sequence comparisons: a review of recent approaches by word analysis
Steele, Joe; Bastola, Dhundy
2014-01-01
Modern sequencing and genome assembly technologies have provided a wealth of data, which will soon require an analysis by comparison for discovery. Sequence alignment, a fundamental task in bioinformatics research, may be used but with some caveats. Seminal techniques and methods from dynamic programming are proving ineffective for this work owing to their inherent computational expense when processing large amounts of sequence data. These methods are prone to giving misleading information because of genetic recombination, genetic shuffling and other inherent biological events. New approaches from information theory, frequency analysis and data compression are available and provide powerful alternatives to dynamic programming. These new methods are often preferred, as their algorithms are simpler and are not affected by synteny-related problems. In this review, we provide a detailed discussion of computational tools, which stem from alignment-free methods based on statistical analysis from word frequencies. We provide several clear examples to demonstrate applications and the interpretations over several different areas of alignment-free analysis such as base–base correlations, feature frequency profiles, compositional vectors, an improved string composition and the D2 statistic metric. Additionally, we provide detailed discussion and an example of analysis by Lempel–Ziv techniques from data compression. PMID:23904502
Probabilistic/Fracture-Mechanics Model For Service Life
NASA Technical Reports Server (NTRS)
Watkins, T., Jr.; Annis, C. G., Jr.
1991-01-01
Computer program makes probabilistic estimates of lifetime of engine and components thereof. Developed to fill need for more accurate life-assessment technique that avoids errors in estimated lives and provides for statistical assessment of levels of risk created by engineering decisions in designing system. Implements mathematical model combining techniques of statistics, fatigue, fracture mechanics, nondestructive analysis, life-cycle cost analysis, and management of engine parts. Used to investigate effects of such engine-component life-controlling parameters as return-to-service intervals, stresses, capabilities for nondestructive evaluation, and qualities of materials.
NASA Technical Reports Server (NTRS)
1988-01-01
Having recognized at an early stage the critical importance of maintaining detector capabilities which utilize state of the art techniques, a joint program was formulated. This program has involved coordination of a broad range of efforts and activities including joint experiments, collaboration in theoretical studies, instrument design, calibrations, and data analysis. Summaries of the progress made to date are presented. A representative bibliography is also included.
Health economics evaluation of a gastric cancer early detection and treatment program in China.
Li, Dan; Yuan, Yuan; Sun, Li-Ping; Fang, Xue; Zhou, Bao-Sen
2014-01-01
To use health economics methodology to assess the screening program on gastric cancer in Zhuanghe, China, so as to provide the basis for health decision on expanding the program of early detection and treatment. The expense of an early detection and treatment program for gastric cancer in patients found by screening, and also costs of traditional treatment in a hospital of Zhuanghe were assessed. Three major techniques of medical economics, namely cost-effective analysis (CEA), cost-benefit analysis (CBA) and cost-utility analysis (CUA), were used to assess the screening program. RESULTS from CEA showed that investing every 25, 235 Yuan on screening program in Zhuanghe area, one gastric cancer patient could be saved. Data from CUA showed that it was cost 1, 370 Yuan per QALY saved. RESULTS from CBA showed that: the total cost was 1,945,206 Yuan with a benefit as 8,669,709 Yuan and an CBR of 4.46. The early detection and treatment program of gastric cancer appears economic and society-beneficial. We suggest that it should be carry out in more high risk areas for gastric cancer.
The Situation Analysis Study of the family planning program in Kenya.
Miller, R A; Ndhlovu, L; Gachara, M M; Fisher, A A
1991-01-01
A new, relatively "quick and clean" operations research approach called a "situation analysis" was developed for examining the strengths and weaknesses of the family planning program of Kenya. Field research teams visited a stratified random sample of 99 of the Ministry of Health's approximately 775 service delivery points. Observation techniques and interviewing were used to collect information on program components and on the quality of care provided to new family planning clients during the observation day. As late as 1986, the Kenya program was rated "weak" and "poor" in the international literature. The Kenya Situation Analysis Study found a functioning, integrated maternal and child health/family planning program serving large numbers of clients, with an emphasis on oral contraceptives and Depo-Provera (and an underemphasis on permanent methods). Although a number of program problems were revealed by the study, overall, in terms of performance, a rating of "moderate" is suggested as more appropriate for Kenya's national family planning program today. In terms of the quality of care, a "moderate to moderate-high" rating is suggested.
West Coast tree improvement programs: a break-even, cost-benefit analysis
F. Thomas Ledig; Richard L Porterfield
1981-01-01
Three tree improvement programs were analyzed by break-even, cost-benefit technique: one for ponderosa pine in the Pacific Northwest, and two for Douglas-fir in the Pacific Northwest-one of low intensity and the other of high intensity. A return of 8 percent on investment appears feasible by using short rotations or by accompanying tree improvement with thinning....
ERIC Educational Resources Information Center
Sayre, Scott Alan
The ultimate goal of the science of artificial intelligence (AI) is to establish programs that will use algorithmic computer techniques to imitate the heuristic thought processes of humans. Most AI programs, especially expert systems, organize their knowledge into three specific areas: data storage, a rule set, and a control structure. Limitations…
NASA Technical Reports Server (NTRS)
Newsom, Jerry R.
1991-01-01
Control-Structures Interaction (CSI) technology embraces the understanding of the interaction between the spacecraft structure and the control system, and the creation and validation of concepts, techniques, and tools, for enabling the interdisciplinary design of an integrated structure and control system, rather than the integration of a structural design and a control system design. The goal of this program is to develop validated CSI technology for integrated design/analysis and qualification of large flexible space systems and precision space structures. A description of the CSI technology program is presented.
Reliability analysis of the F-8 digital fly-by-wire system
NASA Technical Reports Server (NTRS)
Brock, L. D.; Goodman, H. A.
1981-01-01
The F-8 Digital Fly-by-Wire (DFBW) flight test program intended to provide the technology for advanced control systems, giving aircraft enhanced performance and operational capability is addressed. A detailed analysis of the experimental system was performed to estimated the probabilities of two significant safety critical events: (1) loss of primary flight control function, causing reversion to the analog bypass system; and (2) loss of the aircraft due to failure of the electronic flight control system. The analysis covers appraisal of risks due to random equipment failure, generic faults in design of the system or its software, and induced failure due to external events. A unique diagrammatic technique was developed which details the combinatorial reliability equations for the entire system, promotes understanding of system failure characteristics, and identifies the most likely failure modes. The technique provides a systematic method of applying basic probability equations and is augmented by a computer program written in a modular fashion that duplicates the structure of these equations.
Fostering multiple repertoires in undergraduate behavior analysis students
Polson, David A. D.
1995-01-01
Eight techniques used by the author in teaching an introductory applied behavior analysis course are described: (a) a detailed study guide, (b) frequent tests, (c) composition of practice test questions, (d) in-class study groups, (e) fluency building with a computerized flash-card program, (f) bonus marks for participation during question-and-answer sessions, (g) student presentations that summarize and analyze recently published research, and (h) in-class behavior analysis of comic strips. Together, these techniques require an extensive amount of work by students. Nevertheless, students overwhelmingly prefer this approach to the traditional lecture-midterm-final format, and most earn an A as their final course grade. PMID:22478226
Reliability/safety analysis of a fly-by-wire system
NASA Technical Reports Server (NTRS)
Brock, L. D.; Goddman, H. A.
1980-01-01
An analysis technique has been developed to estimate the reliability of a very complex, safety-critical system by constructing a diagram of the reliability equations for the total system. This diagram has many of the characteristics of a fault-tree or success-path diagram, but is much easier to construct for complex redundant systems. The diagram provides insight into system failure characteristics and identifies the most likely failure modes. A computer program aids in the construction of the diagram and the computation of reliability. Analysis of the NASA F-8 Digital Fly-by-Wire Flight Control System is used to illustrate the technique.
Classification software technique assessment
NASA Technical Reports Server (NTRS)
Jayroe, R. R., Jr.; Atkinson, R.; Dasarathy, B. V.; Lybanon, M.; Ramapryian, H. K.
1976-01-01
A catalog of software options is presented for the use of local user communities to obtain software for analyzing remotely sensed multispectral imagery. The resources required to utilize a particular software program are described. Descriptions of how a particular program analyzes data and the performance of that program for an application and data set provided by the user are shown. An effort is made to establish a statistical performance base for various software programs with regard to different data sets and analysis applications, to determine the status of the state-of-the-art.
NASA Technical Reports Server (NTRS)
Rea, F. G.; Pittenger, J. L.; Conlon, R. J.; Allen, J. D.
1975-01-01
Techniques developed for identifying launch vehicle system requirements for NASA automated space missions are discussed. Emphasis is placed on development of computer programs and investigation of astrionics for OSS missions and Scout. The Earth Orbit Mission Program - 1 which performs linear error analysis of launch vehicle dispersions for both vehicle and navigation system factors is described along with the Interactive Graphic Orbit Selection program which allows the user to select orbits which satisfy mission requirements and to evaluate the necessary injection accuracy.
Planetary geosciences, 1989-1990
NASA Technical Reports Server (NTRS)
Zuber, Maria T. (Editor); James, Odette B. (Editor); Lunine, Jonathan I. (Editor); Macpherson, Glenn J. (Editor); Phillips, Roger J. (Editor)
1992-01-01
NASA's Planetary Geosciences Programs (the Planetary Geology and Geophysics and the Planetary Material and Geochemistry Programs) provide support and an organizational framework for scientific research on solid bodies of the solar system. These research and analysis programs support scientific research aimed at increasing our understanding of the physical, chemical, and dynamic nature of the solid bodies of the solar system: the Moon, the terrestrial planets, the satellites of the outer planets, the rings, the asteroids, and the comets. This research is conducted using a variety of methods: laboratory experiments, theoretical approaches, data analysis, and Earth analog techniques. Through research supported by these programs, we are expanding our understanding of the origin and evolution of the solar system. This document is intended to provide an overview of the more significant scientific findings and discoveries made this year by scientists supported by the Planetary Geosciences Program. To a large degree, these results and discoveries are the measure of success of the programs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Atzeni, Simone; Ahn, Dong; Gopalakrishnan, Ganesh
2017-01-12
Archer is built on top of the LLVM/Clang compilers that support OpenMP. It applies static and dynamic analysis techniques to detect data races in OpenMP programs generating a very low runtime and memory overhead. Static analyses identify data race free OpenMP regions and exclude them from runtime analysis, which is performed by ThreadSanitizer included in LLVM/Clang.
Restructuring the rotor analysis program C-60
NASA Technical Reports Server (NTRS)
1985-01-01
The continuing evolution of the rotary wing industry demands increasing analytical capabilities. To keep up with this demand, software must be structured to accommodate change. The approach discussed for meeting this demand is to restructure an existing analysis. The motivational factors, basic principles, application techniques, and practical lessons from experience with this restructuring effort are reviewed.
Cognition, Corpora, and Computing: Triangulating Research in Usage-Based Language Learning
ERIC Educational Resources Information Center
Ellis, Nick C.
2017-01-01
Usage-based approaches explore how we learn language from our experience of language. Related research thus involves the analysis of the usage from which learners learn and of learner usage as it develops. This program involves considerable data recording, transcription, and analysis, using a variety of corpus and computational techniques, many of…
Seeking Social Capital and Expertise in a Newly-Formed Research Community: A Co-Author Analysis
ERIC Educational Resources Information Center
Forte, Christine E.
2017-01-01
This exploratory study applies social network analysis techniques to existing, publicly available data to understand collaboration patterns within the co-author network of a federally-funded, interdisciplinary research program. The central questions asked: What underlying social capital structures can be determined about a group of researchers…
Engineering Analysis of Stresses in Railroad Rails.
DOT National Transportation Integrated Search
1981-10-01
One portion of the Federal Railroad Administration's (FRA) Track Performance Improvement Program is the development of engineering and analytic techniques required for the design and maintenance of railroad track of increased integrity and safety. Un...
NASA Technical Reports Server (NTRS)
Head, J. W. (Editor)
1978-01-01
Developments reported at a meeting of principal investigators for NASA's planetology geology program are summarized. Topics covered include: constraints on solar system formation; asteriods, comets, and satellites; constraints on planetary interiors; volatiles and regoliths; instrument development techniques; planetary cartography; geological and geochemical constraints on planetary evolution; fluvial processes and channel formation; volcanic processes; Eolian processes; radar studies of planetary surfaces; cratering as a process, landform, and dating method; and the Tharsis region of Mars. Activities at a planetary geology field conference on Eolian processes are reported and techniques recommended for the presentation and analysis of crater size-frequency data are included.
NASA Technical Reports Server (NTRS)
1976-01-01
After the disaster of Staten Island in 1973 where 40 people were killed repairing a liquid natural gas storage tank, the New York Fire Commissioner requested NASA's help in drawing up a comprehensive plan to cover the design, construction, and operation of liquid natural gas facilities. Two programs are underway. The first transfers comprehensive risk management techniques and procedures which take the form of an instruction document that includes determining liquid-gas risks through engineering analysis and tests, controlling these risks by setting up redundant fail safe techniques, and establishing criteria calling for decisions that eliminate or accept certain risks. The second program prepares a liquid gas safety manual (the first of its kind).
Analysis and synthesis of abstract data types through generalization from examples
NASA Technical Reports Server (NTRS)
Wild, Christian
1987-01-01
The discovery of general patterns of behavior from a set of input/output examples can be a useful technique in the automated analysis and synthesis of software systems. These generalized descriptions of the behavior form a set of assertions which can be used for validation, program synthesis, program testing and run-time monitoring. Describing the behavior is characterized as a learning process in which general patterns can be easily characterized. The learning algorithm must choose a transform function and define a subset of the transform space which is related to equivalence classes of behavior in the original domain. An algorithm for analyzing the behavior of abstract data types is presented and several examples are given. The use of the analysis for purposes of program synthesis is also discussed.
Using Runtime Analysis to Guide Model Checking of Java Programs
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Norvig, Peter (Technical Monitor)
2001-01-01
This paper describes how two runtime analysis algorithms, an existing data race detection algorithm and a new deadlock detection algorithm, have been implemented to analyze Java programs. Runtime analysis is based on the idea of executing the program once. and observing the generated run to extract various kinds of information. This information can then be used to predict whether other different runs may violate some properties of interest, in addition of course to demonstrate whether the generated run itself violates such properties. These runtime analyses can be performed stand-alone to generate a set of warnings. It is furthermore demonstrated how these warnings can be used to guide a model checker, thereby reducing the search space. The described techniques have been implemented in the b e grown Java model checker called PathFinder.
van Oorsouw, Wietske M W J; Embregts, Petri J C M; Bosman, Anna M T; Jahoda, Andrew
2009-01-01
The last decades have seen increased emphasis on the quality of training for direct-care staff serving people with intellectual disabilities. Nevertheless, it is unclear what the key aspects of effective training are. Therefore, the aim of the present meta-analysis was to establish the ingredients (i.e., goals, format, and techniques) for staff training that are related to improvements of staff behaviour. Our literature search concentrated on studies that were published in a period of 20 years. Fifty-five studies met the criteria, resulting in 502 single-subject designs and 13 n>1 designs. Results revealed important information relevant to further improvement of clinical practice: (a) the combination of in-service with coaching-on-the-job is the most powerful format, (b) in in-service formats, one should use multiple techniques, and verbal feedback is particularly recommended, and (c) in coaching-on-the-job formats, verbal feedback should be part of the program, as well as praise and correction. To maximize effectiveness, program developers should carefully prepare training goals, training format, and training techniques, which will yield a profit for clinical practice.
A FORTRAN program for the analysis of linear continuous and sample-data systems
NASA Technical Reports Server (NTRS)
Edwards, J. W.
1976-01-01
A FORTRAN digital computer program which performs the general analysis of linearized control systems is described. State variable techniques are used to analyze continuous, discrete, and sampled data systems. Analysis options include the calculation of system eigenvalues, transfer functions, root loci, root contours, frequency responses, power spectra, and transient responses for open- and closed-loop systems. A flexible data input format allows the user to define systems in a variety of representations. Data may be entered by inputing explicit data matrices or matrices constructed in user written subroutines, by specifying transfer function block diagrams, or by using a combination of these methods.
Systems Engineering in NASA's R&TD Programs
NASA Technical Reports Server (NTRS)
Jones, Harry
2005-01-01
Systems engineering is largely the analysis and planning that support the design, development, and operation of systems. The most common application of systems engineering is in guiding systems development projects that use a phased process of requirements, specifications, design, and development. This paper investigates how systems engineering techniques should be applied in research and technology development programs for advanced space systems. These programs should include anticipatory engineering of future space flight systems and a project portfolio selection process, as well as systems engineering for multiple development projects.
2014-06-19
urgent and compelling. Recent efforts in this area automate program analysis techniques using model checking and symbolic execution [2, 5–7]. These...bounded model checking tool for x86 binary programs developed at the Air Force Institute of Technology (AFIT). Jiseki creates a bit-vector logic model based...assume there are n different paths through the function foo . The program could potentially call the function foo a bound number of times, resulting in n
Deep anistropic shell program for tire analysis
NASA Technical Reports Server (NTRS)
Andersen, C. M.
1981-01-01
A finite element program was constructed to model the mechanical response of a tire, treated as a deep anisotropic shell, to specified static loads. The program is based on a Sanders Budiansky type shell theory with the effects of transverse shear deformation and bending-extensional coupling included. A displacement formulation is used together with a total Lagrangian description of the deformation. Sixteen-node quadrilateral elements with bicubic shape functions are employed. The Noor basis reduction technique and various type of symmetry considerations serve to improve the computational efficiency.
NASA Technical Reports Server (NTRS)
1974-01-01
An analysis of low cost management approaches for the development of the Earth Observatory Satellite (EOS) is presented. The factors of the program which tend to increase costs are identified. The NASA/Industry interface is stressed to show how the interface can be improved to produce reduced program costs. Techniques and examples of cost reduction which can be applied to the EOS program are tabulated. Specific recommendations for actions to be taken to reduce costs in prescribed areas are submitted.
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Martensen, Anna L.
1992-01-01
FTC, Fault-Tree Compiler program, is reliability-analysis software tool used to calculate probability of top event of fault tree. Five different types of gates allowed in fault tree: AND, OR, EXCLUSIVE OR, INVERT, and M OF N. High-level input language of FTC easy to understand and use. Program supports hierarchical fault-tree-definition feature simplifying process of description of tree and reduces execution time. Solution technique implemented in FORTRAN, and user interface in Pascal. Written to run on DEC VAX computer operating under VMS operating system.
Simple methods of exploiting the underlying structure of rule-based systems
NASA Technical Reports Server (NTRS)
Hendler, James
1986-01-01
Much recent work in the field of expert systems research has aimed at exploiting the underlying structures of the rule base for reasons of analysis. Such techniques as Petri-nets and GAGs have been proposed as representational structures that will allow complete analysis. Much has been made of proving isomorphisms between the rule bases and the mechanisms, and in examining the theoretical power of this analysis. In this paper we describe some early work in a new system which has much simpler (and thus, one hopes, more easily achieved) aims and less formality. The technique being examined is a very simple one: OPS5 programs are analyzed in a purely syntactic way and a FSA description is generated. In this paper we describe the technique and some user interface tools which exploit this structure.
Composite Failures: A Comparison of Experimental Test Results and Computational Analysis Using XFEM
2016-09-30
NUWC-NPT Technical Report 12,218 30 September 2016 Composite Failures: A Comparison of Experimental Test Results and Computational Analysis...A Comparison of Experimental Test Results and Computational Analysis Using XFEM 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...availability of measurement techniques, experimental testing of composite materials has largely outpaced the computational modeling ability, forcing
Comprehensive Flood Plain Studies Using Spatial Data Management Techniques.
1978-06-01
Hydrologic Engineer- ing Center computer programs that forecast urban storm water quality and dynamic in- stream water quality response to waste...determination. Water Quality The water quality analysis planned for the pilot study includes urban storm water quality forecasting and in-streamn...analysis is performed under the direction of Tony Thomas. Chief, Research Branch, by Jess Abbott for storm water quality analysis, R. G. Willey for
ERIC Educational Resources Information Center
LaPlante, Josephine M.; Durham, Taylor R.
A revised edition of PS-14, "An Introduction to Benefit-Cost Analysis for Evaluating Public Programs," presents concepts and techniques of benefit-cost analysis as tools that can be used to assist in deciding between alternatives. The goals of the new edition include teaching students to think about the possible benefits and costs of each…
Falsafi-Zadeh, Sajad; Karimi, Zahra; Galehdari, Hamid
2012-01-01
Molecular dynamic simulation is a practical and powerful technique for analysis of protein structure. Several programs have been developed to facilitate the mentioned investigation, under them the visual molecular dynamic or VMD is the most frequently used programs. One of the beneficial properties of the VMD is its ability to be extendable by designing new plug-in. We introduce here a new facility of the VMD for distance analysis and radius of gyration of biopolymers such as protein and DNA. Availability The database is available for free at http://trc.ajums.ac.ir/HomePage.aspx/?TabID/=12618/&Site/=trc.ajums.ac/&Lang/=fa-IR PMID:22553393
The remote measurement of tornado-like flows employing a scanning laser Doppler system
NASA Technical Reports Server (NTRS)
Jeffreys, H. B.; Bilbro, J. W.; Dimarzio, C.; Sonnenschein, C.; Toomey, D.
1977-01-01
The paper deals with a scanning laser Doppler velocimeter system employed in a test program for measuring naturally occurring tornado-like phenomena, known as dust devils. A description of the system and the test program is followed by a discussion of the data processing techniques and data analysis. The system uses a stable 15-W CO2 laser with the beam expanded and focused by a 12-inch telescope. Range resolution is obtained by focusing the optical system. The velocity of each volume of air (scanned in a horizontal plane) is determined from spectral analysis of the heterodyne signal. Results derived from the measurement program and data/system analyses are examined.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moore, G.
1992-12-28
The following Topics were among those completed at the Air Force Faculty Research Summer Program: Experiences using Model-Based Techniques for the Development of a Large Parallel Instrumentation System; Data Reduction of Laser Induced Fluorescence in Rocket Motor Exhausts; Feasibility of Wavelet Analysis for Plume Data Study; Characterization of Seagrass Meadows in St. Andrew (Crooked Island) Sound, Northern Gulf of Mexico; A Preliminary Study of the Weathering of Jet Fuels in Soil Monitored by SFE with GC Analysis; Preliminary Numerical model of Groundwater Flow at the MADE2 Site.
NASA Technical Reports Server (NTRS)
1972-01-01
The QL module of the Performance Analysis and Design Synthesis (PADS) computer program is described. Execution of this module is initiated when and if subroutine PADSI calls subroutine GROPE. Subroutine GROPE controls the high level logical flow of the QL module. The purpose of the module is to determine a trajectory that satisfies the necessary variational conditions for optimal performance. The module achieves this by solving a nonlinear multi-point boundary value problem. The numerical method employed is described. It is an iterative technique that converges quadratically when it does converge. The three basic steps of the module are: (1) initialization, (2) iteration, and (3) culmination. For Volume 1 see N73-13199.
Parallel program debugging with flowback analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Choi, Jongdeok.
1989-01-01
This thesis describes the design and implementation of an integrated debugging system for parallel programs running on shared memory multi-processors. The goal of the debugging system is to present to the programmer a graphical view of the dynamic program dependences while keeping the execution-time overhead low. The author first describes the use of flowback analysis to provide information on causal relationship between events in a programs' execution without re-executing the program for debugging. Execution time overhead is kept low by recording only a small amount of trace during a program's execution. He uses semantic analysis and a technique called incrementalmore » tracing to keep the time and space overhead low. As part of the semantic analysis, he uses a static program dependence graph structure that reduces the amount of work done at compile time and takes advantage of the dynamic information produced during execution time. The cornerstone of the incremental tracing concept is to generate a coarse trace during execution and fill incrementally, during the interactive portion of the debugging session, the gap between the information gathered in the coarse trace and the information needed to do the flowback analysis using the coarse trace. Then, he describes how to extend the flowback analysis to parallel programs. The flowback analysis can span process boundaries; i.e., the most recent modification to a shared variable might be traced to a different process than the one that contains the current reference. The static and dynamic program dependence graphs of the individual processes are tied together with synchronization and data dependence information to form complete graphs that represent the entire program.« less
Knowledge Based Consultation for Finite Element Structural Analysis.
1980-05-01
Intelligence Finite Element Program Tutorial 20 ABSTRACT (Continue. on rees side If necessary and ide.n’ty b,’ bit,, k nionh.) In recent years, techniques of...involved in Artificial Intelligence at Stanford University developed the program MYCIN F2], for clinical consultation of diseases that require...and Rules The basic backward chaining logic, characteristic to Artificial Intelligence . approaching 1he problem of knowledge representation was
Multispan Elevated Guideway Design for Passenger Transport Vehicles : Volume 1. Text.
DOT National Transportation Integrated Search
1975-04-01
Analysis techniques, a design procedure and design data are described for passenger vehicle, simply supported, single span and multiple span elevated guideway structures. Analyses and computer programs are developed to determine guideway deflections,...
NASA Technical Reports Server (NTRS)
Comfort, R. H.; Horwitz, J. L.
1986-01-01
Temperature and density analysis in the Automated Analysis Program (for the global empirical model) were modified to use flow velocities produced by the flow velocity analysis. Revisions were started to construct an interactive version of the technique for temperature and density analysis used in the automated analysis program. A sutdy of ion and electron heating at high altitudes in the outer plasmasphere was initiated. Also the analysis of the electron gun experiments on SCATHA were extended to include eclipse operations in order to test a hypothesis that there are interactions between the 50 to 100 eV beam and spacecraft generated photoelectrons. The MASSCOMP software to be used in taking and displaying data in the two-ion plasma experiment was tested and is now working satisfactorily. Papers published during the report period are listed.
An investigation of dynamic-analysis methods for variable-geometry structures
NASA Technical Reports Server (NTRS)
Austin, F.
1980-01-01
Selected space structure configurations were reviewed in order to define dynamic analysis problems associated with variable geometry. The dynamics of a beam being constructed from a flexible base and the relocation of the completed beam by rotating the remote manipulator system about the shoulder joint were selected. Equations of motion were formulated in physical coordinates for both of these problems, and FORTRAN programs were developed to generate solutions by numerically integrating the equations. These solutions served as a standard of comparison to gauge the accuracy of approximate solution techniques that were developed and studied. Good control was achieved in both problems. Unstable control system coupling with the system flexibility did not occur. An approximate method was developed for each problem to enable the analyst to investigate variable geometry effects during a short time span using standard fixed geometry programs such as NASTRAN. The average angle and average length techniques are discussed.
Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Tutorial
DOE Office of Scientific and Technical Information (OSTI.GOV)
C. L. Smith; S. T. Beck; S. T. Wood
2008-08-01
The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of computer programs that were developed to create and analyze probabilistic risk assessment (PRAs). This volume is the tutorial manual for the SAPHIRE system. In this document, a series of lessons are provided that guide the user through basic steps common to most analyses preformed with SAPHIRE. The tutorial is divided into two major sections covering both basic and advanced features. The section covering basic topics contains lessons that lead the reader through development of a probabilistic hypothetical problem involving a vehicle accident, highlighting the program’smore » most fundamental features. The advanced features section contains additional lessons that expand on fundamental analysis features of SAPHIRE and provide insights into more complex analysis techniques. Together, these two elements provide an overview into the operation and capabilities of the SAPHIRE software.« less
Earth orientation from lunar laser ranging and an error analysis of polar motion services
NASA Technical Reports Server (NTRS)
Dickey, J. O.; Newhall, X. X.; Williams, J. G.
1985-01-01
Lunar laser ranging (LLR) data are obtained on the basis of the timing of laser pulses travelling from observatories on earth to retroreflectors placed on the moon's surface during the Apollo program. The modeling and analysis of the LLR data can provide valuable insights into earth's dynamics. The feasibility to model accurately the lunar orbit over the full 13-year observation span makes it possible to conduct relatively long-term studies of variations in the earth's rotation. A description is provided of general analysis techniques, and the calculation of universal time (UT1) from LLR is discussed. Attention is also given to a summary of intercomparisons with different techniques, polar motion results and intercomparisons, and a polar motion error analysis.
NASA Technical Reports Server (NTRS)
Biess, J. J.; Yu, Y.; Middlebrook, R. D.; Schoenfeld, A. D.
1974-01-01
A review is given of future power processing systems planned for the next 20 years, and the state-of-the-art of power processing design modeling and analysis techniques used to optimize power processing systems. A methodology of modeling and analysis of power processing equipment and systems has been formulated to fulfill future tradeoff studies and optimization requirements. Computer techniques were applied to simulate power processor performance and to optimize the design of power processing equipment. A program plan to systematically develop and apply the tools for power processing systems modeling and analysis is presented so that meaningful results can be obtained each year to aid the power processing system engineer and power processing equipment circuit designers in their conceptual and detail design and analysis tasks.
Quantum optimization for training support vector machines.
Anguita, Davide; Ridella, Sandro; Rivieccio, Fabio; Zunino, Rodolfo
2003-01-01
Refined concepts, such as Rademacher estimates of model complexity and nonlinear criteria for weighting empirical classification errors, represent recent and promising approaches to characterize the generalization ability of Support Vector Machines (SVMs). The advantages of those techniques lie in both improving the SVM representation ability and yielding tighter generalization bounds. On the other hand, they often make Quadratic-Programming algorithms no longer applicable, and SVM training cannot benefit from efficient, specialized optimization techniques. The paper considers the application of Quantum Computing to solve the problem of effective SVM training, especially in the case of digital implementations. The presented research compares the behavioral aspects of conventional and enhanced SVMs; experiments in both a synthetic and real-world problems support the theoretical analysis. At the same time, the related differences between Quadratic-Programming and Quantum-based optimization techniques are considered.
Arthropod Surveillance Programs: Basic Components, Strategies, and Analysis.
Cohnstaedt, Lee W; Rochon, Kateryn; Duehl, Adrian J; Anderson, John F; Barrera, Roberto; Su, Nan-Yao; Gerry, Alec C; Obenauer, Peter J; Campbell, James F; Lysyk, Tim J; Allan, Sandra A
2012-03-01
Effective entomological surveillance planning stresses a careful consideration of methodology, trapping technologies, and analysis techniques. Herein, the basic principles and technological components of arthropod surveillance plans are described, as promoted in the symposium "Advancements in arthropod monitoring technology, techniques, and analysis" presented at the 58th annual meeting of the Entomological Society of America in San Diego, CA. Interdisciplinary examples of arthropod monitoring for urban, medical, and veterinary applications are reviewed. Arthropod surveillance consists of the three components: 1) sampling method, 2) trap technology, and 3) analysis technique. A sampling method consists of selecting the best device or collection technique for a specific location and sampling at the proper spatial distribution, optimal duration, and frequency to achieve the surveillance objective. Optimized sampling methods are discussed for several mosquito species (Diptera: Culicidae) and ticks (Acari: Ixodidae). The advantages and limitations of novel terrestrial and aerial insect traps, artificial pheromones and kairomones are presented for the capture of red flour beetle (Coleoptera: Tenebrionidae), small hive beetle (Coleoptera: Nitidulidae), bed bugs (Hemiptera: Cimicidae), and Culicoides (Diptera: Ceratopogonidae) respectively. After sampling, extrapolating real world population numbers from trap capture data are possible with the appropriate analysis techniques. Examples of this extrapolation and action thresholds are given for termites (Isoptera: Rhinotermitidae) and red flour beetles.
PAPR reduction in FBMC using an ACE-based linear programming optimization
NASA Astrophysics Data System (ADS)
van der Neut, Nuan; Maharaj, Bodhaswar TJ; de Lange, Frederick; González, Gustavo J.; Gregorio, Fernando; Cousseau, Juan
2014-12-01
This paper presents four novel techniques for peak-to-average power ratio (PAPR) reduction in filter bank multicarrier (FBMC) modulation systems. The approach extends on current PAPR reduction active constellation extension (ACE) methods, as used in orthogonal frequency division multiplexing (OFDM), to an FBMC implementation as the main contribution. The four techniques introduced can be split up into two: linear programming optimization ACE-based techniques and smart gradient-project (SGP) ACE techniques. The linear programming (LP)-based techniques compensate for the symbol overlaps by utilizing a frame-based approach and provide a theoretical upper bound on achievable performance for the overlapping ACE techniques. The overlapping ACE techniques on the other hand can handle symbol by symbol processing. Furthermore, as a result of FBMC properties, the proposed techniques do not require side information transmission. The PAPR performance of the techniques is shown to match, or in some cases improve, on current PAPR techniques for FBMC. Initial analysis of the computational complexity of the SGP techniques indicates that the complexity issues with PAPR reduction in FBMC implementations can be addressed. The out-of-band interference introduced by the techniques is investigated. As a result, it is shown that the interference can be compensated for, whilst still maintaining decent PAPR performance. Additional results are also provided by means of a study of the PAPR reduction of the proposed techniques at a fixed clipping probability. The bit error rate (BER) degradation is investigated to ensure that the trade-off in terms of BER degradation is not too severe. As illustrated by exhaustive simulations, the SGP ACE-based technique proposed are ideal candidates for practical implementation in systems employing the low-complexity polyphase implementation of FBMC modulators. The methods are shown to offer significant PAPR reduction and increase the feasibility of FBMC as a replacement modulation system for OFDM.
SNAP/SHOT Your Ability to Support That Next Application.
ERIC Educational Resources Information Center
Jones, Ernest L.
SNAP/SHOT (System Network Analysis Program-Simulated Host Overview Technique) is a discrete simulation of a network and/or host model available through IBM at the Raleigh System Center. The simulator provides an analysis of a total IBM Communications System. Input data must be obtained from RMF, SMF, and the CICS Analyzer to determine the existing…
ERIC Educational Resources Information Center
Williams, Stacy L.
2006-01-01
A comprehensive meta-analysis of the research following Glass, McGraw, and Smith's (1981) technique integrated findings from twenty-five comparative studies from 1990 to 2003 targeting student achievement and distance education in allied health professions. Student achievement was assessed through course grades and resulted in an overall effect…
Something Old, Something New: MBA Program Evaluation Using Shift-Share Analysis and Google Trends
ERIC Educational Resources Information Center
Davis, Sarah M.; Rodriguez, A. E.
2014-01-01
Shift-share analysis is a decomposition technique that is commonly used to measure attributes of regional change. In this method, regional change is decomposed into its relevant functional and competitive parts. This paper introduces traditional shift-share method and its extensions with examples of its applicability and usefulness for program…
ERIC Educational Resources Information Center
Freeman, Robert R.; And Others
The main results of the survey-and-analysis stage include a substantial collection of preliminary data on the language-sciences information user community, its professional specialties and information channels, its indexing tools, and its terminologies. The prospects and techniques for the development of a modern, discipline-based information…
Frachon, E; Hamon, S; Nicolas, L; de Barjac, H
1991-01-01
Gas-liquid chromatography of fatty acid methyl esters and numerical analysis were carried out with 114 Bacillus sphaericus strains. Since only two clusters harbored mosquitocidal strains, this technique could be developed in screening programs to limit bioassays on mosquito larvae. It also allows differentiation of highly homologous strains. PMID:1781697
NASA Astrophysics Data System (ADS)
Srivastava, Anjali
The determination of accurate material composition of a kidney stone is crucial for understanding the formation of the kidney stone as well as for preventive therapeutic strategies. Radiations probing instrumental activation analysis techniques are excellent tools for identification of involved materials present in the kidney stone. The X-ray fluorescence (XRF) and neutron activation analysis (NAA) experiments were performed and different kidney stones were analyzed. The interactions of X-ray photons and neutrons with matter are complementary in nature, resulting in distinctly different materials detection. This is the first approach to utilize combined X-ray fluorescence and neutron activation analysis for a comprehensive analysis of the kideny stones. Presently, experimental studies in conjunction with analytical techniques were used to determine the exact composition of the kidney stone. The use of open source program Python Multi-Channel Analyzer was utilized to unfold the XRF spectrum. A new type of experimental set-up was developed and utilized for XRF and NAA analysis of the kidney stone. To verify the experimental results with analytical calculation, several sets of kidney stones were analyzed using XRF and NAA technique. The elements which were identified from XRF technique are Br, Cu, Ga, Ge, Mo, Nb, Ni, Rb, Se, Sr, Y, Zr. And, by using Neutron Activation Analysis (NAA) are Au, Br, Ca, Er, Hg, I, K, Na, Pm, Sb, Sc, Sm, Tb, Yb, Zn. This thesis presents a new approach for exact detection of accurate material composition of kidney stone materials using XRF and NAA instrumental activation analysis techniques.
Rozen, Warren Matthew; Spychal, Robert T.; Hunter-Smith, David J.
2016-01-01
Background Accurate volumetric analysis is an essential component of preoperative planning in both reconstructive and aesthetic breast procedures towards achieving symmetrization and patient-satisfactory outcome. Numerous comparative studies and reviews of individual techniques have been reported. However, a unifying review of all techniques comparing their accuracy, reliability, and practicality has been lacking. Methods A review of the published English literature dating from 1950 to 2015 using databases, such as PubMed, Medline, Web of Science, and EMBASE, was undertaken. Results Since Bouman’s first description of water displacement method, a range of volumetric assessment techniques have been described: thermoplastic casting, direct anthropomorphic measurement, two-dimensional (2D) imaging, and computed tomography (CT)/magnetic resonance imaging (MRI) scans. However, most have been unreliable, difficult to execute and demonstrate limited practicability. Introduction of 3D surface imaging has revolutionized the field due to its ease of use, fast speed, accuracy, and reliability. However, its widespread use has been limited by its high cost and lack of high level of evidence. Recent developments have unveiled the first web-based 3D surface imaging program, 4D imaging, and 3D printing. Conclusions Despite its importance, an accurate, reliable, and simple breast volumetric analysis tool has been elusive until the introduction of 3D surface imaging technology. However, its high cost has limited its wide usage. Novel adjunct technologies, such as web-based 3D surface imaging program, 4D imaging, and 3D printing, appear promising. PMID:27047788
Ismail, Abdussalaam Iyanda; Abdul Majid, Abdul Halim; Zakaria, Mohd Normani; Abdullah, Nor Azimah Chew; Hamzah, Sulaiman; Mukari, Siti Zamratol-Mai Sarah
2018-06-01
The current study aims to examine the effects of human resource (measured with the perception of health workers' perception towards UNHS), screening equipment, program layout and screening techniques on healthcare practitioners' awareness (measured with knowledge) of universal newborn hearing screening (UNHS) in Malaysian non-public hospitals. Via cross sectional approach, the current study collected data using a validated questionnaire to obtain information on the awareness of UNHS program among the health practitioners and to test the formulated hypotheses. 51, representing 81% response rate, out of 63 questionnaires distributed to the health professionals were returned and usable for statistical analysis. The survey instruments involving healthcare practitioners' awareness, human resource, program layout, screening instrument, and screening techniques instruments were adapted and scaled with 7-point Likert scale ranging from 1 (little) to 7 (many). Partial Least Squares (PLS) algorithm and bootstrapping techniques were employed to test the hypotheses of the study. With the result involving beta values, t-values and p-values (i.e. β=0.478, t=1.904, p<0.10; β=0.809, t=3.921, p<0.01; β= -0.436, t=1.870, p<0.10), human resource, measured with training, functional equipment and program layout, are held to be significant predictors of enhanced knowledge of health practitioners. Likewise, program layout, human resource, screening technique and screening instrument explain 71% variance in health practitioners' awareness. Health practitioners' awareness is explained by program layout, human resource, and screening instrument with effect size (f2) of 0.065, 0.621, and 0.211 respectively, indicating that program layout, human resource, and screening instrument have small, large and medium effect size on health practitioners' awareness respectively. However, screening technique has zero effect on health practitioners' awareness, indicating the reason why T-statistics is not significant. Having started the UNHS program in 2003, non-public hospitals have more experienced and well-trained employees dealing with the screening tools and instrument, and the program layout is well structured in the hospitals. Yet, the issue of homogeneity exists. Non-public hospitals charge for the service they render, and, in turn, they would ensure quality service, given that they are profit-driven and/or profit-making establishments, and that they would have no option other than provision of value-added and innovative services. The employees in the non-public hospitals have less screening to carry out, given the low number of babies delivered in the private hospitals. In addition, non-significant relationship between screening techniques and healthcare practitioners' awareness of UNHS program is connected with the fact that the techniques that are practiced among public and non-public hospital are similar and standardized. Limitations and suggestions were discussed. Copyright © 2018 Elsevier B.V. All rights reserved.
On the Information Content of Program Traces
NASA Technical Reports Server (NTRS)
Frumkin, Michael; Hood, Robert; Yan, Jerry; Saini, Subhash (Technical Monitor)
1998-01-01
Program traces are used for analysis of program performance, memory utilization, and communications as well as for program debugging. The trace contains records of execution events generated by monitoring units inserted into the program. The trace size limits the resolution of execution events and restricts the user's ability to analyze the program execution. We present a study of the information content of program traces and develop a coding scheme which reduces the trace size to the limit given by the trace entropy. We apply the coding to the traces of AIMS instrumented programs executed on the IBM SPA and the SCSI Power Challenge and compare it with other coding methods. Our technique shows size of the trace can be reduced by more than a factor of 5.
NASA Technical Reports Server (NTRS)
Stricker, L. T.
1973-01-01
The DORCA Applications study has been directed at development of a data bank management computer program identified as DORMAN. Because of the size of the DORCA data files and the manipulations required on that data to support analyses with the DORCA program, automated data techniques to replace time-consuming manual input generation are required. The Dynamic Operations Requirements and Cost Analysis (DORCA) program was developed for use by NASA in planning future space programs. Both programs are designed for implementation on the UNIVAC 1108 computing system. The purpose of this Executive Summary Report is to define for the NASA management the basic functions of the DORMAN program and its capabilities.
Accuracy of remotely sensed data: Sampling and analysis procedures
NASA Technical Reports Server (NTRS)
Congalton, R. G.; Oderwald, R. G.; Mead, R. A.
1982-01-01
A review and update of the discrete multivariate analysis techniques used for accuracy assessment is given. A listing of the computer program written to implement these techniques is given. New work on evaluating accuracy assessment using Monte Carlo simulation with different sampling schemes is given. The results of matrices from the mapping effort of the San Juan National Forest is given. A method for estimating the sample size requirements for implementing the accuracy assessment procedures is given. A proposed method for determining the reliability of change detection between two maps of the same area produced at different times is given.
Update on Chemical Analysis of Recovered Hydrazine Family Fuels for Recycling
NASA Technical Reports Server (NTRS)
Davis, C. L.
1997-01-01
The National Aeronautics and Space Administration, Kennedy Space Center, has developed a program to re-use and/or recycle hypergolic propellants recovered from propellant systems. As part of this effort, new techniques were developed to analyze recovered propellants. At the 1996 PDCS, the paper 'Chemical Analysis of Recovered Hydrazine Family Fuels For Recycling' presented analytical techniques used in accordance with KSC specifications which define what recovered propellants are acceptable for recycling. This paper is a follow up to the 1996 paper. Lower detection limits and response linearity were examined for two gas chromatograph methods.
NASA Astrophysics Data System (ADS)
Antinah; Kusmayadi, T. A.; Husodo, B.
2018-05-01
This study aims to determine the effect of learning model on student achievement in terms of interpersonal intelligence. The compared learning models are LC7E and Direct learning model. This type of research is a quasi-experimental with 2x3 factorial design. The population in this study is a Grade XI student of Wonogiri Vocational Schools. The sample selection had done by stratified cluster random sampling. Data collection technique used questionnaires, documentation and tests. The data analysis technique used two different unequal cell variance analysis which previously conducted prerequisite analysis for balance test, normality test and homogeneity test. he conclusions of this research are: 1) student learning achievement of mathematics given by LC7E learning model is better when compared with direct learning; 2) Mathematics learning achievement of students who have a high level of interpersonal intelligence is better than students with interpersonal intelligence in medium and low level. Students' mathematics learning achievement with interpersonal level of intelligence is better than those with low interpersonal intelligence on linear programming; 3) LC7E learning model resulted better on mathematics learning achievement compared with direct learning model for each category of students’ interpersonal intelligence level on linear program material.
NASA Astrophysics Data System (ADS)
Antinah; Kusmayadi, T. A.; Husodo, B.
2018-03-01
This study aimed to determine the effect of learning model on student achievement in terms of interpersonal intelligence. The compared learning models are LC7E and Direct learning model. This type of research is a quasi-experimental with 2x3 factorial design. The population in this study is a Grade XI student of Wonogiri Vocational Schools. The sample selection had done by stratified cluster random sampling. Data collection technique used questionnaires, documentation and tests. The data analysis technique used two different unequal cell variance analysis which previously conducted prerequisite analysis for balance test, normality test and homogeneity test. he conclusions of this research are: 1) student learning achievement of mathematics given by LC7E learning model is better when compared with direct learning; 2) Mathematics learning achievement of students who have a high level of interpersonal intelligence is better than students with interpersonal intelligence in medium and low level. Students’ mathematics learning achievement with interpersonal level of intelligence is better than those with low interpersonal intelligence on linear programming; 3) LC7E learning model resulted better on mathematics learning achievement compared with direct learning model for each category of students’ interpersonal intelligence level on linear program material.
Sefcik, Justine S; Petrovsky, Darina; Streur, Megan; Toles, Mark; O'Connor, Melissa; Ulrich, Connie M; Marcantonio, Sherry; Coburn, Ken; Naylor, Mary D; Moriarty, Helene
2018-03-01
The purpose of this study was to explore participants' experience in the Health Quality Partners (HQP) Care Coordination Program that contributed to their continued engagement. Older adults with multiple chronic conditions often have limited engagement in health care services and face fragmented health care delivery. This can lead to increased risk for disability, mortality, poor quality of life, and increased health care utilization. A qualitative descriptive design with two focus groups was conducted with a total of 20 older adults enrolled in HQP's Care Coordination Program. Conventional content analysis was the analytical technique. The overarching theme resulting from the analysis was "in our corner," with subthemes "opportunities to learn and socialize" and "dedicated nurses," suggesting that these are the primary contributing factors to engagement in HQP's Care Coordination Program. Study findings suggest that nurses play an integral role in patient engagement among older adults enrolled in a care coordination program.
Jindani, Farah A; Khalsa, G F S
2015-07-01
To understand how individuals with symptoms of posttraumatic stress disorder (PTSD) perceive a trauma-sensitive Kundalini yoga (KY) program. Digitally recorded telephone interviews 30-60 minutes in duration were conducted with 40 individuals with PTSD participating in an 8-week KY treatment program. Interviews were transcribed verbatim and analyzed using qualitative thematic analysis techniques. Qualitative analysis identifies three major themes: self-observed changes, new awareness, and the yoga program itself. Findings suggest that participants noted changes in areas of health and well-being, lifestyle, psychosocial integration, and perceptions of self in relation to the world. Presented are practical suggestions for trauma-related programming. There is a need to consider alternative and potentially empowering approaches to trauma treatment. Yoga-related self-care or self-management strategies are widely accessible, are empowering, and may address the mind-body elements of PTSD.
Analysis to develop a program for energy-integrated farm systems
NASA Astrophysics Data System (ADS)
Eakin, D. E.; Clark, M. A.; Inaba, L. K.; Johnson, K. I.
1981-09-01
A program to use renewable energy resources and possibly develop decentralization of energy systems for agriculture is discussed. The program's objective is determined by: (1) an analysis of the technologies that could be utilized to transform renewable farm resources to energy by the year 2000, (2) the quantity of renewable farm resources that are available, and (3) current energy-use patterns. Individual research, development, and demonstration projects are fit into a national program of energy-integrated farm systems on the basis of market need, conversion potential, technological opportunities, and acceptability. Quantification of these factors for the purpose of establishing program guidelines is conducted using the following four precepts: (1) market need is identified by current use of energy for agricultural production; (2) conversion potential is determined by the availability of renewable resources; and (3) technological opportunities are determined by the state-of-the-art methods, techniques, and processes that can convert renewable resources into farm energy.
Li, Yongping; Huang, Guohe
2009-03-01
In this study, a dynamic analysis approach based on an inexact multistage integer programming (IMIP) model is developed for supporting municipal solid waste (MSW) management under uncertainty. Techniques of interval-parameter programming and multistage stochastic programming are incorporated within an integer-programming framework. The developed IMIP can deal with uncertainties expressed as probability distributions and interval numbers, and can reflect the dynamics in terms of decisions for waste-flow allocation and facility-capacity expansion over a multistage context. Moreover, the IMIP can be used for analyzing various policy scenarios that are associated with different levels of economic consequences. The developed method is applied to a case study of long-term waste-management planning. The results indicate that reasonable solutions have been generated for binary and continuous variables. They can help generate desired decisions of system-capacity expansion and waste-flow allocation with a minimized system cost and maximized system reliability.
A computer program for multiple decrement life table analyses.
Poole, W K; Cooley, P C
1977-06-01
Life table analysis has traditionally been the tool of choice in analyzing distribution of "survival" times when a parametric form for the survival curve could not be reasonably assumed. Chiang, in two papers [1,2] formalized the theory of life table analyses in a Markov chain framework and derived maximum likelihood estimates of the relevant parameters for the analyses. He also discussed how the techniques could be generalized to consider competing risks and follow-up studies. Although various computer programs exist for doing different types of life table analysis [3] to date, there has not been a generally available, well documented computer program to carry out multiple decrement analyses, either by Chiang's or any other method. This paper describes such a program developed by Research Triangle Institute. A user's manual is available at printing costs which supplements the contents of this paper with a discussion of the formula used in the program listing.
NASA Technical Reports Server (NTRS)
Sen, Syamal K.; Shaykhian, Gholam Ali
2011-01-01
MatLab(TradeMark)(MATrix LABoratory) is a numerical computation and simulation tool that is used by thousands Scientists and Engineers in many countries. MatLab does purely numerical calculations, which can be used as a glorified calculator or interpreter programming language; its real strength is in matrix manipulations. Computer algebra functionalities are achieved within the MatLab environment using "symbolic" toolbox. This feature is similar to computer algebra programs, provided by Maple or Mathematica to calculate with mathematical equations using symbolic operations. MatLab in its interpreter programming language form (command interface) is similar with well known programming languages such as C/C++, support data structures and cell arrays to define classes in object oriented programming. As such, MatLab is equipped with most of the essential constructs of a higher programming language. MatLab is packaged with an editor and debugging functionality useful to perform analysis of large MatLab programs and find errors. We believe there are many ways to approach real-world problems; prescribed methods to ensure foregoing solutions are incorporated in design and analysis of data processing and visualization can benefit engineers and scientist in gaining wider insight in actual implementation of their perspective experiments. This presentation will focus on data processing and visualizations aspects of engineering and scientific applications. Specifically, it will discuss methods and techniques to perform intermediate-level data processing covering engineering and scientific problems. MatLab programming techniques including reading various data files formats to produce customized publication-quality graphics, importing engineering and/or scientific data, organizing data in tabular format, exporting data to be used by other software programs such as Microsoft Excel, data presentation and visualization will be discussed.
WIC program participation--a marketing approach.
Buechner, J S; Scott, H D; Smith, J L; Humphrey, A B
1991-01-01
Recent evaluation studies have described the benefits accruing to low-income women and children who participate in the Special Supplemental Food Program for Women, Infants, and Children (WIC). However, participation is not uniform among all groups of eligible persons. This study examines the geographic variation in WIC participation rates of eligible pregnant women in Rhode Island to determine whether the program is effective in reaching the neediest segments of the population. Eight groups of small geographic areas in Rhode Island (census tracts) were formed on the basis of need for maternal and child health services, as determined from a statistical method employing factor and cluster analysis of existing health and sociodemographic data. Among these eight groups, participation rates in WIC during 1983-84 ranged from 46 percent to more than 100 percent of estimated eligible pregnant women. The rates were positively correlated with measures of need, strongly (r = 0.92) with an index of maternal risk, and less strongly (r = 0.79) with an index of birth outcomes. The results of this study have enabled the Rhode Island WIC Program to direct its outreach efforts more specifically to geographic areas where the need for the program's assistance is greatest. The procedures described in this report comprise a technique that can be generally applied to measure program effectiveness in marketing and outreach where relevant data are available by small geographic areas. The data requirements are (a) population-based estimates of program need and (b) program utilization measures. If these data can be aggregated to a common set of small geographic areas, the use of marketing analysis techniques becomes possible, and program benefits in the area of outreach and recruitment can be realized. PMID:1910189
Model authoring system for fail safe analysis
NASA Technical Reports Server (NTRS)
Sikora, Scott E.
1990-01-01
The Model Authoring System is a prototype software application for generating fault tree analyses and failure mode and effects analyses for circuit designs. Utilizing established artificial intelligence and expert system techniques, the circuits are modeled as a frame-based knowledge base in an expert system shell, which allows the use of object oriented programming and an inference engine. The behavior of the circuit is then captured through IF-THEN rules, which then are searched to generate either a graphical fault tree analysis or failure modes and effects analysis. Sophisticated authoring techniques allow the circuit to be easily modeled, permit its behavior to be quickly defined, and provide abstraction features to deal with complexity.
Telecommunications network optimization
NASA Technical Reports Server (NTRS)
Lee, J.
1979-01-01
Analysis discusses STACOM (state criminal justic communication) network topology program used to design and evaluate digital telecommunications networks STACOM employs ESAU-WILLIAMS technique to search for direct links between system terminations and regional switching center. Inputs include traffic data, terminal locations, and functional requirements.
Computer Analysis of Eye Blood-Vessel Images
NASA Technical Reports Server (NTRS)
Wall, R. J.; White, B. S.
1984-01-01
Technique rapidly diagnoses diabetes mellitus. Photographs of "whites" of patients' eyes scanned by computerized image analyzer programmed to quantify density of small blood vessels in conjuctiva. Comparison with data base of known normal and diabetic patients facilitates rapid diagnosis.
A new national mosaic of state landcover data
Thomas, I.; Handley, Lawrence R.; D'Erchia, Frank J.; Charron, Tammy M.
2000-01-01
This presentation reviewed current landcover mapping efforts and presented a new preliminary, national mosaic of Gap Analysis Program (GAP) and Multi-Resolution Land Characteristics Consortium (MRLC) landcover data with a discussion of techniques, problems faced, and future refinements.
Pipe network flow analysis was among the first civil engineering applications programmed for solution on the early commercial mainframe computers in the 1960s. Since that time, advancements in analytical techniques and computing power have enabled us to solve systems with tens o...
ERIC Educational Resources Information Center
Blake, Anthony; Francis, David
1973-01-01
Approaches to developing management ability include systematic techniques, mental enlargement, self-analysis, and job-related counseling. A method is proposed to integrate them into a responsive program involving depth understanding, vision of the future, specialization commitment to change, and self-monitoring control. (MS)
Probabilistic analysis algorithm for UA slope software program.
DOT National Transportation Integrated Search
2013-12-01
A reliability-based computational algorithm for using a single row and equally spaced drilled shafts to : stabilize an unstable slope has been developed in this research. The Monte-Carlo simulation (MCS) : technique was used in the previously develop...
Assessment and instruction to promote higher order thinking in nursing students.
Kantar, Lina D
2014-05-01
The dearth of data on the role of assessment in higher education formed the two purposes of this study: first, to explore assessment strategies commonly used in nursing education by analyzing the curriculum documents of three baccalaureate nursing programs in Lebanon against Bloom's Taxonomy of learning, and second to unravel issues of instruction and assessment by categorizing data into teacher- and learner-centered strategies. Content analysis research technique applied to analyze the curriculum documents of three baccalaureate nursing programs in Beirut, Lebanon. After obtaining IRB approval and consent to access the curriculum documents of the programs, data were analyzed using the content analysis research technique. Data on assessments and instruction were categorized into student-centered and teacher-centered. Data revealed deficiency in employing learner-centered strategies in the assessment and instruction of the three programs. There was evidence that educators of the programs focus on teaching content and examining retention, thus supporting prior notions on teaching to the test and accusations in earnest on adherence to the traditional and behavioral curriculum perspectives. Such curricula leave little room for the development of higher order thinking in learners. Although assessments are believed to be indicators of program and teaching effectiveness, there is relatively alarming information on the incompatibility between current assessment practices and demands of the workplace. There is an urgent need for transforming educators' beliefs, knowledge, and skills on testing, since teaching to pass a test could impede knowledge transfer and deter the development of learners' higher order thinking skills. © 2013.
Man-rated flight software for the F-8 DFBW program
NASA Technical Reports Server (NTRS)
Bairnsfather, R. R.
1976-01-01
The design, implementation, and verification of the flight control software used in the F-8 DFBW program are discussed. Since the DFBW utilizes an Apollo computer and hardware, the procedures, controls, and basic management techniques employed are based on those developed for the Apollo software system. Program assembly control, simulator configuration control, erasable-memory load generation, change procedures and anomaly reporting are discussed. The primary verification tools are described, as well as the program test plans and their implementation on the various simulators. Failure effects analysis and the creation of special failure generating software for testing purposes are described.
Empirical transfer functions for stations in the Central California seismological network
Bakun, W.H.; Dratler, Jay
1976-01-01
A sequence of calibration signals composed of a station identification code, a transient from the release of the seismometer mass at rest from a known displacement from the equilibrium position, and a transient from a known step in voltage to the amplifier input are generated by the automatic daily calibration system (ADCS) now operational in the U.S. Geological Survey central California seismographic network. Documentation of a sequence of interactive programs to compute, from the calibration data, the complex transfer functions for the seismographic system (ground motion through digitizer) the electronics (amplifier through digitizer), and the seismometer alone are presented. The analysis utilizes the Fourier transform technique originally suggested by Espinosa et al (1962). Section I is a general description of seismographic calibration. Section II contrasts the 'Fourier transform' and the 'least-squares' techniques for analyzing transient calibration signals. Theoretical consideration for the Fourier transform technique used here are described in Section III. Section IV is a detailed description of the sequence of calibration signals generated by the ADCS. Section V is a brief 'cookbook description' of the calibration programs; Section VI contains a detailed sample program execution. Section VII suggests the uses of the resultant empirical transfer functions. Supplemental interactive programs by which smooth response functions, suitable for reducing seismic data to ground motion, are also documented in Section VII. Appendices A and B contain complete listings of the Fortran source Codes while Appendix C is an update containing preliminary results obtained from an analysis of some of the calibration signals from stations in the seismographic network near Oroville, California.
Bayesian Techniques for Plasma Theory to Bridge the Gap Between Space and Lab Plasmas
NASA Astrophysics Data System (ADS)
Crabtree, Chris; Ganguli, Gurudas; Tejero, Erik
2017-10-01
We will show how Bayesian techniques provide a general data analysis methodology that is better suited to investigate phenomena that require a nonlinear theory for an explanation. We will provide short examples of how Bayesian techniques have been successfully used in the radiation belts to provide precise nonlinear spectral estimates of whistler mode chorus and how these techniques have been verified in laboratory plasmas. We will demonstrate how Bayesian techniques allow for the direct competition of different physical theories with data acting as the necessary arbitrator. This work is supported by the Naval Research Laboratory base program and by the National Aeronautics and Space Administration under Grant No. NNH15AZ90I.
A Framework for Assessment of Aviation Safety Technology Portfolios
NASA Technical Reports Server (NTRS)
Jones, Sharon M.; Reveley, Mary S.
2014-01-01
The programs within NASA's Aeronautics Research Mission Directorate (ARMD) conduct research and development to improve the national air transportation system so that Americans can travel as safely as possible. NASA aviation safety systems analysis personnel support various levels of ARMD management in their fulfillment of system analysis and technology prioritization as defined in the agency's program and project requirements. This paper provides a framework for the assessment of aviation safety research and technology portfolios that includes metrics such as projected impact on current and future safety, technical development risk and implementation risk. The paper also contains methods for presenting portfolio analysis and aviation safety Bayesian Belief Network (BBN) output results to management using bubble charts and quantitative decision analysis techniques.
Static Analysis of Mobile Programs
2017-02-01
information flow analysis has the potential to significantly aid human auditors , but it is handicapped by high false positive rates. Instead, auditors ...presents these specifications to a human auditor for validation. We have implemented this framework for a taint analysis of An- droid apps that relies on...of queries to a human auditor . 6.4 Inferring Library Information Flow Specifications Using Dynamic Anal- ysis In [15], we present a technique to mine
MSIX - A general and user-friendly platform for RAM analysis
NASA Astrophysics Data System (ADS)
Pan, Z. J.; Blemel, Peter
The authors present a CAD (computer-aided design) platform supporting RAM (reliability, availability, and maintainability) analysis with efficient system description and alternative evaluation. The design concepts, implementation techniques, and application results are described. This platform is user-friendly because of its graphic environment, drawing facilities, object orientation, self-tutoring, and access to the operating system. The programs' independency and portability make them generally applicable to various analysis tasks.
Flexible data-management system
NASA Technical Reports Server (NTRS)
Pelouch, J. J., Jr.
1977-01-01
Combined ASRDI Data-Management and Analysis Technique (CADMAT) is system of computer programs and procedures that can be used to conduct data-management tasks. System was developed specifically for use by scientists and engineers who are confronted with management and analysis of large quantities of data organized into records of events and parametric fields. CADMAT is particularly useful when data are continually accumulated, such as when the need of retrieval and analysis is ongoing.
Alignment-free genetic sequence comparisons: a review of recent approaches by word analysis.
Bonham-Carter, Oliver; Steele, Joe; Bastola, Dhundy
2014-11-01
Modern sequencing and genome assembly technologies have provided a wealth of data, which will soon require an analysis by comparison for discovery. Sequence alignment, a fundamental task in bioinformatics research, may be used but with some caveats. Seminal techniques and methods from dynamic programming are proving ineffective for this work owing to their inherent computational expense when processing large amounts of sequence data. These methods are prone to giving misleading information because of genetic recombination, genetic shuffling and other inherent biological events. New approaches from information theory, frequency analysis and data compression are available and provide powerful alternatives to dynamic programming. These new methods are often preferred, as their algorithms are simpler and are not affected by synteny-related problems. In this review, we provide a detailed discussion of computational tools, which stem from alignment-free methods based on statistical analysis from word frequencies. We provide several clear examples to demonstrate applications and the interpretations over several different areas of alignment-free analysis such as base-base correlations, feature frequency profiles, compositional vectors, an improved string composition and the D2 statistic metric. Additionally, we provide detailed discussion and an example of analysis by Lempel-Ziv techniques from data compression. © The Author 2013. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.
Hall, Francis X
2006-12-01
The goal was to monitor the effectiveness of the Coast Guard Yard's lead program by comparing a shipyard period in 1991 to one in 2002-2003. Comparisons of airborne lead levels by paint removal techniques, airborne lead levels by welding techniques, and blood lead levels of workers were evaluated by chi2 analysis. Airborne lead levels in paint removal techniques decreased over time for all methods used. Airborne lead levels in welding techniques decreased over time for all methods used. Blood lead levels of the high-risk group revealed a 2-fold reduction (prevalence rate ratio = 8.3; 95% confidence interval, 3.7-18.6) and in the low-risk group revealed a 1.6-fold reduction (prevalence rate ratio = 6.2; 95% confidence interval, 0.86-44.7). The Coast Guard Yard runs an effective lead program that exceeds the national Healthy People 2010 goal for lead. The results validate the Coast Guard Yard's use of air-line respirators and lead-free paint on all vessels.
Genetic programming based ensemble system for microarray data classification.
Liu, Kun-Hong; Tong, Muchenxuan; Xie, Shu-Tong; Yee Ng, Vincent To
2015-01-01
Recently, more and more machine learning techniques have been applied to microarray data analysis. The aim of this study is to propose a genetic programming (GP) based new ensemble system (named GPES), which can be used to effectively classify different types of cancers. Decision trees are deployed as base classifiers in this ensemble framework with three operators: Min, Max, and Average. Each individual of the GP is an ensemble system, and they become more and more accurate in the evolutionary process. The feature selection technique and balanced subsampling technique are applied to increase the diversity in each ensemble system. The final ensemble committee is selected by a forward search algorithm, which is shown to be capable of fitting data automatically. The performance of GPES is evaluated using five binary class and six multiclass microarray datasets, and results show that the algorithm can achieve better results in most cases compared with some other ensemble systems. By using elaborate base classifiers or applying other sampling techniques, the performance of GPES may be further improved.
Genetic Programming Based Ensemble System for Microarray Data Classification
Liu, Kun-Hong; Tong, Muchenxuan; Xie, Shu-Tong; Yee Ng, Vincent To
2015-01-01
Recently, more and more machine learning techniques have been applied to microarray data analysis. The aim of this study is to propose a genetic programming (GP) based new ensemble system (named GPES), which can be used to effectively classify different types of cancers. Decision trees are deployed as base classifiers in this ensemble framework with three operators: Min, Max, and Average. Each individual of the GP is an ensemble system, and they become more and more accurate in the evolutionary process. The feature selection technique and balanced subsampling technique are applied to increase the diversity in each ensemble system. The final ensemble committee is selected by a forward search algorithm, which is shown to be capable of fitting data automatically. The performance of GPES is evaluated using five binary class and six multiclass microarray datasets, and results show that the algorithm can achieve better results in most cases compared with some other ensemble systems. By using elaborate base classifiers or applying other sampling techniques, the performance of GPES may be further improved. PMID:25810748
Attachment techniques for high temperature strain
NASA Astrophysics Data System (ADS)
Wnuk, Steve P., Jr.
1993-01-01
Attachment methods for making resistive strain measurements to 2500 F were studied. A survey of available strain gages and attachment techniques was made, and the results are compiled for metal and carbon composite test materials. A theoretical analysis of strain transfer into a bonded strain gage was made, and the important physical parameters of the strain transfer medium, the ceramic matrix, were identified. A pull tester to measure pull-out tests on commonly used strain gage cements indicated that all cements tested displayed adequate strength for good strain transfer. Rokide flame sprayed coatings produced significantly stronger bonds than ceramic cements. An in-depth study of the flame spray process produced simplified installation procedures which also resulted in greater reliability and durability. Application procedures incorporating improvements made during this program are appended to the report. Strain gages installed on carbon composites, Rene' 41, 316 stainless steel, and TZM using attachment techniques developed during this program were successfully tested to 2500 F. Photographs of installation techniques, test procedures, and graphs of the test data are included in this report.
Nonlinear relaxation algorithms for circuit simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saleh, R.A.
Circuit simulation is an important Computer-Aided Design (CAD) tool in the design of Integrated Circuits (IC). However, the standard techniques used in programs such as SPICE result in very long computer-run times when applied to large problems. In order to reduce the overall run time, a number of new approaches to circuit simulation were developed and are described. These methods are based on nonlinear relaxation techniques and exploit the relative inactivity of large circuits. Simple waveform-processing techniques are described to determine the maximum possible speed improvement that can be obtained by exploiting this property of large circuits. Three simulation algorithmsmore » are described, two of which are based on the Iterated Timing Analysis (ITA) method and a third based on the Waveform-Relaxation Newton (WRN) method. New programs that incorporate these techniques were developed and used to simulate a variety of industrial circuits. The results from these simulations are provided. The techniques are shown to be much faster than the standard approach. In addition, a number of parallel aspects of these algorithms are described, and a general space-time model of parallel-task scheduling is developed.« less
Reflector surface distortion analysis techniques (thermal distortion analysis of antennas in space)
NASA Technical Reports Server (NTRS)
Sharp, R.; Liao, M.; Giriunas, J.; Heighway, J.; Lagin, A.; Steinbach, R.
1989-01-01
A group of large computer programs are used to predict the farfield antenna pattern of reflector antennas in the thermal environment of space. Thermal Radiation Analysis Systems (TRASYS) is a thermal radiation analyzer that interfaces with Systems Improved Numerical Differencing Analyzer (SINDA), a finite difference thermal analysis program. The programs linked together for this analysis can now be used to predict antenna performance in the constantly changing space environment. They can be used for very complex spacecraft and antenna geometries. Performance degradation caused by methods of antenna reflector construction and materials selection are also taken into consideration. However, the principal advantage of using this program linkage is to account for distortions caused by the thermal environment of space and the hygroscopic effects of the dry-out of graphite/epoxy materials after the antenna is placed into orbit. The results of this type of analysis could ultimately be used to predict antenna reflector shape versus orbital position. A phased array antenna distortion compensation system could then use this data to make RF phase front corrections. That is, the phase front could be adjusted to account for the distortions in the antenna feed and reflector geometry for a particular orbital position.
On the Automation of the MarkIII Data Analysis System.
NASA Astrophysics Data System (ADS)
Schwegmann, W.; Schuh, H.
1999-03-01
A faster and semiautomatic data analysis is an important contribution to the acceleration of the VLBI procedure. A concept for the automation of one of the most widely used VLBI software packages the MarkIII Data Analysis System was developed. Then, the program PWXCB, which extracts weather and cable calibration data from the station log-files, was automated supplementing the existing Fortran77 program-code. The new program XLOG and its results will be presented. Most of the tasks in the VLBI data analysis are very complex and their automation requires typical knowledge-based techniques. Thus, a knowledge-based system (KBS) for support and guidance of the analyst is being developed using the AI-workbench BABYLON, which is based on methods of artificial intelligence (AI). The advantages of a KBS for the MarkIII Data Analysis System and the required steps to build a KBS will be demonstrated. Examples about the current status of the project will be given, too.
Safety assessment for EPS electron-proton spectrometer
NASA Technical Reports Server (NTRS)
Gleeson, P.
1971-01-01
A safety analysis was conducted to identify the efforts required to assure relatively hazard free operation of the EPS and to meet the safety requirements of the program. Safety engineering criteria, principles, and techniques in applicable disciplines are stressed in the performance of the system and subsystem studies; in test planning; in the design, development, test, evaluation, and checkout of the equipment; and the operating procedures for the EPS program.
Early Formulation of Training Programs for Cost Effectiveness Analysis
1978-07-01
training approaches. viii Although the method and media variables aid training program selection de- cisions, a technique is also required to monitor...fact that personnel must still be taught certain prerequisite skills and knowledges before they can begin to use the actual equipment, this approach...often difficult to identify causal relations. Good summaries have been produced, e.g., Meister, 1976,4 however, and are a great aid in pull- ing
Evaluation of Time Domain EM Coupling Techniques. Volume II.
1980-08-01
tool for the analysis of elec- tromangetic coupling and shielding problems: the finite-difference, time-domain (FD- TD ) solution of Maxwell’s equations...The objective of the program was to evaluate the suitability of the FD- TD method to determine the amount of electromagnetic coupling through an...specific questfiowwere addressed during this program: 1. Can the FD- TD method accurately model electromagnetic coupling into a conducting structure for
NASA Technical Reports Server (NTRS)
Stutzman, W. L.; Takamizawa, K.; Werntz, P.; Lapean, J.; Barts, R.
1991-01-01
The following subject areas are covered: General Reflector Antenna Systems Program version 7(GRASP7); Multiple Reflector Analysis Program for Cylindrical Antennas (MRAPCA); Tri-Reflector 2D Synthesis Code (TRTDS); a geometrical optics and a physical optics synthesis techniques; beam scanning reflector, the type 2 and 6 reflectors, spherical reflector, and multiple reflector imaging systems; and radiometric array design.
1992-12-28
analysis. Marvin Minsky , carefully applying mathematical techniques, developed rigo.,ous theorems regarding netwcrk operation. His research led to the...electrical circuits but was later convened to computer simulation, which is still commonly used today. Early success by - Marvirn Minsky , Frank...publication of the book Perceptrons ( Minsky and Papert 1969), in which he and Seymore Papert proved that the single-layer networks then in use were
TRAC-Monterey FY16 Work Program Development and Report of Research Elicitation
2016-01-01
any changes to priorities or additional projects that require immediate research. Work Program; Research Elicitation Unclassified UU UU UU UU 35 MAJ...conduct analysis for the Army. 1 Marks, Chris, Nesbitt, Peter. TRAC FY14 Research Requirements Elicitation . Technical Report TRAC-M-TM-13-059. 700 Dyer... Requirements Elicitation Interviews Interview Guide: 1. Describe a research requirement in the areas of topics, techniques, and methodologies. 2
PARENT Quick Blind Round-Robin Test Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Braatz, Brett G.; Heasler, Patrick G.; Meyer, Ryan M.
The U.S. Nuclear Regulatory Commission has established the Program to Assess the Reliability of Emerging Nondestructive Techniques (PARENT) whose goal is to investigate the effectiveness of current and novel nondestructive examination procedures and techniques to find flaws in nickel-alloy welds and base materials. This is to be done by conducting a series of open and blind international round-robin tests on a set of piping components that include large-bore dissimilar metal welds, small-bore dissimilar metal welds, and bottom-mounted instrumentation penetration welds. The blind testing is being conducted in two segments, one is called Quick-Blind and the other is called Blind. Themore » Quick-Blind testing and destructive analysis of the test blocks has been completed. This report describes the four Quick-Blind test blocks used, summarizes their destructive analysis, gives an overview of the nondestructive evaluation (NDE) techniques applied, provides an analysis inspection data, and presents the conclusions drawn.« less
Edge compression techniques for visualization of dense directed graphs.
Dwyer, Tim; Henry Riche, Nathalie; Marriott, Kim; Mears, Christopher
2013-12-01
We explore the effectiveness of visualizing dense directed graphs by replacing individual edges with edges connected to 'modules'-or groups of nodes-such that the new edges imply aggregate connectivity. We only consider techniques that offer a lossless compression: that is, where the entire graph can still be read from the compressed version. The techniques considered are: a simple grouping of nodes with identical neighbor sets; Modular Decomposition which permits internal structure in modules and allows them to be nested; and Power Graph Analysis which further allows edges to cross module boundaries. These techniques all have the same goal--to compress the set of edges that need to be rendered to fully convey connectivity--but each successive relaxation of the module definition permits fewer edges to be drawn in the rendered graph. Each successive technique also, we hypothesize, requires a higher degree of mental effort to interpret. We test this hypothetical trade-off with two studies involving human participants. For Power Graph Analysis we propose a novel optimal technique based on constraint programming. This enables us to explore the parameter space for the technique more precisely than could be achieved with a heuristic. Although applicable to many domains, we are motivated by--and discuss in particular--the application to software dependency analysis.
Top down, bottom up structured programming and program structuring
NASA Technical Reports Server (NTRS)
Hamilton, M.; Zeldin, S.
1972-01-01
New design and programming techniques for shuttle software. Based on previous Apollo experience, recommendations are made to apply top-down structured programming techniques to shuttle software. New software verification techniques for large software systems are recommended. HAL, the higher order language selected for the shuttle flight code, is discussed and found to be adequate for implementing these techniques. Recommendations are made to apply the workable combination of top-down, bottom-up methods in the management of shuttle software. Program structuring is discussed relevant to both programming and management techniques.
NASA Technical Reports Server (NTRS)
Wohlen, R. L.
1976-01-01
Techniques are presented for the solution of structural dynamic systems on an electronic digital computer using FORMA (FORTRAN Matrix Analysis). FORMA is a library of subroutines coded in FORTRAN 4 for the efficient solution of structural dynamics problems. These subroutines are in the form of building blocks that can be put together to solve a large variety of structural dynamics problems. The obvious advantage of the building block approach is that programming and checkout time are limited to that required for putting the blocks together in the proper order.
Lorenzo-Seva, Urbano; Ferrando, Pere J
2011-03-01
We provide an SPSS program that implements currently recommended techniques and recent developments for selecting variables in multiple linear regression analysis via the relative importance of predictors. The approach consists of: (1) optimally splitting the data for cross-validation, (2) selecting the final set of predictors to be retained in the equation regression, and (3) assessing the behavior of the chosen model using standard indices and procedures. The SPSS syntax, a short manual, and data files related to this article are available as supplemental materials from brm.psychonomic-journals.org/content/supplemental.
Programming Probabilistic Structural Analysis for Parallel Processing Computer
NASA Technical Reports Server (NTRS)
Sues, Robert H.; Chen, Heh-Chyun; Twisdale, Lawrence A.; Chamis, Christos C.; Murthy, Pappu L. N.
1991-01-01
The ultimate goal of this research program is to make Probabilistic Structural Analysis (PSA) computationally efficient and hence practical for the design environment by achieving large scale parallelism. The paper identifies the multiple levels of parallelism in PSA, identifies methodologies for exploiting this parallelism, describes the development of a parallel stochastic finite element code, and presents results of two example applications. It is demonstrated that speeds within five percent of those theoretically possible can be achieved. A special-purpose numerical technique, the stochastic preconditioned conjugate gradient method, is also presented and demonstrated to be extremely efficient for certain classes of PSA problems.
Some programming techniques for increasing program versatility and efficiency on CDC equipment
NASA Technical Reports Server (NTRS)
Tiffany, S. H.; Newsom, J. R.
1978-01-01
Five programming techniques used to decrease core and increase program versatility and efficiency are explained. The techniques are: (1) dynamic storage allocation, (2) automatic core-sizing and core-resizing, (3) matrix partitioning, (4) free field alphanumeric reads, and (5) incorporation of a data complex. The advantages of these techniques and the basic methods for employing them are explained and illustrated. Several actual program applications which utilize these techniques are described as examples.
Program Package for the Analysis of High Resolution High Signal-To-Noise Stellar Spectra
NASA Astrophysics Data System (ADS)
Piskunov, N.; Ryabchikova, T.; Pakhomov, Yu.; Sitnova, T.; Alekseeva, S.; Mashonkina, L.; Nordlander, T.
2017-06-01
The program package SME (Spectroscopy Made Easy), designed to perform an analysis of stellar spectra using spectral fitting techniques, was updated due to adding new functions (isotopic and hyperfine splittins) in VALD and including grids of NLTE calculations for energy levels of few chemical elements. SME allows to derive automatically stellar atmospheric parameters: effective temperature, surface gravity, chemical abundances, radial and rotational velocities, turbulent velocities, taking into account all the effects defining spectral line formation. SME package uses the best grids of stellar atmospheres that allows us to perform spectral analysis with the similar accuracy in wide range of stellar parameters and metallicities - from dwarfs to giants of BAFGK spectral classes.
Comparative analysis of economic models in selected solar energy computer programs
NASA Astrophysics Data System (ADS)
Powell, J. W.; Barnes, K. A.
1982-01-01
The economic evaluation models in five computer programs widely used for analyzing solar energy systems (F-CHART 3.0, F-CHART 4.0, SOLCOST, BLAST, and DOE-2) are compared. Differences in analysis techniques and assumptions among the programs are assessed from the point of view of consistency with the Federal requirements for life cycle costing (10 CFR Part 436), effect on predicted economic performance, and optimal system size, case of use, and general applicability to diverse systems types and building types. The FEDSOL program developed by the National Bureau of Standards specifically to meet the Federal life cycle cost requirements serves as a basis for the comparison. Results of the study are illustrated in test cases of two different types of Federally owned buildings: a single family residence and a low rise office building.
Helping agencies improve their planning analysis techniques.
DOT National Transportation Integrated Search
2011-11-18
This report summarizes the results of a peer review of the AZTDM. The peer review was : supported by the Travel Model Improvement Program (TMIP), which is sponsored by FHWA. : The peer review of a travel model can serve multiple purposes, including i...
NASA Technical Reports Server (NTRS)
Rana, D. S.
1980-01-01
The data reduction capabilities of the current data reduction programs were assessed and a search for a more comprehensive system with higher data analytic capabilities was made. Results of the investigation are presented.
5 CFR 551.210 - Computer employees.
Code of Federal Regulations, 2012 CFR
2012-01-01
..., creation or modification of computer programs related to machine operating systems; or (4) A combination of...) Computer systems analysts, computer programmers, software engineers, or other similarly skilled workers in... consist of: (1) The application of systems analysis techniques and procedures, including consulting with...
5 CFR 551.210 - Computer employees.
Code of Federal Regulations, 2011 CFR
2011-01-01
..., creation or modification of computer programs related to machine operating systems; or (4) A combination of...) Computer systems analysts, computer programmers, software engineers, or other similarly skilled workers in... consist of: (1) The application of systems analysis techniques and procedures, including consulting with...
5 CFR 551.210 - Computer employees.
Code of Federal Regulations, 2013 CFR
2013-01-01
..., creation or modification of computer programs related to machine operating systems; or (4) A combination of...) Computer systems analysts, computer programmers, software engineers, or other similarly skilled workers in... consist of: (1) The application of systems analysis techniques and procedures, including consulting with...
5 CFR 551.210 - Computer employees.
Code of Federal Regulations, 2014 CFR
2014-01-01
..., creation or modification of computer programs related to machine operating systems; or (4) A combination of...) Computer systems analysts, computer programmers, software engineers, or other similarly skilled workers in... consist of: (1) The application of systems analysis techniques and procedures, including consulting with...
Children's TV Ad Content: 1974
ERIC Educational Resources Information Center
Doolittle, John; Pepper, Robert
1975-01-01
An analysis of commercials on children's television programs shows that, with the advent of guidelines, some changes have been made, particularly in relation to vitamins, minerals, foods, and toys. Presentation techniques however contain increasing sex stereotyping and decreased presence of racial minorities. (LS)
Spacecraft Avionics Software Development Then and Now: Different but the Same
NASA Technical Reports Server (NTRS)
Mangieri, Mark L.; Garman, John (Jack); Vice, Jason
2012-01-01
NASA has always been in the business of balancing new technologies and techniques to achieve human space travel objectives. NASA s historic Software Production Facility (SPF) was developed to serve complex avionics software solutions during an era dominated by mainframes, tape drives, and lower level programming languages. These systems have proven themselves resilient enough to serve the Shuttle Orbiter Avionics life cycle for decades. The SPF and its predecessor the Software Development Lab (SDL) at NASA s Johnson Space Center (JSC) hosted flight software (FSW) engineering, development, simulation, and test. It was active from the beginning of Shuttle Orbiter development in 1972 through the end of the shuttle program in the summer of 2011 almost 40 years. NASA s Kedalion engineering analysis lab is on the forefront of validating and using many contemporary avionics HW/SW development and integration techniques, which represent new paradigms to NASA s heritage culture in avionics software engineering. Kedalion has validated many of the Orion project s HW/SW engineering techniques borrowed from the adjacent commercial aircraft avionics environment, inserting new techniques and skills into the Multi-Purpose Crew Vehicle (MPCV) Orion program. Using contemporary agile techniques, COTS products, early rapid prototyping, in-house expertise and tools, and customer collaboration, NASA has adopted a cost effective paradigm that is currently serving Orion effectively. This paper will explore and contrast differences in technology employed over the years of NASA s space program, due largely to technological advances in hardware and software systems, while acknowledging that the basic software engineering and integration paradigms share many similarities.
CIRCAL-2 - General-purpose on-line circuit design.
NASA Technical Reports Server (NTRS)
Dertouzos, M. L.; Jessel, G. P.; Stinger, J. R.
1972-01-01
CIRCAL-2 is a second-generation general-purpose on-line circuit-design program with the following main features: (1) multiple-analysis capability; (2) uniform and general data structures for handling text editing, network representations, and output results, regardless of analysis; (3) special techniques and structures for minimizing and controlling user-program interaction; (4) use of functionals for the description of hysteresis and heat effects; and (5) ability to define optimization procedures that 'replace' the user. The paper discusses the organization of CIRCAL-2, the aforementioned main features, and their consequences, such as a set of network elements and models general enough for most analyses and a set of functions tailored to circuit-design requirements. The presentation is descriptive, concentrating on conceptual rather than on program implementation details.
A uniform technique for flood frequency analysis.
Thomas, W.O.
1985-01-01
This uniform technique consisted of fitting the logarithms of annual peak discharges to a Pearson Type III distribution using the method of moments. The objective was to adopt a consistent approach for the estimation of floodflow frequencies that could be used in computing average annual flood losses for project evaluation. In addition, a consistent approach was needed for defining equitable flood-hazard zones as part of the National Flood Insurance Program. -from ASCE Publications Information
Dynamic visualization techniques for high consequence software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pollock, G.M.
1998-02-01
This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification. The prototype tool is described along with the requirements constraint language after a brief literature review is presented. Examples of howmore » the tool can be used are also presented. In conclusion, the most significant advantage of this tool is to provide a first step in evaluating specification completeness, and to provide a more productive method for program comprehension and debugging. The expected payoff is increased software surety confidence, increased program comprehension, and reduced development and debugging time.« less
ERIC Educational Resources Information Center
Santos-Delgado, M. J.; Larrea-Tarruella, L.
2004-01-01
The back-titration methods are compared statistically to establish glycine in a nonaqueous medium of acetic acid. Important variations in the mean values of glycine are observed due to the interaction effects between the analysis of variance (ANOVA) technique and a statistical study through a computer software.
A Re-examination of the Black English Copula. Working Papers in Sociolinguistics, No. 66.
ERIC Educational Resources Information Center
Baugh, John
A corpus of Black English (BEV) data is re-examined with exclusive attention to the "is" form of the copula. This analysis differs from previous examinations in that more constraints have been introduced, and the Cedergren/Sankoff computer program for multivariant analysis has been employed. The analytic techniques that are used allow for a finer…
ERIC Educational Resources Information Center
Leow, Christine; Wen, Xiaoli; Korfmacher, Jon
2015-01-01
This article compares regression modeling and propensity score analysis as different types of statistical techniques used in addressing selection bias when estimating the impact of two-year versus one-year Head Start on children's school readiness. The analyses were based on the national Head Start secondary dataset. After controlling for…
MetaGenyo: a web tool for meta-analysis of genetic association studies.
Martorell-Marugan, Jordi; Toro-Dominguez, Daniel; Alarcon-Riquelme, Marta E; Carmona-Saez, Pedro
2017-12-16
Genetic association studies (GAS) aims to evaluate the association between genetic variants and phenotypes. In the last few years, the number of this type of study has increased exponentially, but the results are not always reproducible due to experimental designs, low sample sizes and other methodological errors. In this field, meta-analysis techniques are becoming very popular tools to combine results across studies to increase statistical power and to resolve discrepancies in genetic association studies. A meta-analysis summarizes research findings, increases statistical power and enables the identification of genuine associations between genotypes and phenotypes. Meta-analysis techniques are increasingly used in GAS, but it is also increasing the amount of published meta-analysis containing different errors. Although there are several software packages that implement meta-analysis, none of them are specifically designed for genetic association studies and in most cases their use requires advanced programming or scripting expertise. We have developed MetaGenyo, a web tool for meta-analysis in GAS. MetaGenyo implements a complete and comprehensive workflow that can be executed in an easy-to-use environment without programming knowledge. MetaGenyo has been developed to guide users through the main steps of a GAS meta-analysis, covering Hardy-Weinberg test, statistical association for different genetic models, analysis of heterogeneity, testing for publication bias, subgroup analysis and robustness testing of the results. MetaGenyo is a useful tool to conduct comprehensive genetic association meta-analysis. The application is freely available at http://bioinfo.genyo.es/metagenyo/ .
Zupanc, Christine M; Burgess-Limerick, Robin; Hill, Andrew; Riek, Stephan; Wallis, Guy M; Plooy, Annaliese M; Horswill, Mark S; Watson, Marcus O; Hewett, David G
2015-12-01
Colonoscopy is a difficult cognitive-perceptual-motor task. Designing an appropriate instructional program for such a task requires an understanding of the knowledge, skills and attitudes underpinning the competency required to perform the task. Cognitive task analysis techniques provide an empirical means of deriving this information. Video recording and a think-aloud protocol were conducted while 20 experienced endoscopists performed colonoscopy procedures. "Cued-recall" interviews were also carried out post-procedure with nine of the endoscopists. Analysis of the resulting transcripts employed the constant comparative coding method within a grounded theory framework. The resulting draft competency framework was modified after review during semi-structured interviews conducted with six expert endoscopists. The proposed colonoscopy competency framework consists of twenty-seven skill, knowledge and attitude components, grouped into six categories (clinical knowledge; colonoscope handling; situation awareness; heuristics and strategies; clinical reasoning; and intra- and inter-personal). The colonoscopy competency framework provides a principled basis for the design of a training program, and for the design of formative assessment to gauge progress towards attaining the knowledge, skills and attitudes underpinning the achievement of colonoscopy competence.
Loading tests of a wing structure for a hypersonic aircraft
NASA Technical Reports Server (NTRS)
Fields, R. A.; Reardon, L. F.; Siegel, W. H.
1980-01-01
Room-temperature loading tests were conducted on a wing structure designed with a beaded panel concept for a Mach 8 hypersonic research airplane. Strain, stress, and deflection data were compared with the results of three finite-element structural analysis computer programs and with design data. The test program data were used to evaluate the structural concept and the methods of analysis used in the design. A force stiffness technique was utilized in conjunction with load conditions which produced various combinations of panel shear and compression loading to determine the failure envelope of the buckling critical beaded panels The force-stiffness data did not result in any predictions of buckling failure. It was, therefore, concluded that the panels were conservatively designed as a result of design constraints and assumptions of panel eccentricities. The analysis programs calculated strains and stresses competently. Comparisons between calculated and measured structural deflections showed good agreement. The test program offered a positive demonstration of the beaded panel concept subjected to room-temperature load conditions.
Microgravity sciences application visiting scientist program
NASA Technical Reports Server (NTRS)
1994-01-01
Contract NAS8-38785, Microgravity Experimental and Theoretical Research, is a project involving a large number of individual research programs related to: determination of the structure of human serum albumin and other biomedically important proteins; analysis of thermodynamic properties of various proteins and models of protein nucleation; development of experimental techniques for the growth of protein crystals in space; study of the physics of electrical double layers in the mechanics of liquid interfaces; computational analysis of vapor crystal growth processes in microgravity; analysis of the influence of magnetic fields in damping residual flows in directional solidification processes; crystal growth and characterization of II-VI semiconductor alloys; and production of thin films for nonlinear optics. It is not intended that the programs will be necessarily limited to this set at any one time. The visiting scientists accomplishing these programs shall serve on-site at MSFC to take advantage of existing laboratory facilities and the daily opportunities for technical communications with various senior scientists.
NASA Technical Reports Server (NTRS)
Daly, J. K.
1974-01-01
The programming techniques used to implement the equations and mathematical techniques of the Houston Operations Predictor/Estimator (HOPE) orbit determination program on the UNIVAC 1108 computer are described. Detailed descriptions are given of the program structure, the internal program structure, the internal program tables and program COMMON, modification and maintainence techniques, and individual subroutine documentation.
Time history solution program, L225 (TEV126). Volume 1: Engineering and usage
NASA Technical Reports Server (NTRS)
Kroll, R. I.; Tornallyay, A.; Clemmons, R. E.
1979-01-01
Volume 1 of a two volume document is presented. The usage of the convolution program L225 (TEV 126) is described. The program calculates the time response of a linear system by convoluting the impulsive response function with the time-dependent excitation function. The convolution is performed as a multiplication in the frequency domain. Fast Fourier transform techniques are used to transform the product back into the time domain to obtain response time histories. A brief description of the analysis used is presented.
NASA Technical Reports Server (NTRS)
Mohr, R. L.
1975-01-01
A set of four digital computer programs is presented which can be used to investigate the effects of instrumentation errors on the accuracy of aircraft and helicopter stability-and-control derivatives identified from flight test data. The programs assume that the differential equations of motion are linear and consist of small perturbations about a quasi-steady flight condition. It is also assumed that a Newton-Raphson optimization technique is used for identifying the estimates of the parameters. Flow charts and printouts are included.
NASA Astrophysics Data System (ADS)
Svirina, Anna; Shindor, Olga; Tatmyshevsky, Konstantin
2014-12-01
The paper deals with the main problems of Russian energy system development that proves necessary to provide educational programs in the field of renewable and alternative energy. In the paper the process of curricula development and defining teaching techniques on the basis of expert opinion evaluation is defined, and the competence model for renewable and alternative energy processing master students is suggested. On the basis of a distributed questionnaire and in-depth interviews, the data for statistical analysis was obtained. On the basis of this data, an optimization of curricula structure was performed, and three models of a structure for optimizing teaching techniques were developed. The suggested educational program structure which was adopted by employers is presented in the paper. The findings include quantitatively estimated importance of systemic thinking and professional skills and knowledge as basic competences of a masters' program graduate; statistically estimated necessity of practice-based learning approach; and optimization models for structuring curricula in renewable and alternative energy processing. These findings allow the establishment of a platform for the development of educational programs.
NASA Technical Reports Server (NTRS)
Linford, R. M. F.; Allen, T. H.; Dillow, C. F.
1975-01-01
A program is described to design, fabricate and install an experimental work chamber assembly (WCA) to provide a wide range of experimental capability. The WCA incorporates several techniques for studying the kinetics of contaminant films and their effect on optical surfaces. It incorporates the capability for depositing both optical and contaminant films on temperature-controlled samples, and for in-situ measurements of the vacuum ultraviolet reflectance. Ellipsometer optics are mounted on the chamber for film thickness determinations, and other features include access ports for radiation sources and instrumentation. Several supporting studies were conducted to define specific chamber requirements, to determine the sensitivity of the measurement techniques to be incorporated in the chamber, and to establish procedures for handling samples prior to their installation in the chamber. A bibliography and literature survey of contamination-related articles is included.
Noh, Wonjung; Lim, Ji Young
2015-06-01
The purpose of this study was to identify the financial management educational needs of nurses in order to development an educational program to strengthen their financial management competencies. Data were collected from two focus groups using the nominal group technique. The study consisted of three steps: a literature review, focus group discussion using the nominal group technique, and data synthesis. After analyzing the results, nine key components were selected: corporate management and accounting, introduction to financial management in hospitals, basic structure of accounting, basics of hospital accounting, basics of financial statements, understanding the accounts of financial statements, advanced analysis of financial statements, application of financial management, and capital financing of hospitals. The present findings can be used to develop a financial management education program to strengthen the financial management competencies of nurses. Copyright © 2015. Published by Elsevier B.V.
French, Michael T; Salomé, Helena J; Sindelar, Jody L; McLellan, A Thomas
2002-04-01
To provide detailed methodological guidelines for using the Drug Abuse Treatment Cost Analysis Program (DATCAP) and Addiction Severity Index (ASI) in a benefit-cost analysis of addiction treatment. A representative benefit-cost analysis of three outpatient programs was conducted to demonstrate the feasibility and value of the methodological guidelines. Procedures are outlined for using resource use and cost data collected with the DATCAP. Techniques are described for converting outcome measures from the ASI to economic (dollar) benefits of treatment. Finally, principles are advanced for conducting a benefit-cost analysis and a sensitivity analysis of the estimates. The DATCAP was administered at three outpatient drug-free programs in Philadelphia, PA, for 2 consecutive fiscal years (1996 and 1997). The ASI was administered to a sample of 178 treatment clients at treatment entry and at 7-months postadmission. The DATCAP and ASI appear to have significant potential for contributing to an economic evaluation of addiction treatment. The benefit-cost analysis and subsequent sensitivity analysis all showed that total economic benefit was greater than total economic cost at the three outpatient programs, but this representative application is meant to stimulate future economic research rather than justifying treatment per se. This study used previously validated, research-proven instruments and methods to perform a practical benefit-cost analysis of real-world treatment programs. The study demonstrates one way to combine economic and clinical data and offers a methodological foundation for future economic evaluations of addiction treatment.
Computer codes for thermal analysis of a solid rocket motor nozzle
NASA Technical Reports Server (NTRS)
Chauhan, Rajinder Singh
1988-01-01
A number of computer codes are available for performing thermal analysis of solid rocket motor nozzles. Aerotherm Chemical Equilibrium (ACE) computer program can be used to perform one-dimensional gas expansion to determine the state of the gas at each location of a nozzle. The ACE outputs can be used as input to a computer program called Momentum/Energy Integral Technique (MEIT) for predicting boundary layer development development, shear, and heating on the surface of the nozzle. The output from MEIT can be used as input to another computer program called Aerotherm Charring Material Thermal Response and Ablation Program (CMA). This program is used to calculate oblation or decomposition response of the nozzle material. A code called Failure Analysis Nonlinear Thermal and Structural Integrated Code (FANTASTIC) is also likely to be used for performing thermal analysis of solid rocket motor nozzles after the program is duly verified. A part of the verification work on FANTASTIC was done by using one and two dimension heat transfer examples with known answers. An attempt was made to prepare input for performing thermal analysis of the CCT nozzle using the FANTASTIC computer code. The CCT nozzle problem will first be solved by using ACE, MEIT, and CMA. The same problem will then be solved using FANTASTIC. These results will then be compared for verification of FANTASTIC.
Vilaro, M J; Barnett, T E; Watson, A M; Merten, J W; Mathews, A E
2017-01-01
In 2006, food industry self-regulatory efforts aimed to balance the mix of food advertisements to limit children's exposure to unhealthy food products. An update to these efforts proposed to eliminate all unhealthy advertisements during peak child viewing times and implement uniform nutrition criteria by December, 2013. Marketing techniques are not currently addressed in self-regulatory efforts. The food industry's pledge prompted researchers to conduct a content analysis to assess nutritional quality and presence of persuasive marketing techniques in child-directed food and beverage advertisements. Content analysis. 32 h of children's television programming were recorded in February, 2013. Three independent coders assessed the nutritional content of food and beverage advertisements using the UK Nutrition Profiling System and assessed presence of persuasive techniques (PTs) using a rating form developed for this study. Overall, 13.75% of advertisements promoted a food or beverage product. Most food advertisements, 54.6%, represented unhealthy products and 95.48% of food advertisements contained at least one PT. The number of PTs was not significantly different for healthy (M = 4.98, SD = 2.07) and unhealthy food advertisements (M = 4.66, SD = 1.82) however food advertisements aimed at children used significantly more PTs (M = 5.5, SD = 1.43) than those targeting adults (M = 1.52, SD = 1.54), t (153) = 11.738, P < 0.0001. Saturday morning children's programming showed significantly fewer food advertisements compared to weekday morning children's programming. While a majority of food-related advertisements represented unhealthy items, advertisements airing during Saturday morning programming featured fewer food advertisements overall and were more frequently for healthier items compared to weekdays. Industry self-regulation may not be effective for reducing overall unhealthy ad exposure but may play a role in reduced exposure on weekends. Despite policy efforts, additional changes are needed to improve ad exposure experienced by children with a focus on addressing the persistent use of persuasive marketing techniques in food advertising intended for children. Copyright © 2016 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.
Nonlinear Analysis of Squeeze Film Dampers Applied to Gas Turbine Helicopter Engines.
1980-11-01
calculate the stability (complex roots) of a multi-level gas turbine with aero- dynamic excitation. This program has been applied to the space shuttle...such phenomena as oil film whirl. This paper devlops an analysis technique incorporating modal analysis and fast Fourier transform tech- niques to...USING A SQUEEZE FILM BEARING By M. A. Simpson Research Engineer L. E. Barrett Reserach Assistant Professor Department of Mechanical and Aerospace
Dynamic analysis for shuttle design verification
NASA Technical Reports Server (NTRS)
Fralich, R. W.; Green, C. E.; Rheinfurth, M. H.
1972-01-01
Two approaches that are used for determining the modes and frequencies of space shuttle structures are discussed. The first method, direct numerical analysis, involves finite element mathematical modeling of the space shuttle structure in order to use computer programs for dynamic structural analysis. The second method utilizes modal-coupling techniques of experimental verification made by vibrating only spacecraft components and by deducing modes and frequencies of the complete vehicle from results obtained in the component tests.
Building Structure Housing: Case Study of Community Housing in Kendari City
NASA Astrophysics Data System (ADS)
Umar, M. Z.; Faslih, A.; Arsyad, M.; Sjamsu, A. S.; Kadir, I.
2017-11-01
Housing development has been pioneered through a simple home construction program to reduce the production cost. Simple housing program was developed in Kendari City. The purpose of this study is to show the principles of reducing the cost production for the type 36 homes, in Kendari City. The selected architectural objects are the lower, middle and the upper structure of type 36 house. The data collection was done by observation and in-depth discussion with construction workers. The analysis technique used in this research was a descriptive narrative analysis technique in the form of tabulation data. This study concluded that there are several principles of price reduction in the structure of public housing buildings. Quick principles exist in constructing techniques such as using cigarette packs as a foundation pad, mortar usage for rapid wall standing, and the spacing of mortars could be done manually by using two fingers on a human hand. Economic principles could be used for material matters, such as eliminating the use of gravel for concrete, the use of sand material to contain the soil, the foundation does not use sand and empty stone, and the shape of the ring beam was made using triangle reinforcement.
Logic programming to predict cell fate patterns and retrodict genotypes in organogenesis.
Hall, Benjamin A; Jackson, Ethan; Hajnal, Alex; Fisher, Jasmin
2014-09-06
Caenorhabditis elegans vulval development is a paradigm system for understanding cell differentiation in the process of organogenesis. Through temporal and spatial controls, the fate pattern of six cells is determined by the competition of the LET-23 and the Notch signalling pathways. Modelling cell fate determination in vulval development using state-based models, coupled with formal analysis techniques, has been established as a powerful approach in predicting the outcome of combinations of mutations. However, computing the outcomes of complex and highly concurrent models can become prohibitive. Here, we show how logic programs derived from state machines describing the differentiation of C. elegans vulval precursor cells can increase the speed of prediction by four orders of magnitude relative to previous approaches. Moreover, this increase in speed allows us to infer, or 'retrodict', compatible genomes from cell fate patterns. We exploit this technique to predict highly variable cell fate patterns resulting from dig-1 reduced-function mutations and let-23 mosaics. In addition to the new insights offered, we propose our technique as a platform for aiding the design and analysis of experimental data. © 2014 The Author(s) Published by the Royal Society. All rights reserved.
Arthropod Surveillance Programs: Basic Components, Strategies, and Analysis
Rochon, Kateryn; Duehl, Adrian J.; Anderson, John F.; Barrera, Roberto; Su, Nan-Yao; Gerry, Alec C.; Obenauer, Peter J.; Campbell, James F.; Lysyk, Tim J.; Allan, Sandra A.
2015-01-01
Effective entomological surveillance planning stresses a careful consideration of methodology, trapping technologies, and analysis techniques. Herein, the basic principles and technological components of arthropod surveillance plans are described, as promoted in the symposium “Advancements in arthropod monitoring technology, techniques, and analysis” presented at the 58th annual meeting of the Entomological Society of America in San Diego, CA. Interdisciplinary examples of arthropod monitoring for urban, medical, and veterinary applications are reviewed. Arthropod surveillance consists of the three components: 1) sampling method, 2) trap technology, and 3) analysis technique. A sampling method consists of selecting the best device or collection technique for a specific location and sampling at the proper spatial distribution, optimal duration, and frequency to achieve the surveillance objective. Optimized sampling methods are discussed for several mosquito species (Diptera: Culicidae) and ticks (Acari: Ixodidae). The advantages and limitations of novel terrestrial and aerial insect traps, artificial pheromones and kairomones are presented for the capture of red flour beetle (Coleoptera: Tenebrionidae), small hive beetle (Coleoptera: Nitidulidae), bed bugs (Hemiptera: Cimicidae), and Culicoides (Diptera: Ceratopogonidae) respectively. After sampling, extrapolating real world population numbers from trap capture data are possible with the appropriate analysis techniques. Examples of this extrapolation and action thresholds are given for termites (Isoptera: Rhinotermitidae) and red flour beetles. PMID:26543242
Zhou, Xiao-Rong; Huang, Shui-Sheng; Gong, Xin-Guo; Cen, Li-Ping; Zhang, Cong; Zhu, Hong; Yang, Jun-Jing; Chen, Li
2012-04-01
To construct a performance evaluation and management system on advanced schistosomiasis medical treatment, and analyze and evaluate the work of the advanced schistosomiasis medical treatment over the years. By applying the database management technique and C++ programming technique, we inputted the information of the advanced schistosomiasis cases into the system, and comprehensively evaluated the work of the advanced schistosomiasis medical treatment through the cost-effect analysis, cost-effectiveness analysis, and cost-benefit analysis. We made a set of software formula about cost-effect analysis, cost-effectiveness analysis, and cost-benefit analysis. This system had many features such as clear building, easy to operate, friendly surface, convenient information input and information search. It could benefit the performance evaluation of the province's advanced schistosomiasis medical treatment work. This system can satisfy the current needs of advanced schistosomiasis medical treatment work and can be easy to be widely used.
Eigenproblem solution by a combined Sturm sequence and inverse iteration technique.
NASA Technical Reports Server (NTRS)
Gupta, K. K.
1973-01-01
Description of an efficient and numerically stable algorithm, along with a complete listing of the associated computer program, developed for the accurate computation of specified roots and associated vectors of the eigenvalue problem Aq = lambda Bq with band symmetric A and B, B being also positive-definite. The desired roots are first isolated by the Sturm sequence procedure; then a special variant of the inverse iteration technique is applied for the individual determination of each root along with its vector. The algorithm fully exploits the banded form of relevant matrices, and the associated program written in FORTRAN V for the JPL UNIVAC 1108 computer proves to be most significantly economical in comparison to similar existing procedures. The program may be conveniently utilized for the efficient solution of practical engineering problems, involving free vibration and buckling analysis of structures. Results of such analyses are presented for representative structures.
NASA Technical Reports Server (NTRS)
Yang, Guowei; Pasareanu, Corina S.; Khurshid, Sarfraz
2012-01-01
This paper introduces memoized symbolic execution (Memoise), a novel approach for more efficient application of forward symbolic execution, which is a well-studied technique for systematic exploration of program behaviors based on bounded execution paths. Our key insight is that application of symbolic execution often requires several successive runs of the technique on largely similar underlying problems, e.g., running it once to check a program to find a bug, fixing the bug, and running it again to check the modified program. Memoise introduces a trie-based data structure that stores the key elements of a run of symbolic execution. Maintenance of the trie during successive runs allows re-use of previously computed results of symbolic execution without the need for re-computing them as is traditionally done. Experiments using our prototype embodiment of Memoise show the benefits it holds in various standard scenarios of using symbolic execution, e.g., with iterative deepening of exploration depth, to perform regression analysis, or to enhance coverage.
Using Meta Analysis Techniques to Assess the Safety Effect of Red Light Running Cameras
DOT National Transportation Integrated Search
2002-02-01
Automated enforcement programs, including automated systems that are used to enforce red light running violations, have recently come under scrutiny regarding their value in terms of improving safety, their primary purpose. One of the major hurdles t...
Standards guide for space and earth sciences computer software
NASA Technical Reports Server (NTRS)
Mason, G.; Chapman, R.; Klinglesmith, D.; Linnekin, J.; Putney, W.; Shaffer, F.; Dapice, R.
1972-01-01
Guidelines for the preparation of systems analysis and programming work statements are presented. The data is geared toward the efficient administration of available monetary and equipment resources. Language standards and the application of good management techniques to software development are emphasized.
ERIC Educational Resources Information Center
STONE, PHILIP J.
AUTOMATED LANGUAGE PROCESSING (CONTENT ANALYSIS) IS ENGAGED IN NEW VENTURES IN COMPUTER DIALOG AS A RESULT OF NEW TECHNIQUES IN CATEGORIZING RESPONSES. A COMPUTER "NEED-ACHIEVEMENT" SCORING SYSTEM HAS BEEN DEVELOPED. A SET OF COMPUTER PROGRAMS, LABELED "THE GENERAL INQUIRER," WILL SCORE COMPUTER INPUTS WITH RESPONSES FED FROM…
Logic Design Pathology and Space Flight Electronics
NASA Technical Reports Server (NTRS)
Katz, Richard B.; Barto, Rod L.; Erickson, Ken
1999-01-01
This paper presents a look at logic design from early in the US Space Program and examines faults in recent logic designs. Most examples are based on flight hardware failures and analysis of new tools and techniques. The paper is presented in viewgraph form.
High accuracy-nationwide differential global positioning system test and analysis : phase II report
DOT National Transportation Integrated Search
2005-07-01
The High Accuracy-Nationwide Differential Global Positioning System (HA-NDGPS) program focused on the development of compression and broadcast techniques to provide users over a large area wit very accurate radio navigation solutions. The goal was ac...
Land Use Management for Solid Waste Programs
ERIC Educational Resources Information Center
Brown, Sanford M., Jr.
1974-01-01
The author discusses the problems of solid waste disposal and examines various land use management techniques. These include the land use plan, zoning, regionalization, land utilities, and interim use. Information concerning solid waste processing site zoning and analysis is given. Bibliography included. (MA)
Extensions and Adjuncts to the BRL-COMGEOM Program
1974-08-01
m MAGIC Code, GIFT Code, Computer Simulation, Target Description, Geometric Modeling Techniques, Vulnerability Analysis 20...Arbitrary Quadric Surf ace.. 0Oo „<>. 7 III. BRITL: A GEOMETRY PREPROCESSOR PROGRAM FOR INPUT TO THE GIFT SYSTEM „ 0 18 A. Introduction <, „. ° 18 B...the BRL- GIFT code. The tasks completed under this contract and described in the report are: Ao The addition to the list of available body types
United States Air Force Graduate Student Summer Support Program 1986. Program Management Report
1986-12-01
ng cond iaorns. -: i s ; L s : & i ;al Ii i t r-Lk...reliable technique for obtaining confidence intervals for the population correlation under most selection situations. 197 I . I I .. AN ANALYSIS .r...colIlected Over’ tie. Six analyses were evaiyaterd an ~d sc:me were tested fr effective u.se in :,rilire v-ea It i we aral i s . Recoririlendat i :.ris
Role of U.S. Security Assistance in Modernizing the Portuguese Armed Forces: A Historical Analysis.
1986-09-01
Portuguese Air Force Fiscal Year 1986 IMET/FMS Training Program Security Assistance Management Manual * "Portuguese Navy: A Naval Fleet that is...of the techniques of fiscal management and, within the limits that he had set for the regime, his program of economic recovery succeeded .... What...1984). Currency: Escudo * Agriculture: generally developed; 8.8,% of GDP; main crops - grains, potatoes, olives, grapes (wine); deficit foods - sugar
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paul, I.R.
A pilot study in two states led to the establishment of the Dental Exposure Normalization Technique (DENT) program. This, in brief, is an exposure reduction and quality assurance program for radiological health agencies. The health agency sends X-ray exposure cards to dental X-ray facilities. These are exposed by the dentist and returned for analysis. Facilities which show excessive exposure are then visited to demonstrate the changes in exposure and processing necessary to produce diagnostic quality radiographs with minimum patient exposure.
A technique for integrating engine cycle and aircraft configuration optimization
NASA Technical Reports Server (NTRS)
Geiselhart, Karl A.
1994-01-01
A method for conceptual aircraft design that incorporates the optimization of major engine design variables for a variety of cycle types was developed. The methodology should improve the lengthy screening process currently involved in selecting an appropriate engine cycle for a given application or mission. The new capability will allow environmental concerns such as airport noise and emissions to be addressed early in the design process. The ability to rapidly perform optimization and parametric variations using both engine cycle and aircraft design variables, and to see the impact on the aircraft, should provide insight and guidance for more detailed studies. A brief description of the aircraft performance and mission analysis program and the engine cycle analysis program that were used is given. A new method of predicting propulsion system weight and dimensions using thermodynamic cycle data, preliminary design, and semi-empirical techniques is introduced. Propulsion system performance and weights data generated by the program are compared with industry data and data generated using well established codes. The ability of the optimization techniques to locate an optimum is demonstrated and some of the problems that had to be solved to accomplish this are illustrated. Results from the application of the program to the analysis of three supersonic transport concepts installed with mixed flow turbofans are presented. The results from the application to a Mach 2.4, 5000 n.mi. transport indicate that the optimum bypass ratio is near 0.45 with less than 1 percent variation in minimum gross weight for bypass ratios ranging from 0.3 to 0.6. In the final application of the program, a low sonic boom fix a takeoff gross weight concept that would fly at Mach 2.0 overwater and at Mach 1.6 overland is compared with a baseline concept of the same takeoff gross weight that would fly Mach 2.4 overwater and subsonically overland. The results indicate that for the design mission, the low boom concept has a 5 percent total range penalty relative to the baseline. Additional cycles were optimized for various design overland distances and the effect of flying off-design overland distances is illustrated.
Specification and Error Pattern Based Program Monitoring
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Johnson, Scott; Rosu, Grigore; Clancy, Daniel (Technical Monitor)
2001-01-01
We briefly present Java PathExplorer (JPAX), a tool developed at NASA Ames for monitoring the execution of Java programs. JPAX can be used not only during program testing to reveal subtle errors, but also can be applied during operation to survey safety critical systems. The tool facilitates automated instrumentation of a program in order to properly observe its execution. The instrumentation can be either at the bytecode level or at the source level when the source code is available. JPaX is an instance of a more general project, called PathExplorer (PAX), which is a basis for experiments rather than a fixed system, capable of monitoring various programming languages and experimenting with other logics and analysis techniques
How do particle physicists learn the programming concepts they need?
NASA Astrophysics Data System (ADS)
Kluth, S.; Pia, M. G.; Schoerner-Sadenius, T.; Steinbach, P.
2015-12-01
The ability to read, use and develop code efficiently and successfully is a key ingredient in modern particle physics. We report the experience of a training program, identified as “Advanced Programming Concepts”, that introduces software concepts, methods and techniques to work effectively on a daily basis in a HEP experiment or other programming intensive fields. This paper illustrates the principles, motivations and methods that shape the “Advanced Computing Concepts” training program, the knowledge base that it conveys, an analysis of the feedback received so far, and the integration of these concepts in the software development process of the experiments as well as its applicability to a wider audience.
Analysis and testing of numerical formulas for the initial value problem
NASA Technical Reports Server (NTRS)
Brown, R. L.; Kovach, K. R.; Popyack, J. L.
1980-01-01
Three computer programs for evaluating and testing numerical integration formulas used with fixed stepsize programs to solve initial value systems of ordinary differential equations are described. A program written in PASCAL SERIES, takes as input the differential equations and produces a FORTRAN subroutine for the derivatives of the system and for computing the actual solution through recursive power series techniques. Both of these are used by STAN, a FORTRAN program that interactively displays a discrete analog of the Liapunov stability region of any two dimensional subspace of the system. The derivatives may be used by CLMP, a FORTRAN program, to test the fixed stepsize formula against a good numerical result and interactively display the solutions.
NASA Technical Reports Server (NTRS)
Crook, Andrew J.; Delaney, Robert A.
1992-01-01
The computer program user's manual for the ADPACAPES (Advanced Ducted Propfan Analysis Code-Average Passage Engine Simulation) program is included. The objective of the computer program is development of a three-dimensional Euler/Navier-Stokes flow analysis for fan section/engine geometries containing multiple blade rows and multiple spanwise flow splitters. An existing procedure developed by Dr. J. J. Adamczyk and associates at the NASA Lewis Research Center was modified to accept multiple spanwise splitter geometries and simulate engine core conditions. The numerical solution is based upon a finite volume technique with a four stage Runge-Kutta time marching procedure. Multiple blade row solutions are based upon the average-passage system of equations. The numerical solutions are performed on an H-type grid system, with meshes meeting the requirement of maintaining a common axisymmetric mesh for each blade row grid. The analysis was run on several geometry configurations ranging from one to five blade rows and from one to four radial flow splitters. The efficiency of the solution procedure was shown to be the same as the original analysis.
Retinal Imaging Techniques for Diabetic Retinopathy Screening
Goh, James Kang Hao; Cheung, Carol Y.; Sim, Shaun Sebastian; Tan, Pok Chien; Tan, Gavin Siew Wei; Wong, Tien Yin
2016-01-01
Due to the increasing prevalence of diabetes mellitus, demand for diabetic retinopathy (DR) screening platforms is steeply increasing. Early detection and treatment of DR are key public health interventions that can greatly reduce the likelihood of vision loss. Current DR screening programs typically employ retinal fundus photography, which relies on skilled readers for manual DR assessment. However, this is labor-intensive and suffers from inconsistency across sites. Hence, there has been a recent proliferation of automated retinal image analysis software that may potentially alleviate this burden cost-effectively. Furthermore, current screening programs based on 2-dimensional fundus photography do not effectively screen for diabetic macular edema (DME). Optical coherence tomography is becoming increasingly recognized as the reference standard for DME assessment and can potentially provide a cost-effective solution for improving DME detection in large-scale DR screening programs. Current screening techniques are also unable to image the peripheral retina and require pharmacological pupil dilation; ultra-widefield imaging and confocal scanning laser ophthalmoscopy, which address these drawbacks, possess great potential. In this review, we summarize the current DR screening methods using various retinal imaging techniques, and also outline future possibilities. Advances in retinal imaging techniques can potentially transform the management of patients with diabetes, providing savings in health care costs and resources. PMID:26830491
Retinal Imaging Techniques for Diabetic Retinopathy Screening.
Goh, James Kang Hao; Cheung, Carol Y; Sim, Shaun Sebastian; Tan, Pok Chien; Tan, Gavin Siew Wei; Wong, Tien Yin
2016-02-01
Due to the increasing prevalence of diabetes mellitus, demand for diabetic retinopathy (DR) screening platforms is steeply increasing. Early detection and treatment of DR are key public health interventions that can greatly reduce the likelihood of vision loss. Current DR screening programs typically employ retinal fundus photography, which relies on skilled readers for manual DR assessment. However, this is labor-intensive and suffers from inconsistency across sites. Hence, there has been a recent proliferation of automated retinal image analysis software that may potentially alleviate this burden cost-effectively. Furthermore, current screening programs based on 2-dimensional fundus photography do not effectively screen for diabetic macular edema (DME). Optical coherence tomography is becoming increasingly recognized as the reference standard for DME assessment and can potentially provide a cost-effective solution for improving DME detection in large-scale DR screening programs. Current screening techniques are also unable to image the peripheral retina and require pharmacological pupil dilation; ultra-widefield imaging and confocal scanning laser ophthalmoscopy, which address these drawbacks, possess great potential. In this review, we summarize the current DR screening methods using various retinal imaging techniques, and also outline future possibilities. Advances in retinal imaging techniques can potentially transform the management of patients with diabetes, providing savings in health care costs and resources. © 2016 Diabetes Technology Society.
Analysis and synthesis of abstract data types through generalization from examples
NASA Technical Reports Server (NTRS)
Wild, Christian
1987-01-01
The discovery of general patterns of behavior from a set of input/output examples can be a useful technique in the automated analysis and synthesis of software systems. These generalized descriptions of the behavior form a set of assertions which can be used for validation, program synthesis, program testing, and run-time monitoring. Describing the behavior is characterized as a learning process in which the set of inputs is mapped into an appropriate transform space such that general patterns can be easily characterized. The learning algorithm must chose a transform function and define a subset of the transform space which is related to equivalence classes of behavior in the original domain. An algorithm for analyzing the behavior of abstract data types is presented and several examples are given. The use of the analysis for purposes of program synthesis is also discussed.
NASA Technical Reports Server (NTRS)
Gallardo, V. C.; Storace, A. S.; Gaffney, E. F.; Bach, L. J.; Stallone, M. J.
1981-01-01
The component element method was used to develop a transient dynamic analysis computer program which is essentially based on modal synthesis combined with a central, finite difference, numerical integration scheme. The methodology leads to a modular or building-block technique that is amenable to computer programming. To verify the analytical method, turbine engine transient response analysis (TETRA), was applied to two blade-out test vehicles that had been previously instrumented and tested. Comparison of the time dependent test data with those predicted by TETRA led to recommendations for refinement or extension of the analytical method to improve its accuracy and overcome its shortcomings. The development of working equations, their discretization, numerical solution scheme, the modular concept of engine modelling, the program logical structure and some illustrated results are discussed. The blade-loss test vehicles (rig full engine), the type of measured data, and the engine structural model are described.
Control of flexible structures
NASA Technical Reports Server (NTRS)
Russell, R. A.
1985-01-01
The requirements for future space missions indicate that many of these spacecraft will be large, flexible, and in some applications, require precision geometries. A technology program that addresses the issues associated with the structure/control interactions for these classes of spacecraft is discussed. The goal of the NASA control of flexible structures technology program is to generate a technology data base that will provide the designer with options and approaches to achieve spacecraft performance such as maintaining geometry and/or suppressing undesired spacecraft dynamics. This technology program will define the appropriate combination of analysis, ground testing, and flight testing required to validate the structural/controls analysis and design tools. This work was motivated by a recognition that large minimum weight space structures will be required for many future missions. The tools necessary to support such design included: (1) improved structural analysis; (2) modern control theory; (3) advanced modeling techniques; (4) system identification; and (5) the integration of structures and controls.
Microbial community analysis using MEGAN.
Huson, Daniel H; Weber, Nico
2013-01-01
Metagenomics, the study of microbes in the environment using DNA sequencing, depends upon dedicated software tools for processing and analyzing very large sequencing datasets. One such tool is MEGAN (MEtaGenome ANalyzer), which can be used to interactively analyze and compare metagenomic and metatranscriptomic data, both taxonomically and functionally. To perform a taxonomic analysis, the program places the reads onto the NCBI taxonomy, while functional analysis is performed by mapping reads to the SEED, COG, and KEGG classifications. Samples can be compared taxonomically and functionally, using a wide range of different charting and visualization techniques. PCoA analysis and clustering methods allow high-level comparison of large numbers of samples. Different attributes of the samples can be captured and used within analysis. The program supports various input formats for loading data and can export analysis results in different text-based and graphical formats. The program is designed to work with very large samples containing many millions of reads. It is written in Java and installers for the three major computer operating systems are available from http://www-ab.informatik.uni-tuebingen.de. © 2013 Elsevier Inc. All rights reserved.
Omorczyk, Jarosław; Nosiadek, Leszek; Ambroży, Tadeusz; Nosiadek, Andrzej
2015-01-01
The main aim of this study was to verify the usefulness of selected simple methods of recording and fast biomechanical analysis performed by judges of artistic gymnastics in assessing a gymnast's movement technique. The study participants comprised six artistic gymnastics judges, who assessed back handsprings using two methods: a real-time observation method and a frame-by-frame video analysis method. They also determined flexion angles of knee and hip joints using the computer program. In the case of the real-time observation method, the judges gave a total of 5.8 error points with an arithmetic mean of 0.16 points for the flexion of the knee joints. In the high-speed video analysis method, the total amounted to 8.6 error points and the mean value amounted to 0.24 error points. For the excessive flexion of hip joints, the sum of the error values was 2.2 error points and the arithmetic mean was 0.06 error points during real-time observation. The sum obtained using frame-by-frame analysis method equaled 10.8 and the mean equaled 0.30 error points. Error values obtained through the frame-by-frame video analysis of movement technique were higher than those obtained through the real-time observation method. The judges were able to indicate the number of the frame in which the maximal joint flexion occurred with good accuracy. Using the real-time observation method as well as the high-speed video analysis performed without determining the exact angle for assessing movement technique were found to be insufficient tools for improving the quality of judging.
NASA Technical Reports Server (NTRS)
Wing, L. D.
1979-01-01
Simplified analytical techniques of sounding rocket programs are suggested as a means of bringing the cost of thermal analysis of the Get Away Special (GAS) payloads within acceptable bounds. Particular attention is given to two methods adapted from sounding rocket technology - a method in which the container and payload are assumed to be divided in half vertically by a thermal plane of symmetry, and a method which considers the container and its payload to be an analogous one-dimensional unit having the real or correct container top surface area for radiative heat transfer and a fictitious mass and geometry which model the average thermal effects.
NASA Technical Reports Server (NTRS)
Casas, J. C.; Koziana, J. V.; Saylor, M. S.; Kindle, E. C.
1982-01-01
Problems associated with the development of the measurement of air pollution from satellites (MAPS) experiment program are addressed. The primary thrust of this research was the utilization of the MAPS experiment data in three application areas: low altitude aircraft flights (one to six km); mid altitude aircraft flights (eight to 12 km); and orbiting space platforms. Extensive research work in four major areas of data management was the framework for implementation of the MAPS experiment technique. These areas are: (1) data acquisition; (2) data processing, analysis and interpretation algorithms; (3) data display techniques; and (4) information production.
Automatic Generation of English-Japanese Translation Pattern Utilizing Genetic Programming Technique
NASA Astrophysics Data System (ADS)
Matsumura, Koki; Tamekuni, Yuji; Kimura, Shuhei
There are a lot of constructional differences in an English-Japanese phrase template, and that often makes the act of translation difficult. Moreover, there exist various and tremendous phrase templates and sentence to be refered to. It is not easy to prepare the corpus that covers the all. Therefore, it is very significant to generate the translation pattern of the sentence pattern automatically from a viewpoint of the translation success rate and the capacity of the pattern dictionary. Then, for the purpose of realizing the automatic generation of the translation pattern, this paper proposed the new method for the generation of the translation pattern by using the genetic programming technique (GP). The technique tries to generate the translation pattern of various sentences which are not registered in the phrase template dictionary automatically by giving the genetic operation to the parsing tree of a basic pattern. The tree consists of the pair of the English-Japanese sentence generated as the first stage population. The analysis tree data base with 50,100,150,200 pairs was prepared as the first stage population. And this system was applied and executed for an English input of 1,555 sentences. As a result, the analysis tree increases from 200 to 517, and the accuracy rate of the translation pattern has improved from 42.57% to 70.10%. And, 86.71% of the generated translations was successfully done, whose meanings are enough acceptable and understandable. It seemed that this proposal technique became a clue to raise the translation success rate, and to find the possibility of the reduction of the analysis tree data base.
Computational Aeroelastic Analysis of the Semi-Span Super-Sonic Transport (S4T) Wind-Tunnel Model
NASA Technical Reports Server (NTRS)
Sanetrik, Mark D.; Silva, Walter A.; Hur, Jiyoung
2012-01-01
A summary of the computational aeroelastic analysis for the Semi-Span Super-Sonic Transport (S4T) wind-tunnel model is presented. A broad range of analysis techniques, including linear, nonlinear and Reduced Order Models (ROMs) were employed in support of a series of aeroelastic (AE) and aeroservoelastic (ASE) wind-tunnel tests conducted in the Transonic Dynamics Tunnel (TDT) at NASA Langley Research Center. This research was performed in support of the ASE element in the Supersonics Program, part of NASA's Fundamental Aeronautics Program. The analysis concentrated on open-loop flutter predictions, which were in good agreement with experimental results. This paper is one in a series that comprise a special S4T technical session, which summarizes the S4T project.
Background Characterization Techniques For Pattern Recognition Applications
NASA Astrophysics Data System (ADS)
Noah, Meg A.; Noah, Paul V.; Schroeder, John W.; Kessler, Bernard V.; Chernick, Julian A.
1989-08-01
The Department of Defense has a requirement to investigate technologies for the detection of air and ground vehicles in a clutter environment. The use of autonomous systems using infrared, visible, and millimeter wave detectors has the potential to meet DOD's needs. In general, however, the hard-ware technology (large detector arrays with high sensitivity) has outpaced the development of processing techniques and software. In a complex background scene the "problem" is as much one of clutter rejection as it is target detection. The work described in this paper has investigated a new, and innovative, methodology for background clutter characterization, target detection and target identification. The approach uses multivariate statistical analysis to evaluate a set of image metrics applied to infrared cloud imagery and terrain clutter scenes. The techniques are applied to two distinct problems: the characterization of atmospheric water vapor cloud scenes for the Navy's Infrared Search and Track (IRST) applications to support the Infrared Modeling Measurement and Analysis Program (IRAMMP); and the detection of ground vehicles for the Army's Autonomous Homing Munitions (AHM) problems. This work was sponsored under two separate Small Business Innovative Research (SBIR) programs by the Naval Surface Warfare Center (NSWC), White Oak MD, and the Army Material Systems Analysis Activity at Aberdeen Proving Ground MD. The software described in this paper will be available from the respective contract technical representatives.
NASA Astrophysics Data System (ADS)
Lee, Eun Young; Novotny, Johannes; Wagreich, Michael
2015-04-01
In recent years, 3D visualization of sedimentary basins has become increasingly popular. Stratigraphic and structural mapping is highly important to understand the internal setting of sedimentary basins. And subsequent subsidence analysis provides significant insights for basin evolution. This study focused on developing a simple and user-friendly program which allows geologists to analyze and model sedimentary basin data. The developed program is aimed at stratigraphic and subsidence modelling of sedimentary basins from wells or stratigraphic profile data. This program is mainly based on two numerical methods; surface interpolation and subsidence analysis. For surface visualization four different interpolation techniques (Linear, Natural, Cubic Spline, and Thin-Plate Spline) are provided in this program. The subsidence analysis consists of decompaction and backstripping techniques. The numerical methods are computed in MATLAB® which is a multi-paradigm numerical computing environment used extensively in academic, research, and industrial fields. This program consists of five main processing steps; 1) setup (study area and stratigraphic units), 2) loading of well data, 3) stratigraphic modelling (depth distribution and isopach plots), 4) subsidence parameter input, and 5) subsidence modelling (subsided depth and subsidence rate plots). The graphical user interface intuitively guides users through all process stages and provides tools to analyse and export the results. Interpolation and subsidence results are cached to minimize redundant computations and improve the interactivity of the program. All 2D and 3D visualizations are created by using MATLAB plotting functions, which enables users to fine-tune the visualization results using the full range of available plot options in MATLAB. All functions of this program are illustrated with a case study of Miocene sediments in the Vienna Basin. The basin is an ideal place to test this program, because sufficient data is available to analyse and model stratigraphic setting and subsidence evolution of the basin. The study area covers approximately 1200 km2 including 110 data points in the central part of the Vienna Basin.
NERVA dynamic analysis methodology, SPRVIB
NASA Technical Reports Server (NTRS)
Vronay, D. F.
1972-01-01
The general dynamic computer code called SPRVIB (Spring Vib) developed in support of the NERVA (nuclear engine for rocket vehicle application) program is described. Using normal mode techniques, the program computes kinematical responses of a structure caused by various combinations of harmonic and elliptic forcing functions or base excitations. Provision is made for a graphical type of force or base excitation input to the structure. A description of the required input format and a listing of the program are presented, along with several examples illustrating the use of the program. SPRVIB is written in FORTRAN 4 computer language for use on the CDC 6600 or the IBM 360/75 computers.
A study of mapping exogenous knowledge representations into CONFIG
NASA Technical Reports Server (NTRS)
Mayfield, Blayne E.
1992-01-01
Qualitative reasoning is reasoning with a small set of qualitative values that is an abstraction of a larger and perhaps infinite set of quantitative values. The use of qualitative and quantitative reasoning together holds great promise for performance improvement in applications that suffer from large and/or imprecise knowledge domains. Included among these applications are the modeling, simulation, analysis, and fault diagnosis of physical systems. Several research groups continue to discover and experiment with new qualitative representations and reasoning techniques. However, due to the diversity of these techniques, it is difficult for the programs produced to exchange system models easily. The availability of mappings to transform knowledge from the form used by one of these programs to that used by another would open the doors for comparative analysis of these programs in areas such as completeness, correctness, and performance. A group at the Johnson Space Center (JSC) is working to develop CONFIG, a prototype qualitative modeling, simulation, and analysis tool for fault diagnosis applications in the U.S. space program. The availability of knowledge mappings from the programs produced by other research groups to CONFIG may provide savings in CONFIG's development costs and time, and may improve CONFIG's performance. The study of such mappings is the purpose of the research described in this paper. Two other research groups that have worked with the JSC group in the past are the Northwest University Group and the University of Texas at Austin Group. The former has produced a qualitative reasoning tool named SIMGEN, and the latter has produced one named QSIM. Another program produced by the Austin group is CC, a preprocessor that permits users to develop input for eventual use by QSIM, but in a more natural format. CONFIG and CC are both based on a component-connection ontology, so a mapping from CC's knowledge representation to CONFIG's knowledge representation was chosen as the focus of this study. A mapping from CC to CONFIG was developed. Due to differences between the two programs, however, the mapping transforms some of the CC knowledge to CONFIG as documentation rather than as knowledge in a form useful to computation. The study suggests that it may be worthwhile to pursue the mappings further. By implementing the mapping as a program, actual comparisons of computational efficiency and quality of results can be made between the QSIM and CONFIG programs. A secondary study may reveal that the results of the two programs augment one another, contradict one another, or differ only slightly. If the latter, the qualitative reasoning techniques may be compared in other areas, such as computational efficiency.
Progress in multidisciplinary design optimization at NASA Langley
NASA Technical Reports Server (NTRS)
Padula, Sharon L.
1993-01-01
Multidisciplinary Design Optimization refers to some combination of disciplinary analyses, sensitivity analysis, and optimization techniques used to design complex engineering systems. The ultimate objective of this research at NASA Langley Research Center is to help the US industry reduce the costs associated with development, manufacturing, and maintenance of aerospace vehicles while improving system performance. This report reviews progress towards this objective and highlights topics for future research. Aerospace design problems selected from the author's research illustrate strengths and weaknesses in existing multidisciplinary optimization techniques. The techniques discussed include multiobjective optimization, global sensitivity equations and sequential linear programming.
Photographic and photometric enhancement of Lunar Orbiter products, projects A, B and C
NASA Technical Reports Server (NTRS)
1972-01-01
A detailed discussion is presented of the framelet joining, photometric data improvement, and statistical error analysis. The Lunar Orbiter film handling system, readout system, and the digitization are described, along with the technique of joining adjacent framelets by a using a digital computer. Time and cost estimates are given. The problems and techniques involved in improving the digitized data are discussed. It was found that spectacular improvements are possible. Program documentations are included.
NASA Technical Reports Server (NTRS)
1996-01-01
Solving for the displacements of free-free coupled systems acted upon by static loads is a common task in the aerospace industry. Often, these problems are solved by static analysis with inertia relief. This technique allows for a free-free static analysis by balancing the applied loads with the inertia loads generated by the applied loads. For some engineering applications, the displacements of the free-free coupled system induce additional static loads. Hence, the applied loads are equal to the original loads plus the displacement-dependent loads. A launch vehicle being acted upon by an aerodynamic loading can have such applied loads. The final displacements of such systems are commonly determined with iterative solution techniques. Unfortunately, these techniques can be time consuming and labor intensive. Because the coupled system equations for free-free systems with displacement-dependent loads can be written in closed form, it is advantageous to solve for the displacements in this manner. Implementing closed-form equations in static analysis with inertia relief is analogous to implementing transfer functions in dynamic analysis. An MSC/NASTRAN (MacNeal-Schwendler Corporation/NASA Structural Analysis) DMAP (Direct Matrix Abstraction Program) Alter was used to include displacement-dependent loads in static analysis with inertia relief. It efficiently solved a common aerospace problem that typically has been solved with an iterative technique.
Geochemical Exploration Techniques Applicable in the Search for Copper Deposits
Chaffee, Maurice A.
1975-01-01
Geochemical exploration is an important part of copper-resource evaluation. A large number of geochemical exploration techniques, both proved and untried, are available to the geochemist to use in the search for new copper deposits. Analyses of whole-rock samples have been used in both regional and local geochemical exploration surveys in the search for copper. Analyses of mineral separates, such as biotite, magnetite, and sulfides, have also been used. Analyses of soil samples are widely used in geochemical exploration, especially for localized surveys. It is important to distinguish between residual and transported soil types. Orientation studies should always be conducted prior to a geochemical investigation in a given area in order to determine the best soil horizon and the best size of soil material for sampling in that area. Silty frost boils, caliche, and desert varnish are specialized types of soil samples that might be useful sampling media. Soil gas is a new and potentially valuable geochemical sampling medium, especially in exploring for buried mineral deposits in arid regions. Gaseous products in samples of soil may be related to base-metal deposits and include mercury vapor, sulfur dioxide, hydrogen sulfide, carbon oxysulfide, carbon dioxide, hydrogen, oxygen, nitrogen, the noble gases, the halogens, and many hydrocarbon compounds. Transported materials that have been used in geochemical sampling programs include glacial float boulders, glacial till, esker gravels, stream sediments, stream-sediment concentrates, and lake sediments. Stream-sediment sampling is probably the most widely used and most successful geochemical exploration technique. Hydrogeochemical exploration programs have utilized hot- and cold-spring waters and their precipitates as well as waters from lakes, streams, and wells. Organic gel found in lakes and at stream mouths is an unproved sampling medium. Suspended material and dissolved gases in any type of water may also be useful media. Samples of ice and snow have been used for limited geochemical surveys. Both geobotanical and biogeochemical surveys have been successful in locating copper deposits in many parts of the world. Micro-organisms, including bacteria and algae, are other unproved media that should be studied. Animals can be used in geochemical-prospecting programs. Dogs have been used quite successfully to sniff out hidden and exposed sulfide minerals. Tennite mounds are commonly composed of subsurface material, but have not as yet proved to be useful in locating buried mineral deposits. Animal tissue and waste products are essentially unproved but potentially valuable sampling media. Knowledge of the location of areas where trace-element-associated diseases in animals and man are endemic as well as a better understanding of these diseases, may aid in identifying regions that are enriched in or depleted of various elements, including copper. Results of analyses of gases in the atmosphere are proving valuable in mineral-exploration surveys. Studies involving metallic compounds exhaled by plants into the atmosphere, and of particulate matter suspended in the atmosphere are reviewed these methods may become important in the future. Remote-sensing techniques are useful for making indirect measurements of geochemical responses. Two techniques applicable to geochemical exploration are neutron-activation analysis and gamma-ray spectrometry. Aerial photography is especially useful in vegetation surveys. Radar imagery is an unproved but potentially valuable method for use in studies of vegetation in perpetually clouded regions. With the advent of modern computers, many new techniques, such as correlation analysis, regression analysis, discriminant analysis, factor analysis, cluster analysis, trend-surface analysis, and moving-average analysis can be applied to geochemical data sets. Selective use of these techniques can provide new insights into the interpretatio
Multiobjective Decision Analysis With Engineering and Business Applications
NASA Astrophysics Data System (ADS)
Wood, Eric
The last 15 years have witnessed the development of a large number of multiobjective decision techniques. Applying these techniques to environmental, engineering, and business problems has become well accepted. Multiobjective Decision Analysis With Engineering and Business Applications attempts to cover the main multiobjective techniques both in their mathematical treatment and in their application to real-world problems.The book is divided into 12 chapters plus three appendices. The main portion of the book is represented by chapters 3-6, Where the various approaches are identified, classified, and reviewed. Chapter 3 covers methods for generating nondominated solutions; chapter 4, continuous methods with prior preference articulation; chapter 5, discrete methods with prior preference articulation; and chapter 6, methods of progressive articulation of preferences. In these four chapters, close to 20 techniques are discussed with over 20 illustrative examples. This is both a strength and a weakness; the breadth of techniques and examples provide comprehensive coverage, but it is in a style too mathematically compact for most readers. By my count, the presentation of the 20 techniques in chapters 3-6 covered 85 pages, an average of about 4.5 pages each; therefore, a sound basis in linear algebra and linear programing is required if the reader hopes to follow the material. Chapter 2, “Concepts in Multiobjective Analysis,” also assumes such a background.
Estimates of ground-water recharge based on streamflow-hydrograph methods: Pennsylvania
Risser, Dennis W.; Conger, Randall W.; Ulrich, James E.; Asmussen, Michael P.
2005-01-01
This study, completed by the U.S. Geological Survey (USGS) in cooperation with the Pennsylvania Department of Conservation and Natural Resources, Bureau of Topographic and Geologic Survey (T&GS), provides estimates of ground-water recharge for watersheds throughout Pennsylvania computed by use of two automated streamflow-hydrograph-analysis methods--PART and RORA. The PART computer program uses a hydrograph-separation technique to divide the streamflow hydrograph into components of direct runoff and base flow. Base flow can be a useful approximation of recharge if losses and interbasin transfers of ground water are minimal. The RORA computer program uses a recession-curve displacement technique to estimate ground-water recharge from each storm period indicated on the streamflow hydrograph. Recharge estimates were made using streamflow records collected during 1885-2001 from 197 active and inactive streamflow-gaging stations in Pennsylvania where streamflow is relatively unaffected by regulation. Estimates of mean-annual recharge in Pennsylvania computed by the use of PART ranged from 5.8 to 26.6 inches; estimates from RORA ranged from 7.7 to 29.3 inches. Estimates from the RORA program were about 2 inches greater than those derived from the PART program. Mean-monthly recharge was computed from the RORA program and was reported as a percentage of mean-annual recharge. On the basis of this analysis, the major ground-water recharge period in Pennsylvania typically is November through May; the greatest monthly recharge typically occurs in March.
Program and Project Management Framework
NASA Technical Reports Server (NTRS)
Butler, Cassandra D.
2002-01-01
The primary objective of this project was to develop a framework and system architecture for integrating program and project management tools that may be applied consistently throughout Kennedy Space Center (KSC) to optimize planning, cost estimating, risk management, and project control. Project management methodology used in building interactive systems to accommodate the needs of the project managers is applied as a key component in assessing the usefulness and applicability of the framework and tools developed. Research for the project included investigation and analysis of industrial practices, KSC standards, policies, and techniques, Systems Management Office (SMO) personnel, and other documented experiences of project management experts. In addition, this project documents best practices derived from the literature as well as new or developing project management models, practices, and techniques.
NASA Technical Reports Server (NTRS)
Gunness, R. C., Jr.; Knight, C. J.; Dsylva, E.
1972-01-01
The unified small disturbance equations are numerically solved using the well-known Lax-Wendroff finite difference technique. The method allows complete determination of the inviscid flow field and surface properties as long as the flow remains supersonic. Shock waves and other discontinuities are accounted for implicity in the numerical method. This technique was programed for general application to the three-dimensional case. The validity of the method is demonstrated by calculations on cones, axisymmetric bodies, lifting bodies, delta wings, and a conical wing/body combination. Part 1 contains the discussion of problem development and results of the study. Part 2 contains flow charts, subroutine descriptions, and a listing of the computer program.
Measuring energy-saving retrofits: Experiences from the Texas LoanSTAR program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haberl, J.S.; Reddy, T.A.; Claridge, D.E.
1996-02-01
In 1988 the Governor`s Energy Management Center of Texas received approval from the US Department of Energy to establish a $98.6 million state-wide retrofit demonstration revolving loan program to fund energy-conserving retrofits in state, public school, and local government buildings. As part of this program, a first-of-its-kind, statewide Monitoring and Analysis Program (MAP) was established to verify energy and dollar savings of the retrofits, reduce energy costs by identifying operational and maintenance improvements, improve retrofit selection in future rounds of the LoanSTAR program, and initiate a data base of energy use in institutional and commercial buildings located in Texas. Thismore » report discusses the LoanSTAR MAP with an emphasis on the process of acquiring and analyzing data to measure savings from energy conservation retrofits when budgets are a constraint. This report includes a discussion of the program structure, basic measurement techniques, data archiving and handling, data reporting and analysis, and includes selected examples from LoanSTAR agencies. A summary of the program results for the first two years of monitoring is also included.« less
Amariti, M L; Restori, M; De Ferrari, F; Paganelli, C; Faglia, R; Legnani, G
1999-06-01
Age determination by teeth examination is one of the main means of determining personal identification. Current studies have suggested different techniques for determining the age of a subject by means of the analysis of microscopic and macroscopic structural modifications of the tooth with ageing. The histological approach is useful among the various methodologies utilized for this purpose. It is still unclear as to what is the best technique, as almost all the authors suggest the use of the approach they themselves have tested. In the present study, age determination by means of microscopic techniques has been based on the quantitative analysis of three parameters, all well recognized in specialized literature: 1. dentinal tubules density/sclerosis 2. tooth translucency 3. analysis of the cementum thickness. After a description of the three methodologies (with automatic image processing of the dentinal sclerosis utilizing an appropriate computer program developed by the authors) the results obtained on cases using the three different approaches are presented, and the merits and failings of each technique are identified with the intention of identifying the one offering the least degree of error in age determination.
A computer program for cyclic plasticity and structural fatigue analysis
NASA Technical Reports Server (NTRS)
Kalev, I.
1980-01-01
A computerized tool for the analysis of time independent cyclic plasticity structural response, life to crack initiation prediction, and crack growth rate prediction for metallic materials is described. Three analytical items are combined: the finite element method with its associated numerical techniques for idealization of the structural component, cyclic plasticity models for idealization of the material behavior, and damage accumulation criteria for the fatigue failure.
Better Finite-Element Analysis of Composite Shell Structures
NASA Technical Reports Server (NTRS)
Clarke, Gregory
2007-01-01
A computer program implements a finite-element-based method of predicting the deformations of thin aerospace structures made of isotropic materials or anisotropic fiber-reinforced composite materials. The technique and corresponding software are applicable to thin shell structures in general and are particularly useful for analysis of thin beamlike members having open cross-sections (e.g. I-beams and C-channels) in which significant warping can occur.
Programmable calculator software for computation of the plasma binding of ligands.
Conner, D P; Rocci, M L; Larijani, G E
1986-01-01
The computation of the extent of plasma binding of a ligand to plasma constituents using radiolabeled ligand and equilibrium dialysis is complex and tedious. A computer program for the HP-41C Handheld Computer Series (Hewlett-Packard) was developed to perform these calculations. The first segment of the program constructs a standard curve for quench correction of post-dialysis plasma and buffer samples, using either external standard ratio (ESR) or sample channels ratio (SCR) techniques. The remainder of the program uses the counts per minute, SCR or ESR, and post-dialysis volume of paired plasma and buffer samples generated from the dialysis procedure to compute the extent of binding after correction for background radiation, counting efficiency, and intradialytic shifts of fluid between plasma and buffer compartments during dialysis. This program greatly simplifies the analysis of equilibrium dialysis data and has been employed in the analysis of dexamethasone binding in normal and uremic sera.