Sample records for systems analysis techniques

  1. The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology: Capabilities and Applications

    NASA Technical Reports Server (NTRS)

    Evers, Ken H.; Bachert, Robert F.

    1987-01-01

    The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology has been formulated and applied over a five-year period. It has proven to be a unique, integrated approach utilizing a top-down, structured technique to define and document the system of interest; a knowledge engineering technique to collect and organize system descriptive information; a rapid prototyping technique to perform preliminary system performance analysis; and a sophisticated simulation technique to perform in-depth system performance analysis.

  2. Systems design analysis applied to launch vehicle configuration

    NASA Technical Reports Server (NTRS)

    Ryan, R.; Verderaime, V.

    1993-01-01

    As emphasis shifts from optimum-performance aerospace systems to least lift-cycle costs, systems designs must seek, adapt, and innovate cost improvement techniques in design through operations. The systems design process of concept, definition, and design was assessed for the types and flow of total quality management techniques that may be applicable in a launch vehicle systems design analysis. Techniques discussed are task ordering, quality leverage, concurrent engineering, Pareto's principle, robustness, quality function deployment, criteria, and others. These cost oriented techniques are as applicable to aerospace systems design analysis as to any large commercial system.

  3. Rocket engine system reliability analyses using probabilistic and fuzzy logic techniques

    NASA Technical Reports Server (NTRS)

    Hardy, Terry L.; Rapp, Douglas C.

    1994-01-01

    The reliability of rocket engine systems was analyzed by using probabilistic and fuzzy logic techniques. Fault trees were developed for integrated modular engine (IME) and discrete engine systems, and then were used with the two techniques to quantify reliability. The IRRAS (Integrated Reliability and Risk Analysis System) computer code, developed for the U.S. Nuclear Regulatory Commission, was used for the probabilistic analyses, and FUZZYFTA (Fuzzy Fault Tree Analysis), a code developed at NASA Lewis Research Center, was used for the fuzzy logic analyses. Although both techniques provided estimates of the reliability of the IME and discrete systems, probabilistic techniques emphasized uncertainty resulting from randomness in the system whereas fuzzy logic techniques emphasized uncertainty resulting from vagueness in the system. Because uncertainty can have both random and vague components, both techniques were found to be useful tools in the analysis of rocket engine system reliability.

  4. Rewriting Modulo SMT

    NASA Technical Reports Server (NTRS)

    Rocha, Camilo; Meseguer, Jose; Munoz, Cesar A.

    2013-01-01

    Combining symbolic techniques such as: (i) SMT solving, (ii) rewriting modulo theories, and (iii) model checking can enable the analysis of infinite-state systems outside the scope of each such technique. This paper proposes rewriting modulo SMT as a new technique combining the powers of (i)-(iii) and ideally suited to model and analyze infinite-state open systems; that is, systems that interact with a non-deterministic environment. Such systems exhibit both internal non-determinism due to the system, and external non-determinism due to the environment. They are not amenable to finite-state model checking analysis because they typically are infinite-state. By being reducible to standard rewriting using reflective techniques, rewriting modulo SMT can both naturally model and analyze open systems without requiring any changes to rewriting-based reachability analysis techniques for closed systems. This is illustrated by the analysis of a real-time system beyond the scope of timed automata methods.

  5. Development of a sensitivity analysis technique for multiloop flight control systems

    NASA Technical Reports Server (NTRS)

    Vaillard, A. H.; Paduano, J.; Downing, D. R.

    1985-01-01

    This report presents the development and application of a sensitivity analysis technique for multiloop flight control systems. This analysis yields very useful information on the sensitivity of the relative-stability criteria of the control system, with variations or uncertainties in the system and controller elements. The sensitivity analysis technique developed is based on the computation of the singular values and singular-value gradients of a feedback-control system. The method is applicable to single-input/single-output as well as multiloop continuous-control systems. Application to sampled-data systems is also explored. The sensitivity analysis technique was applied to a continuous yaw/roll damper stability augmentation system of a typical business jet, and the results show that the analysis is very useful in determining the system elements which have the largest effect on the relative stability of the closed-loop system. As a secondary product of the research reported here, the relative stability criteria based on the concept of singular values were explored.

  6. Analysis technique for controlling system wavefront error with active/adaptive optics

    NASA Astrophysics Data System (ADS)

    Genberg, Victor L.; Michels, Gregory J.

    2017-08-01

    The ultimate goal of an active mirror system is to control system level wavefront error (WFE). In the past, the use of this technique was limited by the difficulty of obtaining a linear optics model. In this paper, an automated method for controlling system level WFE using a linear optics model is presented. An error estimate is included in the analysis output for both surface error disturbance fitting and actuator influence function fitting. To control adaptive optics, the technique has been extended to write system WFE in state space matrix form. The technique is demonstrated by example with SigFit, a commercially available tool integrating mechanical analysis with optical analysis.

  7. Application of a sensitivity analysis technique to high-order digital flight control systems

    NASA Technical Reports Server (NTRS)

    Paduano, James D.; Downing, David R.

    1987-01-01

    A sensitivity analysis technique for multiloop flight control systems is studied. This technique uses the scaled singular values of the return difference matrix as a measure of the relative stability of a control system. It then uses the gradients of these singular values with respect to system and controller parameters to judge sensitivity. The sensitivity analysis technique is first reviewed; then it is extended to include digital systems, through the derivation of singular-value gradient equations. Gradients with respect to parameters which do not appear explicitly as control-system matrix elements are also derived, so that high-order systems can be studied. A complete review of the integrated technique is given by way of a simple example: the inverted pendulum problem. The technique is then demonstrated on the X-29 control laws. Results show linear models of real systems can be analyzed by this sensitivity technique, if it is applied with care. A computer program called SVA was written to accomplish the singular-value sensitivity analysis techniques. Thus computational methods and considerations form an integral part of many of the discussions. A user's guide to the program is included. The SVA is a fully public domain program, running on the NASA/Dryden Elxsi computer.

  8. Phased-mission system analysis using Boolean algebraic methods

    NASA Technical Reports Server (NTRS)

    Somani, Arun K.; Trivedi, Kishor S.

    1993-01-01

    Most reliability analysis techniques and tools assume that a system is used for a mission consisting of a single phase. However, multiple phases are natural in many missions. The failure rates of components, system configuration, and success criteria may vary from phase to phase. In addition, the duration of a phase may be deterministic or random. Recently, several researchers have addressed the problem of reliability analysis of such systems using a variety of methods. A new technique for phased-mission system reliability analysis based on Boolean algebraic methods is described. Our technique is computationally efficient and is applicable to a large class of systems for which the failure criterion in each phase can be expressed as a fault tree (or an equivalent representation). Our technique avoids state space explosion that commonly plague Markov chain-based analysis. A phase algebra to account for the effects of variable configurations and success criteria from phase to phase was developed. Our technique yields exact (as opposed to approximate) results. The use of our technique was demonstrated by means of an example and present numerical results to show the effects of mission phases on the system reliability.

  9. Large space antennas: A systems analysis case history

    NASA Technical Reports Server (NTRS)

    Keafer, Lloyd S. (Compiler); Lovelace, U. M. (Compiler)

    1987-01-01

    The value of systems analysis and engineering is aptly demonstrated by the work on Large Space Antennas (LSA) by the NASA Langley Spacecraft Analysis Branch. This work was accomplished over the last half-decade by augmenting traditional system engineering, analysis, and design techniques with computer-aided engineering (CAE) techniques using the Langley-developed Interactive Design and Evaluation of Advanced Spacecraft (IDEAS) system. This report chronicles the research highlights and special systems analyses that focused the LSA work on deployable truss antennas. It notes developmental trends toward greater use of CAE techniques in their design and analysis. A look to the future envisions the application of improved systems analysis capabilities to advanced space systems such as an advanced space station or to lunar and Martian missions and human habitats.

  10. A review of intelligent systems for heart sound signal analysis.

    PubMed

    Nabih-Ali, Mohammed; El-Dahshan, El-Sayed A; Yahia, Ashraf S

    2017-10-01

    Intelligent computer-aided diagnosis (CAD) systems can enhance the diagnostic capabilities of physicians and reduce the time required for accurate diagnosis. CAD systems could provide physicians with a suggestion about the diagnostic of heart diseases. The objective of this paper is to review the recent published preprocessing, feature extraction and classification techniques and their state of the art of phonocardiogram (PCG) signal analysis. Published literature reviewed in this paper shows the potential of machine learning techniques as a design tool in PCG CAD systems and reveals that the CAD systems for PCG signal analysis are still an open problem. Related studies are compared to their datasets, feature extraction techniques and the classifiers they used. Current achievements and limitations in developing CAD systems for PCG signal analysis using machine learning techniques are presented and discussed. In the light of this review, a number of future research directions for PCG signal analysis are provided.

  11. Reachability analysis of real-time systems using time Petri nets.

    PubMed

    Wang, J; Deng, Y; Xu, G

    2000-01-01

    Time Petri nets (TPNs) are a popular Petri net model for specification and verification of real-time systems. A fundamental and most widely applied method for analyzing Petri nets is reachability analysis. The existing technique for reachability analysis of TPNs, however, is not suitable for timing property verification because one cannot derive end-to-end delay in task execution, an important issue for time-critical systems, from the reachability tree constructed using the technique. In this paper, we present a new reachability based analysis technique for TPNs for timing property analysis and verification that effectively addresses the problem. Our technique is based on a concept called clock-stamped state class (CS-class). With the reachability tree generated based on CS-classes, we can directly compute the end-to-end time delay in task execution. Moreover, a CS-class can be uniquely mapped to a traditional state class based on which the conventional reachability tree is constructed. Therefore, our CS-class-based analysis technique is more general than the existing technique. We show how to apply this technique to timing property verification of the TPN model of a command and control (C2) system.

  12. The application of emulation techniques in the analysis of highly reliable, guidance and control computer systems

    NASA Technical Reports Server (NTRS)

    Migneault, Gerard E.

    1987-01-01

    Emulation techniques can be a solution to a difficulty that arises in the analysis of the reliability of guidance and control computer systems for future commercial aircraft. Described here is the difficulty, the lack of credibility of reliability estimates obtained by analytical modeling techniques. The difficulty is an unavoidable consequence of the following: (1) a reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Use of emulation techniques for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques is then discussed. Finally several examples of the application of emulation techniques are described.

  13. A novel pulse height analysis technique for nuclear spectroscopic and imaging systems

    NASA Astrophysics Data System (ADS)

    Tseng, H. H.; Wang, C. Y.; Chou, H. P.

    2005-08-01

    The proposed pulse height analysis technique is based on the constant and linear relationship between pulse width and pulse height generated from front-end electronics of nuclear spectroscopic and imaging systems. The present technique has successfully implemented into the sump water radiation monitoring system in a nuclear power plant. The radiation monitoring system uses a NaI(Tl) scintillator to detect radioactive nuclides of Radon daughters brought down by rain. The technique is also used for a nuclear medical imaging system. The system uses a position sensitive photomultiplier tube coupled with a scintillator. The proposed techniques has greatly simplified the electronic design and made the system a feasible one for potable applications.

  14. Structured Analysis and the Data Flow Diagram: Tools for Library Analysis.

    ERIC Educational Resources Information Center

    Carlson, David H.

    1986-01-01

    This article discusses tools developed to aid the systems analysis process (program evaluation and review technique, Gantt charts, organizational charts, decision tables, flowcharts, hierarchy plus input-process-output). Similarities and differences among techniques, library applications of analysis, structured systems analysis, and the data flow…

  15. Modeling and Analysis of Power Processing Systems (MAPPS). Volume 1: Technical report

    NASA Technical Reports Server (NTRS)

    Lee, F. C.; Rahman, S.; Carter, R. A.; Wu, C. H.; Yu, Y.; Chang, R.

    1980-01-01

    Computer aided design and analysis techniques were applied to power processing equipment. Topics covered include: (1) discrete time domain analysis of switching regulators for performance analysis; (2) design optimization of power converters using augmented Lagrangian penalty function technique; (3) investigation of current-injected multiloop controlled switching regulators; and (4) application of optimization for Navy VSTOL energy power system. The generation of the mathematical models and the development and application of computer aided design techniques to solve the different mathematical models are discussed. Recommendations are made for future work that would enhance the application of the computer aided design techniques for power processing systems.

  16. A Reference Model for Software and System Inspections. White Paper

    NASA Technical Reports Server (NTRS)

    He, Lulu; Shull, Forrest

    2009-01-01

    Software Quality Assurance (SQA) is an important component of the software development process. SQA processes provide assurance that the software products and processes in the project life cycle conform to their specified requirements by planning, enacting, and performing a set of activities to provide adequate confidence that quality is being built into the software. Typical techniques include: (1) Testing (2) Simulation (3) Model checking (4) Symbolic execution (5) Management reviews (6) Technical reviews (7) Inspections (8) Walk-throughs (9) Audits (10) Analysis (complexity analysis, control flow analysis, algorithmic analysis) (11) Formal method Our work over the last few years has resulted in substantial knowledge about SQA techniques, especially the areas of technical reviews and inspections. But can we apply the same QA techniques to the system development process? If yes, what kind of tailoring do we need before applying them in the system engineering context? If not, what types of QA techniques are actually used at system level? And, is there any room for improvement.) After a brief examination of the system engineering literature (especially focused on NASA and DoD guidance) we found that: (1) System and software development process interact with each other at different phases through development life cycle (2) Reviews are emphasized in both system and software development. (Figl.3). For some reviews (e.g. SRR, PDR, CDR), there are both system versions and software versions. (3) Analysis techniques are emphasized (e.g. Fault Tree Analysis, Preliminary Hazard Analysis) and some details are given about how to apply them. (4) Reviews are expected to use the outputs of the analysis techniques. In other words, these particular analyses are usually conducted in preparation for (before) reviews. The goal of our work is to explore the interaction between the Quality Assurance (QA) techniques at the system level and the software level.

  17. A CHARTING TECHNIQUE FOR THE ANALYSIS OF BUSINESS SYSTEMS,

    DTIC Science & Technology

    This paper describes a charting technique useful in the analysis of business systems and in studies of the information economics of the firm. The...planning advanced systems. It is not restricted to any particular kind of business or information system. (Author)

  18. Evidential Reasoning in Expert Systems for Image Analysis.

    DTIC Science & Technology

    1985-02-01

    techniques to image analysis (IA). There is growing evidence that these techniques offer significant improvements in image analysis , particularly in the...2) to provide a common framework for analysis, (3) to structure the ER process for major expert-system tasks in image analysis , and (4) to identify...approaches to three important tasks for expert systems in the domain of image analysis . This segment concluded with an assessment of the strengths

  19. Simplified Phased-Mission System Analysis for Systems with Independent Component Repairs

    NASA Technical Reports Server (NTRS)

    Somani, Arun K.

    1996-01-01

    Accurate analysis of reliability of system requires that it accounts for all major variations in system's operation. Most reliability analyses assume that the system configuration, success criteria, and component behavior remain the same. However, multiple phases are natural. We present a new computationally efficient technique for analysis of phased-mission systems where the operational states of a system can be described by combinations of components states (such as fault trees or assertions). Moreover, individual components may be repaired, if failed, as part of system operation but repairs are independent of the system state. For repairable systems Markov analysis techniques are used but they suffer from state space explosion. That limits the size of system that can be analyzed and it is expensive in computation. We avoid the state space explosion. The phase algebra is used to account for the effects of variable configurations, repairs, and success criteria from phase to phase. Our technique yields exact (as opposed to approximate) results. We demonstrate our technique by means of several examples and present numerical results to show the effects of phases and repairs on the system reliability/availability.

  20. Teaching Tip: Using Activity Diagrams to Model Systems Analysis Techniques: Teaching What We Preach

    ERIC Educational Resources Information Center

    Lending, Diane; May, Jeffrey

    2013-01-01

    Activity diagrams are used in Systems Analysis and Design classes as a visual tool to model the business processes of "as-is" and "to-be" systems. This paper presents the idea of using these same activity diagrams in the classroom to model the actual processes (practices and techniques) of Systems Analysis and Design. This tip…

  1. Evaluation Applied to Reliability Analysis of Reconfigurable, Highly Reliable, Fault-Tolerant, Computing Systems for Avionics

    NASA Technical Reports Server (NTRS)

    Migneault, G. E.

    1979-01-01

    Emulation techniques are proposed as a solution to a difficulty arising in the analysis of the reliability of highly reliable computer systems for future commercial aircraft. The difficulty, viz., the lack of credible precision in reliability estimates obtained by analytical modeling techniques are established. The difficulty is shown to be an unavoidable consequence of: (1) a high reliability requirement so demanding as to make system evaluation by use testing infeasible, (2) a complex system design technique, fault tolerance, (3) system reliability dominated by errors due to flaws in the system definition, and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. The technique of emulation is described, indicating how its input is a simple description of the logical structure of a system and its output is the consequent behavior. The use of emulation techniques is discussed for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques.

  2. Hybrid soft computing systems for electromyographic signals analysis: a review.

    PubMed

    Xie, Hong-Bo; Guo, Tianruo; Bai, Siwei; Dokos, Socrates

    2014-02-03

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis.

  3. Hybrid soft computing systems for electromyographic signals analysis: a review

    PubMed Central

    2014-01-01

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis. PMID:24490979

  4. Demonstration of a Safety Analysis on a Complex System

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy; Alfaro, Liliana; Alvarado, Christine; Brown, Molly; Hunt, Earl B.; Jaffe, Matt; Joslyn, Susan; Pinnell, Denise; Reese, Jon; Samarziya, Jeffrey; hide

    1997-01-01

    For the past 17 years, Professor Leveson and her graduate students have been developing a theoretical foundation for safety in complex systems and building a methodology upon that foundation. The methodology includes special management structures and procedures, system hazard analyses, software hazard analysis, requirements modeling and analysis for completeness and safety, special software design techniques including the design of human-machine interaction, verification, operational feedback, and change analysis. The Safeware methodology is based on system safety techniques that are extended to deal with software and human error. Automation is used to enhance our ability to cope with complex systems. Identification, classification, and evaluation of hazards is done using modeling and analysis. To be effective, the models and analysis tools must consider the hardware, software, and human components in these systems. They also need to include a variety of analysis techniques and orthogonal approaches: There exists no single safety analysis or evaluation technique that can handle all aspects of complex systems. Applying only one or two may make us feel satisfied, but will produce limited results. We report here on a demonstration, performed as part of a contract with NASA Langley Research Center, of the Safeware methodology on the Center-TRACON Automation System (CTAS) portion of the air traffic control (ATC) system and procedures currently employed at the Dallas/Fort Worth (DFW) TRACON (Terminal Radar Approach CONtrol). CTAS is an automated system to assist controllers in handling arrival traffic in the DFW area. Safety is a system property, not a component property, so our safety analysis considers the entire system and not simply the automated components. Because safety analysis of a complex system is an interdisciplinary effort, our team included system engineers, software engineers, human factors experts, and cognitive psychologists.

  5. Low-thrust chemical propulsion system propellant expulsion and thermal conditioning study. Executive summary

    NASA Technical Reports Server (NTRS)

    Merino, F.; Wakabayashi, I.; Pleasant, R. L.; Hill, M.

    1982-01-01

    Preferred techniques for providing abort pressurization and engine feed system net positive suction pressure (NPSP) for low thrust chemical propulsion systems (LTPS) were determined. A representative LTPS vehicle configuration is presented. Analysis tasks include: propellant heating analysis; pressurant requirements for abort propellant dump; and comparative analysis of pressurization techniques and thermal subcoolers.

  6. Occupational Analysis Technology: Expanded Role in Development of Cost-Effective Maintenance Systems. Final Report.

    ERIC Educational Resources Information Center

    Foley, John P., Jr.

    A study was conducted to refine and coordinate occupational analysis, job performance aids, and elements of the instructional systems development process for task specific Air Force maintenance training. Techniques for task identification and analysis (TI & A) and data gathering techniques for occupational analysis were related. While TI &…

  7. Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling

    NASA Technical Reports Server (NTRS)

    Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2005-01-01

    Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).

  8. Automated thermal mapping techniques using chromatic image analysis

    NASA Technical Reports Server (NTRS)

    Buck, Gregory M.

    1989-01-01

    Thermal imaging techniques are introduced using a chromatic image analysis system and temperature sensitive coatings. These techniques are used for thermal mapping and surface heat transfer measurements on aerothermodynamic test models in hypersonic wind tunnels. Measurements are made on complex vehicle configurations in a timely manner and at minimal expense. The image analysis system uses separate wavelength filtered images to analyze surface spectral intensity data. The system was initially developed for quantitative surface temperature mapping using two-color thermographic phosphors but was found useful in interpreting phase change paint and liquid crystal data as well.

  9. Embedded expert system for space shuttle main engine maintenance

    NASA Technical Reports Server (NTRS)

    Pooley, J.; Thompson, W.; Homsley, T.; Teoh, W.; Jones, J.; Lewallen, P.

    1987-01-01

    The SPARTA Embedded Expert System (SEES) is an intelligent health monitoring system that directs analysis by placing confidence factors on possible engine status and then recommends a course of action to an engineer or engine controller. The technique can prevent catastropic failures or costly rocket engine down time because of false alarms. Further, the SEES has potential as an on-board flight monitor for reusable rocket engine systems. The SEES methodology synergistically integrates vibration analysis, pattern recognition and communications theory techniques with an artificial intelligence technique - the Embedded Expert System (EES).

  10. An Introduction to Markov Modeling: Concepts and Uses

    NASA Technical Reports Server (NTRS)

    Boyd, Mark A.; Lau, Sonie (Technical Monitor)

    1998-01-01

    Kharkov modeling is a modeling technique that is widely useful for dependability analysis of complex fault tolerant systems. It is very flexible in the type of systems and system behavior it can model. It is not, however, the most appropriate modeling technique for every modeling situation. The first task in obtaining a reliability or availability estimate for a system is selecting which modeling technique is most appropriate to the situation at hand. A person performing a dependability analysis must confront the question: is Kharkov modeling most appropriate to the system under consideration, or should another technique be used instead? The need to answer this gives rise to other more basic questions regarding Kharkov modeling: what are the capabilities and limitations of Kharkov modeling as a modeling technique? How does it relate to other modeling techniques? What kind of system behavior can it model? What kinds of software tools are available for performing dependability analyses with Kharkov modeling techniques? These questions and others will be addressed in this tutorial.

  11. 48 CFR 215.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... reliability of its estimating and accounting systems. [63 FR 55040, Oct. 14, 1998, as amended at 71 FR 69494... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Proposal analysis techniques. 215.404-1 Section 215.404-1 Federal Acquisition Regulations System DEFENSE ACQUISITION...

  12. Emulation applied to reliability analysis of reconfigurable, highly reliable, fault-tolerant computing systems

    NASA Technical Reports Server (NTRS)

    Migneault, G. E.

    1979-01-01

    Emulation techniques applied to the analysis of the reliability of highly reliable computer systems for future commercial aircraft are described. The lack of credible precision in reliability estimates obtained by analytical modeling techniques is first established. The difficulty is shown to be an unavoidable consequence of: (1) a high reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Next, the technique of emulation is described, indicating how its input is a simple description of the logical structure of a system and its output is the consequent behavior. Use of emulation techniques is discussed for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques. Finally an illustrative example is presented to demonstrate from actual use the promise of the proposed application of emulation.

  13. A Passive System Reliability Analysis for a Station Blackout

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brunett, Acacia; Bucknor, Matthew; Grabaskas, David

    2015-05-03

    The latest iterations of advanced reactor designs have included increased reliance on passive safety systems to maintain plant integrity during unplanned sequences. While these systems are advantageous in reducing the reliance on human intervention and availability of power, the phenomenological foundations on which these systems are built require a novel approach to a reliability assessment. Passive systems possess the unique ability to fail functionally without failing physically, a result of their explicit dependency on existing boundary conditions that drive their operating mode and capacity. Argonne National Laboratory is performing ongoing analyses that demonstrate various methodologies for the characterization of passivemore » system reliability within a probabilistic framework. Two reliability analysis techniques are utilized in this work. The first approach, the Reliability Method for Passive Systems, provides a mechanistic technique employing deterministic models and conventional static event trees. The second approach, a simulation-based technique, utilizes discrete dynamic event trees to treat time- dependent phenomena during scenario evolution. For this demonstration analysis, both reliability assessment techniques are used to analyze an extended station blackout in a pool-type sodium fast reactor (SFR) coupled with a reactor cavity cooling system (RCCS). This work demonstrates the entire process of a passive system reliability analysis, including identification of important parameters and failure metrics, treatment of uncertainties and analysis of results.« less

  14. Digital communications: Microwave applications

    NASA Astrophysics Data System (ADS)

    Feher, K.

    Transmission concepts and techniques of digital systems are presented; and practical state-of-the-art implementation of digital communications systems by line-of-sight microwaves is described. Particular consideration is given to statistical methods in digital transmission systems analysis, digital modulation methods, microwave amplifiers, system gain, m-ary and QAM microwave systems, correlative techniques and applications to digital radio systems, hybrid systems, digital microwave systems design, diversity and protection switching techniques, measurement techniques, and research and development trends and unsolved problems.

  15. Techniques for the analysis of data from coded-mask X-ray telescopes

    NASA Technical Reports Server (NTRS)

    Skinner, G. K.; Ponman, T. J.; Hammersley, A. P.; Eyles, C. J.

    1987-01-01

    Several techniques useful in the analysis of data from coded-mask telescopes are presented. Methods of handling changes in the instrument pointing direction are reviewed and ways of using FFT techniques to do the deconvolution considered. Emphasis is on techniques for optimally-coded systems, but it is shown that the range of systems included in this class can be extended through the new concept of 'partial cycle averaging'.

  16. Authentication techniques for smart cards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, R.A.

    1994-02-01

    Smart card systems are most cost efficient when implemented as a distributed system, which is a system without central host interaction or a local database of card numbers for verifying transaction approval. A distributed system, as such, presents special card and user authentication problems. Fortunately, smart cards offer processing capabilities that provide solutions to authentication problems, provided the system is designed with proper data integrity measures. Smart card systems maintain data integrity through a security design that controls data sources and limits data changes. A good security design is usually a result of a system analysis that provides a thoroughmore » understanding of the application needs. Once designers understand the application, they may specify authentication techniques that mitigate the risk of system compromise or failure. Current authentication techniques include cryptography, passwords, challenge/response protocols, and biometrics. The security design includes these techniques to help prevent counterfeit cards, unauthorized use, or information compromise. This paper discusses card authentication and user identity techniques that enhance security for microprocessor card systems. It also describes the analysis process used for determining proper authentication techniques for a system.« less

  17. Closed-form Static Analysis with Inertia Relief and Displacement-Dependent Loads Using a MSC/NASTRAN DMAP Alter

    NASA Technical Reports Server (NTRS)

    Barnett, Alan R.; Widrick, Timothy W.; Ludwiczak, Damian R.

    1995-01-01

    Solving for the displacements of free-free coupled systems acted upon by static loads is commonly performed throughout the aerospace industry. Many times, these problems are solved using static analysis with inertia relief. This solution technique allows for a free-free static analysis by balancing the applied loads with inertia loads generated by the applied loads. For some engineering applications, the displacements of the free-free coupled system induce additional static loads. Hence, the applied loads are equal to the original loads plus displacement-dependent loads. Solving for the final displacements of such systems is commonly performed using iterative solution techniques. Unfortunately, these techniques can be time-consuming and labor-intensive. Since the coupled system equations for free-free systems with displacement-dependent loads can be written in closed-form, it is advantageous to solve for the displacements in this manner. Implementing closed-form equations in static analysis with inertia relief is analogous to implementing transfer functions in dynamic analysis. Using a MSC/NASTRAN DMAP Alter, displacement-dependent loads have been included in static analysis with inertia relief. Such an Alter has been used successfully to solve efficiently a common aerospace problem typically solved using an iterative technique.

  18. Efficient techniques for forced response involving linear modal components interconnected by discrete nonlinear connection elements

    NASA Astrophysics Data System (ADS)

    Avitabile, Peter; O'Callahan, John

    2009-01-01

    Generally, response analysis of systems containing discrete nonlinear connection elements such as typical mounting connections require the physical finite element system matrices to be used in a direct integration algorithm to compute the nonlinear response analysis solution. Due to the large size of these physical matrices, forced nonlinear response analysis requires significant computational resources. Usually, the individual components of the system are analyzed and tested as separate components and their individual behavior may essentially be linear when compared to the total assembled system. However, the joining of these linear subsystems using highly nonlinear connection elements causes the entire system to become nonlinear. It would be advantageous if these linear modal subsystems could be utilized in the forced nonlinear response analysis since much effort has usually been expended in fine tuning and adjusting the analytical models to reflect the tested subsystem configuration. Several more efficient techniques have been developed to address this class of problem. Three of these techniques given as: equivalent reduced model technique (ERMT);modal modification response technique (MMRT); andcomponent element method (CEM); are presented in this paper and are compared to traditional methods.

  19. Connection between optimal control theory and adiabatic-passage techniques in quantum systems

    NASA Astrophysics Data System (ADS)

    Assémat, E.; Sugny, D.

    2012-08-01

    This work explores the relationship between optimal control theory and adiabatic passage techniques in quantum systems. The study is based on a geometric analysis of the Hamiltonian dynamics constructed from Pontryagin's maximum principle. In a three-level quantum system, we show that the stimulated Raman adiabatic passage technique can be associated to a peculiar Hamiltonian singularity. One deduces that the adiabatic pulse is solution of the optimal control problem only for a specific cost functional. This analysis is extended to the case of a four-level quantum system.

  20. Reliability analysis of a robotic system using hybridized technique

    NASA Astrophysics Data System (ADS)

    Kumar, Naveen; Komal; Lather, J. S.

    2017-09-01

    In this manuscript, the reliability of a robotic system has been analyzed using the available data (containing vagueness, uncertainty, etc). Quantification of involved uncertainties is done through data fuzzification using triangular fuzzy numbers with known spreads as suggested by system experts. With fuzzified data, if the existing fuzzy lambda-tau (FLT) technique is employed, then the computed reliability parameters have wide range of predictions. Therefore, decision-maker cannot suggest any specific and influential managerial strategy to prevent unexpected failures and consequently to improve complex system performance. To overcome this problem, the present study utilizes a hybridized technique. With this technique, fuzzy set theory is utilized to quantify uncertainties, fault tree is utilized for the system modeling, lambda-tau method is utilized to formulate mathematical expressions for failure/repair rates of the system, and genetic algorithm is utilized to solve established nonlinear programming problem. Different reliability parameters of a robotic system are computed and the results are compared with the existing technique. The components of the robotic system follow exponential distribution, i.e., constant. Sensitivity analysis is also performed and impact on system mean time between failures (MTBF) is addressed by varying other reliability parameters. Based on analysis some influential suggestions are given to improve the system performance.

  1. Optical Design And Analysis Of Carbon Dioxide Laser Fusion Systems Using Interferometry And Fast Fourier Transform Techniques

    NASA Astrophysics Data System (ADS)

    Viswanathan, V. K.

    1980-11-01

    The optical design and analysis of the LASL carbon dioxide laser fusion systems required the use of techniques that are quite different from the currently used method in conventional optical design problems. The necessity for this is explored and the method that has been successfully used at Los Alamos to understand these systems is discussed with examples. This method involves characterization of the various optical components in their mounts by a Zernike polynomial set and using fast Fourier transform techniques to propagate the beam, taking diffraction and other nonlinear effects that occur in these types of systems into account. The various programs used for analysis are briefly discussed.

  2. Real-Time Condition Monitoring and Fault Diagnosis of Gear Train Systems Using Instantaneous Angular Speed (IAS) Analysis

    NASA Astrophysics Data System (ADS)

    Sait, Abdulrahman S.

    This dissertation presents a reliable technique for monitoring the condition of rotating machinery by applying instantaneous angular speed (IAS) analysis. A new analysis of the effects of changes in the orientation of the line of action and the pressure angle of the resultant force acting on gear tooth profile of spur gear under different levels of tooth damage is utilized. The analysis and experimental work discussed in this dissertation provide a clear understating of the effects of damage on the IAS by analyzing the digital signals output of rotary incremental optical encoder. A comprehensive literature review of state of the knowledge in condition monitoring and fault diagnostics of rotating machinery, including gearbox system is presented. Progress and new developments over the past 30 years in failure detection techniques of rotating machinery including engines, bearings and gearboxes are thoroughly reviewed. This work is limited to the analysis of a gear train system with gear tooth surface faults utilizing angular motion analysis technique. Angular motion data were acquired using an incremental optical encoder. Results are compared to a vibration-based technique. The vibration data were acquired using an accelerometer. The signals were obtained and analyzed in the phase domains using signal averaging to determine the existence and position of faults on the gear train system. Forces between the mating teeth surfaces are analyzed and simulated to validate the influence of the presence of damage on the pressure angle and the IAS. National Instruments hardware is used and NI LabVIEW software code is developed for real-time, online condition monitoring systems and fault detection techniques. The sensitivity of optical encoders to gear fault detection techniques is experimentally investigated by applying IAS analysis under different gear damage levels and different operating conditions. A reliable methodology is developed for selecting appropriate testing/operating conditions of a rotating system to generate an alarm system for damage detection.

  3. String Stability of a Linear Formation Flight Control System

    NASA Technical Reports Server (NTRS)

    Allen, Michael J.; Ryan, Jack; Hanson, Curtis E.; Parle, James F.

    2002-01-01

    String stability analysis of an autonomous formation flight system was performed using linear and nonlinear simulations. String stability is a measure of how position errors propagate from one vehicle to another in a cascaded system. In the formation flight system considered here, each i(sup th) aircraft uses information from itself and the preceding ((i-1)(sup th)) aircraft to track a commanded relative position. A possible solution for meeting performance requirements with such a system is to allow string instability. This paper explores two results of string instability and outlines analysis techniques for string unstable systems. The three analysis techniques presented here are: linear, nonlinear formation performance, and ride quality. The linear technique was developed from a worst-case scenario and could be applied to the design of a string unstable controller. The nonlinear formation performance and ride quality analysis techniques both use nonlinear formation simulation. Three of the four formation-controller gain-sets analyzed in this paper were limited more by ride quality than by performance. Formations of up to seven aircraft in a cascaded formation could be used in the presence of light gusts with this string unstable system.

  4. Towards Effective Clustering Techniques for the Analysis of Electric Power Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hogan, Emilie A.; Cotilla Sanchez, Jose E.; Halappanavar, Mahantesh

    2013-11-30

    Clustering is an important data analysis technique with numerous applications in the analysis of electric power grids. Standard clustering techniques are oblivious to the rich structural and dynamic information available for power grids. Therefore, by exploiting the inherent topological and electrical structure in the power grid data, we propose new methods for clustering with applications to model reduction, locational marginal pricing, phasor measurement unit (PMU or synchrophasor) placement, and power system protection. We focus our attention on model reduction for analysis based on time-series information from synchrophasor measurement devices, and spectral techniques for clustering. By comparing different clustering techniques onmore » two instances of realistic power grids we show that the solutions are related and therefore one could leverage that relationship for a computational advantage. Thus, by contrasting different clustering techniques we make a case for exploiting structure inherent in the data with implications for several domains including power systems.« less

  5. Analytical techniques and instrumentation: A compilation. [analytical instrumentation, materials performance, and systems analysis

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Technical information is presented covering the areas of: (1) analytical instrumentation useful in the analysis of physical phenomena; (2) analytical techniques used to determine the performance of materials; and (3) systems and component analyses for design and quality control.

  6. Time-dependent inertia analysis of vehicle mechanisms

    NASA Astrophysics Data System (ADS)

    Salmon, James Lee

    Two methods for performing transient inertia analysis of vehicle hardware systems are developed in this dissertation. The analysis techniques can be used to predict the response of vehicle mechanism systems to the accelerations associated with vehicle impacts. General analytical methods for evaluating translational or rotational system dynamics are generated and evaluated for various system characteristics. The utility of the derived techniques are demonstrated by applying the generalized methods to two vehicle systems. Time dependent acceleration measured during a vehicle to vehicle impact are used as input to perform a dynamic analysis of an automobile liftgate latch and outside door handle. Generalized Lagrange equations for a non-conservative system are used to formulate a second order nonlinear differential equation defining the response of the components to the transient input. The differential equation is solved by employing the fourth order Runge-Kutta method. The events are then analyzed using commercially available two dimensional rigid body dynamic analysis software. The results of the two analytical techniques are compared to experimental data generated by high speed film analysis of tests of the two components performed on a high G acceleration sled at Ford Motor Company.

  7. Rewriting Modulo SMT and Open System Analysis

    NASA Technical Reports Server (NTRS)

    Rocha, Camilo; Meseguer, Jose; Munoz, Cesar

    2014-01-01

    This paper proposes rewriting modulo SMT, a new technique that combines the power of SMT solving, rewriting modulo theories, and model checking. Rewriting modulo SMT is ideally suited to model and analyze infinite-state open systems, i.e., systems that interact with a non-deterministic environment. Such systems exhibit both internal non-determinism, which is proper to the system, and external non-determinism, which is due to the environment. In a reflective formalism, such as rewriting logic, rewriting modulo SMT can be reduced to standard rewriting. Hence, rewriting modulo SMT naturally extends rewriting-based reachability analysis techniques, which are available for closed systems, to open systems. The proposed technique is illustrated with the formal analysis of: (i) a real-time system that is beyond the scope of timed-automata methods and (ii) automatic detection of reachability violations in a synchronous language developed to support autonomous spacecraft operations.

  8. Flow-based analysis using microfluidics-chemiluminescence systems.

    PubMed

    Al Lawati, Haider A J

    2013-01-01

    This review will discuss various approaches and techniques in which analysis using microfluidics-chemiluminescence systems (MF-CL) has been reported. A variety of applications is examined, including environmental, pharmaceutical, biological, food and herbal analysis. Reported uses of CL reagents, sample introduction techniques, sample pretreatment methods, CL signal enhancement and detection systems are discussed. A hydrodynamic pumping system is predominately used for these applications. However, several reports are available in which electro-osmotic (EO) pumping has been implemented. Various sample pretreatment methods have been used, including liquid-liquid extraction, solid-phase extraction and molecularly imprinted polymers. A wide range of innovative techniques has been reported for CL signal enhancement. Most of these techniques are based on enhancement of the mixing process in the microfluidics channels, which leads to enhancement of the CL signal. However, other techniques are also reported, such as mirror reaction, liquid core waveguide, on-line pre-derivatization and the use of an opaque white chip with a thin transparent seal. Photodetectors are the most commonly used detectors; however, other detection systems have also been used, including integrated electrochemiluminescence (ECL) and organic photodiodes (OPDs). Copyright © 2012 John Wiley & Sons, Ltd.

  9. Discrete ordinates-Monte Carlo coupling: A comparison of techniques in NERVA radiation analysis

    NASA Technical Reports Server (NTRS)

    Lindstrom, D. G.; Normand, E.; Wilcox, A. D.

    1972-01-01

    In the radiation analysis of the NERVA nuclear rocket system, two-dimensional discrete ordinates calculations are sufficient to provide detail in the pressure vessel and reactor assembly. Other parts of the system, however, require three-dimensional Monte Carlo analyses. To use these two methods in a single analysis, a means of coupling was developed whereby the results of a discrete ordinates calculation can be used to produce source data for a Monte Carlo calculation. Several techniques for producing source detail were investigated. Results of calculations on the NERVA system are compared and limitations and advantages of the coupling techniques discussed.

  10. Viewpoint on ISA TR84.0.02--simplified methods and fault tree analysis.

    PubMed

    Summers, A E

    2000-01-01

    ANSI/ISA-S84.01-1996 and IEC 61508 require the establishment of a safety integrity level for any safety instrumented system or safety related system used to mitigate risk. Each stage of design, operation, maintenance, and testing is judged against this safety integrity level. Quantitative techniques can be used to verify whether the safety integrity level is met. ISA-dTR84.0.02 is a technical report under development by ISA, which discusses how to apply quantitative analysis techniques to safety instrumented systems. This paper discusses two of those techniques: (1) Simplified equations and (2) Fault tree analysis.

  11. Analytical transmissibility based transfer path analysis for multi-energy-domain systems using four-pole parameter theory

    NASA Astrophysics Data System (ADS)

    Mashayekhi, Mohammad Jalali; Behdinan, Kamran

    2017-10-01

    The increasing demand to minimize undesired vibration and noise levels in several high-tech industries has generated a renewed interest in vibration transfer path analysis. Analyzing vibration transfer paths within a system is of crucial importance in designing an effective vibration isolation strategy. Most of the existing vibration transfer path analysis techniques are empirical which are suitable for diagnosis and troubleshooting purpose. The lack of an analytical transfer path analysis to be used in the design stage is the main motivation behind this research. In this paper an analytical transfer path analysis based on the four-pole theory is proposed for multi-energy-domain systems. Bond graph modeling technique which is an effective approach to model multi-energy-domain systems is used to develop the system model. In this paper an electro-mechanical system is used as a benchmark example to elucidate the effectiveness of the proposed technique. An algorithm to obtain the equivalent four-pole representation of a dynamical systems based on the corresponding bond graph model is also presented in this paper.

  12. Theoretical and software considerations for nonlinear dynamic analysis

    NASA Technical Reports Server (NTRS)

    Schmidt, R. J.; Dodds, R. H., Jr.

    1983-01-01

    In the finite element method for structural analysis, it is generally necessary to discretize the structural model into a very large number of elements to accurately evaluate displacements, strains, and stresses. As the complexity of the model increases, the number of degrees of freedom can easily exceed the capacity of present-day software system. Improvements of structural analysis software including more efficient use of existing hardware and improved structural modeling techniques are discussed. One modeling technique that is used successfully in static linear and nonlinear analysis is multilevel substructuring. This research extends the use of multilevel substructure modeling to include dynamic analysis and defines the requirements for a general purpose software system capable of efficient nonlinear dynamic analysis. The multilevel substructuring technique is presented, the analytical formulations and computational procedures for dynamic analysis and nonlinear mechanics are reviewed, and an approach to the design and implementation of a general purpose structural software system is presented.

  13. A controls engineering approach for analyzing airplane input-output characteristics

    NASA Technical Reports Server (NTRS)

    Arbuckle, P. Douglas

    1991-01-01

    An engineering approach for analyzing airplane control and output characteristics is presented. State-space matrix equations describing the linear perturbation dynamics are transformed from physical coordinates into scaled coordinates. The scaling is accomplished by applying various transformations to the system to employ prior engineering knowledge of the airplane physics. Two different analysis techniques are then explained. Modal analysis techniques calculate the influence of each system input on each fundamental mode of motion and the distribution of each mode among the system outputs. The optimal steady state response technique computes the blending of steady state control inputs that optimize the steady state response of selected system outputs. Analysis of an example airplane model is presented to demonstrate the described engineering approach.

  14. Parametric Robust Control and System Identification: Unified Approach

    NASA Technical Reports Server (NTRS)

    Keel, L. H.

    1996-01-01

    During the period of this support, a new control system design and analysis method has been studied. This approach deals with control systems containing uncertainties that are represented in terms of its transfer function parameters. Such a representation of the control system is common and many physical parameter variations fall into this type of uncertainty. Techniques developed here are capable of providing nonconservative analysis of such control systems with parameter variations. We have also developed techniques to deal with control systems when their state space representations are given rather than transfer functions. In this case, the plant parameters will appear as entries of state space matrices. Finally, a system modeling technique to construct such systems from the raw input - output frequency domain data has been developed.

  15. Thunderstorm monitoring and lightning warning, operational applications of the Safir system

    NASA Technical Reports Server (NTRS)

    Richard, Philippe

    1991-01-01

    During the past years a new range of studies have been opened by the application of electromagnetic localization techniques to the field of thunderstorm remote sensing. VHF localization techniques were used in particular for the analysis of lightning discharges and gave access to time resolved 3-D images of lightning discharges within thunderclouds. Detection and localization techniques developed have been applied to the design of the SAFIR system. This development's main objective was the design of an operational system capable of assessing and warning in real time for lightning hazards and potential thunderstorm hazards. The SAFIR system main detection technique is the long range interferometric localization of thunderstorm electromagnetic activity; the system performs the localization of intracloud and cloud to ground lightning discharges and the analysis of the characteristics of the activity.

  16. Why bundled payments could drive innovation: an example from interventional oncology.

    PubMed

    Steele, Joseph R; Jones, A Kyle; Ninan, Elizabeth P; Clarke, Ryan K; Odisio, Bruno C; Avritscher, Rony; Murthy, Ravi; Mahvash, Armeen

    2015-03-01

    Some have suggested that the current fee-for-service health care payment system in the United States stifles innovation. However, there are few published examples supporting this concept. We implemented an innovative temporary balloon occlusion technique for yttrium 90 radioembolization of nonresectable liver cancer. Although our balloon occlusion technique was associated with similar patient outcomes, lower cost, and faster procedure times compared with the standard-of-care coil embolization technique, our technique failed to gain widespread acceptance. Financial analysis revealed that because the balloon occlusion technique avoided a procedural step associated with a lucrative Current Procedural Terminology billing code, this new technique resulted in a significant decrease in hospital and physician revenue in the current fee-for-service payment system, even though the new technique would provide a revenue enhancement through cost savings in a bundled payment system. Our analysis illustrates how in a fee-for-service payment system, financial disincentives can stifle innovation and advancement of health care delivery. Copyright © 2015 by American Society of Clinical Oncology.

  17. MSC/NASTRAN DMAP Alter Used for Closed-Form Static Analysis With Inertia Relief and Displacement-Dependent Loads

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Solving for the displacements of free-free coupled systems acted upon by static loads is a common task in the aerospace industry. Often, these problems are solved by static analysis with inertia relief. This technique allows for a free-free static analysis by balancing the applied loads with the inertia loads generated by the applied loads. For some engineering applications, the displacements of the free-free coupled system induce additional static loads. Hence, the applied loads are equal to the original loads plus the displacement-dependent loads. A launch vehicle being acted upon by an aerodynamic loading can have such applied loads. The final displacements of such systems are commonly determined with iterative solution techniques. Unfortunately, these techniques can be time consuming and labor intensive. Because the coupled system equations for free-free systems with displacement-dependent loads can be written in closed form, it is advantageous to solve for the displacements in this manner. Implementing closed-form equations in static analysis with inertia relief is analogous to implementing transfer functions in dynamic analysis. An MSC/NASTRAN (MacNeal-Schwendler Corporation/NASA Structural Analysis) DMAP (Direct Matrix Abstraction Program) Alter was used to include displacement-dependent loads in static analysis with inertia relief. It efficiently solved a common aerospace problem that typically has been solved with an iterative technique.

  18. Photomorphic analysis techniques: An interim spatial analysis using satellite remote sensor imagery and historical data

    NASA Technical Reports Server (NTRS)

    Keuper, H. R.; Peplies, R. W.; Gillooly, R. P.

    1977-01-01

    The use of machine scanning and/or computer-based techniques to provide greater objectivity in the photomorphic approach was investigated. Photomorphic analysis and its application in regional planning are discussed. Topics included: delineation of photomorphic regions; inadequacies of existing classification systems; tonal and textural characteristics and signature analysis techniques; pattern recognition and Fourier transform analysis; and optical experiments. A bibliography is included.

  19. SAINT: A combined simulation language for modeling man-machine systems

    NASA Technical Reports Server (NTRS)

    Seifert, D. J.

    1979-01-01

    SAINT (Systems Analysis of Integrated Networks of Tasks) is a network modeling and simulation technique for design and analysis of complex man machine systems. SAINT provides the conceptual framework for representing systems that consist of discrete task elements, continuous state variables, and interactions between them. It also provides a mechanism for combining human performance models and dynamic system behaviors in a single modeling structure. The SAINT technique is described and applications of the SAINT are discussed.

  20. Mixed strategies for energy conservation and alternative energy utilization (solar) in buildings. Final report. Volume II. Detailed results. [New York, Atlanta, Omaha, and Albuquerque

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1977-06-01

    The mixed-strategy analysis was a tradeoff analysis between energy-conservation methods and an alternative energy source (solar) considering technical and economic benefits. The objective of the analysis was to develop guidelines for: reducing energy requirements; reducing conventional fuel use; and identifying economic alternatives for building owners. The analysis was done with a solar system in place. This makes the study unique in that it is determining the interaction of energy conservation with a solar system. The study, therefore, established guidelines as to how to minimize capital investment while reducing the conventional fuel consumption through either a larger solar system or anmore » energy-conserving technique. To focus the scope of energy-conservation techniques and alternative energy sources considered, five building types (house, apartment buildings, commercial buildings, schools, and office buildings) were selected. Finally, the lists of energy-conservation techniques and alternative energy sources were reduced to lists of manageable size by using technical attributes to select the best candidates for further study. The resultant energy-conservation techniques were described in detail and installed costs determined. The alternative energy source reduced to solar. Building construction characteristics were defined for each building for each of four geographic regions of the country. A mixed strategy consisting of an energy-conservation technique and solar heating/hot water/cooling system was analyzed, using computer simulation to determine the interaction between energy conservation and the solar system. Finally, using FEA fuel-price scenarios and installed costs for the solar system and energy conservation techniques, an economic analysis was performed to determine the cost effectiveness of the combination. (MCW)« less

  1. Preparation and analysis of standardized waste samples for Controlled Ecological Life Support Systems (CELSS)

    NASA Technical Reports Server (NTRS)

    Carden, J. L.; Browner, R.

    1982-01-01

    The preparation and analysis of standardized waste samples for controlled ecological life support systems (CELSS) are considered. Analysis of samples from wet oxidation experiments, the development of ion chromatographic techniques utilizing conventional high pressure liquid chromatography (HPLC) equipment, and an investigation of techniques for interfacing an ion chromatograph (IC) with an inductively coupled plasma optical emission spectrometer (ICPOES) are discussed.

  2. Performance analysis of static locking in replicated distributed database systems

    NASA Technical Reports Server (NTRS)

    Kuang, Yinghong; Mukkamala, Ravi

    1991-01-01

    Data replication and transaction deadlocks can severely affect the performance of distributed database systems. Many current evaluation techniques ignore these aspects, because it is difficult to evaluate through analysis and time consuming to evaluate through simulation. A technique is used that combines simulation and analysis to closely illustrate the impact of deadlock and evaluate performance of replicated distributed database with both shared and exclusive locks.

  3. Transportation Systems Evaluation

    NASA Technical Reports Server (NTRS)

    Fanning, M. L.; Michelson, R. A.

    1972-01-01

    A methodology for the analysis of transportation systems consisting of five major interacting elements is reported. The analysis begins with the causes of travel demand: geographic, economic, and demographic characteristics as well as attitudes toward travel. Through the analysis, the interaction of these factors with the physical and economic characteristics of the transportation system is determined. The result is an evaluation of the system from the point of view of both passenger and operator. The methodology is applicable to the intraurban transit systems as well as major airlines. Applications of the technique to analysis of a PRT system and a study of intraurban air travel are given. In the discussion several unique models or techniques are mentioned: i.e., passenger preference modeling, an integrated intraurban transit model, and a series of models to perform airline analysis.

  4. Modeling and analysis of power processing systems: Feasibility investigation and formulation of a methodology

    NASA Technical Reports Server (NTRS)

    Biess, J. J.; Yu, Y.; Middlebrook, R. D.; Schoenfeld, A. D.

    1974-01-01

    A review is given of future power processing systems planned for the next 20 years, and the state-of-the-art of power processing design modeling and analysis techniques used to optimize power processing systems. A methodology of modeling and analysis of power processing equipment and systems has been formulated to fulfill future tradeoff studies and optimization requirements. Computer techniques were applied to simulate power processor performance and to optimize the design of power processing equipment. A program plan to systematically develop and apply the tools for power processing systems modeling and analysis is presented so that meaningful results can be obtained each year to aid the power processing system engineer and power processing equipment circuit designers in their conceptual and detail design and analysis tasks.

  5. Analytical techniques: A compilation

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A compilation, containing articles on a number of analytical techniques for quality control engineers and laboratory workers, is presented. Data cover techniques for testing electronic, mechanical, and optical systems, nondestructive testing techniques, and gas analysis techniques.

  6. Creep-Rupture Data Analysis - Engineering Application of Regression Techniques. Ph.D. Thesis - North Carolina State Univ.

    NASA Technical Reports Server (NTRS)

    Rummler, D. R.

    1976-01-01

    The results are presented of investigations to apply regression techniques to the development of methodology for creep-rupture data analysis. Regression analysis techniques are applied to the explicit description of the creep behavior of materials for space shuttle thermal protection systems. A regression analysis technique is compared with five parametric methods for analyzing three simulated and twenty real data sets, and a computer program for the evaluation of creep-rupture data is presented.

  7. Reliability/safety analysis of a fly-by-wire system

    NASA Technical Reports Server (NTRS)

    Brock, L. D.; Goddman, H. A.

    1980-01-01

    An analysis technique has been developed to estimate the reliability of a very complex, safety-critical system by constructing a diagram of the reliability equations for the total system. This diagram has many of the characteristics of a fault-tree or success-path diagram, but is much easier to construct for complex redundant systems. The diagram provides insight into system failure characteristics and identifies the most likely failure modes. A computer program aids in the construction of the diagram and the computation of reliability. Analysis of the NASA F-8 Digital Fly-by-Wire Flight Control System is used to illustrate the technique.

  8. 48 CFR 215.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 3 2011-10-01 2011-10-01 false Proposal analysis... Contract Pricing 215.404-1 Proposal analysis techniques. (1) Follow the procedures at PGI 215.404-1 for proposal analysis. (2) For spare parts or support equipment, perform an analysis of— (i) Those line items...

  9. Systemic Analysis Approaches for Air Transportation

    NASA Technical Reports Server (NTRS)

    Conway, Sheila

    2005-01-01

    Air transportation system designers have had only limited success using traditional operations research and parametric modeling approaches in their analyses of innovations. They need a systemic methodology for modeling of safety-critical infrastructure that is comprehensive, objective, and sufficiently concrete, yet simple enough to be used with reasonable investment. The methodology must also be amenable to quantitative analysis so issues of system safety and stability can be rigorously addressed. However, air transportation has proven itself an extensive, complex system whose behavior is difficult to describe, no less predict. There is a wide range of system analysis techniques available, but some are more appropriate for certain applications than others. Specifically in the area of complex system analysis, the literature suggests that both agent-based models and network analysis techniques may be useful. This paper discusses the theoretical basis for each approach in these applications, and explores their historic and potential further use for air transportation analysis.

  10. Signal Detection Techniques for Diagnostic Monitoring of Space Shuttle Main Engine Turbomachinery

    NASA Technical Reports Server (NTRS)

    Coffin, Thomas; Jong, Jen-Yi

    1986-01-01

    An investigation to develop, implement, and evaluate signal analysis techniques for the detection and classification of incipient mechanical failures in turbomachinery is reviewed. A brief description of the Space Shuttle Main Engine (SSME) test/measurement program is presented. Signal analysis techniques available to describe dynamic measurement characteristics are reviewed. Time domain and spectral methods are described, and statistical classification in terms of moments is discussed. Several of these waveform analysis techniques have been implemented on a computer and applied to dynamc signals. A laboratory evaluation of the methods with respect to signal detection capability is described. A unique coherence function (the hyper-coherence) was developed through the course of this investigation, which appears promising as a diagnostic tool. This technique and several other non-linear methods of signal analysis are presented and illustrated by application. Software for application of these techniques has been installed on the signal processing system at the NASA/MSFC Systems Dynamics Laboratory.

  11. WAATS: A computer program for Weights Analysis of Advanced Transportation Systems

    NASA Technical Reports Server (NTRS)

    Glatt, C. R.

    1974-01-01

    A historical weight estimating technique for advanced transportation systems is presented. The classical approach to weight estimation is discussed and sufficient data is presented to estimate weights for a large spectrum of flight vehicles including horizontal and vertical takeoff aircraft, boosters and reentry vehicles. A computer program, WAATS (Weights Analysis for Advanced Transportation Systems) embracing the techniques discussed has been written and user instructions are presented. The program was developed for use in the ODIN (Optimal Design Integration System) system.

  12. A Flipped Classroom Approach to Teaching Systems Analysis, Design and Implementation

    ERIC Educational Resources Information Center

    Tanner, Maureen; Scott, Elsje

    2015-01-01

    This paper describes a flipped classroom approach followed to teach systems analysis, design and implementation at university level. The techniques employed are described. These techniques were underpinned by a theory of coherent practice: a pedagogy that provides a framework for the design of highly structured interventions to guide students in…

  13. Design approach of an aquaculture cage system for deployment in the constructed channel flow environments of a power plant

    PubMed Central

    Lee, Jihoon; Fredriksson, David W.; DeCew, Judson; Drach, Andrew; Yim, Solomon C.

    2018-01-01

    This study provides an engineering approach for designing an aquaculture cage system for use in constructed channel flow environments. As sustainable aquaculture has grown globally, many novel techniques have been introduced such as those implemented in the global Atlantic salmon industry. The advent of several highly sophisticated analysis software systems enables the development of such novel engineering techniques. These software systems commonly include three-dimensional (3D) drafting, computational fluid dynamics, and finite element analysis. In this study, a combination of these analysis tools is applied to evaluate a conceptual aquaculture system for potential deployment in a power plant effluent channel. The channel is supposedly clean; however, it includes elevated water temperatures and strong currents. The first portion of the analysis includes the design of a fish cage system with specific net solidities using 3D drafting techniques. Computational fluid dynamics is then applied to evaluate the flow reduction through the system from the previously generated solid models. Implementing the same solid models, a finite element analysis is performed on the critical components to assess the material stresses produced by the drag force loads that are calculated from the fluid velocities. PMID:29897954

  14. Spacecraft Electrical Power System (EPS) generic analysis tools and techniques

    NASA Technical Reports Server (NTRS)

    Morris, Gladys M.; Sheppard, Mark A.

    1992-01-01

    An overview is provided of the analysis tools and techiques used in modeling the Space Station Freedom electrical power system, as well as future space vehicle power systems. The analysis capabilities of the Electrical Power System (EPS) are described and the EPS analysis tools are surveyed.

  15. Performance analysis of static locking in replicated distributed database systems

    NASA Technical Reports Server (NTRS)

    Kuang, Yinghong; Mukkamala, Ravi

    1991-01-01

    Data replications and transaction deadlocks can severely affect the performance of distributed database systems. Many current evaluation techniques ignore these aspects, because it is difficult to evaluate through analysis and time consuming to evaluate through simulation. Here, a technique is discussed that combines simulation and analysis to closely illustrate the impact of deadlock and evaluate performance of replicated distributed databases with both shared and exclusive locks.

  16. Automated Sneak Circuit Analysis Technique

    DTIC Science & Technology

    1990-06-01

    the OrCAD/SDT module Port facility. 2. The terminals of all in- circuit voltage sources (e , batteries) must be labeled using the OrCAD/SDT module port...ELECTE 1 MAY 2 01994 _- AUTOMATED SNEAK CIRCUIT ANALYSIS TECHNIQUEIt~ w I wtA who RADC 94-14062 Systems Reliability & Engineering Division Rome...Air Develpment Center Best Avai~lable copy AUTOMATED SNEAK CIRCUIT ANALYSIS TECHNIQUE RADC June 1990 Systems Reliability & Engineering Division Rome Air

  17. Reduced-Smoke Solid Propellant Combustion Products Analysis. Development of a Micromotor Combustor Technique.

    DTIC Science & Technology

    1976-10-01

    A low-cost micromotor combustor technique has been devised to support the development of reduced-smoke solid propellant formulations. The technique...includes a simple, reusable micromotor capable of high chamber pressures, a combustion products collection system, and procedures for analysis of

  18. Sensor failure and multivariable control for airbreathing propulsion systems. Ph.D. Thesis - Dec. 1979 Final Report

    NASA Technical Reports Server (NTRS)

    Behbehani, K.

    1980-01-01

    A new sensor/actuator failure analysis technique for turbofan jet engines was developed. Three phases of failure analysis, namely detection, isolation, and accommodation are considered. Failure detection and isolation techniques are developed by utilizing the concept of Generalized Likelihood Ratio (GLR) tests. These techniques are applicable to both time varying and time invariant systems. Three GLR detectors are developed for: (1) hard-over sensor failure; (2) hard-over actuator failure; and (3) brief disturbances in the actuators. The probability distribution of the GLR detectors and the detectability of sensor/actuator failures are established. Failure type is determined by the maximum of the GLR detectors. Failure accommodation is accomplished by extending the Multivariable Nyquest Array (MNA) control design techniques to nonsquare system designs. The performance and effectiveness of the failure analysis technique are studied by applying the technique to a turbofan jet engine, namely the Quiet Clean Short Haul Experimental Engine (QCSEE). Single and multiple sensor/actuator failures in the QCSEE are simulated and analyzed and the effects of model degradation are studied.

  19. Flow Injection Technique for Biochemical Analysis with Chemiluminescence Detection in Acidic Media

    PubMed Central

    Chen, Jing; Fang, Yanjun

    2007-01-01

    A review with 90 references is presented to show the development of acidic chemiluminescence methods for biochemical analysis by use of flow injection technique in the last 10 years. A brief discussion of both the chemiluminescence and flow injection technique is given. The proposed methods for biochemical analysis are described and compared according to the used chemiluminescence system.

  20. Expert systems in civil engineering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kostem, C.N.; Maher, M.L.

    1986-01-01

    This book presents the papers given at a symposium on expert systems in civil engineering. Topics considered at the symposium included problem solving using expert system techniques, construction schedule analysis, decision making and risk analysis, seismic risk analysis systems, an expert system for inactive hazardous waste site characterization, an expert system for site selection, knowledge engineering, and knowledge-based expert systems in seismic analysis.

  1. Miniaturized flow injection analysis system

    DOEpatents

    Folta, James A.

    1997-01-01

    A chemical analysis technique known as flow injection analysis, wherein small quantities of chemical reagents and sample are intermixed and reacted within a capillary flow system and the reaction products are detected optically, electrochemically, or by other means. A highly miniaturized version of a flow injection analysis system has been fabricated utilizing microfabrication techniques common to the microelectronics industry. The microflow system uses flow capillaries formed by etching microchannels in a silicon or glass wafer followed by bonding to another wafer, commercially available microvalves bonded directly to the microflow channels, and an optical absorption detector cell formed near the capillary outlet, with light being both delivered and collected with fiber optics. The microflow system is designed mainly for analysis of liquids and currently measures 38.times.25.times.3 mm, but can be designed for gas analysis and be substantially smaller in construction.

  2. Do not blame the driver: a systems analysis of the causes of road freight crashes.

    PubMed

    Newnam, Sharon; Goode, Natassia

    2015-03-01

    Although many have advocated a systems approach in road transportation, this view has not meaningfully penetrated road safety research, practice or policy. In this study, a systems theory-based approach, Rasmussens's (1997) risk management framework and associated Accimap technique, is applied to the analysis of road freight transportation crashes. Twenty-seven highway crash investigation reports were downloaded from the National Transport Safety Bureau website. Thematic analysis was used to identify the complex system of contributory factors, and relationships, identified within the reports. The Accimap technique was then used to represent the linkages and dependencies within and across system levels in the road freight transportation industry and to identify common factors and interactions across multiple crashes. The results demonstrate how a systems approach can increase knowledge in this safety critical domain, while the findings can be used to guide prevention efforts and the development of system-based investigation processes for the heavy vehicle industry. A research agenda for developing an investigation technique to better support the application of the Accimap technique by practitioners in road freight transportation industry is proposed. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Analysis of Hospital Processes with Process Mining Techniques.

    PubMed

    Orellana García, Arturo; Pérez Alfonso, Damián; Larrea Armenteros, Osvaldo Ulises

    2015-01-01

    Process mining allows for discovery, monitoring, and improving processes identified in information systems from their event logs. In hospital environments, process analysis has been a crucial factor for cost reduction, control and proper use of resources, better patient care, and achieving service excellence. This paper presents a new component for event logs generation in the Hospital Information System or HIS, developed at University of Informatics Sciences. The event logs obtained are used for analysis of hospital processes with process mining techniques. The proposed solution intends to achieve the generation of event logs in the system with high quality. The performed analyses allowed for redefining functions in the system and proposed proper flow of information. The study exposed the need to incorporate process mining techniques in hospital systems to analyze the processes execution. Moreover, we illustrate its application for making clinical and administrative decisions for the management of hospital activities.

  4. A performance evaluation postprocessor for computer-aided design and analysis of communication systems

    NASA Technical Reports Server (NTRS)

    Tranter, W. H.

    1979-01-01

    A technique for estimating the signal-to-noise ratio at a point in a digital simulation of a communication system is described; the technique is essentially a digital realization of a technique proposed by Shepertycki (1964) for the evaluation of analog communication systems. Signals having lowpass or bandpass spectra may be used. Simulation results show the technique to be accurate over a wide range of signal-to-noise ratios.

  5. The Generation of Novel MR Imaging Techniques to Visualize Inflammatory/Degenerative Mechanisms and the Correlation of MR Data with 3D Microscopic Changes

    DTIC Science & Technology

    2013-09-01

    existing MR scanning systems providing the ability to visualize structures that are impossible with current methods . Using techniques to concurrently...and unique system for analysis of affected brain regions and coupled with other imaging techniques and molecular measurements holds significant...scanning systems providing the ability to visualize structures that are impossible with current methods . Using techniques to concurrently stain

  6. Reliability techniques for computer executive programs

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Computer techniques for increasing the stability and reliability of executive and supervisory systems were studied. Program segmentation characteristics are discussed along with a validation system which is designed to retain the natural top down outlook in coding. An analysis of redundancy techniques and roll back procedures is included.

  7. Coarse-to-fine markerless gait analysis based on PCA and Gauss-Laguerre decomposition

    NASA Astrophysics Data System (ADS)

    Goffredo, Michela; Schmid, Maurizio; Conforto, Silvia; Carli, Marco; Neri, Alessandro; D'Alessio, Tommaso

    2005-04-01

    Human movement analysis is generally performed through the utilization of marker-based systems, which allow reconstructing, with high levels of accuracy, the trajectories of markers allocated on specific points of the human body. Marker based systems, however, show some drawbacks that can be overcome by the use of video systems applying markerless techniques. In this paper, a specifically designed computer vision technique for the detection and tracking of relevant body points is presented. It is based on the Gauss-Laguerre Decomposition, and a Principal Component Analysis Technique (PCA) is used to circumscribe the region of interest. Results obtained on both synthetic and experimental tests provide significant reduction of the computational costs, with no significant reduction of the tracking accuracy.

  8. Advances in the analysis and design of constant-torque springs

    NASA Technical Reports Server (NTRS)

    McGuire, John R.; Yura, Joseph A.

    1996-01-01

    In order to improve the design procedure of constant-torque springs used in aerospace applications, several new analysis techniques have been developed. These techniques make it possible to accurately construct a torque-rotation curve for any general constant-torque spring configuration. These new techniques allow for friction in the system to be included in the analysis, an area of analysis that has heretofore been unexplored. The new analysis techniques also include solutions for the deflected shape of the spring as well as solutions for drum and roller support reaction forces. A design procedure incorporating these new capabilities is presented.

  9. The integrated manual and automatic control of complex flight systems

    NASA Technical Reports Server (NTRS)

    Schmidt, D. K.

    1985-01-01

    Pilot/vehicle analysis techniques for optimizing aircraft handling qualities are presented. The analysis approach considered is based on the optimal control frequency domain techniques. These techniques stem from an optimal control approach of a Neal-Smith like analysis on aircraft attitude dynamics extended to analyze the flared landing task. Some modifications to the technique are suggested and discussed. An in depth analysis of the effect of the experimental variables, such as prefilter, is conducted to gain further insight into the flared land task for this class of vehicle dynamics.

  10. Investigation of energy management strategies for photovoltaic systems - An analysis technique

    NASA Technical Reports Server (NTRS)

    Cull, R. C.; Eltimsahy, A. H.

    1982-01-01

    Progress is reported in formulating energy management strategies for stand-alone PV systems, developing an analytical tool that can be used to investigate these strategies, applying this tool to determine the proper control algorithms and control variables (controller inputs and outputs) for a range of applications, and quantifying the relative performance and economics when compared to systems that do not apply energy management. The analysis technique developed may be broadly applied to a variety of systems to determine the most appropriate energy management strategies, control variables and algorithms. The only inputs required are statistical distributions for stochastic energy inputs and outputs of the system and the system's device characteristics (efficiency and ratings). Although the formulation was originally driven by stand-alone PV system needs, the techniques are also applicable to hybrid and grid connected systems.

  11. Investigation of energy management strategies for photovoltaic systems - An analysis technique

    NASA Astrophysics Data System (ADS)

    Cull, R. C.; Eltimsahy, A. H.

    Progress is reported in formulating energy management strategies for stand-alone PV systems, developing an analytical tool that can be used to investigate these strategies, applying this tool to determine the proper control algorithms and control variables (controller inputs and outputs) for a range of applications, and quantifying the relative performance and economics when compared to systems that do not apply energy management. The analysis technique developed may be broadly applied to a variety of systems to determine the most appropriate energy management strategies, control variables and algorithms. The only inputs required are statistical distributions for stochastic energy inputs and outputs of the system and the system's device characteristics (efficiency and ratings). Although the formulation was originally driven by stand-alone PV system needs, the techniques are also applicable to hybrid and grid connected systems.

  12. Thermal radiation analysis system TRASYS 2: User's manual

    NASA Technical Reports Server (NTRS)

    Goble, R. G.; Jensen, C. L.

    1980-01-01

    The Thermal Radiation Analyzer System (TRASYS) program put thermal radiation analysis on the same basis as thermal analysis using program systems such as MITAS and SINDA. The user is provided the powerful options of writing his own executive, or driver logic and choosing, among several available options, the most desirable solution technique(s) for the problem at hand. This User's Manual serves the twofold purpose of instructing the user in all applications and providing a convenient reference book that presents the features and capabilities in a concise, easy-to-find manner.

  13. Analysis of the Apollo spacecraft operational data management system. Executive summary

    NASA Technical Reports Server (NTRS)

    1971-01-01

    A study was made of Apollo, Skylab, and several other data management systems to determine those techniques which could be applied to the management of operational data for future manned spacecraft programs. The results of the study are presented and include: (1) an analysis of present data management systems, (2) a list of requirements for future operational data management systems, (3) an evaluation of automated data management techniques, and (4) a plan for data management applicable to future space programs.

  14. Systems Analysis for Librarians and Information Professionals. Second Edition. Library and Information Science Text Series.

    ERIC Educational Resources Information Center

    Osborne, Larry N.; Nakamura, Margaret

    This textbook covering the basic techniques of systems analysis targeted to the librarian or information professional contains the following chapters: (1) Understanding Systems Analysis; (2) The Historical Development of Systems Analysis; (3) The Human Element; (4) Identifying and Defining Problems; (5) Collecting Data; (6) Analyzing and…

  15. A combination of selected mapping and clipping to increase energy efficiency of OFDM systems

    PubMed Central

    Lee, Byung Moo; Rim, You Seung

    2017-01-01

    We propose an energy efficient combination design for OFDM systems based on selected mapping (SLM) and clipping peak-to-average power ratio (PAPR) reduction techniques, and show the related energy efficiency (EE) performance analysis. The combination of two different PAPR reduction techniques can provide a significant benefit in increasing EE, because it can take advantages of both techniques. For the combination, we choose the clipping and SLM techniques, since the former technique is quite simple and effective, and the latter technique does not cause any signal distortion. We provide the structure and the systematic operating method, and show the various analyzes to derive the EE gain based on the combined technique. Our analysis show that the combined technique increases the EE by 69% compared to no PAPR reduction, and by 19.34% compared to only using SLM technique. PMID:29023591

  16. Application of Mathematical Signal Processing Techniques to Mission Systems. (l’Application des techniques mathematiques du traitement du signal aux systemes de conduite des missions)

    DTIC Science & Technology

    1999-11-01

    represents the linear time invariant (LTI) response of the combined analysis /synthesis system while the second repre- sents the aliasing introduced into...effectively to implement voice scrambling systems based on time - frequency permutation . The most general form of such a system is shown in Fig. 22 where...92201 NEUILLY-SUR-SEINE CEDEX, FRANCE RTO LECTURE SERIES 216 Application of Mathematical Signal Processing Techniques to Mission Systems (1

  17. Statistical evaluation of vibration analysis techniques

    NASA Technical Reports Server (NTRS)

    Milner, G. Martin; Miller, Patrice S.

    1987-01-01

    An evaluation methodology is presented for a selection of candidate vibration analysis techniques applicable to machinery representative of the environmental control and life support system of advanced spacecraft; illustrative results are given. Attention is given to the statistical analysis of small sample experiments, the quantification of detection performance for diverse techniques through the computation of probability of detection versus probability of false alarm, and the quantification of diagnostic performance.

  18. A simple white noise analysis of neuronal light responses.

    PubMed

    Chichilnisky, E J

    2001-05-01

    A white noise technique is presented for estimating the response properties of spiking visual system neurons. The technique is simple, robust, efficient and well suited to simultaneous recordings from multiple neurons. It provides a complete and easily interpretable model of light responses even for neurons that display a common form of response nonlinearity that precludes classical linear systems analysis. A theoretical justification of the technique is presented that relies only on elementary linear algebra and statistics. Implementation is described with examples. The technique and the underlying model of neural responses are validated using recordings from retinal ganglion cells, and in principle are applicable to other neurons. Advantages and disadvantages of the technique relative to classical approaches are discussed.

  19. The Application of Operations Research Techniques to the Evaluation of Military Management Information Systems.

    DTIC Science & Technology

    systems such as management information systems . To provide a methodology yielding quantitative results which may assist a commander and his staff in...this analysis, it is proposed that management information systems be evaluated as a whole by a technique defined as the semantic differential. Each

  20. [Development of performance evaluation and management system on advanced schistosomiasis medical treatment].

    PubMed

    Zhou, Xiao-Rong; Huang, Shui-Sheng; Gong, Xin-Guo; Cen, Li-Ping; Zhang, Cong; Zhu, Hong; Yang, Jun-Jing; Chen, Li

    2012-04-01

    To construct a performance evaluation and management system on advanced schistosomiasis medical treatment, and analyze and evaluate the work of the advanced schistosomiasis medical treatment over the years. By applying the database management technique and C++ programming technique, we inputted the information of the advanced schistosomiasis cases into the system, and comprehensively evaluated the work of the advanced schistosomiasis medical treatment through the cost-effect analysis, cost-effectiveness analysis, and cost-benefit analysis. We made a set of software formula about cost-effect analysis, cost-effectiveness analysis, and cost-benefit analysis. This system had many features such as clear building, easy to operate, friendly surface, convenient information input and information search. It could benefit the performance evaluation of the province's advanced schistosomiasis medical treatment work. This system can satisfy the current needs of advanced schistosomiasis medical treatment work and can be easy to be widely used.

  1. Miniaturized flow injection analysis system

    DOEpatents

    Folta, J.A.

    1997-07-01

    A chemical analysis technique known as flow injection analysis is described, wherein small quantities of chemical reagents and sample are intermixed and reacted within a capillary flow system and the reaction products are detected optically, electrochemically, or by other means. A highly miniaturized version of a flow injection analysis system has been fabricated utilizing microfabrication techniques common to the microelectronics industry. The microflow system uses flow capillaries formed by etching microchannels in a silicon or glass wafer followed by bonding to another wafer, commercially available microvalves bonded directly to the microflow channels, and an optical absorption detector cell formed near the capillary outlet, with light being both delivered and collected with fiber optics. The microflow system is designed mainly for analysis of liquids and currently measures 38{times}25{times}3 mm, but can be designed for gas analysis and be substantially smaller in construction. 9 figs.

  2. Modular techniques for dynamic fault-tree analysis

    NASA Technical Reports Server (NTRS)

    Patterson-Hine, F. A.; Dugan, Joanne B.

    1992-01-01

    It is noted that current approaches used to assess the dependability of complex systems such as Space Station Freedom and the Air Traffic Control System are incapable of handling the size and complexity of these highly integrated designs. A novel technique for modeling such systems which is built upon current techniques in Markov theory and combinatorial analysis is described. It enables the development of a hierarchical representation of system behavior which is more flexible than either technique alone. A solution strategy which is based on an object-oriented approach to model representation and evaluation is discussed. The technique is virtually transparent to the user since the fault tree models can be built graphically and the objects defined automatically. The tree modularization procedure allows the two model types, Markov and combinatoric, to coexist and does not require that the entire fault tree be translated to a Markov chain for evaluation. This effectively reduces the size of the Markov chain required and enables solutions with less truncation, making analysis of longer mission times possible. Using the fault-tolerant parallel processor as an example, a model is built and solved for a specific mission scenario and the solution approach is illustrated in detail.

  3. Remote sensing science for the Nineties; Proceedings of IGARSS '90 - 10th Annual International Geoscience and Remote Sensing Symposium, University of Maryland, College Park, May 20-24, 1990. Vols. 1, 2, & 3

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Various papers on remote sensing (RS) for the nineties are presented. The general topics addressed include: subsurface methods, radar scattering, oceanography, microwave models, atmospheric correction, passive microwave systems, RS in tropical forests, moderate resolution land analysis, SAR geometry and SNR improvement, image analysis, inversion and signal processing for geoscience, surface scattering, rain measurements, sensor calibration, wind measurements, terrestrial ecology, agriculture, geometric registration, subsurface sediment geology, radar modulation mechanisms, radar ocean scattering, SAR calibration, airborne radar systems, water vapor retrieval, forest ecosystem dynamics, land analysis, multisensor data fusion. Also considered are: geologic RS, RS sensor optical measurements, RS of snow, temperature retrieval, vegetation structure, global change, artificial intelligence, SAR processing techniques, geologic RS field experiment, stochastic modeling, topography and Digital Elevation model, SAR ocean waves, spaceborne lidar and optical, sea ice field measurements, millimeter waves, advanced spectroscopy, spatial analysis and data compression, SAR polarimetry techniques. Also discussed are: plant canopy modeling, optical RS techniques, optical and IR oceanography, soil moisture, sea ice back scattering, lightning cloud measurements, spatial textural analysis, SAR systems and techniques, active microwave sensing, lidar and optical, radar scatterometry, RS of estuaries, vegetation modeling, RS systems, EOS/SAR Alaska, applications for developing countries, SAR speckle and texture.

  4. Application of small-signal modeling and measurement techniques to the stability analysis of an integrated switching-mode power system. [onboard Dynamics Explorer Satellite

    NASA Technical Reports Server (NTRS)

    Wong, R. C.; Owen, H. A., Jr.; Wilson, T. G.; Rodriguez, G. E.

    1980-01-01

    Small-signal modeling techniques are used in a system stability analysis of a breadboard version of a complete functional electrical power system. The system consists of a regulated switching dc-to-dc converter, a solar-cell-array simulator, a solar-array EMI filter, battery chargers and linear shunt regulators. Loss mechanisms in the converter power stage, including switching-time effects in the semiconductor elements, are incorporated into the modeling procedure to provide an accurate representation of the system without requiring frequency-domain measurements to determine the damping factor. The small-signal system model is validated by the use of special measurement techniques which are adapted to the poor signal-to-noise ratio encountered in switching-mode systems. The complete electrical power system with the solar-array EMI filter is shown to be stable over the intended range of operation.

  5. Mathematical and Statistical Techniques for Systems Medicine: The Wnt Signaling Pathway as a Case Study.

    PubMed

    MacLean, Adam L; Harrington, Heather A; Stumpf, Michael P H; Byrne, Helen M

    2016-01-01

    The last decade has seen an explosion in models that describe phenomena in systems medicine. Such models are especially useful for studying signaling pathways, such as the Wnt pathway. In this chapter we use the Wnt pathway to showcase current mathematical and statistical techniques that enable modelers to gain insight into (models of) gene regulation and generate testable predictions. We introduce a range of modeling frameworks, but focus on ordinary differential equation (ODE) models since they remain the most widely used approach in systems biology and medicine and continue to offer great potential. We present methods for the analysis of a single model, comprising applications of standard dynamical systems approaches such as nondimensionalization, steady state, asymptotic and sensitivity analysis, and more recent statistical and algebraic approaches to compare models with data. We present parameter estimation and model comparison techniques, focusing on Bayesian analysis and coplanarity via algebraic geometry. Our intention is that this (non-exhaustive) review may serve as a useful starting point for the analysis of models in systems medicine.

  6. E-Learning System Using Segmentation-Based MR Technique for Learning Circuit Construction

    ERIC Educational Resources Information Center

    Takemura, Atsushi

    2016-01-01

    This paper proposes a novel e-Learning system using the mixed reality (MR) technique for technical experiments involving the construction of electronic circuits. The proposed system comprises experimenters' mobile computers and a remote analysis system. When constructing circuits, each learner uses a mobile computer to transmit image data from the…

  7. Formal methods for modeling and analysis of hybrid systems

    NASA Technical Reports Server (NTRS)

    Tiwari, Ashish (Inventor); Lincoln, Patrick D. (Inventor)

    2009-01-01

    A technique based on the use of a quantifier elimination decision procedure for real closed fields and simple theorem proving to construct a series of successively finer qualitative abstractions of hybrid automata is taught. The resulting abstractions are always discrete transition systems which can then be used by any traditional analysis tool. The constructed abstractions are conservative and can be used to establish safety properties of the original system. The technique works on linear and non-linear polynomial hybrid systems: the guards on discrete transitions and the continuous flows in all modes can be specified using arbitrary polynomial expressions over the continuous variables. An exemplar tool in the SAL environment built over the theorem prover PVS is detailed. The technique scales well to large and complex hybrid systems.

  8. Evaluation of radioisotope tracer and activation analysis techniques for contamination monitoring in space environment simulation chambers

    NASA Technical Reports Server (NTRS)

    Smathers, J. B.; Kuykendall, W. E., Jr.; Wright, R. E., Jr.; Marshall, J. R.

    1973-01-01

    Radioisotope measurement techniques and neutron activation analysis are evaluated for use in identifying and locating contamination sources in space environment simulation chambers. The alpha range method allows the determination of total contaminant concentration in vapor state and condensate state. A Cf-252 neutron activation analysis system for detecting oils and greases tagged with stable elements is described. While neutron activation analysis of tagged contaminants offers specificity, an on-site system is extremely costly to implement and provides only marginal detection sensitivity under even the most favorable conditions.

  9. Validating an Air Traffic Management Concept of Operation Using Statistical Modeling

    NASA Technical Reports Server (NTRS)

    He, Yuning; Davies, Misty Dawn

    2013-01-01

    Validating a concept of operation for a complex, safety-critical system (like the National Airspace System) is challenging because of the high dimensionality of the controllable parameters and the infinite number of states of the system. In this paper, we use statistical modeling techniques to explore the behavior of a conflict detection and resolution algorithm designed for the terminal airspace. These techniques predict the robustness of the system simulation to both nominal and off-nominal behaviors within the overall airspace. They also can be used to evaluate the output of the simulation against recorded airspace data. Additionally, the techniques carry with them a mathematical value of the worth of each prediction-a statistical uncertainty for any robustness estimate. Uncertainty Quantification (UQ) is the process of quantitative characterization and ultimately a reduction of uncertainties in complex systems. UQ is important for understanding the influence of uncertainties on the behavior of a system and therefore is valuable for design, analysis, and verification and validation. In this paper, we apply advanced statistical modeling methodologies and techniques on an advanced air traffic management system, namely the Terminal Tactical Separation Assured Flight Environment (T-TSAFE). We show initial results for a parameter analysis and safety boundary (envelope) detection in the high-dimensional parameter space. For our boundary analysis, we developed a new sequential approach based upon the design of computer experiments, allowing us to incorporate knowledge from domain experts into our modeling and to determine the most likely boundary shapes and its parameters. We carried out the analysis on system parameters and describe an initial approach that will allow us to include time-series inputs, such as the radar track data, into the analysis

  10. Systems Analysis in Small Educational Systems: A Case Study.

    ERIC Educational Resources Information Center

    Vazquez-Abad, Jesus; And Others

    1982-01-01

    The use of systems analysis in transforming a graduate program in educational technology from a lecture-based system to a self-instructional one is described. Several operational research techniques are illustrated. A bibliography of 10 items is included. (CHC)

  11. Interactive Image Analysis System Design,

    DTIC Science & Technology

    1982-12-01

    This report describes a design for an interactive image analysis system (IIAS), which implements terrain data extraction techniques. The design... analysis system. Additionally, the system is fully capable of supporting many generic types of image analysis and data processing, and is modularly...employs commercially available, state of the art minicomputers and image display devices with proven software to achieve a cost effective, reliable image

  12. Application of mobile digital communications in law enforcement, an introductory planning guide

    NASA Technical Reports Server (NTRS)

    Sohn, R. L.; Abraham, J. E.; Leflang, W. G.; Kennedy, R. D.; Wilson, J. H.; Gurfield, R. M.

    1975-01-01

    A set of planning guidelines for the application of digital communications techniques to law enforcement use is presented. Some essential characteristics of digital techniques and their applications are outlined, as are some principles of system analysis, evaluation, and planning. Requirements analysis, system concept design, implementation planning, and performance and cost modeling are described and demonstrated with respect to this application problem. Information on law enforcement digital communications systems and equipment and a list of vendor sources are given in appendices.

  13. Analysis of Proportional Integral and Optimized Proportional Integral Controllers for Resistance Spot Welding System (RSWS) - A Performance Perspective

    NASA Astrophysics Data System (ADS)

    Rama Subbanna, S.; Suryakalavathi, M., Dr.

    2017-08-01

    This paper is an attempt to accomplish a performance analysis of the different control techniques on spikes reduction method applied on the medium frequency transformer based DC spot welding system. Spike reduction is an important factor to be considered while spot welding systems are concerned. During normal RSWS operation welding transformer’s magnetic core can become saturated due to the unbalanced resistances of both transformer secondary windings and different characteristics of output rectifier diodes, which causes current spikes and over-current protection switch-off of the entire system. The current control technique is a piecewise linear control technique that is inspired from the DC-DC converter control algorithms to register a novel spike reduction method in the MFDC spot welding applications. Two controllers that were used for the spike reduction portion of the overall applications involve the traditional PI controller and Optimized PI controller. Care is taken such that the current control technique would maintain a reduced spikes in the primary current of the transformer while it reduces the Total Harmonic Distortion. The performance parameter that is involved in the spikes reduction technique is the THD, Percentage of current spike reduction for both techniques. Matlab/SimulinkTM based simulation is carried out for the MFDC RSWS with KW and results are tabulated for the PI and Optimized PI controllers and a tradeoff analysis is carried out.

  14. A VLF-based technique in applications to digital control of nonlinear hybrid multirate systems

    NASA Astrophysics Data System (ADS)

    Vassilyev, Stanislav; Ulyanov, Sergey; Maksimkin, Nikolay

    2017-01-01

    In this paper, a technique for rigorous analysis and design of nonlinear multirate digital control systems on the basis of the reduction method and sublinear vector Lyapunov functions is proposed. The control system model under consideration incorporates continuous-time dynamics of the plant and discrete-time dynamics of the controller and takes into account uncertainties of the plant, bounded disturbances, nonlinear characteristics of sensors and actuators. We consider a class of multirate systems where the control update rate is slower than the measurement sampling rates and periodic non-uniform sampling is admitted. The proposed technique does not use the preliminary discretization of the system, and, hence, allows one to eliminate the errors associated with the discretization and improve the accuracy of analysis. The technique is applied to synthesis of digital controller for a flexible spacecraft in the fine stabilization mode and decentralized controller for a formation of autonomous underwater vehicles. Simulation results are provided to validate the good performance of the designed controllers.

  15. Space Construction System Analysis. Part 2: Executive summary

    NASA Technical Reports Server (NTRS)

    1980-01-01

    A detailed, end-to-end analysis of the activities, techniques, equipment and Shuttle provisions required to construct a reference project system is described. Included are: platform definition; construction analysis; cost and programmatics; and space construction experiments concepts.

  16. Systems Theoretic Process Analysis Applied to an Offshore Supply Vessel Dynamic Positioning System

    DTIC Science & Technology

    2016-06-01

    additional safety issues that were either not identified or inadequately mitigated through the use of Fault Tree Analysis and Failure Modes and...Techniques ...................................................................................................... 15 1.3.1. Fault Tree Analysis...49 3.2. Fault Tree Analysis Comparison

  17. Using object-oriented analysis techniques to support system testing

    NASA Astrophysics Data System (ADS)

    Zucconi, Lin

    1990-03-01

    Testing of real-time control systems can be greatly facilitated by use of object-oriented and structured analysis modeling techniques. This report describes a project where behavior, process and information models built for a real-time control system were used to augment and aid traditional system testing. The modeling techniques used were an adaptation of the Ward/Mellor method for real-time systems analysis and design (Ward85) for object-oriented development. The models were used to simulate system behavior by means of hand execution of the behavior or state model and the associated process (data and control flow) and information (data) models. The information model, which uses an extended entity-relationship modeling technique, is used to identify application domain objects and their attributes (instance variables). The behavioral model uses state-transition diagrams to describe the state-dependent behavior of the object. The process model uses a transformation schema to describe the operations performed on or by the object. Together, these models provide a means of analyzing and specifying a system in terms of the static and dynamic properties of the objects which it manipulates. The various models were used to simultaneously capture knowledge about both the objects in the application domain and the system implementation. Models were constructed, verified against the software as-built and validated through informal reviews with the developer. These models were then hand-executed.

  18. The feasibility of implementing the data analysis and reporting techniques (DART) package in Virginia.

    DOT National Transportation Integrated Search

    1980-01-01

    This project was undertaken for the Virginia Department of Transportation Safety to assess the feasibility of implementing the Data Analysis and Reporting Techniques (DART) computer software system in Virginia. Following a review of available literat...

  19. Model authoring system for fail safe analysis

    NASA Technical Reports Server (NTRS)

    Sikora, Scott E.

    1990-01-01

    The Model Authoring System is a prototype software application for generating fault tree analyses and failure mode and effects analyses for circuit designs. Utilizing established artificial intelligence and expert system techniques, the circuits are modeled as a frame-based knowledge base in an expert system shell, which allows the use of object oriented programming and an inference engine. The behavior of the circuit is then captured through IF-THEN rules, which then are searched to generate either a graphical fault tree analysis or failure modes and effects analysis. Sophisticated authoring techniques allow the circuit to be easily modeled, permit its behavior to be quickly defined, and provide abstraction features to deal with complexity.

  20. Visualization of system dynamics using phasegrams

    PubMed Central

    Herbst, Christian T.; Herzel, Hanspeter; Švec, Jan G.; Wyman, Megan T.; Fitch, W. Tecumseh

    2013-01-01

    A new tool for visualization and analysis of system dynamics is introduced: the phasegram. Its application is illustrated with both classical nonlinear systems (logistic map and Lorenz system) and with biological voice signals. Phasegrams combine the advantages of sliding-window analysis (such as the spectrogram) with well-established visualization techniques from the domain of nonlinear dynamics. In a phasegram, time is mapped onto the x-axis, and various vibratory regimes, such as periodic oscillation, subharmonics or chaos, are identified within the generated graph by the number and stability of horizontal lines. A phasegram can be interpreted as a bifurcation diagram in time. In contrast to other analysis techniques, it can be automatically constructed from time-series data alone: no additional system parameter needs to be known. Phasegrams show great potential for signal classification and can act as the quantitative basis for further analysis of oscillating systems in many scientific fields, such as physics (particularly acoustics), biology or medicine. PMID:23697715

  1. Experimental analysis of computer system dependability

    NASA Technical Reports Server (NTRS)

    Iyer, Ravishankar, K.; Tang, Dong

    1993-01-01

    This paper reviews an area which has evolved over the past 15 years: experimental analysis of computer system dependability. Methodologies and advances are discussed for three basic approaches used in the area: simulated fault injection, physical fault injection, and measurement-based analysis. The three approaches are suited, respectively, to dependability evaluation in the three phases of a system's life: design phase, prototype phase, and operational phase. Before the discussion of these phases, several statistical techniques used in the area are introduced. For each phase, a classification of research methods or study topics is outlined, followed by discussion of these methods or topics as well as representative studies. The statistical techniques introduced include the estimation of parameters and confidence intervals, probability distribution characterization, and several multivariate analysis methods. Importance sampling, a statistical technique used to accelerate Monte Carlo simulation, is also introduced. The discussion of simulated fault injection covers electrical-level, logic-level, and function-level fault injection methods as well as representative simulation environments such as FOCUS and DEPEND. The discussion of physical fault injection covers hardware, software, and radiation fault injection methods as well as several software and hybrid tools including FIAT, FERARI, HYBRID, and FINE. The discussion of measurement-based analysis covers measurement and data processing techniques, basic error characterization, dependency analysis, Markov reward modeling, software-dependability, and fault diagnosis. The discussion involves several important issues studies in the area, including fault models, fast simulation techniques, workload/failure dependency, correlated failures, and software fault tolerance.

  2. Simple methods of exploiting the underlying structure of rule-based systems

    NASA Technical Reports Server (NTRS)

    Hendler, James

    1986-01-01

    Much recent work in the field of expert systems research has aimed at exploiting the underlying structures of the rule base for reasons of analysis. Such techniques as Petri-nets and GAGs have been proposed as representational structures that will allow complete analysis. Much has been made of proving isomorphisms between the rule bases and the mechanisms, and in examining the theoretical power of this analysis. In this paper we describe some early work in a new system which has much simpler (and thus, one hopes, more easily achieved) aims and less formality. The technique being examined is a very simple one: OPS5 programs are analyzed in a purely syntactic way and a FSA description is generated. In this paper we describe the technique and some user interface tools which exploit this structure.

  3. Photovoltaic power system reliability considerations

    NASA Technical Reports Server (NTRS)

    Lalli, V. R.

    1980-01-01

    An example of how modern engineering and safety techniques can be used to assure the reliable and safe operation of photovoltaic power systems is presented. This particular application is for a solar cell power system demonstration project designed to provide electric power requirements for remote villages. The techniques utilized involve a definition of the power system natural and operating environment, use of design criteria and analysis techniques, an awareness of potential problems via the inherent reliability and FMEA methods, and use of fail-safe and planned spare parts engineering philosophy.

  4. Ozone measurement systems improvements studies

    NASA Technical Reports Server (NTRS)

    Thomas, R. W.; Guard, K.; Holland, A. C.; Spurling, J. F.

    1974-01-01

    Results are summarized of an initial study of techniques for measuring atmospheric ozone, carried out as the first phase of a program to improve ozone measurement techniques. The study concentrated on two measurement systems, the electro chemical cell (ECC) ozonesonde and the Dobson ozone spectrophotometer, and consisted of two tasks. The first task consisted of error modeling and system error analysis of the two measurement systems. Under the second task a Monte-Carlo model of the Dobson ozone measurement technique was developed and programmed for computer operation.

  5. Photovoltaic power system reliability considerations

    NASA Technical Reports Server (NTRS)

    Lalli, V. R.

    1980-01-01

    This paper describes an example of how modern engineering and safety techniques can be used to assure the reliable and safe operation of photovoltaic power systems. This particular application was for a solar cell power system demonstration project in Tangaye, Upper Volta, Africa. The techniques involve a definition of the power system natural and operating environment, use of design criteria and analysis techniques, an awareness of potential problems via the inherent reliability and FMEA methods, and use of a fail-safe and planned spare parts engineering philosophy.

  6. Paper simulation techniques in user requirements analysis for interactive computer systems

    NASA Technical Reports Server (NTRS)

    Ramsey, H. R.; Atwood, M. E.; Willoughby, J. K.

    1979-01-01

    This paper describes the use of a technique called 'paper simulation' in the analysis of user requirements for interactive computer systems. In a paper simulation, the user solves problems with the aid of a 'computer', as in normal man-in-the-loop simulation. In this procedure, though, the computer does not exist, but is simulated by the experimenters. This allows simulated problem solving early in the design effort, and allows the properties and degree of structure of the system and its dialogue to be varied. The technique, and a method of analyzing the results, are illustrated with examples from a recent paper simulation exercise involving a Space Shuttle flight design task

  7. Investigation of advanced phase-shifting projected fringe profilometry techniques

    NASA Astrophysics Data System (ADS)

    Liu, Hongyu

    1999-11-01

    The phase-shifting projected fringe profilometry (PSPFP) technique is a powerful tool in the profile measurements of rough engineering surfaces. Compared with other competing techniques, this technique is notable for its full-field measurement capacity, system simplicity, high measurement speed, and low environmental vulnerability. The main purpose of this dissertation is to tackle three important problems, which severely limit the capability and the accuracy of the PSPFP technique, with some new approaches. Chapter 1 provides some background information of the PSPFP technique including the measurement principles, basic features, and related techniques is briefly introduced. The objectives and organization of the thesis are also outlined. Chapter 2 gives a theoretical treatment to the absolute PSPFP measurement. The mathematical formulations and basic requirements of the absolute PSPFP measurement and its supporting techniques are discussed in detail. Chapter 3 introduces the experimental verification of the proposed absolute PSPFP technique. Some design details of a prototype system are discussed as supplements to the previous theoretical analysis. Various fundamental experiments performed for concept verification and accuracy evaluation are introduced together with some brief comments. Chapter 4 presents the theoretical study of speckle- induced phase measurement errors. In this analysis, the expression for speckle-induced phase errors is first derived based on the multiplicative noise model of image- plane speckles. The statistics and the system dependence of speckle-induced phase errors are then thoroughly studied through numerical simulations and analytical derivations. Based on the analysis, some suggestions on the system design are given to improve measurement accuracy. Chapter 5 discusses a new technique combating surface reflectivity variations. The formula used for error compensation is first derived based on a simplified model of the detection process. The techniques coping with two major effects of surface reflectivity variations are then introduced. Some fundamental problems in the proposed technique are studied through simulations. Chapter 6 briefly summarizes the major contributions of the current work and provides some suggestions for the future research.

  8. Mathematical analysis techniques for modeling the space network activities

    NASA Technical Reports Server (NTRS)

    Foster, Lisa M.

    1992-01-01

    The objective of the present work was to explore and identify mathematical analysis techniques, and in particular, the use of linear programming. This topic was then applied to the Tracking and Data Relay Satellite System (TDRSS) in order to understand the space network better. Finally, a small scale version of the system was modeled, variables were identified, data was gathered, and comparisons were made between actual and theoretical data.

  9. Systemic family psychotherapy in China: a qualitative analysis of therapy process.

    PubMed

    Liu, Liang; Miller, John K; Zhao, Xudong; Ma, Xiquan; Wang, Jikun; Li, Wentian

    2013-12-01

    Although the procedure of systemic family therapy has been extensively researched in Western culture, few studies on this subject have been conducted in China. The aim of this study was to specify the therapy-delivered interventions in Chinese systemic family therapy and to explore how Western-imported systemic therapy model is delivered in Chinese culture. A qualitative and exploratory research approach was taken in which thematic analysis was used to analyse the transcribed psychotherapy videotapes with Chinese families. Twenty-six hours of video-recorded systemic family therapy sessions from 14 Chinese family cases were sampled. Thematic analysis was used to analyse the transcriptions of therapy sessions and identify therapist-generated interventions in therapy. Frequencies of different interventions were counted, and correspondence analysis (CA) was used to reveal the corresponding relationships between different interventions. Analysis led to two main themes related to therapists-delivered interventions: Therapist's intention and therapy technique. Results revealed 15 types of therapist's intentions and 16 categories of therapy techniques. CA indicated that therapists' intentions changed across different therapy stages and specific techniques were used to achieve corresponding intentions. Interventions delivered in Chinese systemic therapy are mainly adherent with Milan and post-Milan systemic models. Due to the shortage in systemic therapy service in China and Chinese culture advocating reverence to authority, components of psycho-education, guidance, and metaphor are adopted in Chinese systemic practice. Some directions for future research are suggested. © 2012 The British Psychological Society.

  10. SIR/CAR Analysis Technique for Voluntary School Sport or Amateur Athletic Organizations. A SIR/CAR Application.

    ERIC Educational Resources Information Center

    Moriarty, Dick; Zarebski, John

    This paper delineates the exact methodology developed by the Sports Institute for Research/Change Agent Research (SIR/CAR) for applying a systems analysis technique to a voluntary mutual benefit organization, such as a school or amateur athletic group. The functions of the technique are to compare avowed and actual behavior, to utilize group…

  11. Computer-assisted techniques to evaluate fringe patterns

    NASA Astrophysics Data System (ADS)

    Sciammarella, Cesar A.; Bhat, Gopalakrishna K.

    1992-01-01

    Strain measurement using interferometry requires an efficient way to extract the desired information from interferometric fringes. Availability of digital image processing systems makes it possible to use digital techniques for the analysis of fringes. In the past, there have been several developments in the area of one dimensional and two dimensional fringe analysis techniques, including the carrier fringe method (spatial heterodyning) and the phase stepping (quasi-heterodyning) technique. This paper presents some new developments in the area of two dimensional fringe analysis, including a phase stepping technique supplemented by the carrier fringe method and a two dimensional Fourier transform method to obtain the strain directly from the discontinuous phase contour map.

  12. Fast Computation and Assessment Methods in Power System Analysis

    NASA Astrophysics Data System (ADS)

    Nagata, Masaki

    Power system analysis is essential for efficient and reliable power system operation and control. Recently, online security assessment system has become of importance, as more efficient use of power networks is eagerly required. In this article, fast power system analysis techniques such as contingency screening, parallel processing and intelligent systems application are briefly surveyed from the view point of their application to online dynamic security assessment.

  13. Techniques for forced response involving discrete nonlinearities. I - Theory. II - Applications

    NASA Astrophysics Data System (ADS)

    Avitabile, Peter; Callahan, John O.

    Several new techniques developed for the forced response analysis of systems containing discrete nonlinear connection elements are presented and compared to the traditional methods. In particular, the techniques examined are the Equivalent Reduced Model Technique (ERMT), Modal Modification Response Technique (MMRT), and Component Element Method (CEM). The general theory of the techniques is presented, and applications are discussed with particular reference to the beam nonlinear system model using ERMT, MMRT, and CEM; frame nonlinear response using the three techniques; and comparison of the results obtained by using the ERMT, MMRT, and CEM models.

  14. An analysis technique for testing log grades

    Treesearch

    Carl A. Newport; William G. O' Regan

    1963-01-01

    An analytical technique that may be used in evaluating log-grading systems is described. It also provides means of comparing two or more grading systems, or a proposed change with the system from which it was developed. The total volume and computed value of lumber from each sample log are the basic data used.

  15. Models and techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1978-01-01

    Progress in the development of system models and techniques for the formulation and evaluation of aircraft computer system effectiveness is reported. Topics covered include: analysis of functional dependence: a prototype software package, METAPHOR, developed to aid the evaluation of performability; and a comprehensive performability modeling and evaluation exercise involving the SIFT computer.

  16. Effects of implant system, impression technique, and impression material on accuracy of the working cast.

    PubMed

    Wegner, Kerstin; Weskott, Katharina; Zenginel, Martha; Rehmann, Peter; Wöstmann, Bernd

    2013-01-01

    This in vitro study aimed to identify the effects of the implant system, impression technique, and impression material on the transfer accuracy of implant impressions. The null hypothesis tested was that, in vitro and within the parameters of the experiment, the spatial relationship of a working cast to the placement of implants is not related to (1) the implant system, (2) the impression technique, or (3) the impression material. A steel maxilla was used as a reference model. Six implants of two different implant systems (Standard Plus, Straumann; Semados, Bego) were fixed in the reference model. The target variables were: three-dimensional (3D) shift in all directions, implant axis direction, and rotation. The target variables were assessed using a 3D coordinate measuring machine, and the respective deviations of the plaster models from the nominal values of the reference model were calculated. Two different impression techniques (reposition/pickup) and four impression materials (Aquasil Ultra, Flexitime, Impregum Penta, P2 Magnum 360) were investigated. In all, 80 implant impressions for each implant system were taken. Statistical analysis was performed using multivariate analysis of variance. The implant system significantly influenced the transfer accuracy for most spatial dimensions, including the overall 3D shift and implant axis direction. There was no significant difference between the two implant systems with regard to rotation. Multivariate analysis of variance showed a significant effect on transfer accuracy only for the implant system. Within the limits of the present study, it can be concluded that the transfer accuracy of the intraoral implant position on the working cast is far more dependent on the implant system than on the selection of a specific impression technique or material.

  17. Adaptive neuro-fuzzy and expert systems for power quality analysis and prediction of abnormal operation

    NASA Astrophysics Data System (ADS)

    Ibrahim, Wael Refaat Anis

    The present research involves the development of several fuzzy expert systems for power quality analysis and diagnosis. Intelligent systems for the prediction of abnormal system operation were also developed. The performance of all intelligent modules developed was either enhanced or completely produced through adaptive fuzzy learning techniques. Neuro-fuzzy learning is the main adaptive technique utilized. The work presents a novel approach to the interpretation of power quality from the perspective of the continuous operation of a single system. The research includes an extensive literature review pertaining to the applications of intelligent systems to power quality analysis. Basic definitions and signature events related to power quality are introduced. In addition, detailed discussions of various artificial intelligence paradigms as well as wavelet theory are included. A fuzzy-based intelligent system capable of identifying normal from abnormal operation for a given system was developed. Adaptive neuro-fuzzy learning was applied to enhance its performance. A group of fuzzy expert systems that could perform full operational diagnosis were also developed successfully. The developed systems were applied to the operational diagnosis of 3-phase induction motors and rectifier bridges. A novel approach for learning power quality waveforms and trends was developed. The technique, which is adaptive neuro fuzzy-based, learned, compressed, and stored the waveform data. The new technique was successfully tested using a wide variety of power quality signature waveforms, and using real site data. The trend-learning technique was incorporated into a fuzzy expert system that was designed to predict abnormal operation of a monitored system. The intelligent system learns and stores, in compressed format, trends leading to abnormal operation. The system then compares incoming data to the retained trends continuously. If the incoming data matches any of the learned trends, an alarm is instigated predicting the advent of system abnormal operation. The incoming data could be compared to previous trends as well as matched to trends developed through computer simulations and stored using fuzzy learning.

  18. Finite element modeling of truss structures with frequency-dependent material damping

    NASA Technical Reports Server (NTRS)

    Lesieutre, George A.

    1991-01-01

    A physically motivated modelling technique for structural dynamic analysis that accommodates frequency dependent material damping was developed. Key features of the technique are the introduction of augmenting thermodynamic fields (AFT) to interact with the usual mechanical displacement field, and the treatment of the resulting coupled governing equations using finite element analysis methods. The AFT method is fully compatible with current structural finite element analysis techniques. The method is demonstrated in the dynamic analysis of a 10-bay planar truss structure, a structure representative of those contemplated for use in future space systems.

  19. Computational techniques for design optimization of thermal protection systems for the space shuttle vehicle. Volume 1: Final report

    NASA Technical Reports Server (NTRS)

    1971-01-01

    Computational techniques were developed and assimilated for the design optimization. The resulting computer program was then used to perform initial optimization and sensitivity studies on a typical thermal protection system (TPS) to demonstrate its application to the space shuttle TPS design. The program was developed in Fortran IV for the CDC 6400 but was subsequently converted to the Fortran V language to be used on the Univac 1108. The program allows for improvement and update of the performance prediction techniques. The program logic involves subroutines which handle the following basic functions: (1) a driver which calls for input, output, and communication between program and user and between the subroutines themselves; (2) thermodynamic analysis; (3) thermal stress analysis; (4) acoustic fatigue analysis; and (5) weights/cost analysis. In addition, a system total cost is predicted based on system weight and historical cost data of similar systems. Two basic types of input are provided, both of which are based on trajectory data. These are vehicle attitude (altitude, velocity, and angles of attack and sideslip), for external heat and pressure loads calculation, and heating rates and pressure loads as a function of time.

  20. Digital techniques for ULF wave polarization analysis

    NASA Technical Reports Server (NTRS)

    Arthur, C. W.

    1979-01-01

    Digital power spectral and wave polarization analysis are powerful techniques for studying ULF waves in the earth's magnetosphere. Four different techniques for using the spectral matrix to perform such an analysis have been presented in the literature. Three of these techniques are similar in that they require transformation of the spectral matrix to the principal axis system prior to performing the polarization analysis. The differences in the three techniques lie in the manner in which determine this transformation. A comparative study of these three techniques using both simulated and real data has shown them to be approximately equal in quality of performance. The fourth technique does not require transformation of the spectral matrix. Rather, it uses the measured spectral matrix and state vectors for a desired wave type to design a polarization detector function in the frequency domain. The design of various detector functions and their application to both simulated and real data will be presented.

  1. Predicting Gilthead Sea Bream (Sparus aurata) Freshness by a Novel Combined Technique of 3D Imaging and SW-NIR Spectral Analysis.

    PubMed

    Ivorra, Eugenio; Verdu, Samuel; Sánchez, Antonio J; Grau, Raúl; Barat, José M

    2016-10-19

    A technique that combines the spatial resolution of a 3D structured-light (SL) imaging system with the spectral analysis of a hyperspectral short-wave near infrared system was developed for freshness predictions of gilthead sea bream on the first storage days (Days 0-6). This novel approach allows the hyperspectral analysis of very specific fish areas, which provides more information for freshness estimations. The SL system obtains a 3D reconstruction of fish, and an automatic method locates gilthead's pupils and irises. Once these regions are positioned, the hyperspectral camera acquires spectral information and a multivariate statistical study is done. The best region is the pupil with an R² of 0.92 and an RMSE of 0.651 for predictions. We conclude that the combination of 3D technology with the hyperspectral analysis offers plenty of potential and is a very promising technique to non destructively predict gilthead freshness.

  2. Predicting Gilthead Sea Bream (Sparus aurata) Freshness by a Novel Combined Technique of 3D Imaging and SW-NIR Spectral Analysis

    PubMed Central

    Ivorra, Eugenio; Verdu, Samuel; Sánchez, Antonio J.; Grau, Raúl; Barat, José M.

    2016-01-01

    A technique that combines the spatial resolution of a 3D structured-light (SL) imaging system with the spectral analysis of a hyperspectral short-wave near infrared system was developed for freshness predictions of gilthead sea bream on the first storage days (Days 0–6). This novel approach allows the hyperspectral analysis of very specific fish areas, which provides more information for freshness estimations. The SL system obtains a 3D reconstruction of fish, and an automatic method locates gilthead’s pupils and irises. Once these regions are positioned, the hyperspectral camera acquires spectral information and a multivariate statistical study is done. The best region is the pupil with an R2 of 0.92 and an RMSE of 0.651 for predictions. We conclude that the combination of 3D technology with the hyperspectral analysis offers plenty of potential and is a very promising technique to non destructively predict gilthead freshness. PMID:27775556

  3. Software Safety Analysis of a Flight Guidance System

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W. (Technical Monitor); Tribble, Alan C.; Miller, Steven P.; Lempia, David L.

    2004-01-01

    This document summarizes the safety analysis performed on a Flight Guidance System (FGS) requirements model. In particular, the safety properties desired of the FGS model are identified and the presence of the safety properties in the model is formally verified. Chapter 1 provides an introduction to the entire project, while Chapter 2 gives a brief overview of the problem domain, the nature of accidents, model based development, and the four-variable model. Chapter 3 outlines the approach. Chapter 4 presents the results of the traditional safety analysis techniques and illustrates how the hazardous conditions associated with the system trace into specific safety properties. Chapter 5 presents the results of the formal methods analysis technique model checking that was used to verify the presence of the safety properties in the requirements model. Finally, Chapter 6 summarizes the main conclusions of the study, first and foremost that model checking is a very effective verification technique to use on discrete models with reasonable state spaces. Additional supporting details are provided in the appendices.

  4. A method for the measurement and analysis of ride vibrations of transportation systems

    NASA Technical Reports Server (NTRS)

    Catherines, J. J.; Clevenson, S. A.; Scholl, H. F.

    1972-01-01

    The measurement and recording of ride vibrations which affect passenger comfort in transportation systems and the subsequent data-reduction methods necessary for interpreting the data present exceptional instrumentation requirements and necessitate the use of computers for specialized analysis techniques. A method is presented for both measuring and analyzing ride vibrations of the type encountered in ground and air transportation systems. A portable system for measuring and recording low-frequency, low-amplitude accelerations and specialized data-reduction procedures are described. Sample vibration measurements in the form of statistical parameters representative of typical transportation systems are also presented to demonstrate the utility of the techniques.

  5. Data management system performance modeling

    NASA Technical Reports Server (NTRS)

    Kiser, Larry M.

    1993-01-01

    This paper discusses analytical techniques that have been used to gain a better understanding of the Space Station Freedom's (SSF's) Data Management System (DMS). The DMS is a complex, distributed, real-time computer system that has been redesigned numerous times. The implications of these redesigns have not been fully analyzed. This paper discusses the advantages and disadvantages for static analytical techniques such as Rate Monotonic Analysis (RMA) and also provides a rationale for dynamic modeling. Factors such as system architecture, processor utilization, bus architecture, queuing, etc. are well suited for analysis with a dynamic model. The significance of performance measures for a real-time system are discussed.

  6. Model-Based Safety Analysis

    NASA Technical Reports Server (NTRS)

    Joshi, Anjali; Heimdahl, Mats P. E.; Miller, Steven P.; Whalen, Mike W.

    2006-01-01

    System safety analysis techniques are well established and are used extensively during the design of safety-critical systems. Despite this, most of the techniques are highly subjective and dependent on the skill of the practitioner. Since these analyses are usually based on an informal system model, it is unlikely that they will be complete, consistent, and error free. In fact, the lack of precise models of the system architecture and its failure modes often forces the safety analysts to devote much of their effort to gathering architectural details about the system behavior from several sources and embedding this information in the safety artifacts such as the fault trees. This report describes Model-Based Safety Analysis, an approach in which the system and safety engineers share a common system model created using a model-based development process. By extending the system model with a fault model as well as relevant portions of the physical system to be controlled, automated support can be provided for much of the safety analysis. We believe that by using a common model for both system and safety engineering and automating parts of the safety analysis, we can both reduce the cost and improve the quality of the safety analysis. Here we present our vision of model-based safety analysis and discuss the advantages and challenges in making this approach practical.

  7. An image analysis system for near-infrared (NIR) fluorescence lymph imaging

    NASA Astrophysics Data System (ADS)

    Zhang, Jingdan; Zhou, Shaohua Kevin; Xiang, Xiaoyan; Rasmussen, John C.; Sevick-Muraca, Eva M.

    2011-03-01

    Quantitative analysis of lymphatic function is crucial for understanding the lymphatic system and diagnosing the associated diseases. Recently, a near-infrared (NIR) fluorescence imaging system is developed for real-time imaging lymphatic propulsion by intradermal injection of microdose of a NIR fluorophore distal to the lymphatics of interest. However, the previous analysis software3, 4 is underdeveloped, requiring extensive time and effort to analyze a NIR image sequence. In this paper, we develop a number of image processing techniques to automate the data analysis workflow, including an object tracking algorithm to stabilize the subject and remove the motion artifacts, an image representation named flow map to characterize lymphatic flow more reliably, and an automatic algorithm to compute lymph velocity and frequency of propulsion. By integrating all these techniques to a system, the analysis workflow significantly reduces the amount of required user interaction and improves the reliability of the measurement.

  8. Advanced Automation for Ion Trap Mass Spectrometry-New Opportunities for Real-Time Autonomous Analysis

    NASA Technical Reports Server (NTRS)

    Palmer, Peter T.; Wong, C. M.; Salmonson, J. D.; Yost, R. A.; Griffin, T. P.; Yates, N. A.; Lawless, James G. (Technical Monitor)

    1994-01-01

    The utility of MS/MS for both target compound analysis and the structure elucidation of unknowns has been described in a number of references. A broader acceptance of this technique has not yet been realized as it requires large, complex, and costly instrumentation which has not been competitive with more conventional techniques. Recent advancements in ion trap mass spectrometry promise to change this situation. Although the ion trap's small size, sensitivity, and ability to perform multiple stages of mass spectrometry have made it eminently suitable for on-line, real-time monitoring applications, advance automation techniques are required to make these capabilities more accessible to non-experts. Towards this end we have developed custom software for the design and implementation of MS/MS experiments. This software allows the user to take full advantage of the ion trap's versatility with respect to ionization techniques, scan proxies, and ion accumulation/ejection methods. Additionally, expert system software has been developed for autonomous target compound analysis. This software has been linked to ion trap control software and a commercial data system to bring all of the steps in the analysis cycle under control of the expert system. These software development efforts and their utilization for a number of trace analysis applications will be described.

  9. Laser power conversion system analysis, volume 1

    NASA Technical Reports Server (NTRS)

    Jones, W. S.; Morgan, L. L.; Forsyth, J. B.; Skratt, J. P.

    1979-01-01

    The orbit-to-orbit laser energy conversion system analysis established a mission model of satellites with various orbital parameters and average electrical power requirements ranging from 1 to 300 kW. The system analysis evaluated various conversion techniques, power system deployment parameters, power system electrical supplies and other critical supplies and other critical subsystems relative to various combinations of the mission model. The analysis show that the laser power system would not be competitive with current satellite power systems from weight, cost and development risk standpoints.

  10. A Cost-Effectiveness/Benefit Analysis Model for Postsecondary Vocational Programs. Technical Report.

    ERIC Educational Resources Information Center

    Kim, Jin Eun

    A cost-effectiveness/benefit analysis is defined as a technique for measuring the outputs of existing and new programs in relation to their specified program objectives, against the costs of those programs. In terms of its specific use, the technique is conceptualized as a systems analysis method, an evaluation method, and a planning tool for…

  11. Teaching Earth Signals Analysis Using the Java-DSP Earth Systems Edition: Modern and Past Climate Change

    ERIC Educational Resources Information Center

    Ramamurthy, Karthikeyan Natesan; Hinnov, Linda A.; Spanias, Andreas S.

    2014-01-01

    Modern data collection in the Earth Sciences has propelled the need for understanding signal processing and time-series analysis techniques. However, there is an educational disconnect in the lack of instruction of time-series analysis techniques in many Earth Science academic departments. Furthermore, there are no platform-independent freeware…

  12. Utilization of a CRT display light pen in the design of feedback control systems

    NASA Technical Reports Server (NTRS)

    Thompson, J. G.; Young, K. R.

    1972-01-01

    A hierarchical structure of the interlinked programs was developed to provide a flexible computer-aided design tool. A graphical input technique and a data structure are considered which provide the capability of entering the control system model description into the computer in block diagram form. An information storage and retrieval system was developed to keep track of the system description, and analysis and simulation results, and to provide them to the correct routines for further manipulation or display. Error analysis and diagnostic capabilities are discussed, and a technique was developed to reduce a transfer function to a set of nested integrals suitable for digital simulation. A general, automated block diagram reduction procedure was set up to prepare the system description for the analysis routines.

  13. Using object-oriented analysis to design a multi-mission ground data system

    NASA Technical Reports Server (NTRS)

    Shames, Peter

    1995-01-01

    This paper describes an analytical approach and descriptive methodology that is adapted from Object-Oriented Analysis (OOA) techniques. The technique is described and then used to communicate key issues of system logical architecture. The essence of the approach is to limit the analysis to only service objects, with the idea of providing a direct mapping from the design to a client-server implementation. Key perspectives on the system, such as user interaction, data flow and management, service interfaces, hardware configuration, and system and data integrity are covered. A significant advantage of this service-oriented approach is that it permits mapping all of these different perspectives on the system onto a single common substrate. This services substrate is readily represented diagramatically, thus making details of the overall design much more accessible.

  14. Independent component analysis based digital signal processing in coherent optical fiber communication systems

    NASA Astrophysics Data System (ADS)

    Li, Xiang; Luo, Ming; Qiu, Ying; Alphones, Arokiaswami; Zhong, Wen-De; Yu, Changyuan; Yang, Qi

    2018-02-01

    In this paper, channel equalization techniques for coherent optical fiber transmission systems based on independent component analysis (ICA) are reviewed. The principle of ICA for blind source separation is introduced. The ICA based channel equalization after both single-mode fiber and few-mode fiber transmission for single-carrier and orthogonal frequency division multiplexing (OFDM) modulation formats are investigated, respectively. The performance comparisons with conventional channel equalization techniques are discussed.

  15. Quantitative Schlieren analysis applied to holograms of crystals grown on Spacelab 3

    NASA Technical Reports Server (NTRS)

    Brooks, Howard L.

    1986-01-01

    In order to extract additional information about crystals grown in the microgravity environment of Spacelab, a quantitative schlieren analysis technique was developed for use in a Holography Ground System of the Fluid Experiment System. Utilizing the Unidex position controller, it was possible to measure deviation angles produced by refractive index gradients of 0.5 milliradians. Additionally, refractive index gradient maps for any recorded time during the crystal growth were drawn and used to create solute concentration maps for the environment around the crystal. The technique was applied to flight holograms of Cell 204 of the Fluid Experiment System that were recorded during the Spacelab 3 mission on STS 51B. A triglycine sulfate crystal was grown under isothermal conditions in the cell and the data gathered with the quantitative schlieren analysis technique is consistent with a diffusion limited growth process.

  16. ADP of multispectral scanner data for land use mapping

    NASA Technical Reports Server (NTRS)

    Hoffer, R. M.

    1971-01-01

    The advantages and disadvantages of various remote sensing instrumentation and analysis techniques are reviewed. The use of multispectral scanner data and the automatic data processing techniques are considered. A computer-aided analysis system for remote sensor data is described with emphasis on the image display, statistics processor, wavelength band selection, classification processor, and results display. Advanced techniques in using spectral and temporal data are also considered.

  17. Solid State Audio/Speech Processor Analysis.

    DTIC Science & Technology

    1980-03-01

    techniques. The techniques were demonstrated to be worthwhile in an efficient realtime AWR system. Finally, microprocessor architectures were designed to...do not include custom chip development, detailed hardware design , construction or testing. ITTDCD is very encouraged by the results obtained in this...California, Berkley, was responsible for furnishing the simulation data of OD speech analysis techniques and for the design and development of the hardware OD

  18. NASA/ASEE Summer Faculty Fellowship Program, 1990, Volume 1

    NASA Technical Reports Server (NTRS)

    Bannerot, Richard B. (Editor); Goldstein, Stanley H. (Editor)

    1990-01-01

    The 1990 Johnson Space Center (JSC) NASA/American Society for Engineering Education (ASEE) Summer Faculty Fellowship Program was conducted by the University of Houston-University Park and JSC. A compilation of the final reports on the research projects are presented. The topics covered include: the Space Station; the Space Shuttle; exobiology; cell biology; culture techniques; control systems design; laser induced fluorescence; spacecraft reliability analysis; reduced gravity; biotechnology; microgravity applications; regenerative life support systems; imaging techniques; cardiovascular system; physiological effects; extravehicular mobility units; mathematical models; bioreactors; computerized simulation; microgravity simulation; and dynamic structural analysis.

  19. A framework for graph-based synthesis, analysis, and visualization of HPC cluster job data.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mayo, Jackson R.; Kegelmeyer, W. Philip, Jr.; Wong, Matthew H.

    The monitoring and system analysis of high performance computing (HPC) clusters is of increasing importance to the HPC community. Analysis of HPC job data can be used to characterize system usage and diagnose and examine failure modes and their effects. This analysis is not straightforward, however, due to the complex relationships that exist between jobs. These relationships are based on a number of factors, including shared compute nodes between jobs, proximity of jobs in time, etc. Graph-based techniques represent an approach that is particularly well suited to this problem, and provide an effective technique for discovering important relationships in jobmore » queuing and execution data. The efficacy of these techniques is rooted in the use of a semantic graph as a knowledge representation tool. In a semantic graph job data, represented in a combination of numerical and textual forms, can be flexibly processed into edges, with corresponding weights, expressing relationships between jobs, nodes, users, and other relevant entities. This graph-based representation permits formal manipulation by a number of analysis algorithms. This report presents a methodology and software implementation that leverages semantic graph-based techniques for the system-level monitoring and analysis of HPC clusters based on job queuing and execution data. Ontology development and graph synthesis is discussed with respect to the domain of HPC job data. The framework developed automates the synthesis of graphs from a database of job information. It also provides a front end, enabling visualization of the synthesized graphs. Additionally, an analysis engine is incorporated that provides performance analysis, graph-based clustering, and failure prediction capabilities for HPC systems.« less

  20. Towards generating ECSS-compliant fault tree analysis results via ConcertoFLA

    NASA Astrophysics Data System (ADS)

    Gallina, B.; Haider, Z.; Carlsson, A.

    2018-05-01

    Attitude Control Systems (ACSs) maintain the orientation of the satellite in three-dimensional space. ACSs need to be engineered in compliance with ECSS standards and need to ensure a certain degree of dependability. Thus, dependability analysis is conducted at various levels and by using ECSS-compliant techniques. Fault Tree Analysis (FTA) is one of these techniques. FTA is being automated within various Model Driven Engineering (MDE)-based methodologies. The tool-supported CHESS-methodology is one of them. This methodology incorporates ConcertoFLA, a dependability analysis technique enabling failure behavior analysis and thus FTA-results generation. ConcertoFLA, however, similarly to other techniques, still belongs to the academic research niche. To promote this technique within the space industry, we apply it on an ACS and discuss about its multi-faceted potentialities in the context of ECSS-compliant engineering.

  1. Computer-aided system design

    NASA Technical Reports Server (NTRS)

    Walker, Carrie K.

    1991-01-01

    A technique has been developed for combining features of a systems architecture design and assessment tool and a software development tool. This technique reduces simulation development time and expands simulation detail. The Architecture Design and Assessment System (ADAS), developed at the Research Triangle Institute, is a set of computer-assisted engineering tools for the design and analysis of computer systems. The ADAS system is based on directed graph concepts and supports the synthesis and analysis of software algorithms mapped to candidate hardware implementations. Greater simulation detail is provided by the ADAS functional simulator. With the functional simulator, programs written in either Ada or C can be used to provide a detailed description of graph nodes. A Computer-Aided Software Engineering tool developed at the Charles Stark Draper Laboratory (CSDL CASE) automatically generates Ada or C code from engineering block diagram specifications designed with an interactive graphical interface. A technique to use the tools together has been developed, which further automates the design process.

  2. Self-conscious robotic system design process--from analysis to implementation.

    PubMed

    Chella, Antonio; Cossentino, Massimo; Seidita, Valeria

    2011-01-01

    Developing robotic systems endowed with self-conscious capabilities means realizing complex sub-systems needing ad-hoc software engineering techniques for their modelling, analysis and implementation. In this chapter the whole process (from analysis to implementation) to model the development of self-conscious robotic systems is presented and the new created design process, PASSIC, supporting each part of it, is fully illustrated.

  3. An example of requirements for Advanced Subsonic Civil Transport (ASCT) flight control system using structured techniques

    NASA Technical Reports Server (NTRS)

    Mclees, Robert E.; Cohen, Gerald C.

    1991-01-01

    The requirements are presented for an Advanced Subsonic Civil Transport (ASCT) flight control system generated using structured techniques. The requirements definition starts from initially performing a mission analysis to identify the high level control system requirements and functions necessary to satisfy the mission flight. The result of the study is an example set of control system requirements partially represented using a derivative of Yourdon's structured techniques. Also provided is a research focus for studying structured design methodologies and in particular design-for-validation philosophies.

  4. A comparative analysis of frequency modulation threshold extension techniques

    NASA Technical Reports Server (NTRS)

    Arndt, G. D.; Loch, F. J.

    1970-01-01

    FM threshold extension for system performance improvement, comparing impulse noise elimination, correlation detection and delta modulation signal processing techniques implemented at demodulator output

  5. A Structural and Content-Based Analysis for Web Filtering.

    ERIC Educational Resources Information Center

    Lee, P. Y.; Hui, S. C.; Fong, A. C. M.

    2003-01-01

    Presents an analysis of the distinguishing features of pornographic Web pages so that effective filtering techniques can be developed. Surveys the existing techniques for Web content filtering and describes the implementation of a Web content filtering system that uses an artificial neural network. (Author/LRW)

  6. A rapid high-resolution method for resolving DNA topoisomers.

    PubMed

    Mitchenall, Lesley A; Hipkin, Rachel E; Piperakis, Michael M; Burton, Nicolas P; Maxwell, Anthony

    2018-01-16

    Agarose gel electrophoresis has been the mainstay technique for the analysis of DNA samples of moderate size. In addition to separating linear DNA molecules, it can also resolve different topological forms of plasmid DNAs, an application useful for the analysis of the reactions of DNA topoisomerases. However, gel electrophoresis is an intrinsically low-throughput technique and suffers from other potential disadvantages. We describe the application of the QIAxcel Advanced System, a high-throughput capillary electrophoresis system, to separate DNA topoisomers, and compare this technique with gel electrophoresis. We prepared a range of topoisomers of plasmids pBR322 and pUC19, and a 339 bp DNA minicircle, and compared their separation by gel electrophoresis and the QIAxcel System. We found superior resolution with the QIAxcel System, and that quantitative analysis of topoisomer distributions was straightforward. We show that the QIAxcel system has advantages in terms of speed, resolution and cost, and can be applied to DNA circles of various sizes. It can readily be adapted for use in compound screening against topoisomerase targets.

  7. An improved large-field focusing schlieren system

    NASA Technical Reports Server (NTRS)

    Weinstein, Leonard M.

    1991-01-01

    The analysis and performance of a high-brightness large-field focusing schlieren system is described. The system can be used to examine complex two- and three-dimensional flows. Techniques are described to obtain focusing schlieren through distorting optical elements, to use multiple colors in a time multiplexing technique, and to use diffuse screen holography for three-dimensional photographs.

  8. Shock and vibration technology with applications to electrical systems

    NASA Technical Reports Server (NTRS)

    Eshleman, R. L.

    1972-01-01

    A survey is presented of shock and vibration technology for electrical systems developed by the aerospace programs. The shock environment is surveyed along with new techniques for modeling, computer simulation, damping, and response analysis. Design techniques based on the use of analog computers, shock spectra, optimization, and nonlinear isolation are discussed. Shock mounting of rotors for performance and survival, and vibration isolation techniques are reviewed.

  9. Computer assisted analysis of auroral images obtained from high altitude polar satellites

    NASA Technical Reports Server (NTRS)

    Samadani, Ramin; Flynn, Michael

    1993-01-01

    Automatic techniques that allow the extraction of physically significant parameters from auroral images were developed. This allows the processing of a much larger number of images than is currently possible with manual techniques. Our techniques were applied to diverse auroral image datasets. These results were made available to geophysicists at NASA and at universities in the form of a software system that performs the analysis. After some feedback from users, an upgraded system was transferred to NASA and to two universities. The feasibility of user-trained search and retrieval of large amounts of data using our automatically derived parameter indices was demonstrated. Techniques based on classification and regression trees (CART) were developed and applied to broaden the types of images to which the automated search and retrieval may be applied. Our techniques were tested with DE-1 auroral images.

  10. Cognitive task analysis: Techniques applied to airborne weapons training

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Terranova, M.; Seamster, T.L.; Snyder, C.E.

    1989-01-01

    This is an introduction to cognitive task analysis as it may be used in Naval Air Systems Command (NAVAIR) training development. The focus of a cognitive task analysis is human knowledge, and its methods of analysis are those developed by cognitive psychologists. This paper explains the role that cognitive task analysis and presents the findings from a preliminary cognitive task analysis of airborne weapons operators. Cognitive task analysis is a collection of powerful techniques that are quantitative, computational, and rigorous. The techniques are currently not in wide use in the training community, so examples of this methodology are presented alongmore » with the results. 6 refs., 2 figs., 4 tabs.« less

  11. Integrated Application of Active Controls (IAAC) technology to an advanced subsonic transport project: Current and advanced act control system definition study. Volume 2: Appendices

    NASA Technical Reports Server (NTRS)

    Hanks, G. W.; Shomber, H. A.; Dethman, H. A.; Gratzer, L. B.; Maeshiro, A.; Gangsaas, D.; Blight, J. D.; Buchan, S. M.; Crumb, C. B.; Dorwart, R. J.

    1981-01-01

    The current status of the Active Controls Technology (ACT) for the advanced subsonic transport project is investigated through analysis of the systems technical data. Control systems technologies under examination include computerized reliability analysis, pitch axis fly by wire actuator, flaperon actuation system design trade study, control law synthesis and analysis, flutter mode control and gust load alleviation analysis, and implementation of alternative ACT systems. Extensive analysis of the computer techniques involved in each system is included.

  12. Develop advanced nonlinear signal analysis topographical mapping system

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The Space Shuttle Main Engine (SSME) has been undergoing extensive flight certification and developmental testing, which involves some 250 health monitoring measurements. Under the severe temperature, pressure, and dynamic environments sustained during operation, numerous major component failures have occurred, resulting in extensive engine hardware damage and scheduling losses. To enhance SSME safety and reliability, detailed analysis and evaluation of the measurements signal are mandatory to assess its dynamic characteristics and operational condition. Efficient and reliable signal detection techniques will reduce catastrophic system failure risks and expedite the evaluation of both flight and ground test data, and thereby reduce launch turn-around time. The basic objective of this contract are threefold: (1) develop and validate a hierarchy of innovative signal analysis techniques for nonlinear and nonstationary time-frequency analysis. Performance evaluation will be carried out through detailed analysis of extensive SSME static firing and flight data. These techniques will be incorporated into a fully automated system; (2) develop an advanced nonlinear signal analysis topographical mapping system (ATMS) to generate a Compressed SSME TOPO Data Base (CSTDB). This ATMS system will convert tremendous amount of complex vibration signals from the entire SSME test history into a bank of succinct image-like patterns while retaining all respective phase information. High compression ratio can be achieved to allow minimal storage requirement, while providing fast signature retrieval, pattern comparison, and identification capabilities; and (3) integrate the nonlinear correlation techniques into the CSTDB data base with compatible TOPO input data format. Such integrated ATMS system will provide the large test archives necessary for quick signature comparison. This study will provide timely assessment of SSME component operational status, identify probable causes of malfunction, and indicate feasible engineering solutions. The final result of this program will yield an ATMS system of nonlinear and nonstationary spectral analysis software package integrated with the Compressed SSME TOPO Data Base (CSTDB) on the same platform. This system will allow NASA engineers to retrieve any unique defect signatures and trends associated with different failure modes and anomalous phenomena over the entire SSME test history across turbo pump families.

  13. Develop advanced nonlinear signal analysis topographical mapping system

    NASA Technical Reports Server (NTRS)

    Jong, Jen-Yi

    1993-01-01

    The SSME has been undergoing extensive flight certification and developmental testing, which involves some 250 health monitoring measurements. Under the severe temperature pressure, and dynamic environments sustained during operation, numerous major component failures have occurred, resulting in extensive engine hardware damage and scheduling losses. To enhance SSME safety and reliability, detailed analysis and evaluation of the measurements signal are mandatory to assess its dynamic characteristics and operational condition. Efficient and reliable signal detection techniques will reduce catastrophic system failure risks and expedite the evaluation of both flight and ground test data, and thereby reduce launch turn-around time. The basic objective of this contract are threefold: (1) Develop and validate a hierarchy of innovative signal analysis techniques for nonlinear and nonstationary time-frequency analysis. Performance evaluation will be carried out through detailed analysis of extensive SSME static firing and flight data. These techniques will be incorporated into a fully automated system. (2) Develop an advanced nonlinear signal analysis topographical mapping system (ATMS) to generate a Compressed SSME TOPO Data Base (CSTDB). This ATMS system will convert tremendous amounts of complex vibration signals from the entire SSME test history into a bank of succinct image-like patterns while retaining all respective phase information. A high compression ratio can be achieved to allow the minimal storage requirement, while providing fast signature retrieval, pattern comparison, and identification capabilities. (3) Integrate the nonlinear correlation techniques into the CSTDB data base with compatible TOPO input data format. Such integrated ATMS system will provide the large test archives necessary for a quick signature comparison. This study will provide timely assessment of SSME component operational status, identify probable causes of malfunction, and indicate feasible engineering solutions. The final result of this program will yield an ATMS system of nonlinear and nonstationary spectral analysis software package integrated with the Compressed SSME TOPO Data Base (CSTDB) on the same platform. This system will allow NASA engineers to retrieve any unique defect signatures and trends associated with different failure modes and anomalous phenomena over the entire SSME test history across turbo pump families.

  14. Computer program uses Monte Carlo techniques for statistical system performance analysis

    NASA Technical Reports Server (NTRS)

    Wohl, D. P.

    1967-01-01

    Computer program with Monte Carlo sampling techniques determines the effect of a component part of a unit upon the overall system performance. It utilizes the full statistics of the disturbances and misalignments of each component to provide unbiased results through simulated random sampling.

  15. Fault Tree Analysis: An Emerging Methodology for Instructional Science.

    ERIC Educational Resources Information Center

    Wood, R. Kent; And Others

    1979-01-01

    Describes Fault Tree Analysis, a tool for systems analysis which attempts to identify possible modes of failure in systems to increase the probability of success. The article defines the technique and presents the steps of FTA construction, focusing on its application to education. (RAO)

  16. Post-test navigation data analysis techniques for the shuttle ALT

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Postflight test analysis data processing techniques for shuttle approach and landing tests (ALT) navigation data are defined. Postfight test processor requirements are described along with operational and design requirements, data input requirements, and software test requirements. The postflight test data processing is described based on the natural test sequence: quick-look analysis, postflight navigation processing, and error isolation processing. Emphasis is placed on the tradeoffs that must remain open and subject to analysis until final definition is achieved in the shuttle data processing system and the overall ALT plan. A development plan for the implementation of the ALT postflight test navigation data processing system is presented. Conclusions are presented.

  17. The Systems Approach to Functional Job Analysis. Task Analysis of the Physician's Assistant: Volume I--Task Analysis Methodology and Techniques.

    ERIC Educational Resources Information Center

    Wake Forest Univ., Winston Salem, NC. Bowman Gray School of Medicine.

    Utilizing a systematic sampling technique, the professional activities of small groups of pediatricians, family practitioners, surgeons, obstetricians, and internists were observed for 4 or 5 days by a medical student who checked a prearranged activity sheet every 30 seconds to: (1) identify those tasks and activities an assistant could be trained…

  18. Safety Guided Design of Crew Return Vehicle in Concept Design Phase Using STAMP/STPA

    NASA Astrophysics Data System (ADS)

    Nakao, H.; Katahira, M.; Miyamoto, Y.; Leveson, N.

    2012-01-01

    In the concept development and design phase of a new space system, such as a Crew Vehicle, designers tend to focus on how to implement new technology. Designers also consider the difficulty of using the new technology and trade off several system design candidates. Then they choose an optimal design from the candidates. Safety should be a key aspect driving optimal concept design. However, in past concept design activities, safety analysis such as FTA has not used to drive the design because such analysis techniques focus on component failure and component failure cannot be considered in the concept design phase. The solution to these problems is to apply a new hazard analysis technique, called STAMP/STPA. STAMP/STPA defines safety as a control problem rather than a failure problem and identifies hazardous scenarios and their causes. Defining control flow is the essential in concept design phase. Therefore STAMP/STPA could be a useful tool to assess the safety of system candidates and to be part of the rationale for choosing a design as the baseline of the system. In this paper, we explain our case study of safety guided concept design using STPA, the new hazard analysis technique, and model-based specification technique on Crew Return Vehicle design and evaluate benefits of using STAMP/STPA in concept development phase.

  19. Analysis techniques for multivariate root loci. [a tool in linear control systems

    NASA Technical Reports Server (NTRS)

    Thompson, P. M.; Stein, G.; Laub, A. J.

    1980-01-01

    Analysis and techniques are developed for the multivariable root locus and the multivariable optimal root locus. The generalized eigenvalue problem is used to compute angles and sensitivities for both types of loci, and an algorithm is presented that determines the asymptotic properties of the optimal root locus.

  20. System analysis in rotorcraft design: The past decade

    NASA Technical Reports Server (NTRS)

    Galloway, Thomas L.

    1988-01-01

    Rapid advances in the technology of electronic digital computers and the need for an integrated synthesis approach in developing future rotorcraft programs has led to increased emphasis on system analysis techniques in rotorcraft design. The task in systems analysis is to deal with complex, interdependent, and conflicting requirements in a structured manner so rational and objective decisions can be made. Whether the results are wisdom or rubbish depends upon the validity and sometimes more importantly, the consistency of the inputs, the correctness of the analysis, and a sensible choice of measures of effectiveness to draw conclusions. In rotorcraft design this means combining design requirements, technology assessment, sensitivity analysis and reviews techniques currently in use by NASA and Army organizations in developing research programs and vehicle specifications for rotorcraft. These procedures span simple graphical approaches to comprehensive analysis on large mainframe computers. Examples of recent applications to military and civil missions are highlighted.

  1. Vibration Transmission through Rolling Element Bearings in Geared Rotor Systems

    DTIC Science & Technology

    1990-11-01

    147 4.8 Concluding Remarks ........................................................... 153 V STATISTICAL ENERGY ANALYSIS ............................................ 155...and dynamic finite element techniques are used to develop the discrete vibration models while statistical energy analysis method is used for the broad...bearing system studies, geared rotor system studies, and statistical energy analysis . Each chapter is self sufficient since it is written in a

  2. Performance analysis of distributed applications using automatic classification of communication inefficiencies

    DOEpatents

    Vetter, Jeffrey S.

    2005-02-01

    The method and system described herein presents a technique for performance analysis that helps users understand the communication behavior of their message passing applications. The method and system described herein may automatically classifies individual communication operations and reveal the cause of communication inefficiencies in the application. This classification allows the developer to quickly focus on the culprits of truly inefficient behavior, rather than manually foraging through massive amounts of performance data. Specifically, the method and system described herein trace the message operations of Message Passing Interface (MPI) applications and then classify each individual communication event using a supervised learning technique: decision tree classification. The decision tree may be trained using microbenchmarks that demonstrate both efficient and inefficient communication. Since the method and system described herein adapt to the target system's configuration through these microbenchmarks, they simultaneously automate the performance analysis process and improve classification accuracy. The method and system described herein may improve the accuracy of performance analysis and dramatically reduce the amount of data that users must encounter.

  3. Artificial Intelligence Techniques: Applications for Courseware Development.

    ERIC Educational Resources Information Center

    Dear, Brian L.

    1986-01-01

    Introduces some general concepts and techniques of artificial intelligence (natural language interfaces, expert systems, knowledge bases and knowledge representation, heuristics, user-interface metaphors, and object-based environments) and investigates ways these techniques might be applied to analysis, design, development, implementation, and…

  4. Discriminant forest classification method and system

    DOEpatents

    Chen, Barry Y.; Hanley, William G.; Lemmond, Tracy D.; Hiller, Lawrence J.; Knapp, David A.; Mugge, Marshall J.

    2012-11-06

    A hybrid machine learning methodology and system for classification that combines classical random forest (RF) methodology with discriminant analysis (DA) techniques to provide enhanced classification capability. A DA technique which uses feature measurements of an object to predict its class membership, such as linear discriminant analysis (LDA) or Andersen-Bahadur linear discriminant technique (AB), is used to split the data at each node in each of its classification trees to train and grow the trees and the forest. When training is finished, a set of n DA-based decision trees of a discriminant forest is produced for use in predicting the classification of new samples of unknown class.

  5. Systems thinking, the Swiss Cheese Model and accident analysis: a comparative systemic analysis of the Grayrigg train derailment using the ATSB, AcciMap and STAMP models.

    PubMed

    Underwood, Peter; Waterson, Patrick

    2014-07-01

    The Swiss Cheese Model (SCM) is the most popular accident causation model and is widely used throughout various industries. A debate exists in the research literature over whether the SCM remains a viable tool for accident analysis. Critics of the model suggest that it provides a sequential, oversimplified view of accidents. Conversely, proponents suggest that it embodies the concepts of systems theory, as per the contemporary systemic analysis techniques. The aim of this paper was to consider whether the SCM can provide a systems thinking approach and remain a viable option for accident analysis. To achieve this, the train derailment at Grayrigg was analysed with an SCM-based model (the ATSB accident investigation model) and two systemic accident analysis methods (AcciMap and STAMP). The analysis outputs and usage of the techniques were compared. The findings of the study showed that each model applied the systems thinking approach. However, the ATSB model and AcciMap graphically presented their findings in a more succinct manner, whereas STAMP more clearly embodied the concepts of systems theory. The study suggests that, whilst the selection of an analysis method is subject to trade-offs that practitioners and researchers must make, the SCM remains a viable model for accident analysis. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Analysis of randomly time varying systems by gaussian closure technique

    NASA Astrophysics Data System (ADS)

    Dash, P. K.; Iyengar, R. N.

    1982-07-01

    The Gaussian probability closure technique is applied to study the random response of multidegree of freedom stochastically time varying systems under non-Gaussian excitations. Under the assumption that the response, the coefficient and the excitation processes are jointly Gaussian, deterministic equations are derived for the first two response moments. It is further shown that this technique leads to the best Gaussian estimate in a minimum mean square error sense. An example problem is solved which demonstrates the capability of this technique for handling non-linearity, stochastic system parameters and amplitude limited responses in a unified manner. Numerical results obtained through the Gaussian closure technique compare well with the exact solutions.

  7. Quantitative assessment of human motion using video motion analysis

    NASA Technical Reports Server (NTRS)

    Probe, John D.

    1993-01-01

    In the study of the dynamics and kinematics of the human body a wide variety of technologies has been developed. Photogrammetric techniques are well documented and are known to provide reliable positional data from recorded images. Often these techniques are used in conjunction with cinematography and videography for analysis of planar motion, and to a lesser degree three-dimensional motion. Cinematography has been the most widely used medium for movement analysis. Excessive operating costs and the lag time required for film development, coupled with recent advances in video technology, have allowed video based motion analysis systems to emerge as a cost effective method of collecting and analyzing human movement. The Anthropometric and Biomechanics Lab at Johnson Space Center utilizes the video based Ariel Performance Analysis System (APAS) to develop data on shirtsleeved and space-suited human performance in order to plan efficient on-orbit intravehicular and extravehicular activities. APAS is a fully integrated system of hardware and software for biomechanics and the analysis of human performance and generalized motion measurement. Major components of the complete system include the video system, the AT compatible computer, and the proprietary software.

  8. Develop Advanced Nonlinear Signal Analysis Topographical Mapping System

    NASA Technical Reports Server (NTRS)

    Jong, Jen-Yi

    1997-01-01

    During the development of the SSME, a hierarchy of advanced signal analysis techniques for mechanical signature analysis has been developed by NASA and AI Signal Research Inc. (ASRI) to improve the safety and reliability for Space Shuttle operations. These techniques can process and identify intelligent information hidden in a measured signal which is often unidentifiable using conventional signal analysis methods. Currently, due to the highly interactive processing requirements and the volume of dynamic data involved, detailed diagnostic analysis is being performed manually which requires immense man-hours with extensive human interface. To overcome this manual process, NASA implemented this program to develop an Advanced nonlinear signal Analysis Topographical Mapping System (ATMS) to provide automatic/unsupervised engine diagnostic capabilities. The ATMS will utilize a rule-based Clips expert system to supervise a hierarchy of diagnostic signature analysis techniques in the Advanced Signal Analysis Library (ASAL). ASAL will perform automatic signal processing, archiving, and anomaly detection/identification tasks in order to provide an intelligent and fully automated engine diagnostic capability. The ATMS has been successfully developed under this contract. In summary, the program objectives to design, develop, test and conduct performance evaluation for an automated engine diagnostic system have been successfully achieved. Software implementation of the entire ATMS system on MSFC's OISPS computer has been completed. The significance of the ATMS developed under this program is attributed to the fully automated coherence analysis capability for anomaly detection and identification which can greatly enhance the power and reliability of engine diagnostic evaluation. The results have demonstrated that ATMS can significantly save time and man-hours in performing engine test/flight data analysis and performance evaluation of large volumes of dynamic test data.

  9. Multiple Vehicle Detection and Segmentation in Malaysia Traffic Flow

    NASA Astrophysics Data System (ADS)

    Fariz Hasan, Ahmad; Fikri Che Husin, Mohd; Affendi Rosli, Khairul; Norhafiz Hashim, Mohd; Faiz Zainal Abidin, Amar

    2018-03-01

    Vision based system are widely used in the field of Intelligent Transportation System (ITS) to extract a large amount of information to analyze traffic scenes. By rapid number of vehicles on the road as well as significant increase on cameras dictated the need for traffic surveillance systems. This system can take over the burden some task was performed by human operator in traffic monitoring centre. The main technique proposed by this paper is concentrated on developing a multiple vehicle detection and segmentation focusing on monitoring through Closed Circuit Television (CCTV) video. The system is able to automatically segment vehicle extracted from heavy traffic scene by optical flow estimation alongside with blob analysis technique in order to detect the moving vehicle. Prior to segmentation, blob analysis technique will compute the area of interest region corresponding to moving vehicle which will be used to create bounding box on that particular vehicle. Experimental validation on the proposed system was performed and the algorithm is demonstrated on various set of traffic scene.

  10. Comparative analysis of techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Hitt, E. F.; Bridgman, M. S.; Robinson, A. C.

    1981-01-01

    Performability analysis is a technique developed for evaluating the effectiveness of fault-tolerant computing systems in multiphase missions. Performability was evaluated for its accuracy, practical usefulness, and relative cost. The evaluation was performed by applying performability and the fault tree method to a set of sample problems ranging from simple to moderately complex. The problems involved as many as five outcomes, two to five mission phases, permanent faults, and some functional dependencies. Transient faults and software errors were not considered. A different analyst was responsible for each technique. Significantly more time and effort were required to learn performability analysis than the fault tree method. Performability is inherently as accurate as fault tree analysis. For the sample problems, fault trees were more practical and less time consuming to apply, while performability required less ingenuity and was more checkable. Performability offers some advantages for evaluating very complex problems.

  11. Comparison of Spares Logistics Analysis Techniques for Long Duration Human Spaceflight

    NASA Technical Reports Server (NTRS)

    Owens, Andrew; de Weck, Olivier; Mattfeld, Bryan; Stromgren, Chel; Cirillo, William

    2015-01-01

    As the durations and distances involved in human exploration missions increase, the logistics associated with the repair and maintenance becomes more challenging. Whereas the operation of the International Space Station (ISS) depends upon regular resupply from the Earth, this paradigm may not be feasible for future missions. Longer mission durations result in higher probabilities of component failures as well as higher uncertainty regarding which components may fail, and longer distances from Earth increase the cost of resupply as well as the speed at which the crew can abort to Earth in the event of an emergency. As such, mission development efforts must take into account the logistics requirements associated with maintenance and spares. Accurate prediction of the spare parts demand for a given mission plan and how that demand changes as a result of changes to the system architecture enables full consideration of the lifecycle cost associated with different options. In this paper, we utilize a range of analysis techniques - Monte Carlo, semi-Markov, binomial, and heuristic - to examine the relationship between the mass of spares and probability of loss of function related to the Carbon Dioxide Removal System (CRS) for a notional, simplified mission profile. The Exploration Maintainability Analysis Tool (EMAT), developed at NASA Langley Research Center, is utilized for the Monte Carlo analysis. We discuss the implications of these results and the features and drawbacks of each method. In particular, we identify the limitations of heuristic methods for logistics analysis, and the additional insights provided by more in-depth techniques. We discuss the potential impact of system complexity on each technique, as well as their respective abilities to examine dynamic events. This work is the first step in an effort that will quantitatively examine how well these techniques handle increasingly more complex systems by gradually expanding the system boundary.

  12. Model-Driven Test Generation of Distributed Systems

    NASA Technical Reports Server (NTRS)

    Easwaran, Arvind; Hall, Brendan; Schweiker, Kevin

    2012-01-01

    This report describes a novel test generation technique for distributed systems. Utilizing formal models and formal verification tools, spe cifically the Symbolic Analysis Laboratory (SAL) tool-suite from SRI, we present techniques to generate concurrent test vectors for distrib uted systems. These are initially explored within an informal test validation context and later extended to achieve full MC/DC coverage of the TTEthernet protocol operating within a system-centric context.

  13. Analysis of transport eco-efficiency scenarios to support sustainability assessment: a study on Dhaka City, Bangladesh.

    PubMed

    Iqbal, Asif; Allan, Andrew; Afroze, Shirina

    2017-08-01

    The study focused to assess the level of efficiency (of both emissions and service quality) that can be achieved for the transport system in Dhaka City, Bangladesh. The assessment technique attempted to quantify the extent of eco-efficiency achievable for the system modifications due to planning or strategy. The eco-efficiency analysis was facilitated with a detailed survey data on Dhaka City transport system, which was conducted for 9 months in 2012-2013. Line source modelling (CALINE4) was incorporated to estimate the on-road emission concentration. The eco-efficiency of the transport systems was assessed with the 'multi-criteria analysis' (MCA) technique that enabled the valuation of systems' qualitative and quantitative parameters. As per the analysis, driving indiscipline on road can alone promise about 47% reductions in emissions, which along with the number of private vehicles were the important stressors that restrict achieving eco-efficiency in Dhaka City. Detailed analysis of the transport system together with the potential transport system scenarios can offer a checklist to the policy makers enabling to identify the possible actions needed that can offer greater services to the dwellers against lesser emissions, which in turn can bring sustainability of the system.

  14. Flexible body stability analysis of Space Shuttle ascent flight control system by using lambda matrix solution techniques

    NASA Technical Reports Server (NTRS)

    Bown, R. L.; Christofferson, A.; Lardas, M.; Flanders, H.

    1980-01-01

    A lambda matrix solution technique is being developed to perform an open loop frequency analysis of a high order dynamic system. The procedure evaluates the right and left latent vectors corresponding to the respective latent roots. The latent vectors are used to evaluate the partial fraction expansion formulation required to compute the flexible body open loop feedback gains for the Space Shuttle Digital Ascent Flight Control System. The algorithm is in the final stages of development and will be used to insure that the feedback gains meet the design specification.

  15. An empirical comparison of a dynamic software testability metric to static cyclomatic complexity

    NASA Technical Reports Server (NTRS)

    Voas, Jeffrey M.; Miller, Keith W.; Payne, Jeffrey E.

    1993-01-01

    This paper compares the dynamic testability prediction technique termed 'sensitivity analysis' to the static testability technique termed cyclomatic complexity. The application that we chose in this empirical study is a CASE generated version of a B-737 autoland system. For the B-737 system we analyzed, we isolated those functions that we predict are more prone to hide errors during system/reliability testing. We also analyzed the code with several other well-known static metrics. This paper compares and contrasts the results of sensitivity analysis to the results of the static metrics.

  16. Recommended techniques for effective maintainability. A continuous improvement initiative of the NASA Reliability and Maintainability Steering Committee

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This manual presents a series of recommended techniques that can increase overall operational effectiveness of both flight and ground based NASA systems. It provides a set of tools that minimizes risk associated with: (1) restoring failed functions (both ground and flight based); (2) conducting complex and highly visible maintenance operations; and (3) sustaining a technical capability to support the NASA mission using aging equipment or facilities. It considers (1) program management - key elements of an effective maintainability effort; (2) design and development - techniques that have benefited previous programs; (3) analysis and test - quantitative and qualitative analysis processes and testing techniques; and (4) operations and operational design techniques that address NASA field experience. This document is a valuable resource for continuous improvement ideas in executing the systems development process in accordance with the NASA 'better, faster, smaller, and cheaper' goal without compromising safety.

  17. Ultrasonic non invasive techniques for microbiological instrumentation

    NASA Astrophysics Data System (ADS)

    Elvira, L.; Sierra, C.; Galán, B.; Resa, P.

    2010-01-01

    Non invasive techniques based on ultrasounds have advantageous features to study, characterize and monitor microbiological and enzymatic reactions. These processes may change the sound speed, viscosity or particle distribution size of the medium where they take place, which makes possible their analysis using ultrasonic techniques. In this work, two different systems for the analysis of microbiological liquid media based on ultrasounds are presented. In first place, an industrial application based on an ultrasonic monitoring technique for microbiological growth detection in milk is shown. Such a system may improve the quality control strategies in food production factories, being able to decrease the time required to detect possible contaminations in packed products. Secondly, a study about the growing of the Escherichia coli DH5 α in different conditions is presented. It is shown that the use of ultrasonic non invasive characterization techniques in combination with other conventional measurements like optical density provides complementary information about the metabolism of these bacteria.

  18. Text mining and its potential applications in systems biology.

    PubMed

    Ananiadou, Sophia; Kell, Douglas B; Tsujii, Jun-ichi

    2006-12-01

    With biomedical literature increasing at a rate of several thousand papers per week, it is impossible to keep abreast of all developments; therefore, automated means to manage the information overload are required. Text mining techniques, which involve the processes of information retrieval, information extraction and data mining, provide a means of solving this. By adding meaning to text, these techniques produce a more structured analysis of textual knowledge than simple word searches, and can provide powerful tools for the production and analysis of systems biology models.

  19. Battery Test Manual For 48 Volt Mild Hybrid Electric Vehicles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walker, Lee Kenneth

    2017-03-01

    This manual details the U.S. Advanced Battery Consortium and U.S. Department of Energy Vehicle Technologies Program goals, test methods, and analysis techniques for a 48 Volt Mild Hybrid Electric Vehicle system. The test methods are outlined stating with characterization tests, followed by life tests. The final section details standardized analysis techniques for 48 V systems that allow for the comparison of different programs that use this manual. An example test plan is included, along with guidance to filling in gap table numbers.

  20. A tandem regression-outlier analysis of a ligand cellular system for key structural modifications around ligand binding.

    PubMed

    Lin, Ying-Ting

    2013-04-30

    A tandem technique of hard equipment is often used for the chemical analysis of a single cell to first isolate and then detect the wanted identities. The first part is the separation of wanted chemicals from the bulk of a cell; the second part is the actual detection of the important identities. To identify the key structural modifications around ligand binding, the present study aims to develop a counterpart of tandem technique for cheminformatics. A statistical regression and its outliers act as a computational technique for separation. A PPARγ (peroxisome proliferator-activated receptor gamma) agonist cellular system was subjected to such an investigation. Results show that this tandem regression-outlier analysis, or the prioritization of the context equations tagged with features of the outliers, is an effective regression technique of cheminformatics to detect key structural modifications, as well as their tendency of impact to ligand binding. The key structural modifications around ligand binding are effectively extracted or characterized out of cellular reactions. This is because molecular binding is the paramount factor in such ligand cellular system and key structural modifications around ligand binding are expected to create outliers. Therefore, such outliers can be captured by this tandem regression-outlier analysis.

  1. Techniques for Fault Detection and Visualization of Telemetry Dependence Relationships for Root Cause Fault Analysis in Complex Systems

    NASA Astrophysics Data System (ADS)

    Guy, Nathaniel

    This thesis explores new ways of looking at telemetry data, from a time-correlative perspective, in order to see patterns within the data that may suggest root causes of system faults. It was thought initially that visualizing an animated Pearson Correlation Coefficient (PCC) matrix for telemetry channels would be sufficient to give new understanding; however, testing showed that the high dimensionality and inability to easily look at change over time in this approach impeded understanding. Different correlative techniques, combined with the time curve visualization proposed by Bach et al (2015), were adapted to visualize both raw telemetry and telemetry data correlations. Review revealed that these new techniques give insights into the data, and an intuitive grasp of data families, which show the effectiveness of this approach for enhancing system understanding and assisting with root cause analysis for complex aerospace systems.

  2. Autologous Fat Grafting to the Breast Using REVOLVE System to Reduce Clinical Costs.

    PubMed

    Brzezienski, Mark A; Jarrell, John A

    2016-09-01

    With the increasing popularity of fat grafting over the past decade, the techniques for harvest, processing and preparation, and transfer of the fat cells have evolved to improve efficiency and consistency. The REVOLVE System is a fat processing device used in autologous fat grafting which eliminates much of the specialized equipment as well as the labor intensive and time consuming efforts of the original Coleman technique of fat processing. This retrospective study evaluates the economics of fat grafting, comparing traditional Coleman processing to the REVOLVE System. From June 2013 through December 2013, 88 fat grafting cases by a single-surgeon were reviewed. Timed procedures using either the REVOLVE System or Coleman technique were extracted from the group. Data including fat grafting procedure time, harvested volume, harvest and recipient sites, and concurrent procedures were gathered. Cost and utilization assessments were performed comparing the economics between the groups using standard values of operating room costs provided by the study hospital. Thirty-seven patients with timed procedures were identified, 13 of which were Coleman technique patients and twenty-four (24) were REVOLVE System patients. The average rate of fat transfer was 1.77 mL/minute for the Coleman technique and 4.69 mL/minute for the REVOLVE System, which was a statistically significant difference (P < 0.0001) between the 2 groups. Cost analysis comparing the REVOLVE System and Coleman techniques demonstrates a dramatic divergence in the price per mL of transferred fat at 75 mL when using the previously calculated rates for each group. This single surgeon's experience with the REVOLVE System for fat processing establishes economic support for its use in specific high-volume fat grafting cases. Cost analysis comparing the REVOLVE System and Coleman techniques suggests that in cases of planned fat transfer of 75 mL or more, using the REVOLVE System for fat processing is more economically beneficial. This study may serve as a guide to plastic surgeons in deciding which cases might be appropriate for the use of the REVOLVE System and is the first report comparing economics of fat grafting with the traditional Coleman technique and the REVOLVE System.

  3. An Application of Instructional System Development to Determine Financial Management Education Needs for Logistics Management Positions.

    DTIC Science & Technology

    1976-09-01

    The purpose of this research effort was to determine the financial management educational needs of USAF graduate logistics positions. Goal analysis...was used to identify financial management techniques and task analysis was used to develop a method to identify the use of financial management techniques...positions. The survey identified financial management techniques in five areas: cost accounting, capital budgeting, working capital, financial forecasting, and programming. (Author)

  4. New approaches to the analysis of complex samples using fluorescence lifetime techniques and organized media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hertz, P.R.

    Fluorescence spectroscopy is a highly sensitive and selective tool for the analysis of complex systems. In order to investigate the efficacy of several steady state and dynamic techniques for the analysis of complex systems, this work focuses on two types of complex, multicomponent samples: petrolatums and coal liquids. It is shown in these studies dynamic, fluorescence lifetime-based measurements provide enhanced discrimination between complex petrolatum samples. Additionally, improved quantitative analysis of multicomponent systems is demonstrated via incorporation of organized media in coal liquid samples. This research provides the first systematic studies of (1) multifrequency phase-resolved fluorescence spectroscopy for dynamic fluorescence spectralmore » fingerprinting of complex samples, and (2) the incorporation of bile salt micellar media to improve accuracy and sensitivity for characterization of complex systems. In the petroleum studies, phase-resolved fluorescence spectroscopy is used to combine spectral and lifetime information through the measurement of phase-resolved fluorescence intensity. The intensity is collected as a function of excitation and emission wavelengths, angular modulation frequency, and detector phase angle. This multidimensional information enhances the ability to distinguish between complex samples with similar spectral characteristics. Examination of the eigenvalues and eigenvectors from factor analysis of phase-resolved and steady state excitation-emission matrices, using chemometric methods of data analysis, confirms that phase-resolved fluorescence techniques offer improved discrimination between complex samples as compared with conventional steady state methods.« less

  5. Hierarchical modeling for reliability analysis using Markov models. B.S./M.S. Thesis - MIT

    NASA Technical Reports Server (NTRS)

    Fagundo, Arturo

    1994-01-01

    Markov models represent an extremely attractive tool for the reliability analysis of many systems. However, Markov model state space grows exponentially with the number of components in a given system. Thus, for very large systems Markov modeling techniques alone become intractable in both memory and CPU time. Often a particular subsystem can be found within some larger system where the dependence of the larger system on the subsystem is of a particularly simple form. This simple dependence can be used to decompose such a system into one or more subsystems. A hierarchical technique is presented which can be used to evaluate these subsystems in such a way that their reliabilities can be combined to obtain the reliability for the full system. This hierarchical approach is unique in that it allows the subsystem model to pass multiple aggregate state information to the higher level model, allowing more general systems to be evaluated. Guidelines are developed to assist in the system decomposition. An appropriate method for determining subsystem reliability is also developed. This method gives rise to some interesting numerical issues. Numerical error due to roundoff and integration are discussed at length. Once a decomposition is chosen, the remaining analysis is straightforward but tedious. However, an approach is developed for simplifying the recombination of subsystem reliabilities. Finally, a real world system is used to illustrate the use of this technique in a more practical context.

  6. Modeling and prototyping of biometric systems using dataflow programming

    NASA Astrophysics Data System (ADS)

    Minakova, N.; Petrov, I.

    2018-01-01

    The development of biometric systems is one of the labor-intensive processes. Therefore, the creation and analysis of approaches and techniques is an urgent task at present. This article presents a technique of modeling and prototyping biometric systems based on dataflow programming. The technique includes three main stages: the development of functional blocks, the creation of a dataflow graph and the generation of a prototype. A specially developed software modeling environment that implements this technique is described. As an example of the use of this technique, an example of the implementation of the iris localization subsystem is demonstrated. A variant of modification of dataflow programming is suggested to solve the problem related to the undefined order of block activation. The main advantage of the presented technique is the ability to visually display and design the model of the biometric system, the rapid creation of a working prototype and the reuse of the previously developed functional blocks.

  7. Capillary electrophoresis in two-dimensional separation systems: Techniques and applications.

    PubMed

    Kohl, Felix J; Sánchez-Hernández, Laura; Neusüß, Christian

    2015-01-01

    The analysis of complex samples requires powerful separation techniques. Here, 2D chromatographic separation techniques (e.g. LC-LC, GC-GC) are increasingly applied in many fields. Electrophoretic separation techniques show a different selectivity in comparison to LC and GC and very high separation efficiency. Thus, 2D separation systems containing at least one CE-based separation technique are an interesting alternative featuring potentially a high degree of orthogonality. However, the generally small volumes and strong electrical fields in CE require special coupling techniques. These technical developments are reviewed in this work, discussing benefits and drawbacks of offline and online systems. Emphasis is placed on the design of the systems, their coupling, and the detector used. Moreover, the employment of strategies to improve peak capacity, resolution, or sensitivity is highlighted. Various applications of 2D separations with CE are summarized. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Integrated Formulation of Beacon-Based Exception Analysis for Multimissions

    NASA Technical Reports Server (NTRS)

    Mackey, Ryan; James, Mark; Park, Han; Zak, Mickail

    2003-01-01

    Further work on beacon-based exception analysis for multimissions (BEAM), a method of real-time, automated diagnosis of a complex electromechanical systems, has greatly expanded its capability and suitability of application. This expanded formulation, which fully integrates physical models and symbolic analysis, is described. The new formulation of BEAM expands upon previous advanced techniques for analysis of signal data, utilizing mathematical modeling of the system physics, and expert-system reasoning,

  9. A Bio Medical Waste Identification and Classification Algorithm Using Mltrp and Rvm.

    PubMed

    Achuthan, Aravindan; Ayyallu Madangopal, Vasumathi

    2016-10-01

    We aimed to extract the histogram features for text analysis and, to classify the types of Bio Medical Waste (BMW) for garbage disposal and management. The given BMW was preprocessed by using the median filtering technique that efficiently reduced the noise in the image. After that, the histogram features of the filtered image were extracted with the help of proposed Modified Local Tetra Pattern (MLTrP) technique. Finally, the Relevance Vector Machine (RVM) was used to classify the BMW into human body parts, plastics, cotton and liquids. The BMW image was collected from the garbage image dataset for analysis. The performance of the proposed BMW identification and classification system was evaluated in terms of sensitivity, specificity, classification rate and accuracy with the help of MATLAB. When compared to the existing techniques, the proposed techniques provided the better results. This work proposes a new texture analysis and classification technique for BMW management and disposal. It can be used in many real time applications such as hospital and healthcare management systems for proper BMW disposal.

  10. Designing to Support Command and Control in Urban Firefighting

    DTIC Science & Technology

    2008-06-01

    complex human- machine systems. Keywords: Command and control, firefighting, cognitive systems engineering, cognitive task analysis 1...Elm, W. (2000). Bootstrapping multiple converging cognitive task analysis techniques for system design. In J.M.C. Schraagen, S.F. Chipman, & V.L...Shalin, (Eds.), Cognitive Task Analysis . (pp. 317-340). Mahwah, NJ: Lawrence Erlbaum. Rasmussen, J., Pejtersen, A., Goodman, L. (1994). Cognitive

  11. Analysis of Compounds Dissolved in Nonpolar Solvents by Electrospray Ionization on Conductive Nanomaterials

    NASA Astrophysics Data System (ADS)

    Xia, Bing; Gao, Yuanji; Ji, Baocheng; Ma, Fengwei; Ding, Lisheng; Zhou, Yan

    2018-03-01

    Electrospray ionization mass spectrometry (ESI-MS) technique has limitations in analysis of compounds that are dissolved in nonpolar solvents. In this study, ambient ionization of compounds in solvents that are not "friendly" to electrospray ionization, such as n-hexane, is achieved by conductive nanomaterials spray ionization (CNMSI) on nanomaterial emitters, including carbon nanotubes paper and mesodendritic silver covered metal, which applies high voltages to emitters made of these materials without the assistance of polar solvents. Although the time intensity curves (TIC) commonly vary from 4.5% to 23.7% over analyses, protonated molecular ions were found to be the most abundant species, demonstrating good reproducibility of the technique in terms of ionized species. Higher mass spectrometric responses are observed in analyzing nonpolar systems than polar systems. 2-Methoxyacetophenone, 4-methylacetophenone, benzothiazole, quinolone, and cycloheptanone as low as 2 pg in n-hexane can be directly detected using the developed method. The developed technique expands the analysis capability of ESI-MS for direct, online analysis of nonpolar systems, such as low polarity extracts, normal phase liquid chromatography eluates, and synthetic mixtures. [Figure not available: see fulltext.

  12. Analysis of a document/reporting system

    NASA Technical Reports Server (NTRS)

    Narrow, B.

    1971-01-01

    An in-depth analysis of the information system within the Data Processing Branch is presented. Quantitative measures are used to evaluate the efficiency and effectiveness of the information system. It is believed that this is the first documented study which utilizes quantitative measures for full scale system analysis. The quantitative measures and techniques for collecting and qualifying the basic data, as described, are applicable to any information system. Therefore this report is considered to be of interest to any persons concerned with the management design, analysis or evaluation of information systems.

  13. Reliability analysis of the F-8 digital fly-by-wire system

    NASA Technical Reports Server (NTRS)

    Brock, L. D.; Goodman, H. A.

    1981-01-01

    The F-8 Digital Fly-by-Wire (DFBW) flight test program intended to provide the technology for advanced control systems, giving aircraft enhanced performance and operational capability is addressed. A detailed analysis of the experimental system was performed to estimated the probabilities of two significant safety critical events: (1) loss of primary flight control function, causing reversion to the analog bypass system; and (2) loss of the aircraft due to failure of the electronic flight control system. The analysis covers appraisal of risks due to random equipment failure, generic faults in design of the system or its software, and induced failure due to external events. A unique diagrammatic technique was developed which details the combinatorial reliability equations for the entire system, promotes understanding of system failure characteristics, and identifies the most likely failure modes. The technique provides a systematic method of applying basic probability equations and is augmented by a computer program written in a modular fashion that duplicates the structure of these equations.

  14. Development of neural network techniques for finger-vein pattern classification

    NASA Astrophysics Data System (ADS)

    Wu, Jian-Da; Liu, Chiung-Tsiung; Tsai, Yi-Jang; Liu, Jun-Ching; Chang, Ya-Wen

    2010-02-01

    A personal identification system using finger-vein patterns and neural network techniques is proposed in the present study. In the proposed system, the finger-vein patterns are captured by a device that can transmit near infrared through the finger and record the patterns for signal analysis and classification. The biometric system for verification consists of a combination of feature extraction using principal component analysis and pattern classification using both back-propagation network and adaptive neuro-fuzzy inference systems. Finger-vein features are first extracted by principal component analysis method to reduce the computational burden and removes noise residing in the discarded dimensions. The features are then used in pattern classification and identification. To verify the effect of the proposed adaptive neuro-fuzzy inference system in the pattern classification, the back-propagation network is compared with the proposed system. The experimental results indicated the proposed system using adaptive neuro-fuzzy inference system demonstrated a better performance than the back-propagation network for personal identification using the finger-vein patterns.

  15. Development of a versatile user-friendly IBA experimental chamber

    NASA Astrophysics Data System (ADS)

    Kakuee, Omidreza; Fathollahi, Vahid; Lamehi-Rachti, Mohammad

    2016-03-01

    Reliable performance of the Ion Beam Analysis (IBA) techniques is based on the accurate geometry of the experimental setup, employment of the reliable nuclear data and implementation of dedicated analysis software for each of the IBA techniques. It has already been shown that geometrical imperfections lead to significant uncertainties in quantifications of IBA measurements. To minimize these uncertainties, a user-friendly experimental chamber with a heuristic sample positioning system for IBA analysis was recently developed in the Van de Graaff laboratory in Tehran. This system enhances IBA capabilities and in particular Nuclear Reaction Analysis (NRA) and Elastic Recoil Detection Analysis (ERDA) techniques. The newly developed sample manipulator provides the possibility of both controlling the tilt angle of the sample and analyzing samples with different thicknesses. Moreover, a reasonable number of samples can be loaded in the sample wheel. A comparison of the measured cross section data of the 16O(d,p1)17O reaction with the data reported in the literature confirms the performance and capability of the newly developed experimental chamber.

  16. Program risk analysis handbook

    NASA Technical Reports Server (NTRS)

    Batson, R. G.

    1987-01-01

    NASA regulations specify that formal risk analysis be performed on a program at each of several milestones. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from extremely simple to complex network-based simulation, are described in this handbook in order to provide both analyst and manager with a guide for selection of the most appropriate technique. All program risk assessment techniques are shown to be based on elicitation and encoding of subjective probability estimates from the various area experts on a program. Techniques to encode the five most common distribution types are given. Then, a total of twelve distinct approaches to risk assessment are given. Steps involved, good and bad points, time involved, and degree of computer support needed are listed. Why risk analysis should be used by all NASA program managers is discussed. Tools available at NASA-MSFC are identified, along with commercially available software. Bibliography (150 entries) and a program risk analysis check-list are provided.

  17. Matrix Perturbation Techniques in Structural Dynamics

    NASA Technical Reports Server (NTRS)

    Caughey, T. K.

    1973-01-01

    Matrix perturbation are developed techniques which can be used in the dynamical analysis of structures where the range of numerical values in the matrices extreme or where the nature of the damping matrix requires that complex valued eigenvalues and eigenvectors be used. The techniques can be advantageously used in a variety of fields such as earthquake engineering, ocean engineering, aerospace engineering and other fields concerned with the dynamical analysis of large complex structures or systems of second order differential equations. A number of simple examples are included to illustrate the techniques.

  18. Quantitative assessment of human motion using video motion analysis

    NASA Technical Reports Server (NTRS)

    Probe, John D.

    1990-01-01

    In the study of the dynamics and kinematics of the human body, a wide variety of technologies was developed. Photogrammetric techniques are well documented and are known to provide reliable positional data from recorded images. Often these techniques are used in conjunction with cinematography and videography for analysis of planar motion, and to a lesser degree three-dimensional motion. Cinematography has been the most widely used medium for movement analysis. Excessive operating costs and the lag time required for film development coupled with recent advances in video technology have allowed video based motion analysis systems to emerge as a cost effective method of collecting and analyzing human movement. The Anthropometric and Biomechanics Lab at Johnson Space Center utilizes the video based Ariel Performance Analysis System to develop data on shirt-sleeved and space-suited human performance in order to plan efficient on orbit intravehicular and extravehicular activities. The system is described.

  19. Stability analysis of nonlinear Roesser-type two-dimensional systems via a homogenous polynomial technique

    NASA Astrophysics Data System (ADS)

    Zhang, Tie-Yan; Zhao, Yan; Xie, Xiang-Peng

    2012-12-01

    This paper is concerned with the problem of stability analysis of nonlinear Roesser-type two-dimensional (2D) systems. Firstly, the fuzzy modeling method for the usual one-dimensional (1D) systems is extended to the 2D case so that the underlying nonlinear 2D system can be represented by the 2D Takagi—Sugeno (TS) fuzzy model, which is convenient for implementing the stability analysis. Secondly, a new kind of fuzzy Lyapunov function, which is a homogeneous polynomially parameter dependent on fuzzy membership functions, is developed to conceive less conservative stability conditions for the TS Roesser-type 2D system. In the process of stability analysis, the obtained stability conditions approach exactness in the sense of convergence by applying some novel relaxed techniques. Moreover, the obtained result is formulated in the form of linear matrix inequalities, which can be easily solved via standard numerical software. Finally, a numerical example is also given to demonstrate the effectiveness of the proposed approach.

  20. Development of parallel algorithms for electrical power management in space applications

    NASA Technical Reports Server (NTRS)

    Berry, Frederick C.

    1989-01-01

    The application of parallel techniques for electrical power system analysis is discussed. The Newton-Raphson method of load flow analysis was used along with the decomposition-coordination technique to perform load flow analysis. The decomposition-coordination technique enables tasks to be performed in parallel by partitioning the electrical power system into independent local problems. Each independent local problem represents a portion of the total electrical power system on which a loan flow analysis can be performed. The load flow analysis is performed on these partitioned elements by using the Newton-Raphson load flow method. These independent local problems will produce results for voltage and power which can then be passed to the coordinator portion of the solution procedure. The coordinator problem uses the results of the local problems to determine if any correction is needed on the local problems. The coordinator problem is also solved by an iterative method much like the local problem. The iterative method for the coordination problem will also be the Newton-Raphson method. Therefore, each iteration at the coordination level will result in new values for the local problems. The local problems will have to be solved again along with the coordinator problem until some convergence conditions are met.

  1. Impact during equine locomotion: techniques for measurement and analysis.

    PubMed

    Burn, J F; Wilson, A; Nason, G P

    1997-05-01

    Impact is implicated in the development of several types of musculoskeletal injury in the horse. Characterisation of impact experienced during strenuous exercise is an important first step towards understanding the mechanism for injury. Measurement and analysis of large, short duration impacts is difficult. The measurement system must be able to record transient peaks and high frequencies accurately. The analysis technique must be able to characterise the impact signal in time and frequency. This paper presents a measurement system and analysis technique for the characterisation of large impacts. A piezo-electric accelerometer was securely mounted on the dorsal surface of the horses hoof. Saddle mounted charge amplifiers and a 20 m coaxial cable transferred these data to a PC based logging system. Data were down-loaded onto a UNIX workstation and analysed using a proprietary statistics package. The values of parameters calculated from the time series data were comparable to those of other authors. A wavelet decomposition showed that the frequency profile of the signal changed with time. While most spectral energy was seen at impact, a significant amount of energy was contained in the signal immediately following impact. Over 99% of this energy was contained in frequencies less than 1250 Hz. The sampling rate and the frequency response of a measurement system for recording impact should be chosen carefully to prevent loss or corruption of data. Time scale analysis using a wavelet decomposition is a powerful technique which can be used to characterise impact data. The use of contour plots provides a highly visual representation of the time and frequency localisation of power during impact.

  2. Radiograph and passive data analysis using mixed variable optimization

    DOEpatents

    Temple, Brian A.; Armstrong, Jerawan C.; Buescher, Kevin L.; Favorite, Jeffrey A.

    2015-06-02

    Disclosed herein are representative embodiments of methods, apparatus, and systems for performing radiography analysis. For example, certain embodiments perform radiographic analysis using mixed variable computation techniques. One exemplary system comprises a radiation source, a two-dimensional detector for detecting radiation transmitted through a object between the radiation source and detector, and a computer. In this embodiment, the computer is configured to input the radiographic image data from the two-dimensional detector and to determine one or more materials that form the object by using an iterative analysis technique that selects the one or more materials from hierarchically arranged solution spaces of discrete material possibilities and selects the layer interfaces from the optimization of the continuous interface data.

  3. Multidisciplinary System Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  4. Multi-Disciplinary System Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Mahadevan, Sankaran; Han, Song

    1997-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code developed under the leadership of NASA Lewis Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multi-disciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  5. Some aspects of optical feedback with cadmium sulfide and related photoconductors. [for extended frequency response

    NASA Technical Reports Server (NTRS)

    Katzberg, S. J.

    1974-01-01

    A primary limitation of many solid state photoconductors used in electro-optical systems is their slow response in converting varying light intensities into electrical signals. An optical feedback technique is presented which can extend the frequency response of systems that use these detectors by orders of magnitude without adversely affecting overall signal-to-noise ratio performance. The technique is analyzed to predict the improvement possible and a system is implemented using cadmium sulfide to demonstrate the effectiveness of the technique and the validity of the analysis.

  6. The Application of a Residual Risk Evaluation Technique Used for Expendable Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Latimer, John A.

    2009-01-01

    This presentation provides a Residual Risk Evaluation Technique (RRET) developed by Kennedy Space Center (KSC) Safety and Mission Assurance (S&MA) Launch Services Division. This technique is one of many procedures used by S&MA at KSC to evaluate residual risks for each Expendable Launch Vehicle (ELV) mission. RRET is a straight forward technique that incorporates the proven methodology of risk management, fault tree analysis, and reliability prediction. RRET derives a system reliability impact indicator from the system baseline reliability and the system residual risk reliability values. The system reliability impact indicator provides a quantitative measure of the reduction in the system baseline reliability due to the identified residual risks associated with the designated ELV mission. An example is discussed to provide insight into the application of RRET.

  7. System Identification of Mistuned Bladed Disks from Traveling Wave Response Measurements

    NASA Technical Reports Server (NTRS)

    Feiner, D. M.; Griffin, J. H.; Jones, K. W.; Kenyon, J. A.; Mehmed, O.; Kurkov, A. P.

    2003-01-01

    A new approach to modal analysis is presented. By applying this technique to bladed disk system identification methods, one can determine the mistuning in a rotor based on its response to a traveling wave excitation. This allows system identification to be performed under rotating conditions, and thus expands the applicability of existing mistuning identification techniques from integrally bladed rotors to conventional bladed disks.

  8. Atomic characterization of Si nanoclusters embedded in SiO2 by atom probe tomography

    PubMed Central

    2011-01-01

    Silicon nanoclusters are of prime interest for new generation of optoelectronic and microelectronics components. Physical properties (light emission, carrier storage...) of systems using such nanoclusters are strongly dependent on nanostructural characteristics. These characteristics (size, composition, distribution, and interface nature) are until now obtained using conventional high-resolution analytic methods, such as high-resolution transmission electron microscopy, EFTEM, or EELS. In this article, a complementary technique, the atom probe tomography, was used for studying a multilayer (ML) system containing silicon clusters. Such a technique and its analysis give information on the structure at the atomic level and allow obtaining complementary information with respect to other techniques. A description of the different steps for such analysis: sample preparation, atom probe analysis, and data treatment are detailed. An atomic scale description of the Si nanoclusters/SiO2 ML will be fully described. This system is composed of 3.8-nm-thick SiO layers and 4-nm-thick SiO2 layers annealed 1 h at 900°C. PMID:21711666

  9. Failure detection and fault management techniques for flush airdata sensing systems

    NASA Technical Reports Server (NTRS)

    Whitmore, Stephen A.; Moes, Timothy R.; Leondes, Cornelius T.

    1992-01-01

    Methods based on chi-squared analysis are presented for detecting system and individual-port failures in the high-angle-of-attack flush airdata sensing system on the NASA F-18 High Alpha Research Vehicle. The HI-FADS hardware is introduced, and the aerodynamic model describes measured pressure in terms of dynamic pressure, angle of attack, angle of sideslip, and static pressure. Chi-squared analysis is described in the presentation of the concept for failure detection and fault management which includes nominal, iteration, and fault-management modes. A matrix of pressure orifices arranged in concentric circles on the nose of the aircraft indicate the parameters which are applied to the regression algorithms. The sensing techniques are applied to the F-18 flight data, and two examples are given of the computed angle-of-attack time histories. The failure-detection and fault-management techniques permit the matrix to be multiply redundant, and the chi-squared analysis is shown to be useful in the detection of failures.

  10. Investigation of Models and Estimation Techniques for GPS Attitude Determination

    NASA Technical Reports Server (NTRS)

    Garrick, J.

    1996-01-01

    Much work has been done in the Flight Dynamics Analysis Branch (FDAB) in developing algorithms to met the new and growing field of attitude determination using the Global Positioning SYstem (GPS) constellation of satellites. Flight Dynamics has the responsibility to investigate any new technology and incorporate the innovations in the attitude ground support systems developed to support future missions. The work presented here is an investigative analysis that will produce the needed adaptation to allow the Flight Dynamics Support System (FDSS) to incorporate GPS phase measurements and produce observation measurements compatible with the FDSS. A simulator was developed to produce the necessary measurement data to test the models developed for the different estimation techniques used by FDAB. This paper gives an overview of the current modeling capabilities of the simulator models and algorithms for the adaptation of GPS measurement data and results from each of the estimation techniques. Future analysis efforts to evaluate the simulator and models against inflight GPS measurement data are also outlined.

  11. Estimation for bilinear stochastic systems

    NASA Technical Reports Server (NTRS)

    Willsky, A. S.; Marcus, S. I.

    1974-01-01

    Three techniques for the solution of bilinear estimation problems are presented. First, finite dimensional optimal nonlinear estimators are presented for certain bilinear systems evolving on solvable and nilpotent lie groups. Then the use of harmonic analysis for estimation problems evolving on spheres and other compact manifolds is investigated. Finally, an approximate estimation technique utilizing cumulants is discussed.

  12. A Model-Based Systems Engineering Methodology for Employing Architecture In System Analysis: Developing Simulation Models Using Systems Modeling Language Products to Link Architecture and Analysis

    DTIC Science & Technology

    2016-06-01

    characteristics, experimental design techniques, and analysis methodologies that distinguish each phase of the MBSE MEASA. To ensure consistency... methodology . Experimental design selection, simulation analysis, and trade space analysis support the final two stages. Figure 27 segments the MBSE MEASA...rounding has the potential to increase the correlation between columns of the experimental design matrix. The design methodology presented in Vieira

  13. Toward quantitative estimation of material properties with dynamic mode atomic force microscopy: a comparative study.

    PubMed

    Ghosal, Sayan; Gannepalli, Anil; Salapaka, Murti

    2017-08-11

    In this article, we explore methods that enable estimation of material properties with the dynamic mode atomic force microscopy suitable for soft matter investigation. The article presents the viewpoint of casting the system, comprising of a flexure probe interacting with the sample, as an equivalent cantilever system and compares a steady-state analysis based method with a recursive estimation technique for determining the parameters of the equivalent cantilever system in real time. The steady-state analysis of the equivalent cantilever model, which has been implicitly assumed in studies on material property determination, is validated analytically and experimentally. We show that the steady-state based technique yields results that quantitatively agree with the recursive method in the domain of its validity. The steady-state technique is considerably simpler to implement, however, slower compared to the recursive technique. The parameters of the equivalent system are utilized to interpret storage and dissipative properties of the sample. Finally, the article identifies key pitfalls that need to be avoided toward the quantitative estimation of material properties.

  14. Usability engineering: domain analysis activities for augmented-reality systems

    NASA Astrophysics Data System (ADS)

    Gabbard, Joseph; Swan, J. E., II; Hix, Deborah; Lanzagorta, Marco O.; Livingston, Mark; Brown, Dennis B.; Julier, Simon J.

    2002-05-01

    This paper discusses our usability engineering process for the Battlefield Augmented Reality System (BARS). Usability engineering is a structured, iterative, stepwise development process. Like the related disciplines of software and systems engineering, usability engineering is a combination of management principals and techniques, formal and semi- formal evaluation techniques, and computerized tools. BARS is an outdoor augmented reality system that displays heads- up battlefield intelligence information to a dismounted warrior. The paper discusses our general usability engineering process. We originally developed the process in the context of virtual reality applications, but in this work we are adapting the procedures to an augmented reality system. The focus of this paper is our work on domain analysis, the first activity of the usability engineering process. We describe our plans for and our progress to date on our domain analysis for BARS. We give results in terms of a specific urban battlefield use case we have designed.

  15. Get to Understand More from Single-Cells: Current Studies of Microfluidic-Based Techniques for Single-Cell Analysis.

    PubMed

    Lo, Shih-Jie; Yao, Da-Jeng

    2015-07-23

    This review describes the microfluidic techniques developed for the analysis of a single cell. The characteristics of microfluidic (e.g., little sample amount required, high-throughput performance) make this tool suitable to answer and to solve biological questions of interest about a single cell. This review aims to introduce microfluidic related techniques for the isolation, trapping and manipulation of a single cell. The major approaches for detection in single-cell analysis are introduced; the applications of single-cell analysis are then summarized. The review concludes with discussions of the future directions and opportunities of microfluidic systems applied in analysis of a single cell.

  16. Get to Understand More from Single-Cells: Current Studies of Microfluidic-Based Techniques for Single-Cell Analysis

    PubMed Central

    Lo, Shih-Jie; Yao, Da-Jeng

    2015-01-01

    This review describes the microfluidic techniques developed for the analysis of a single cell. The characteristics of microfluidic (e.g., little sample amount required, high-throughput performance) make this tool suitable to answer and to solve biological questions of interest about a single cell. This review aims to introduce microfluidic related techniques for the isolation, trapping and manipulation of a single cell. The major approaches for detection in single-cell analysis are introduced; the applications of single-cell analysis are then summarized. The review concludes with discussions of the future directions and opportunities of microfluidic systems applied in analysis of a single cell. PMID:26213918

  17. Components of spatial information management in wildlife ecology: Software for statistical and modeling analysis [Chapter 14

    Treesearch

    Hawthorne L. Beyer; Jeff Jenness; Samuel A. Cushman

    2010-01-01

    Spatial information systems (SIS) is a term that describes a wide diversity of concepts, techniques, and technologies related to the capture, management, display and analysis of spatial information. It encompasses technologies such as geographic information systems (GIS), global positioning systems (GPS), remote sensing, and relational database management systems (...

  18. Prerequisites for Systems Analysts: Analytic and Management Demands of a New Approach to Educational Administration.

    ERIC Educational Resources Information Center

    Ammentorp, William

    There is much to be gained by using systems analysis in educational administration. Most administrators, presently relying on classical statistical techniques restricted to problems having few variables, should be trained to use more sophisticated tools such as systems analysis. The systems analyst, interested in the basic processes of a group or…

  19. Noncontact techniques for diesel engine diagnostics using exhaust waveform analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gore, D.A.; Cooke, G.J.

    1987-01-01

    RCA Corporation's continuing efforts to develop noncontact test techniques for diesel engines have led to recent advancements in deep engine diagnostics. The U.S. Army Tank-Automotive Command (TACOM) has been working with RCA for the development of new noncontact sensors and test techniques which use these sensors in conjunction with their family of Simplified Test Equipment (STE) to perform vehicle diagnostics. The STE systems are microprocessor-based maintenance tools that assist the Army mechanic in diagnosing malfunctions in both tactical and combat vehicles. The test systems support the mechanic by providing the sophisticated signal processing capabilities necessary for a wide range ofmore » diagnostic testing including exhaust waveform analysis.« less

  20. Performance optimization of spectral amplitude coding OCDMA system using new enhanced multi diagonal code

    NASA Astrophysics Data System (ADS)

    Imtiaz, Waqas A.; Ilyas, M.; Khan, Yousaf

    2016-11-01

    This paper propose a new code to optimize the performance of spectral amplitude coding-optical code division multiple access (SAC-OCDMA) system. The unique two-matrix structure of the proposed enhanced multi diagonal (EMD) code and effective correlation properties, between intended and interfering subscribers, significantly elevates the performance of SAC-OCDMA system by negating multiple access interference (MAI) and associated phase induce intensity noise (PIIN). Performance of SAC-OCDMA system based on the proposed code is thoroughly analyzed for two detection techniques through analytic and simulation analysis by referring to bit error rate (BER), signal to noise ratio (SNR) and eye patterns at the receiving end. It is shown that EMD code while using SDD technique provides high transmission capacity, reduces the receiver complexity, and provides better performance as compared to complementary subtraction detection (CSD) technique. Furthermore, analysis shows that, for a minimum acceptable BER of 10-9 , the proposed system supports 64 subscribers at data rates of up to 2 Gbps for both up-down link transmission.

  1. A special protection scheme utilizing trajectory sensitivity analysis in power transmission

    NASA Astrophysics Data System (ADS)

    Suriyamongkol, Dan

    In recent years, new measurement techniques have provided opportunities to improve the North American Power System observability, control and protection. This dissertation discusses the formulation and design of a special protection scheme based on a novel utilization of trajectory sensitivity techniques with inputs consisting of system state variables and parameters. Trajectory sensitivity analysis (TSA) has been used in previous publications as a method for power system security and stability assessment, and the mathematical formulation of TSA lends itself well to some of the time domain power system simulation techniques. Existing special protection schemes often have limited sets of goals and control actions. The proposed scheme aims to maintain stability while using as many control actions as possible. The approach here will use the TSA in a novel way by using the sensitivities of system state variables with respect to state parameter variations to determine the state parameter controls required to achieve the desired state variable movements. The initial application will operate based on the assumption that the modeled power system has full system observability, and practical considerations will be discussed.

  2. Construction of dynamic stochastic simulation models using knowledge-based techniques

    NASA Technical Reports Server (NTRS)

    Williams, M. Douglas; Shiva, Sajjan G.

    1990-01-01

    Over the past three decades, computer-based simulation models have proven themselves to be cost-effective alternatives to the more structured deterministic methods of systems analysis. During this time, many techniques, tools and languages for constructing computer-based simulation models have been developed. More recently, advances in knowledge-based system technology have led many researchers to note the similarities between knowledge-based programming and simulation technologies and to investigate the potential application of knowledge-based programming techniques to simulation modeling. The integration of conventional simulation techniques with knowledge-based programming techniques is discussed to provide a development environment for constructing knowledge-based simulation models. A comparison of the techniques used in the construction of dynamic stochastic simulation models and those used in the construction of knowledge-based systems provides the requirements for the environment. This leads to the design and implementation of a knowledge-based simulation development environment. These techniques were used in the construction of several knowledge-based simulation models including the Advanced Launch System Model (ALSYM).

  3. Using pattern recognition as a method for predicting extreme events in natural and socio-economic systems

    NASA Astrophysics Data System (ADS)

    Intriligator, M.

    2011-12-01

    Vladimir (Volodya) Keilis-Borok has pioneered the use of pattern recognition as a technique for analyzing and forecasting developments in natural as well as socio-economic systems. Keilis-Borok's work on predicting earthquakes and landslides using this technique as a leading geophysicist has been recognized around the world. Keilis-Borok has also been a world leader in the application of pattern recognition techniques to the analysis and prediction of socio-economic systems. He worked with Allan Lichtman of American University in using such techniques to predict presidential elections in the U.S. Keilis-Borok and I have worked together with others on the use of pattern recognition techniques to analyze and to predict socio-economic systems. We have used this technique to study the pattern of macroeconomic indicators that would predict the end of an economic recession in the U.S. We have also worked with officers in the Los Angeles Police Department to use this technique to predict surges of homicides in Los Angeles.

  4. Computer-aided-engineering system for modeling and analysis of ECLSS integration testing

    NASA Technical Reports Server (NTRS)

    Sepahban, Sonbol

    1987-01-01

    The accurate modeling and analysis of two-phase fluid networks found in environmental control and life support systems is presently undertaken by computer-aided engineering (CAE) techniques whose generalized fluid dynamics package can solve arbitrary flow networks. The CAE system for integrated test bed modeling and analysis will also furnish interfaces and subsystem/test-article mathematical models. Three-dimensional diagrams of the test bed are generated by the system after performing the requisite simulation and analysis.

  5. STS-1 environmental control and life support system. Consumables and thermal analysis

    NASA Technical Reports Server (NTRS)

    Steines, G.

    1980-01-01

    The Environmental Control and Life Support Systems (ECLSS)/thermal systems analysis for the Space Transportation System 1 Flight (STS-1) was performed using the shuttle environmental consumables usage requirements evaluation (SECURE) computer program. This program employs a nodal technique utilizing the Fortran Environmental Analysis Routines (FEAR). The output parameters evaluated were consumable quantities, fluid temperatures, heat transfer and rejection, and cabin atmospheric pressure. Analysis of these indicated that adequate margins exist for the nonpropulsive consumables and related thermal environment.

  6. Reliability analysis of a phaser measurement unit using a generalized fuzzy lambda-tau(GFLT) technique.

    PubMed

    Komal

    2018-05-01

    Nowadays power consumption is increasing day-by-day. To fulfill failure free power requirement, planning and implementation of an effective and reliable power management system is essential. Phasor measurement unit(PMU) is one of the key device in wide area measurement and control systems. The reliable performance of PMU assures failure free power supply for any power system. So, the purpose of the present study is to analyse the reliability of a PMU used for controllability and observability of power systems utilizing available uncertain data. In this paper, a generalized fuzzy lambda-tau (GFLT) technique has been proposed for this purpose. In GFLT, system components' uncertain failure and repair rates are fuzzified using fuzzy numbers having different shapes such as triangular, normal, cauchy, sharp gamma and trapezoidal. To select a suitable fuzzy number for quantifying data uncertainty, system experts' opinion have been considered. The GFLT technique applies fault tree, lambda-tau method, fuzzified data using different membership functions, alpha-cut based fuzzy arithmetic operations to compute some important reliability indices. Furthermore, in this study ranking of critical components of the system using RAM-Index and sensitivity analysis have also been performed. The developed technique may be helpful to improve system performance significantly and can be applied to analyse fuzzy reliability of other engineering systems. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  7. Study of photon correlation techniques for processing of laser velocimeter signals

    NASA Technical Reports Server (NTRS)

    Mayo, W. T., Jr.

    1977-01-01

    The objective was to provide the theory and a system design for a new type of photon counting processor for low level dual scatter laser velocimeter (LV) signals which would be capable of both the first order measurements of mean flow and turbulence intensity and also the second order time statistics: cross correlation auto correlation, and related spectra. A general Poisson process model for low level LV signals and noise which is valid from the photon-resolved regime all the way to the limiting case of nonstationary Gaussian noise was used. Computer simulation algorithms and higher order statistical moment analysis of Poisson processes were derived and applied to the analysis of photon correlation techniques. A system design using a unique dual correlate and subtract frequency discriminator technique is postulated and analyzed. Expectation analysis indicates that the objective measurements are feasible.

  8. Net energy analysis: Powerful tool for selecting electric power options

    NASA Astrophysics Data System (ADS)

    Baron, S.

    A number of net energy analysis studies have been conducted in recent years for electric power production from coal, oil and uranium fuels; synthetic fuels from coal and oil shale; and heat and electric power from solar energy. This technique is an excellent indicator of investment costs, environmental impact and potential economic competitiveness of alternative electric power systems for energy planners from the Eastern European countries considering future options. Energy conservation is also important to energy planners and the net energy analysis technique is an excellent accounting system on the extent of energy resource conservation. The author proposes to discuss the technique and to present the results of his studies and others in the field. The information supplied to the attendees will serve as a powerful tool to the energy planners considering their electric power options in the future.

  9. Artificial intelligence in sports biomechanics: new dawn or false hope?

    PubMed

    Bartlett, Roger

    2006-12-15

    This article reviews developments in the use of Artificial Intelligence (AI) in sports biomechanics over the last decade. It outlines possible uses of Expert Systems as diagnostic tools for evaluating faults in sports movements ('techniques') and presents some example knowledge rules for such an expert system. It then compares the analysis of sports techniques, in which Expert Systems have found little place to date, with gait analysis, in which they are routinely used. Consideration is then given to the use of Artificial Neural Networks (ANNs) in sports biomechanics, focusing on Kohonen self-organizing maps, which have been the most widely used in technique analysis, and multi-layer networks, which have been far more widely used in biomechanics in general. Examples of the use of ANNs in sports biomechanics are presented for javelin and discus throwing, shot putting and football kicking. I also present an example of the use of Evolutionary Computation in movement optimization in the soccer throw in, which predicted an optimal technique close to that in the coaching literature. After briefly overviewing the use of AI in both sports science and biomechanics in general, the article concludes with some speculations about future uses of AI in sports biomechanics. Key PointsExpert Systems remain almost unused in sports biomechanics, unlike in the similar discipline of gait analysis.Artificial Neural Networks, particularly Kohonen Maps, have been used, although their full value remains unclear.Other AI applications, including Evolutionary Computation, have received little attention.

  10. A linear circuit analysis program with stiff systems capability

    NASA Technical Reports Server (NTRS)

    Cook, C. H.; Bavuso, S. J.

    1973-01-01

    Several existing network analysis programs have been modified and combined to employ a variable topological approach to circuit translation. Efficient numerical integration techniques are used for transient analysis.

  11. Presentation on systems cluster research

    NASA Technical Reports Server (NTRS)

    Morgenthaler, George W.

    1989-01-01

    This viewgraph presentation presents an overview of systems cluster research performed by the Center for Space Construction. The goals of the research are to develop concepts, insights, and models for space construction and to develop systems engineering/analysis curricula for training future aerospace engineers. The following topics are covered: CSC systems analysis/systems engineering (SIMCON) model, CSC systems cluster schedule, system life-cycle, model optimization techniques, publications, cooperative efforts, and sponsored research.

  12. A survey of visualization systems for network security.

    PubMed

    Shiravi, Hadi; Shiravi, Ali; Ghorbani, Ali A

    2012-08-01

    Security Visualization is a very young term. It expresses the idea that common visualization techniques have been designed for use cases that are not supportive of security-related data, demanding novel techniques fine tuned for the purpose of thorough analysis. Significant amount of work has been published in this area, but little work has been done to study this emerging visualization discipline. We offer a comprehensive review of network security visualization and provide a taxonomy in the form of five use-case classes encompassing nearly all recent works in this area. We outline the incorporated visualization techniques and data sources and provide an informative table to display our findings. From the analysis of these systems, we examine issues and concerns regarding network security visualization and provide guidelines and directions for future researchers and visual system developers.

  13. Modeling and Hazard Analysis Using STPA

    NASA Astrophysics Data System (ADS)

    Ishimatsu, Takuto; Leveson, Nancy; Thomas, John; Katahira, Masa; Miyamoto, Yuko; Nakao, Haruka

    2010-09-01

    A joint research project between MIT and JAXA/JAMSS is investigating the application of a new hazard analysis to the system and software in the HTV. Traditional hazard analysis focuses on component failures but software does not fail in this way. Software most often contributes to accidents by commanding the spacecraft into an unsafe state(e.g., turning off the descent engines prematurely) or by not issuing required commands. That makes the standard hazard analysis techniques of limited usefulness on software-intensive systems, which describes most spacecraft built today. STPA is a new hazard analysis technique based on systems theory rather than reliability theory. It treats safety as a control problem rather than a failure problem. The goal of STPA, which is to create a set of scenarios that can lead to a hazard, is the same as FTA but STPA includes a broader set of potential scenarios including those in which no failures occur but the problems arise due to unsafe and unintended interactions among the system components. STPA also provides more guidance to the analysts that traditional fault tree analysis. Functional control diagrams are used to guide the analysis. In addition, JAXA uses a model-based system engineering development environment(created originally by Leveson and called SpecTRM) which also assists in the hazard analysis. One of the advantages of STPA is that it can be applied early in the system engineering and development process in a safety-driven design process where hazard analysis drives the design decisions rather than waiting until reviews identify problems that are then costly or difficult to fix. It can also be applied in an after-the-fact analysis and hazard assessment, which is what we did in this case study. This paper describes the experimental application of STPA to the JAXA HTV in order to determine the feasibility and usefulness of the new hazard analysis technique. Because the HTV was originally developed using fault tree analysis and following the NASA standards for safety-critical systems, the results of our experimental application of STPA can be compared with these more traditional safety engineering approaches in terms of the problems identified and the resources required to use it.

  14. Verus: A Tool for Quantitative Analysis of Finite-State Real-Time Systems.

    DTIC Science & Technology

    1996-08-12

    Symbolic model checking is a technique for verifying finite-state concurrent systems that has been extended to handle real - time systems . Models with...up to 10(exp 30) states can often be verified in minutes. In this paper, we present a new tool to analyze real - time systems , based on this technique...We have designed a language, called Verus, for the description of real - time systems . Such a description is compiled into a state-transition graph and

  15. Supporting Handoff in Asynchronous Collaborative Sensemaking Using Knowledge-Transfer Graphs.

    PubMed

    Zhao, Jian; Glueck, Michael; Isenberg, Petra; Chevalier, Fanny; Khan, Azam

    2018-01-01

    During asynchronous collaborative analysis, handoff of partial findings is challenging because externalizations produced by analysts may not adequately communicate their investigative process. To address this challenge, we developed techniques to automatically capture and help encode tacit aspects of the investigative process based on an analyst's interactions, and streamline explicit authoring of handoff annotations. We designed our techniques to mediate awareness of analysis coverage, support explicit communication of progress and uncertainty with annotation, and implicit communication through playback of investigation histories. To evaluate our techniques, we developed an interactive visual analysis system, KTGraph, that supports an asynchronous investigative document analysis task. We conducted a two-phase user study to characterize a set of handoff strategies and to compare investigative performance with and without our techniques. The results suggest that our techniques promote the use of more effective handoff strategies, help increase an awareness of prior investigative process and insights, as well as improve final investigative outcomes.

  16. Non-contact ulcer area calculation system for neuropathic foot ulcer.

    PubMed

    Shah, Parth; Mahajan, Siddaram; Nageswaran, Sharmila; Paul, Sathish Kumar; Ebenzer, Mannam

    2017-08-11

    Around 125,785 new cases in year 2013-14 of leprosy were detected in India as per WHO report on leprosy in September 2015 which accounts to approximately 62% of the total new cases. Anaesthetic foot caused by leprosy leads to uneven loading of foot leading to ulcer in approximately 20% of the cases. Much efforts have gone in identifying newer techniques to efficiently monitor the progress of ulcer healing. Current techniques followed in measuring the size of ulcers, have not been found to be so accurate but are still is followed by clinicians across the globe. Quantification of prognosis of the condition would be required to understand the efficacy of current treatment methods and plan for further treatment. This study aims at developing a non contact technique to precisely measure the size of ulcer in patients affected by leprosy. Using MATLAB software, GUI was designed to process the acquired ulcer image by segmenting and calculating the pixel area of the image. The image was further converted to a standard measurement using a reference object. The developed technique was tested on 16 ulcer images acquired from 10 leprosy patients with plantar ulcers. Statistical analysis was done using MedCalc analysis software to find the reliability of the system. The analysis showed a very high correlation coefficient (r=0.9882) between the ulcer area measurements done using traditional technique and the newly developed technique, The reliability of the newly developed technique was significant with a significance level of 99.9%. The designed non-contact ulcer area calculating system using MATLAB is found to be a reliable system in calculating the size of ulcers. The technique would help clinicians have a reliable tool to monitor the progress of ulcer healing and help modify the treatment protocol if needed. Copyright © 2017 European Foot and Ankle Society. Published by Elsevier Ltd. All rights reserved.

  17. Analysis of filter tuning techniques for sequential orbit determination

    NASA Technical Reports Server (NTRS)

    Lee, T.; Yee, C.; Oza, D.

    1995-01-01

    This paper examines filter tuning techniques for a sequential orbit determination (OD) covariance analysis. Recently, there has been a renewed interest in sequential OD, primarily due to the successful flight qualification of the Tracking and Data Relay Satellite System (TDRSS) Onboard Navigation System (TONS) using Doppler data extracted onboard the Extreme Ultraviolet Explorer (EUVE) spacecraft. TONS computes highly accurate orbit solutions onboard the spacecraft in realtime using a sequential filter. As the result of the successful TONS-EUVE flight qualification experiment, the Earth Observing System (EOS) AM-1 Project has selected TONS as the prime navigation system. In addition, sequential OD methods can be used successfully for ground OD. Whether data are processed onboard or on the ground, a sequential OD procedure is generally favored over a batch technique when a realtime automated OD system is desired. Recently, OD covariance analyses were performed for the TONS-EUVE and TONS-EOS missions using the sequential processing options of the Orbit Determination Error Analysis System (ODEAS). ODEAS is the primary covariance analysis system used by the Goddard Space Flight Center (GSFC) Flight Dynamics Division (FDD). The results of these analyses revealed a high sensitivity of the OD solutions to the state process noise filter tuning parameters. The covariance analysis results show that the state estimate error contributions from measurement-related error sources, especially those due to the random noise and satellite-to-satellite ionospheric refraction correction errors, increase rapidly as the state process noise increases. These results prompted an in-depth investigation of the role of the filter tuning parameters in sequential OD covariance analysis. This paper analyzes how the spacecraft state estimate errors due to dynamic and measurement-related error sources are affected by the process noise level used. This information is then used to establish guidelines for determining optimal filter tuning parameters in a given sequential OD scenario for both covariance analysis and actual OD. Comparisons are also made with corresponding definitive OD results available from the TONS-EUVE analysis.

  18. Web-Based Trainer for Electrical Circuit Analysis

    ERIC Educational Resources Information Center

    Weyten, L.; Rombouts, P.; De Maeyer, J.

    2009-01-01

    A Web-based system for training electric circuit analysis is presented in this paper. It is centered on symbolic analysis techniques and it not only verifies the student's final answer, but it also tracks and coaches him/her through all steps of his/her reasoning path. The system mimics homework assignments, enhanced by immediate personalized…

  19. Comparing the reliability of a trigonometric technique to goniometry and inclinometry in measuring ankle dorsiflexion.

    PubMed

    Sidaway, Ben; Euloth, Tracey; Caron, Heather; Piskura, Matthew; Clancy, Jessica; Aide, Alyson

    2012-07-01

    The purpose of this study was to compare the reliability of three previously used techniques for the measurement of ankle dorsiflexion ROM, open-chained goniometry, closed-chained goniometry, and inclinometry, to a novel trigonometric technique. Twenty-one physiotherapy students used four techniques (open-chained goniometry, closed-chained goniometry, inclinometry, and trigonometry) to assess dorsiflexion range of motion in 24 healthy volunteers. All student raters underwent training to establish competence in the four techniques. Raters then measured dorsiflexion with a randomly assigned measuring technique four times over two sessions, one week apart. Data were analyzed using a technique by session analysis of variance, technique measurement variability being the primary index of reliability. Comparisons were also made between the measurements derived from the four techniques and those obtained from a computerized video analysis system. Analysis of the rater measurement variability around the technique means revealed significant differences between techniques with the least variation being found in the trigonometric technique. Significant differences were also found between the technique means but no differences between sessions were evident. The trigonometric technique produced mean ROMs closest in value to those derived from computer analysis. Application of the trigonometric technique resulted in the least variability in measurement across raters and consequently should be considered for use when changes in dorsiflexion ROM need to be reliably assessed. Copyright © 2012 Elsevier B.V. All rights reserved.

  20. Applications of data compression techniques in modal analysis for on-orbit system identification

    NASA Technical Reports Server (NTRS)

    Carlin, Robert A.; Saggio, Frank; Garcia, Ephrahim

    1992-01-01

    Data compression techniques have been investigated for use with modal analysis applications. A redundancy-reduction algorithm was used to compress frequency response functions (FRFs) in order to reduce the amount of disk space necessary to store the data and/or save time in processing it. Tests were performed for both single- and multiple-degree-of-freedom (SDOF and MDOF, respectively) systems, with varying amounts of noise. Analysis was done on both the compressed and uncompressed FRFs using an SDOF Nyquist curve fit as well as the Eigensystem Realization Algorithm. Significant savings were realized with minimal errors incurred by the compression process.

  1. Blade loss transient dynamics analysis, volume 1. Task 1: Survey and perspective. [aircraft gas turbine engines

    NASA Technical Reports Server (NTRS)

    Gallardo, V. C.; Gaffney, E. F.; Bach, L. J.; Stallone, M. J.

    1981-01-01

    An analytical technique was developed to predict the behavior of a rotor system subjected to sudden unbalance. The technique is implemented in the Turbine Engine Transient Rotor Analysis (TETRA) computer program using the component element method. The analysis was particularly aimed toward blade-loss phenomena in gas turbine engines. A dual-rotor, casing, and pylon structure can be modeled by the computer program. Blade tip rubs, Coriolis forces, and mechanical clearances are included. The analytical system was verified by modeling and simulating actual test conditions for a rig test as well as a full-engine, blade-release demonstration.

  2. Market basket analysis visualization on a spherical surface

    NASA Astrophysics Data System (ADS)

    Hao, Ming C.; Hsu, Meichun; Dayal, Umeshwar; Wei, Shu F.; Sprenger, Thomas; Holenstein, Thomas

    2001-05-01

    This paper discusses the visualization of the relationships in e-commerce transactions. To date, many practical research projects have shown the usefulness of a physics-based mass- spring technique to layout data items with close relationships on a graph. We describe a market basket analysis visualization system using this technique. This system is described as the following: (1) integrates a physics-based engine into a visual data mining platform; (2) use a 3D spherical surface to visualize the cluster of related data items; and (3) for large volumes of transactions, uses hidden structures to unclutter the display. Several examples of market basket analysis are also provided.

  3. Robust Stability Analysis of the Space Launch System Control Design: A Singular Value Approach

    NASA Technical Reports Server (NTRS)

    Pei, Jing; Newsome, Jerry R.

    2015-01-01

    Classical stability analysis consists of breaking the feedback loops one at a time and determining separately how much gain or phase variations would destabilize the stable nominal feedback system. For typical launch vehicle control design, classical control techniques are generally employed. In addition to stability margins, frequency domain Monte Carlo methods are used to evaluate the robustness of the design. However, such techniques were developed for Single-Input-Single-Output (SISO) systems and do not take into consideration the off-diagonal terms in the transfer function matrix of Multi-Input-Multi-Output (MIMO) systems. Robust stability analysis techniques such as H(sub infinity) and mu are applicable to MIMO systems but have not been adopted as standard practices within the launch vehicle controls community. This paper took advantage of a simple singular-value-based MIMO stability margin evaluation method based on work done by Mukhopadhyay and Newsom and applied it to the SLS high-fidelity dynamics model. The method computes a simultaneous multi-loop gain and phase margin that could be related back to classical margins. The results presented in this paper suggest that for the SLS system, traditional SISO stability margins are similar to the MIMO margins. This additional level of verification provides confidence in the robustness of the control design.

  4. Characterization of Ultra-fine Grained and Nanocrystalline Materials Using Transmission Kikuchi Diffraction

    PubMed Central

    Proust, Gwénaëlle; Trimby, Patrick; Piazolo, Sandra; Retraint, Delphine

    2017-01-01

    One of the challenges in microstructure analysis nowadays resides in the reliable and accurate characterization of ultra-fine grained (UFG) and nanocrystalline materials. The traditional techniques associated with scanning electron microscopy (SEM), such as electron backscatter diffraction (EBSD), do not possess the required spatial resolution due to the large interaction volume between the electrons from the beam and the atoms of the material. Transmission electron microscopy (TEM) has the required spatial resolution. However, due to a lack of automation in the analysis system, the rate of data acquisition is slow which limits the area of the specimen that can be characterized. This paper presents a new characterization technique, Transmission Kikuchi Diffraction (TKD), which enables the analysis of the microstructure of UFG and nanocrystalline materials using an SEM equipped with a standard EBSD system. The spatial resolution of this technique can reach 2 nm. This technique can be applied to a large range of materials that would be difficult to analyze using traditional EBSD. After presenting the experimental set up and describing the different steps necessary to realize a TKD analysis, examples of its use on metal alloys and minerals are shown to illustrate the resolution of the technique and its flexibility in term of material to be characterized. PMID:28447998

  5. Characterization of Ultra-fine Grained and Nanocrystalline Materials Using Transmission Kikuchi Diffraction.

    PubMed

    Proust, Gwénaëlle; Trimby, Patrick; Piazolo, Sandra; Retraint, Delphine

    2017-04-01

    One of the challenges in microstructure analysis nowadays resides in the reliable and accurate characterization of ultra-fine grained (UFG) and nanocrystalline materials. The traditional techniques associated with scanning electron microscopy (SEM), such as electron backscatter diffraction (EBSD), do not possess the required spatial resolution due to the large interaction volume between the electrons from the beam and the atoms of the material. Transmission electron microscopy (TEM) has the required spatial resolution. However, due to a lack of automation in the analysis system, the rate of data acquisition is slow which limits the area of the specimen that can be characterized. This paper presents a new characterization technique, Transmission Kikuchi Diffraction (TKD), which enables the analysis of the microstructure of UFG and nanocrystalline materials using an SEM equipped with a standard EBSD system. The spatial resolution of this technique can reach 2 nm. This technique can be applied to a large range of materials that would be difficult to analyze using traditional EBSD. After presenting the experimental set up and describing the different steps necessary to realize a TKD analysis, examples of its use on metal alloys and minerals are shown to illustrate the resolution of the technique and its flexibility in term of material to be characterized.

  6. The Coplane Analysis Technique for Three-Dimensional Wind Retrieval Using the HIWRAP Airborne Doppler Radar

    NASA Technical Reports Server (NTRS)

    Didlake, Anthony C., Jr.; Heymsfield, Gerald M.; Tian, Lin; Guimond, Stephen R.

    2015-01-01

    The coplane analysis technique for mapping the three-dimensional wind field of precipitating systems is applied to the NASA High Altitude Wind and Rain Airborne Profiler (HIWRAP). HIWRAP is a dual-frequency Doppler radar system with two downward pointing and conically scanning beams. The coplane technique interpolates radar measurements to a natural coordinate frame, directly solves for two wind components, and integrates the mass continuity equation to retrieve the unobserved third wind component. This technique is tested using a model simulation of a hurricane and compared to a global optimization retrieval. The coplane method produced lower errors for the cross-track and vertical wind components, while the global optimization method produced lower errors for the along-track wind component. Cross-track and vertical wind errors were dependent upon the accuracy of the estimated boundary condition winds near the surface and at nadir, which were derived by making certain assumptions about the vertical velocity field. The coplane technique was then applied successfully to HIWRAP observations of Hurricane Ingrid (2013). Unlike the global optimization method, the coplane analysis allows for a transparent connection between the radar observations and specific analysis results. With this ability, small-scale features can be analyzed more adequately and erroneous radar measurements can be identified more easily.

  7. A program to form a multidisciplinary data base and analysis for dynamic systems

    NASA Technical Reports Server (NTRS)

    Taylor, L. W.; Suit, W. T.; Mayo, M. H.

    1984-01-01

    Diverse sets of experimental data and analysis programs have been assembled for the purpose of facilitating research in systems identification, parameter estimation and state estimation techniques. The data base analysis programs are organized to make it easy to compare alternative approaches. Additional data and alternative forms of analysis will be included as they become available.

  8. Cyclic Symmetry Finite Element Forced Response Analysis of a Distortion-Tolerant Fan with Boundary Layer Ingestion

    NASA Technical Reports Server (NTRS)

    Min, J. B.; Reddy, T. S. R.; Bakhle, M. A.; Coroneos, R. M.; Stefko, G. L.; Provenza, A. J.; Duffy, K. P.

    2018-01-01

    Accurate prediction of the blade vibration stress is required to determine overall durability of fan blade design under Boundary Layer Ingestion (BLI) distorted flow environments. Traditional single blade modeling technique is incapable of representing accurate modeling for the entire rotor blade system subject to complex dynamic loading behaviors and vibrations in distorted flow conditions. A particular objective of our work was to develop a high-fidelity full-rotor aeromechanics analysis capability for a system subjected to a distorted inlet flow by applying cyclic symmetry finite element modeling methodology. This reduction modeling method allows computationally very efficient analysis using a small periodic section of the full rotor blade system. Experimental testing by the use of the 8-foot by 6-foot Supersonic Wind Tunnel Test facility at NASA Glenn Research Center was also carried out for the system designated as the Boundary Layer Ingesting Inlet/Distortion-Tolerant Fan (BLI2DTF) technology development. The results obtained from the present numerical modeling technique were evaluated with those of the wind tunnel experimental test, toward establishing a computationally efficient aeromechanics analysis modeling tool facilitating for analyses of the full rotor blade systems subjected to a distorted inlet flow conditions. Fairly good correlations were achieved hence our computational modeling techniques were fully demonstrated. The analysis result showed that the safety margin requirement set in the BLI2DTF fan blade design provided a sufficient margin with respect to the operating speed range.

  9. Fault Tree Analysis as a Planning and Management Tool: A Case Study

    ERIC Educational Resources Information Center

    Witkin, Belle Ruth

    1977-01-01

    Fault Tree Analysis is an operations research technique used to analyse the most probable modes of failure in a system, in order to redesign or monitor the system more closely in order to increase its likelihood of success. (Author)

  10. Differential die-away analysis system response modeling and detector design

    NASA Astrophysics Data System (ADS)

    Jordan, K. A.; Gozani, T.; Vujic, J.

    2008-05-01

    Differential die-away-analysis (DDAA) is a sensitive technique to detect presence of fissile materials such as 235U and 239Pu. DDAA uses a high-energy (14 MeV) pulsed neutron generator to interrogate a shipping container. The signature is a fast neutron signal hundreds of microseconds after the cessation of the neutron pulse. This fast neutron signal has decay time identical to the thermal neutron diffusion decay time of the inspected cargo. The theoretical aspects of a cargo inspection system based on the differential die-away technique are explored. A detailed mathematical model of the system is developed, and experimental results validating this model are presented.

  11. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.

  12. Classifying Facial Actions

    PubMed Central

    Donato, Gianluca; Bartlett, Marian Stewart; Hager, Joseph C.; Ekman, Paul; Sejnowski, Terrence J.

    2010-01-01

    The Facial Action Coding System (FACS) [23] is an objective method for quantifying facial movement in terms of component actions. This system is widely used in behavioral investigations of emotion, cognitive processes, and social interaction. The coding is presently performed by highly trained human experts. This paper explores and compares techniques for automatically recognizing facial actions in sequences of images. These techniques include analysis of facial motion through estimation of optical flow; holistic spatial analysis, such as principal component analysis, independent component analysis, local feature analysis, and linear discriminant analysis; and methods based on the outputs of local filters, such as Gabor wavelet representations and local principal components. Performance of these systems is compared to naive and expert human subjects. Best performances were obtained using the Gabor wavelet representation and the independent component representation, both of which achieved 96 percent accuracy for classifying 12 facial actions of the upper and lower face. The results provide converging evidence for the importance of using local filters, high spatial frequencies, and statistical independence for classifying facial actions. PMID:21188284

  13. Laser induced breakdown spectroscopy (LIBS) as a rapid tool for material analysis

    NASA Astrophysics Data System (ADS)

    Hussain, T.; Gondal, M. A.

    2013-06-01

    Laser induced breakdown spectroscopy (LIBS) is a novel technique for elemental analysis based on laser-generated plasma. In this technique, laser pulses are applied for ablation of the sample, resulting in the vaporization and ionization of sample in hot plasma which is finally analyzed by the spectrometer. The elements are identified by their unique spectral signatures. LIBS system was developed for elemental analysis of solid and liquid samples. The developed system was applied for qualitative as well as quantitative measurement of elemental concentration present in iron slag and open pit ore samples. The plasma was generated by focusing a pulsed Nd:YAG laser at 1064 nm on test samples to study the capabilities of LIBS as a rapid tool for material analysis. The concentrations of various elements of environmental significance such as cadmium, calcium, magnesium, chromium, manganese, titanium, barium, phosphorus, copper, iron, zinc etc., in these samples were determined. Optimal experimental conditions were evaluated for improving the sensitivity of developed LIBS system through parametric dependence study. The laser-induced breakdown spectroscopy (LIBS) results were compared with the results obtained using standard analytical technique such as inductively couple plasma emission spectroscopy (ICP). Limit of detection (LOD) of our LIBS system were also estimated for the above mentioned elements. This study demonstrates that LIBS could be highly appropriate for rapid online analysis of iron slag and open pit waste.

  14. Earth Observatory Satellite system definition study. Report no. 7: EOS system definition report. Appendixes A through D

    NASA Technical Reports Server (NTRS)

    1974-01-01

    An analysis of the systems involved in the operation and support of the Earth Observatory Satellite (EOS) is presented. Among the systems considered are the following: (1) the data management system, (2) observatory to primary ground station communications links, (3) local user system, (4) techniques for recognizing ground control points, (5) the central data processing-implementation concept, and (6) program effectiveness analysis.

  15. Kernel analysis in TeV gamma-ray selection

    NASA Astrophysics Data System (ADS)

    Moriarty, P.; Samuelson, F. W.

    2000-06-01

    We discuss the use of kernel analysis as a technique for selecting gamma-ray candidates in Atmospheric Cherenkov astronomy. The method is applied to observations of the Crab Nebula and Markarian 501 recorded with the Whipple 10 m Atmospheric Cherenkov imaging system, and the results are compared with the standard Supercuts analysis. Since kernel analysis is computationally intensive, we examine approaches to reducing the computational load. Extension of the technique to estimate the energy of the gamma-ray primary is considered. .

  16. Artificial intelligence within the chemical laboratory.

    PubMed

    Winkel, P

    1994-01-01

    Various techniques within the area of artificial intelligence such as expert systems and neural networks may play a role during the problem-solving processes within the clinical biochemical laboratory. Neural network analysis provides a non-algorithmic approach to information processing, which results in the ability of the computer to form associations and to recognize patterns or classes among data. It belongs to the machine learning techniques which also include probabilistic techniques such as discriminant function analysis and logistic regression and information theoretical techniques. These techniques may be used to extract knowledge from example patients to optimize decision limits and identify clinically important laboratory quantities. An expert system may be defined as a computer program that can give advice in a well-defined area of expertise and is able to explain its reasoning. Declarative knowledge consists of statements about logical or empirical relationships between things. Expert systems typically separate declarative knowledge residing in a knowledge base from the inference engine: an algorithm that dynamically directs and controls the system when it searches its knowledge base. A tool is an expert system without a knowledge base. The developer of an expert system uses a tool by entering knowledge into the system. Many, if not the majority of problems encountered at the laboratory level are procedural. A problem is procedural if it is possible to write up a step-by-step description of the expert's work or if it can be represented by a decision tree. To solve problems of this type only small expert system tools and/or conventional programming are required.(ABSTRACT TRUNCATED AT 250 WORDS)

  17. Random safety auditing, root cause analysis, failure mode and effects analysis.

    PubMed

    Ursprung, Robert; Gray, James

    2010-03-01

    Improving quality and safety in health care is a major concern for health care providers, the general public, and policy makers. Errors and quality issues are leading causes of morbidity and mortality across the health care industry. There is evidence that patients in the neonatal intensive care unit (NICU) are at high risk for serious medical errors. To facilitate compliance with safe practices, many institutions have established quality-assurance monitoring procedures. Three techniques that have been found useful in the health care setting are failure mode and effects analysis, root cause analysis, and random safety auditing. When used together, these techniques are effective tools for system analysis and redesign focused on providing safe delivery of care in the complex NICU system. Copyright 2010 Elsevier Inc. All rights reserved.

  18. Cooperative analysis expert situation assessment research

    NASA Technical Reports Server (NTRS)

    Mccown, Michael G.

    1987-01-01

    For the past few decades, Rome Air Development Center (RADC) has been conducting research in Artificial Intelligence (AI). When the recent advances in hardware technology made many AI techniques practical, the Intelligence and Reconnaissance Directorate of RADC initiated an applications program entitled Knowledge Based Intelligence Systems (KBIS). The goal of the program is the development of a generic Intelligent Analyst System, an open machine with the framework for intelligence analysis, natural language processing, and man-machine interface techniques, needing only the specific problem domain knowledge to be operationally useful. The development of KBIS is described.

  19. The workload book: Assessment of operator workload to engineering systems

    NASA Technical Reports Server (NTRS)

    Gopher, D.

    1983-01-01

    The structure and initial work performed toward the creation of a handbook for workload analysis directed at the operational community of engineers and human factors psychologists are described. The goal, when complete, will be to make accessible to such individuals the results of theoretically-based research that are of practical interest and utility in the analysis and prediction of operator workload in advanced and existing systems. In addition, the results of laboratory study focused on the development of a subjective rating technique for workload that is based on psychophysical scaling techniques are described.

  20. Data Analysis Techniques for a Lunar Surface Navigation System Testbed

    NASA Technical Reports Server (NTRS)

    Chelmins, David; Sands, O. Scott; Swank, Aaron

    2011-01-01

    NASA is interested in finding new methods of surface navigation to allow astronauts to navigate on the lunar surface. In support of the Vision for Space Exploration, the NASA Glenn Research Center developed the Lunar Extra-Vehicular Activity Crewmember Location Determination System and performed testing at the Desert Research and Technology Studies event in 2009. A significant amount of sensor data was recorded during nine tests performed with six test subjects. This paper provides the procedure, formulas, and techniques for data analysis, as well as commentary on applications.

  1. Techniques for optimal crop selection in a controlled ecological life support system

    NASA Technical Reports Server (NTRS)

    Mccormack, Ann; Finn, Cory; Dunsky, Betsy

    1993-01-01

    A Controlled Ecological Life Support System (CELSS) utilizes a plant's natural ability to regenerate air and water while being grown as a food source in a closed life support system. Current plant research is directed toward obtaining quantitative empirical data on the regenerative ability of each species of plant and the system volume and power requirements. Two techniques were adapted to optimize crop species selection while at the same time minimizing the system volume and power requirements. Each allows the level of life support supplied by the plants to be selected, as well as other system parameters. The first technique uses decision analysis in the form of a spreadsheet. The second method, which is used as a comparison with and validation of the first, utilizes standard design optimization techniques. Simple models of plant processes are used in the development of these methods.

  2. Techniques for optimal crop selection in a controlled ecological life support system

    NASA Technical Reports Server (NTRS)

    Mccormack, Ann; Finn, Cory; Dunsky, Betsy

    1992-01-01

    A Controlled Ecological Life Support System (CELSS) utilizes a plant's natural ability to regenerate air and water while being grown as a food source in a closed life support system. Current plant research is directed toward obtaining quantitative empirical data on the regenerative ability of each species of plant and the system volume and power requirements. Two techniques were adapted to optimize crop species selection while at the same time minimizing the system volume and power requirements. Each allows the level of life support supplied by the plants to be selected, as well as other system parameters. The first technique uses decision analysis in the form of a spreadsheet. The second method, which is used as a comparison with and validation of the first, utilizes standard design optimization techniques. Simple models of plant processes are used in the development of these methods.

  3. Symbolic Execution Enhanced System Testing

    NASA Technical Reports Server (NTRS)

    Davies, Misty D.; Pasareanu, Corina S.; Raman, Vishwanath

    2012-01-01

    We describe a testing technique that uses information computed by symbolic execution of a program unit to guide the generation of inputs to the system containing the unit, in such a way that the unit's, and hence the system's, coverage is increased. The symbolic execution computes unit constraints at run-time, along program paths obtained by system simulations. We use machine learning techniques treatment learning and function fitting to approximate the system input constraints that will lead to the satisfaction of the unit constraints. Execution of system input predictions either uncovers new code regions in the unit under analysis or provides information that can be used to improve the approximation. We have implemented the technique and we have demonstrated its effectiveness on several examples, including one from the aerospace domain.

  4. CPM and PERT in Library Management.

    ERIC Educational Resources Information Center

    Main, Linda

    1989-01-01

    Discusses two techniques of systems analysis--Critical Path Method (CPM) and Program Evaluation Review Techniques (PERT)--and their place in library management. An overview of CPM and PERT charting procedures is provided. (11 references) (Author/MES)

  5. A Study on Predictive Analytics Application to Ship Machinery Maintenance

    DTIC Science & Technology

    2013-09-01

    Looking at the nature of the time series forecasting method , it would be better applied to offline analysis . The application for real- time online...other system attributes in future. Two techniques of statistical analysis , mainly time series models and cumulative sum control charts, are discussed in...statistical tool employed for the two techniques of statistical analysis . Both time series forecasting as well as CUSUM control charts are shown to be

  6. Image processing and analysis using neural networks for optometry area

    NASA Astrophysics Data System (ADS)

    Netto, Antonio V.; Ferreira de Oliveira, Maria C.

    2002-11-01

    In this work we describe the framework of a functional system for processing and analyzing images of the human eye acquired by the Hartmann-Shack technique (HS), in order to extract information to formulate a diagnosis of eye refractive errors (astigmatism, hypermetropia and myopia). The analysis is to be carried out using an Artificial Intelligence system based on Neural Nets, Fuzzy Logic and Classifier Combination. The major goal is to establish the basis of a new technology to effectively measure ocular refractive errors that is based on methods alternative those adopted in current patented systems. Moreover, analysis of images acquired with the Hartmann-Shack technique may enable the extraction of additional information on the health of an eye under exam from the same image used to detect refraction errors.

  7. Application of pattern recognition techniques to crime analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bender, C.F.; Cox, L.A. Jr.; Chappell, G.A.

    1976-08-15

    The initial goal was to evaluate the capabilities of current pattern recognition techniques when applied to existing computerized crime data. Performance was to be evaluated both in terms of the system's capability to predict crimes and to optimize police manpower allocation. A relation was sought to predict the crime's susceptibility to solution, based on knowledge of the crime type, location, time, etc. The preliminary results of this work are discussed. They indicate that automatic crime analysis involving pattern recognition techniques is feasible, and that efforts to determine optimum variables and techniques are warranted. 47 figures (RWR)

  8. Fusing modeling techniques to support domain analysis for reuse opportunities identification

    NASA Technical Reports Server (NTRS)

    Hall, Susan Main; Mcguire, Eileen

    1993-01-01

    Functional modeling techniques or object-oriented graphical representations, which are more useful to someone trying to understand the general design or high level requirements of a system? For a recent domain analysis effort, the answer was a fusion of popular modeling techniques of both types. By using both functional and object-oriented techniques, the analysts involved were able to lean on their experience in function oriented software development, while taking advantage of the descriptive power available in object oriented models. In addition, a base of familiar modeling methods permitted the group of mostly new domain analysts to learn the details of the domain analysis process while producing a quality product. This paper describes the background of this project and then provides a high level definition of domain analysis. The majority of this paper focuses on the modeling method developed and utilized during this analysis effort.

  9. NEW APPROACHES IN RISK ANALYSIS OF ENVIRONMENTAL STRESSORS TO HUMAN AND ECOLOGICAL SYSTEMS

    EPA Science Inventory

    We explore the application of novel techniques for improving and integrating risk analysis of environmental stressors to human and ecological systems. Environmental protection decisions are guided by risk assessments serving as tools to develop regulatory policy and other relate...

  10. The principles of the Brazilian Unified Health System, studied based on similitude analysis

    PubMed Central

    de Pontes, Ana Paula Munhen; de Oliveira, Denize Cristina; Gomes, Antonio Marcos Tosoli

    2014-01-01

    Objectives to analyze and compare the incorporation of the ethical-doctrinal and organizational principles into the social representations of the Unified Health System (SUS) among health professionals. Method a study grounded in Social Representations Theory, undertaken with 125 subjects, in eight health institutions in Rio de Janeiro. The free word association technique was applied to the induction term "SUS", the words evoked being analyzed using the techniques of the Vergès matrix and similitude analysis. Results it was identified that the professionals' social representations vary depending on their level of education, and that those with higher education represent a subgroup responsible for the process of representational change identified. This result was confirmed through similitude analysis. Conclusion a process of representational change is ongoing, in which it was ascertained that the professionals incorporated the principles of the SUS into their symbolic constructions. The similitude analysis was shown to be a fruitful technique for research in nursing. PMID:24553704

  11. Space shuttle/food system. Volume 2, Appendix C: Food cooling techniques analysis. Appendix D: Package and stowage: Alternate concepts analysis

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The relative penalties associated with various techniques for providing an onboard cold environment for storage of perishable food items, and for the development of packaging and vehicle stowage parameters were investigated in terms of the overall food system design analysis of space shuttle. The degrees of capability for maintaining both a 40 F to 45 F refrigerated temperature and a 0 F and 20 F frozen environment were assessed for the following cooling techniques: (1) phase change (heat sink) concept; (2) thermoelectric concept; (3) vapor cycle concept; and (4) expendable ammonia concept. The parameters considered in the analysis were weight, volume, and spacecraft power restrictions. Data were also produced for packaging and vehicle stowage parameters which are compatible with vehicle weight and volume specifications. Certain assumptions were made for food packaging sizes based on previously generated space shuttle menus. The results of the study are shown, along with the range of meal choices considered.

  12. Seismic Hazard Analysis as a Controlling Technique of Induced Seismicity in Geothermal Systems

    NASA Astrophysics Data System (ADS)

    Convertito, V.; Sharma, N.; Maercklin, N.; Emolo, A.; Zollo, A.

    2011-12-01

    The effect of induced seismicity of geothermal systems during stimulation and fluid circulation can cover a wide range of values from light and unfelt to severe and damaging. If the design of a modern geothermal system requires the largest efficiency to be obtained from the social point of view it is required that the system could be managed in order to reduce possible impact in advance. In this framework, automatic control of the seismic response of the stimulated reservoir is nowadays mandatory, particularly in proximity of densely populated areas. Recently, techniques have been proposed for this purpose mainly based on the concept of the traffic light. This system provides a tool to decide the level of stimulation rate based on the real-time analysis of the induced seismicity and the ongoing ground motion values. However, in some cases the induced effect can be delayed with respect to the time when the reservoir is stimulated. Thus, a controlling system technique able to estimate the ground motion levels for different time scales can help to better control the geothermal system. Here we present an adaptation of the classical probabilistic seismic hazard analysis to the case where the seismicity rate as well as the propagation medium properties are not constant with time. We use a non-homogeneous seismicity model for modeling purposes, in which the seismicity rate and b-value of the recurrence relationship change with time. Additionally, as a further controlling procedure, we propose a moving time window analysis of the recorded peak ground-motion values aimed at monitoring the changes in the propagation medium. In fact, for the same set of magnitude values recorded at the same stations, we expect that on average peak ground motion values attenuate in same way. As a consequence, the residual differences can be reasonably ascribed to changes in medium properties. These changes can be modeled and directly introduced in the hazard integral. We applied the proposed technique to a training dataset of induced earthquakes recorded by Berkeley-Geysers network, which is installed in The Geysers geothermal area in Northern California. The reliability of the techniques is then tested by using a different dataset performing seismic hazard analysis in a time-evolving approach, which provides with ground-motion values having fixed probabilities of exceedence. Those values can be finally compared with the observations by using appropriate statistical tests.

  13. An analytical technique for predicting the characteristics of a flexible wing equipped with an active flutter-suppression system and comparison with wind-tunnel data

    NASA Technical Reports Server (NTRS)

    Abel, I.

    1979-01-01

    An analytical technique for predicting the performance of an active flutter-suppression system is presented. This technique is based on the use of an interpolating function to approximate the unsteady aerodynamics. The resulting equations are formulated in terms of linear, ordinary differential equations with constant coefficients. This technique is then applied to an aeroelastic model wing equipped with an active flutter-suppression system. Comparisons between wind-tunnel data and analysis are presented for the wing both with and without active flutter suppression. Results indicate that the wing flutter characteristics without flutter suppression can be predicted very well but that a more adequate model of wind-tunnel turbulence is required when the active flutter-suppression system is used.

  14. Flexible data-management system

    NASA Technical Reports Server (NTRS)

    Pelouch, J. J., Jr.

    1977-01-01

    Combined ASRDI Data-Management and Analysis Technique (CADMAT) is system of computer programs and procedures that can be used to conduct data-management tasks. System was developed specifically for use by scientists and engineers who are confronted with management and analysis of large quantities of data organized into records of events and parametric fields. CADMAT is particularly useful when data are continually accumulated, such as when the need of retrieval and analysis is ongoing.

  15. Managing Variation in Services in a Software Product Line Context

    DTIC Science & Technology

    2010-05-01

    Oriented Domain Analysis ( FODA ) Feasibility Study (CMU/SEI-90-TR-021, ADA235785). Software Engineering Institute, Carnegie Mellon University, 1990...the systems in the product line, and a plan for building the systems. Product line scope and product line analysis define the boundaries and...systems, as well as expected ways in which they may vary. Product line analysis applies established modeling techniques to engineer the common and

  16. Interfacial and Thin Film Chemistry in Electron Device Fabrication

    DTIC Science & Technology

    1990-11-20

    mounted on our UHV surface analysis system, and J. Forster and D. V. Podlesnik of the IBM General Technology Division. The following discussion...IBM (East Fishkill). The system being built consists of three chambers: one UHV analysis chamber, one UHV reaction chamber and one high pressure...system will be equipped with several surface analysis and cleaning techniques. Fourier transform infrared spectrometry will be used to do attenuated

  17. Data analysis of the COMPTEL instrument on the NASA gamma ray observatory

    NASA Technical Reports Server (NTRS)

    Diehl, R.; Bennett, K.; Collmar, W.; Connors, A.; Denherder, J. W.; Hermsen, W.; Lichti, G. G.; Lockwood, J. A.; Macri, J.; Mcconnell, M.

    1992-01-01

    The Compton imaging telescope (COMPTEL) on the Gamma Ray Observatory (GRO) is a wide field of view instrument. The coincidence measurement technique in two scintillation detector layers requires specific analysis methods. Straightforward event projection into the sky is impossible. Therefore, detector events are analyzed in a multi-dimensional dataspace using a gamma ray sky hypothesis convolved with the point spread function of the instrument in this dataspace. Background suppression and analysis techniques have important implications on the gamma ray source results for this background limited telescope. The COMPTEL collaboration applies a software system of analysis utilities, organized around a database management system. The use of this system for the assistance of guest investigators at the various collaboration sites and external sites is foreseen and allows different detail levels of cooperation with the COMPTEL institutes, dependent on the type of data to be studied.

  18. Bring It to the Pitch: Combining Video and Movement Data to Enhance Team Sport Analysis.

    PubMed

    Stein, Manuel; Janetzko, Halldor; Lamprecht, Andreas; Breitkreutz, Thorsten; Zimmermann, Philipp; Goldlucke, Bastian; Schreck, Tobias; Andrienko, Gennady; Grossniklaus, Michael; Keim, Daniel A

    2018-01-01

    Analysts in professional team sport regularly perform analysis to gain strategic and tactical insights into player and team behavior. Goals of team sport analysis regularly include identification of weaknesses of opposing teams, or assessing performance and improvement potential of a coached team. Current analysis workflows are typically based on the analysis of team videos. Also, analysts can rely on techniques from Information Visualization, to depict e.g., player or ball trajectories. However, video analysis is typically a time-consuming process, where the analyst needs to memorize and annotate scenes. In contrast, visualization typically relies on an abstract data model, often using abstract visual mappings, and is not directly linked to the observed movement context anymore. We propose a visual analytics system that tightly integrates team sport video recordings with abstract visualization of underlying trajectory data. We apply appropriate computer vision techniques to extract trajectory data from video input. Furthermore, we apply advanced trajectory and movement analysis techniques to derive relevant team sport analytic measures for region, event and player analysis in the case of soccer analysis. Our system seamlessly integrates video and visualization modalities, enabling analysts to draw on the advantages of both analysis forms. Several expert studies conducted with team sport analysts indicate the effectiveness of our integrated approach.

  19. Performance analysis of ‘Perturb and Observe’ and ‘Incremental Conductance’ MPPT algorithms for PV system

    NASA Astrophysics Data System (ADS)

    Lodhi, Ehtisham; Lodhi, Zeeshan; Noman Shafqat, Rana; Chen, Fieda

    2017-07-01

    Photovoltaic (PV) system usually employed The Maximum power point tracking (MPPT) techniques for increasing its efficiency. The performance of the PV system perhaps boosts by controlling at its apex point of power, in this way maximal power can be given to load. The proficiency of a PV system usually depends upon irradiance, temperature and array architecture. PV array shows a non-linear style for V-I curve and maximal power point on V-P curve also varies with changing environmental conditions. MPPT methods grantees that a PV module is regulated at reference voltage and to produce entire usage of the maximal output power. This paper gives analysis between two widely employed Perturb and Observe (P&O) and Incremental Conductance (INC) MPPT techniques. Their performance is evaluated and compared through theoretical analysis and digital simulation on the basis of response time and efficiency under varying irradiance and temperature condition using Matlab/Simulink.

  20. Classification of damage in structural systems using time series analysis and supervised and unsupervised pattern recognition techniques

    NASA Astrophysics Data System (ADS)

    Omenzetter, Piotr; de Lautour, Oliver R.

    2010-04-01

    Developed for studying long, periodic records of various measured quantities, time series analysis methods are inherently suited and offer interesting possibilities for Structural Health Monitoring (SHM) applications. However, their use in SHM can still be regarded as an emerging application and deserves more studies. In this research, Autoregressive (AR) models were used to fit experimental acceleration time histories from two experimental structural systems, a 3- storey bookshelf-type laboratory structure and the ASCE Phase II SHM Benchmark Structure, in healthy and several damaged states. The coefficients of the AR models were chosen as damage sensitive features. Preliminary visual inspection of the large, multidimensional sets of AR coefficients to check the presence of clusters corresponding to different damage severities was achieved using Sammon mapping - an efficient nonlinear data compression technique. Systematic classification of damage into states based on the analysis of the AR coefficients was achieved using two supervised classification techniques: Nearest Neighbor Classification (NNC) and Learning Vector Quantization (LVQ), and one unsupervised technique: Self-organizing Maps (SOM). This paper discusses the performance of AR coefficients as damage sensitive features and compares the efficiency of the three classification techniques using experimental data.

  1. Fault Tree Analysis: Its Implications for Use in Education.

    ERIC Educational Resources Information Center

    Barker, Bruce O.

    This study introduces the concept of Fault Tree Analysis as a systems tool and examines the implications of Fault Tree Analysis (FTA) as a technique for isolating failure modes in educational systems. A definition of FTA and discussion of its history, as it relates to education, are provided. The step by step process for implementation and use of…

  2. THE EDUCATIONAL INSTITUTION AS A SYSTEM--A PROPOSED GENERALIZED PROCEDURE FOR ANALYSIS.

    ERIC Educational Resources Information Center

    REISMAN, ARNOLD; TAFT, MARTIN I.

    A UNIFIED APPROACH TO THE ANALYSIS AND SYNTHESIS OF THE FUNCTIONS AND OPERATIONS IN EDUCATIONAL INSTITUTIONS IS PRESENTED. SYSTEMS ANALYSIS TECHNIQUES USED IN OTHER AREAS SUCH AS CRAFT, PERT, CERBS, AND OPERATIONS RESEARCH ARE SUGGESTED AS POTENTIALLY ADAPTABLE FOR USE IN HIGHER EDUCATION. THE MAJOR OBJECTIVE OF A SCHOOL IS TO ALLOCATE AVAILABLE…

  3. Social Learning Network Analysis Model to Identify Learning Patterns Using Ontology Clustering Techniques and Meaningful Learning

    ERIC Educational Resources Information Center

    Firdausiah Mansur, Andi Besse; Yusof, Norazah

    2013-01-01

    Clustering on Social Learning Network still not explored widely, especially when the network focuses on e-learning system. Any conventional methods are not really suitable for the e-learning data. SNA requires content analysis, which involves human intervention and need to be carried out manually. Some of the previous clustering techniques need…

  4. The combined use of order tracking techniques for enhanced Fourier analysis of order components

    NASA Astrophysics Data System (ADS)

    Wang, K. S.; Heyns, P. S.

    2011-04-01

    Order tracking is one of the most important vibration analysis techniques for diagnosing faults in rotating machinery. It can be performed in many different ways, each of these with distinct advantages and disadvantages. However, in the end the analyst will often use Fourier analysis to transform the data from a time series to frequency or order spectra. It is therefore surprising that the study of the Fourier analysis of order-tracked systems seems to have been largely ignored in the literature. This paper considers the frequently used Vold-Kalman filter-based order tracking and computed order tracking techniques. The main pros and cons of each technique for Fourier analysis are discussed and the sequential use of Vold-Kalman filtering and computed order tracking is proposed as a novel idea to enhance the results of Fourier analysis for determining the order components. The advantages of the combined use of these order tracking techniques are demonstrated numerically on an SDOF rotor simulation model. Finally, the approach is also demonstrated on experimental data from a real rotating machine.

  5. Survey of aircraft electrical power systems

    NASA Technical Reports Server (NTRS)

    Lee, C. H.; Brandner, J. J.

    1972-01-01

    Areas investigated include: (1) load analysis; (2) power distribution, conversion techniques and generation; (3) design criteria and performance capabilities of hydraulic and pneumatic systems; (4) system control and protection methods; (5) component and heat transfer systems cooling; and (6) electrical system reliability.

  6. Component Composition for Embedded Systems Using Semantic Aspect-Oriented Programming

    DTIC Science & Technology

    2004-10-01

    real - time systems for the defense community. Our research focused on Real-Time Java implementation and analysis techniques. Real-Time Java is important for the defense community because it holds out the promise of enabling developers to apply COTS Java technology to specialized military embedded systems. It also promises to allow the defense community to utilize a large Java-literate workforce for building defense systems. Our research has delivered several techniques that may make Real-Time Java a better platform for developing embedded

  7. Systems for animal exposure in full-scale fire tests

    NASA Technical Reports Server (NTRS)

    Hilado, C. J.; Cumming, H. J.; Kourtides, D. A.; Parker, J. A.

    1977-01-01

    Two systems for exposing animals in full-scale fire tests are described. Both systems involve the simultaneous exposure of two animal species, mice and rats, in modular units; determination of mortality, morbidity, and behavioral response; and analysis of the blood for carboxyhemoglobin. The systems described represent two of many possible options for obtaining bioassay data from full-scale fire tests. In situations where the temperatures to which the test animals are exposed can not be controlled, analytical techniques may be more appropriate than bioassay techniques.

  8. Automatic Target Recognition Classification System Evaluation Methodology

    DTIC Science & Technology

    2002-09-01

    Testing Set of Two-Class XOR Data (250 Samples)......................................... 2-59 2.27 Decision Analysis Process Flow Chart...ROC curve meta - analysis , which is the estimation of the true ROC curve of a given diagnostic system through ROC analysis across many studies or...technique can be very effective in sensitivity analysis ; trying to determine which data points have the most effect on the solution, and in

  9. Design Criteria For Networked Image Analysis System

    NASA Astrophysics Data System (ADS)

    Reader, Cliff; Nitteberg, Alan

    1982-01-01

    Image systems design is currently undergoing a metamorphosis from the conventional computing systems of the past into a new generation of special purpose designs. This change is motivated by several factors, notably among which is the increased opportunity for high performance with low cost offered by advances in semiconductor technology. Another key issue is a maturing in understanding of problems and the applicability of digital processing techniques. These factors allow the design of cost-effective systems that are functionally dedicated to specific applications and used in a utilitarian fashion. Following an overview of the above stated issues, the paper presents a top-down approach to the design of networked image analysis systems. The requirements for such a system are presented, with orientation toward the hospital environment. The three main areas are image data base management, viewing of image data and image data processing. This is followed by a survey of the current state of the art, covering image display systems, data base techniques, communications networks and software systems control. The paper concludes with a description of the functional subystems and architectural framework for networked image analysis in a production environment.

  10. A Fault Tree Approach to Analysis of Behavioral Systems: An Overview.

    ERIC Educational Resources Information Center

    Stephens, Kent G.

    Developed at Brigham Young University, Fault Tree Analysis (FTA) is a technique for enhancing the probability of success in any system by analyzing the most likely modes of failure that could occur. It provides a logical, step-by-step description of possible failure events within a system and their interaction--the combinations of potential…

  11. SNAP/SHOT Your Ability to Support That Next Application.

    ERIC Educational Resources Information Center

    Jones, Ernest L.

    SNAP/SHOT (System Network Analysis Program-Simulated Host Overview Technique) is a discrete simulation of a network and/or host model available through IBM at the Raleigh System Center. The simulator provides an analysis of a total IBM Communications System. Input data must be obtained from RMF, SMF, and the CICS Analyzer to determine the existing…

  12. Study of fault tolerant software technology for dynamic systems

    NASA Technical Reports Server (NTRS)

    Caglayan, A. K.; Zacharias, G. L.

    1985-01-01

    The major aim of this study is to investigate the feasibility of using systems-based failure detection isolation and compensation (FDIC) techniques in building fault-tolerant software and extending them, whenever possible, to the domain of software fault tolerance. First, it is shown that systems-based FDIC methods can be extended to develop software error detection techniques by using system models for software modules. In particular, it is demonstrated that systems-based FDIC techniques can yield consistency checks that are easier to implement than acceptance tests based on software specifications. Next, it is shown that systems-based failure compensation techniques can be generalized to the domain of software fault tolerance in developing software error recovery procedures. Finally, the feasibility of using fault-tolerant software in flight software is investigated. In particular, possible system and version instabilities, and functional performance degradation that may occur in N-Version programming applications to flight software are illustrated. Finally, a comparative analysis of N-Version and recovery block techniques in the context of generic blocks in flight software is presented.

  13. A knowledge based system for scientific data visualization

    NASA Technical Reports Server (NTRS)

    Senay, Hikmet; Ignatius, Eve

    1992-01-01

    A knowledge-based system, called visualization tool assistant (VISTA), which was developed to assist scientists in the design of scientific data visualization techniques, is described. The system derives its knowledge from several sources which provide information about data characteristics, visualization primitives, and effective visual perception. The design methodology employed by the system is based on a sequence of transformations which decomposes a data set into a set of data partitions, maps this set of partitions to visualization primitives, and combines these primitives into a composite visualization technique design. Although the primary function of the system is to generate an effective visualization technique design for a given data set by using principles of visual perception the system also allows users to interactively modify the design, and renders the resulting image using a variety of rendering algorithms. The current version of the system primarily supports visualization techniques having applicability in earth and space sciences, although it may easily be extended to include other techniques useful in other disciplines such as computational fluid dynamics, finite-element analysis and medical imaging.

  14. Improving the limits of detection of low background alpha emission measurements

    NASA Astrophysics Data System (ADS)

    McNally, Brendan D.; Coleman, Stuart; Harris, Jack T.; Warburton, William K.

    2018-01-01

    Alpha particle emission - even at extremely low levels - is a significant issue in the search for rare events (e.g., double beta decay, dark matter detection). Traditional measurement techniques require long counting times to measure low sample rates in the presence of much larger instrumental backgrounds. To address this, a commercially available instrument developed by XIA uses pulse shape analysis to discriminate alpha emissions produced by the sample from those produced by other surfaces of the instrument itself. Experience with this system has uncovered two residual sources of background: cosmogenics and radon emanation from internal components. An R&D program is underway to enhance the system and extend the pulse shape analysis technique further, so that these residual sources can be identified and rejected as well. In this paper, we review the theory of operation and pulse shape analysis techniques used in XIA's alpha counter, and briefly explore data suggesting the origin of the residual background terms. We will then present our approach to enhance the system's ability to identify and reject these terms. Finally, we will describe a prototype system that incorporates our concepts and demonstrates their feasibility.

  15. Simulation of Neural Firing Dynamics: A Student Project.

    ERIC Educational Resources Information Center

    Kletsky, E. J.

    This paper describes a student project in digital simulation techniques that is part of a graduate systems analysis course entitled Biosimulation. The students chose different simulation techniques to solve a problem related to the neuron model. (MLH)

  16. Multidimensional Processing and Visual Rendering of Complex 3D Biomedical Images

    NASA Technical Reports Server (NTRS)

    Sams, Clarence F.

    2016-01-01

    The proposed technology uses advanced image analysis techniques to maximize the resolution and utility of medical imaging methods being used during spaceflight. We utilize COTS technology for medical imaging, but our applications require higher resolution assessment of the medical images than is routinely applied with nominal system software. By leveraging advanced data reduction and multidimensional imaging techniques utilized in analysis of Planetary Sciences and Cell Biology imaging, it is possible to significantly increase the information extracted from the onboard biomedical imaging systems. Year 1 focused on application of these techniques to the ocular images collected on ground test subjects and ISS crewmembers. Focus was on the choroidal vasculature and the structure of the optic disc. Methods allowed for increased resolution and quantitation of structural changes enabling detailed assessment of progression over time. These techniques enhance the monitoring and evaluation of crew vision issues during space flight.

  17. WE-G-BRA-07: Analyzing the Safety Implications of a Brachytherapy Process Improvement Project Utilizing a Novel System-Theory-Based Hazard-Analysis Technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang, A; Samost, A; Viswanathan, A

    Purpose: To investigate the hazards in cervical-cancer HDR brachytherapy using a novel hazard-analysis technique, System Theoretic Process Analysis (STPA). The applicability and benefit of STPA to the field of radiation oncology is demonstrated. Methods: We analyzed the tandem and ring HDR procedure through observations, discussions with physicists and physicians, and the use of a previously developed process map. Controllers and their respective control actions were identified and arranged into a hierarchical control model of the system, modeling the workflow from applicator insertion through initiating treatment delivery. We then used the STPA process to identify potentially unsafe control actions. Scenarios weremore » then generated from the identified unsafe control actions and used to develop recommendations for system safety constraints. Results: 10 controllers were identified and included in the final model. From these controllers 32 potentially unsafe control actions were identified, leading to more than 120 potential accident scenarios, including both clinical errors (e.g., using outdated imaging studies for planning), and managerial-based incidents (e.g., unsafe equipment, budget, or staffing decisions). Constraints identified from those scenarios include common themes, such as the need for appropriate feedback to give the controllers an adequate mental model to maintain safe boundaries of operations. As an example, one finding was that the likelihood of the potential accident scenario of the applicator breaking during insertion might be reduced by establishing a feedback loop of equipment-usage metrics and equipment-failure reports to the management controller. Conclusion: The utility of STPA in analyzing system hazards in a clinical brachytherapy system was demonstrated. This technique, rooted in system theory, identified scenarios both technical/clinical and managerial in nature. These results suggest that STPA can be successfully used to analyze safety in brachytherapy and may prove to be an alternative to other hazard analysis techniques.« less

  18. A novel method for tracing the movement of multiple individual soil particles under rainfall conditions using florescent videography.

    NASA Astrophysics Data System (ADS)

    Hardy, Robert; Pates, Jackie; Quinton, John

    2016-04-01

    The importance of developing new techniques to study soil movement cannot be underestimated especially those that integrate new technology. Currently there are limited empirical data available about the movement of individual soil particles, particularly high quality time-resolved data. Here we present a new technique which allows multiple individual soil particles to be traced in real time under simulated rainfall conditions. The technique utilises fluorescent videography in combination with a fluorescent soil tracer, which is based on natural particles. The system has been successfully used on particles greater than ~130 micrometres diameter. The technique uses HD video shot at 50 frames per second, providing extremely high temporal (0.02 s) and spatial resolution (sub-millimetre) of a particle's location without the need to perturb the system. Once the tracer has been filmed then the images are processed and analysed using a particle analysis and visualisation toolkit written in python. The toolkit enables the creation of 2 and 3-D time-resolved graphs showing the location of 1 or more particles. Quantitative numerical analysis of a pathway (or collection of pathways) is also possible, allowing parameters such as particle speed and displacement to be assessed. Filming the particles removes the need to destructively sample material and has many side-benefits, reducing the time, money and effort expended in the collection, transport and laboratory analysis of soils, while delivering data in a digital form which is perfect for modern computer-driven analysis techniques. There are many potential applications for the technique. High resolution empirical data on how soil particles move could be used to create, parameterise and evaluate soil movement models, particularly those that use the movement of individual particles. As data can be collected while rainfall is occurring it may offer the ability to study systems under dynamic conditions(rather than rainfall of a constant intensity), which are more realistic and this was one motivations behind the development of this technique.

  19. Knowledge Management for the Analysis of Complex Experimentation.

    ERIC Educational Resources Information Center

    Maule, R.; Schacher, G.; Gallup, S.

    2002-01-01

    Describes a knowledge management system that was developed to help provide structure for dynamic and static data and to aid in the analysis of complex experimentation. Topics include quantitative and qualitative data; mining operations using artificial intelligence techniques; information architecture of the system; and transforming data into…

  20. A computer-aided movement analysis system.

    PubMed

    Fioretti, S; Leo, T; Pisani, E; Corradini, M L

    1990-08-01

    Interaction with biomechanical data concerning human movement analysis implies the adoption of various experimental equipments and the choice of suitable models, data processing, and graphical data restitution techniques. The integration of measurement setups with the associated experimental protocols and the relative software procedures constitutes a computer-aided movement analysis (CAMA) system. In the present paper such integration is mapped onto the causes that limit the clinical acceptance of movement analysis methods. The structure of the system is presented. A specific CAMA system devoted to posture analysis is described in order to show the attainable features. Scientific results obtained with the support of the described system are also reported.

  1. Comparison of existing digital image analysis systems for the analysis of Thematic Mapper data

    NASA Technical Reports Server (NTRS)

    Likens, W. C.; Wrigley, R. C.

    1984-01-01

    Most existing image analysis systems were designed with the Landsat Multi-Spectral Scanner in mind, leaving open the question of whether or not these systems could adequately process Thematic Mapper data. In this report, both hardware and software systems have been evaluated for compatibility with TM data. Lack of spectral analysis capability was not found to be a problem, though techniques for spatial filtering and texture varied. Computer processing speed and data storage of currently existing mini-computer based systems may be less than adequate. Upgrading to more powerful hardware may be required for many TM applications.

  2. Application of mass spectrometry to process control for polymer material in autoclave curing

    NASA Technical Reports Server (NTRS)

    Smith, A. C.

    1983-01-01

    Mass spectrometer analysis of gas samples collected during a cure cycle of polymer materials can be used as a process control technique. This technique is particularly helpful in studying the various types of solvents and resin systems used in the preparation of polymer materials and characterizing the chemical composition of different resin systems and their mechanism of polymerization.

  3. Recent trends in particle size analysis techniques

    NASA Technical Reports Server (NTRS)

    Kang, S. H.

    1984-01-01

    Recent advances and developments in the particle-sizing technologies are briefly reviewed in accordance with three operating principles including particle size and shape descriptions. Significant trends of the particle size analysing equipment recently developed show that compact electronic circuitry and rapid data processing systems were mainly adopted in the instrument design. Some newly developed techniques characterizing the particulate system were also introduced.

  4. GLO-STIX: Graph-Level Operations for Specifying Techniques and Interactive eXploration

    PubMed Central

    Stolper, Charles D.; Kahng, Minsuk; Lin, Zhiyuan; Foerster, Florian; Goel, Aakash; Stasko, John; Chau, Duen Horng

    2015-01-01

    The field of graph visualization has produced a wealth of visualization techniques for accomplishing a variety of analysis tasks. Therefore analysts often rely on a suite of different techniques, and visual graph analysis application builders strive to provide this breadth of techniques. To provide a holistic model for specifying network visualization techniques (as opposed to considering each technique in isolation) we present the Graph-Level Operations (GLO) model. We describe a method for identifying GLOs and apply it to identify five classes of GLOs, which can be flexibly combined to re-create six canonical graph visualization techniques. We discuss advantages of the GLO model, including potentially discovering new, effective network visualization techniques and easing the engineering challenges of building multi-technique graph visualization applications. Finally, we implement the GLOs that we identified into the GLO-STIX prototype system that enables an analyst to interactively explore a graph by applying GLOs. PMID:26005315

  5. Techniques Analysis of the Interference Suppression Algorithm in Broadband Aeronautical Multi-carrier Communication System

    NASA Astrophysics Data System (ADS)

    Li, Dong-xia; Ye, Qian-wen

    Out-of-band radiation suppression algorithm must be used efficiently for broadband aeronautical communication system in order not to interfere the operation of the existing systems in aviation L-Band. Based on the simple introduction of the broadband aeronautical multi-carrier communication (B-AMC) system model, several sidelobe suppression techniques in orthogonal frequency multiplexing (OFDM) system are presented and analyzed so as to find a suitable algorithm for B-AMC system in this paper. Simulation results show that raise-cosine function windowing can suppress the out-of-band radiation of B-AMC system effectively.

  6. Stability, performance and sensitivity analysis of I.I.D. jump linear systems

    NASA Astrophysics Data System (ADS)

    Chávez Fuentes, Jorge R.; González, Oscar R.; Gray, W. Steven

    2018-06-01

    This paper presents a symmetric Kronecker product analysis of independent and identically distributed jump linear systems to develop new, lower dimensional equations for the stability and performance analysis of this type of systems than what is currently available. In addition, new closed form expressions characterising multi-parameter relative sensitivity functions for performance metrics are introduced. The analysis technique is illustrated with a distributed fault-tolerant flight control example where the communication links are allowed to fail randomly.

  7. An operational global-scale ocean thermal analysis system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clancy, R. M.; Pollak, K.D.; Phoebus, P.A.

    1990-04-01

    The Optimum Thermal Interpolation System (OTIS) is an ocean thermal analysis system designed for operational use at FNOC. It is based on the optimum interpolation of the assimilation technique and functions in an analysis-prediction-analysis data assimilation cycle with the TOPS mixed-layer model. OTIS provides a rigorous framework for combining real-time data, climatology, and predictions from numerical ocean prediction models to produce a large-scale synoptic representation of ocean thermal structure. The techniques and assumptions used in OTIS are documented and results of operational tests of global scale OTIS at FNOC are presented. The tests involved comparisons of OTIS against an existingmore » operational ocean thermal structure model and were conducted during February, March, and April 1988. Qualitative comparison of the two products suggests that OTIS gives a more realistic representation of subsurface anomalies and horizontal gradients and that it also gives a more accurate analysis of the thermal structure, with improvements largest below the mixed layer. 37 refs.« less

  8. Ku-band signal design study. [space shuttle orbiter data processing network

    NASA Technical Reports Server (NTRS)

    Rubin, I.

    1978-01-01

    Analytical tools, methods and techniques for assessing the design and performance of the space shuttle orbiter data processing system (DPS) are provided. The computer data processing network is evaluated in the key areas of queueing behavior synchronization and network reliability. The structure of the data processing network is described as well as the system operation principles and the network configuration. The characteristics of the computer systems are indicated. System reliability measures are defined and studied. System and network invulnerability measures are computed. Communication path and network failure analysis techniques are included.

  9. Risk analysis with a fuzzy-logic approach of a complex installation

    NASA Astrophysics Data System (ADS)

    Peikert, Tim; Garbe, Heyno; Potthast, Stefan

    2016-09-01

    This paper introduces a procedural method based on fuzzy logic to analyze systematic the risk of an electronic system in an intentional electromagnetic environment (IEME). The method analyzes the susceptibility of a complex electronic installation with respect to intentional electromagnetic interference (IEMI). It combines the advantages of well-known techniques as fault tree analysis (FTA), electromagnetic topology (EMT) and Bayesian networks (BN) and extends the techniques with an approach to handle uncertainty. This approach uses fuzzy sets, membership functions and fuzzy logic to handle the uncertainty with probability functions and linguistic terms. The linguistic terms add to the risk analysis the knowledge from experts of the investigated system or environment.

  10. The value of job analysis, job description and performance.

    PubMed

    Wolfe, M N; Coggins, S

    1997-01-01

    All companies, regardless of size, are faced with the same employment concerns. Efficient personnel management requires the use of three human resource techniques--job analysis, job description and performance appraisal. These techniques and tools are not for large practices only. Small groups can obtain the same benefits by employing these performance control measures. Job analysis allows for the development of a compensation system. Job descriptions summarize the most important duties. Performance appraisals help reward outstanding work.

  11. The application of digital techniques to the analysis of metallurgical experiments

    NASA Technical Reports Server (NTRS)

    Rathz, T. J.

    1977-01-01

    The application of a specific digital computer system (known as the Image Data Processing System) to the analysis of three NASA-sponsored metallurgical experiments is discussed in some detail. The basic hardware and software components of the Image Data Processing System are presented. Many figures are presented in the discussion of each experimental analysis in an attempt to show the accuracy and speed that the Image Data Processing System affords in analyzing photographic images dealing with metallurgy, and in particular with material processing.

  12. Combination of process and vibration data for improved condition monitoring of industrial systems working under variable operating conditions

    NASA Astrophysics Data System (ADS)

    Ruiz-Cárcel, C.; Jaramillo, V. H.; Mba, D.; Ottewill, J. R.; Cao, Y.

    2016-01-01

    The detection and diagnosis of faults in industrial processes is a very active field of research due to the reduction in maintenance costs achieved by the implementation of process monitoring algorithms such as Principal Component Analysis, Partial Least Squares or more recently Canonical Variate Analysis (CVA). Typically the condition of rotating machinery is monitored separately using vibration analysis or other specific techniques. Conventional vibration-based condition monitoring techniques are based on the tracking of key features observed in the measured signal. Typically steady-state loading conditions are required to ensure consistency between measurements. In this paper, a technique based on merging process and vibration data is proposed with the objective of improving the detection of mechanical faults in industrial systems working under variable operating conditions. The capabilities of CVA for detection and diagnosis of faults were tested using experimental data acquired from a compressor test rig where different process faults were introduced. Results suggest that the combination of process and vibration data can effectively improve the detectability of mechanical faults in systems working under variable operating conditions.

  13. Analysis of a crossed Bragg cell acousto-optical spectrometer for SETI

    NASA Technical Reports Server (NTRS)

    Gulkis, S.

    1989-01-01

    The search for radio signals from extraterrestrial intelligent beings (SETI) requires the use of large instantaneous bandwidth (500 MHz) and high resolution (20 Hz) spectrometers. Digital systems with a high degree of modularity can be used to provide this capability, and this method has been widely discussed. Another technique for meeting the SETI requirement is to use a crossed Bragg cell spectrometer as described by Psaltis and Casasent. This technique makes use of the Folded Spectrum concept, introduced by Thomas. The Folded Spectrum is a 2-D Fourier Transform of a raster scanned 1-D signal. It is directly related to the long 1-D spectrum of the original signal and is ideally suited for optical signal processing. The folded spectrum technique has received little attention to date, primarily because early systems made use of photographic film which are unsuitable for the real time data analysis and voluminous data requirements of SETI. An analysis of the crossed Bragg cell spectrometer is presented as a method to achieve the spectral processing requirements for SETI. Systematic noise contributions unique to the Bragg cell system will be discussed.

  14. Analysis of a crossed Bragg cell acousto-optical spectrometer for SETI.

    PubMed

    Gulkis, S

    1989-01-01

    The search for radio signals from extraterrestrial intelligent beings (SETI) requires the use of large instantaneous bandwidth (500 MHz) and high resolution (20 Hz) spectrometers. Digital systems with a high degree of modularity can be used to provide this capability, and this method has been widely discussed. Another technique for meeting the SETI requirement is to use a crossed Bragg cell spectrometer as described by Psaltis and Casasent. This technique makes use of the Folded Spectrum concept, introduced by Thomas. The Folded Spectrum is a 2-D Fourier Transform of a raster scanned 1-D signal. It is directly related to the long 1-D spectrum of the original signal and is ideally suited for optical signal processing. The folded spectrum technique has received little attention to date, primarily because early systems made use of photographic film which are unsuitable for the real time data analysis and voluminous data requirements of SETI. An analysis of the crossed Bragg cell spectrometer is presented as a method to achieve the spectral processing requirements for SETI. Systematic noise contributions unique to the Bragg cell system will be discussed.

  15. Analysis of a crossed Bragg cell acousto-optical spectrometer for SETI

    NASA Astrophysics Data System (ADS)

    Gulkis, Samuel

    The search for radio signals from extraterrestrial intelligent beings (SETI) requires the use of large instantaneous bandwidth (500 MHz) and high resolution (20 Hz) spectrometers. Digital systems with a high degree of modularity can be used to provide this capability, and this method has been widely discussed. Another technique for meeting the SETI requirement is to use a crossed Bragg cell spectrometer as described by Psaltis and Casasent. This technique makes use of the Folded Spectrum concept, introduced by Thomas. The Folded Spectrum is a 2-D Fourier Transform of a raster scanned 1-D signal. It is directly related to the long 1-D spectrum of the original signal and is ideally suited for optical signal processing. The folded spectrum technique has received little attention to date, primarily because early systems made use of photographic film which are unsuitable for the real time data analysis and voluminous data requirements of SETI. An analysis of the crossed Bragg cell spectrometer is presented as a method to achieve the spectral processing requirements for SETI. Systematic noise contributions unique to the Bragg cell system will be discussed.

  16. The integration of system specifications and program coding

    NASA Technical Reports Server (NTRS)

    Luebke, W. R.

    1970-01-01

    Experience in maintaining up-to-date documentation for one module of the large-scale Medical Literature Analysis and Retrieval System 2 (MEDLARS 2) is described. Several innovative techniques were explored in the development of this system's data management environment, particularly those that use PL/I as an automatic documenter. The PL/I data description section can provide automatic documentation by means of a master description of data elements that has long and highly meaningful mnemonic names and a formalized technique for the production of descriptive commentary. The techniques discussed are practical methods that employ the computer during system development in a manner that assists system implementation, provides interim documentation for customer review, and satisfies some of the deliverable documentation requirements.

  17. A hybrid nonlinear programming method for design optimization

    NASA Technical Reports Server (NTRS)

    Rajan, S. D.

    1986-01-01

    Solutions to engineering design problems formulated as nonlinear programming (NLP) problems usually require the use of more than one optimization technique. Moreover, the interaction between the user (analysis/synthesis) program and the NLP system can lead to interface, scaling, or convergence problems. An NLP solution system is presented that seeks to solve these problems by providing a programming system to ease the user-system interface. A simple set of rules is used to select an optimization technique or to switch from one technique to another in an attempt to detect, diagnose, and solve some potential problems. Numerical examples involving finite element based optimal design of space trusses and rotor bearing systems are used to illustrate the applicability of the proposed methodology.

  18. Constraint treatment techniques and parallel algorithms for multibody dynamic analysis. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Chiou, Jin-Chern

    1990-01-01

    Computational procedures for kinematic and dynamic analysis of three-dimensional multibody dynamic (MBD) systems are developed from the differential-algebraic equations (DAE's) viewpoint. Constraint violations during the time integration process are minimized and penalty constraint stabilization techniques and partitioning schemes are developed. The governing equations of motion, a two-stage staggered explicit-implicit numerical algorithm, are treated which takes advantage of a partitioned solution procedure. A robust and parallelizable integration algorithm is developed. This algorithm uses a two-stage staggered central difference algorithm to integrate the translational coordinates and the angular velocities. The angular orientations of bodies in MBD systems are then obtained by using an implicit algorithm via the kinematic relationship between Euler parameters and angular velocities. It is shown that the combination of the present solution procedures yields a computationally more accurate solution. To speed up the computational procedures, parallel implementation of the present constraint treatment techniques, the two-stage staggered explicit-implicit numerical algorithm was efficiently carried out. The DAE's and the constraint treatment techniques were transformed into arrowhead matrices to which Schur complement form was derived. By fully exploiting the sparse matrix structural analysis techniques, a parallel preconditioned conjugate gradient numerical algorithm is used to solve the systems equations written in Schur complement form. A software testbed was designed and implemented in both sequential and parallel computers. This testbed was used to demonstrate the robustness and efficiency of the constraint treatment techniques, the accuracy of the two-stage staggered explicit-implicit numerical algorithm, and the speed up of the Schur-complement-based parallel preconditioned conjugate gradient algorithm on a parallel computer.

  19. High-speed railway signal trackside equipment patrol inspection system

    NASA Astrophysics Data System (ADS)

    Wu, Nan

    2018-03-01

    High-speed railway signal trackside equipment patrol inspection system comprehensively applies TDI (time delay integration), high-speed and highly responsive CMOS architecture, low illumination photosensitive technique, image data compression technique, machine vision technique and so on, installed on high-speed railway inspection train, and achieves the collection, management and analysis of the images of signal trackside equipment appearance while the train is running. The system will automatically filter out the signal trackside equipment images from a large number of the background image, and identify of the equipment changes by comparing the original image data. Combining with ledger data and train location information, the system accurately locate the trackside equipment, conscientiously guiding maintenance.

  20. Ambient ionization and miniature mass spectrometry system for chemical and biological analysis

    PubMed Central

    Ma, Xiaoxiao; Ouyang, Zheng

    2016-01-01

    Ambien ionization and miniaturization of mass spectrometers are two fields in mass spectrometry that have advanced significantly in the last decade. The integration of the techniques developed in these two fields is leading to the development of complete miniature analytical systems that can be used for on-site or point-of-care analysis by non-expert users. In this review, we report the current status of development in ambient ionization and miniature mass spectrometers, with an emphasis on those techniques with potential impact on the point-of-care (POC) diagnostics. The challenges in the future development of the integrated systems are discussed with possible solutions presented. PMID:28042191

  1. Exploring Techniques for Vision Based Human Activity Recognition: Methods, Systems, and Evaluation

    PubMed Central

    Xu, Xin; Tang, Jinshan; Zhang, Xiaolong; Liu, Xiaoming; Zhang, Hong; Qiu, Yimin

    2013-01-01

    With the wide applications of vision based intelligent systems, image and video analysis technologies have attracted the attention of researchers in the computer vision field. In image and video analysis, human activity recognition is an important research direction. By interpreting and understanding human activities, we can recognize and predict the occurrence of crimes and help the police or other agencies react immediately. In the past, a large number of papers have been published on human activity recognition in video and image sequences. In this paper, we provide a comprehensive survey of the recent development of the techniques, including methods, systems, and quantitative evaluation of the performance of human activity recognition. PMID:23353144

  2. Technique for Early Reliability Prediction of Software Components Using Behaviour Models

    PubMed Central

    Ali, Awad; N. A. Jawawi, Dayang; Adham Isa, Mohd; Imran Babar, Muhammad

    2016-01-01

    Behaviour models are the most commonly used input for predicting the reliability of a software system at the early design stage. A component behaviour model reveals the structure and behaviour of the component during the execution of system-level functionalities. There are various challenges related to component reliability prediction at the early design stage based on behaviour models. For example, most of the current reliability techniques do not provide fine-grained sequential behaviour models of individual components and fail to consider the loop entry and exit points in the reliability computation. Moreover, some of the current techniques do not tackle the problem of operational data unavailability and the lack of analysis results that can be valuable for software architects at the early design stage. This paper proposes a reliability prediction technique that, pragmatically, synthesizes system behaviour in the form of a state machine, given a set of scenarios and corresponding constraints as input. The state machine is utilized as a base for generating the component-relevant operational data. The state machine is also used as a source for identifying the nodes and edges of a component probabilistic dependency graph (CPDG). Based on the CPDG, a stack-based algorithm is used to compute the reliability. The proposed technique is evaluated by a comparison with existing techniques and the application of sensitivity analysis to a robotic wheelchair system as a case study. The results indicate that the proposed technique is more relevant at the early design stage compared to existing works, and can provide a more realistic and meaningful prediction. PMID:27668748

  3. An introduction to autonomous control systems

    NASA Technical Reports Server (NTRS)

    Antsaklis, Panos J.; Passino, Kevin M.; Wang, S. J.

    1991-01-01

    The functions, characteristics, and benefits of autonomous control are outlined. An autonomous control functional architecture for future space vehicles that incorporates the concepts and characteristics described is presented. The controller is hierarchical, with an execution level (the lowest level), coordination level (middle level), and management and organization level (highest level). The general characteristics of the overall architecture, including those of the three levels, are explained, and an example to illustrate their functions is given. Mathematical models for autonomous systems, including 'logical' discrete event system models, are discussed. An approach to the quantitative, systematic modeling, analysis, and design of autonomous controllers is also discussed. It is a hybrid approach since it uses conventional analysis techniques based on difference and differential equations and new techniques for the analysis of the systems described with a symbolic formalism such as finite automata. Some recent results from the areas of planning and expert systems, machine learning, artificial neural networks, and the area restructurable controls are briefly outlined.

  4. An Ibm PC/AT-Based Image Acquisition And Processing System For Quantitative Image Analysis

    NASA Astrophysics Data System (ADS)

    Kim, Yongmin; Alexander, Thomas

    1986-06-01

    In recent years, a large number of applications have been developed for image processing systems in the area of biological imaging. We have already finished the development of a dedicated microcomputer-based image processing and analysis system for quantitative microscopy. The system's primary function has been to facilitate and ultimately automate quantitative image analysis tasks such as the measurement of cellular DNA contents. We have recognized from this development experience, and interaction with system users, biologists and technicians, that the increasingly widespread use of image processing systems, and the development and application of new techniques for utilizing the capabilities of such systems, would generate a need for some kind of inexpensive general purpose image acquisition and processing system specially tailored for the needs of the medical community. We are currently engaged in the development and testing of hardware and software for a fairly high-performance image processing computer system based on a popular personal computer. In this paper, we describe the design and development of this system. Biological image processing computer systems have now reached a level of hardware and software refinement where they could become convenient image analysis tools for biologists. The development of a general purpose image processing system for quantitative image analysis that is inexpensive, flexible, and easy-to-use represents a significant step towards making the microscopic digital image processing techniques more widely applicable not only in a research environment as a biologist's workstation, but also in clinical environments as a diagnostic tool.

  5. Respiratory monitoring system based on the nasal pressure technique for the analysis of sleep breathing disorders: Reduction of static and dynamic errors, and comparisons with thermistors and pneumotachographs

    NASA Astrophysics Data System (ADS)

    Alves de Mesquita, Jayme; Lopes de Melo, Pedro

    2004-03-01

    Thermally sensitive devices—thermistors—have usually been used to monitor sleep-breathing disorders. However, because of their long time constant, these devices are not able to provide a good characterization of fast events, like hypopneas. Nasal pressure recording technique (NPR) has recently been suggested to quantify airflow during sleep. It is claimed that the short time constants of the devices used to implement this technique would allow an accurate analysis of fast abnormal respiratory events. However, these devices present errors associated with nonlinearities and acoustic resonance that could reduce the diagnostic value of the NPR. Moreover, in spite of the high scientific and clinical potential, there is no detailed description of a complete instrumentation system to implement this promising technique in sleep studies. In this context, the purpose of this work was twofold: (1) describe the development of a flexible NPR device and (2) evaluate the performance of this device when compared to pneumotachographs (PNTs) and thermistors. After the design details are described, the system static accuracy is evaluated by a comparative analysis with a PNT. This analysis revealed a significant reduction (p<0.001) of the static error when system nonlinearities were reduced. The dynamic performance of the NPR system was investigated by frequency response analysis and time constant evaluations and the results showed that the developed device response was as good as PNT and around 100 times faster (τ=5,3 ms) than thermistors (τ=512 ms). Experimental results obtained in simulated clinical conditions and in a patient are presented as examples, and confirmed the good features achieved in engineering tests. These results are in close agreement with physiological fundamentals, supplying substantial evidence that the improved dynamic and static characteristics of this device can contribute to a more accurate implementation of medical research projects and to improve the diagnoses of sleep-breathing disorders.

  6. Proceedings of the National Conference on Energy Resource Management. Volume 1: Techniques, Procedures and Data Bases

    NASA Technical Reports Server (NTRS)

    Brumfield, J. O. (Editor); Schiffman, Y. M. (Editor)

    1982-01-01

    Topics dealing with the integration of remotely sensed data with geographic information system for application in energy resources management are discussed. Associated remote sensing and image analysis techniques are also addressed.

  7. Sensitivity of control-augmented structure obtained by a system decomposition method

    NASA Technical Reports Server (NTRS)

    Sobieszczanskisobieski, Jaroslaw; Bloebaum, Christina L.; Hajela, Prabhat

    1988-01-01

    The verification of a method for computing sensitivity derivatives of a coupled system is presented. The method deals with a system whose analysis can be partitioned into subsets that correspond to disciplines and/or physical subsystems that exchange input-output data with each other. The method uses the partial sensitivity derivatives of the output with respect to input obtained for each subset separately to assemble a set of linear, simultaneous, algebraic equations that are solved for the derivatives of the coupled system response. This sensitivity analysis is verified using an example of a cantilever beam augmented with an active control system to limit the beam's dynamic displacements under an excitation force. The verification shows good agreement of the method with reference data obtained by a finite difference technique involving entire system analysis. The usefulness of a system sensitivity method in optimization applications by employing a piecewise-linear approach to the same numerical example is demonstrated. The method's principal merits are its intrinsically superior accuracy in comparison with the finite difference technique, and its compatibility with the traditional division of work in complex engineering tasks among specialty groups.

  8. Analysis of Dual Mode Systems in an Urban Area : Volume 3. Description of the Analysis Techniques and Data Sources.

    DOT National Transportation Integrated Search

    1973-12-01

    Various forms of Dual Mode transportation were analyzed in order to assess the economic viability of the dual mode concept. Specially designed new small Dual Mode vehicles, modifications of existing automobiles, and pallet systems, all operating in c...

  9. Analysis of Dual Mode Systems in an Urban Area : Volume 3. Description of the Analysis Techniques and Data Sources

    DOT National Transportation Integrated Search

    1973-12-01

    Various forms of Dual Mode transportation were analyzed in order to assess the economic viability of the dual mode concept. Specially designed new small Dual Mode vehicles, modifications of existing automobiles, and pallet systems, all operating in c...

  10. Implementation of a microprocessor-based visual-evoked cortical potential recording and analysis system.

    PubMed

    Wilson, A; Fram, D; Sistar, J

    1981-06-01

    An Imsai 8080 microcomputer is being used to simultaneously generate a color graphics stimulus display and to record visual-evoked cortical potentials. A brief description of the hardware and software developed for this system is presented. Data storage and analysis techniques are also discussed.

  11. Dynamic malware analysis using IntroVirt: a modified hypervisor-based system

    NASA Astrophysics Data System (ADS)

    White, Joshua S.; Pape, Stephen R.; Meily, Adam T.; Gloo, Richard M.

    2013-05-01

    In this paper, we present a system for Dynamic Malware Analysis which incorporates the use of IntroVirt™. IntroVirt is an introspective hypervisor architecture and infrastructure that supports advanced analysis techniques for stealth-malwareanalysis. This system allows for complete guest monitoring and interaction, including the manipulation and blocking of system calls. IntroVirt is capable of bypassing virtual machine detection capabilities of even the most sophisticated malware, by spoofing returns to system call responses. Additional fuzzing capabilities can be employed to detect both malware vulnerabilities and polymorphism.

  12. Automatic simplification of systems of reaction-diffusion equations by a posteriori analysis.

    PubMed

    Maybank, Philip J; Whiteley, Jonathan P

    2014-02-01

    Many mathematical models in biology and physiology are represented by systems of nonlinear differential equations. In recent years these models have become increasingly complex in order to explain the enormous volume of data now available. A key role of modellers is to determine which components of the model have the greatest effect on a given observed behaviour. An approach for automatically fulfilling this role, based on a posteriori analysis, has recently been developed for nonlinear initial value ordinary differential equations [J.P. Whiteley, Model reduction using a posteriori analysis, Math. Biosci. 225 (2010) 44-52]. In this paper we extend this model reduction technique for application to both steady-state and time-dependent nonlinear reaction-diffusion systems. Exemplar problems drawn from biology are used to demonstrate the applicability of the technique. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. Development of Pattern Recognition Techniques for the Evaluation of Toxicant Impacts to Multispecies Systems

    DTIC Science & Technology

    1993-06-18

    the exception. In the Standardized Aquatic Microcosm and the Mixed Flask Culture (MFC) microcosms, multivariate analysis and clustering methods...rule rather than the exception. In the Standardized Aquatic Microcosm and the Mixed Flask Culture (MFC) microcosms, multivariate analysis and...experiments using two microcosm protocols. We use nonmetric clustering, a multivariate pattern recognition technique developed by Matthews and Heame (1991

  14. Analysis and design of three dimensional supersonic nozzles. Volume 2: Numerical program for analysis of nozzle-exhaust flow fields

    NASA Technical Reports Server (NTRS)

    Kalben, P.

    1972-01-01

    The FORTRAN IV Program developed to analyze the flow field associated with scramjet exhaust systems is presented. The instructions for preparing input and interpreting output are described. The program analyzes steady three dimensional supersonic flow by the reference plane characteristic technique. The governing equations and numerical techniques employed are presented in Volume 1 of this report.

  15. Real-time emergency forecasting technique for situation management systems

    NASA Astrophysics Data System (ADS)

    Kopytov, V. V.; Kharechkin, P. V.; Naumenko, V. V.; Tretyak, R. S.; Tebueva, F. B.

    2018-05-01

    The article describes the real-time emergency forecasting technique that allows increasing accuracy and reliability of forecasting results of any emergency computational model applied for decision making in situation management systems. Computational models are improved by the Improved Brown’s method applying fractal dimension to forecast short time series data being received from sensors and control systems. Reliability of emergency forecasting results is ensured by the invalid sensed data filtering according to the methods of correlation analysis.

  16. Symbolic manipulation techniques for vibration analysis of laminated elliptic plates

    NASA Technical Reports Server (NTRS)

    Andersen, C. M.; Noor, A. K.

    1977-01-01

    A computational scheme is presented for the free vibration analysis of laminated composite elliptic plates. The scheme is based on Hamilton's principle, the Rayleigh-Ritz technique and symmetry considerations and is implemented with the aid of the MACSYMA symbolic manipulation system. The MACYSMA system, through differentiation, integration, and simplification of analytic expressions, produces highly-efficient FORTRAN code for the evaluation of the stiffness and mass coefficients. Multiple use is made of this code to obtain not only the frequencies and mode shapes of the plate, but also the derivatives of the frequencies with respect to various material and geometric parameters.

  17. Development of spectral analysis math models and software program and spectral analyzer, digital converter interface equipment design

    NASA Technical Reports Server (NTRS)

    Hayden, W. L.; Robinson, L. H.

    1972-01-01

    Spectral analyses of angle-modulated communication systems is studied by: (1) performing a literature survey of candidate power spectrum computational techniques, determining the computational requirements, and formulating a mathematical model satisfying these requirements; (2) implementing the model on UNIVAC 1230 digital computer as the Spectral Analysis Program (SAP); and (3) developing the hardware specifications for a data acquisition system which will acquire an input modulating signal for SAP. The SAP computational technique uses extended fast Fourier transform and represents a generalized approach for simple and complex modulating signals.

  18. Imaging Analysis of Near-Field Recording Technique for Observation of Biological Specimens

    NASA Astrophysics Data System (ADS)

    Moriguchi, Chihiro; Ohta, Akihiro; Egami, Chikara; Kawata, Yoshimasa; Terakawa, Susumu; Tsuchimori, Masaaki; Watanabe, Osamu

    2006-07-01

    We present an analysis of the properties of an imaging based on a near-field recording technique in comparison with simulation results. In the system, the optical field distributions localized near the specimens are recorded as the surface topographic distributions of a photosensitive film. It is possible to observe both soft and moving specimens, because the system does not require a scanning probe to obtain the observed image. The imaging properties are evaluated using fine structures of paramecium, and we demonstrate that it is possible to observe minute differences of refractive indices.

  19. The Statistical Analysis Techniques to Support the NGNP Fuel Performance Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bihn T. Pham; Jeffrey J. Einerson

    2010-06-01

    This paper describes the development and application of statistical analysis techniques to support the AGR experimental program on NGNP fuel performance. The experiments conducted in the Idaho National Laboratory’s Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel/graphite temperature) is regulated by the He-Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the SAS-based NGNP Data Management and Analysis System (NDMAS) for automatedmore » processing and qualification of the AGR measured data. The NDMAS also stores daily neutronic (power) and thermal (heat transfer) code simulation results along with the measurement data, allowing for their combined use and comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the target quantity (fuel temperature) within a given range.« less

  20. Atmospheric statistics for aerospace vehicle operations

    NASA Technical Reports Server (NTRS)

    Smith, O. E.; Batts, G. W.

    1993-01-01

    Statistical analysis of atmospheric variables was performed for the Shuttle Transportation System (STS) design trade studies and the establishment of launch commit criteria. Atmospheric constraint statistics have been developed for the NASP test flight, the Advanced Launch System, and the National Launch System. The concepts and analysis techniques discussed in the paper are applicable to the design and operations of any future aerospace vehicle.

  1. A quantitative image cytometry technique for time series or population analyses of signaling networks.

    PubMed

    Ozaki, Yu-ichi; Uda, Shinsuke; Saito, Takeshi H; Chung, Jaehoon; Kubota, Hiroyuki; Kuroda, Shinya

    2010-04-01

    Modeling of cellular functions on the basis of experimental observation is increasingly common in the field of cellular signaling. However, such modeling requires a large amount of quantitative data of signaling events with high spatio-temporal resolution. A novel technique which allows us to obtain such data is needed for systems biology of cellular signaling. We developed a fully automatable assay technique, termed quantitative image cytometry (QIC), which integrates a quantitative immunostaining technique and a high precision image-processing algorithm for cell identification. With the aid of an automated sample preparation system, this device can quantify protein expression, phosphorylation and localization with subcellular resolution at one-minute intervals. The signaling activities quantified by the assay system showed good correlation with, as well as comparable reproducibility to, western blot analysis. Taking advantage of the high spatio-temporal resolution, we investigated the signaling dynamics of the ERK pathway in PC12 cells. The QIC technique appears as a highly quantitative and versatile technique, which can be a convenient replacement for the most conventional techniques including western blot, flow cytometry and live cell imaging. Thus, the QIC technique can be a powerful tool for investigating the systems biology of cellular signaling.

  2. A Study of Synchronization Techniques for Optical Communication Systems

    NASA Technical Reports Server (NTRS)

    Gagliardi, R. M.

    1975-01-01

    The study of synchronization techniques and related topics in the design of high data rate, deep space, optical communication systems was reported. Data cover: (1) effects of timing errors in narrow pulsed digital optical systems, (2) accuracy of microwave timing systems operating in low powered optical systems, (3) development of improved tracking systems for the optical channel and determination of their tracking performance, (4) development of usable photodetector mathematical models for application to analysis and performance design in communication receivers, and (5) study application of multi-level block encoding to optical transmission of digital data.

  3. iLift: A health behavior change support system for lifting and transfer techniques to prevent lower-back injuries in healthcare.

    PubMed

    Kuipers, Derek A; Wartena, Bard O; Dijkstra, Boudewijn H; Terlouw, Gijs; van T Veer, Job T B; van Dijk, Hylke W; Prins, Jelle T; Pierie, Jean Pierre E N

    2016-12-01

    Lower back problems are a common cause of sick leave of employees in Dutch care homes and hospitals. In the Netherlands over 40% of reported sick leave is due to back problems, mainly caused by carrying out heavy work. The goal of the iLift project was to develop a game for nursing personnel to train them in lifting and transfer techniques. The main focus was not on testing for the effectiveness of the game itself, but rather on the design of the game as an autogenous trigger and its place in a behavioral change support system. In this article, the design and development of such a health behavior change support system is addressed, describing cycles of design and evaluation. (a) To define the problem space, use context and user context, focus group interviews were conducted with Occupational Therapists (n=4), Nurses (n=10) and Caregivers (n=12) and a thematic analysis was performed. We interviewed experts (n=5) on the subject of lifting and transferring techniques. (b) A design science research approach resulted in a playable prototype. An expert panel conducted analysis of video-recorded playing activities. (c) Field experiment: We performed a dynamic analysis in order to investigate the feasibility of the prototype through biometric data from player sessions (n=620) by healthcare professionals (n=37). (a) Occupational Therapists, Nurses and Caregivers did not recognise a lack of knowledge with training in lifting and transferring techniques. All groups considered their workload, time pressure and a culturally determined habit to place the patient's well being above their own as the main reason not to apply appropriate lifting and transferring techniques. This led to a shift in focus from a serious game teaching lifting and transferring techniques to a health behavior change support system containing a game with the intention to influence behavior. (b) Building and testing (subcomponents of) the prototype resulted in design choices regarding players perspective, auditory and visual feedback, overall playability and perceived immersiveness. This design process also addressed the behavior shaping capacities of the game and its place within the health behavior change support system. An expert panel on lifting and transferring techniques validated the provoked in-game activities as being authentic. (c) Regression analysis showed an increase of the game score and dashboard score when more sessions were played, indicating an in-game training effect. A post-hoc test revealed that from an average of 10 playing sessions or more, the dashboard score and the game score align, which indicates behavioral change towards executing appropriate static lifting and transferring techniques. Data gathered in the final field test shows an in-game training effect, causing players to exhibit correct techniques for static lifting and transferring techniques but also revealed the necessity for future social system development and especially regarding intervention acceptance. Social system factors showed a strong impact on the games persuasive capacities and its autogenous intent. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  4. Development and applications of single particle orientation and rotational tracking in dynamic systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Kuangcai

    The goal of this study is to help with future data analysis and experiment designs in rotational dynamics research using DIC-based SPORT technique. Most of the current studies using DIC-based SPORT techniques are technical demonstrations. Understanding the mechanisms behind the observed rotational behaviors of the imaging probes should be the focus of the future SPORT studies. More efforts are still needed in the development of new imaging probes, particle tracking methods, instrumentations, and advanced data analysis methods to further extend the potential of DIC-based SPORT technique.

  5. Diagnostic emulation: Implementation and user's guide

    NASA Technical Reports Server (NTRS)

    Becher, Bernice

    1987-01-01

    The Diagnostic Emulation Technique was developed within the System Validation Methods Branch as a part of the development of methods for the analysis of the reliability of highly reliable, fault tolerant digital avionics systems. This is a general technique which allows for the emulation of a digital hardware system. The technique is general in the sense that it is completely independent of the particular target hardware which is being emulated. Parts of the system are described and emulated at the logic or gate level, while other parts of the system are described and emulated at the functional level. This algorithm allows for the insertion of faults into the system, and for the observation of the response of the system to these faults. This allows for controlled and accelerated testing of system reaction to hardware failures in the target machine. This document describes in detail how the algorithm was implemented at NASA Langley Research Center and gives instructions for using the system.

  6. Near-Field Magnetic Dipole Moment Analysis

    NASA Technical Reports Server (NTRS)

    Harris, Patrick K.

    2003-01-01

    This paper describes the data analysis technique used for magnetic testing at the NASA Goddard Space Flight Center (GSFC). Excellent results have been obtained using this technique to convert a spacecraft s measured magnetic field data into its respective magnetic dipole moment model. The model is most accurate with the earth s geomagnetic field cancelled in a spherical region bounded by the measurement magnetometers with a minimum radius large enough to enclose the magnetic source. Considerably enhanced spacecraft magnetic testing is offered by using this technique in conjunction with a computer-controlled magnetic field measurement system. Such a system, with real-time magnetic field display capabilities, has been incorporated into other existing magnetic measurement facilities and is also used at remote locations where transport to a magnetics test facility is impractical.

  7. Activity-Based Costing in a Service Organization

    DTIC Science & Technology

    1993-06-01

    environments, very little research has been done in service organizations. This thesis is a comparative analysis of a tradional cost accounting system with an...activity-based cost accounting system in a medium-sized mass transit system. The purpose of the analysis was to determine whether activity-based... accounting techniques can effectively be applied in a service industry. In addition, a goal for the thesis was to determine which costing system reports a

  8. A guide to onboard checkout. Volume 4: Propulsion

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The propulsion system for a space station is considered with respect to onboard checkout requirements. Failure analysis, reliability, and maintenance features are presented. Computer analysis techniques are also discussed.

  9. Estimating Mass of Inflatable Aerodynamic Decelerators Using Dimensionless Parameters

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.

    2011-01-01

    This paper describes a technique for estimating mass for inflatable aerodynamic decelerators. The technique uses dimensional analysis to identify a set of dimensionless parameters for inflation pressure, mass of inflation gas, and mass of flexible material. The dimensionless parameters enable scaling of an inflatable concept with geometry parameters (e.g., diameter), environmental conditions (e.g., dynamic pressure), inflation gas properties (e.g., molecular mass), and mass growth allowance. This technique is applicable for attached (e.g., tension cone, hypercone, and stacked toroid) and trailing inflatable aerodynamic decelerators. The technique uses simple engineering approximations that were developed by NASA in the 1960s and 1970s, as well as some recent important developments. The NASA Mars Entry and Descent Landing System Analysis (EDL-SA) project used this technique to estimate the masses of the inflatable concepts that were used in the analysis. The EDL-SA results compared well with two independent sets of high-fidelity finite element analyses.

  10. Methods for trend analysis: Examples with problem/failure data

    NASA Technical Reports Server (NTRS)

    Church, Curtis K.

    1989-01-01

    Statistics are emphasized as an important role in quality control and reliability. Consequently, Trend Analysis Techniques recommended a variety of statistical methodologies that could be applied to time series data. The major goal of the working handbook, using data from the MSFC Problem Assessment System, is to illustrate some of the techniques in the NASA standard, some different techniques, and to notice patterns of data. Techniques for trend estimation used are: regression (exponential, power, reciprocal, straight line) and Kendall's rank correlation coefficient. The important details of a statistical strategy for estimating a trend component are covered in the examples. However, careful analysis and interpretation is necessary because of small samples and frequent zero problem reports in a given time period. Further investigations to deal with these issues are being conducted.

  11. Visually enhanced CCTV digital surveillance utilizing Intranet and Internet.

    PubMed

    Ozaki, Nobuyuki

    2002-07-01

    This paper describes a solution for integrated plant supervision utilizing closed circuit television (CCTV) digital surveillance. Three basic requirements are first addressed as the platform of the system, with discussion on the suitable video compression. The system configuration is described in blocks. The system provides surveillance functionality: real-time monitoring, and process analysis functionality: a troubleshooting tool. This paper describes the formulation of practical performance design for determining various encoder parameters. It also introduces image processing techniques for enhancing the original CCTV digital image to lessen the burden on operators. Some screenshots are listed for the surveillance functionality. For the process analysis, an image searching filter supported by image processing techniques is explained with screenshots. Multimedia surveillance, which is the merger with process data surveillance, or the SCADA system, is also explained.

  12. Methods of determination of periods in the motion of asteroids

    NASA Astrophysics Data System (ADS)

    Bien, R.; Schubart, J.

    Numerical techniques for the analysis of fundamental periods in asteroidal motion are evaluated. The specific techniques evaluated were: the periodogram analysis procedure of Wundt (1980); Stumpff's (1937) system of algebraic transformations; and Labrouste's procedure. It is shown that the Labrouste procedure permitted sufficient isolation of single oscillations from the quasi-periodic process of asteroidal motion. The procedure was applied to the analysis of resonance in the motion of Trojan-type and Hilda-type asteroids, and some preliminary results are discussed.

  13. Spectral-Timing Analysis of Kilohetrz Quasi-Periodic Osciallations in Neutron Star Low-Mass X-ray Binaries

    NASA Astrophysics Data System (ADS)

    Cackett, Edward; Troyer, Jon; Peille, Philippe; Barret, Didier

    2018-01-01

    Kilohertz quasi-periodic oscillations or kHz QPOs are intensity variations that occur in the X-ray band observed in neutron star low-mass X-ray binary (LMXB) systems. In such systems, matter is transferred from a secondary low-mass star to a neutron star via the process of accretion. kHz QPOs occur on the timescale of the inner accretion flow and may carry signatures of the physics of strong gravity (c2 ~ GM/R) and possibly clues to constraining the neutron star equation of state (EOS). Both the timing behavior of kHz QPOs and the time-averaged spectra of these systems have been studied extensively. No model derived from these techniques has been able to illuminate the origin of kHz QPOs. Spectral-timing is an analysis technique that can be used to derive information about the nature of physical processes occurring within the accretion flow on the timescale of the kHz QPO. To date, kHz QPOs of (4) neutron star LMXB systems have been studied with spectral-timing techniques. We present a comprehensive study of spectral-timing products of kHz QPOs from systems where data is available in the RXTE archive to demonstrate the promise of this technique to gain insights regarding the origin of kHz QPOs. Using data averaged over the entire RXTE archive, we show correlated time-lags as a function of QPO frequency and energy, as well as energy-dependent covariance spectra for the various LMXB systems where spectral-timing analysis is possible. We find similar trends in all average spectral-timing products for the objects studied. This suggests a common origin of kHz QPOs.

  14. Performance analysis of the ascent propulsion system of the Apollo spacecraft

    NASA Technical Reports Server (NTRS)

    Hooper, J. C., III

    1973-01-01

    Activities involved in the performance analysis of the Apollo lunar module ascent propulsion system are discussed. A description of the ascent propulsion system, including hardware, instrumentation, and system characteristics, is included. The methods used to predict the inflight performance and to establish performance uncertainties of the ascent propulsion system are discussed. The techniques of processing the telemetered flight data and performing postflight performance reconstruction to determine actual inflight performance are discussed. Problems that have been encountered and results from the analysis of the ascent propulsion system performance during the Apollo 9, 10, and 11 missions are presented.

  15. Design and Analysis Techniques for Concurrent Blackboard Systems. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Mcmanus, John William

    1992-01-01

    Blackboard systems are a natural progression of knowledge-based systems into a more powerful problem solving technique. They provide a way for several highly specialized knowledge sources to cooperate to solve large, complex problems. Blackboard systems incorporate the concepts developed by rule-based and expert systems programmers and include the ability to add conventionally coded knowledge sources. The small and specialized knowledge sources are easier to develop and test, and can be hosted on hardware specifically suited to the task that they are solving. The Formal Model for Blackboard Systems was developed to provide a consistent method for describing a blackboard system. A set of blackboard system design tools has been developed and validated for implementing systems that are expressed using the Formal Model. The tools are used to test and refine a proposed blackboard system design before the design is implemented. My research has shown that the level of independence and specialization of the knowledge sources directly affects the performance of blackboard systems. Using the design, simulation, and analysis tools, I developed a concurrent object-oriented blackboard system that is faster, more efficient, and more powerful than existing systems. The use of the design and analysis tools provided the highly specialized and independent knowledge sources required for my concurrent blackboard system to achieve its design goals.

  16. Planning representation for automated exploratory data analysis

    NASA Astrophysics Data System (ADS)

    St. Amant, Robert; Cohen, Paul R.

    1994-03-01

    Igor is a knowledge-based system for exploratory statistical analysis of complex systems and environments. Igor has two related goals: to help automate the search for interesting patterns in data sets, and to help develop models that capture significant relationships in the data. We outline a language for Igor, based on techniques of opportunistic planning, which balances control and opportunism. We describe the application of Igor to the analysis of the behavior of Phoenix, an artificial intelligence planning system.

  17. SEP thrust subsystem performance sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Atkins, K. L.; Sauer, C. G., Jr.; Kerrisk, D. J.

    1973-01-01

    This is a two-part report on solar electric propulsion (SEP) performance sensitivity analysis. The first part describes the preliminary analysis of the SEP thrust system performance for an Encke rendezvous mission. A detailed description of thrust subsystem hardware tolerances on mission performance is included together with nominal spacecraft parameters based on these tolerances. The second part describes the method of analysis and graphical techniques used in generating the data for Part 1. Included is a description of both the trajectory program used and the additional software developed for this analysis. Part 2 also includes a comprehensive description of the use of the graphical techniques employed in this performance analysis.

  18. Space crew radiation exposure analysis system based on a commercial stand-alone CAD system

    NASA Technical Reports Server (NTRS)

    Appleby, Matthew H.; Golightly, Michael J.; Hardy, Alva C.

    1992-01-01

    Major improvements have recently been completed in the approach to spacecraft shielding analysis. A Computer-Aided Design (CAD)-based system has been developed for determining the shielding provided to any point within or external to the spacecraft. Shielding analysis is performed using a commercially available stand-alone CAD system and a customized ray-tracing subroutine contained within a standard engineering modeling software package. This improved shielding analysis technique has been used in several vehicle design projects such as a Mars transfer habitat, pressurized lunar rover, and the redesigned Space Station. Results of these analyses are provided to demonstrate the applicability and versatility of the system.

  19. Spacecraft Multiple Array Communication System Performance Analysis

    NASA Technical Reports Server (NTRS)

    Hwu, Shian U.; Desilva, Kanishka; Sham, Catherine C.

    2010-01-01

    The Communication Systems Simulation Laboratory (CSSL) at the NASA Johnson Space Center is tasked to perform spacecraft and ground network communication system simulations, design validation, and performance verification. The CSSL has developed simulation tools that model spacecraft communication systems and the space and ground environment in which the tools operate. In this paper, a spacecraft communication system with multiple arrays is simulated. Multiple array combined technique is used to increase the radio frequency coverage and data rate performance. The technique is to achieve phase coherence among the phased arrays to combine the signals at the targeting receiver constructively. There are many technical challenges in spacecraft integration with a high transmit power communication system. The array combining technique can improve the communication system data rate and coverage performances without increasing the system transmit power requirements. Example simulation results indicate significant performance improvement can be achieved with phase coherence implementation.

  20. Use of Model-Based Design Methods for Enhancing Resiliency Analysis of Unmanned Aerial Vehicles

    NASA Astrophysics Data System (ADS)

    Knox, Lenora A.

    The most common traditional non-functional requirement analysis is reliability. With systems becoming more complex, networked, and adaptive to environmental uncertainties, system resiliency has recently become the non-functional requirement analysis of choice. Analysis of system resiliency has challenges; which include, defining resilience for domain areas, identifying resilience metrics, determining resilience modeling strategies, and understanding how to best integrate the concepts of risk and reliability into resiliency. Formal methods that integrate all of these concepts do not currently exist in specific domain areas. Leveraging RAMSoS, a model-based reliability analysis methodology for Systems of Systems (SoS), we propose an extension that accounts for resiliency analysis through evaluation of mission performance, risk, and cost using multi-criteria decision-making (MCDM) modeling and design trade study variability modeling evaluation techniques. This proposed methodology, coined RAMSoS-RESIL, is applied to a case study in the multi-agent unmanned aerial vehicle (UAV) domain to investigate the potential benefits of a mission architecture where functionality to complete a mission is disseminated across multiple UAVs (distributed) opposed to being contained in a single UAV (monolithic). The case study based research demonstrates proof of concept for the proposed model-based technique and provides sufficient preliminary evidence to conclude which architectural design (distributed vs. monolithic) is most resilient based on insight into mission resilience performance, risk, and cost in addition to the traditional analysis of reliability.

  1. Pattern-recognition techniques applied to performance monitoring of the DSS 13 34-meter antenna control assembly

    NASA Technical Reports Server (NTRS)

    Mellstrom, J. A.; Smyth, P.

    1991-01-01

    The results of applying pattern recognition techniques to diagnose fault conditions in the pointing system of one of the Deep Space network's large antennas, the DSS 13 34-meter structure, are discussed. A previous article described an experiment whereby a neural network technique was used to identify fault classes by using data obtained from a simulation model of the Deep Space Network (DSN) 70-meter antenna system. Described here is the extension of these classification techniques to the analysis of real data from the field. The general architecture and philosophy of an autonomous monitoring paradigm is described and classification results are discussed and analyzed in this context. Key features of this approach include a probabilistic time-varying context model, the effective integration of signal processing and system identification techniques with pattern recognition algorithms, and the ability to calibrate the system given limited amounts of training data. Reported here are recognition accuracies in the 97 to 98 percent range for the particular fault classes included in the experiments.

  2. Intelligent Techniques Using Molecular Data Analysis in Leukaemia: An Opportunity for Personalized Medicine Support System

    PubMed Central

    Adelson, David; Brown, Fred; Chaudhri, Naeem

    2017-01-01

    The use of intelligent techniques in medicine has brought a ray of hope in terms of treating leukaemia patients. Personalized treatment uses patient's genetic profile to select a mode of treatment. This process makes use of molecular technology and machine learning, to determine the most suitable approach to treating a leukaemia patient. Until now, no reviews have been published from a computational perspective concerning the development of personalized medicine intelligent techniques for leukaemia patients using molecular data analysis. This review studies the published empirical research on personalized medicine in leukaemia and synthesizes findings across studies related to intelligence techniques in leukaemia, with specific attention to particular categories of these studies to help identify opportunities for further research into personalized medicine support systems in chronic myeloid leukaemia. A systematic search was carried out to identify studies using intelligence techniques in leukaemia and to categorize these studies based on leukaemia type and also the task, data source, and purpose of the studies. Most studies used molecular data analysis for personalized medicine, but future advancement for leukaemia patients requires molecular models that use advanced machine-learning methods to automate decision-making in treatment management to deliver supportive medical information to the patient in clinical practice. PMID:28812013

  3. Intelligent Techniques Using Molecular Data Analysis in Leukaemia: An Opportunity for Personalized Medicine Support System.

    PubMed

    Banjar, Haneen; Adelson, David; Brown, Fred; Chaudhri, Naeem

    2017-01-01

    The use of intelligent techniques in medicine has brought a ray of hope in terms of treating leukaemia patients. Personalized treatment uses patient's genetic profile to select a mode of treatment. This process makes use of molecular technology and machine learning, to determine the most suitable approach to treating a leukaemia patient. Until now, no reviews have been published from a computational perspective concerning the development of personalized medicine intelligent techniques for leukaemia patients using molecular data analysis. This review studies the published empirical research on personalized medicine in leukaemia and synthesizes findings across studies related to intelligence techniques in leukaemia, with specific attention to particular categories of these studies to help identify opportunities for further research into personalized medicine support systems in chronic myeloid leukaemia. A systematic search was carried out to identify studies using intelligence techniques in leukaemia and to categorize these studies based on leukaemia type and also the task, data source, and purpose of the studies. Most studies used molecular data analysis for personalized medicine, but future advancement for leukaemia patients requires molecular models that use advanced machine-learning methods to automate decision-making in treatment management to deliver supportive medical information to the patient in clinical practice.

  4. Quantization error of CCD cameras and their influence on phase calculation in fringe pattern analysis.

    PubMed

    Skydan, Oleksandr A; Lilley, Francis; Lalor, Michael J; Burton, David R

    2003-09-10

    We present an investigation into the phase errors that occur in fringe pattern analysis that are caused by quantization effects. When acquisition devices with a limited value of camera bit depth are used, there are a limited number of quantization levels available to record the signal. This may adversely affect the recorded signal and adds a potential source of instrumental error to the measurement system. Quantization effects also determine the accuracy that may be achieved by acquisition devices in a measurement system. We used the Fourier fringe analysis measurement technique. However, the principles can be applied equally well for other phase measuring techniques to yield a phase error distribution that is caused by the camera bit depth.

  5. Polarimetric Thomson scattering for high Te fusion plasmas

    NASA Astrophysics Data System (ADS)

    Giudicotti, L.

    2017-11-01

    Polarimetric Thomson scattering (TS) is a technique for the analysis of TS spectra in which the electron temperature Te is determined from the depolarization of the scattered radiation, a relativistic effect noticeable only in very hot (Te >= 10 keV) fusion plasmas. It has been proposed as a complementary technique to supplement the conventional spectral analysis in the ITER CPTS (Core Plasma Thomson Scattering) system for measurements in high Te, low ne plasma conditions. In this paper we review the characteristics of the depolarized TS radiation with special emphasis to the conditions of the ITER CPTS system and we describe a possible implementation of this diagnostic method suitable to significantly improve the performances of the conventional TS spectral analysis in the high Te range.

  6. A Hardware Model Validation Tool for Use in Complex Space Systems

    NASA Technical Reports Server (NTRS)

    Davies, Misty Dawn; Gundy-Burlet, Karen L.; Limes, Gregory L.

    2010-01-01

    One of the many technological hurdles that must be overcome in future missions is the challenge of validating as-built systems against the models used for design. We propose a technique composed of intelligent parameter exploration in concert with automated failure analysis as a scalable method for the validation of complex space systems. The technique is impervious to discontinuities and linear dependencies in the data, and can handle dimensionalities consisting of hundreds of variables over tens of thousands of experiments.

  7. Surface Modeling, Solid Modeling and Finite Element Modeling. Analysis Capabilities of Computer-Assisted Design and Manufacturing Systems.

    ERIC Educational Resources Information Center

    Nee, John G.; Kare, Audhut P.

    1987-01-01

    Explores several concepts in computer assisted design/computer assisted manufacturing (CAD/CAM). Defines, evaluates, reviews and compares advanced computer-aided geometric modeling and analysis techniques. Presents the results of a survey to establish the capabilities of minicomputer based-systems with the CAD/CAM packages evaluated. (CW)

  8. A Selected Annotated Bibliography on the Analysis of Water Resources System, Volume 2.

    ERIC Educational Resources Information Center

    Kriss, Carol; And Others

    Presented is an annotated bibliography of some recent selected publications pertaining to the application of systems analysis techniques for defining and evaluating alternative solutions to water resource problems. Both subject and author indices are provided. Keywords are listed at the end of each abstract. The abstracted material emphasizes the…

  9. A Selected Annotated Bibliography on the Analysis of Water Resource Systems.

    ERIC Educational Resources Information Center

    Gysi, Marshall; And Others

    Presented is an annotated bibliography of some selected publications pertaining to the application of systems analysis techniques to water resource problems. The majority of the references included in this bibliography have been published within the last five years. About half of the entries have informative abstracts and keywords following the…

  10. AUTOMATED LITERATURE PROCESSING HANDLING AND ANALYSIS SYSTEM--FIRST GENERATION.

    ERIC Educational Resources Information Center

    Redstone Scientific Information Center, Redstone Arsenal, AL.

    THE REPORT PRESENTS A SUMMARY OF THE DEVELOPMENT AND THE CHARACTERISTICS OF THE FIRST GENERATION OF THE AUTOMATED LITERATURE PROCESSING, HANDLING AND ANALYSIS (ALPHA-1) SYSTEM. DESCRIPTIONS OF THE COMPUTER TECHNOLOGY OF ALPHA-1 AND THE USE OF THIS AUTOMATED LIBRARY TECHNIQUE ARE PRESENTED. EACH OF THE SUBSYSTEMS AND MODULES NOW IN OPERATION ARE…

  11. Trace analysis in the food and beverage industry by capillary gas chromatography: system performance and maintenance.

    PubMed

    Hayes, M A

    1988-04-01

    Gas chromatography (GC) is the most widely used analytical technique in the food and beverage industry. This paper addresses the problems of sample preparation and system maintenance to ensure the most sensitive, durable, and efficient results for trace analysis by GC in this industry.

  12. Techniques for the inorganic analysis in a controlled ecological life support system

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Sample preparation, instrumental analysis, a direct injection device for IC/ICP interfacing, a hydraulic jet bend impaction device, monodisperse aerosol generator, and low gravity environments are discussed.

  13. Stochastic subspace identification for operational modal analysis of an arch bridge

    NASA Astrophysics Data System (ADS)

    Loh, Chin-Hsiung; Chen, Ming-Che; Chao, Shu-Hsien

    2012-04-01

    In this paer the application of output-only system identification technique, known as Stochastic Subspace Identification (SSI) algorithms, for civil infrastructures is carried out. The ability of covariance driven stochastic subspace identification (SSI-COV) was proved through the analysis of the ambient data of an arch bridge under operational condition. A newly developed signal processing technique, Singular Spectrum analysis (SSA), capable to smooth noisy signals, is adopted for pre-processing the recorded data before the SSI. The conjunction of SSA and SSICOV provides a useful criterion for the system order determination. With the aim of estimating accurate modal parameters of the structure in off-line analysis, a stabilization diagram is constructed by plotting the identified poles of the system with increasing the size of data Hankel matrix. Identification task of a real structure, Guandu Bridge, is carried out to identify the system natural frequencies and mode shapes. The uncertainty of the identified model parameters from output-only measurement of the bridge under operation condition, such as temperature and traffic loading conditions, is discussed.

  14. On the next generation of reliability analysis tools

    NASA Technical Reports Server (NTRS)

    Babcock, Philip S., IV; Leong, Frank; Gai, Eli

    1987-01-01

    The current generation of reliability analysis tools concentrates on improving the efficiency of the description and solution of the fault-handling processes and providing a solution algorithm for the full system model. The tools have improved user efficiency in these areas to the extent that the problem of constructing the fault-occurrence model is now the major analysis bottleneck. For the next generation of reliability tools, it is proposed that techniques be developed to improve the efficiency of the fault-occurrence model generation and input. Further, the goal is to provide an environment permitting a user to provide a top-down design description of the system from which a Markov reliability model is automatically constructed. Thus, the user is relieved of the tedious and error-prone process of model construction, permitting an efficient exploration of the design space, and an independent validation of the system's operation is obtained. An additional benefit of automating the model construction process is the opportunity to reduce the specialized knowledge required. Hence, the user need only be an expert in the system he is analyzing; the expertise in reliability analysis techniques is supplied.

  15. Accelerometer Data Analysis and Presentation Techniques

    NASA Technical Reports Server (NTRS)

    Rogers, Melissa J. B.; Hrovat, Kenneth; McPherson, Kevin; Moskowitz, Milton E.; Reckart, Timothy

    1997-01-01

    The NASA Lewis Research Center's Principal Investigator Microgravity Services project analyzes Orbital Acceleration Research Experiment and Space Acceleration Measurement System data for principal investigators of microgravity experiments. Principal investigators need a thorough understanding of data analysis techniques so that they can request appropriate analyses to best interpret accelerometer data. Accelerometer data sampling and filtering is introduced along with the related topics of resolution and aliasing. Specific information about the Orbital Acceleration Research Experiment and Space Acceleration Measurement System data sampling and filtering is given. Time domain data analysis techniques are discussed and example environment interpretations are made using plots of acceleration versus time, interval average acceleration versus time, interval root-mean-square acceleration versus time, trimmean acceleration versus time, quasi-steady three dimensional histograms, and prediction of quasi-steady levels at different locations. An introduction to Fourier transform theory and windowing is provided along with specific analysis techniques and data interpretations. The frequency domain analyses discussed are power spectral density versus frequency, cumulative root-mean-square acceleration versus frequency, root-mean-square acceleration versus frequency, one-third octave band root-mean-square acceleration versus frequency, and power spectral density versus frequency versus time (spectrogram). Instructions for accessing NASA Lewis Research Center accelerometer data and related information using the internet are provided.

  16. System reliability and recovery.

    DOT National Transportation Integrated Search

    1971-06-01

    The paper exhibits a variety of reliability techniques applicable to future ATC data processing systems. Presently envisioned schemes for error detection, error interrupt and error analysis are considered, along with methods of retry, reconfiguration...

  17. Histology image analysis for carcinoma detection and grading

    PubMed Central

    He, Lei; Long, L. Rodney; Antani, Sameer; Thoma, George R.

    2012-01-01

    This paper presents an overview of the image analysis techniques in the domain of histopathology, specifically, for the objective of automated carcinoma detection and classification. As in other biomedical imaging areas such as radiology, many computer assisted diagnosis (CAD) systems have been implemented to aid histopathologists and clinicians in cancer diagnosis and research, which have been attempted to significantly reduce the labor and subjectivity of traditional manual intervention with histology images. The task of automated histology image analysis is usually not simple due to the unique characteristics of histology imaging, including the variability in image preparation techniques, clinical interpretation protocols, and the complex structures and very large size of the images themselves. In this paper we discuss those characteristics, provide relevant background information about slide preparation and interpretation, and review the application of digital image processing techniques to the field of histology image analysis. In particular, emphasis is given to state-of-the-art image segmentation methods for feature extraction and disease classification. Four major carcinomas of cervix, prostate, breast, and lung are selected to illustrate the functions and capabilities of existing CAD systems. PMID:22436890

  18. DIMENSIONS OF SIMULATION.

    ERIC Educational Resources Information Center

    CRAWFORD, MEREDITH P.

    OPEN AND CLOSED LOOP SIMULATION IS DISCUSSED FROM THE VIEWPOINT OF RESEARCH AND DEVELOPMENT IN TRAINING TECHNIQUES. AREAS DISCUSSED INCLUDE--(1) OPEN-LOOP ENVIRONMENTAL SIMULATION, (2) SIMULATION NOT INVOLVING PEOPLE, (3) ANALYSIS OF OCCUPATIONS, (4) SIMULATION FOR TRAINING, (5) REAL-SIZE SYSTEM SIMULATION, (6) TECHNIQUES OF MINIATURIZATION, AND…

  19. Evaluation of Ares-I Control System Robustness to Uncertain Aerodynamics and Flex Dynamics

    NASA Technical Reports Server (NTRS)

    Jang, Jiann-Woei; VanTassel, Chris; Bedrossian, Nazareth; Hall, Charles; Spanos, Pol

    2008-01-01

    This paper discusses the application of robust control theory to evaluate robustness of the Ares-I control systems. Three techniques for estimating upper and lower bounds of uncertain parameters which yield stable closed-loop response are used here: (1) Monte Carlo analysis, (2) mu analysis, and (3) characteristic frequency response analysis. All three methods are used to evaluate stability envelopes of the Ares-I control systems with uncertain aerodynamics and flex dynamics. The results show that characteristic frequency response analysis is the most effective of these methods for assessing robustness.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lucia, M., E-mail: mlucia@pppl.gov; Kaita, R.; Majeski, R.

    The Materials Analysis and Particle Probe (MAPP) is a compact in vacuo surface science diagnostic, designed to provide in situ surface characterization of plasma facing components in a tokamak environment. MAPP has been implemented for operation on the Lithium Tokamak Experiment at Princeton Plasma Physics Laboratory (PPPL), where all control and analysis systems are currently under development for full remote operation. Control systems include vacuum management, instrument power, and translational/rotational probe drive. Analysis systems include onboard Langmuir probes and all components required for x-ray photoelectron spectroscopy, low-energy ion scattering spectroscopy, direct recoil spectroscopy, and thermal desorption spectroscopy surface analysis techniques.

  1. Contingency Analysis Post-Processing With Advanced Computing and Visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yousu; Glaesemann, Kurt; Fitzhenry, Erin

    Contingency analysis is a critical function widely used in energy management systems to assess the impact of power system component failures. Its outputs are important for power system operation for improved situational awareness, power system planning studies, and power market operations. With the increased complexity of power system modeling and simulation caused by increased energy production and demand, the penetration of renewable energy and fast deployment of smart grid devices, and the trend of operating grids closer to their capacity for better efficiency, more and more contingencies must be executed and analyzed quickly in order to ensure grid reliability andmore » accuracy for the power market. Currently, many researchers have proposed different techniques to accelerate the computational speed of contingency analysis, but not much work has been published on how to post-process the large amount of contingency outputs quickly. This paper proposes a parallel post-processing function that can analyze contingency analysis outputs faster and display them in a web-based visualization tool to help power engineers improve their work efficiency by fast information digestion. Case studies using an ESCA-60 bus system and a WECC planning system are presented to demonstrate the functionality of the parallel post-processing technique and the web-based visualization tool.« less

  2. Differences in head impulse test results due to analysis techniques.

    PubMed

    Cleworth, Taylor W; Carpenter, Mark G; Honegger, Flurin; Allum, John H J

    2017-01-01

    Different analysis techniques are used to define vestibulo-ocular reflex (VOR) gain between eye and head angular velocity during the video head impulse test (vHIT). Comparisons would aid selection of gain techniques best related to head impulse characteristics and promote standardisation. Compare and contrast known methods of calculating vHIT VOR gain. We examined lateral canal vHIT responses recorded from 20 patients twice within 13 weeks of acute unilateral peripheral vestibular deficit onset. Ten patients were tested with an ICS Impulse system (GN Otometrics) and 10 with an EyeSeeCam (ESC) system (Interacoustics). Mean gain and variance were computed with area, average sample gain, and regression techniques over specific head angular velocity (HV) and acceleration (HA) intervals. Results for the same gain technique were not different between measurement systems. Area and average sample gain yielded equally lower variances than regression techniques. Gains computed over the whole impulse duration were larger than those computed for increasing HV. Gain over decreasing HV was associated with larger variances. Gains computed around peak HV were smaller than those computed around peak HA. The median gain over 50-70 ms was not different from gain around peak HV. However, depending on technique used, the gain over increasing HV was different from gain around peak HA. Conversion equations between gains obtained with standard ICS and ESC methods were computed. For low gains, the conversion was dominated by a constant that needed to be added to ESC gains to equal ICS gains. We recommend manufacturers standardize vHIT gain calculations using 2 techniques: area gain around peak HA and peak HV.

  3. Towards the use of Structural Loop Analysis to Study System Behaviour of Socio-Ecological Systems.

    NASA Astrophysics Data System (ADS)

    Abram, Joseph; Dyke, James

    2016-04-01

    Maintaining socio-ecological systems in desirable states is key to developing a growing economy, alleviating poverty and achieving a sustainable future. While the driving forces of an environmental system are often well known, the dynamics impacting these drivers can be hidden within a tangled structure of causal chains and feedback loops. A lack of understanding of a system's dynamic structure and its influence on a system's behaviour can cause unforeseen side-effects during model scenario testing and policy implementation. Structural Loop analysis of socio-ecological system models identifies dominant feedback structures during times of behavioural shift, allowing the user to monitor key influential drivers during model simulation. This work carries out Loop Eigenvalue Elasticity Analysis (LEEA) on three system dynamic models, exploring tipping points in lake systems undergoing eutrophication. The purpose is to explore the potential benefits and limitations of the technique in the field of socio-ecology. The LEEA technique shows promise for socio-ecological systems which undergo regime shifts or express oscillatory trends, but shows limited usefulness with large models. The results of this work highlight changes in feedback loop dominance, years prior to eutrophic tipping events in lake systems. LEEA could be used as an early warning signal to impending system changes, complementary to other known early warning signals. This approach could improve our understanding during critical times of a system's behaviour, changing how we approach model analysis and the way scenario testing and policy implementation are addressed in socio-ecological system models.

  4. The use of interpractive graphic displays for interpretation of surface design parameters

    NASA Technical Reports Server (NTRS)

    Talcott, N. A., Jr.

    1981-01-01

    An interactive computer graphics technique known as the Graphic Display Data method has been developed to provide a convenient means for rapidly interpreting large amounts of surface design data. The display technique should prove valuable in such disciplines as aerodynamic analysis, structural analysis, and experimental data analysis. To demonstrate the system's features, an example is presented of the Graphic Data Display method used as an interpretive tool for radiation equilibrium temperature distributions over the surface of an aerodynamic vehicle. Color graphic displays were also examined as a logical extension of the technique to improve its clarity and to allow the presentation of greater detail in a single display.

  5. Dynamics of aerospace vehicles

    NASA Technical Reports Server (NTRS)

    Schmidt, David K.

    1991-01-01

    The focus of this research was to address the modeling, including model reduction, of flexible aerospace vehicles, with special emphasis on models used in dynamic analysis and/or guidance and control system design. In the modeling, it is critical that the key aspects of the system being modeled be captured in the model. In this work, therefore, aspects of the vehicle dynamics critical to control design were important. In this regard, fundamental contributions were made in the areas of stability robustness analysis techniques, model reduction techniques, and literal approximations for key dynamic characteristics of flexible vehicles. All these areas are related. In the development of a model, approximations are always involved, so control systems designed using these models must be robust against uncertainties in these models.

  6. Study on development system of increasing gearbox for high-performance wind-power generator

    NASA Astrophysics Data System (ADS)

    Xu, Hongbin; Yan, Kejun; Zhao, Junyu

    2005-12-01

    Based on the analysis of the development potentiality of wind-power generator and domestic manufacture of its key parts in China, an independent development system of the Increasing Gearbox for High-performance Wind-power Generator (IGHPWG) was introduced. The main elements of the system were studied, including the procedure design, design analysis system, manufacturing technology and detecting system, and the relative important technologies were analyzed such as mixed optimal joint transmission structure of the first planetary drive with two grade parallel axle drive based on equal strength, tooth root round cutting technology before milling hard tooth surface, high-precise tooth grinding technology, heat treatment optimal technology and complex surface technique, and rig test and detection technique of IGHPWG. The development conception was advanced the data share and quality assurance system through all the elements of the development system. The increasing Gearboxes for 600KW and 1MW Wind-power Generator have been successfully developed through the application of the development system.

  7. Optical system design, analysis, and production for advanced technology systems; Proceedings of the Meeting, Innsbruck, Austria, Apr. 15-17, 1986

    NASA Technical Reports Server (NTRS)

    Fischer, Robert E. (Editor); Rogers, Philip J. (Editor)

    1986-01-01

    The present conference considers topics in the fields of optical systems design software, the design and analysis of optical systems, illustrative cases of advanced optical system design, the integration of optical designs into greater systems, and optical fabrication and testing techniques. Attention is given to an extended range diffraction-based merit function for lens design optimization, an assessment of technologies for stray light control and evaluation, the automated characterization of IR systems' spatial resolution, a spectrum of design techniques based on aberration theory, a three-field IR telescope, a large aperture zoom lens for 16-mm motion picture cameras, and the use of concave holographic gratings as monochomators. Also discussed are the use of aspherics in optical systems, glass choice procedures for periscope design, the fabrication and testing of unconventional optics, low mass mirrors for large optics, and the diamond grinding of optical surfaces on aspheric lens molds.

  8. Geographic information systems, remote sensing, and spatial analysis activities in Texas, 2002-07

    USGS Publications Warehouse

    Pearson, D.K.; Gary, R.H.; Wilson, Z.D.

    2007-01-01

    Geographic information system (GIS) technology has become an important tool for scientific investigation, resource management, and environmental planning. A GIS is a computer-aided system capable of collecting, storing, analyzing, and displaying spatially referenced digital data. GIS technology is particularly useful when analyzing a wide variety of spatial data such as with remote sensing and spatial analysis. Remote sensing involves collecting remotely sensed data, such as satellite imagery, aerial photography, or radar images, and analyzing the data to gather information or investigate trends about the environment or the Earth's surface. Spatial analysis combines remotely sensed, thematic, statistical, quantitative, and geographical data through overlay, modeling, and other analytical techniques to investigate specific research questions. It is the combination of data formats and analysis techniques that has made GIS an essential tool in scientific investigations. This document presents information about the technical capabilities and project activities of the U.S. Geological Survey (USGS) Texas Water Science Center (TWSC) GIS Workgroup from 2002 through 2007.

  9. Cocrystal screening of hydroxybenzamides with benzoic acid derivatives: a comparative study of thermal and solution-based methods.

    PubMed

    Manin, Alex N; Voronin, Alexander P; Drozd, Ksenia V; Manin, Nikolay G; Bauer-Brandl, Annette; Perlovich, German L

    2014-12-18

    The main problem occurring at the early stages of cocrystal search is the choice of an effective screening technique. Among the most popular techniques of obtaining cocrystals are crystallization from solution, crystallization from melt and solvent-drop grinding. This paper represents a comparative analysis of the following screening techniques: DSC cocrystal screening method, thermal microscopy and saturation temperature method. The efficiency of different techniques of cocrystal screening was checked in 18 systems. Benzamide and benzoic acid derivatives were chosen as model systems due to their ability to form acid-amide supramolecular heterosynthon. The screening has confirmed the formation of 6 new cocrystals. The screening by the saturation temperature method has the highest screen-out rate but the smallest range of application. DSC screening has a satisfactory accuracy and allows screening over a short time. Thermal microscopy is most efficient as an additional technique used to interpret ambiguous DSC screening results. The study also included an analysis of the influence of solvent type and component solubility on cocrystal formation. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Solar energy system performance evaluation report for IBM System 3, Glendo, Wyoming

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The analysis used was based on instrumented system data monitored and collected for at least one full season of operation. The long-term field performance of the installed system is described. Technical contributions to the definition of techniques and requirements for solar energy system design are given.

  11. Computer-aided analysis of star shot films for high-accuracy radiation therapy treatment units

    NASA Astrophysics Data System (ADS)

    Depuydt, Tom; Penne, Rudi; Verellen, Dirk; Hrbacek, Jan; Lang, Stephanie; Leysen, Katrien; Vandevondel, Iwein; Poels, Kenneth; Reynders, Truus; Gevaert, Thierry; Duchateau, Michael; Tournel, Koen; Boussaer, Marlies; Cosentino, Dorian; Garibaldi, Cristina; Solberg, Timothy; De Ridder, Mark

    2012-05-01

    As mechanical stability of radiation therapy treatment devices has gone beyond sub-millimeter levels, there is a rising demand for simple yet highly accurate measurement techniques to support the routine quality control of these devices. A combination of using high-resolution radiosensitive film and computer-aided analysis could provide an answer. One generally known technique is the acquisition of star shot films to determine the mechanical stability of rotations of gantries and the therapeutic beam. With computer-aided analysis, mechanical performance can be quantified as a radiation isocenter radius size. In this work, computer-aided analysis of star shot film is further refined by applying an analytical solution for the smallest intersecting circle problem, in contrast to the gradient optimization approaches used until today. An algorithm is presented and subjected to a performance test using two different types of radiosensitive film, the Kodak EDR2 radiographic film and the ISP EBT2 radiochromic film. Artificial star shots with a priori known radiation isocenter size are used to determine the systematic errors introduced by the digitization of the film and the computer analysis. The estimated uncertainty on the isocenter size measurement with the presented technique was 0.04 mm (2σ) and 0.06 mm (2σ) for radiographic and radiochromic films, respectively. As an application of the technique, a study was conducted to compare the mechanical stability of O-ring gantry systems with C-arm-based gantries. In total ten systems of five different institutions were included in this study and star shots were acquired for gantry, collimator, ring, couch rotations and gantry wobble. It was not possible to draw general conclusions about differences in mechanical performance between O-ring and C-arm gantry systems, mainly due to differences in the beam-MLC alignment procedure accuracy. Nevertheless, the best performing O-ring system in this study, a BrainLab/MHI Vero system, and the best performing C-arm system, a Varian Truebeam system, showed comparable mechanical performance: gantry isocenter radius of 0.12 and 0.09 mm, respectively, ring/couch rotation of below 0.10 mm for both systems and a wobble of 0.06 and 0.18 mm, respectively. The methodology described in this work can be used to monitor mechanical performance constancy of high-accuracy treatment devices, with means available in a clinical radiation therapy environment.

  12. Interactive visual optimization and analysis for RFID benchmarking.

    PubMed

    Wu, Yingcai; Chung, Ka-Kei; Qu, Huamin; Yuan, Xiaoru; Cheung, S C

    2009-01-01

    Radio frequency identification (RFID) is a powerful automatic remote identification technique that has wide applications. To facilitate RFID deployment, an RFID benchmarking instrument called aGate has been invented to identify the strengths and weaknesses of different RFID technologies in various environments. However, the data acquired by aGate are usually complex time varying multidimensional 3D volumetric data, which are extremely challenging for engineers to analyze. In this paper, we introduce a set of visualization techniques, namely, parallel coordinate plots, orientation plots, a visual history mechanism, and a 3D spatial viewer, to help RFID engineers analyze benchmark data visually and intuitively. With the techniques, we further introduce two workflow procedures (a visual optimization procedure for finding the optimum reader antenna configuration and a visual analysis procedure for comparing the performance and identifying the flaws of RFID devices) for the RFID benchmarking, with focus on the performance analysis of the aGate system. The usefulness and usability of the system are demonstrated in the user evaluation.

  13. Detecting and classifying method based on similarity matching of Android malware behavior with profile.

    PubMed

    Jang, Jae-Wook; Yun, Jaesung; Mohaisen, Aziz; Woo, Jiyoung; Kim, Huy Kang

    2016-01-01

    Mass-market mobile security threats have increased recently due to the growth of mobile technologies and the popularity of mobile devices. Accordingly, techniques have been introduced for identifying, classifying, and defending against mobile threats utilizing static, dynamic, on-device, and off-device techniques. Static techniques are easy to evade, while dynamic techniques are expensive. On-device techniques are evasion, while off-device techniques need being always online. To address some of those shortcomings, we introduce Andro-profiler, a hybrid behavior based analysis and classification system for mobile malware. Andro-profiler main goals are efficiency, scalability, and accuracy. For that, Andro-profiler classifies malware by exploiting the behavior profiling extracted from the integrated system logs including system calls. Andro-profiler executes a malicious application on an emulator in order to generate the integrated system logs, and creates human-readable behavior profiles by analyzing the integrated system logs. By comparing the behavior profile of malicious application with representative behavior profile for each malware family using a weighted similarity matching technique, Andro-profiler detects and classifies it into malware families. The experiment results demonstrate that Andro-profiler is scalable, performs well in detecting and classifying malware with accuracy greater than 98 %, outperforms the existing state-of-the-art work, and is capable of identifying 0-day mobile malware samples.

  14. Real-time color image processing for forensic fiber investigations

    NASA Astrophysics Data System (ADS)

    Paulsson, Nils

    1995-09-01

    This paper describes a system for automatic fiber debris detection based on color identification. The properties of the system are fast analysis and high selectivity, a necessity when analyzing forensic fiber samples. An ordinary investigation separates the material into well above 100,000 video images to analyze. The system is based on standard techniques such as CCD-camera, motorized sample table, and IBM-compatible PC/AT with add-on-boards for video frame digitalization and stepping motor control as the main parts. It is possible to operate the instrument at full video rate (25 image/s) with aid of the HSI-color system (hue- saturation-intensity) and software optimization. High selectivity is achieved by separating the analysis into several steps. The first step is fast direct color identification of objects in the analyzed video images and the second step analyzes detected objects with a more complex and time consuming stage of the investigation to identify single fiber fragments for subsequent analysis with more selective techniques.

  15. Analyzing Distributed Functions in an Integrated Hazard Analysis

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Massie, Michael J.

    2010-01-01

    Large scale integration of today's aerospace systems is achievable through the use of distributed systems. Validating the safety of distributed systems is significantly more difficult as compared to centralized systems because of the complexity of the interactions between simultaneously active components. Integrated hazard analysis (IHA), a process used to identify unacceptable risks and to provide a means of controlling them, can be applied to either centralized or distributed systems. IHA, though, must be tailored to fit the particular system being analyzed. Distributed systems, for instance, must be analyzed for hazards in terms of the functions that rely on them. This paper will describe systems-oriented IHA techniques (as opposed to traditional failure-event or reliability techniques) that should be employed for distributed systems in aerospace environments. Special considerations will be addressed when dealing with specific distributed systems such as active thermal control, electrical power, command and data handling, and software systems (including the interaction with fault management systems). Because of the significance of second-order effects in large scale distributed systems, the paper will also describe how to analyze secondary functions to secondary functions through the use of channelization.

  16. Application and Validation of Workload Assessment Techniques

    DTIC Science & Technology

    1993-03-01

    tech ical report documents the process and outcome of meeting this objective. Procedure: A series of eight separate studies was conducted using three...development process . The task analysis and simulation technique was shown to have the capability to track empirical workload ratings. More research is...operator workload during the systems acquisi- tion process , and (b) a pamphlet for the managers of Army systems that describes the need and some procedures

  17. Real-time processing of interferograms for monitoring protein crystal growth on the Space Station

    NASA Technical Reports Server (NTRS)

    Choudry, A.; Dupuis, N.

    1988-01-01

    The possibility of using microscopic interferometric techniques to monitor the growth of protein crystals on the Space Station is studied. Digital image processing techniques are used to develop a system for the real-time analysis of microscopic interferograms of nucleation sites during protein crystal growth. Features of the optical setup and the image processing system are discussed and experimental results are presented.

  18. High order statistical signatures from source-driven measurements of subcritical fissile systems

    NASA Astrophysics Data System (ADS)

    Mattingly, John Kelly

    1998-11-01

    This research focuses on the development and application of high order statistical analyses applied to measurements performed with subcritical fissile systems driven by an introduced neutron source. The signatures presented are derived from counting statistics of the introduced source and radiation detectors that observe the response of the fissile system. It is demonstrated that successively higher order counting statistics possess progressively higher sensitivity to reactivity. Consequently, these signatures are more sensitive to changes in the composition, fissile mass, and configuration of the fissile assembly. Furthermore, it is shown that these techniques are capable of distinguishing the response of the fissile system to the introduced source from its response to any internal or inherent sources. This ability combined with the enhanced sensitivity of higher order signatures indicates that these techniques will be of significant utility in a variety of applications. Potential applications include enhanced radiation signature identification of weapons components for nuclear disarmament and safeguards applications and augmented nondestructive analysis of spent nuclear fuel. In general, these techniques expand present capabilities in the analysis of subcritical measurements.

  19. Recombinant drugs-on-a-chip: The usage of capillary electrophoresis and trends in miniaturized systems - A review.

    PubMed

    Morbioli, Giorgio Gianini; Mazzu-Nascimento, Thiago; Aquino, Adriano; Cervantes, Cesar; Carrilho, Emanuel

    2016-09-07

    We present here a critical review covering conventional analytical tools of recombinant drug analysis and discuss their evolution towards miniaturized systems foreseeing a possible unique recombinant drug-on-a-chip device. Recombinant protein drugs and/or pro-drug analysis require sensitive and reproducible analytical techniques for quality control to ensure safety and efficacy of drugs according to regulatory agencies. The versatility of miniaturized systems combined with their low-cost could become a major trend in recombinant drugs and bioprocess analysis. Miniaturized systems are capable of performing conventional analytical and proteomic tasks, allowing for interfaces with other powerful techniques, such as mass spectrometry. Microdevices can be applied during the different stages of recombinant drug processing, such as gene isolation, DNA amplification, cell culture, protein expression, protein separation, and analysis. In addition, organs-on-chips have appeared as a viable alternative to testing biodrug pharmacokinetics and pharmacodynamics, demonstrating the capabilities of the miniaturized systems. The integration of individual established microfluidic operations and analytical tools in a single device is a challenge to be overcome to achieve a unique recombinant drug-on-a-chip device. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Autonomous facial recognition system inspired by human visual system based logarithmical image visualization technique

    NASA Astrophysics Data System (ADS)

    Wan, Qianwen; Panetta, Karen; Agaian, Sos

    2017-05-01

    Autonomous facial recognition system is widely used in real-life applications, such as homeland border security, law enforcement identification and authentication, and video-based surveillance analysis. Issues like low image quality, non-uniform illumination as well as variations in poses and facial expressions can impair the performance of recognition systems. To address the non-uniform illumination challenge, we present a novel robust autonomous facial recognition system inspired by the human visual system based, so called, logarithmical image visualization technique. In this paper, the proposed method, for the first time, utilizes the logarithmical image visualization technique coupled with the local binary pattern to perform discriminative feature extraction for facial recognition system. The Yale database, the Yale-B database and the ATT database are used for computer simulation accuracy and efficiency testing. The extensive computer simulation demonstrates the method's efficiency, accuracy, and robustness of illumination invariance for facial recognition.

  1. Structured Analysis of the Logistic Support Analysis (LSA) Task, ’Integrated Logistic Support (ILS) Assessment Maintenance Planning E-1 Element’ (APJ 966-204)

    DTIC Science & Technology

    1988-10-01

    Structured Analysis involves building a logical (non-physical) model of a system, using graphic techniques which enable users, analysts, and designers to... Design uses tools, especially graphic ones, to render systems readily understandable. 8 Ř. Structured Design offers a set of strategies for...in the overall systems design process, and an overview of the assessment procedures, as well as a guide to the overall assessment. 20. DISTRIBUTION

  2. Development of Measurement Methods for Detection of Special Nuclear Materials using D-D Pulsed Neutron Source

    NASA Astrophysics Data System (ADS)

    Misawa, Tsuyoshi; Takahashi, Yoshiyuki; Yagi, Takahiro; Pyeon, Cheol Ho; Kimura, Masaharu; Masuda, Kai; Ohgaki, Hideaki

    2015-10-01

    For detection of hidden special nuclear materials (SNMs), we have developed an active neutron-based interrogation system combined with a D-D fusion pulsed neutron source and a neutron detection system. In the detection scheme, we have adopted new measurement techniques simultaneously; neutron noise analysis and neutron energy spectrum analysis. The validity of neutron noise analysis method has been experimentally studied in the Kyoto University Critical Assembly (KUCA), and was applied to a cargo container inspection system by simulation.

  3. Procedures for woody vegetation surveys in the Kazgail rural council area, Kordofan, Sudan

    USGS Publications Warehouse

    Falconer, Allan; Cross, Matthew D.; Orr, Donald G.

    1990-01-01

    Efforts to reforest parts of the Kordofan Province of Sudan are receiving support from international development agencies. These efforts include planning and implementing reforestation activities that require the collection of natural resources and socioeconomic data, and the preparation of base maps. A combination of remote sensing, geographic information system and global positioning systems procedures are used in this study to meet these requirements.Remote sensing techniques were used to provide base maps and to guide the compilation of vegetation resources maps. These techniques provided a rapid and efficient method for documenting available resources. Pocket‐sized global positioning system units were used to establish the location of field data collected for mapping and resource analysis. A microcomputer data management system tabulated and displayed the field data. The resulting system for data analysis, management, and planning has been adopted for the mapping and inventory of the Gum Belt of Sudan.

  4. P2P-Based Data System for the EAST Experiment

    NASA Astrophysics Data System (ADS)

    Shu, Yantai; Zhang, Liang; Zhao, Weifeng; Chen, Haiming; Luo, Jiarong

    2006-06-01

    A peer-to-peer (P2P)-based EAST Data System is being designed to provide data acquisition and analysis support for the EAST superconducting tokamak. Instead of transferring data to the servers, all collected data are stored in the data acquisition subsystems locally and the PC clients can access the raw data directly using the P2P architecture. Both online and offline systems are based on Napster-like P2P architecture. This allows the peer (PC) to act both as a client and as a server. A simulation-based method and a steady-state operational analysis technique are used for performance evaluation. These analyses show that the P2P technique can significantly reduce the completion time of raw data display and real-time processing on the online system, and raise the workload capacity and reduce the delay on the offline system.

  5. A data analysis expert system for large established distributed databases

    NASA Technical Reports Server (NTRS)

    Gnacek, Anne-Marie; An, Y. Kim; Ryan, J. Patrick

    1987-01-01

    A design for a natural language database interface system, called the Deductively Augmented NASA Management Decision support System (DANMDS), is presented. The DANMDS system components have been chosen on the basis of the following considerations: maximal employment of the existing NASA IBM-PC computers and supporting software; local structuring and storing of external data via the entity-relationship model; a natural easy-to-use error-free database query language; user ability to alter query language vocabulary and data analysis heuristic; and significant artificial intelligence data analysis heuristic techniques that allow the system to become progressively and automatically more useful.

  6. LOCAD-PTS: Operation of a New System for Microbial Monitoring Aboard the International Space Station (ISS)

    NASA Technical Reports Server (NTRS)

    Maule, J.; Wainwright, N.; Steele, A.; Gunter, D.; Flores, G.; Effinger, M.; Danibm N,; Wells, M.; Williams, S.; Morris, H.; hide

    2008-01-01

    Microorganisms within the space stations Salyut, Mir and the International Space Station (ISS), have traditionally been monitored with culture-based techniques. These techniques involve growing environmental samples (cabin water, air or surfaces) on agar-type media for several days, followed by visualization of resulting colonies; and return of samples to Earth for ground-based analysis. This approach has provided a wealth of useful data and enhanced our understanding of the microbial ecology within space stations. However, the approach is also limited by the following: i) More than 95% microorganisms in the environment cannot grow on conventional growth media; ii) Significant time lags occur between onboard sampling and colony visualization (3-5 days) and ground-based analysis (as long as several months); iii) Colonies are often difficult to visualize due to condensation within contact slide media plates; and iv) Techniques involve growth of potentially harmful microorganisms, which must then be disposed of safely. This report describes the operation of a new culture-independent technique onboard the ISS for rapid analysis (within minutes) of endotoxin and -1, 3-glucan, found in the cell walls of gram-negative bacteria and fungi, respectively. This technique involves analysis of environmental samples with the Limulus Amebocyte Lysate (LAL) assay in a handheld device. This handheld device and sampling system is known as the Lab-On-a-Chip Application Development Portable Test System (LOCAD-PTS). A poster will be presented that describes a comparative study between LOCAD-PTS analysis and existing culture-based methods onboard the ISS; together with an exploratory survey of surface endotoxin throughout the ISS. It is concluded that while a general correlation between LOCAD-PTS and traditional culture-based methods should not necessarily be expected, a combinatorial approach can be adopted where both sets of data are used together to generate a more complete story of the microbial ecology on the ISS.

  7. DataView: a computational visualisation system for multidisciplinary design and analysis

    NASA Astrophysics Data System (ADS)

    Wang, Chengen

    2016-01-01

    Rapidly processing raw data and effectively extracting underlining information from huge volumes of multivariate data become essential to all decision-making processes in sectors like finance, government, medical care, climate analysis, industries, science, etc. Remarkably, visualisation is recognised as a fundamental technology that props up human comprehension, cognition and utilisation of burgeoning amounts of heterogeneous data. This paper presents a computational visualisation system, named DataView, which has been developed for graphically displaying and capturing outcomes of multiphysics problem-solvers widely used in engineering fields. The DataView is functionally composed of techniques for table/diagram representation, and graphical illustration of scalar, vector and tensor fields. The field visualisation techniques are implemented on the basis of a range of linear and non-linear meshes, which flexibly adapts to disparate data representation schemas adopted by a variety of disciplinary problem-solvers. The visualisation system has been successfully applied to a number of engineering problems, of which some illustrations are presented to demonstrate effectiveness of the visualisation techniques.

  8. Rapid prototyping of nanofluidic systems using size-reduced electrospun nanofibers for biomolecular analysis.

    PubMed

    Park, Seung-Min; Huh, Yun Suk; Szeto, Kylan; Joe, Daniel J; Kameoka, Jun; Coates, Geoffrey W; Edel, Joshua B; Erickson, David; Craighead, Harold G

    2010-11-05

    Biomolecular transport in nanofluidic confinement offers various means to investigate the behavior of biomolecules in their native aqueous environments, and to develop tools for diverse single-molecule manipulations. Recently, a number of simple nanofluidic fabrication techniques has been demonstrated that utilize electrospun nanofibers as a backbone structure. These techniques are limited by the arbitrary dimension of the resulting nanochannels due to the random nature of electrospinning. Here, a new method for fabricating nanofluidic systems from size-reduced electrospun nanofibers is reported and demonstrated. As it is demonstrated, this method uses the scanned electrospinning technique for generation of oriented sacrificial nanofibers and exposes these nanofibers to harsh, but isotropic etching/heating environments to reduce their cross-sectional dimension. The creation of various nanofluidic systems as small as 20 nm is demonstrated, and practical examples of single biomolecular handling, such as DNA elongation in nanochannels and fluorescence correlation spectroscopic analysis of biomolecules passing through nanochannels, are provided.

  9. 5 CFR 551.210 - Computer employees.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ..., creation or modification of computer programs related to machine operating systems; or (4) A combination of...) Computer systems analysts, computer programmers, software engineers, or other similarly skilled workers in... consist of: (1) The application of systems analysis techniques and procedures, including consulting with...

  10. 5 CFR 551.210 - Computer employees.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ..., creation or modification of computer programs related to machine operating systems; or (4) A combination of...) Computer systems analysts, computer programmers, software engineers, or other similarly skilled workers in... consist of: (1) The application of systems analysis techniques and procedures, including consulting with...

  11. 5 CFR 551.210 - Computer employees.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ..., creation or modification of computer programs related to machine operating systems; or (4) A combination of...) Computer systems analysts, computer programmers, software engineers, or other similarly skilled workers in... consist of: (1) The application of systems analysis techniques and procedures, including consulting with...

  12. 5 CFR 551.210 - Computer employees.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ..., creation or modification of computer programs related to machine operating systems; or (4) A combination of...) Computer systems analysts, computer programmers, software engineers, or other similarly skilled workers in... consist of: (1) The application of systems analysis techniques and procedures, including consulting with...

  13. An intercomparison of five ammonia measurement techniques

    NASA Technical Reports Server (NTRS)

    Williams, E. J.; Sandholm, S. T.; Bradshaw, J. D.; Schendel, J. S.; Langford, A. O.; Quinn, P. K.; Lebel, P. J.; Vay, S. A.; Roberts, P. D.; Norton, R. B.

    1992-01-01

    Results obtained from five techniques for measuring gas-phase ammonia at low concentration in the atmosphere are compared. These methods are: (1) a photofragmentation/laser-induced fluorescence (PF/LIF) instrument; (2) a molybdenum oxide annular denuder sampling/chemiluminescence detection technique; (3) a tungsten oxide denuder sampling/chemiluminescence detection system; (4) a citric-acid-coated denuder sampling/ion chromatographic analysis (CAD/IC) method; and (5) an oxalic-acid-coated filter pack sampling/colorimetric analysis method. It was found that two of the techniques, the PF/LIF and the CAD/IC methods, measured approximately 90 percent of the calculated ammonia added in the spiking tests and agreed very well with each other in the ambient measurements.

  14. Analytical techniques of pilot scanning behavior and their application

    NASA Technical Reports Server (NTRS)

    Harris, R. L., Sr.; Glover, B. J.; Spady, A. A., Jr.

    1986-01-01

    The state of the art of oculometric data analysis techniques and their applications in certain research areas such as pilot workload, information transfer provided by various display formats, crew role in automated systems, and pilot training are documented. These analytical techniques produce the following data: real-time viewing of the pilot's scanning behavior, average dwell times, dwell percentages, instrument transition paths, dwell histograms, and entropy rate measures. These types of data are discussed, and overviews of the experimental setup, data analysis techniques, and software are presented. A glossary of terms frequently used in pilot scanning behavior and a bibliography of reports on related research sponsored by NASA Langley Research Center are also presented.

  15. Proceedings of the second annual Forest Inventory and Analysis symposium; Salt Lake City, UT. October 17-18, 2000

    Treesearch

    Gregory A. Reams; Ronald E. McRoberts; Paul C. van Deusen; [Editors

    2001-01-01

    Documents progress in developing techniques in remote sensing, statistics, information management, and analysis required for full implementation of the national Forest Inventory and Analysis program’s annual forest inventory system.

  16. Safety Analysis and Protection Measures of the Control System of the Pulsed High Magnetic Field Facility in WHMFC

    NASA Astrophysics Data System (ADS)

    Shi, J. T.; Han, X. T.; Xie, J. F.; Yao, L.; Huang, L. T.; Li, L.

    2013-03-01

    A Pulsed High Magnetic Field Facility (PHMFF) has been established in Wuhan National High Magnetic Field Center (WHMFC) and various protection measures are applied in its control system. In order to improve the reliability and robustness of the control system, the safety analysis of the PHMFF is carried out based on Fault Tree Analysis (FTA) technique. The function and realization of 5 protection systems, which include sequence experiment operation system, safety assistant system, emergency stop system, fault detecting and processing system and accident isolating protection system, are given. The tests and operation indicate that these measures improve the safety of the facility and ensure the safety of people.

  17. An interdisciplinary analysis of ERTS data for Colorado mountain environments using ADP techniques

    NASA Technical Reports Server (NTRS)

    Hoffer, R. M. (Principal Investigator)

    1972-01-01

    There are no author-identified significant results in this report. Research efforts have been placed on: (1) location, acquisition, and preparation of baseline information necessary for the computer analysis, and (2) refinement of techniques for analysis of MSS data obtained from ERTS-1. Analysis of the first frame of data collected by the ERTS-1 multispectral scanner system over the Lake Texoma area has proven very valuable for determining the best procedures to follow in working with and analyzing ERTS data. Progress on the following projects is described: (1) cover type mapping; (2) geomorphology; and hydrologic feature surveys.

  18. Heterodyne laser spectroscopy system

    DOEpatents

    Wyeth, Richard W.; Paisner, Jeffrey A.; Story, Thomas

    1990-01-01

    A heterodyne laser spectroscopy system utilizes laser heterodyne techniques for purposes of laser isotope separation spectroscopy, vapor diagnostics, processing of precise laser frequency offsets from a reference frequency, and provides spectral analysis of a laser beam.

  19. General Analytical Schemes for the Characterization of Pectin-Based Edible Gelled Systems

    PubMed Central

    Haghighi, Maryam; Rezaei, Karamatollah

    2012-01-01

    Pectin-based gelled systems have gained increasing attention for the design of newly developed food products. For this reason, the characterization of such formulas is a necessity in order to present scientific data and to introduce an appropriate finished product to the industry. Various analytical techniques are available for the evaluation of the systems formulated on the basis of pectin and the designed gel. In this paper, general analytical approaches for the characterization of pectin-based gelled systems were categorized into several subsections including physicochemical analysis, visual observation, textural/rheological measurement, microstructural image characterization, and psychorheological evaluation. Three-dimensional trials to assess correlations among microstructure, texture, and taste were also discussed. Practical examples of advanced objective techniques including experimental setups for small and large deformation rheological measurements and microstructural image analysis were presented in more details. PMID:22645484

  20. Diagnostics of wear in aeronautical systems

    NASA Technical Reports Server (NTRS)

    Wedeven, L. D.

    1979-01-01

    The use of appropriate diagnostic tools for aircraft oil wetted components is reviewed, noting that it can reduce direct operating costs through reduced unscheduled maintenance, particularly in helicopter engine and transmission systems where bearing failures are a significant cost factor. Engine and transmission wear modes are described, and diagnostic methods for oil and wet particle analysis, the spectrometric oil analysis program, chip detectors, ferrography, in-line oil monitor and radioactive isotope tagging are discussed, noting that they are effective over a limited range of particle sizes but compliment each other if used in parallel. Fine filtration can potentially increase time between overhauls, but reduces the effectiveness of conventional oil monitoring techniques so that alternative diagnostic techniques must be used. It is concluded that the development of a diagnostic system should be parallel and integral with the development of a mechanical system.

  1. Apollo management: A key to the solution of the social-economical dilemma - The transferability of space-travel managerial techniques to the civil sector

    NASA Technical Reports Server (NTRS)

    Puttkamer, J. V.

    1973-01-01

    An analysis has been conducted to find out whether the management techniques developed in connection with the Apollo project could be used for dealing with such urgent problems of modern society as the crisis of the cities, the increasing environmental pollution, and the steadily growing traffic. Basic concepts and definitions of program and system management are discussed together with details regarding the employment of these concepts in connection with the solution of the problems of the Apollo program. Principles and significance of a systems approach are considered, giving attention to planning, system analysis, system integration, and project management. An application of the methods of project management to the problems of the civil sector is possible if the special characteristics of each particular case are taken into account.

  2. Interpreting Popov criteria in Lure´ systems with complex scaling stability analysis

    NASA Astrophysics Data System (ADS)

    Zhou, J.

    2018-06-01

    The paper presents a novel frequency-domain interpretation of Popov criteria for absolute stability in Lure´ systems by means of what we call complex scaling stability analysis. The complex scaling technique is developed for exponential/asymptotic stability in LTI feedback systems, which dispenses open-loop poles distribution, contour/locus orientation and prior frequency sweeping. Exploiting the technique for alternatively revealing positive realness of transfer functions, re-interpreting Popov criteria is explicated. More specifically, the suggested frequency-domain stability conditions are conformable both in scalar and multivariable cases, and can be implemented either graphically with locus plotting or numerically without; in particular, the latter is suitable as a design tool with auxiliary parameter freedom. The interpretation also reveals further frequency-domain facts about Lure´ systems. Numerical examples are included to illustrate the main results.

  3. Phase-amplitude imaging: its application to fully automated analysis of magnetic field measurements in laser-produced plasmas.

    PubMed

    Kalal, M; Nugent, K A; Luther-Davies, B

    1987-05-01

    An interferometric technique which enables simultaneous phase and amplitude imaging of optically transparent objects is discussed with respect to its application for the measurement of spontaneous toroidal magnetic fields generated in laser-produced plasmas. It is shown that this technique can replace the normal independent pair of optical systems (interferometry and shadowgraphy) by one system and use computer image processing to recover both the plasma density and magnetic field information with high accuracy. A fully automatic algorithm for the numerical analysis of the data has been developed and its performance demonstrated for the case of simulated as well as experimental data.

  4. Phase-amplitude imaging: its application to fully automated analysis of magnetic field measurements in laser-produced plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kalal, M.; Nugent, K.A.; Luther-Davies, B.

    1987-05-01

    An interferometric technique which enables simultaneous phase and amplitude imaging of optically transparent objects is discussed with respect to its application for the measurement of spontaneous toroidal magnetic fields generated in laser-produced plasmas. It is shown that this technique can replace the normal independent pair of optical systems (interferometry and shadowgraphy) by one system and use computer image processing to recover both the plasma density and magnetic field information with high accuracy. A fully automatic algorithm for the numerical analysis of the data has been developed and its performance demonstrated for the case of simulated as well as experimental data.

  5. Optimization of binary thermodynamic and phase diagram data

    NASA Astrophysics Data System (ADS)

    Bale, Christopher W.; Pelton, A. D.

    1983-03-01

    An optimization technique based upon least squares regression is presented to permit the simultaneous analysis of diverse experimental binary thermodynamic and phase diagram data. Coefficients of polynomial expansions for the enthalpy and excess entropy of binary solutions are obtained which can subsequently be used to calculate the thermodynamic properties or the phase diagram. In an interactive computer-assisted analysis employing this technique, one can critically analyze a large number of diverse data in a binary system rapidly, in a manner which is fully self-consistent thermodynamically. Examples of applications to the Bi-Zn, Cd-Pb, PbCl2-KCl, LiCl-FeCl2, and Au-Ni binary systems are given.

  6. Particle Streak Anemometry: A New Method for Proximal Flow Sensing from Aircraft

    NASA Astrophysics Data System (ADS)

    Nichols, T. W.

    Accurate sensing of relative air flow direction from fixed-wing small unmanned aircraft (sUAS) is challenging with existing multi-hole pitot-static and vane systems. Sub-degree direction accuracy is generally not available on such systems and disturbances to the local flow field, induced by the airframe, introduce an additional error source. An optical imaging approach to make a relative air velocity measurement with high-directional accuracy is presented. Optical methods offer the capability to make a proximal measurement in undisturbed air outside of the local flow field without the need to place sensors on vulnerable probes extended ahead of the aircraft. Current imaging flow analysis techniques for laboratory use rely on relatively thin imaged volumes and sophisticated hardware and intensity thresholding in low-background conditions. A new method is derived and assessed using a particle streak imaging technique that can be implemented with low-cost commercial cameras and illumination systems, and can function in imaged volumes of arbitrary depth with complex background signal. The new technique, referred to as particle streak anemometry (PSA) (to differentiate from particle streak velocimetry which makes a field measurement rather than a single bulk flow measurement) utilizes a modified Canny Edge detection algorithm with a connected component analysis and principle component analysis to detect streak ends in complex imaging conditions. A linear solution for the air velocity direction is then implemented with a random sample consensus (RANSAC) solution approach. A single DOF non-linear, non-convex optimization problem is then solved for the air speed through an iterative approach. The technique was tested through simulation and wind tunnel tests yielding angular accuracies under 0.2 degrees, superior to the performance of existing commercial systems. Air speed error standard deviations varied from 1.6 to 2.2 m/s depending on the techniques of implementation. While air speed sensing is secondary to accurate flow direction measurement, the air speed results were in line with commercial pitot static systems at low speeds.

  7. A Pragmatic Cognitive System Engineering Approach to Model Dynamic Human Decision-Making Activities in Intelligent and Automated Systems

    DTIC Science & Technology

    2003-10-01

    Among the procedures developed to identify cognitive processes, there are the Cognitive Task Analysis (CTA) and the Cognitive Work Analysis (CWA...of Cognitive Task Design. [11] Potter, S.S., Roth, E.M., Woods, D.D., and Elm, W.C. (2000). Cognitive Task Analysis as Bootstrapping Multiple...Converging Techniques, In Schraagen, Chipman, and Shalin (Eds.). Cognitive Task Analysis . Mahwah, NJ: Lawrence Erlbaum Associates. [12] Roth, E.M

  8. Data handling and analysis for the 1971 corn blight watch experiment.

    NASA Technical Reports Server (NTRS)

    Anuta, P. E.; Phillips, T. L.; Landgrebe, D. A.

    1972-01-01

    Review of the data handling and analysis methods used in the near-operational test of remote sensing systems provided by the 1971 corn blight watch experiment. The general data analysis techniques and, particularly, the statistical multispectral pattern recognition methods for automatic computer analysis of aircraft scanner data are described. Some of the results obtained are examined, and the implications of the experiment for future data communication requirements of earth resource survey systems are discussed.

  9. Genetic programming based ensemble system for microarray data classification.

    PubMed

    Liu, Kun-Hong; Tong, Muchenxuan; Xie, Shu-Tong; Yee Ng, Vincent To

    2015-01-01

    Recently, more and more machine learning techniques have been applied to microarray data analysis. The aim of this study is to propose a genetic programming (GP) based new ensemble system (named GPES), which can be used to effectively classify different types of cancers. Decision trees are deployed as base classifiers in this ensemble framework with three operators: Min, Max, and Average. Each individual of the GP is an ensemble system, and they become more and more accurate in the evolutionary process. The feature selection technique and balanced subsampling technique are applied to increase the diversity in each ensemble system. The final ensemble committee is selected by a forward search algorithm, which is shown to be capable of fitting data automatically. The performance of GPES is evaluated using five binary class and six multiclass microarray datasets, and results show that the algorithm can achieve better results in most cases compared with some other ensemble systems. By using elaborate base classifiers or applying other sampling techniques, the performance of GPES may be further improved.

  10. Genetic Programming Based Ensemble System for Microarray Data Classification

    PubMed Central

    Liu, Kun-Hong; Tong, Muchenxuan; Xie, Shu-Tong; Yee Ng, Vincent To

    2015-01-01

    Recently, more and more machine learning techniques have been applied to microarray data analysis. The aim of this study is to propose a genetic programming (GP) based new ensemble system (named GPES), which can be used to effectively classify different types of cancers. Decision trees are deployed as base classifiers in this ensemble framework with three operators: Min, Max, and Average. Each individual of the GP is an ensemble system, and they become more and more accurate in the evolutionary process. The feature selection technique and balanced subsampling technique are applied to increase the diversity in each ensemble system. The final ensemble committee is selected by a forward search algorithm, which is shown to be capable of fitting data automatically. The performance of GPES is evaluated using five binary class and six multiclass microarray datasets, and results show that the algorithm can achieve better results in most cases compared with some other ensemble systems. By using elaborate base classifiers or applying other sampling techniques, the performance of GPES may be further improved. PMID:25810748

  11. An Analog Macroscopic Technique for Studying Molecular Hydrodynamic Processes in Dense Gases and Liquids.

    PubMed

    Dahlberg, Jerry; Tkacik, Peter T; Mullany, Brigid; Fleischhauer, Eric; Shahinian, Hossein; Azimi, Farzad; Navare, Jayesh; Owen, Spencer; Bisel, Tucker; Martin, Tony; Sholar, Jodie; Keanini, Russell G

    2017-12-04

    An analog, macroscopic method for studying molecular-scale hydrodynamic processes in dense gases and liquids is described. The technique applies a standard fluid dynamic diagnostic, particle image velocimetry (PIV), to measure: i) velocities of individual particles (grains), extant on short, grain-collision time-scales, ii) velocities of systems of particles, on both short collision-time- and long, continuum-flow-time-scales, iii) collective hydrodynamic modes known to exist in dense molecular fluids, and iv) short- and long-time-scale velocity autocorrelation functions, central to understanding particle-scale dynamics in strongly interacting, dense fluid systems. The basic system is composed of an imaging system, light source, vibrational sensors, vibrational system with a known media, and PIV and analysis software. Required experimental measurements and an outline of the theoretical tools needed when using the analog technique to study molecular-scale hydrodynamic processes are highlighted. The proposed technique provides a relatively straightforward alternative to photonic and neutron beam scattering methods traditionally used in molecular hydrodynamic studies.

  12. Basic research for the geodynamics program

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Some objectives of this geodynamic program are: (1) optimal utilization of laser and VLBI observations as reference frames for geodynamics, (2) utilization of range difference observations in geodynamics, and (3) estimation techniques in crustal deformation analysis. The determination of Earth rotation parameters from different space geodetic systems is studied. Also reported on is the utilization of simultaneous laser range differences for the determination of baseline variation. An algorithm for the analysis of regional or local crustal deformation measurements is proposed along with other techniques and testing procedures. Some results of the reference from comparisons in terms of the pole coordinates from different techniques are presented.

  13. A Review of Diagnostic Techniques for ISHM Applications

    NASA Technical Reports Server (NTRS)

    Patterson-Hine, Ann; Biswas, Gautam; Aaseng, Gordon; Narasimhan, Sriam; Pattipati, Krishna

    2005-01-01

    System diagnosis is an integral part of any Integrated System Health Management application. Diagnostic applications make use of system information from the design phase, such as safety and mission assurance analysis, failure modes and effects analysis, hazards analysis, functional models, fault propagation models, and testability analysis. In modern process control and equipment monitoring systems, topological and analytic , models of the nominal system, derived from design documents, are also employed for fault isolation and identification. Depending on the complexity of the monitored signals from the physical system, diagnostic applications may involve straightforward trending and feature extraction techniques to retrieve the parameters of importance from the sensor streams. They also may involve very complex analysis routines, such as signal processing, learning or classification methods to derive the parameters of importance to diagnosis. The process that is used to diagnose anomalous conditions from monitored system signals varies widely across the different approaches to system diagnosis. Rule-based expert systems, case-based reasoning systems, model-based reasoning systems, learning systems, and probabilistic reasoning systems are examples of the many diverse approaches ta diagnostic reasoning. Many engineering disciplines have specific approaches to modeling, monitoring and diagnosing anomalous conditions. Therefore, there is no "one-size-fits-all" approach to building diagnostic and health monitoring capabilities for a system. For instance, the conventional approaches to diagnosing failures in rotorcraft applications are very different from those used in communications systems. Further, online and offline automated diagnostic applications are integrated into an operations framework with flight crews, flight controllers and maintenance teams. While the emphasis of this paper is automation of health management functions, striking the correct balance between automated and human-performed tasks is a vital concern.

  14. MSIX - A general and user-friendly platform for RAM analysis

    NASA Astrophysics Data System (ADS)

    Pan, Z. J.; Blemel, Peter

    The authors present a CAD (computer-aided design) platform supporting RAM (reliability, availability, and maintainability) analysis with efficient system description and alternative evaluation. The design concepts, implementation techniques, and application results are described. This platform is user-friendly because of its graphic environment, drawing facilities, object orientation, self-tutoring, and access to the operating system. The programs' independency and portability make them generally applicable to various analysis tasks.

  15. Acoustic prediction methods for the NASA generalized advanced propeller analysis system (GAPAS)

    NASA Technical Reports Server (NTRS)

    Padula, S. L.; Block, P. J. W.

    1984-01-01

    Classical methods of propeller performance analysis are coupled with state-of-the-art Aircraft Noise Prediction Program (ANOPP:) techniques to yield a versatile design tool, the NASA Generalized Advanced Propeller Analysis System (GAPAS) for the novel quiet and efficient propellers. ANOPP is a collection of modular specialized programs. GAPAS as a whole addresses blade geometry and aerodynamics, rotor performance and loading, and subsonic propeller noise.

  16. Shuttle user analysis (study 2.2). Volume 3: Business risk and value of operations in space (BRAVO). Part 5: Analysis of GSFC Earth Observation Satellite (EOS) system mission model using BRAVO techniques

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Cost comparisons were made between three modes of operation (expend, ground refurbish, and space resupply) for the Earth Observation System (EOS-B) to furnish data to NASA on alternative ways to use the shuttle/EOS. Results of the analysis are presented in tabular form.

  17. GESA--a two-dimensional processing system using knowledge base techniques.

    PubMed

    Rowlands, D G; Flook, A; Payne, P I; van Hoff, A; Niblett, T; McKee, S

    1988-12-01

    The successful analysis of two-dimensional (2-D) polyacrylamide electrophoresis gels demands considerable experience and understanding of the protein system under investigation as well as knowledge of the separation technique itself. The present work concerns the development of a computer system for analysing 2-D electrophoretic separations which incorporates concepts derived from artificial intelligence research such that non-experts can use the technique as a diagnostic or identification tool. Automatic analysis of 2-D gel separations has proved to be extremely difficult using statistical methods. Non-reproducibility of gel separations is also difficult to overcome using automatic systems. However, the human eye is extremely good at recognising patterns in images, and human intervention in semi-automatic computer systems can reduce the computational complexities of fully automatic systems. Moreover, the expertise and understanding of an "expert" is invaluable in reducing system complexity if it can be encapsulated satisfactorily in an expert system. The combination of user-intervention in the computer system together with the encapsulation of expert knowledge characterises the present system. The domain within which the system has been developed is that of wheat grain storage proteins (gliadins) which exhibit polymorphism to such an extent that cultivars can be uniquely identified by their gliadin patterns. The system can be adapted to other domains where a range of polymorpic protein sub-units exist. In its generalised form, the system can also be used for comparing more complex 2-D gel electrophoretic separations.

  18. Investigating cardiorespiratory interaction by cross-spectral analysis of event series

    NASA Astrophysics Data System (ADS)

    Schäfer, Carsten; Rosenblum, Michael G.; Pikovsky, Arkady S.; Kurths, Jürgen

    2000-02-01

    The human cardiovascular and respiratory systems interact with each other and show effects of modulation and synchronization. Here we present a cross-spectral technique that specifically considers the event-like character of the heartbeat and avoids typical restrictions of other spectral methods. Using models as well as experimental data, we demonstrate how modulation and synchronization can be distinguished. Finally, we compare the method to traditional techniques and to the analysis of instantaneous phases.

  19. The integrated manual and automatic control of complex flight systems

    NASA Technical Reports Server (NTRS)

    Schmidt, D. K.

    1986-01-01

    The topics of research in this program include pilot/vehicle analysis techniques, identification of pilot dynamics, and control and display synthesis techniques for optimizing aircraft handling qualities. The project activities are discussed. The current technical activity is directed at extending and validating the active display synthesis procedure, and the pilot/vehicle analysis of the NLR rate-command flight configurations in the landing task. Two papers published by the researchers are attached as appendices.

  20. Identification of piecewise affine systems based on fuzzy PCA-guided robust clustering technique

    NASA Astrophysics Data System (ADS)

    Khanmirza, Esmaeel; Nazarahari, Milad; Mousavi, Alireza

    2016-12-01

    Hybrid systems are a class of dynamical systems whose behaviors are based on the interaction between discrete and continuous dynamical behaviors. Since a general method for the analysis of hybrid systems is not available, some researchers have focused on specific types of hybrid systems. Piecewise affine (PWA) systems are one of the subsets of hybrid systems. The identification of PWA systems includes the estimation of the parameters of affine subsystems and the coefficients of the hyperplanes defining the partition of the state-input domain. In this paper, we have proposed a PWA identification approach based on a modified clustering technique. By using a fuzzy PCA-guided robust k-means clustering algorithm along with neighborhood outlier detection, the two main drawbacks of the well-known clustering algorithms, i.e., the poor initialization and the presence of outliers, are eliminated. Furthermore, this modified clustering technique enables us to determine the number of subsystems without any prior knowledge about system. In addition, applying the structure of the state-input domain, that is, considering the time sequence of input-output pairs, provides a more efficient clustering algorithm, which is the other novelty of this work. Finally, the proposed algorithm has been evaluated by parameter identification of an IGV servo actuator. Simulation together with experiment analysis has proved the effectiveness of the proposed method.

  1. DATMAN: A reliability data analysis program using Bayesian updating

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Becker, M.; Feltus, M.A.

    1996-12-31

    Preventive maintenance (PM) techniques focus on the prevention of failures, in particular, system components that are important to plant functions. Reliability-centered maintenance (RCM) improves on the PM techniques by introducing a set of guidelines by which to evaluate the system functions. It also minimizes intrusive maintenance, labor, and equipment downtime without sacrificing system performance when its function is essential for plant safety. Both the PM and RCM approaches require that system reliability data be updated as more component failures and operation time are acquired. Systems reliability and the likelihood of component failures can be calculated by Bayesian statistical methods, whichmore » can update these data. The DATMAN computer code has been developed at Penn State to simplify the Bayesian analysis by performing tedious calculations needed for RCM reliability analysis. DATMAN reads data for updating, fits a distribution that best fits the data, and calculates component reliability. DATMAN provides a user-friendly interface menu that allows the user to choose from several common prior and posterior distributions, insert new failure data, and visually select the distribution that matches the data most accurately.« less

  2. Planning for Downtown Circulation Systems. Volume 2. Analysis Techniques.

    DOT National Transportation Integrated Search

    1983-10-01

    This volume contains the analysis and refinement stages of downtown circulator planning. Included are sections on methods for estimating patronage, costs, revenues, and impacts, and a section on methods for performing micro-level analyses.

  3. Measurement of labile copper in wine by medium exchange stripping potentiometry utilising screen printed carbon electrodes.

    PubMed

    Clark, Andrew C; Kontoudakis, Nikolaos; Barril, Celia; Schmidtke, Leigh M; Scollary, Geoffrey R

    2016-07-01

    The presence of copper in wine is known to impact the reductive, oxidative and colloidal stability of wine, and techniques enabling measurement of different forms of copper in wine are of particular interest in understanding these spoilage processes. Electrochemical stripping techniques developed to date require significant pretreatment of wine, potentially disturbing the copper binding equilibria. A thin mercury film on a screen printed carbon electrode was utilised in a flow system for the direct analysis of labile copper in red and white wine by constant current stripping potentiometry with medium exchange. Under the optimised conditions, including an enrichment time of 500s and constant current of 1.0μA, the response range was linear from 0.015 to 0.200mg/L. The analysis of 52 red and white wines showed that this technique generally provided lower labile copper concentrations than reported for batch measurement by related techniques. Studies in a model system and in finished wines showed that the copper sulfide was not measured as labile copper, and that loss of hydrogen sulfide via volatilisation induced an increase in labile copper within the model wine system. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Noise analysis of nucleate boiling

    NASA Technical Reports Server (NTRS)

    Mcknight, R. D.; Ram, K. S.

    1971-01-01

    The techniques of noise analysis have been utilized to investigate nucleate pool boiling. A simple experimental setup has been developed for obtaining the power spectrum of a nucleate boiling system. These techniques were first used to study single bubbles, and a method of relating the two-dimensional projected size and the local velocity of the bubbles to the auto-correlation functions is presented. This method is much less time consuming than conventional methods of measurement and has no probes to disturb the system. These techniques can be used to determine the contribution of evaporation to total heat flux in nucleate boiling. Also, these techniques can be used to investigate the effect of various parameters upon the frequency response of nucleate boiling. The predominant frequencies of the power spectrum correspond to the frequencies of bubble generation. The effects of heat input, degree of subcooling, and liquid surface tension upon the power spectra of a boiling system are presented. It was found that the degree of subcooling has a more pronounced effect upon bubble size than does heat flux. Also the effect of lowering surface tension can be sufficient to reduce the effect of the degree of subcooling upon the size of the bubbles.

  5. A FORTRAN program for the analysis of linear continuous and sample-data systems

    NASA Technical Reports Server (NTRS)

    Edwards, J. W.

    1976-01-01

    A FORTRAN digital computer program which performs the general analysis of linearized control systems is described. State variable techniques are used to analyze continuous, discrete, and sampled data systems. Analysis options include the calculation of system eigenvalues, transfer functions, root loci, root contours, frequency responses, power spectra, and transient responses for open- and closed-loop systems. A flexible data input format allows the user to define systems in a variety of representations. Data may be entered by inputing explicit data matrices or matrices constructed in user written subroutines, by specifying transfer function block diagrams, or by using a combination of these methods.

  6. Tools and techniques for developing policies for complex and uncertain systems.

    PubMed

    Bankes, Steven C

    2002-05-14

    Agent-based models (ABM) are examples of complex adaptive systems, which can be characterized as those systems for which no model less complex than the system itself can accurately predict in detail how the system will behave at future times. Consequently, the standard tools of policy analysis, based as they are on devising policies that perform well on some best estimate model of the system, cannot be reliably used for ABM. This paper argues that policy analysis by using ABM requires an alternative approach to decision theory. The general characteristics of such an approach are described, and examples are provided of its application to policy analysis.

  7. Investigation of energy management strategies for photovoltaic systems - A predictive control algorithm

    NASA Technical Reports Server (NTRS)

    Cull, R. C.; Eltimsahy, A. H.

    1983-01-01

    The present investigation is concerned with the formulation of energy management strategies for stand-alone photovoltaic (PV) systems, taking into account a basic control algorithm for a possible predictive, (and adaptive) controller. The control system controls the flow of energy in the system according to the amount of energy available, and predicts the appropriate control set-points based on the energy (insolation) available by using an appropriate system model. Aspects of adaptation to the conditions of the system are also considered. Attention is given to a statistical analysis technique, the analysis inputs, the analysis procedure, and details regarding the basic control algorithm.

  8. Collaborative real-time motion video analysis by human observer and image exploitation algorithms

    NASA Astrophysics Data System (ADS)

    Hild, Jutta; Krüger, Wolfgang; Brüstle, Stefan; Trantelle, Patrick; Unmüßig, Gabriel; Heinze, Norbert; Peinsipp-Byma, Elisabeth; Beyerer, Jürgen

    2015-05-01

    Motion video analysis is a challenging task, especially in real-time applications. In most safety and security critical applications, a human observer is an obligatory part of the overall analysis system. Over the last years, substantial progress has been made in the development of automated image exploitation algorithms. Hence, we investigate how the benefits of automated video analysis can be integrated suitably into the current video exploitation systems. In this paper, a system design is introduced which strives to combine both the qualities of the human observer's perception and the automated algorithms, thus aiming to improve the overall performance of a real-time video analysis system. The system design builds on prior work where we showed the benefits for the human observer by means of a user interface which utilizes the human visual focus of attention revealed by the eye gaze direction for interaction with the image exploitation system; eye tracker-based interaction allows much faster, more convenient, and equally precise moving target acquisition in video images than traditional computer mouse selection. The system design also builds on prior work we did on automated target detection, segmentation, and tracking algorithms. Beside the system design, a first pilot study is presented, where we investigated how the participants (all non-experts in video analysis) performed in initializing an object tracking subsystem by selecting a target for tracking. Preliminary results show that the gaze + key press technique is an effective, efficient, and easy to use interaction technique when performing selection operations on moving targets in videos in order to initialize an object tracking function.

  9. On algorithmic optimization of histogramming functions for GEM systems

    NASA Astrophysics Data System (ADS)

    Krawczyk, Rafał D.; Czarski, Tomasz; Kolasinski, Piotr; Poźniak, Krzysztof T.; Linczuk, Maciej; Byszuk, Adrian; Chernyshova, Maryna; Juszczyk, Bartlomiej; Kasprowicz, Grzegorz; Wojenski, Andrzej; Zabolotny, Wojciech

    2015-09-01

    This article concerns optimization methods for data analysis for the X-ray GEM detector system. The offline analysis of collected samples was optimized for MATLAB computations. Compiled functions in C language were used with MEX library. Significant speedup was received for both ordering-preprocessing and for histogramming of samples. Utilized techniques with obtained results are presented.

  10. Web-Based Essay Critiquing System and EFL Students' Writing: A Quantitative and Qualitative Investigation

    ERIC Educational Resources Information Center

    Lee, Cynthia; Wong, Kelvin C. K.; Cheung, William K.; Lee, Fion S. L.

    2009-01-01

    The paper first describes a web-based essay critiquing system developed by the authors using latent semantic analysis (LSA), an automatic text analysis technique, to provide students with immediate feedback on content and organisation for revision whenever there is an internet connection. It reports on its effectiveness in enhancing adult EFL…

  11. Dynamic Analysis of Sounding Rocket Pneumatic System Revision

    NASA Technical Reports Server (NTRS)

    Armen, Jerald

    2010-01-01

    The recent fusion of decades of advancements in mathematical models, numerical algorithms and curve fitting techniques marked the beginning of a new era in the science of simulation. It is becoming indispensable to the study of rockets and aerospace analysis. In pneumatic system, which is the main focus of this paper, particular emphasis will be placed on the efforts of compressible flow in Attitude Control System of sounding rocket.

  12. Using integrated models to minimize environmentally induced wavefront error in optomechanical design and analysis

    NASA Astrophysics Data System (ADS)

    Genberg, Victor L.; Michels, Gregory J.

    2017-08-01

    The ultimate design goal of an optical system subjected to dynamic loads is to minimize system level wavefront error (WFE). In random response analysis, system WFE is difficult to predict from finite element results due to the loss of phase information. In the past, the use of ystem WFE was limited by the difficulty of obtaining a linear optics model. In this paper, an automated method for determining system level WFE using a linear optics model is presented. An error estimate is included in the analysis output based on fitting errors of mode shapes. The technique is demonstrated by example with SigFit, a commercially available tool integrating mechanical analysis with optical analysis.

  13. Artificial Intelligence in Sports Biomechanics: New Dawn or False Hope?

    PubMed Central

    Bartlett, Roger

    2006-01-01

    This article reviews developments in the use of Artificial Intelligence (AI) in sports biomechanics over the last decade. It outlines possible uses of Expert Systems as diagnostic tools for evaluating faults in sports movements (‘techniques’) and presents some example knowledge rules for such an expert system. It then compares the analysis of sports techniques, in which Expert Systems have found little place to date, with gait analysis, in which they are routinely used. Consideration is then given to the use of Artificial Neural Networks (ANNs) in sports biomechanics, focusing on Kohonen self-organizing maps, which have been the most widely used in technique analysis, and multi-layer networks, which have been far more widely used in biomechanics in general. Examples of the use of ANNs in sports biomechanics are presented for javelin and discus throwing, shot putting and football kicking. I also present an example of the use of Evolutionary Computation in movement optimization in the soccer throw in, which predicted an optimal technique close to that in the coaching literature. After briefly overviewing the use of AI in both sports science and biomechanics in general, the article concludes with some speculations about future uses of AI in sports biomechanics. Key Points Expert Systems remain almost unused in sports biomechanics, unlike in the similar discipline of gait analysis. Artificial Neural Networks, particularly Kohonen Maps, have been used, although their full value remains unclear. Other AI applications, including Evolutionary Computation, have received little attention. PMID:24357939

  14. Applications Of Binary Image Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Tropf, H.; Enderle, E.; Kammerer, H. P.

    1983-10-01

    After discussing the conditions where binary image analysis techniques can be used, three new applications of the fast binary image analysis system S.A.M. (Sensorsystem for Automation and Measurement) are reported: (1) The human view direction is measured at TV frame rate while the subject's head is free movable. (2) Industrial parts hanging on a moving conveyor are classified prior to spray painting by robot. (3) In automotive wheel assembly, the eccentricity of the wheel is minimized by turning the tyre relative to the rim in order to balance the eccentricity of the components.

  15. Infusing Reliability Techniques into Software Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shi, Ying

    2015-01-01

    Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.

  16. An analysis of the circuitry of the visual pathway of the lateral eye of limullus

    NASA Technical Reports Server (NTRS)

    Sjoestrand, F. S.

    1970-01-01

    The methodology is discussed for three-dimensional analysis of the nervous system on the basis of electron micrographs of serial sections. An analysis is presented of a part of the circuitry of the rabbit retina. In addition, some exploratory work is reported with respect to the visual cortex of the cat brain. A proper technique for preservation of the visual cortex was worked out and a technique to localize microelectrode tips in the tissue in connection with electron microscopy was partially worked out.

  17. (abstract) Generic Modeling of a Life Support System for Process Technology Comparisons

    NASA Technical Reports Server (NTRS)

    Ferrall, J. F.; Seshan, P. K.; Rohatgi, N. K.; Ganapathi, G. B.

    1993-01-01

    This paper describes a simulation model called the Life Support Systems Analysis Simulation Tool (LiSSA-ST), the spreadsheet program called the Life Support Systems Analysis Trade Tool (LiSSA-TT), and the Generic Modular Flow Schematic (GMFS) modeling technique. Results of using the LiSSA-ST and the LiSSA-TT will be presented for comparing life support systems and process technology options for a Lunar Base and a Mars Exploration Mission.

  18. Enhancements to the Network Repair Level Analysis (NRLA) Model Using Marginal Analysis Techniques and Centralized Intermediate Repair Facility (CIRF) Maintenance Concepts.

    DTIC Science & Technology

    1983-12-01

    while at the same time improving its operational efficiency. Through their integration and use, System Program Managers have a comprehensive analytical... systems . The NRLA program is hosted on the CREATE Operating System and contains approxiamately 5500 lines of computer code. It consists of a main...associated with C alternative maintenance plans. As the technological complexity of weapons systems has increased new and innovative logisitcal support

  19. Human Resources Data in Weapon System Design: An Initial Plan for Development of a Unified Data Base.

    DTIC Science & Technology

    1980-11-01

    Dela Bnrted) Item 19 Continued: system design design handbooks maintenance manpower simulation de’ision options cost estimating relationships prediction...determine the extent to which human resources data (HRD) are used in early system design. The third was to assess the availability and ade - quacy of...relationships, regression analysis, comparability analysis, expected value techniques) to provide initial data values in the very early stages of weapon system

  20. Exploring relation types for literature-based discovery.

    PubMed

    Preiss, Judita; Stevenson, Mark; Gaizauskas, Robert

    2015-09-01

    Literature-based discovery (LBD) aims to identify "hidden knowledge" in the medical literature by: (1) analyzing documents to identify pairs of explicitly related concepts (terms), then (2) hypothesizing novel relations between pairs of unrelated concepts that are implicitly related via a shared concept to which both are explicitly related. Many LBD approaches use simple techniques to identify semantically weak relations between concepts, for example, document co-occurrence. These generate huge numbers of hypotheses, difficult for humans to assess. More complex techniques rely on linguistic analysis, for example, shallow parsing, to identify semantically stronger relations. Such approaches generate fewer hypotheses, but may miss hidden knowledge. The authors investigate this trade-off in detail, comparing techniques for identifying related concepts to discover which are most suitable for LBD. A generic LBD system that can utilize a range of relation types was developed. Experiments were carried out comparing a number of techniques for identifying relations. Two approaches were used for evaluation: replication of existing discoveries and the "time slicing" approach.(1) RESULTS: Previous LBD discoveries could be replicated using relations based either on document co-occurrence or linguistic analysis. Using relations based on linguistic analysis generated many fewer hypotheses, but a significantly greater proportion of them were candidates for hidden knowledge. The use of linguistic analysis-based relations improves accuracy of LBD without overly damaging coverage. LBD systems often generate huge numbers of hypotheses, which are infeasible to manually review. Improving their accuracy has the potential to make these systems significantly more usable. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  1. Integrating computer programs for engineering analysis and design

    NASA Technical Reports Server (NTRS)

    Wilhite, A. W.; Crisp, V. K.; Johnson, S. C.

    1983-01-01

    The design of a third-generation system for integrating computer programs for engineering and design has been developed for the Aerospace Vehicle Interactive Design (AVID) system. This system consists of an engineering data management system, program interface software, a user interface, and a geometry system. A relational information system (ARIS) was developed specifically for the computer-aided engineering system. It is used for a repository of design data that are communicated between analysis programs, for a dictionary that describes these design data, for a directory that describes the analysis programs, and for other system functions. A method is described for interfacing independent analysis programs into a loosely-coupled design system. This method emphasizes an interactive extension of analysis techniques and manipulation of design data. Also, integrity mechanisms exist to maintain database correctness for multidisciplinary design tasks by an individual or a team of specialists. Finally, a prototype user interface program has been developed to aid in system utilization.

  2. Adapting risk management and computational intelligence network optimization techniques to improve traffic throughput and tail risk analysis.

    DOT National Transportation Integrated Search

    2014-04-01

    Risk management techniques are used to analyze fluctuations in uncontrollable variables and keep those fluctuations from impeding : the core function of a system or business. Examples of this are making sure that volatility in copper and aluminum pri...

  3. Multivariate Analysis of Schools and Educational Policy.

    ERIC Educational Resources Information Center

    Kiesling, Herbert J.

    This report describes a multivariate analysis technique that approaches the problems of educational production function analysis by (1) using comparable measures of output across large experiments, (2) accounting systematically for differences in socioeconomic background, and (3) treating the school as a complete system in which different…

  4. Neutron scattering for the analysis of biological structures. Brookhaven symposia in biology. Number 27

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schoenborn, B P

    1976-01-01

    Sessions were included on neutron scattering and biological structure analysis, protein crystallography, neutron scattering from oriented systems, solution scattering, preparation of deuterated specimens, inelastic scattering, data analysis, experimental techniques, and instrumentation. Separate entries were made for the individual papers.

  5. Auxiliary principle technique and iterative algorithm for a perturbed system of generalized multi-valued mixed quasi-equilibrium-like problems.

    PubMed

    Rahaman, Mijanur; Pang, Chin-Tzong; Ishtyak, Mohd; Ahmad, Rais

    2017-01-01

    In this article, we introduce a perturbed system of generalized mixed quasi-equilibrium-like problems involving multi-valued mappings in Hilbert spaces. To calculate the approximate solutions of the perturbed system of generalized multi-valued mixed quasi-equilibrium-like problems, firstly we develop a perturbed system of auxiliary generalized multi-valued mixed quasi-equilibrium-like problems, and then by using the celebrated Fan-KKM technique, we establish the existence and uniqueness of solutions of the perturbed system of auxiliary generalized multi-valued mixed quasi-equilibrium-like problems. By deploying an auxiliary principle technique and an existence result, we formulate an iterative algorithm for solving the perturbed system of generalized multi-valued mixed quasi-equilibrium-like problems. Lastly, we study the strong convergence analysis of the proposed iterative sequences under monotonicity and some mild conditions. These results are new and generalize some known results in this field.

  6. Moving Target Techniques: Leveraging Uncertainty for CyberDefense

    DTIC Science & Technology

    2015-12-15

    cyberattacks is a continual struggle for system managers. Attackers often need only find one vulnerability (a flaw or bug that an attacker can exploit...additional parsing code itself could have security-relevant software bugs . Dynamic  Network   Techniques in the dynamic network domain change the...evaluation of MT techniques can benefit from a variety of evaluation approaches, including abstract analysis, modeling and simulation, test bed

  7. AGARD Flight Test Techniques Series. Volume 9. Aircraft Exterior Noise Measurement and Analysis Techniques. (Le Bruit a l’Exterieur des Aeronefs: Techniques de Mesure et d’Analyse)

    DTIC Science & Technology

    1991-04-01

    aircraft Fig. 4.6 Airborne test set-up to compare several microphone/nose-cone arrangements for self -noise generation on a glider plane Fig. 4.7 Comparison...of normalized self -noise spectra of ogive-nose-cone equipped condenser-microphones of different diameters F!g. 4.8 Frequency splitting in the noise...output is obtained at the last com-poet ot the sub-system. The electrical respose of the entire system is then the arithmetic Sof the ildividual respnsem

  8. Progress in multidisciplinary design optimization at NASA Langley

    NASA Technical Reports Server (NTRS)

    Padula, Sharon L.

    1993-01-01

    Multidisciplinary Design Optimization refers to some combination of disciplinary analyses, sensitivity analysis, and optimization techniques used to design complex engineering systems. The ultimate objective of this research at NASA Langley Research Center is to help the US industry reduce the costs associated with development, manufacturing, and maintenance of aerospace vehicles while improving system performance. This report reviews progress towards this objective and highlights topics for future research. Aerospace design problems selected from the author's research illustrate strengths and weaknesses in existing multidisciplinary optimization techniques. The techniques discussed include multiobjective optimization, global sensitivity equations and sequential linear programming.

  9. Parallel object-oriented, denoising system using wavelet multiresolution analysis

    DOEpatents

    Kamath, Chandrika; Baldwin, Chuck H.; Fodor, Imola K.; Tang, Nu A.

    2005-04-12

    The present invention provides a data de-noising system utilizing processors and wavelet denoising techniques. Data is read and displayed in different formats. The data is partitioned into regions and the regions are distributed onto the processors. Communication requirements are determined among the processors according to the wavelet denoising technique and the partitioning of the data. The data is transforming onto different multiresolution levels with the wavelet transform according to the wavelet denoising technique, the communication requirements, and the transformed data containing wavelet coefficients. The denoised data is then transformed into its original reading and displaying data format.

  10. Safety considerations in the design and operation of large wind turbines

    NASA Technical Reports Server (NTRS)

    Reilly, D. H.

    1979-01-01

    The engineering and safety techniques used to assure the reliable and safe operation of large wind turbine generators utilizing the Mod 2 Wind Turbine System Program as an example is described. The techniques involve a careful definition of the wind turbine's natural and operating environments, use of proven structural design criteria and analysis techniques, an evaluation of potential failure modes and hazards, and use of a fail safe and redundant component engineering philosophy. The role of an effective quality assurance program, tailored to specific hardware criticality, and the checkout and validation program developed to assure system integrity are described.

  11. Maintenance Audit through Value Analysis Technique: A Case Study

    NASA Astrophysics Data System (ADS)

    Carnero, M. C.; Delgado, S.

    2008-11-01

    The increase in competitiveness, technological changes and the increase in the requirements of quality and service have forced a change in the design and application of maintenance, as well as the way in which it is considered within the managerial strategy. There are numerous maintenance activities that must be developed in a service company. As a result the maintenance functions as a whole have to be outsourced. Nevertheless, delegating this subject to specialized personnel does not exempt the company from responsibilities, but rather leads to the need for control of each maintenance activity. In order to achieve this control and to evaluate the efficiency and effectiveness of the company it is essential to carry out an audit that diagnoses the problems that could develop. In this paper a maintenance audit applied to a service company is developed. The methodology applied is based on the expert systems. The expert system by means of rules uses the weighting technique SMART and value analysis to obtain the weighting between the decision functions and between the alternatives. The expert system applies numerous rules and relations between different variables associated with the specific maintenance functions, to obtain the maintenance state by sections and the general maintenance state of the enterprise. The contributions of this paper are related to the development of a maintenance audit in a service enterprise, in which maintenance is not generally considered a strategic subject and to the integration of decision-making tools such as the weighting technique SMART with value analysis techniques, typical in the design of new products, in the area of the rule-based expert systems.

  12. Automation tools for demonstration of goal directed and self-repairing flight control systems

    NASA Technical Reports Server (NTRS)

    Agarwal, A. K.

    1988-01-01

    The coupling of expert systems and control design and analysis techniques are documented to provide a realizable self repairing flight control system. Key features of such a flight control system are identified and a limited set of rules for a simple aircraft model are presented.

  13. Non Contacting Evaluation of Strains and Cracking Using Optical and Infrared Imaging Techniques

    DTIC Science & Technology

    1988-08-22

    Compatible Zenith Z-386 microcomputer with plotter II. 3-D Motion Measurinq System 1. Complete OPTOTRAK three dimensional digitizing system. System includes...acquisition unit - 16 single ended analog input channels 3. Data Analysis Package software (KINEPLOT) 4. Extra OPTOTRAK Camera (max 224 per system

  14. Heterodyne laser spectroscopy system

    DOEpatents

    Wyeth, Richard W.; Paisner, Jeffrey A.; Story, Thomas

    1989-01-01

    A heterodyne laser spectroscopy system utilizes laser heterodyne techniques for purposes of laser isotope separation spectroscopy, vapor diagnostics, processing of precise laser frequency offsets from a reference frequency and the like, and provides spectral analysis of a laser beam.

  15. Analysis and Comparison of Some Automatic Vehicle Monitoring Systems

    DOT National Transportation Integrated Search

    1973-07-01

    In 1970 UMTA solicited proposals and selected four companies to develop systems to demonstrate the feasibility of different automatic vehicle monitoring techniques. The demonstrations culminated in experiments in Philadelphia to assess the performanc...

  16. Analysis of Costs and Performance

    ERIC Educational Resources Information Center

    Duchesne, Roderick M.

    1973-01-01

    This article outlines a library management information system concerned with total library costs and performance. The system is essentially an adaptation of well-proven industrial and commercial management accounting techniques to the library context. (24 references) (Author)

  17. Non-destructive evaluation of laboratory scale hydraulic fracturing using acoustic emission

    NASA Astrophysics Data System (ADS)

    Hampton, Jesse Clay

    The primary objective of this research is to develop techniques to characterize hydraulic fractures and fracturing processes using acoustic emission monitoring based on laboratory scale hydraulic fracturing experiments. Individual microcrack AE source characterization is performed to understand the failure mechanisms associated with small failures along pre-existing discontinuities and grain boundaries. Individual microcrack analysis methods include moment tensor inversion techniques to elucidate the mode of failure, crack slip and crack normal direction vectors, and relative volumetric deformation of an individual microcrack. Differentiation between individual microcrack analysis and AE cloud based techniques is studied in efforts to refine discrete fracture network (DFN) creation and regional damage quantification of densely fractured media. Regional damage estimations from combinations of individual microcrack analyses and AE cloud density plotting are used to investigate the usefulness of weighting cloud based AE analysis techniques with microcrack source data. Two granite types were used in several sample configurations including multi-block systems. Laboratory hydraulic fracturing was performed with sample sizes ranging from 15 x 15 x 25 cm3 to 30 x 30 x 25 cm 3 in both unconfined and true-triaxially confined stress states using different types of materials. Hydraulic fracture testing in rock block systems containing a large natural fracture was investigated in terms of AE response throughout fracture interactions. Investigations of differing scale analyses showed the usefulness of individual microcrack characterization as well as DFN and cloud based techniques. Individual microcrack characterization weighting cloud based techniques correlated well with post-test damage evaluations.

  18. Investigation of safety analysis methods using computer vision techniques

    NASA Astrophysics Data System (ADS)

    Shirazi, Mohammad Shokrolah; Morris, Brendan Tran

    2017-09-01

    This work investigates safety analysis methods using computer vision techniques. The vision-based tracking system is developed to provide the trajectory of road users including vehicles and pedestrians. Safety analysis methods are developed to estimate time to collision (TTC) and postencroachment time (PET) that are two important safety measurements. Corresponding algorithms are presented and their advantages and drawbacks are shown through their success in capturing the conflict events in real time. The performance of the tracking system is evaluated first, and probability density estimation of TTC and PET are shown for 1-h monitoring of a Las Vegas intersection. Finally, an idea of an intersection safety map is introduced, and TTC values of two different intersections are estimated for 1 day from 8:00 a.m. to 6:00 p.m.

  19. FAST: A multi-processed environment for visualization of computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Bancroft, Gordon V.; Merritt, Fergus J.; Plessel, Todd C.; Kelaita, Paul G.; Mccabe, R. Kevin

    1991-01-01

    Three-dimensional, unsteady, multi-zoned fluid dynamics simulations over full scale aircraft are typical of the problems being investigated at NASA Ames' Numerical Aerodynamic Simulation (NAS) facility on CRAY2 and CRAY-YMP supercomputers. With multiple processor workstations available in the 10-30 Mflop range, we feel that these new developments in scientific computing warrant a new approach to the design and implementation of analysis tools. These larger, more complex problems create a need for new visualization techniques not possible with the existing software or systems available as of this writing. The visualization techniques will change as the supercomputing environment, and hence the scientific methods employed, evolves even further. The Flow Analysis Software Toolkit (FAST), an implementation of a software system for fluid mechanics analysis, is discussed.

  20. Visualization of Global Sensitivity Analysis Results Based on a Combination of Linearly Dependent and Independent Directions

    NASA Technical Reports Server (NTRS)

    Davies, Misty D.; Gundy-Burlet, Karen

    2010-01-01

    A useful technique for the validation and verification of complex flight systems is Monte Carlo Filtering -- a global sensitivity analysis that tries to find the inputs and ranges that are most likely to lead to a subset of the outputs. A thorough exploration of the parameter space for complex integrated systems may require thousands of experiments and hundreds of controlled and measured variables. Tools for analyzing this space often have limitations caused by the numerical problems associated with high dimensionality and caused by the assumption of independence of all of the dimensions. To combat both of these limitations, we propose a technique that uses a combination of the original variables with the derived variables obtained during a principal component analysis.

Top