Active controls: A look at analytical methods and associated tools
NASA Technical Reports Server (NTRS)
Newsom, J. R.; Adams, W. M., Jr.; Mukhopadhyay, V.; Tiffany, S. H.; Abel, I.
1984-01-01
A review of analytical methods and associated tools for active controls analysis and design problems is presented. Approaches employed to develop mathematical models suitable for control system analysis and/or design are discussed. Significant efforts have been expended to develop tools to generate the models from the standpoint of control system designers' needs and develop the tools necessary to analyze and design active control systems. Representative examples of these tools are discussed. Examples where results from the methods and tools have been compared with experimental data are also presented. Finally, a perspective on future trends in analysis and design methods is presented.
Tool for Turbine Engine Closed-Loop Transient Analysis (TTECTrA) Users' Guide
NASA Technical Reports Server (NTRS)
Csank, Jeffrey T.; Zinnecker, Alicia M.
2014-01-01
The tool for turbine engine closed-loop transient analysis (TTECTrA) is a semi-automated control design tool for subsonic aircraft engine simulations. At a specific flight condition, TTECTrA produces a basic controller designed to meet user-defined goals and containing only the fundamental limiters that affect the transient performance of the engine. The purpose of this tool is to provide the user a preliminary estimate of the transient performance of an engine model without the need to design a full nonlinear controller.
Control/structure interaction conceptual design tool
NASA Technical Reports Server (NTRS)
Briggs, Hugh C.
1990-01-01
The JPL Control/Structure Interaction Program is developing new analytical methods for designing micro-precision spacecraft with controlled structures. One of these, the Conceptual Design Tool, will illustrate innovative new approaches to the integration of multi-disciplinary analysis and design methods. The tool will be used to demonstrate homogeneity of presentation, uniform data representation across analytical methods, and integrated systems modeling. The tool differs from current 'integrated systems' that support design teams most notably in its support for the new CSI multi-disciplinary engineer. The design tool will utilize a three dimensional solid model of the spacecraft under design as the central data organization metaphor. Various analytical methods, such as finite element structural analysis, control system analysis, and mechanical configuration layout, will store and retrieve data from a hierarchical, object oriented data structure that supports assemblies of components with associated data and algorithms. In addition to managing numerical model data, the tool will assist the designer in organizing, stating, and tracking system requirements.
Air Traffic Complexity Measurement Environment (ACME): Software User's Guide
NASA Technical Reports Server (NTRS)
1996-01-01
A user's guide for the Air Traffic Complexity Measurement Environment (ACME) software is presented. The ACME consists of two major components, a complexity analysis tool and user interface. The Complexity Analysis Tool (CAT) analyzes complexity off-line, producing data files which may be examined interactively via the Complexity Data Analysis Tool (CDAT). The Complexity Analysis Tool is composed of three independently executing processes that communicate via PVM (Parallel Virtual Machine) and Unix sockets. The Runtime Data Management and Control process (RUNDMC) extracts flight plan and track information from a SAR input file, and sends the information to GARP (Generate Aircraft Routes Process) and CAT (Complexity Analysis Task). GARP in turn generates aircraft trajectories, which are utilized by CAT to calculate sector complexity. CAT writes flight plan, track and complexity data to an output file, which can be examined interactively. The Complexity Data Analysis Tool (CDAT) provides an interactive graphic environment for examining the complexity data produced by the Complexity Analysis Tool (CAT). CDAT can also play back track data extracted from System Analysis Recording (SAR) tapes. The CDAT user interface consists of a primary window, a controls window, and miscellaneous pop-ups. Aircraft track and position data is displayed in the main viewing area of the primary window. The controls window contains miscellaneous control and display items. Complexity data is displayed in pop-up windows. CDAT plays back sector complexity and aircraft track and position data as a function of time. Controls are provided to start and stop playback, adjust the playback rate, and reposition the display to a specified time.
NASA Astrophysics Data System (ADS)
He, Qiuwei; Lv, Xingming; Wang, Xin; Qu, Xingtian; Zhao, Ji
2017-01-01
Blade is the key component in the energy power equipment of turbine, aircraft engines and so on. Researches on the process and equipment for blade finishing become one of important and difficult point. To control precisely tool system of developed hybrid grinding and polishing machine tool for blade finishing, the tool system with changeable wheel for belt polishing is analyzed in this paper. Firstly, the belt length and wrap angle of each wheel in different position of tension wheel swing angle in the process of changing wheel is analyzed. The reasonable belt length is calculated by using MATLAB, and relationships between wrap angle of each wheel and cylinder expansion amount of contact wheel are obtained. Then, the control system for changeable wheel tool structure is developed. Lastly, the surface roughness of blade finishing is verified by experiments. Theoretical analysis and experimental results show that reasonable belt length and wheel wrap angle can be obtained by proposed analysis method, the changeable wheel tool system can be controlled precisely, and the surface roughness of blade after grinding meets the design requirements.
ASTEC and MODEL: Controls software development at Goddard Space Flight Center
NASA Technical Reports Server (NTRS)
Downing, John P.; Bauer, Frank H.; Surber, Jeffrey L.
1993-01-01
The ASTEC (Analysis and Simulation Tools for Engineering Controls) software is under development at the Goddard Space Flight Center (GSFC). The design goal is to provide a wide selection of controls analysis tools at the personal computer level, as well as the capability to upload compute-intensive jobs to a mainframe or supercomputer. In the last three years the ASTEC (Analysis and Simulation Tools for Engineering Controls) software has been under development. ASTEC is meant to be an integrated collection of controls analysis tools for use at the desktop level. MODEL (Multi-Optimal Differential Equation Language) is a translator that converts programs written in the MODEL language to FORTRAN. An upgraded version of the MODEL program will be merged into ASTEC. MODEL has not been modified since 1981 and has not kept with changes in computers or user interface techniques. This paper describes the changes made to MODEL in order to make it useful in the 90's and how it relates to ASTEC.
On-line analysis capabilities developed to support the AFW wind-tunnel tests
NASA Technical Reports Server (NTRS)
Wieseman, Carol D.; Hoadley, Sherwood T.; Mcgraw, Sandra M.
1992-01-01
A variety of on-line analysis tools were developed to support two active flexible wing (AFW) wind-tunnel tests. These tools were developed to verify control law execution, to satisfy analysis requirements of the control law designers, to provide measures of system stability in a real-time environment, and to provide project managers with a quantitative measure of controller performance. Descriptions and purposes of the developed capabilities are presented along with examples. Procedures for saving and transferring data for near real-time analysis, and descriptions of the corresponding data interface programs are also presented. The on-line analysis tools worked well before, during, and after the wind tunnel test and proved to be a vital and important part of the entire test effort.
A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.
Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco
2018-01-01
One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.
Rapid State Space Modeling Tool for Rectangular Wing Aeroservoelastic Studies
NASA Technical Reports Server (NTRS)
Suh, Peter M.; Conyers, Howard J.; Mavris, Dimitri N.
2014-01-01
This paper introduces a modeling and simulation tool for aeroservoelastic analysis of rectangular wings with trailing edge control surfaces. The inputs to the code are planform design parameters such as wing span, aspect ratio and number of control surfaces. A doublet lattice approach is taken to compute generalized forces. A rational function approximation is computed. The output, computed in a few seconds, is a state space aeroservoelastic model which can be used for analysis and control design. The tool is fully parameterized with default information so there is little required interaction with the model developer. Although, all parameters can be easily modified if desired.The focus of this paper is on tool presentation, verification and validation. This process is carried out in stages throughout the paper. The rational function approximation is verified against computed generalized forces for a plate model. A model composed of finite element plates is compared to a modal analysis from commercial software and an independently conducted experimental ground vibration test analysis. Aeroservoelastic analysis is the ultimate goal of this tool. Therefore the flutter speed and frequency for a clamped plate are computed using V-g and V-f analysis. The computational results are compared to a previously published computational analysis and wind tunnel results for the same structure. Finally a case study of a generic wing model with a single control surface is presented. Verification of the state space model is presented in comparison to V-g and V-f analysis. This also includes the analysis of the model in response to a 1-cos gust.
Rapid State Space Modeling Tool for Rectangular Wing Aeroservoelastic Studies
NASA Technical Reports Server (NTRS)
Suh, Peter M.; Conyers, Howard J.; Mavris, Dimitri N.
2015-01-01
This paper introduces a modeling and simulation tool for aeroservoelastic analysis of rectangular wings with trailing-edge control surfaces. The inputs to the code are planform design parameters such as wing span, aspect ratio, and number of control surfaces. Using this information, the generalized forces are computed using the doublet-lattice method. Using Roger's approximation, a rational function approximation is computed. The output, computed in a few seconds, is a state space aeroservoelastic model which can be used for analysis and control design. The tool is fully parameterized with default information so there is little required interaction with the model developer. All parameters can be easily modified if desired. The focus of this paper is on tool presentation, verification, and validation. These processes are carried out in stages throughout the paper. The rational function approximation is verified against computed generalized forces for a plate model. A model composed of finite element plates is compared to a modal analysis from commercial software and an independently conducted experimental ground vibration test analysis. Aeroservoelastic analysis is the ultimate goal of this tool, therefore, the flutter speed and frequency for a clamped plate are computed using damping-versus-velocity and frequency-versus-velocity analysis. The computational results are compared to a previously published computational analysis and wind-tunnel results for the same structure. A case study of a generic wing model with a single control surface is presented. Verification of the state space model is presented in comparison to damping-versus-velocity and frequency-versus-velocity analysis, including the analysis of the model in response to a 1-cos gust.
Rapid State Space Modeling Tool for Rectangular Wing Aeroservoelastic Studies
NASA Technical Reports Server (NTRS)
Suh, Peter M.; Conyers, Howard Jason; Mavris, Dimitri N.
2015-01-01
This report introduces a modeling and simulation tool for aeroservoelastic analysis of rectangular wings with trailing-edge control surfaces. The inputs to the code are planform design parameters such as wing span, aspect ratio, and number of control surfaces. Using this information, the generalized forces are computed using the doublet-lattice method. Using Roger's approximation, a rational function approximation is computed. The output, computed in a few seconds, is a state space aeroservoelastic model which can be used for analysis and control design. The tool is fully parameterized with default information so there is little required interaction with the model developer. All parameters can be easily modified if desired. The focus of this report is on tool presentation, verification, and validation. These processes are carried out in stages throughout the report. The rational function approximation is verified against computed generalized forces for a plate model. A model composed of finite element plates is compared to a modal analysis from commercial software and an independently conducted experimental ground vibration test analysis. Aeroservoelastic analysis is the ultimate goal of this tool, therefore, the flutter speed and frequency for a clamped plate are computed using damping-versus-velocity and frequency-versus-velocity analysis. The computational results are compared to a previously published computational analysis and wind-tunnel results for the same structure. A case study of a generic wing model with a single control surface is presented. Verification of the state space model is presented in comparison to damping-versus-velocity and frequency-versus-velocity analysis, including the analysis of the model in response to a 1-cos gust.
NASA Technical Reports Server (NTRS)
Marsell, Brandon; Griffin, David; Schallhorn, Dr. Paul; Roth, Jacob
2012-01-01
Coupling computational fluid dynamics (CFD) with a controls analysis tool elegantly allows for high accuracy predictions of the interaction between sloshing liquid propellants and th e control system of a launch vehicle. Instead of relying on mechanical analogs which are not valid during aU stages of flight, this method allows for a direct link between the vehicle dynamic environments calculated by the solver in the controls analysis tool to the fluid flow equations solved by the CFD code. This paper describes such a coupling methodology, presents the results of a series of test cases, and compares said results against equivalent results from extensively validated tools. The coupling methodology, described herein, has proven to be highly accurate in a variety of different cases.
Integrated CFD and Controls Analysis Interface for High Accuracy Liquid Propellant Slosh Predictions
NASA Technical Reports Server (NTRS)
Marsell, Brandon; Griffin, David; Schallhorn, Paul; Roth, Jacob
2012-01-01
Coupling computational fluid dynamics (CFD) with a controls analysis tool elegantly allows for high accuracy predictions of the interaction between sloshing liquid propellants and the control system of a launch vehicle. Instead of relying on mechanical analogs which are n0t va lid during all stages of flight, this method allows for a direct link between the vehicle dynamic environments calculated by the solver in the controls analysis tool to the fluid now equations solved by the CFD code. This paper describes such a coupling methodology, presents the results of a series of test cases, and compares said results against equivalent results from extensively validated tools. The coupling methodology, described herein, has proven to be highly accurate in a variety of different cases.
NASA Astrophysics Data System (ADS)
Landrock, Clinton K.
Falls are the leading cause of all external injuries. Outcomes of falls include the leading cause of traumatic brain injury and bone fractures, and high direct medical costs in the billions of dollars. This work focused on developing three areas of enabling component technology to be used in postural control monitoring tools targeting the mitigation of falls. The first was an analysis tool based on stochastic fractal analysis to reliably measure levels of motor control. The second focus was on thin film wearable pressure sensors capable of relaying data for the first tool. The third was new thin film advanced optics for improving phototherapy devices targeting postural control disorders. Two populations, athletes and elderly, were studied against control groups. The results of these studies clearly show that monitoring postural stability in at-risk groups can be achieved reliably, and an integrated wearable system can be envisioned for both monitoring and treatment purposes. Keywords: electro-active polymer, ionic polymer-metal composite, postural control, motor control, fall prevention, sports medicine, fractal analysis, physiological signals, wearable sensors, phototherapy, photobiomodulation, nano-optics.
Standardisation of DNA quantitation by image analysis: quality control of instrumentation.
Puech, M; Giroud, F
1999-05-01
DNA image analysis is frequently performed in clinical practice as a prognostic tool and to improve diagnosis. The precision of prognosis and diagnosis depends on the accuracy of analysis and particularly on the quality of image analysis systems. It has been reported that image analysis systems used for DNA quantification differ widely in their characteristics (Thunissen et al.: Cytometry 27: 21-25, 1997). This induces inter-laboratory variations when the same sample is analysed in different laboratories. In microscopic image analysis, the principal instrumentation errors arise from the optical and electronic parts of systems. They bring about problems of instability, non-linearity, and shading and glare phenomena. The aim of this study is to establish tools and standardised quality control procedures for microscopic image analysis systems. Specific reference standard slides have been developed to control instability, non-linearity, shading and glare phenomena and segmentation efficiency. Some systems have been controlled with these tools and these quality control procedures. Interpretation criteria and accuracy limits of these quality control procedures are proposed according to the conclusions of a European project called PRESS project (Prototype Reference Standard Slide). Beyond these limits, tested image analysis systems are not qualified to realise precise DNA analysis. The different procedures presented in this work determine if an image analysis system is qualified to deliver sufficiently precise DNA measurements for cancer case analysis. If the controlled systems are beyond the defined limits, some recommendations are given to find a solution to the problem.
A digital flight control system verification laboratory
NASA Technical Reports Server (NTRS)
De Feo, P.; Saib, S.
1982-01-01
A NASA/FAA program has been established for the verification and validation of digital flight control systems (DFCS), with the primary objective being the development and analysis of automated verification tools. In order to enhance the capabilities, effectiveness, and ease of using the test environment, software verification tools can be applied. Tool design includes a static analyzer, an assertion generator, a symbolic executor, a dynamic analysis instrument, and an automated documentation generator. Static and dynamic tools are integrated with error detection capabilities, resulting in a facility which analyzes a representative testbed of DFCS software. Future investigations will ensue particularly in the areas of increase in the number of software test tools, and a cost effectiveness assessment.
Design of Launch Vehicle Flight Control Systems Using Ascent Vehicle Stability Analysis Tool
NASA Technical Reports Server (NTRS)
Jang, Jiann-Woei; Alaniz, Abran; Hall, Robert; Bedossian, Nazareth; Hall, Charles; Jackson, Mark
2011-01-01
A launch vehicle represents a complicated flex-body structural environment for flight control system design. The Ascent-vehicle Stability Analysis Tool (ASAT) is developed to address the complicity in design and analysis of a launch vehicle. The design objective for the flight control system of a launch vehicle is to best follow guidance commands while robustly maintaining system stability. A constrained optimization approach takes the advantage of modern computational control techniques to simultaneously design multiple control systems in compliance with required design specs. "Tower Clearance" and "Load Relief" designs have been achieved for liftoff and max dynamic pressure flight regions, respectively, in the presence of large wind disturbances. The robustness of the flight control system designs has been verified in the frequency domain Monte Carlo analysis using ASAT.
ASTEC: Controls analysis for personal computers
NASA Technical Reports Server (NTRS)
Downing, John P.; Bauer, Frank H.; Thorpe, Christopher J.
1989-01-01
The ASTEC (Analysis and Simulation Tools for Engineering Controls) software is under development at Goddard Space Flight Center (GSFC). The design goal is to provide a wide selection of controls analysis tools at the personal computer level, as well as the capability to upload compute-intensive jobs to a mainframe or supercomputer. The project is a follow-on to the INCA (INteractive Controls Analysis) program that has been developed at GSFC over the past five years. While ASTEC makes use of the algorithms and expertise developed for the INCA program, the user interface was redesigned to take advantage of the capabilities of the personal computer. The design philosophy and the current capabilities of the ASTEC software are described.
Influence of export control policy on the competitiveness of machine tool producing organizations
NASA Astrophysics Data System (ADS)
Ahrstrom, Jeffrey D.
The possible influence of export control policies on producers of export controlled machine tools is examined in this quantitative study. International market competitiveness theories hold that market controlling policies such as export control regulations may influence an organization's ability to compete (Burris, 2010). Differences in domestic application of export control policy on machine tool exports may impose throttling effects on the competitiveness of participating firms (Freedenberg, 2010). Commodity shipments from Japan, Germany, and the United States to the Russian market will be examined using descriptive statistics; gravity modeling of these specific markets provides a foundation for comparison to actual shipment data; and industry participant responses to a user developed survey will provide additional data for analysis using a Kruskal-Wallis one-way analysis of variance. There is scarce academic research data on the topic of export control effects within the machine tool industry. Research results may be of interest to industry leadership in market participation decisions, advocacy arguments, and strategic planning. Industry advocates and export policy decision makers could find data of interest in supporting positions for or against modifications of export control policies.
Integrated multidisciplinary analysis tool IMAT users' guide
NASA Technical Reports Server (NTRS)
Meissner, Frances T. (Editor)
1988-01-01
The Integrated Multidisciplinary Analysis Tool (IMAT) is a computer software system developed at Langley Research Center. IMAT provides researchers and analysts with an efficient capability to analyze satellite controls systems influenced by structural dynamics. Using a menu-driven executive system, IMAT leads the user through the program options. IMAT links a relational database manager to commercial and in-house structural and controls analysis codes. This paper describes the IMAT software system and how to use it.
Distributed Engine Control Empirical/Analytical Verification Tools
NASA Technical Reports Server (NTRS)
DeCastro, Jonathan; Hettler, Eric; Yedavalli, Rama; Mitra, Sayan
2013-01-01
NASA's vision for an intelligent engine will be realized with the development of a truly distributed control system featuring highly reliable, modular, and dependable components capable of both surviving the harsh engine operating environment and decentralized functionality. A set of control system verification tools was developed and applied to a C-MAPSS40K engine model, and metrics were established to assess the stability and performance of these control systems on the same platform. A software tool was developed that allows designers to assemble easily a distributed control system in software and immediately assess the overall impacts of the system on the target (simulated) platform, allowing control system designers to converge rapidly on acceptable architectures with consideration to all required hardware elements. The software developed in this program will be installed on a distributed hardware-in-the-loop (DHIL) simulation tool to assist NASA and the Distributed Engine Control Working Group (DECWG) in integrating DCS (distributed engine control systems) components onto existing and next-generation engines.The distributed engine control simulator blockset for MATLAB/Simulink and hardware simulator provides the capability to simulate virtual subcomponents, as well as swap actual subcomponents for hardware-in-the-loop (HIL) analysis. Subcomponents can be the communication network, smart sensor or actuator nodes, or a centralized control system. The distributed engine control blockset for MATLAB/Simulink is a software development tool. The software includes an engine simulation, a communication network simulation, control algorithms, and analysis algorithms set up in a modular environment for rapid simulation of different network architectures; the hardware consists of an embedded device running parts of the CMAPSS engine simulator and controlled through Simulink. The distributed engine control simulation, evaluation, and analysis technology provides unique capabilities to study the effects of a given change to the control system in the context of the distributed paradigm. The simulation tool can support treatment of all components within the control system, both virtual and real; these include communication data network, smart sensor and actuator nodes, centralized control system (FADEC full authority digital engine control), and the aircraft engine itself. The DECsim tool can allow simulation-based prototyping of control laws, control architectures, and decentralization strategies before hardware is integrated into the system. With the configuration specified, the simulator allows a variety of key factors to be systematically assessed. Such factors include control system performance, reliability, weight, and bandwidth utilization.
A dynamical framework for integrated corridor management.
DOT National Transportation Integrated Search
2016-01-11
We develop analysis and control synthesis tools for dynamic traffic flow over networks. Our analysis : relies on exploiting monotonicity properties of the dynamics, and on adapting relevant tools from : stochastic queuing networks. We develop proport...
Use of the MATRIXx Integrated Toolkit on the Microwave Anisotropy Probe Attitude Control System
NASA Technical Reports Server (NTRS)
Ward, David K.; Andrews, Stephen F.; McComas, David C.; ODonnell, James R., Jr.
1999-01-01
Recent advances in analytical software tools allow the analysis, simulation, flight code, and documentation of an algorithm to be generated from a single source, all within one integrated analytical design package. NASA's Microwave Anisotropy Probe project has used one such package, Integrated Systems' MATRIXx suite, in the design of the spacecraft's Attitude Control System. The project's experience with the linear analysis, simulation, code generation, and documentation tools will be presented and compared with more traditional development tools. In particular, the quality of the flight software generated will be examined in detail. Finally, lessons learned on each of the tools will be shared.
An integrated modeling and design tool for advanced optical spacecraft
NASA Technical Reports Server (NTRS)
Briggs, Hugh C.
1992-01-01
Consideration is given to the design and status of the Integrated Modeling of Optical Systems (IMOS) tool and to critical design issues. A multidisciplinary spacecraft design and analysis tool with support for structural dynamics, controls, thermal analysis, and optics, IMOS provides rapid and accurate end-to-end performance analysis, simulations, and optimization of advanced space-based optical systems. The requirements for IMOS-supported numerical arrays, user defined data structures, and a hierarchical data base are outlined, and initial experience with the tool is summarized. A simulation of a flexible telescope illustrates the integrated nature of the tools.
Modern CACSD using the Robust-Control Toolbox
NASA Technical Reports Server (NTRS)
Chiang, Richard Y.; Safonov, Michael G.
1989-01-01
The Robust-Control Toolbox is a collection of 40 M-files which extend the capability of PC/PRO-MATLAB to do modern multivariable robust control system design. Included are robust analysis tools like singular values and structured singular values, robust synthesis tools like continuous/discrete H(exp 2)/H infinity synthesis and Linear Quadratic Gaussian Loop Transfer Recovery methods and a variety of robust model reduction tools such as Hankel approximation, balanced truncation and balanced stochastic truncation, etc. The capabilities of the toolbox are described and illustated with examples to show how easily they can be used in practice. Examples include structured singular value analysis, H infinity loop-shaping and large space structure model reduction.
IMAT (Integrated Multidisciplinary Analysis Tool) user's guide for the VAX/VMS computer
NASA Technical Reports Server (NTRS)
Meissner, Frances T. (Editor)
1988-01-01
The Integrated Multidisciplinary Analysis Tool (IMAT) is a computer software system for the VAX/VMS computer developed at the Langley Research Center. IMAT provides researchers and analysts with an efficient capability to analyze satellite control systems influenced by structural dynamics. Using a menu-driven executive system, IMAT leads the user through the program options. IMAT links a relational database manager to commercial and in-house structural and controls analysis codes. This paper describes the IMAT software system and how to use it.
TU-FG-201-05: Varian MPC as a Statistical Process Control Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carver, A; Rowbottom, C
Purpose: Quality assurance in radiotherapy requires the measurement of various machine parameters to ensure they remain within permitted values over time. In Truebeam release 2.0 the Machine Performance Check (MPC) was released allowing beam output and machine axis movements to be assessed in a single test. We aim to evaluate the Varian Machine Performance Check (MPC) as a tool for Statistical Process Control (SPC). Methods: Varian’s MPC tool was used on three Truebeam and one EDGE linac for a period of approximately one year. MPC was commissioned against independent systems. After this period the data were reviewed to determine whethermore » or not the MPC was useful as a process control tool. Analyses on individual tests were analysed using Shewhart control plots, using Matlab for analysis. Principal component analysis was used to determine if a multivariate model was of any benefit in analysing the data. Results: Control charts were found to be useful to detect beam output changes, worn T-nuts and jaw calibration issues. Upper and lower control limits were defined at the 95% level. Multivariate SPC was performed using Principal Component Analysis. We found little evidence of clustering beyond that which might be naively expected such as beam uniformity and beam output. Whilst this makes multivariate analysis of little use it suggests that each test is giving independent information. Conclusion: The variety of independent parameters tested in MPC makes it a sensitive tool for routine machine QA. We have determined that using control charts in our QA programme would rapidly detect changes in machine performance. The use of control charts allows large quantities of tests to be performed on all linacs without visual inspection of all results. The use of control limits alerts users when data are inconsistent with previous measurements before they become out of specification. A. Carver has received a speaker’s honorarium from Varian.« less
2010-09-01
application of existing assessment tools that may be applicable to Marine Air Ground Task Force (MAGTF) Command, Control, Communications and...of existing assessment tools that may be applicable to Marine Air Ground Task Force (MAGTF) Command, Control, Communications and Computers (C4...assessment tools and analysis concepts that may be extended to the Marine Corps’ C4 System of Systems assessment methodology as a means to obtain a
A computer-aided approach to nonlinear control systhesis
NASA Technical Reports Server (NTRS)
Wie, Bong; Anthony, Tobin
1988-01-01
The major objective of this project is to develop a computer-aided approach to nonlinear stability analysis and nonlinear control system design. This goal is to be obtained by refining the describing function method as a synthesis tool for nonlinear control design. The interim report outlines the approach by this study to meet these goals including an introduction to the INteractive Controls Analysis (INCA) program which was instrumental in meeting these study objectives. A single-input describing function (SIDF) design methodology was developed in this study; coupled with the software constructed in this study, the results of this project provide a comprehensive tool for design and integration of nonlinear control systems.
2007-08-01
ERDC/TN ANSRP-07-2 August 2007 Detection of Apoptosis in Early Life Stages as a Tool to Evaluate Chemical Control of Invasive Species by J...4. TITLE AND SUBTITLE Detection of Apoptosis in Early Life Stages as a Tool to Evaluate Chemical Control of Invasive Species 5a. CONTRACT NUMBER 5b...heralding apoptosis . Data analysis. An apoptotic index (API) was established by calculating the percentage of embryos in each life stage with
The effect of introducing computers into an introductory physics problem-solving laboratory
NASA Astrophysics Data System (ADS)
McCullough, Laura Ellen
2000-10-01
Computers are appearing in every type of classroom across the country. Yet they often appear without benefit of studying their effects. The research that is available on computer use in classrooms has found mixed results, and often ignores the theoretical and instructional contexts of the computer in the classroom. The University of Minnesota's physics department employs a cooperative-group problem solving pedagogy, based on a cognitive apprenticeship instructional model, in its calculus-based introductory physics course. This study was designed to determine possible negative effects of introducing a computerized data-acquisition and analysis tool into this pedagogy as a problem-solving tool for students to use in laboratory. To determine the effects of the computer tool, two quasi-experimental treatment groups were selected. The computer-tool group (N = 170) used a tool, designed for this study (VideoTool), to collect and analyze motion data in the laboratory. The control group (N = 170) used traditional non-computer equipment (spark tapes and Polaroid(TM) film). The curriculum was kept as similar as possible for the two groups. During the ten week academic quarter, groups were examined for effects on performance on conceptual tests and grades, attitudes towards the laboratory and the laboratory tools, and behaviors within cooperative groups. Possible interactions with gender were also examined. Few differences were found between the control and computer-tool groups. The control group received slightly higher scores on one conceptual test, but this difference was not educationally significant. The computer-tool group had slightly more positive attitudes towards using the computer tool than their counterparts had towards the traditional tools. The computer-tool group also perceived that they spoke more frequently about physics misunderstandings, while the control group felt that they discussed equipment difficulties more often. This perceptual difference interacted with gender, with the men in the control group more likely to discuss equipment difficulties than any other group. Overall, the differences between the control and quasi-experimental groups were minimal. It was concluded that carefully replacing traditional data collection and analysis tools with a computer tool had no negative effects on achievement, attitude, group behavior, and did not interact with gender.
Ares I-X Flight Test Validation of Control Design Tools in the Frequency-Domain
NASA Technical Reports Server (NTRS)
Johnson, Matthew; Hannan, Mike; Brandon, Jay; Derry, Stephen
2011-01-01
A major motivation of the Ares I-X flight test program was to Design for Data, in order to maximize the usefulness of the data recorded in support of Ares I modeling and validation of design and analysis tools. The Design for Data effort was intended to enable good post-flight characterizations of the flight control system, the vehicle structural dynamics, and also the aerodynamic characteristics of the vehicle. To extract the necessary data from the system during flight, a set of small predetermined Programmed Test Inputs (PTIs) was injected directly into the TVC signal. These PTIs were designed to excite the necessary vehicle dynamics while exhibiting a minimal impact on loads. The method is similar to common approaches in aircraft flight test programs, but with unique launch vehicle challenges due to rapidly changing states, short duration of flight, a tight flight envelope, and an inability to repeat any test. This paper documents the validation effort of the stability analysis tools to the flight data which was performed by comparing the post-flight calculated frequency response of the vehicle to the frequency response calculated by the stability analysis tools used to design and analyze the preflight models during the control design effort. The comparison between flight day frequency response and stability tool analysis for flight of the simulated vehicle shows good agreement and provides a high level of confidence in the stability analysis tools for use in any future program. This is true for both a nominal model as well as for dispersed analysis, which shows that the flight day frequency response is enveloped by the vehicle s preflight uncertainty models.
Cost/Schedule Control Systems Criteria: A Reference Guide to C/SCSC information
1992-09-01
Smith, Larry A. "Mainframe ARTEMIS: More than a Project Management Tool -- Earned Value Analysis ( PEVA )," Project Management Journal, 19:23-28 (April 1988...A. "Mainframe ARTEMIS: More than a Project Management Tool - Earned Value Analysis ( PEVA )," Project Management Journal, 19:23-28 (April 1988). 14...than a Project Management Tool -- Earned Value Analysis ( PEVA )," Project Management Journal, 19:23-28 (April 1988). 17. Trufant, Thomas M. and Robert
A UML Profile for State Analysis
NASA Technical Reports Server (NTRS)
Murray, Alex; Rasmussen, Robert
2010-01-01
State Analysis is a systems engineering methodology for the specification and design of control systems, developed at the Jet Propulsion Laboratory. The methodology emphasizes an analysis of the system under control in terms of States and their properties and behaviors and their effects on each other, a clear separation of the control system from the controlled system, cognizance in the control system of the controlled system's State, goal-based control built on constraining the controlled system's States, and disciplined techniques for State discovery and characterization. State Analysis (SA) introduces two key diagram types: State Effects and Goal Network diagrams. The team at JPL developed a tool for performing State Analysis. The tool includes a drawing capability, backed by a database that supports the diagram types and the organization of the elements of the SA models. But the tool does not support the usual activities of software engineering and design - a disadvantage, since systems to which State Analysis can be applied tend to be very software-intensive. This motivated the work described in this paper: the development of a preliminary Unified Modeling Language (UML) profile for State Analysis. Having this profile would enable systems engineers to specify a system using the methods and graphical language of State Analysis, which is easily linked with a larger system model in SysML (Systems Modeling Language), while also giving software engineers engaged in implementing the specified control system immediate access to and use of the SA model, in the same language, UML, used for other software design. That is, a State Analysis profile would serve as a shared modeling bridge between system and software models for the behavior aspects of the system. This paper begins with an overview of State Analysis and its underpinnings, followed by an overview of the mapping of SA constructs to the UML metamodel. It then delves into the details of these mappings and the constraints associated with them. Finally, we give an example of the use of the profile for expressing an example SA model.
PySCeSToolbox: a collection of metabolic pathway analysis tools.
Christensen, Carl D; Hofmeyr, Jan-Hendrik S; Rohwer, Johann M
2018-01-01
PySCeSToolbox is an extension to the Python Simulator for Cellular Systems (PySCeS) that includes tools for performing generalized supply-demand analysis, symbolic metabolic control analysis, and a framework for investigating the kinetic and thermodynamic aspects of enzyme-catalyzed reactions. Each tool addresses a different aspect of metabolic behaviour, control, and regulation; the tools complement each other and can be used in conjunction to better understand higher level system behaviour. PySCeSToolbox is available on Linux, Mac OS X and Windows. It is licensed under the BSD 3-clause licence. Code, setup instructions and a link to documentation can be found at https://github.com/PySCeS/PyscesToolbox. jr@sun.ac.za. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
NASA Technical Reports Server (NTRS)
Chin, Jeffrey C.; Csank, Jeffrey T.
2016-01-01
The Tool for Turbine Engine Closed-Loop Transient Analysis (TTECTrA ver2) is a control design tool thatenables preliminary estimation of transient performance for models without requiring a full nonlinear controller to bedesigned. The program is compatible with subsonic engine models implemented in the MATLAB/Simulink (TheMathworks, Inc.) environment and Numerical Propulsion System Simulation (NPSS) framework. At a specified flightcondition, TTECTrA will design a closed-loop controller meeting user-defined requirements in a semi or fully automatedfashion. Multiple specifications may be provided, in which case TTECTrA will design one controller for each, producing acollection of controllers in a single run. Each resulting controller contains a setpoint map, a schedule of setpointcontroller gains, and limiters; all contributing to transient characteristics. The goal of the program is to providesteady-state engine designers with more immediate feedback on the transient engine performance earlier in the design cycle.
Tool Efficiency Analysis model research in SEMI industry
NASA Astrophysics Data System (ADS)
Lei, Ma; Nana, Zhang; Zhongqiu, Zhang
2018-06-01
One of the key goals in SEMI industry is to improve equipment through put and ensure equipment production efficiency maximization. This paper is based on SEMI standards in semiconductor equipment control, defines the transaction rules between different tool states, and presents a TEA system model which is to analysis tool performance automatically based on finite state machine. The system was applied to fab tools and verified its effectiveness successfully, and obtained the parameter values used to measure the equipment performance, also including the advices of improvement.
The Precision Formation Flying Integrated Analysis Tool (PFFIAT)
NASA Technical Reports Server (NTRS)
Stoneking, Eric; Lyon, Richard G.; Sears, Edie; Lu, Victor
2004-01-01
Several space missions presently in the concept phase (e.g. Stellar Imager, Submillimeter Probe of Evolutionary Cosmic Structure, Terrestrial Planet Finder) plan to use multiple spacecraft flying in precise formation to synthesize unprecedently large aperture optical systems. These architectures present challenges to the attitude and position determination and control system; optical performance is directly coupled to spacecraft pointing with typical control requirements being on the scale of milliarcseconds and nanometers. To investigate control strategies, rejection of environmental disturbances, and sensor and actuator requirements, a capability is needed to model both the dynamical and optical behavior of such a distributed telescope system. This paper describes work ongoing at NASA Goddard Space Flight Center toward the integration of a set of optical analysis tools (Optical System Characterization and Analysis Research software, or OSCAR) with the Formation Flying Test Bed (FFTB). The resulting system is called the Precision Formation Flying Integrated Analysis Tool (PFFIAT), and it provides the capability to simulate closed-loop control of optical systems composed of elements mounted on multiple spacecraft. The attitude and translation spacecraft dynamics are simulated in the FFTB, including effects of the space environment (e.g. solar radiation pressure, differential orbital motion). The resulting optical configuration is then processed by OSCAR to determine an optical image. From this image, wavefront sensing (e.g. phase retrieval) techniques are being developed to derive attitude and position errors. These error signals will be fed back to the spacecraft control systems, completing the control loop. A simple case study is presented to demonstrate the present capabilities of the tool.
The Precision Formation Flying Integrated Analysis Tool (PFFIAT)
NASA Technical Reports Server (NTRS)
Stoneking, Eric; Lyon, Richard G.; Sears, Edie; Lu, Victor
2004-01-01
Several space missions presently in the concept phase (e.g. Stellar Imager, Sub- millimeter Probe of Evolutionary Cosmic Structure, Terrestrial Planet Finder) plan to use multiple spacecraft flying in precise formation to synthesize unprecedently large aperture optical systems. These architectures present challenges to the attitude and position determination and control system; optical performance is directly coupled to spacecraft pointing with typical control requirements being on the scale of milliarcseconds and nanometers. To investigate control strategies, rejection of environmental disturbances, and sensor and actuator requirements, a capability is needed to model both the dynamical and optical behavior of such a distributed telescope system. This paper describes work ongoing at NASA Goddard Space Flight Center toward the integration of a set of optical analysis tools (Optical System Characterization and Analysis Research software, or OSCAR) with the Formation J?lying Test Bed (FFTB). The resulting system is called the Precision Formation Flying Integrated Analysis Tool (PFFIAT), and it provides the capability to simulate closed-loop control of optical systems composed of elements mounted on multiple spacecraft. The attitude and translation spacecraft dynamics are simulated in the FFTB, including effects of the space environment (e.g. solar radiation pressure, differential orbital motion). The resulting optical configuration is then processed by OSCAR to determine an optical image. From this image, wavefront sensing (e.g. phase retrieval) techniques are being developed to derive attitude and position errors. These error signals will be fed back to the spacecraft control systems, completing the control loop. A simple case study is presented to demonstrate the present capabilities of the tool.
Modeling Tools for Propulsion Analysis and Computational Fluid Dynamics on the Internet
NASA Technical Reports Server (NTRS)
Muss, J. A.; Johnson, C. W.; Gotchy, M. B.
2000-01-01
The existing RocketWeb(TradeMark) Internet Analysis System (httr)://www.iohnsonrockets.com/rocketweb) provides an integrated set of advanced analysis tools that can be securely accessed over the Internet. Since these tools consist of both batch and interactive analysis codes, the system includes convenient methods for creating input files and evaluating the resulting data. The RocketWeb(TradeMark) system also contains many features that permit data sharing which, when further developed, will facilitate real-time, geographically diverse, collaborative engineering within a designated work group. Adding work group management functionality while simultaneously extending and integrating the system's set of design and analysis tools will create a system providing rigorous, controlled design development, reducing design cycle time and cost.
Control of flexible structures
NASA Technical Reports Server (NTRS)
Russell, R. A.
1985-01-01
The requirements for future space missions indicate that many of these spacecraft will be large, flexible, and in some applications, require precision geometries. A technology program that addresses the issues associated with the structure/control interactions for these classes of spacecraft is discussed. The goal of the NASA control of flexible structures technology program is to generate a technology data base that will provide the designer with options and approaches to achieve spacecraft performance such as maintaining geometry and/or suppressing undesired spacecraft dynamics. This technology program will define the appropriate combination of analysis, ground testing, and flight testing required to validate the structural/controls analysis and design tools. This work was motivated by a recognition that large minimum weight space structures will be required for many future missions. The tools necessary to support such design included: (1) improved structural analysis; (2) modern control theory; (3) advanced modeling techniques; (4) system identification; and (5) the integration of structures and controls.
Bhattacharyya, Rahul; Davidson, Donald J; Sugand, Kapil; Bartlett, Matthew J; Bhattacharya, Rajarshi; Gupte, Chinmay M
2017-10-04
Virtual-reality and cadaveric simulations are expensive and not readily accessible. Innovative and accessible training adjuncts are required to help to meet training needs. Cognitive task analysis has been used extensively to train pilots and in other surgical specialties. However, the use of cognitive task analyses within orthopaedics is in its infancy. The purpose of this study was to evaluate the effectiveness of a novel cognitive task analysis tool to train novice surgeons in diagnostic knee arthroscopy in high-fidelity, phantom-limb simulation. Three expert knee surgeons were interviewed independently to generate a list of technical steps, decision points, and errors for diagnostic knee arthroscopy. A modified Delphi technique was used to generate the final cognitive task analysis. A video and a voiceover were recorded for each phase of this procedure. These were combined to produce the Imperial Knee Arthroscopy Cognitive Task Analysis (IKACTA) tool that utilizes written and audiovisual stimuli to describe each phase of a diagnostic knee arthroscopy. In this double-blinded, randomized controlled trial, a power calculation was performed prior to recruitment. Sixteen novice orthopaedic trainees who performed ≤10 diagnostic knee arthroscopies were randomized into 2 equal groups. The intervention group (IKACTA group) was given the IKACTA tool and the control group had no additional learning material. They were assessed objectively (validated Arthroscopic Surgical Skill Evaluation Tool [ASSET] global rating scale) on a high-fidelity, phantom-knee simulator. All participants, using the Likert rating scale, subjectively rated the tool. The mean ASSET score (and standard deviation) was 19.5 ± 3.7 points in the IKACTA group and 10.6 ± 2.3 points in the control group, resulting in an improvement of 8.9 points (95% confidence interval, 7.6 to 10.1 points; p = 0.002); the score was determined as 51.3% (19.5 of 38) for the IKACTA group, 27.9% (10.6 of 38) for the control group, and 23.4% (8.9 of 38) for the improvement. All participants agreed that the cognitive task analysis learning tool was a useful training adjunct to learning in the operating room. To our knowledge, this is the first cognitive task analysis in diagnostic knee arthroscopy that is user-friendly and inexpensive and has demonstrated significant benefits in training. The IKACTA will provide trainees with a demonstrably strong foundation in diagnostic knee arthroscopy that will flatten learning curves in both technical skills and decision-making.
Periodically-Scheduled Controller Analysis using Hybrid Systems Reachability and Continuization
2015-12-01
tools to verify specifications for hybrid automata do not perform well on such periodically scheduled models. This is due to a combination of the large...an additive nondeterministic input. Reachability tools for hybrid automata can better handle such systems. We further improve the analysis by...formally as a hybrid automaton. However, reachability tools to verify specifications for hybrid automata do not perform well on such periodically
NASA Astrophysics Data System (ADS)
Rose, Michael Benjamin
A novel trajectory and attitude control and navigation analysis tool for powered ascent is developed. The tool is capable of rapid trade-space analysis and is designed to ultimately reduce turnaround time for launch vehicle design, mission planning, and redesign work. It is streamlined to quickly determine trajectory and attitude control dispersions, propellant dispersions, orbit insertion dispersions, and navigation errors and their sensitivities to sensor errors, actuator execution uncertainties, and random disturbances. The tool is developed by applying both Monte Carlo and linear covariance analysis techniques to a closed-loop, launch vehicle guidance, navigation, and control (GN&C) system. The nonlinear dynamics and flight GN&C software models of a closed-loop, six-degree-of-freedom (6-DOF), Monte Carlo simulation are formulated and developed. The nominal reference trajectory (NRT) for the proposed lunar ascent trajectory is defined and generated. The Monte Carlo truth models and GN&C algorithms are linearized about the NRT, the linear covariance equations are formulated, and the linear covariance simulation is developed. The performance of the launch vehicle GN&C system is evaluated using both Monte Carlo and linear covariance techniques and their trajectory and attitude control dispersion, propellant dispersion, orbit insertion dispersion, and navigation error results are validated and compared. Statistical results from linear covariance analysis are generally within 10% of Monte Carlo results, and in most cases the differences are less than 5%. This is an excellent result given the many complex nonlinearities that are embedded in the ascent GN&C problem. Moreover, the real value of this tool lies in its speed, where the linear covariance simulation is 1036.62 times faster than the Monte Carlo simulation. Although the application and results presented are for a lunar, single-stage-to-orbit (SSTO), ascent vehicle, the tools, techniques, and mathematical formulations that are discussed are applicable to ascent on Earth or other planets as well as other rocket-powered systems such as sounding rockets and ballistic missiles.
... Toolkits Economic Impact Analysis Tool Community Health Gateway Sustainability Planning Tools Testing New Approaches Rural Health IT ... to the Centers for Disease Control and Prevention report, The State of Aging and Health in America ...
Integrated tools for control-system analysis
NASA Technical Reports Server (NTRS)
Ostroff, Aaron J.; Proffitt, Melissa S.; Clark, David R.
1989-01-01
The basic functions embedded within a user friendly software package (MATRIXx) are used to provide a high level systems approach to the analysis of linear control systems. Various control system analysis configurations are assembled automatically to minimize the amount of work by the user. Interactive decision making is incorporated via menu options and at selected points, such as in the plotting section, by inputting data. There are five evaluations such as the singular value robustness test, singular value loop transfer frequency response, Bode frequency response, steady-state covariance analysis, and closed-loop eigenvalues. Another section describes time response simulations. A time response for random white noise disturbance is available. The configurations and key equations used for each type of analysis, the restrictions that apply, the type of data required, and an example problem are described. One approach for integrating the design and analysis tools is also presented.
Brunner, C; Hoffmann, K; Thiele, T; Schedler, U; Jehle, H; Resch-Genger, U
2015-04-01
Commercial platforms consisting of ready-to-use microarrays printed with target-specific DNA probes, a microarray scanner, and software for data analysis are available for different applications in medical diagnostics and food analysis, detecting, e.g., viral and bacteriological DNA sequences. The transfer of these tools from basic research to routine analysis, their broad acceptance in regulated areas, and their use in medical practice requires suitable calibration tools for regular control of instrument performance in addition to internal assay controls. Here, we present the development of a novel assay-adapted calibration slide for a commercialized DNA-based assay platform, consisting of precisely arranged fluorescent areas of various intensities obtained by incorporating different concentrations of a "green" dye and a "red" dye in a polymer matrix. These dyes present "Cy3" and "Cy5" analogues with improved photostability, chosen based upon their spectroscopic properties closely matching those of common labels for the green and red channel of microarray scanners. This simple tool allows to efficiently and regularly assess and control the performance of the microarray scanner provided with the biochip platform and to compare different scanners. It will be eventually used as fluorescence intensity scale for referencing of assays results and to enhance the overall comparability of diagnostic tests.
Zeng, Xiantao; Zhang, Yonggang; Kwong, Joey S W; Zhang, Chao; Li, Sheng; Sun, Feng; Niu, Yuming; Du, Liang
2015-02-01
To systematically review the methodological assessment tools for pre-clinical and clinical studies, systematic review and meta-analysis, and clinical practice guideline. We searched PubMed, the Cochrane Handbook for Systematic Reviews of Interventions, Joanna Briggs Institute (JBI) Reviewers Manual, Centre for Reviews and Dissemination, Critical Appraisal Skills Programme (CASP), Scottish Intercollegiate Guidelines Network (SIGN), and the National Institute for Clinical Excellence (NICE) up to May 20th, 2014. Two authors selected studies and extracted data; quantitative analysis was performed to summarize the characteristics of included tools. We included a total of 21 assessment tools for analysis. A number of tools were developed by academic organizations, and some were developed by only a small group of researchers. The JBI developed the highest number of methodological assessment tools, with CASP coming second. Tools for assessing the methodological quality of randomized controlled studies were most abundant. The Cochrane Collaboration's tool for assessing risk of bias is the best available tool for assessing RCTs. For cohort and case-control studies, we recommend the use of the Newcastle-Ottawa Scale. The Methodological Index for Non-Randomized Studies (MINORS) is an excellent tool for assessing non-randomized interventional studies, and the Agency for Healthcare Research and Quality (ARHQ) methodology checklist is applicable for cross-sectional studies. For diagnostic accuracy test studies, the Quality Assessment of Diagnostic Accuracy Studies-2 (QUADAS-2) tool is recommended; the SYstematic Review Centre for Laboratory animal Experimentation (SYRCLE) risk of bias tool is available for assessing animal studies; Assessment of Multiple Systematic Reviews (AMSTAR) is a measurement tool for systematic reviews/meta-analyses; an 18-item tool has been developed for appraising case series studies, and the Appraisal of Guidelines, Research and Evaluation (AGREE)-II instrument is widely used to evaluate clinical practice guidelines. We have successfully identified a variety of methodological assessment tools for different types of study design. However, further efforts in the development of critical appraisal tools are warranted since there is currently a lack of such tools for other fields, e.g. genetic studies, and some existing tools (nested case-control studies and case reports, for example) are in need of updating to be in line with current research practice and rigor. In addition, it is very important that all critical appraisal tools remain subjective and performance bias is effectively avoided. © 2015 Chinese Cochrane Center, West China Hospital of Sichuan University and Wiley Publishing Asia Pty Ltd.
The dynamic analysis of drum roll lathe for machining of rollers
NASA Astrophysics Data System (ADS)
Qiao, Zheng; Wu, Dongxu; Wang, Bo; Li, Guo; Wang, Huiming; Ding, Fei
2014-08-01
An ultra-precision machine tool for machining of the roller has been designed and assembled, and due to the obvious impact which dynamic characteristic of machine tool has on the quality of microstructures on the roller surface, the dynamic characteristic of the existing machine tool is analyzed in this paper, so is the influence of circumstance that a large scale and slender roller is fixed in the machine on dynamic characteristic of the machine tool. At first, finite element model of the machine tool is built and simplified, and based on that, the paper carries on with the finite element mode analysis and gets the natural frequency and shaking type of four steps of the machine tool. According to the above model analysis results, the weak stiffness systems of machine tool can be further improved and the reasonable bandwidth of control system of the machine tool can be designed. In the end, considering the shock which is caused by Z axis as a result of fast positioning frequently to feeding system and cutting tool, transient analysis is conducted by means of ANSYS analysis in this paper. Based on the results of transient analysis, the vibration regularity of key components of machine tool and its impact on cutting process are explored respectively.
A computer aided engineering tool for ECLS systems
NASA Technical Reports Server (NTRS)
Bangham, Michal E.; Reuter, James L.
1987-01-01
The Computer-Aided Systems Engineering and Analysis tool used by NASA for environmental control and life support system design studies is capable of simulating atmospheric revitalization systems, water recovery and management systems, and single-phase active thermal control systems. The designer/analysis interface used is graphics-based, and allows the designer to build a model by constructing a schematic of the system under consideration. Data management functions are performed, and the program is translated into a format that is compatible with the solution routines.
Reliability Analysis for AFTI-F16 SRFCS Using ASSIST and SURE
NASA Technical Reports Server (NTRS)
Wu, N. Eva
2001-01-01
This paper reports the results of a study on reliability analysis of an AFTI-16 Self-Repairing Flight Control System (SRFCS) using software tools SURE (Semi-Markov Unreliability Range Evaluator and ASSIST (Abstract Semi-Markov Specification Interface to the SURE Tool). The purpose of the study is to investigate the potential utility of the software tools in the ongoing effort of the NASA Aviation Safety Program, where the class of systems must be extended beyond the originally intended serving class of electronic digital processors. The study concludes that SURE and ASSIST are applicable to reliability, analysis of flight control systems. They are especially efficient for sensitivity analysis that quantifies the dependence of system reliability on model parameters. The study also confirms an earlier finding on the dominant role of a parameter called a failure coverage. The paper will remark on issues related to the improvement of coverage and the optimization of redundancy level.
Marketing--A Controllable Tool for Education Administrators.
ERIC Educational Resources Information Center
Smith, Wendell C.
1980-01-01
Educational marketing is now becoming legitimized. Marketing techniques such as cost benefit analysis and the selection of a mix of promotional methods are tools that educational administrators should understand and use. (SK)
Economics of infection control surveillance technology: cost-effective or just cost?
Furuno, Jon P; Schweizer, Marin L; McGregor, Jessina C; Perencevich, Eli N
2008-04-01
Previous studies have suggested that informatics tools, such as automated alert and decision support systems, may increase the efficiency and quality of infection control surveillance. However, little is known about the cost-effectiveness of these tools. We focus on 2 types of economic analyses that have utility in assessing infection control interventions (cost-effectiveness analysis and business-case analysis) and review the available literature on the economics of computerized infection control surveillance systems. Previous studies on the effectiveness of computerized infection control surveillance have been limited to assessments of whether these tools increase the sensitivity and specificity of surveillance over traditional methods. Furthermore, we identified only 2 studies that assessed the costs associated with computerized infection control surveillance. Thus, it remains unknown whether computerized infection control surveillance systems are cost-effective and whether use of these systems improves patient outcomes. The existing data are insufficient to allow for a summary conclusion on the cost-effectiveness of infection control surveillance technology. All future studies of computerized infection control surveillance systems should aim to collect outcomes and economic data to inform decision making and assist hospitals with completing business-cases analyses.
Statistical quality control through overall vibration analysis
NASA Astrophysics Data System (ADS)
Carnero, M. a. Carmen; González-Palma, Rafael; Almorza, David; Mayorga, Pedro; López-Escobar, Carlos
2010-05-01
The present study introduces the concept of statistical quality control in automotive wheel bearings manufacturing processes. Defects on products under analysis can have a direct influence on passengers' safety and comfort. At present, the use of vibration analysis on machine tools for quality control purposes is not very extensive in manufacturing facilities. Noise and vibration are common quality problems in bearings. These failure modes likely occur under certain operating conditions and do not require high vibration amplitudes but relate to certain vibration frequencies. The vibration frequencies are affected by the type of surface problems (chattering) of ball races that are generated through grinding processes. The purpose of this paper is to identify grinding process variables that affect the quality of bearings by using statistical principles in the field of machine tools. In addition, an evaluation of the quality results of the finished parts under different combinations of process variables is assessed. This paper intends to establish the foundations to predict the quality of the products through the analysis of self-induced vibrations during the contact between the grinding wheel and the parts. To achieve this goal, the overall self-induced vibration readings under different combinations of process variables are analysed using statistical tools. The analysis of data and design of experiments follows a classical approach, considering all potential interactions between variables. The analysis of data is conducted through analysis of variance (ANOVA) for data sets that meet normality and homoscedasticity criteria. This paper utilizes different statistical tools to support the conclusions such as chi squared, Shapiro-Wilks, symmetry, Kurtosis, Cochran, Hartlett, and Hartley and Krushal-Wallis. The analysis presented is the starting point to extend the use of predictive techniques (vibration analysis) for quality control. This paper demonstrates the existence of predictive variables (high-frequency vibration displacements) that are sensible to the processes setup and the quality of the products obtained. Based on the result of this overall vibration analysis, a second paper will analyse self-induced vibration spectrums in order to define limit vibration bands, controllable every cycle or connected to permanent vibration-monitoring systems able to adjust sensible process variables identified by ANOVA, once the vibration readings exceed established quality limits.
New multivariable capabilities of the INCA program
NASA Technical Reports Server (NTRS)
Bauer, Frank H.; Downing, John P.; Thorpe, Christopher J.
1989-01-01
The INteractive Controls Analysis (INCA) program was developed at NASA's Goddard Space Flight Center to provide a user friendly, efficient environment for the design and analysis of control systems, specifically spacecraft control systems. Since its inception, INCA has found extensive use in the design, development, and analysis of control systems for spacecraft, instruments, robotics, and pointing systems. The (INCA) program was initially developed as a comprehensive classical design analysis tool for small and large order control systems. The latest version of INCA, expected to be released in February of 1990, was expanded to include the capability to perform multivariable controls analysis and design.
Configuration Analysis Tool (CAT). System Description and users guide (revision 1)
NASA Technical Reports Server (NTRS)
Decker, W.; Taylor, W.; Mcgarry, F. E.; Merwarth, P.
1982-01-01
A system description of, and user's guide for, the Configuration Analysis Tool (CAT) are presented. As a configuration management tool, CAT enhances the control of large software systems by providing a repository for information describing the current status of a project. CAT provides an editing capability to update the information and a reporting capability to present the information. CAT is an interactive program available in versions for the PDP-11/70 and VAX-11/780 computers.
Integrated piezoelectric actuators in deep drawing tools
NASA Astrophysics Data System (ADS)
Neugebauer, R.; Mainda, P.; Drossel, W.-G.; Kerschner, M.; Wolf, K.
2011-04-01
The production of car body panels are defective in succession of process fluctuations. Thus the produced car body panel can be precise or damaged. To reduce the error rate, an intelligent deep drawing tool was developed at the Fraunhofer Institute for Machine Tools and Forming Technology IWU in cooperation with Audi and Volkswagen. Mechatronic components in a closed-loop control is the main differentiating factor between an intelligent and a conventional deep drawing tool. In correlation with sensors for process monitoring, the intelligent tool consists of piezoelectric actuators to actuate the deep drawing process. By enabling the usage of sensors and actuators at the die, the forming tool transform to a smart structure. The interface between sensors and actuators will be realized with a closed-loop control. The content of this research will present the experimental results with the piezoelectric actuator. For the analysis a production-oriented forming tool with all automotive requirements were used. The disposed actuators are monolithic multilayer actuators of the piezo injector system. In order to achieve required force, the actuators are combined in a cluster. The cluster is redundant and economical. In addition to the detailed assembly structures, this research will highlight intensive analysis with the intelligent deep drawing tool.
Evaluation of the efficiency and reliability of software generated by code generators
NASA Technical Reports Server (NTRS)
Schreur, Barbara
1994-01-01
There are numerous studies which show that CASE Tools greatly facilitate software development. As a result of these advantages, an increasing amount of software development is done with CASE Tools. As more software engineers become proficient with these tools, their experience and feedback lead to further development with the tools themselves. What has not been widely studied, however, is the reliability and efficiency of the actual code produced by the CASE Tools. This investigation considered these matters. Three segments of code generated by MATRIXx, one of many commercially available CASE Tools, were chosen for analysis: ETOFLIGHT, a portion of the Earth to Orbit Flight software, and ECLSS and PFMC, modules for Environmental Control and Life Support System and Pump Fan Motor Control, respectively.
Distributed Sensing with Fault-Tolerant Resource Reallocation for Disaster Area Assessment
2010-05-01
Abrasion Tool R/C .............................................................................................................. Radio Control RISC...spectrometer onboard the rovers in order to determine their composition. Prior to soil and rock analysis, the robots might utilize their Rock Abrasion ...one robot might be responsible for using its rock abrasion tool and perhaps a manipulator, while another performs the analysis with the Mössbauer
USDA-ARS?s Scientific Manuscript database
Background: Dietary intake assessment with diet records (DR) is a standard research and practice tool in nutrition. Manual entry and analysis of DR is time-consuming and expensive. New electronic tools for diet entry by clients and research participants may reduce the cost and effort of nutrient int...
Navigating freely-available software tools for metabolomics analysis.
Spicer, Rachel; Salek, Reza M; Moreno, Pablo; Cañueto, Daniel; Steinbeck, Christoph
2017-01-01
The field of metabolomics has expanded greatly over the past two decades, both as an experimental science with applications in many areas, as well as in regards to data standards and bioinformatics software tools. The diversity of experimental designs and instrumental technologies used for metabolomics has led to the need for distinct data analysis methods and the development of many software tools. To compile a comprehensive list of the most widely used freely available software and tools that are used primarily in metabolomics. The most widely used tools were selected for inclusion in the review by either ≥ 50 citations on Web of Science (as of 08/09/16) or the use of the tool being reported in the recent Metabolomics Society survey. Tools were then categorised by the type of instrumental data (i.e. LC-MS, GC-MS or NMR) and the functionality (i.e. pre- and post-processing, statistical analysis, workflow and other functions) they are designed for. A comprehensive list of the most used tools was compiled. Each tool is discussed within the context of its application domain and in relation to comparable tools of the same domain. An extended list including additional tools is available at https://github.com/RASpicer/MetabolomicsTools which is classified and searchable via a simple controlled vocabulary. This review presents the most widely used tools for metabolomics analysis, categorised based on their main functionality. As future work, we suggest a direct comparison of tools' abilities to perform specific data analysis tasks e.g. peak picking.
Control system design and analysis using the INteractive Controls Analysis (INCA) program
NASA Technical Reports Server (NTRS)
Bauer, Frank H.; Downing, John P.
1987-01-01
The INteractive Controls Analysis (INCA) program was developed at the Goddard Space Flight Center to provide a user friendly efficient environment for the design and analysis of linear control systems. Since its inception, INCA has found extensive use in the design, development, and analysis of control systems for spacecraft, instruments, robotics, and pointing systems. Moreover, the results of the analytic tools imbedded in INCA have been flight proven with at least three currently orbiting spacecraft. This paper describes the INCA program and illustrates, using a flight proven example, how the package can perform complex design analyses with relative ease.
U.K. Foot and Mouth Disease: A Systemic Risk Assessment of Existing Controls.
Delgado, João; Pollard, Simon; Pearn, Kerry; Snary, Emma L; Black, Edgar; Prpich, George; Longhurst, Phil
2017-09-01
This article details a systemic analysis of the controls in place and possible interventions available to further reduce the risk of a foot and mouth disease (FMD) outbreak in the United Kingdom. Using a research-based network analysis tool, we identify vulnerabilities within the multibarrier control system and their corresponding critical control points (CCPs). CCPs represent opportunities for active intervention that produce the greatest improvement to United Kingdom's resilience to future FMD outbreaks. Using an adapted 'features, events, and processes' (FEPs) methodology and network analysis, our results suggest that movements of animals and goods associated with legal activities significantly influence the system's behavior due to their higher frequency and ability to combine and create scenarios of exposure similar in origin to the U.K. FMD outbreaks of 1967/8 and 2001. The systemic risk assessment highlights areas outside of disease control that are relevant to disease spread. Further, it proves to be a powerful tool for demonstrating the need for implementing disease controls that have not previously been part of the system. © 2016 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.
2013-01-01
Background Assessing the risk of bias of randomized controlled trials (RCTs) is crucial to understand how biases affect treatment effect estimates. A number of tools have been developed to evaluate risk of bias of RCTs; however, it is unknown how these tools compare to each other in the items included. The main objective of this study was to describe which individual items are included in RCT quality tools used in general health and physical therapy (PT) research, and how these items compare to those of the Cochrane Risk of Bias (RoB) tool. Methods We used comprehensive literature searches and a systematic approach to identify tools that evaluated the methodological quality or risk of bias of RCTs in general health and PT research. We extracted individual items from all quality tools. We calculated the frequency of quality items used across tools and compared them to those in the RoB tool. Comparisons were made between general health and PT quality tools using Chi-squared tests. Results In addition to the RoB tool, 26 quality tools were identified, with 19 being used in general health and seven in PT research. The total number of quality items included in general health research tools was 130, compared with 48 items across PT tools and seven items in the RoB tool. The most frequently included items in general health research tools (14/19, 74%) were inclusion and exclusion criteria, and appropriate statistical analysis. In contrast, the most frequent items included in PT tools (86%, 6/7) were: baseline comparability, blinding of investigator/assessor, and use of intention-to-treat analysis. Key items of the RoB tool (sequence generation and allocation concealment) were included in 71% (5/7) of PT tools, and 63% (12/19) and 37% (7/19) of general health research tools, respectively. Conclusions There is extensive item variation across tools that evaluate the risk of bias of RCTs in health research. Results call for an in-depth analysis of items that should be used to assess risk of bias of RCTs. Further empirical evidence on the use of individual items and the psychometric properties of risk of bias tools is needed. PMID:24044807
Armijo-Olivo, Susan; Fuentes, Jorge; Ospina, Maria; Saltaji, Humam; Hartling, Lisa
2013-09-17
Assessing the risk of bias of randomized controlled trials (RCTs) is crucial to understand how biases affect treatment effect estimates. A number of tools have been developed to evaluate risk of bias of RCTs; however, it is unknown how these tools compare to each other in the items included. The main objective of this study was to describe which individual items are included in RCT quality tools used in general health and physical therapy (PT) research, and how these items compare to those of the Cochrane Risk of Bias (RoB) tool. We used comprehensive literature searches and a systematic approach to identify tools that evaluated the methodological quality or risk of bias of RCTs in general health and PT research. We extracted individual items from all quality tools. We calculated the frequency of quality items used across tools and compared them to those in the RoB tool. Comparisons were made between general health and PT quality tools using Chi-squared tests. In addition to the RoB tool, 26 quality tools were identified, with 19 being used in general health and seven in PT research. The total number of quality items included in general health research tools was 130, compared with 48 items across PT tools and seven items in the RoB tool. The most frequently included items in general health research tools (14/19, 74%) were inclusion and exclusion criteria, and appropriate statistical analysis. In contrast, the most frequent items included in PT tools (86%, 6/7) were: baseline comparability, blinding of investigator/assessor, and use of intention-to-treat analysis. Key items of the RoB tool (sequence generation and allocation concealment) were included in 71% (5/7) of PT tools, and 63% (12/19) and 37% (7/19) of general health research tools, respectively. There is extensive item variation across tools that evaluate the risk of bias of RCTs in health research. Results call for an in-depth analysis of items that should be used to assess risk of bias of RCTs. Further empirical evidence on the use of individual items and the psychometric properties of risk of bias tools is needed.
NASA Technical Reports Server (NTRS)
Hale, Joseph P., II
1994-01-01
Human Factors Engineering support was provided for the 30% design review of the late Space Station Freedom Payload Control Area (PCA). The PCA was to be the payload operations control room, analogous to the Spacelab Payload Operations Control Center (POCC). This effort began with a systematic collection and refinement of the relevant requirements driving the spatial layout of the consoles and PCA. This information was used as input for specialized human factors analytical tools and techniques in the design and design analysis activities. Design concepts and configuration options were developed and reviewed using sketches, 2-D Computer-Aided Design (CAD) drawings, and immersive Virtual Reality (VR) mockups.
NASA Technical Reports Server (NTRS)
Lowrie, J. W.; Fermelia, A. J.; Haley, D. C.; Gremban, K. D.; Vanbaalen, J.; Walsh, R. W.
1982-01-01
Documentation of the preliminary software developed as a framework for a generalized integrated robotic system simulation is presented. The program structure is composed of three major functions controlled by a program executive. The three major functions are: system definition, analysis tools, and post processing. The system definition function handles user input of system parameters and definition of the manipulator configuration. The analysis tools function handles the computational requirements of the program. The post processing function allows for more detailed study of the results of analysis tool function executions. Also documented is the manipulator joint model software to be used as the basis of the manipulator simulation which will be part of the analysis tools capability.
HPTLC Fingerprint Analysis: A Quality Control for Authentication of Herbal Phytochemicals
NASA Astrophysics Data System (ADS)
Ram, Mauji; Abdin, M. Z.; Khan, M. A.; Jha, Prabhakar
Authentication and consistent quality are the basic requirement for Indian traditional medicine (TIM), Chinese traditional herbal medicine (TCHM), and their commercial products, regardless of the kind of research conducted to modernize the TIM and TCHM. The complexities of TIM and TCHM challenge the current official quality control mode, for which only a few biochemical markers were selected for identification and quantitative assay. Referring too many unknown factors existed in TIM and TCHM, it is impossible and unnecessary to pinpoint qualitatively and quantitatively every single component contained in the herbal drug. Chromatographic fingerprint is a rational option to meet the need for more effective and powerful quality assessment to TIM and TCHM. The optimized chromatographic fingerprint is not only an alternative analytical tool for authentication, but also an approach to express the various pattern of chemical ingredients distribution in the herbal drugs and preserve such "database" for further multifaced sustainable studies. Analytical separation techniques, for example, high-performance liquid chromatography (HPLC), gas chromatography (GC) and mass spectrometry (MS) were among the most popular methods of choice used for quality control of raw material and finished herbal product. Fingerprint analysis approach using high-performance thin-layer chromatography (HPTLC) has become the most potent tool for quality control of herbal medicines because of its simplicity and reliability. It can serve as a tool for identification, authentication, and quality control of herbal drugs. In this chapter, attempts are being made to expand the use of HPTLC and at the same time create interest among prospective researcher in herbal analysis. The developed method can be used as a quality control tool for rapid authentication from a wide variety of herbal samples. Some examples demonstrated the role of fingerprinting in quality control and assessment.
RSAT: regulatory sequence analysis tools.
Thomas-Chollier, Morgane; Sand, Olivier; Turatsinze, Jean-Valéry; Janky, Rekin's; Defrance, Matthieu; Vervisch, Eric; Brohée, Sylvain; van Helden, Jacques
2008-07-01
The regulatory sequence analysis tools (RSAT, http://rsat.ulb.ac.be/rsat/) is a software suite that integrates a wide collection of modular tools for the detection of cis-regulatory elements in genome sequences. The suite includes programs for sequence retrieval, pattern discovery, phylogenetic footprint detection, pattern matching, genome scanning and feature map drawing. Random controls can be performed with random gene selections or by generating random sequences according to a variety of background models (Bernoulli, Markov). Beyond the original word-based pattern-discovery tools (oligo-analysis and dyad-analysis), we recently added a battery of tools for matrix-based detection of cis-acting elements, with some original features (adaptive background models, Markov-chain estimation of P-values) that do not exist in other matrix-based scanning tools. The web server offers an intuitive interface, where each program can be accessed either separately or connected to the other tools. In addition, the tools are now available as web services, enabling their integration in programmatic workflows. Genomes are regularly updated from various genome repositories (NCBI and EnsEMBL) and 682 organisms are currently supported. Since 1998, the tools have been used by several hundreds of researchers from all over the world. Several predictions made with RSAT were validated experimentally and published.
Dataflow Design Tool: User's Manual
NASA Technical Reports Server (NTRS)
Jones, Robert L., III
1996-01-01
The Dataflow Design Tool is a software tool for selecting a multiprocessor scheduling solution for a class of computational problems. The problems of interest are those that can be described with a dataflow graph and are intended to be executed repetitively on a set of identical processors. Typical applications include signal processing and control law problems. The software tool implements graph-search algorithms and analysis techniques based on the dataflow paradigm. Dataflow analyses provided by the software are introduced and shown to effectively determine performance bounds, scheduling constraints, and resource requirements. The software tool provides performance optimization through the inclusion of artificial precedence constraints among the schedulable tasks. The user interface and tool capabilities are described. Examples are provided to demonstrate the analysis, scheduling, and optimization functions facilitated by the tool.
National Fusion Collaboratory: Grid Computing for Simulations and Experiments
NASA Astrophysics Data System (ADS)
Greenwald, Martin
2004-05-01
The National Fusion Collaboratory Project is creating a computational grid designed to advance scientific understanding and innovation in magnetic fusion research by facilitating collaborations, enabling more effective integration of experiments, theory and modeling and allowing more efficient use of experimental facilities. The philosophy of FusionGrid is that data, codes, analysis routines, visualization tools, and communication tools should be thought of as network available services, easily used by the fusion scientist. In such an environment, access to services is stressed rather than portability. By building on a foundation of established computer science toolkits, deployment time can be minimized. These services all share the same basic infrastructure that allows for secure authentication and resource authorization which allows stakeholders to control their own resources such as computers, data and experiments. Code developers can control intellectual property, and fair use of shared resources can be demonstrated and controlled. A key goal is to shield scientific users from the implementation details such that transparency and ease-of-use are maximized. The first FusionGrid service deployed was the TRANSP code, a widely used tool for transport analysis. Tools for run preparation, submission, monitoring and management have been developed and shared among a wide user base. This approach saves user sites from the laborious effort of maintaining such a large and complex code while at the same time reducing the burden on the development team by avoiding the need to support a large number of heterogeneous installations. Shared visualization and A/V tools are being developed and deployed to enhance long-distance collaborations. These include desktop versions of the Access Grid, a highly capable multi-point remote conferencing tool and capabilities for sharing displays and analysis tools over local and wide-area networks.
Boxwala, A A; Chaney, E L; Fritsch, D S; Friedman, C P; Rosenman, J G
1998-09-01
The purpose of this investigation was to design and implement a prototype physician workstation, called PortFolio, as a platform for developing and evaluating, by means of controlled observer studies, user interfaces and interactive tools for analyzing and managing digital portal images. The first observer study was designed to measure physician acceptance of workstation technology, as an alternative to a view box, for inspection and analysis of portal images for detection of treatment setup errors. The observer study was conducted in a controlled experimental setting to evaluate physician acceptance of the prototype workstation technology exemplified by PortFolio. PortFolio incorporates a windows user interface, a compact kit of carefully selected image analysis tools, and an object-oriented data base infrastructure. The kit evaluated in the observer study included tools for contrast enhancement, registration, and multimodal image visualization. Acceptance was measured in the context of performing portal image analysis in a structured protocol designed to simulate clinical practice. The acceptability and usage patterns were measured from semistructured questionnaires and logs of user interactions. Radiation oncologists, the subjects for this study, perceived the tools in PortFolio to be acceptable clinical aids. Concerns were expressed regarding user efficiency, particularly with respect to the image registration tools. The results of our observer study indicate that workstation technology is acceptable to radiation oncologists as an alternative to a view box for clinical detection of setup errors from digital portal images. Improvements in implementation, including more tools and a greater degree of automation in the image analysis tasks, are needed to make PortFolio more clinically practical.
Numerical continuation and bifurcation analysis in aircraft design: an industrial perspective.
Sharma, Sanjiv; Coetzee, Etienne B; Lowenberg, Mark H; Neild, Simon A; Krauskopf, Bernd
2015-09-28
Bifurcation analysis is a powerful method for studying the steady-state nonlinear dynamics of systems. Software tools exist for the numerical continuation of steady-state solutions as parameters of the system are varied. These tools make it possible to generate 'maps of solutions' in an efficient way that provide valuable insight into the overall dynamic behaviour of a system and potentially to influence the design process. While this approach has been employed in the military aircraft control community to understand the effectiveness of controllers, the use of bifurcation analysis in the wider aircraft industry is yet limited. This paper reports progress on how bifurcation analysis can play a role as part of the design process for passenger aircraft. © 2015 The Author(s).
NASA Technical Reports Server (NTRS)
Lee, Katharine K.; Kerns, Karol; Bone, Randall
2001-01-01
The measurement of operational acceptability is important for the development, implementation, and evolution of air traffic management decision support tools. The Controller Acceptance Rating Scale was developed at NASA Ames Research Center for the development and evaluation of the Passive Final Approach Spacing Tool. CARS was modeled after a well-known pilot evaluation rating instrument, the Cooper-Harper Scale, and has since been used in the evaluation of the User Request Evaluation Tool, developed by MITRE's Center for Advanced Aviation System Development. In this paper, we provide a discussion of the development of CARS and an analysis of the empirical data collected with CARS to examine construct validity. Results of intraclass correlations indicated statistically significant reliability for the CARS. From the subjective workload data that were collected in conjunction with the CARS, it appears that the expected set of workload attributes was correlated with the CARS. As expected, the analysis also showed that CARS was a sensitive indicator of the impact of decision support tools on controller operations. Suggestions for future CARS development and its improvement are also provided.
Integrated Tools for Future Distributed Engine Control Technologies
NASA Technical Reports Server (NTRS)
Culley, Dennis; Thomas, Randy; Saus, Joseph
2013-01-01
Turbine engines are highly complex mechanical systems that are becoming increasingly dependent on control technologies to achieve system performance and safety metrics. However, the contribution of controls to these measurable system objectives is difficult to quantify due to a lack of tools capable of informing the decision makers. This shortcoming hinders technology insertion in the engine design process. NASA Glenn Research Center is developing a Hardware-inthe- Loop (HIL) platform and analysis tool set that will serve as a focal point for new control technologies, especially those related to the hardware development and integration of distributed engine control. The HIL platform is intended to enable rapid and detailed evaluation of new engine control applications, from conceptual design through hardware development, in order to quantify their impact on engine systems. This paper discusses the complex interactions of the control system, within the context of the larger engine system, and how new control technologies are changing that paradigm. The conceptual design of the new HIL platform is then described as a primary tool to address those interactions and how it will help feed the insertion of new technologies into future engine systems.
Application handbook for a Standardized Control Module (SCM) for DC-DC converters, volume 1
NASA Astrophysics Data System (ADS)
Lee, F. C.; Mahmoud, M. F.; Yu, Y.
1980-04-01
The standardized control module (SCM) was developed for application in the buck, boost and buck/boost DC-DC converters. The SCM used multiple feedback loops to provide improved input line and output load regulation, stable feedback control system, good dynamic transient response and adaptive compensation of the control loop for changes in open loop gain and output filter time constraints. The necessary modeling and analysis tools to aid the design engineer in the application of the SCM to DC-DC Converters were developed. The SCM functional block diagram and the different analysis techniques were examined. The average time domain analysis technique was chosen as the basic analytical tool. The power stage transfer functions were developed for the buck, boost and buck/boost converters. The analog signal and digital signal processor transfer functions were developed for the three DC-DC Converter types using the constant on time, constant off time and constant frequency control laws.
Application handbook for a Standardized Control Module (SCM) for DC-DC converters, volume 1
NASA Technical Reports Server (NTRS)
Lee, F. C.; Mahmoud, M. F.; Yu, Y.
1980-01-01
The standardized control module (SCM) was developed for application in the buck, boost and buck/boost DC-DC converters. The SCM used multiple feedback loops to provide improved input line and output load regulation, stable feedback control system, good dynamic transient response and adaptive compensation of the control loop for changes in open loop gain and output filter time constraints. The necessary modeling and analysis tools to aid the design engineer in the application of the SCM to DC-DC Converters were developed. The SCM functional block diagram and the different analysis techniques were examined. The average time domain analysis technique was chosen as the basic analytical tool. The power stage transfer functions were developed for the buck, boost and buck/boost converters. The analog signal and digital signal processor transfer functions were developed for the three DC-DC Converter types using the constant on time, constant off time and constant frequency control laws.
Modeling the Multi-Body System Dynamics of a Flexible Solar Sail Spacecraft
NASA Technical Reports Server (NTRS)
Kim, Young; Stough, Robert; Whorton, Mark
2005-01-01
Solar sail propulsion systems enable a wide range of space missions that are not feasible with current propulsion technology. Hardware concepts and analytical methods have matured through ground development to the point that a flight validation mission is now realizable. Much attention has been given to modeling the structural dynamics of the constituent elements, but to date an integrated system level dynamics analysis has been lacking. Using a multi-body dynamics and control analysis tool called TREETOPS, the coupled dynamics of the sailcraft bus, sail membranes, flexible booms, and control system sensors and actuators of a representative solar sail spacecraft are investigated to assess system level dynamics and control issues. With this tool, scaling issues and parametric trade studies can be performed to study achievable performance, control authority requirements, and control/structure interaction assessments.
NASA Technical Reports Server (NTRS)
Adams, William M., Jr.; Hoadley, Sherwood T.
1993-01-01
This paper discusses the capabilities of the Interaction of Structures, Aerodynamics, and Controls (ISAC) system of program modules. The major modeling, analysis, and data management components of ISAC are identified. Equations of motion are displayed for a Laplace-domain representation of the unsteady aerodynamic forces. Options for approximating a frequency-domain representation of unsteady aerodynamic forces with rational functions of the Laplace variable are shown. Linear time invariant state-space equations of motion that result are discussed. Model generation and analyses of stability and dynamic response characteristics are shown for an aeroelastic vehicle which illustrate some of the capabilities of ISAC as a modeling and analysis tool for aeroelastic applications.
The development and testing of a skin tear risk assessment tool.
Newall, Nelly; Lewin, Gill F; Bulsara, Max K; Carville, Keryln J; Leslie, Gavin D; Roberts, Pam A
2017-02-01
The aim of the present study is to develop a reliable and valid skin tear risk assessment tool. The six characteristics identified in a previous case control study as constituting the best risk model for skin tear development were used to construct a risk assessment tool. The ability of the tool to predict skin tear development was then tested in a prospective study. Between August 2012 and September 2013, 1466 tertiary hospital patients were assessed at admission and followed up for 10 days to see if they developed a skin tear. The predictive validity of the tool was assessed using receiver operating characteristic (ROC) analysis. When the tool was found not to have performed as well as hoped, secondary analyses were performed to determine whether a potentially better performing risk model could be identified. The tool was found to have high sensitivity but low specificity and therefore have inadequate predictive validity. Secondary analysis of the combined data from this and the previous case control study identified an alternative better performing risk model. The tool developed and tested in this study was found to have inadequate predictive validity. The predictive validity of an alternative, more parsimonious model now needs to be tested. © 2015 Medicalhelplines.com Inc and John Wiley & Sons Ltd.
NASA Technical Reports Server (NTRS)
Rasmussen, Robert; Bennett, Matthew
2006-01-01
The State Analysis Database Tool software establishes a productive environment for collaboration among software and system engineers engaged in the development of complex interacting systems. The tool embodies State Analysis, a model-based system engineering methodology founded on a state-based control architecture (see figure). A state represents a momentary condition of an evolving system, and a model may describe how a state evolves and is affected by other states. The State Analysis methodology is a process for capturing system and software requirements in the form of explicit models and states, and defining goal-based operational plans consistent with the models. Requirements, models, and operational concerns have traditionally been documented in a variety of system engineering artifacts that address different aspects of a mission s lifecycle. In State Analysis, requirements, models, and operations information are State Analysis artifacts that are consistent and stored in a State Analysis Database. The tool includes a back-end database, a multi-platform front-end client, and Web-based administrative functions. The tool is structured to prompt an engineer to follow the State Analysis methodology, to encourage state discovery and model description, and to make software requirements and operations plans consistent with model descriptions.
Soft System Analysis to Integrate Technology & Human in Controller Workstation
DOT National Transportation Integrated Search
2011-10-16
Computer-based decision support tools (DST), : shared information, and other forms of automation : are increasingly being planned for use by controllers : and pilots to support Air Traffic Management (ATM) : and Air Traffic Control (ATC) in the Next ...
Generic trending and analysis system
NASA Technical Reports Server (NTRS)
Keehan, Lori; Reese, Jay
1994-01-01
The Generic Trending and Analysis System (GTAS) is a generic spacecraft performance monitoring tool developed by NASA Code 511 and Loral Aerosys. It is designed to facilitate quick anomaly resolution and trend analysis. Traditionally, the job of off-line analysis has been performed using hardware and software systems developed for real-time spacecraft contacts; then, the systems were supplemented with a collection of tools developed by Flight Operations Team (FOT) members. Since the number of upcoming missions is increasing, NASA can no longer afford to operate in this manner. GTAS improves control center productivity and effectiveness because it provides a generic solution across multiple missions. Thus, GTAS eliminates the need for each individual mission to develop duplicate capabilities. It also allows for more sophisticated tools to be developed because it draws resources from several projects. In addition, the GTAS software system incorporates commercial off-the-shelf tools software (COTS) packages and reuses components of other NASA-developed systems wherever possible. GTAS has incorporated lessons learned from previous missions by involving the users early in the development process. GTAS users took a proactive role in requirements analysis, design, development, and testing. Because of user involvement, several special tools were designed and are now being developed. GTAS users expressed considerable interest in facilitating data collection for long term trending and analysis. As a result, GTAS provides easy access to large volumes of processed telemetry data directly in the control center. The GTAS archival and retrieval capabilities are supported by the integration of optical disk technology and a COTS relational database management system.
CRCDA—Comprehensive resources for cancer NGS data analysis
Thangam, Manonanthini; Gopal, Ramesh Kumar
2015-01-01
Next generation sequencing (NGS) innovations put a compelling landmark in life science and changed the direction of research in clinical oncology with its productivity to diagnose and treat cancer. The aim of our portal comprehensive resources for cancer NGS data analysis (CRCDA) is to provide a collection of different NGS tools and pipelines under diverse classes with cancer pathways and databases and furthermore, literature information from PubMed. The literature data was constrained to 18 most common cancer types such as breast cancer, colon cancer and other cancers that exhibit in worldwide population. NGS-cancer tools for the convenience have been categorized into cancer genomics, cancer transcriptomics, cancer epigenomics, quality control and visualization. Pipelines for variant detection, quality control and data analysis were listed to provide out-of-the box solution for NGS data analysis, which may help researchers to overcome challenges in selecting and configuring individual tools for analysing exome, whole genome and transcriptome data. An extensive search page was developed that can be queried by using (i) type of data [literature, gene data and sequence read archive (SRA) data] and (ii) type of cancer (selected based on global incidence and accessibility of data). For each category of analysis, variety of tools are available and the biggest challenge is in searching and using the right tool for the right application. The objective of the work is collecting tools in each category available at various places and arranging the tools and other data in a simple and user-friendly manner for biologists and oncologists to find information easier. To the best of our knowledge, we have collected and presented a comprehensive package of most of the resources available in cancer for NGS data analysis. Given these factors, we believe that this website will be an useful resource to the NGS research community working on cancer. Database URL: http://bioinfo.au-kbc.org.in/ngs/ngshome.html. PMID:26450948
La Porta, F; Giordano, A; Caselli, S; Foti, C; Franchignoni, F
2015-12-01
It is unclear whether the BBS is an effective tool for the measurement of early postural control impairments in patients with Parkinson's disease (PD). The aim of this paper was to evaluate BBS' content validity, internal construct validity, reliability and targeting in patients with PD within the Rasch analysis framework. Observational, cross-sectional study. Outpatient Rehabilitation Unit. A sample of 285 outpatients with PD. The content validity of the BBS was assessed using standard linking techniques. The BBS was administered by trained physiotherapists. The data collected then underwent Rasch analysis. Content validity analysis showed a lack of items assessing postural responses to tripping and slips and stability during walking. On Rasch analysis, the BBS failed the requirements of monotonicity, local independence, unidimensionality and invariance. After rescoring 7 items, grouping of locally dependent items into testlets, and deletion of the static sitting balance item because mistargeted and underdiscriminating, the Rasch-modified BBS for PD (BBS-PD) showed adequate internal construct validity (χ(2)24=39.693; P=0.023), including absence of differential item functioning (DIF) across gender and age, and was, as a whole, sufficiently precise for individual person measurement (PSI=0.894). However, the scale was not well targeted to the sample in view of the prevalence of higher scores. This study demonstrated the internal construct validity and reliability of the BBS-PD as a measurement tool for patients with PD within the Rasch analysis framework. However, the lack of items critical to the assessment of postural control impairments typical of PD, affected negatively the targeting, so that a significant percentage of patients was located in the higher ability range of the measurement continuum, where precision of measurement is reduced. These findings suggest that the BBS, even if modified, may not be an effective tool for the measurement of early postural control in patients with PD.
Onwujekwe, Obinna; Malik, El-Fatih Mohamed; Mustafa, Sara Hassan; Mnzava, Abraham
2005-12-15
In order to optimally prioritize and use public and private budgets for equitable malaria vector control, there is a need to determine the level and determinants of consumer demand for different vector control tools. To determine the demand from people of different socio-economic groups for indoor residual house-spraying (IRHS), insecticide-treated nets (ITNs), larviciding with chemicals (LWC), and space spraying/fogging (SS) and the disease control implications of the result. Ratings and levels of willingness-to-pay (WTP) for the vector control tools were determined using a random cross-sectional sample of 720 householdes drawn from two states. WTP was elicited using the bidding game. An asset-based socio-economic status (SES) index was used to explore whether WTP was related to SES of the respondents. IRHS received the highest proportion of highest preferred rating (41.0%) followed by ITNs (23.1%). However, ITNs had the highest mean WTP followed by IRHS, while LWC had the least. The regression analysis showed that SES was positively and statistically significantly related to WTP across the four vector control tools and that the respondents' rating of IRHS and ITNs significantly explained their levels of WTP for the two tools. People were willing to pay for all the vector-control tools, but the demand for the vector control tools was related to the SES of the respondents. Hence, it is vital that there are public policies and financing mechanisms to ensure equitable provision and utilisation of vector control tools, as well as protecting the poor from cost-sharing arrangements.
Psychometric Analysis of the Servicemember Evaluation Tool
to assess psychological resilience. The Naval Center for Combat and Operational Stress Control developed the Servicemember Evaluation Tool (SET) to...vessels on deployment. The goals of this thesis are to evaluate the psychometric properties of the SET on this sample population. Furthermore, this
Development of Advanced Life Prediction Tools for Elastic-Plastic Fatigue Crack Growth
NASA Technical Reports Server (NTRS)
Gregg, Wayne; McGill, Preston; Swanson, Greg; Wells, Doug; Throckmorton, D. A. (Technical Monitor)
2001-01-01
The objective of this viewgraph presentation is to develop a systematic approach to improving the fracture control process, including analytical tools, standards, guidelines, and awareness. Analytical tools specifically for elastic-plastic fracture analysis is a regime that is currently empirical for the Space Shuttle External Tank (ET) and is handled by simulated service testing of pre-cracked panels.
Two implementations of the Expert System for the Flight Analysis System (ESFAS) project
NASA Technical Reports Server (NTRS)
Wang, Lui
1988-01-01
A comparison is made between the two most sophisticated expert system building tools, the Automated Reasoning Tool (ART) and the Knowledge Engineering Environment (KEE). The same problem domain (ESFAS) was used in making the comparison. The Expert System for the Flight Analysis System (ESFAS) acts as an intelligent front end for the Flight Analysis System (FAS). FAS is a complex configuration controlled set of interrelated processors (FORTRAN routines) which will be used by the Mission Planning and Analysis Div. (MPAD) to design and analyze Shuttle and potential Space Station missions. Implementations of ESFAS are described. The two versions represent very different programming paradigms; ART uses rules and KEE uses objects. Due to each of the tools philosophical differences, KEE is implemented using a depth first traversal algorithm, whereas ART uses a user directed traversal method. Either tool could be used to solve this particular problem.
Fire in longleaf pine stand management: an economic analysis
Rodney L. Busby; Donald G. Hodges
1999-01-01
A simulation analysis of the economics of using prescribed fire as a forest management tool in the management of longleaf pine (Pinus palustris Mill.) plantations was conducted. A management regime using frequent prescribed fire was compared to management regimes involving fertilization and chemical release, chemical control, and mechanical control. Determining the...
Simple Sensitivity Analysis for Orion Guidance Navigation and Control
NASA Technical Reports Server (NTRS)
Pressburger, Tom; Hoelscher, Brian; Martin, Rodney; Sricharan, Kumar
2013-01-01
The performance of Orion flight software, especially its GNC software, is being analyzed by running Monte Carlo simulations of Orion spacecraft flights. The simulated performance is analyzed for conformance with flight requirements, expressed as performance constraints. Flight requirements include guidance (e.g. touchdown distance from target) and control (e.g., control saturation) as well as performance (e.g., heat load constraints). The Monte Carlo simulations disperse hundreds of simulation input variables, for everything from mass properties to date of launch. We describe in this paper a sensitivity analysis tool ("Critical Factors Tool" or CFT) developed to find the input variables or pairs of variables which by themselves significantly influence satisfaction of requirements or significantly affect key performance metrics (e.g., touchdown distance from target). Knowing these factors can inform robustness analysis, can inform where engineering resources are most needed, and could even affect operations. The contributions of this paper include the introduction of novel sensitivity measures, such as estimating success probability, and a technique for determining whether pairs of factors are interacting dependently or independently. The tool found that input variables such as moments, mass, thrust dispersions, and date of launch were found to be significant factors for success of various requirements. Examples are shown in this paper as well as a summary and physics discussion of EFT-1 driving factors that the tool found.
General MACOS Interface for Modeling and Analysis for Controlled Optical Systems
NASA Technical Reports Server (NTRS)
Sigrist, Norbert; Basinger, Scott A.; Redding, David C.
2012-01-01
The General MACOS Interface (GMI) for Modeling and Analysis for Controlled Optical Systems (MACOS) enables the use of MATLAB as a front-end for JPL s critical optical modeling package, MACOS. MACOS is JPL s in-house optical modeling software, which has proven to be a superb tool for advanced systems engineering of optical systems. GMI, coupled with MACOS, allows for seamless interfacing with modeling tools from other disciplines to make possible integration of dynamics, structures, and thermal models with the addition of control systems for deformable optics and other actuated optics. This software package is designed as a tool for analysts to quickly and easily use MACOS without needing to be an expert at programming MACOS. The strength of MACOS is its ability to interface with various modeling/development platforms, allowing evaluation of system performance with thermal, mechanical, and optical modeling parameter variations. GMI provides an improved means for accessing selected key MACOS functionalities. The main objective of GMI is to marry the vast mathematical and graphical capabilities of MATLAB with the powerful optical analysis engine of MACOS, thereby providing a useful tool to anyone who can program in MATLAB. GMI also improves modeling efficiency by eliminating the need to write an interface function for each task/project, reducing error sources, speeding up user/modeling tasks, and making MACOS well suited for fast prototyping.
Investigation of energy management strategies for photovoltaic systems - An analysis technique
NASA Technical Reports Server (NTRS)
Cull, R. C.; Eltimsahy, A. H.
1982-01-01
Progress is reported in formulating energy management strategies for stand-alone PV systems, developing an analytical tool that can be used to investigate these strategies, applying this tool to determine the proper control algorithms and control variables (controller inputs and outputs) for a range of applications, and quantifying the relative performance and economics when compared to systems that do not apply energy management. The analysis technique developed may be broadly applied to a variety of systems to determine the most appropriate energy management strategies, control variables and algorithms. The only inputs required are statistical distributions for stochastic energy inputs and outputs of the system and the system's device characteristics (efficiency and ratings). Although the formulation was originally driven by stand-alone PV system needs, the techniques are also applicable to hybrid and grid connected systems.
Investigation of energy management strategies for photovoltaic systems - An analysis technique
NASA Astrophysics Data System (ADS)
Cull, R. C.; Eltimsahy, A. H.
Progress is reported in formulating energy management strategies for stand-alone PV systems, developing an analytical tool that can be used to investigate these strategies, applying this tool to determine the proper control algorithms and control variables (controller inputs and outputs) for a range of applications, and quantifying the relative performance and economics when compared to systems that do not apply energy management. The analysis technique developed may be broadly applied to a variety of systems to determine the most appropriate energy management strategies, control variables and algorithms. The only inputs required are statistical distributions for stochastic energy inputs and outputs of the system and the system's device characteristics (efficiency and ratings). Although the formulation was originally driven by stand-alone PV system needs, the techniques are also applicable to hybrid and grid connected systems.
Implementing a Quantitative Analysis Design Tool for Future Generation Interfaces
2012-03-01
with Remotely Piloted Aircraft (RPA) has resulted in the need of a platform to evaluate interface design. The Vigilant Spirit Control Station ( VSCS ...Spirit interface. A modified version of the HCI Index was successfully applied to perform a quantitative analysis of the baseline VSCS interface and...time of the original VSCS interface. These results revealed the effectiveness of the tool and demonstrated in the design of future generation
Integrated verification and testing system (IVTS) for HAL/S programs
NASA Technical Reports Server (NTRS)
Senn, E. H.; Ames, K. R.; Smith, K. A.
1983-01-01
The IVTS is a large software system designed to support user-controlled verification analysis and testing activities for programs written in the HAL/S language. The system is composed of a user interface and user command language, analysis tools and an organized data base of host system files. The analysis tools are of four major types: (1) static analysis, (2) symbolic execution, (3) dynamic analysis (testing), and (4) documentation enhancement. The IVTS requires a split HAL/S compiler, divided at the natural separation point between the parser/lexical analyzer phase and the target machine code generator phase. The IVTS uses the internal program form (HALMAT) between these two phases as primary input for the analysis tools. The dynamic analysis component requires some way to 'execute' the object HAL/S program. The execution medium may be an interpretive simulation or an actual host or target machine.
A Lean Six Sigma approach to the improvement of the selenium analysis method.
Cloete, Bronwyn C; Bester, André
2012-11-02
Reliable results represent the pinnacle assessment of quality of an analytical laboratory, and therefore variability is considered to be a critical quality problem associated with the selenium analysis method executed at Western Cape Provincial Veterinary Laboratory (WCPVL). The elimination and control of variability is undoubtedly of significant importance because of the narrow margin of safety between toxic and deficient doses of the trace element for good animal health. A quality methodology known as Lean Six Sigma was believed to present the most feasible solution for overcoming the adverse effect of variation, through steps towards analytical process improvement. Lean Six Sigma represents a form of scientific method type, which is empirical, inductive and deductive, and systematic, which relies on data, and is fact-based. The Lean Six Sigma methodology comprises five macro-phases, namely Define, Measure, Analyse, Improve and Control (DMAIC). Both qualitative and quantitative laboratory data were collected in terms of these phases. Qualitative data were collected by using quality-tools, namely an Ishikawa diagram, a Pareto chart, Kaizen analysis and a Failure Mode Effect analysis tool. Quantitative laboratory data, based on the analytical chemistry test method, were collected through a controlled experiment. The controlled experiment entailed 13 replicated runs of the selenium test method, whereby 11 samples were repetitively analysed, whilst Certified Reference Material (CRM) was also included in 6 of the runs. Laboratory results obtained from the controlled experiment was analysed by using statistical methods, commonly associated with quality validation of chemistry procedures. Analysis of both sets of data yielded an improved selenium analysis method, believed to provide greater reliability of results, in addition to a greatly reduced cycle time and superior control features. Lean Six Sigma may therefore be regarded as a valuable tool in any laboratory, and represents both a management discipline, and a standardised approach to problem solving and process optimisation.
Automation tools for demonstration of goal directed and self-repairing flight control systems
NASA Technical Reports Server (NTRS)
Agarwal, A. K.
1988-01-01
The coupling of expert systems and control design and analysis techniques are documented to provide a realizable self repairing flight control system. Key features of such a flight control system are identified and a limited set of rules for a simple aircraft model are presented.
Proceedings of the Fifth NASA/NSF/DOD Workshop on Aerospace Computational Control
NASA Technical Reports Server (NTRS)
Wette, M. (Editor); Man, G. K. (Editor)
1993-01-01
The Fifth Annual Workshop on Aerospace Computational Control was one in a series of workshops sponsored by NASA, NSF, and the DOD. The purpose of these workshops is to address computational issues in the analysis, design, and testing of flexible multibody control systems for aerospace applications. The intention in holding these workshops is to bring together users, researchers, and developers of computational tools in aerospace systems (spacecraft, space robotics, aerospace transportation vehicles, etc.) for the purpose of exchanging ideas on the state of the art in computational tools and techniques.
Large Angle Transient Dynamics (LATDYN) user's manual
NASA Technical Reports Server (NTRS)
Abrahamson, A. Louis; Chang, Che-Wei; Powell, Michael G.; Wu, Shih-Chin; Bingel, Bradford D.; Theophilos, Paula M.
1991-01-01
A computer code for modeling the large angle transient dynamics (LATDYN) of structures was developed to investigate techniques for analyzing flexible deformation and control/structure interaction problems associated with large angular motions of spacecraft. This type of analysis is beyond the routine capability of conventional analytical tools without simplifying assumptions. In some instances, the motion may be sufficiently slow and the spacecraft (or component) sufficiently rigid to simplify analyses of dynamics and controls by making pseudo-static and/or rigid body assumptions. The LATDYN introduces a new approach to the problem by combining finite element structural analysis, multi-body dynamics, and control system analysis in a single tool. It includes a type of finite element that can deform and rotate through large angles at the same time, and which can be connected to other finite elements either rigidly or through mechanical joints. The LATDYN also provides symbolic capabilities for modeling control systems which are interfaced directly with the finite element structural model. Thus, the nonlinear equations representing the structural model are integrated along with the equations representing sensors, processing, and controls as a coupled system.
The Value of Information in Decision-Analytic Modeling for Malaria Vector Control in East Africa.
Kim, Dohyeong; Brown, Zachary; Anderson, Richard; Mutero, Clifford; Miranda, Marie Lynn; Wiener, Jonathan; Kramer, Randall
2017-02-01
Decision analysis tools and mathematical modeling are increasingly emphasized in malaria control programs worldwide to improve resource allocation and address ongoing challenges with sustainability. However, such tools require substantial scientific evidence, which is costly to acquire. The value of information (VOI) has been proposed as a metric for gauging the value of reduced model uncertainty. We apply this concept to an evidenced-based Malaria Decision Analysis Support Tool (MDAST) designed for application in East Africa. In developing MDAST, substantial gaps in the scientific evidence base were identified regarding insecticide resistance in malaria vector control and the effectiveness of alternative mosquito control approaches, including larviciding. We identify four entomological parameters in the model (two for insecticide resistance and two for larviciding) that involve high levels of uncertainty and to which outputs in MDAST are sensitive. We estimate and compare a VOI for combinations of these parameters in evaluating three policy alternatives relative to a status quo policy. We find having perfect information on the uncertain parameters could improve program net benefits by up to 5-21%, with the highest VOI associated with jointly eliminating uncertainty about reproductive speed of malaria-transmitting mosquitoes and initial efficacy of larviciding at reducing the emergence of new adult mosquitoes. Future research on parameter uncertainty in decision analysis of malaria control policy should investigate the VOI with respect to other aspects of malaria transmission (such as antimalarial resistance), the costs of reducing uncertainty in these parameters, and the extent to which imperfect information about these parameters can improve payoffs. © 2016 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Filippov, A. V.; Tarasov, S. Yu; Podgornyh, O. A.; Shamarin, N. N.; Filippova, E. O.
2017-01-01
Automatization of engineering processes requires developing relevant mathematical support and a computer software. Analysis of metal cutting kinematics and tool geometry is a necessary key task at the preproduction stage. This paper is focused on developing a procedure for determining the geometry of oblique peakless round-nose tool lathe machining with the use of vector/matrix transformations. Such an approach allows integration into modern mathematical software packages in distinction to the traditional analytic description. Such an advantage is very promising for developing automated control of the preproduction process. A kinematic criterion for the applicable tool geometry has been developed from the results of this study. The effect of tool blade inclination and curvature on the geometry-dependent process parameters was evaluated.
Onwujekwe, Obinna; Malik, El-Fatih Mohamed; Mustafa, Sara Hassan; Mnzava, Abraham
2005-01-01
Background In order to optimally prioritize and use public and private budgets for equitable malaria vector control, there is a need to determine the level and determinants of consumer demand for different vector control tools. Objectives To determine the demand from people of different socio-economic groups for indoor residual house-spraying (IRHS), insecticide-treated nets (ITNs), larviciding with chemicals (LWC), and space spraying/fogging (SS) and the disease control implications of the result. Methods Ratings and levels of willingness-to-pay (WTP) for the vector control tools were determined using a random cross-sectional sample of 720 householdes drawn from two states. WTP was elicited using the bidding game. An asset-based socio-economic status (SES) index was used to explore whether WTP was related to SES of the respondents. Results IRHS received the highest proportion of highest preferred rating (41.0%) followed by ITNs (23.1%). However, ITNs had the highest mean WTP followed by IRHS, while LWC had the least. The regression analysis showed that SES was positively and statistically significantly related to WTP across the four vector control tools and that the respondents' rating of IRHS and ITNs significantly explained their levels of WTP for the two tools. Conclusion People were willing to pay for all the vector-control tools, but the demand for the vector control tools was related to the SES of the respondents. Hence, it is vital that there are public policies and financing mechanisms to ensure equitable provision and utilisation of vector control tools, as well as protecting the poor from cost-sharing arrangements. PMID:16356177
Rigidity controllable polishing tool based on magnetorheological effect
NASA Astrophysics Data System (ADS)
Wang, Jia; Wan, Yongjian; Shi, Chunyan
2012-10-01
A stable and predictable material removal function (MRF) plays a crucial role in computer controlled optical surfacing (CCOS). For physical contact polishing case, the stability of MRF depends on intimate contact between polishing interface and workpiece. Rigid laps maintain this function in polishing spherical surfaces, whose curvature has no variation with the position on the surface. Such rigid laps provide smoothing effect for mid-spatial frequency errors, but can't be used in aspherical surfaces for they will destroy the surface figure. Flexible tools such as magnetorheological fluid or air bonnet conform to the surface [1]. They lack rigidity and provide little natural smoothing effect. We present a rigidity controllable polishing tool that uses a kind of magnetorheological elastomers (MRE) medium [2]. It provides the ability of both conforming to the aspheric surface and maintaining natural smoothing effect. What's more, its rigidity can be controlled by the magnetic field. This paper will present the design, analysis, and stiffness variation mechanism model of such polishing tool [3].
OISI dynamic end-to-end modeling tool
NASA Astrophysics Data System (ADS)
Kersten, Michael; Weidler, Alexander; Wilhelm, Rainer; Johann, Ulrich A.; Szerdahelyi, Laszlo
2000-07-01
The OISI Dynamic end-to-end modeling tool is tailored to end-to-end modeling and dynamic simulation of Earth- and space-based actively controlled optical instruments such as e.g. optical stellar interferometers. `End-to-end modeling' is meant to denote the feature that the overall model comprises besides optical sub-models also structural, sensor, actuator, controller and disturbance sub-models influencing the optical transmission, so that the system- level instrument performance due to disturbances and active optics can be simulated. This tool has been developed to support performance analysis and prediction as well as control loop design and fine-tuning for OISI, Germany's preparatory program for optical/infrared spaceborne interferometry initiated in 1994 by Dornier Satellitensysteme GmbH in Friedrichshafen.
System data communication structures for active-control transport aircraft, volume 1
NASA Technical Reports Server (NTRS)
Hopkins, A. L.; Martin, J. H.; Brock, L. D.; Jansson, D. G.; Serben, S.; Smith, T. B.; Hanley, L. D.
1981-01-01
Candidate data communication techniques are identified, including dedicated links, local buses, broadcast buses, multiplex buses, and mesh networks. The design methodology for mesh networks is then discussed, including network topology and node architecture. Several concepts of power distribution are reviewed, including current limiting and mesh networks for power. The technology issues of packaging, transmission media, and lightning are addressed, and, finally, the analysis tools developed to aid in the communication design process are described. There are special tools to analyze the reliability and connectivity of networks and more general reliability analysis tools for all types of systems.
Simulation-Based Analysis of Reentry Dynamics for the Sharp Atmospheric Entry Vehicle
NASA Technical Reports Server (NTRS)
Tillier, Clemens Emmanuel
1998-01-01
This thesis describes the analysis of the reentry dynamics of a high-performance lifting atmospheric entry vehicle through numerical simulation tools. The vehicle, named SHARP, is currently being developed by the Thermal Protection Materials and Systems branch of NASA Ames Research Center, Moffett Field, California. The goal of this project is to provide insight into trajectory tradeoffs and vehicle dynamics using simulation tools that are powerful, flexible, user-friendly and inexpensive. Implemented Using MATLAB and SIMULINK, these tools are developed with an eye towards further use in the conceptual design of the SHARP vehicle's trajectory and flight control systems. A trajectory simulator is used to quantify the entry capabilities of the vehicle subject to various operational constraints. Using an aerodynamic database computed by NASA and a model of the earth, the simulator generates the vehicle trajectory in three-dimensional space based on aerodynamic angle inputs. Requirements for entry along the SHARP aerothermal performance constraint are evaluated for different control strategies. Effect of vehicle mass on entry parameters is investigated, and the cross range capability of the vehicle is evaluated. Trajectory results are presented and interpreted. A six degree of freedom simulator builds on the trajectory simulator and provides attitude simulation for future entry controls development. A Newtonian aerodynamic model including control surfaces and a mass model are developed. A visualization tool for interpreting simulation results is described. Control surfaces are roughly sized. A simple controller is developed to fly the vehicle along its aerothermal performance constraint using aerodynamic flaps for control. This end-to-end demonstration proves the suitability of the 6-DOF simulator for future flight control system development. Finally, issues surrounding real-time simulation with hardware in the loop are discussed.
A Graphics System for Pole-Zero Map Analysis.
ERIC Educational Resources Information Center
Beyer, William Fred, III
Computer scientists have developed an interactive, graphical display system for pole-zero map analysis. They designed it for use as an educational tool in teaching introductory courses in automatic control systems. The facilities allow the user to specify a control system and an input function in the form of a pole-zero map and then examine the…
NASA Technical Reports Server (NTRS)
Gupta, Pramod; Schumann, Johann
2004-01-01
High reliability of mission- and safety-critical software systems has been identified by NASA as a high-priority technology challenge. We present an approach for the performance analysis of a neural network (NN) in an advanced adaptive control system. This problem is important in the context of safety-critical applications that require certification, such as flight software in aircraft. We have developed a tool to measure the performance of the NN during operation by calculating a confidence interval (error bar) around the NN's output. Our tool can be used during pre-deployment verification as well as monitoring the network performance during operation. The tool has been implemented in Simulink and simulation results on a F-15 aircraft are presented.
Interpolity exchange of basalt tools facilitated via elite control in Hawaiian archaic states
Kirch, Patrick V.; Mills, Peter R.; Lundblad, Steven P.; Sinton, John; Kahn, Jennifer G.
2012-01-01
Ethnohistoric accounts of late precontact Hawaiian archaic states emphasize the independence of chiefly controlled territories (ahupua‘a) based on an agricultural, staple economy. However, elite control of unevenly distributed resources, such as high-quality volcanic rock for adze production, may have provided an alternative source of economic power. To test this hypothesis we used nondestructive energy-dispersive X-ray fluorescence (ED-XRF) analysis of 328 lithic artifacts from 36 archaeological features in the Kahikinui district, Maui Island, to geochemically characterize the source groups. This process was followed by a limited sampling using destructive wavelength-dispersive X-ray fluorescence (WD-XRF) analysis to more precisely characterize certain nonlocal source groups. Seventeen geochemical groups were defined, eight of which represent extra-Maui Island sources. Although the majority of stone tools were derived from Maui Island sources (71%), a significant quantity (27%) of tools derived from extraisland sources, including the large Mauna Kea quarry on Hawai‘i Island as well as quarries on O‘ahu, Moloka‘i, and Lāna‘i islands. Importantly, tools quarried from extralocal sources are found in the highest frequency in elite residential features and in ritual contexts. These results suggest a significant role for a wealth economy based on the control and distribution of nonagricultural goods and resources during the rise of the Hawaiian archaic states. PMID:22203984
A lumped parameter mathematical model for simulation of subsonic wind tunnels
NASA Technical Reports Server (NTRS)
Krosel, S. M.; Cole, G. L.; Bruton, W. M.; Szuch, J. R.
1986-01-01
Equations for a lumped parameter mathematical model of a subsonic wind tunnel circuit are presented. The equation state variables are internal energy, density, and mass flow rate. The circuit model is structured to allow for integration and analysis of tunnel subsystem models which provide functions such as control of altitude pressure and temperature. Thus the model provides a useful tool for investigating the transient behavior of the tunnel and control requirements. The model was applied to the proposed NASA Lewis Altitude Wind Tunnel (AWT) circuit and included transfer function representations of the tunnel supply/exhaust air and refrigeration subsystems. Both steady state and frequency response data are presented for the circuit model indicating the type of results and accuracy that can be expected from the model. Transient data for closed loop control of the tunnel and its subsystems are also presented, demonstrating the model's use as a control analysis tool.
Passivity and Dissipativity as Design and Analysis Tools for Networked Control Systems
ERIC Educational Resources Information Center
Yu, Han
2012-01-01
In this dissertation, several control problems are studied that arise when passive or dissipative systems are interconnected and controlled over a communication network. Since communication networks can impact the systems' stability and performance, there is a need to extend the results on control of passive or dissipative systems to networked…
Interdisciplinary analysis procedures in the modeling and control of large space-based structures
NASA Technical Reports Server (NTRS)
Cooper, Paul A.; Stockwell, Alan E.; Kim, Zeen C.
1987-01-01
The paper describes a computer software system called the Integrated Multidisciplinary Analysis Tool, IMAT, that has been developed at NASA Langley Research Center. IMAT provides researchers and analysts with an efficient capability to analyze satellite control systems influenced by structural dynamics. Using a menu-driven interactive executive program, IMAT links a relational database to commercial structural and controls analysis codes. The paper describes the procedures followed to analyze a complex satellite structure and control system. The codes used to accomplish the analysis are described, and an example is provided of an application of IMAT to the analysis of a reference space station subject to a rectangular pulse loading at its docking port.
MEA-Tools: an open source toolbox for the analysis of multi-electrode data with MATLAB.
Egert, U; Knott, Th; Schwarz, C; Nawrot, M; Brandt, A; Rotter, S; Diesmann, M
2002-05-30
Recent advances in electrophysiological techniques have created new tools for the acquisition and storage of neuronal activity recorded simultaneously with numerous electrodes. These techniques support the analysis of the function as well as the structure of individual electrogenic cells in the context of surrounding neuronal or cardiac network. Commercially available tools for the analysis of such data, however, cannot be easily adapted to newly emerging requirements for data analysis and visualization, and cross compatibility between them is limited. In this report we introduce a free open source toolbox called microelectrode array tools (MEA-Tools) for the analysis of multi-electrode data based on the common data analysis environment MATLAB (version 5.3-6.1, The Mathworks, Natick, MA). The toolbox itself is platform independent. The file interface currently supports files recorded with MCRack (Multi Channel Systems, Reutlingen, Germany) under Microsoft Windows 95, 98, NT, and 2000, but can be adapted to other data acquisition systems. Functions are controlled via command line input and graphical user interfaces, and support common requirements for the analysis of local field potentials, extracellular spike activity, and continuous recordings, in addition to supplementary data acquired by additional instruments, e.g. intracellular amplifiers. Data may be processed as continuous recordings or time windows triggered to some event.
Practical applications of surface analytic tools in tribology
NASA Technical Reports Server (NTRS)
Ferrante, J.
1980-01-01
Many of the currently, widely used tools available for surface analysis are described. Those which have the highest applicability for giving elemental and/or compound analysis for problems of interest in tribology and are truly surface sensitive (that is, less than 10 atomic layers) are presented. The latter group is evaluated in detail in terms of strengths and weaknesses. Emphasis is placed on post facto analysis of experiments performed under 'real' conditions (e.g., in air with lubricants). It is further indicated that such equipment could be used for screening and quality control.
NASA Astrophysics Data System (ADS)
Ma, Zhichao; Hu, Leilei; Zhao, Hongwei; Wu, Boda; Peng, Zhenxing; Zhou, Xiaoqin; Zhang, Hongguo; Zhu, Shuai; Xing, Lifeng; Hu, Huang
2010-08-01
The theories and techniques for improving machining accuracy via position control of diamond tool's tip and raising resolution of cutting depth on precise CNC lathes have been extremely focused on. A new piezo-driven ultra-precision machine tool servo system is designed and tested to improve manufacturing accuracy of workpiece. The mathematical model of machine tool servo system is established and the finite element analysis is carried out on parallel plate flexure hinges. The output position of diamond tool's tip driven by the machine tool servo system is tested via a contact capacitive displacement sensor. Proportional, integral, derivative (PID) feedback is also implemented to accommodate and compensate dynamical change owing cutting forces as well as the inherent non-linearity factors of the piezoelectric stack during cutting process. By closed loop feedback controlling strategy, the tracking error is limited to 0.8 μm. Experimental results have shown the proposed machine tool servo system could provide a tool positioning resolution of 12 nm, which is much accurate than the inherent CNC resolution magnitude. The stepped shaft of aluminum specimen with a step increment of cutting depth of 1 μm is tested, and the obtained contour illustrates the displacement command output from controller is accurately and real-time reflected on the machined part.
Development and Integration of Control System Models
NASA Technical Reports Server (NTRS)
Kim, Young K.
1998-01-01
The computer simulation tool, TREETOPS, has been upgraded and used at NASA/MSFC to model various complicated mechanical systems and to perform their dynamics and control analysis with pointing control systems. A TREETOPS model of Advanced X-ray Astrophysics Facility - Imaging (AXAF-1) dynamics and control system was developed to evaluate the AXAF-I pointing performance for Normal Pointing Mode. An optical model of Shooting Star Experiment (SSE) was also developed and its optical performance analysis was done using the MACOS software.
Cornwell, MacIntosh; Vangala, Mahesh; Taing, Len; Herbert, Zachary; Köster, Johannes; Li, Bo; Sun, Hanfei; Li, Taiwen; Zhang, Jian; Qiu, Xintao; Pun, Matthew; Jeselsohn, Rinath; Brown, Myles; Liu, X Shirley; Long, Henry W
2018-04-12
RNA sequencing has become a ubiquitous technology used throughout life sciences as an effective method of measuring RNA abundance quantitatively in tissues and cells. The increase in use of RNA-seq technology has led to the continuous development of new tools for every step of analysis from alignment to downstream pathway analysis. However, effectively using these analysis tools in a scalable and reproducible way can be challenging, especially for non-experts. Using the workflow management system Snakemake we have developed a user friendly, fast, efficient, and comprehensive pipeline for RNA-seq analysis. VIPER (Visualization Pipeline for RNA-seq analysis) is an analysis workflow that combines some of the most popular tools to take RNA-seq analysis from raw sequencing data, through alignment and quality control, into downstream differential expression and pathway analysis. VIPER has been created in a modular fashion to allow for the rapid incorporation of new tools to expand the capabilities. This capacity has already been exploited to include very recently developed tools that explore immune infiltrate and T-cell CDR (Complementarity-Determining Regions) reconstruction abilities. The pipeline has been conveniently packaged such that minimal computational skills are required to download and install the dozens of software packages that VIPER uses. VIPER is a comprehensive solution that performs most standard RNA-seq analyses quickly and effectively with a built-in capacity for customization and expansion.
Development of the ECLSS Sizing Analysis Tool and ARS Mass Balance Model Using Microsoft Excel
NASA Technical Reports Server (NTRS)
McGlothlin, E. P.; Yeh, H. Y.; Lin, C. H.
1999-01-01
The development of a Microsoft Excel-compatible Environmental Control and Life Support System (ECLSS) sizing analysis "tool" for conceptual design of Mars human exploration missions makes it possible for a user to choose a certain technology in the corresponding subsystem. This tool estimates the mass, volume, and power requirements of every technology in a subsystem and the system as a whole. Furthermore, to verify that a design sized by the ECLSS Sizing Tool meets the mission requirements and integrates properly, mass balance models that solve for component throughputs of such ECLSS systems as the Water Recovery System (WRS) and Air Revitalization System (ARS) must be developed. The ARS Mass Balance Model will be discussed in this paper.
Terminal - Tactical Separation Assured Flight Environment (T-TSafe)
NASA Technical Reports Server (NTRS)
Verma, Savita Arora; Tang, Huabin; Ballinger, Debbi
2011-01-01
The Tactical Separation Assured Flight Environment (TSAFE) has been previously tested as a conflict detection and resolution tool in the en-route phase of flight. Fast time simulations of a terminal version of this tool called Terminal TSAFE (T-TSAFE) have shown promise over the current conflict detection tools. It has shown to have fewer false alerts (as low as 2 per hour) and better prediction to conflict time than Conflict Alert. The tool will be tested in the simulated terminal area of Los Angeles International Airport, in a Human-in-the-loop experiment to identify controller procedures and information requirements. The simulation will include comparisons of T-TSAFE with NASA's version of Conflict Alert. Also, some other variables such as altitude entry by the controller, which improve T-TSAFE's predictions for conflict detection, will be tested. T-TSAFE integrates features of current conflict detection tools such as Automated Terminal Proximity Alert used to alleviate compression errors in the final approach phase. Based on fast-time simulation analysis, the anticipated benefits of T-TSAFE over Conflict Alert include reduced false/missed alerts and increased time to predicted loss of separation. Other metrics that will be used to evaluate the tool's impact on the controller include controller intervention, workload, and situation awareness.
Computer Instructional Aids for Undergraduate Control Education. 1978 Edition.
ERIC Educational Resources Information Center
Volz, Richard A.; And Others
This work represents the development of computer tools for undergraduate students. Emphasis is on automatic control theory using hybrid and digital computation. The routine calculations of control system analysis are presented as students would use them on the University of Michigan's central digital computer and the time-shared graphic terminals…
42: An Open-Source Simulation Tool for Study and Design of Spacecraft Attitude Control Systems
NASA Technical Reports Server (NTRS)
Stoneking, Eric
2018-01-01
Simulation is an important tool in the analysis and design of spacecraft attitude control systems. The speaker will discuss the simulation tool, called simply 42, that he has developed over the years to support his own work as an engineer in the Attitude Control Systems Engineering Branch at NASA Goddard Space Flight Center. 42 was intended from the outset to be high-fidelity and powerful, but also fast and easy to use. 42 is publicly available as open source since 2014. The speaker will describe some of 42's models and features, and discuss its applicability to studies ranging from early concept studies through the design cycle, integration, and operations. He will outline 42's architecture and share some thoughts on simulation development as a long-term project.
A Study on Predictive Analytics Application to Ship Machinery Maintenance
2013-09-01
Looking at the nature of the time series forecasting method , it would be better applied to offline analysis . The application for real- time online...other system attributes in future. Two techniques of statistical analysis , mainly time series models and cumulative sum control charts, are discussed in...statistical tool employed for the two techniques of statistical analysis . Both time series forecasting as well as CUSUM control charts are shown to be
NASA Astrophysics Data System (ADS)
Winiwarter, Susanne; Middleton, Brian; Jones, Barry; Courtney, Paul; Lindmark, Bo; Page, Ken M.; Clark, Alan; Landqvist, Claire
2015-09-01
We demonstrate here a novel use of statistical tools to study intra- and inter-site assay variability of five early drug metabolism and pharmacokinetics in vitro assays over time. Firstly, a tool for process control is presented. It shows the overall assay variability but allows also the following of changes due to assay adjustments and can additionally highlight other, potentially unexpected variations. Secondly, we define the minimum discriminatory difference/ratio to support projects to understand how experimental values measured at different sites at a given time can be compared. Such discriminatory values are calculated for 3 month periods and followed over time for each assay. Again assay modifications, especially assay harmonization efforts, can be noted. Both the process control tool and the variability estimates are based on the results of control compounds tested every time an assay is run. Variability estimates for a limited set of project compounds were computed as well and found to be comparable. This analysis reinforces the need to consider assay variability in decision making, compound ranking and in silico modeling.
Supporting Scientific Analysis within Collaborative Problem Solving Environments
NASA Technical Reports Server (NTRS)
Watson, Velvin R.; Kwak, Dochan (Technical Monitor)
2000-01-01
Collaborative problem solving environments for scientists should contain the analysis tools the scientists require in addition to the remote collaboration tools used for general communication. Unfortunately, most scientific analysis tools have been designed for a "stand-alone mode" and cannot be easily modified to work well in a collaborative environment. This paper addresses the questions, "What features are desired in a scientific analysis tool contained within a collaborative environment?", "What are the tool design criteria needed to provide these features?", and "What support is required from the architecture to support these design criteria?." First, the features of scientific analysis tools that are important for effective analysis in collaborative environments are listed. Next, several design criteria for developing analysis tools that will provide these features are presented. Then requirements for the architecture to support these design criteria are listed. Sonic proposed architectures for collaborative problem solving environments are reviewed and their capabilities to support the specified design criteria are discussed. A deficiency in the most popular architecture for remote application sharing, the ITU T. 120 architecture, prevents it from supporting highly interactive, dynamic, high resolution graphics. To illustrate that the specified design criteria can provide a highly effective analysis tool within a collaborative problem solving environment, a scientific analysis tool that contains the specified design criteria has been integrated into a collaborative environment and tested for effectiveness. The tests were conducted in collaborations between remote sites in the US and between remote sites on different continents. The tests showed that the tool (a tool for the visual analysis of computer simulations of physics) was highly effective for both synchronous and asynchronous collaborative analyses. The important features provided by the tool (and made possible by the specified design criteria) are: 1. The tool provides highly interactive, dynamic, high resolution, 3D graphics. 2. All remote scientists can view the same dynamic, high resolution, 3D scenes of the analysis as the analysis is being conducted. 3. The responsiveness of the tool is nearly identical to the responsiveness of the tool in a stand-alone mode. 4. The scientists can transfer control of the analysis between themselves. 5. Any analysis session or segment of an analysis session, whether done individually or collaboratively, can be recorded and posted on the Web for other scientists or students to download and play in either a collaborative or individual mode. 6. The scientist or student who downloaded the session can, individually or collaboratively, modify or extend the session with his/her own "what if" analysis of the data and post his/her version of the analysis back onto the Web. 7. The peak network bandwidth used in the collaborative sessions is only 1K bit/second even though the scientists at all sites are viewing high resolution (1280 x 1024 pixels), dynamic, 3D scenes of the analysis. The links between the specified design criteria and these performance features are presented.
2013-09-01
attacker can acquire and use against a wireless infrastructure. Wireless attack tool kits such as the “ Raspberry – PI ” (shown in Figure 10), and...still use a tool such as the Raspberry – PI to perform attacks against a network from outside the controlled area or even inside the controlled area...when considering an insider attack. Figure 10. (From www.howtodocomputing.blogspot.com, n.d.) Wireless – PI is “a collection of pre-configured
1983-09-01
Reyer’s research 12 . I -’. .. - n -l l--" (11:1), of "scrounging" from another worker’s tool box. As a consequence, control and accounting procedures...PPOGPAM $ELEMENT. PROJECT, TASK School of Systems and Logistics APEA A WORK UNIT NUMBERS Air Force Institute of Technology, WPAFBO1 II. CONTROLLING ...thesis analyzes those attitudes by evaluating the collected data from the AFLMC questionnaire and this research team’s telephone interviews. The
Critical Consciousness: A Critique and Critical Analysis of the Literature
ERIC Educational Resources Information Center
Jemal, Alexis
2017-01-01
The education system has been heralded as a tool of liberation and simultaneously critiqued as a tool of social control to maintain the oppressive status quo. Critical consciousness (CC), developed by the Brazilian educator, Paulo Freire, advanced an educational pedagogy to liberate the masses from systemic inequity maintained and perpetuated by…
A Cross-National CAI Tool To Support Learning Operations Decision-Making and Market Analysis.
ERIC Educational Resources Information Center
Mockler, Robert J.; Afanasiev, Mikhail Y.; Dologite, Dorothy G.
1999-01-01
Describes bicultural (United States and Russia) development of a computer-aided instruction (CAI) tool to learn management decision-making using information systems technologies. The program has been used with undergraduate and graduate students in both countries; it integrates free and controlled market concepts and combines traditional computer…
Application of spatial technology in malaria research & control: some new insights.
Saxena, Rekha; Nagpal, B N; Srivastava, Aruna; Gupta, S K; Dash, A P
2009-08-01
Geographical information System (GIS) has emerged as the core of the spatial technology which integrates wide range of dataset available from different sources including Remote Sensing (RS) and Global Positioning System (GPS). Literature published during the decade (1998-2007) has been compiled and grouped into six categories according to the usage of the technology in malaria epidemiology. Different GIS modules like spatial data sources, mapping and geo-processing tools, distance calculation, digital elevation model (DEM), buffer zone and geo-statistical analysis have been investigated in detail, illustrated with examples as per the derived results. These GIS tools have contributed immensely in understanding the epidemiological processes of malaria and examples drawn have shown that GIS is now widely used for research and decision making in malaria control. Statistical data analysis currently is the most consistent and established set of tools to analyze spatial datasets. The desired future development of GIS is in line with the utilization of geo-statistical tools which combined with high quality data has capability to provide new insight into malaria epidemiology and the complexity of its transmission potential in endemic areas.
Fast 3D Net Expeditions: Tools for Effective Scientific Collaboration on the World Wide Web
NASA Technical Reports Server (NTRS)
Watson, Val; Chancellor, Marisa K. (Technical Monitor)
1996-01-01
Two new technologies, the FASTexpedition and Remote FAST, have been developed that provide remote, 3D (three dimensional), high resolution, dynamic, interactive viewing of scientific data. The FASTexpedition permits one to access scientific data from the World Wide Web, take guided expeditions through the data, and continue with self controlled expeditions through the data. Remote FAST permits collaborators at remote sites to simultaneously view an analysis of scientific data being controlled by one of the collaborators. Control can be transferred between sites. These technologies are now being used for remote collaboration in joint university, industry, and NASA projects. Also, NASA Ames Research Center has initiated a project to make scientific data and guided expeditions through the data available as FASTexpeditions on the World Wide Web for educational purposes. Previously, remote visualization of dynamic data was done using video format (transmitting pixel information) such as video conferencing or MPEG (Motion Picture Expert Group) movies on the Internet. The concept for this new technology is to send the raw data (e.g., grids, vectors, and scalars) along with viewing scripts over the Internet and have the pixels generated by a visualization tool running on the viewers local workstation. The visualization tool that is currently used is FAST (Flow Analysis Software Toolkit). The advantages of this new technology over using video format are: (1) The visual is much higher in resolution (1280x1024 pixels with 24 bits of color) than typical video format transmitted over the network. (2) The form of the visualization can be controlled interactively (because the viewer is interactively controlling the visualization tool running on his workstation). (3) A rich variety of guided expeditions through the data can be included easily. (4) A capability is provided for other sites to see a visual analysis of one site as the analysis is interactively performed. Control of the analysis can be passed from site to site. (5) The scenes can be viewed in 3D using stereo vision. (6) The network bandwidth for the visualization using this new technology is much smaller than when using video format. (The measured peak bandwidth used was 1 Kbit/sec whereas the measured bandwidth for a small video picture was 500 Kbits/sec.) This talk will illustrate the use of these new technologies and present a proposal for using these technologies to improve science education.
NASA Technical Reports Server (NTRS)
Bentz, Karl F.; Coleman, Robert D.; Dubnik, Kathy; Marshall, William S.; Mcentee, Amy; Na, Sae H.; Patton, Scott G.; West, Michael C.
1987-01-01
Tools useful for operations and maintenance tasks on the lunar surface were determined and designed. Primary constraints are the lunar environment, the astronaut's space suit and the strength limits of the astronaut on the moon. A multipurpose rotary motion tool and a collapsible tool carrier were designed. For the rotary tool, a brushless motor and controls were specified, a material for the housing was chosen, bearings and lubrication were recommended and a planetary reduction gear attachment was designed. The tool carrier was designed primarily for ease of access to the tools and fasteners. A material was selected and structural analysis was performed on the carrier. Recommendations were made about the limitations of human performance and about possible attachments to the torque driver.
NASA Astrophysics Data System (ADS)
Pingel, N.; Liang, Y.; Bindra, A.
2016-12-01
More than 1 million Californians live and work in the floodplains of the Sacramento-San Joaquin Valley where flood risks are among the highest in the nation. In response to this threat to people, property and the environment, the Department of Water Resources (DWR) has been called to action to improve flood risk management. This has transpired through significant advances in development of flood information and tools, analysis, and planning. Senate Bill 5 directed DWR to prepare the Central Valley Flood Protection Plan (CVFPP) and update it every 5 years. A key component of this aggressive planning approach is answering the question: What is the current flood risk, and how would proposed improvements change flood risk throughout the system? Answering this question is a substantial challenge due to the size and complexity of the watershed and flood control system. The watershed is roughly 42,000 sq mi, and flows are controlled by numerous reservoirs, bypasses, and levees. To overcome this challenge, the State invested in development of a comprehensive analysis "tool box" through various DWR programs. Development of the tool box included: collection of hydro-meteorological, topographic, geotechnical, and economic data; development of rainfall-runoff, reservoir operation, hydraulic routing, and flood risk analysis models; and development of specialized applications and computing schemes to accelerate the analysis. With this toolbox, DWR is analyzing flood hazard, flood control system performance, exposure and vulnerability of people and property to flooding, consequence of flooding for specific events, and finally flood risk for a range of CVFPP alternatives. Based on the results, DWR will put forward a State Recommended Plan in the 2017 CVFPP. Further, the value of the analysis tool box extends beyond the CVFPP. It will serve as a foundation for other flood studies for years to come and has already been successfully applied for inundation mapping to support emergency response, reservoir operation analysis, and others.
Effects of a Network-Centric Multi-Modal Communication Tool on a Communication Monitoring Task
2012-03-01
replaced (Nelson, Bolia, Vidulich, & Langhorne , 2004). Communication will continue to be the central tool for Command and Control (C2) operators. However...Nelson, Bolia, Vidulich, & Langhorne , 2004). The two highest ratings for most potential technologies were data capture/replay tools and chat...analysis of variance (ANOVA). A significant main effect was found for Difficulty, F (1, 13) = 21.11, p < .05; the overall level of detections was
Decision support tool for diagnosing the source of variation
NASA Astrophysics Data System (ADS)
Masood, Ibrahim; Azrul Azhad Haizan, Mohamad; Norbaya Jumali, Siti; Ghazali, Farah Najihah Mohd; Razali, Hazlin Syafinaz Md; Shahir Yahya, Mohd; Azlan, Mohd Azwir bin
2017-08-01
Identifying the source of unnatural variation (SOV) in manufacturing process is essential for quality control. The Shewhart control chart patterns (CCPs) are commonly used to monitor the SOV. However, a proper interpretation of CCPs associated to its SOV requires a high skill industrial practitioner. Lack of knowledge in process engineering will lead to erroneous corrective action. The objective of this study is to design the operating procedures of computerized decision support tool (DST) for process diagnosis. The DST is an embedded tool in CCPs recognition scheme. Design methodology involves analysis of relationship between geometrical features, manufacturing process and CCPs. The DST contents information about CCPs and its possible root cause error and description on SOV phenomenon such as process deterioration in tool bluntness, offsetting tool, loading error, and changes in materials hardness. The DST will be useful for an industrial practitioner in making effective troubleshooting.
SECIMTools: a suite of metabolomics data analysis tools.
Kirpich, Alexander S; Ibarra, Miguel; Moskalenko, Oleksandr; Fear, Justin M; Gerken, Joseph; Mi, Xinlei; Ashrafi, Ali; Morse, Alison M; McIntyre, Lauren M
2018-04-20
Metabolomics has the promise to transform the area of personalized medicine with the rapid development of high throughput technology for untargeted analysis of metabolites. Open access, easy to use, analytic tools that are broadly accessible to the biological community need to be developed. While technology used in metabolomics varies, most metabolomics studies have a set of features identified. Galaxy is an open access platform that enables scientists at all levels to interact with big data. Galaxy promotes reproducibility by saving histories and enabling the sharing workflows among scientists. SECIMTools (SouthEast Center for Integrated Metabolomics) is a set of Python applications that are available both as standalone tools and wrapped for use in Galaxy. The suite includes a comprehensive set of quality control metrics (retention time window evaluation and various peak evaluation tools), visualization techniques (hierarchical cluster heatmap, principal component analysis, modular modularity clustering), basic statistical analysis methods (partial least squares - discriminant analysis, analysis of variance, t-test, Kruskal-Wallis non-parametric test), advanced classification methods (random forest, support vector machines), and advanced variable selection tools (least absolute shrinkage and selection operator LASSO and Elastic Net). SECIMTools leverages the Galaxy platform and enables integrated workflows for metabolomics data analysis made from building blocks designed for easy use and interpretability. Standard data formats and a set of utilities allow arbitrary linkages between tools to encourage novel workflow designs. The Galaxy framework enables future data integration for metabolomics studies with other omics data.
Analysis of Orbital Lifetime Prediction Parameters in Preparation for Post-Mission Disposal
NASA Astrophysics Data System (ADS)
Choi, Ha-Yeon; Kim, Hae-Dong; Seong, Jae-Dong
2015-12-01
Atmospheric drag force is an important source of perturbation of Low Earth Orbit (LEO) orbit satellites, and solar activity is a major factor for changes in atmospheric density. In particular, the orbital lifetime of a satellite varies with changes in solar activity, so care must be taken in predicting the remaining orbital lifetime during preparation for post-mission disposal. In this paper, the System Tool Kit (STK®) Long-term Orbit Propagator is used to analyze the changes in orbital lifetime predictions with respect to solar activity. In addition, the STK® Lifetime tool is used to analyze the change in orbital lifetime with respect to solar flux data generation, which is needed for the orbital lifetime calculation, and its control on the drag coefficient control. Analysis showed that the application of the most recent solar flux file within the Lifetime tool gives a predicted trend that is closest to the actual orbit. We also examine the effect of the drag coefficient, by performing a comparative analysis between varying and constant coefficients in terms of solar activity intensities.
Human factors phase IV : risk analysis tool for new train control technology.
DOT National Transportation Integrated Search
2005-01-31
This report covers the theoretical development of the safety state model for railroad operations. Using data from a train control technology experiment, experimental application of the model is demonstrated. A stochastic model of system behavior is d...
Human factors phase IV : risk analysis tool for new train control technology
DOT National Transportation Integrated Search
2005-01-01
This report covers the theoretical development of the safety state model for railroad operations. Using data from a train control technology experiment, experimental application of the model is demonstrated. A stochastic model of system behavior is d...
Sidhu, Navdeep S; Edwards, Morgan
2018-04-27
We conducted a scoping review of tools designed to add structure to clinical teaching, with a thematic analysis to establish definitional clarity. Six thousand and forty nine citations were screened, 434 reviewed for eligibility, and 230 identified as meeting study inclusion criteria. Eighty-nine names and 51 definitions were identified. Based on a post facto thematic analysis, we propose that these tools be named "deliberate teaching tools" (DTTs) and defined as "frameworks that enable clinicians to have a purposeful and considered approach to teaching encounters by incorporating elements identified with good teaching practice." We identified 46 DTTs in the literature, with 38 (82.6%) originally described for the medical setting. Forty justification articles consisted of 16 feedback surveys, 13 controlled trials, seven pre-post intervention studies with no control group, and four observation studies. Current evidence of efficacy is not entirely conclusive, and many studies contain methodology flaws. Forty-nine clarification articles comprised 12 systematic reviews and 37 narrative reviews. The most number of DTTs described by any review was four. A common design theme was identified in approximately three-quarters of DTTs. Applicability of DTTs to specific alternate settings should be considered in context, and appropriately designed justification studies are warranted to demonstrate efficacy.
Simple Sensitivity Analysis for Orion GNC
NASA Technical Reports Server (NTRS)
Pressburger, Tom; Hoelscher, Brian; Martin, Rodney; Sricharan, Kumar
2013-01-01
The performance of Orion flight software, especially its GNC software, is being analyzed by running Monte Carlo simulations of Orion spacecraft flights. The simulated performance is analyzed for conformance with flight requirements, expressed as performance constraints. Flight requirements include guidance (e.g. touchdown distance from target) and control (e.g., control saturation) as well as performance (e.g., heat load constraints). The Monte Carlo simulations disperse hundreds of simulation input variables, for everything from mass properties to date of launch.We describe in this paper a sensitivity analysis tool (Critical Factors Tool or CFT) developed to find the input variables or pairs of variables which by themselves significantly influence satisfaction of requirements or significantly affect key performance metrics (e.g., touchdown distance from target). Knowing these factors can inform robustness analysis, can inform where engineering resources are most needed, and could even affect operations. The contributions of this paper include the introduction of novel sensitivity measures, such as estimating success probability, and a technique for determining whether pairs of factors are interacting dependently or independently. The tool found that input variables such as moments, mass, thrust dispersions, and date of launch were found to be significant factors for success of various requirements. Examples are shown in this paper as well as a summary and physics discussion of EFT-1 driving factors that the tool found.
On-Line Loss of Control Detection Using Wavelets
NASA Technical Reports Server (NTRS)
Brenner, Martin J. (Technical Monitor); Thompson, Peter M.; Klyde, David H.; Bachelder, Edward N.; Rosenthal, Theodore J.
2005-01-01
Wavelet transforms are used for on-line detection of aircraft loss of control. Wavelet transforms are compared with Fourier transform methods and shown to more rapidly detect changes in the vehicle dynamics. This faster response is due to a time window that decreases in length as the frequency increases. New wavelets are defined that further decrease the detection time by skewing the shape of the envelope. The wavelets are used for power spectrum and transfer function estimation. Smoothing is used to tradeoff the variance of the estimate with detection time. Wavelets are also used as front-end to the eigensystem reconstruction algorithm. Stability metrics are estimated from the frequency response and models, and it is these metrics that are used for loss of control detection. A Matlab toolbox was developed for post-processing simulation and flight data using the wavelet analysis methods. A subset of these methods was implemented in real time and named the Loss of Control Analysis Tool Set or LOCATS. A manual control experiment was conducted using a hardware-in-the-loop simulator for a large transport aircraft, in which the real time performance of LOCATS was demonstrated. The next step is to use these wavelet analysis tools for flight test support.
Trends and Issues in Fuzzy Control and Neuro-Fuzzy Modeling
NASA Technical Reports Server (NTRS)
Chiu, Stephen
1996-01-01
Everyday experience in building and repairing things around the home have taught us the importance of using the right tool for the right job. Although we tend to think of a 'job' in broad terms, such as 'build a bookcase,' we understand well that the 'right job' associated with each 'right tool' is typically a narrowly bounded subtask, such as 'tighten the screws.' Unfortunately, we often lose sight of this principle when solving engineering problems; we treat a broadly defined problem, such as controlling or modeling a system, as a narrow one that has a single 'right tool' (e.g., linear analysis, fuzzy logic, neural network). We need to recognize that a typical real-world problem contains a number of different sub-problems, and that a truly optimal solution (the best combination of cost, performance and feature) is obtained by applying the right tool to the right sub-problem. Here I share some of my perspectives on what constitutes the 'right job' for fuzzy control and describe recent advances in neuro-fuzzy modeling to illustrate and to motivate the synergistic use of different tools.
Annalaura, Carducci; Giulia, Davini; Stefano, Ceccanti
2013-01-01
Risk analysis is widely used in the pharmaceutical industry to manage production processes, validation activities, training, and other activities. Several methods of risk analysis are available (for example, failure mode and effects analysis, fault tree analysis), and one or more should be chosen and adapted to the specific field where they will be applied. Among the methods available, hazard analysis and critical control points (HACCP) is a methodology that has been applied since the 1960s, and whose areas of application have expanded over time from food to the pharmaceutical industry. It can be easily and successfully applied to several processes because its main feature is the identification, assessment, and control of hazards. It can be also integrated with other tools, such as fishbone diagram and flowcharting. The aim of this article is to show how HACCP can be used to manage an analytical process, propose how to conduct the necessary steps, and provide data templates necessary to document and useful to follow current good manufacturing practices. In the quality control process, risk analysis is a useful tool for enhancing the uniformity of technical choices and their documented rationale. Accordingly, it allows for more effective and economical laboratory management, is capable of increasing the reliability of analytical results, and enables auditors and authorities to better understand choices that have been made. The aim of this article is to show how hazard analysis and critical control points can be used to manage bacterial endotoxins testing and other analytical processes in a formal, clear, and detailed manner.
Analysis technique for controlling system wavefront error with active/adaptive optics
NASA Astrophysics Data System (ADS)
Genberg, Victor L.; Michels, Gregory J.
2017-08-01
The ultimate goal of an active mirror system is to control system level wavefront error (WFE). In the past, the use of this technique was limited by the difficulty of obtaining a linear optics model. In this paper, an automated method for controlling system level WFE using a linear optics model is presented. An error estimate is included in the analysis output for both surface error disturbance fitting and actuator influence function fitting. To control adaptive optics, the technique has been extended to write system WFE in state space matrix form. The technique is demonstrated by example with SigFit, a commercially available tool integrating mechanical analysis with optical analysis.
Integrated modeling of advanced optical systems
NASA Astrophysics Data System (ADS)
Briggs, Hugh C.; Needels, Laura; Levine, B. Martin
1993-02-01
This poster session paper describes an integrated modeling and analysis capability being developed at JPL under funding provided by the JPL Director's Discretionary Fund and the JPL Control/Structure Interaction Program (CSI). The posters briefly summarize the program capabilities and illustrate them with an example problem. The computer programs developed under this effort will provide an unprecedented capability for integrated modeling and design of high performance optical spacecraft. The engineering disciplines supported include structural dynamics, controls, optics and thermodynamics. Such tools are needed in order to evaluate the end-to-end system performance of spacecraft such as OSI, POINTS, and SMMM. This paper illustrates the proof-of-concept tools that have been developed to establish the technology requirements and demonstrate the new features of integrated modeling and design. The current program also includes implementation of a prototype tool based upon the CAESY environment being developed under the NASA Guidance and Control Research and Technology Computational Controls Program. This prototype will be available late in FY-92. The development plan proposes a major software production effort to fabricate, deliver, support and maintain a national-class tool from FY-93 through FY-95.
Multiple comparison analysis testing in ANOVA.
McHugh, Mary L
2011-01-01
The Analysis of Variance (ANOVA) test has long been an important tool for researchers conducting studies on multiple experimental groups and one or more control groups. However, ANOVA cannot provide detailed information on differences among the various study groups, or on complex combinations of study groups. To fully understand group differences in an ANOVA, researchers must conduct tests of the differences between particular pairs of experimental and control groups. Tests conducted on subsets of data tested previously in another analysis are called post hoc tests. A class of post hoc tests that provide this type of detailed information for ANOVA results are called "multiple comparison analysis" tests. The most commonly used multiple comparison analysis statistics include the following tests: Tukey, Newman-Keuls, Scheffee, Bonferroni and Dunnett. These statistical tools each have specific uses, advantages and disadvantages. Some are best used for testing theory while others are useful in generating new theory. Selection of the appropriate post hoc test will provide researchers with the most detailed information while limiting Type 1 errors due to alpha inflation.
Operations management tools to be applied for textile
NASA Astrophysics Data System (ADS)
Maralcan, A.; Ilhan, I.
2017-10-01
In this paper, basic concepts of process analysis such as flow time, inventory, bottleneck, labour cost and utilization are illustrated first. The effect of bottleneck on the results of a business are especially emphasized. In the next section, tools on productivity measurement; KPI (Key Performance Indicators) Tree, OEE (Overall Equipment Effectiveness) and Takt Time are introduced and exemplified. KPI tree is a diagram on which we can visualize all the variables of an operation which are driving financial results through cost and profit. OEE is a tool to measure a potential extra capacity of an equipment or an employee. Takt time is a tool to determine the process flow rate according to the customer demand. KPI tree is studied through the whole process while OEE is exemplified for a stenter frame machine which is the most important machine (and usually the bottleneck) and the most expensive investment in a finishing plant. Takt time is exemplified for the quality control department. Finally quality tools, six sigma, control charts and jidoka are introduced. Six sigma is a tool to measure process capability and by the way probability of a defect. Control chart is a powerful tool to monitor the process. The idea of jidoka (detect, stop and alert) is about alerting the people that there is a problem in the process.
Real-Time Performance of Mechatronic PZT Module Using Active Vibration Feedback Control.
Aggogeri, Francesco; Borboni, Alberto; Merlo, Angelo; Pellegrini, Nicola; Ricatto, Raffaele
2016-09-25
This paper proposes an innovative mechatronic piezo-actuated module to control vibrations in modern machine tools. Vibrations represent one of the main issues that seriously compromise the quality of the workpiece. The active vibration control (AVC) device is composed of a host part integrated with sensors and actuators synchronized by a regulator; it is able to make a self-assessment and adjust to alterations in the environment. In particular, an innovative smart actuator has been designed and developed to satisfy machining requirements during active vibration control. This study presents the mechatronic model based on the kinematic and dynamic analysis of the AVC device. To ensure a real time performance, a H2-LQG controller has been developed and validated by simulations involving a machine tool, PZT actuator and controller models. The Hardware in the Loop (HIL) architecture is adopted to control and attenuate the vibrations. A set of experimental tests has been performed to validate the AVC module on a commercial machine tool. The feasibility of the real time vibration damping is demonstrated and the simulation accuracy is evaluated.
Real-Time Performance of Mechatronic PZT Module Using Active Vibration Feedback Control
Aggogeri, Francesco; Borboni, Alberto; Merlo, Angelo; Pellegrini, Nicola; Ricatto, Raffaele
2016-01-01
This paper proposes an innovative mechatronic piezo-actuated module to control vibrations in modern machine tools. Vibrations represent one of the main issues that seriously compromise the quality of the workpiece. The active vibration control (AVC) device is composed of a host part integrated with sensors and actuators synchronized by a regulator; it is able to make a self-assessment and adjust to alterations in the environment. In particular, an innovative smart actuator has been designed and developed to satisfy machining requirements during active vibration control. This study presents the mechatronic model based on the kinematic and dynamic analysis of the AVC device. To ensure a real time performance, a H2-LQG controller has been developed and validated by simulations involving a machine tool, PZT actuator and controller models. The Hardware in the Loop (HIL) architecture is adopted to control and attenuate the vibrations. A set of experimental tests has been performed to validate the AVC module on a commercial machine tool. The feasibility of the real time vibration damping is demonstrated and the simulation accuracy is evaluated. PMID:27681732
Manual-control Analysis Applied to the Money-supply Control Task
NASA Technical Reports Server (NTRS)
Wingrove, R. C.
1984-01-01
The recent procedure implemented by the Federal Reserve Board to control the money supply is formulated in the form of a tracking model as used in the study of manual-control tasks. Using this model, an analysis is made to determine the effect of monetary control on the fluctuations in economic output. The results indicate that monetary control can reduce the amplitude of fluctuations at frequencies near the region of historic business cycles. However, with significant time lags in the control loop, monetary control tends to increase the amplitude of the fluctuations at the higher frequencies. How the investigator or student can use the tools developed in the field of manual-control analysis to study the nature of economic fluctuations and to examine different strategies for stabilization is examined.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schlipf, David; Raach, Steffen; Haizmann, Florian
2015-12-14
This paper presents first steps toward an adaptive lidar data processing technique crucial for lidar-assisted control in wind turbines. The prediction time and the quality of the wind preview from lidar measurements depend on several factors and are not constant. If the data processing is not continually adjusted, the benefit of lidar-assisted control cannot be fully exploited, or can even result in harmful control action. An online analysis of the lidar and turbine data are necessary to continually reassess the prediction time and lidar data quality. In this work, a structured process to develop an analysis tool for the predictionmore » time and a new hardware setup for lidar-assisted control are presented. The tool consists of an online estimation of the rotor effective wind speed from lidar and turbine data and the implementation of an online cross correlation to determine the time shift between both signals. Further, initial results from an ongoing campaign in which this system was employed for providing lidar preview for feed-forward pitch control are presented.« less
Proceedings of the 3rd Annual Conference on Aerospace Computational Control, volume 1
NASA Technical Reports Server (NTRS)
Bernard, Douglas E. (Editor); Man, Guy K. (Editor)
1989-01-01
Conference topics included definition of tool requirements, advanced multibody component representation descriptions, model reduction, parallel computation, real time simulation, control design and analysis software, user interface issues, testing and verification, and applications to spacecraft, robotics, and aircraft.
Association between hospital size and quality improvement for pharmaceutical services.
Nau, David P; Garber, Mathew C; Lipowski, Earlene E; Stevenson, James G
2004-01-15
The relationship between hospital size and quality improvement (QI) for pharmaceutical services was studied. A questionnaire on QI was sent to hospital pharmacy directors in Michigan and Florida in 2002. The questionnaire included items on QI lead-team composition, QI tools, QI training, and QI culture. Usable responses were received from 162 (57%) of 282 pharmacy directors. Pharmacy QI lead teams were present in 57% of institutions, with larger teams in large hospitals (> or = 300 patients). Only two QI tools were used by a majority of hospitals: root-cause analysis (62%) and flow charts (66%). Small hospitals (< 50 patients) were less likely than medium-sized hospitals (50-299 patients) and large hospitals to use several QI tools, including control charts, cause-and-effect diagrams, root-cause analysis, flow charts, and histograms. Large hospitals were more likely than small and medium-sized hospitals to use root-cause analysis and control charts. There was no relationship between hospital size and the frequency with which physician or patient satisfaction with pharmaceutical services was measured. There were no differences in QI training or QI culture across hospital size categories. A survey suggested that a majority of hospital pharmacies in Michigan and Florida have begun to adopt QI techniques but that most are not using rigorous QI tools. Pharmacies in large hospitals had more QI lead-team members and were more likely to use certain QI tools, but there was no relationship between hospital size and satisfaction measurements, QI training, or QI culture.
Laurencikas, E; Sävendahl, L; Jorulf, H
2006-06-01
To assess the value of the metacarpophalangeal pattern profile (MCPP) analysis as a diagnostic tool for differentiating between patients with dyschondrosteosis, Turner syndrome, and hypochondroplasia. Radiographic and clinical data from 135 patients between 1 and 51 years of age were collected and analyzed. The study included 25 patients with hypochondroplasia (HCP), 39 with dyschondrosteosis (LWD), and 71 with Turner syndrome (TS). Hand pattern profiles were calculated and compared with those of 110 normal individuals. Pearson correlation coefficient (r) and multivariate discriminant analysis were used for pattern profile analysis. Pattern variability index, a measure of dysmorphogenesis, was calculated for LWD, TS, HCP, and normal controls. Our results demonstrate that patients with LWD, TS, or HCP have distinct pattern profiles that are significantly different from each other and from those of normal controls. Discriminant analysis yielded correct classification of normal versus abnormal individuals in 84% of cases. Classification of the patients into LWD, TS, and HCP groups was successful in 75%. The correct classification rate was higher (85%) when differentiating two pathological groups at a time. Pattern variability index was not helpful for differential diagnosis of LWD, TS, and HCP. Patients with LWD, TS, or HCP have distinct MCPPs and can be successfully differentiated from each other using advanced MCPP analysis. Discriminant analysis is to be preferred over Pearson correlation coefficient because it is a more sensitive and specific technique. MCPP analysis is a helpful tool for differentiating between syndromes with similar clinical and radiological abnormalities.
Water Network Tool for Resilience v. 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
2015-12-09
WNTR is a python package designed to simulate and analyze resilience of water distribution networks. The software includes: - Pressure driven and demand driven hydraulic simulation - Water quality simulation to track concentration, trace, and water age - Conditional controls to simulate power outages - Models to simulate pipe breaks - A wide range of resilience metrics - Analysis and visualization tools
ChiLin: a comprehensive ChIP-seq and DNase-seq quality control and analysis pipeline.
Qin, Qian; Mei, Shenglin; Wu, Qiu; Sun, Hanfei; Li, Lewyn; Taing, Len; Chen, Sujun; Li, Fugen; Liu, Tao; Zang, Chongzhi; Xu, Han; Chen, Yiwen; Meyer, Clifford A; Zhang, Yong; Brown, Myles; Long, Henry W; Liu, X Shirley
2016-10-03
Transcription factor binding, histone modification, and chromatin accessibility studies are important approaches to understanding the biology of gene regulation. ChIP-seq and DNase-seq have become the standard techniques for studying protein-DNA interactions and chromatin accessibility respectively, and comprehensive quality control (QC) and analysis tools are critical to extracting the most value from these assay types. Although many analysis and QC tools have been reported, few combine ChIP-seq and DNase-seq data analysis and quality control in a unified framework with a comprehensive and unbiased reference of data quality metrics. ChiLin is a computational pipeline that automates the quality control and data analyses of ChIP-seq and DNase-seq data. It is developed using a flexible and modular software framework that can be easily extended and modified. ChiLin is ideal for batch processing of many datasets and is well suited for large collaborative projects involving ChIP-seq and DNase-seq from different designs. ChiLin generates comprehensive quality control reports that include comparisons with historical data derived from over 23,677 public ChIP-seq and DNase-seq samples (11,265 datasets) from eight literature-based classified categories. To the best of our knowledge, this atlas represents the most comprehensive ChIP-seq and DNase-seq related quality metric resource currently available. These historical metrics provide useful heuristic quality references for experiment across all commonly used assay types. Using representative datasets, we demonstrate the versatility of the pipeline by applying it to different assay types of ChIP-seq data. The pipeline software is available open source at https://github.com/cfce/chilin . ChiLin is a scalable and powerful tool to process large batches of ChIP-seq and DNase-seq datasets. The analysis output and quality metrics have been structured into user-friendly directories and reports. We have successfully compiled 23,677 profiles into a comprehensive quality atlas with fine classification for users.
software development of next-generation whole-building energy modeling, analysis, and simulation tools technical positions in networking protocol specifications, call control software, and requirements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gillen, David S.
Analysis activities for Nonproliferation and Arms Control verification require the use of many types of data. Tabular structured data, such as Excel spreadsheets and relational databases, have traditionally been used for data mining activities, where specific queries are issued against data to look for matching results. The application of visual analytics tools to structured data enables further exploration of datasets to promote discovery of previously unknown results. This paper discusses the application of a specific visual analytics tool to datasets related to the field of Arms Control and Nonproliferation to promote the use of visual analytics more broadly in thismore » domain. Visual analytics focuses on analytical reasoning facilitated by interactive visual interfaces (Wong and Thomas 2004). It promotes exploratory analysis of data, and complements data mining technologies where known patterns can be mined for. Also with a human in the loop, they can bring in domain knowledge and subject matter expertise. Visual analytics has not widely been applied to this domain. In this paper, we will focus on one type of data: structured data, and show the results of applying a specific visual analytics tool to answer questions in the Arms Control and Nonproliferation domain. We chose to use the T.Rex tool, a visual analytics tool developed at PNNL, which uses a variety of visual exploration patterns to discover relationships in structured datasets, including a facet view, graph view, matrix view, and timeline view. The facet view enables discovery of relationships between categorical information, such as countries and locations. The graph tool visualizes node-link relationship patterns, such as the flow of materials being shipped between parties. The matrix visualization shows highly correlated categories of information. The timeline view shows temporal patterns in data. In this paper, we will use T.Rex with two different datasets to demonstrate how interactive exploration of the data can aid an analyst with arms control and nonproliferation verification activities. Using a dataset from PIERS (PIERS 2014), we will show how container shipment imports and exports can aid an analyst in understanding the shipping patterns between two countries. We will also use T.Rex to examine a collection of research publications from the IAEA International Nuclear Information System (IAEA 2014) to discover collaborations of concern. We hope this paper will encourage the use of visual analytics structured data analytics in the field of nonproliferation and arms control verification. Our paper outlines some of the challenges that exist before broad adoption of these kinds of tools can occur and offers next steps to overcome these challenges.« less
Machine Learning: A Crucial Tool for Sensor Design
Zhao, Weixiang; Bhushan, Abhinav; Santamaria, Anthony D.; Simon, Melinda G.; Davis, Cristina E.
2009-01-01
Sensors have been widely used for disease diagnosis, environmental quality monitoring, food quality control, industrial process analysis and control, and other related fields. As a key tool for sensor data analysis, machine learning is becoming a core part of novel sensor design. Dividing a complete machine learning process into three steps: data pre-treatment, feature extraction and dimension reduction, and system modeling, this paper provides a review of the methods that are widely used for each step. For each method, the principles and the key issues that affect modeling results are discussed. After reviewing the potential problems in machine learning processes, this paper gives a summary of current algorithms in this field and provides some feasible directions for future studies. PMID:20191110
NASA Technical Reports Server (NTRS)
1981-01-01
The software developed to simulate the ground control point navigation system is described. The Ground Control Point Simulation Program (GCPSIM) is designed as an analysis tool to predict the performance of the navigation system. The system consists of two star trackers, a global positioning system receiver, a gyro package, and a landmark tracker.
Open source cardiology electronic health record development for DIGICARDIAC implementation
NASA Astrophysics Data System (ADS)
Dugarte, Nelson; Medina, Rubén.; Huiracocha, Lourdes; Rojas, Rubén.
2015-12-01
This article presents the development of a Cardiology Electronic Health Record (CEHR) system. Software consists of a structured algorithm designed under Health Level-7 (HL7) international standards. Novelty of the system is the integration of high resolution ECG (HRECG) signal acquisition and processing tools, patient information management tools and telecardiology tools. Acquisition tools are for management and control of the DIGICARDIAC electrocardiograph functions. Processing tools allow management of HRECG signal analysis searching for indicative patterns of cardiovascular pathologies. Telecardiology tools incorporation allows system communication with other health care centers decreasing access time to the patient information. CEHR system was completely developed using open source software. Preliminary results of process validation showed the system efficiency.
NASA Astrophysics Data System (ADS)
Krehbiel, C.; Maiersperger, T.; Friesz, A.; Harriman, L.; Quenzer, R.; Impecoven, K.
2016-12-01
Three major obstacles facing big Earth data users include data storage, management, and analysis. As the amount of satellite remote sensing data increases, so does the need for better data storage and management strategies to exploit the plethora of data now available. Standard GIS tools can help big Earth data users whom interact with and analyze increasingly large and diverse datasets. In this presentation we highlight how NASA's Land Processes Distributed Active Archive Center (LP DAAC) is tackling these big Earth data challenges. We provide a real life use case example to describe three tools and services provided by the LP DAAC to more efficiently exploit big Earth data in a GIS environment. First, we describe the Open-source Project for a Network Data Access Protocol (OPeNDAP), which calls to specific data, minimizing the amount of data that a user downloads and improves the efficiency of data downloading and processing. Next, we cover the LP DAAC's Application for Extracting and Exploring Analysis Ready Samples (AppEEARS), a web application interface for extracting and analyzing land remote sensing data. From there, we review an ArcPython toolbox that was developed to provide quality control services to land remote sensing data products. Locating and extracting specific subsets of larger big Earth datasets improves data storage and management efficiency for the end user, and quality control services provides a straightforward interpretation of big Earth data. These tools and services are beneficial to the GIS user community in terms of standardizing workflows and improving data storage, management, and analysis tactics.
Software Tools for Developing and Simulating the NASA LaRC CMF Motion Base
NASA Technical Reports Server (NTRS)
Bryant, Richard B., Jr.; Carrelli, David J.
2006-01-01
The NASA Langley Research Center (LaRC) Cockpit Motion Facility (CMF) motion base has provided many design and analysis challenges. In the process of addressing these challenges, a comprehensive suite of software tools was developed. The software tools development began with a detailed MATLAB/Simulink model of the motion base which was used primarily for safety loads prediction, design of the closed loop compensator and development of the motion base safety systems1. A Simulink model of the digital control law, from which a portion of the embedded code is directly generated, was later added to this model to form a closed loop system model. Concurrently, software that runs on a PC was created to display and record motion base parameters. It includes a user interface for controlling time history displays, strip chart displays, data storage, and initializing of function generators used during motion base testing. Finally, a software tool was developed for kinematic analysis and prediction of mechanical clearances for the motion system. These tools work together in an integrated package to support normal operations of the motion base, simulate the end to end operation of the motion base system providing facilities for software-in-the-loop testing, mechanical geometry and sensor data visualizations, and function generator setup and evaluation.
Experimental Evaluation of Verification and Validation Tools on Martian Rover Software
NASA Technical Reports Server (NTRS)
Brat, Guillaume; Giannakopoulou, Dimitra; Goldberg, Allen; Havelund, Klaus; Lowry, Mike; Pasareani, Corina; Venet, Arnaud; Visser, Willem; Washington, Rich
2003-01-01
We report on a study to determine the maturity of different verification and validation technologies (V&V) on a representative example of NASA flight software. The study consisted of a controlled experiment where three technologies (static analysis, runtime analysis and model checking) were compared to traditional testing with respect to their ability to find seeded errors in a prototype Mars Rover. What makes this study unique is that it is the first (to the best of our knowledge) to do a controlled experiment to compare formal methods based tools to testing on a realistic industrial-size example where the emphasis was on collecting as much data on the performance of the tools and the participants as possible. The paper includes a description of the Rover code that was analyzed, the tools used as well as a detailed description of the experimental setup and the results. Due to the complexity of setting up the experiment, our results can not be generalized, but we believe it can still serve as a valuable point of reference for future studies of this kind. It did confirm the belief we had that advanced tools can outperform testing when trying to locate concurrency errors. Furthermore the results of the experiment inspired a novel framework for testing the next generation of the Rover.
Managing Information On Technical Requirements
NASA Technical Reports Server (NTRS)
Mauldin, Lemuel E., III; Hammond, Dana P.
1993-01-01
Technical Requirements Analysis and Control Systems/Initial Operating Capability (TRACS/IOC) computer program provides supplemental software tools for analysis, control, and interchange of project requirements so qualified project members have access to pertinent project information, even if in different locations. Enables users to analyze and control requirements, serves as focal point for project requirements, and integrates system supporting efficient and consistent operations. TRACS/IOC is HyperCard stack for use on Macintosh computers running HyperCard 1.2 or later and Oracle 1.2 or later.
Integrated communication and control systems. I - Analysis
NASA Technical Reports Server (NTRS)
Halevi, Yoram; Ray, Asok
1988-01-01
The paper presents the results of an ICCS analysis focusing on discrete-time control systems subject to time-varying delays. The present analytical technique is applicable to integrated dynamic systems such as those encountered in advanced aircraft, spacecraft, and the real-time control of robots and machine tools via a high-speed network within an autonomous manufacturing environment. The significance of data latency and missynchronization between individual system components in ICCS networks is discussed in view of the time-varying delays.
Manipulator interactive design with interconnected flexible elements
NASA Technical Reports Server (NTRS)
Singh, R. P.; Likins, P. W.
1983-01-01
This paper describes the development of an analysis tool for the interactive design of control systems for manipulators and similar electro-mechanical systems amenable to representation as structures in a topological chain. The chain consists of a series of elastic bodies subject to small deformations and arbitrary displacements. The bodies are connected by hinges which permit kinematic constraints, control, or relative motion with six degrees of freedom. The equations of motion for the chain configuration are derived via Kane's method, extended for application to interconnected flexible bodies with time-varying boundary conditions. A corresponding set of modal coordinates has been selected. The motion equations are imbedded within a simulation that transforms the vector-dyadic equations into scalar form for numerical integration. The simulation also includes a linear, time-invariant controler specified in transfer function format and a set of sensors and actuators that interface between the structure and controller. The simulation is driven by an interactive set-up program resulting in an easy-to-use analysis tool.
Quality Control System using Simple Implementation of Seven Tools for Batik Textile Manufacturing
NASA Astrophysics Data System (ADS)
Ragil Suryoputro, Muhammad; Sugarindra, Muchamad; Erfaisalsyah, Hendy
2017-06-01
In order to produce better products and mitigate defect in products, every company must implement a quality control system. Company will find means to implement a quality control system that is capable and reliable. One of the methods is using the simple implementation of the seven tools in quality control defects. The case studied in this research was the level of disability xyz grey fabric on a shuttle loom 2 on the Batik manufacturing company. The seven tools that include: flowchart, check sheet, histogram, scatter diagram combined with control charts, Pareto diagrams and fishbone diagrams (causal diagram). Check sheet results obtained types of defects in the grey fabric was woven xyz is warp, double warp, the warp break, double warp, empty warp, warp tenuous, ugly edges, thick warp, and rust. Based on the analysis of control chart indicates that the process is out of control. This can be seen in the graph control where there is still a lot of outlier data. Based on a scatter diagram shows a positive correlation between the percentage of disability and the number of production. Based on Pareto diagram, repair needs priority is for the dominant type of defect is warp (44%) and based on double warp value histogram is also the highest with a value of 23635.11 m. In addition, based on the analysis of the factors causing defect by fishbone diagram double warp or other types of defects originating from the materials, methods, machines, measurements, man and environment. Thus the company can take to minimize the prevention and repair of defects and improve product quality.
Uncertainty Modeling for Robustness Analysis of Control Upset Prevention and Recovery Systems
NASA Technical Reports Server (NTRS)
Belcastro, Christine M.; Khong, Thuan H.; Shin, Jong-Yeob; Kwatny, Harry; Chang, Bor-Chin; Balas, Gary J.
2005-01-01
Formal robustness analysis of aircraft control upset prevention and recovery systems could play an important role in their validation and ultimate certification. Such systems (developed for failure detection, identification, and reconfiguration, as well as upset recovery) need to be evaluated over broad regions of the flight envelope and under extreme flight conditions, and should include various sources of uncertainty. However, formulation of linear fractional transformation (LFT) models for representing system uncertainty can be very difficult for complex parameter-dependent systems. This paper describes a preliminary LFT modeling software tool which uses a matrix-based computational approach that can be directly applied to parametric uncertainty problems involving multivariate matrix polynomial dependencies. Several examples are presented (including an F-16 at an extreme flight condition, a missile model, and a generic example with numerous crossproduct terms), and comparisons are given with other LFT modeling tools that are currently available. The LFT modeling method and preliminary software tool presented in this paper are shown to compare favorably with these methods.
NASA Technical Reports Server (NTRS)
Drysdale, Alan; Thomas, Mark; Fresa, Mark; Wheeler, Ray
1992-01-01
Controlled Ecological Life Support System (CELSS) technology is critical to the Space Exploration Initiative. NASA's Kennedy Space Center has been performing CELSS research for several years, developing data related to CELSS design. We have developed OCAM (Object-oriented CELSS Analysis and Modeling), a CELSS modeling tool, and have used this tool to evaluate CELSS concepts, using this data. In using OCAM, a CELSS is broken down into components, and each component is modeled as a combination of containers, converters, and gates which store, process, and exchange carbon, hydrogen, and oxygen on a daily basis. Multiple crops and plant types can be simulated. Resource recovery options modeled include combustion, leaching, enzyme treatment, aerobic or anaerobic digestion, and mushroom and fish growth. Results include printouts and time-history graphs of total system mass, biomass, carbon dioxide, and oxygen quantities; energy consumption; and manpower requirements. The contributions of mass, energy, and manpower to system cost have been analyzed to compare configurations and determine appropriate research directions.
Global sensitivity analysis of DRAINMOD-FOREST, an integrated forest ecosystem model
Shiying Tian; Mohamed A. Youssef; Devendra M. Amatya; Eric D. Vance
2014-01-01
Global sensitivity analysis is a useful tool to understand process-based ecosystem models by identifying key parameters and processes controlling model predictions. This study reported a comprehensive global sensitivity analysis for DRAINMOD-FOREST, an integrated model for simulating water, carbon (C), and nitrogen (N) cycles and plant growth in lowland forests. The...
Optimization of an Advanced Hybrid Wing Body Concept Using HCDstruct Version 1.2
NASA Technical Reports Server (NTRS)
Quinlan, Jesse R.; Gern, Frank H.
2016-01-01
Hybrid Wing Body (HWB) aircraft concepts continue to be promising candidates for achieving the simultaneous fuel consumption and noise reduction goals set forth by NASA's Environmentally Responsible Aviation (ERA) project. In order to evaluate the projected benefits, improvements in structural analysis at the conceptual design level were necessary; thus, NASA researchers developed the Hybrid wing body Conceptual Design and structural optimization (HCDstruct) tool to perform aeroservoelastic structural optimizations of advanced HWB concepts. In this paper, the authors present substantial updates to the HCDstruct tool and related analysis, including: the addition of four inboard and eight outboard control surfaces and two all-movable tail/rudder assemblies, providing a full aeroservoelastic analysis capability; the implementation of asymmetric load cases for structural sizing applications; and a methodology for minimizing control surface actuation power using NASTRAN SOL 200 and HCDstruct's aeroservoelastic finite-element model (FEM).
Risk analysis and bovine tuberculosis, a re-emerging zoonosis.
Etter, Eric; Donado, Pilar; Jori, Ferran; Caron, Alexandre; Goutard, Flavie; Roger, François
2006-10-01
The widespread of immunodeficiency with AIDS, the consequence of poverty on sanitary protection and information at both individual and state levels lead control of tuberculosis (TB) to be one of the priorities of World Health Organization programs. The impact of bovine tuberculosis (BTB) on humans is poorly documented. However, BTB remains a major problem for livestock in developing countries particularly in Africa and wildlife is responsible for the failure of TB eradication programs. In Africa, the consumption of raw milk and raw meat, and the development of bushmeat consumption as a cheap source of proteins, represent one of the principal routes for human contaminations with BTB. The exploration of these different pathways using tools as participatory epidemiology allows the risk analysis of the impact of BTB on human health in Africa. This analysis represents a management support and decision tool in the study and the control of zoonotic BTB.
Airphoto analysis of erosion control practices
NASA Technical Reports Server (NTRS)
Morgan, K. M.; Morris-Jones, D. R.; Lee, G. B.; Kiefer, R. W.
1980-01-01
The Universal Soil Loss Equation (USLE) is a widely accepted tool for erosion prediction and conservation planning. In this study, airphoto analysis of color and color infrared 70 mm photography at a scale of 1:60,000 was used to determine the erosion control practice factor in the USLE. Information about contour tillage, contour strip cropping, and grass waterways was obtained from aerial photography for Pheasant Branch Creek watershed in Dane County, Wisconsin.
Factors Controlling Sediment Load in The Central Anatolia Region of Turkey: Ankara River Basin.
Duru, Umit; Wohl, Ellen; Ahmadi, Mehdi
2017-05-01
Better understanding of the factors controlling sediment load at a catchment scale can facilitate estimation of soil erosion and sediment transport rates. The research summarized here enhances understanding of correlations between potential control variables on suspended sediment loads. The Soil and Water Assessment Tool was used to simulate flow and sediment at the Ankara River basin. Multivariable regression analysis and principal component analysis were then performed between sediment load and controlling variables. The physical variables were either directly derived from a Digital Elevation Model or from field maps or computed using established equations. Mean observed sediment rate is 6697 ton/year and mean sediment yield is 21 ton/y/km² from the gage. Soil and Water Assessment Tool satisfactorily simulated observed sediment load with Nash-Sutcliffe efficiency, relative error, and coefficient of determination (R²) values of 0.81, -1.55, and 0.93, respectively in the catchment. Therefore, parameter values from the physically based model were applied to the multivariable regression analysis as well as principal component analysis. The results indicate that stream flow, drainage area, and channel width explain most of the variability in sediment load among the catchments. The implications of the results, efficient siltation management practices in the catchment should be performed to stream flow, drainage area, and channel width.
Factors Controlling Sediment Load in The Central Anatolia Region of Turkey: Ankara River Basin
NASA Astrophysics Data System (ADS)
Duru, Umit; Wohl, Ellen; Ahmadi, Mehdi
2017-05-01
Better understanding of the factors controlling sediment load at a catchment scale can facilitate estimation of soil erosion and sediment transport rates. The research summarized here enhances understanding of correlations between potential control variables on suspended sediment loads. The Soil and Water Assessment Tool was used to simulate flow and sediment at the Ankara River basin. Multivariable regression analysis and principal component analysis were then performed between sediment load and controlling variables. The physical variables were either directly derived from a Digital Elevation Model or from field maps or computed using established equations. Mean observed sediment rate is 6697 ton/year and mean sediment yield is 21 ton/y/km² from the gage. Soil and Water Assessment Tool satisfactorily simulated observed sediment load with Nash-Sutcliffe efficiency, relative error, and coefficient of determination ( R²) values of 0.81, -1.55, and 0.93, respectively in the catchment. Therefore, parameter values from the physically based model were applied to the multivariable regression analysis as well as principal component analysis. The results indicate that stream flow, drainage area, and channel width explain most of the variability in sediment load among the catchments. The implications of the results, efficient siltation management practices in the catchment should be performed to stream flow, drainage area, and channel width.
Dust control effectiveness of drywall sanding tools.
Young-Corbett, Deborah E; Nussbaum, Maury A
2009-07-01
In this laboratory study, four drywall sanding tools were evaluated in terms of dust generation rates in the respirable and thoracic size classes. In a repeated measures study design, 16 participants performed simulated drywall finishing tasks with each of four tools: (1) ventilated sander, (2) pole sander, (3) block sander, and (4) wet sponge. Dependent variables of interest were thoracic and respirable breathing zone dust concentrations. Analysis by Friedman's Test revealed that the ventilated drywall sanding tool produced significantly less dust, of both size classes, than did the other three tools. The pole and wet sanders produced significantly less dust of both size classes than did the block sander. The block sander, the most commonly used tool in drywall finishing operations, produced significantly more dust of both size classes than did the other three tools. When compared with the block sander, the other tools offer substantial dust reduction. The ventilated tool reduced respirable concentrations by 88% and thoracic concentrations by 85%. The pole sander reduced respirable concentrations by 58% and thoracic by 50%. The wet sander produced reductions of 60% and 47% in the respirable and thoracic classes, respectively. Wet sponge sanders and pole sanders are effective at reducing breathing-zone dust concentrations; however, based on its superior dust control effectiveness, the ventilated sander is the recommended tool for drywall finishing operations.
NASA Astrophysics Data System (ADS)
Cheng, Kai; Niu, Zhi-Chao; Wang, Robin C.; Rakowski, Richard; Bateman, Richard
2017-09-01
Smart machining has tremendous potential and is becoming one of new generation high value precision manufacturing technologies in line with the advance of Industry 4.0 concepts. This paper presents some innovative design concepts and, in particular, the development of four types of smart cutting tools, including a force-based smart cutting tool, a temperature-based internally-cooled cutting tool, a fast tool servo (FTS) and smart collets for ultraprecision and micro manufacturing purposes. Implementation and application perspectives of these smart cutting tools are explored and discussed particularly for smart machining against a number of industrial application requirements. They are contamination-free machining, machining of tool-wear-prone Si-based infra-red devices and medical applications, high speed micro milling and micro drilling, etc. Furthermore, implementation techniques are presented focusing on: (a) plug-and-produce design principle and the associated smart control algorithms, (b) piezoelectric film and surface acoustic wave transducers to measure cutting forces in process, (c) critical cutting temperature control in real-time machining, (d) in-process calibration through machining trials, (e) FE-based design and analysis of smart cutting tools, and (f) application exemplars on adaptive smart machining.
An experiment with interactive planning models
NASA Technical Reports Server (NTRS)
Beville, J.; Wagner, J. H.; Zannetos, Z. S.
1970-01-01
Experiments on decision making in planning problems are described. Executives were tested in dealing with capital investments and competitive pricing decisions under conditions of uncertainty. A software package, the interactive risk analysis model system, was developed, and two controlled experiments were conducted. It is concluded that planning models can aid management, and predicted uses of the models are as a central tool, as an educational tool, to improve consistency in decision making, to improve communications, and as a tool for consensus decision making.
Lira, Luis Henrique; Hirai, Flávio E; Oliveira, Marivaldo; Portellinha, Waldir; Nakano, Eliane Mayumi
2017-01-01
To identify the causes of a diffuse lamellar keratitis (DLK) outbreak using a systematic search tool in a case-control analysis. An Ishikawa diagram was used to guide physicians to determine the potential risk factors involved in this outbreak. Coherence between the occurrences and each possible cause listed in the diagram was verified, and the total number of eyes at risk was used to calculate the proportion of affected eyes. Multivariate analysis was performed using logistic regression to determine the independent effect of the risk factors, after controlling for confounders and test interactions. All DLK cases were reported in 2007 between June 13 and December 21; during this period, 3,698 procedures were performed. Of the 1,682 flap-related procedures, 204 eyes of 141 individuals presented with DLK. No direct relationship was observed between the occurrence of DLK and the presence of any specific factors; however, flap-lifting enhancements, procedures performed during the morning shift, and non-use of therapeutic contact lenses after the surgery were significantly related to higher occurrence percentages of this condition. The Ishikawa diagram, like most quality tools, is a visualization and knowledge organization tool. This systematization allowed the investigators to thoroughly assess all the possible causes of DLK outbreak. A clear view of the entire surgical logistics permitted even more rigid management of the main factors involved in the process and, as a result, highlighted factors that deserved attention. The case-control analysis on every factor raised by the Ishikawa diagram indicated that the commonly suspected factors such as biofilm contamination of the water reservoir in autoclaves, the air-conditioning filter system, glove powder, microkeratome motor oil, and gentian violet markers were not related to the outbreak.
Controlling the Transport of an Ion: Classical and Quantum Mechanical Solutions
2014-07-09
quantum systems: tools, achievements, and limitations Christiane P Koch Shortcuts to adiabaticity for an ion in a rotating radially- tight trap M Palmero...Keywords: coherent control, ion traps, quantum information, optimal control theory 1. Introduction Control methods are key enabling techniques in many...figure 6. 3.4. Feasibility analysis of quantum optimal control Numerical optimization of the wavepacket motion is expected to become necessary once
DOE Office of Scientific and Technical Information (OSTI.GOV)
Riddle, F. J.
2003-06-26
The Automated Hazard Analysis (AHA) application is a software tool used to conduct job hazard screening and analysis of tasks to be performed in Savannah River Site facilities. The AHA application provides a systematic approach to the assessment of safety and environmental hazards associated with specific tasks, and the identification of controls regulations, and other requirements needed to perform those tasks safely. AHA is to be integrated into existing Savannah River site work control and job hazard analysis processes. Utilization of AHA will improve the consistency and completeness of hazard screening and analysis, and increase the effectiveness of the workmore » planning process.« less
RNA-seq Data: Challenges in and Recommendations for Experimental Design and Analysis.
Williams, Alexander G; Thomas, Sean; Wyman, Stacia K; Holloway, Alisha K
2014-10-01
RNA-seq is widely used to determine differential expression of genes or transcripts as well as identify novel transcripts, identify allele-specific expression, and precisely measure translation of transcripts. Thoughtful experimental design and choice of analysis tools are critical to ensure high-quality data and interpretable results. Important considerations for experimental design include number of replicates, whether to collect paired-end or single-end reads, sequence length, and sequencing depth. Common analysis steps in all RNA-seq experiments include quality control, read alignment, assigning reads to genes or transcripts, and estimating gene or transcript abundance. Our aims are two-fold: to make recommendations for common components of experimental design and assess tool capabilities for each of these steps. We also test tools designed to detect differential expression, since this is the most widespread application of RNA-seq. We hope that these analyses will help guide those who are new to RNA-seq and will generate discussion about remaining needs for tool improvement and development. Copyright © 2014 John Wiley & Sons, Inc.
Using a formal requirements management tool for system engineering: first results at ESO
NASA Astrophysics Data System (ADS)
Zamparelli, Michele
2006-06-01
The attention to proper requirement analysis and maintenance is growing in modern astronomical undertakings. The increasing degree of complexity that current and future generations of projects have reached requires substantial system engineering efforts and the usage of all available technology to keep project development under control. One such technology is a tool which helps managing relationships between deliverables at various development stages, and across functional subsystems and disciplines as different as software, mechanics, optics and electronics. The immediate benefits are traceability and the possibility to do impact analysis. An industrially proven tool for requirements management is presented together with the first results across some projects at ESO and a cost/benefit analysis of its usage. Experience gathered so far shows that the extensibility and configurability of the tool from one hand, and integration with common documentation formats and standards on the other, make it appear as a promising solution for even small scale system development.
Cornelissen, Frans; Cik, Miroslav; Gustin, Emmanuel
2012-04-01
High-content screening has brought new dimensions to cellular assays by generating rich data sets that characterize cell populations in great detail and detect subtle phenotypes. To derive relevant, reliable conclusions from these complex data, it is crucial to have informatics tools supporting quality control, data reduction, and data mining. These tools must reconcile the complexity of advanced analysis methods with the user-friendliness demanded by the user community. After review of existing applications, we realized the possibility of adding innovative new analysis options. Phaedra was developed to support workflows for drug screening and target discovery, interact with several laboratory information management systems, and process data generated by a range of techniques including high-content imaging, multicolor flow cytometry, and traditional high-throughput screening assays. The application is modular and flexible, with an interface that can be tuned to specific user roles. It offers user-friendly data visualization and reduction tools for HCS but also integrates Matlab for custom image analysis and the Konstanz Information Miner (KNIME) framework for data mining. Phaedra features efficient JPEG2000 compression and full drill-down functionality from dose-response curves down to individual cells, with exclusion and annotation options, cell classification, statistical quality controls, and reporting.
Design and Control of Compliant Tensegrity Robots Through Simulation and Hardware Validation
NASA Technical Reports Server (NTRS)
Caluwaerts, Ken; Despraz, Jeremie; Iscen, Atil; Sabelhaus, Andrew P.; Bruce, Jonathan; Schrauwen, Benjamin; Sunspiral, Vytas
2014-01-01
To better understand the role of tensegrity structures in biological systems and their application to robotics, the Dynamic Tensegrity Robotics Lab at NASA Ames Research Center has developed and validated two different software environments for the analysis, simulation, and design of tensegrity robots. These tools, along with new control methodologies and the modular hardware components developed to validate them, are presented as a system for the design of actuated tensegrity structures. As evidenced from their appearance in many biological systems, tensegrity ("tensile-integrity") structures have unique physical properties which make them ideal for interaction with uncertain environments. Yet these characteristics, such as variable structural compliance, and global multi-path load distribution through the tension network, make design and control of bio-inspired tensegrity robots extremely challenging. This work presents the progress in using these two tools in tackling the design and control challenges. The results of this analysis includes multiple novel control approaches for mobility and terrain interaction of spherical tensegrity structures. The current hardware prototype of a six-bar tensegrity, code-named ReCTeR, is presented in the context of this validation.
Looking into the glass: glassware as an alcohol marketing tool, and the implications for policy.
Stead, Martine; Angus, Kathryn; Macdonald, Laura; Bauld, Linda
2014-01-01
To examine how glassware functions as a marketing tool. Content analysis of trade journals. Glassware is used as an integral part of marketing activity to recruit customers, revive brands, build profits and increase consumption. Glassware should be subject to the same control as other forms of marketing. Glasses could be re-engineered to promote safer drinking.
Consequent use of IT tools as a driver for cost reduction and quality improvements
NASA Astrophysics Data System (ADS)
Hein, Stefan; Rapp, Roberto; Feustel, Andreas
2013-10-01
The semiconductor industry drives a lot of efforts in the field of cost reductions and quality improvements. The consequent use of IT tools is one possibility to support these goals. With the extensions of its 150mm Fab to 200mm Robert Bosch increased the systematic use of data analysis and Advanced Process Control (APC).
CAESY - COMPUTER AIDED ENGINEERING SYSTEM
NASA Technical Reports Server (NTRS)
Wette, M. R.
1994-01-01
Many developers of software and algorithms for control system design have recognized that current tools have limits in both flexibility and efficiency. Many forces drive the development of new tools including the desire to make complex system modeling design and analysis easier and the need for quicker turnaround time in analysis and design. Other considerations include the desire to make use of advanced computer architectures to help in control system design, adopt new methodologies in control, and integrate design processes (e.g., structure, control, optics). CAESY was developed to provide a means to evaluate methods for dealing with user needs in computer-aided control system design. It is an interpreter for performing engineering calculations and incorporates features of both Ada and MATLAB. It is designed to be reasonably flexible and powerful. CAESY includes internally defined functions and procedures, as well as user defined ones. Support for matrix calculations is provided in the same manner as MATLAB. However, the development of CAESY is a research project, and while it provides some features which are not found in commercially sold tools, it does not exhibit the robustness that many commercially developed tools provide. CAESY is written in C-language for use on Sun4 series computers running SunOS 4.1.1 and later. The program is designed to optionally use the LAPACK math library. The LAPACK math routines are available through anonymous ftp from research.att.com. CAESY requires 4Mb of RAM for execution. The standard distribution medium is a .25 inch streaming magnetic tape cartridge (QIC-24) in UNIX tar format. CAESY was developed in 1993 and is a copyrighted work with all copyright vested in NASA.
Piazza, Rocco; Magistroni, Vera; Pirola, Alessandra; Redaelli, Sara; Spinelli, Roberta; Redaelli, Serena; Galbiati, Marta; Valletta, Simona; Giudici, Giovanni; Cazzaniga, Giovanni; Gambacorti-Passerini, Carlo
2013-01-01
Copy number alterations (CNA) are common events occurring in leukaemias and solid tumors. Comparative Genome Hybridization (CGH) is actually the gold standard technique to analyze CNAs; however, CGH analysis requires dedicated instruments and is able to perform only low resolution Loss of Heterozygosity (LOH) analyses. Here we present CEQer (Comparative Exome Quantification analyzer), a new graphical, event-driven tool for CNA/allelic-imbalance (AI) coupled analysis of exome sequencing data. By using case-control matched exome data, CEQer performs a comparative digital exonic quantification to generate CNA data and couples this information with exome-wide LOH and allelic imbalance detection. This data is used to build mixed statistical/heuristic models allowing the identification of CNA/AI events. To test our tool, we initially used in silico generated data, then we performed whole-exome sequencing from 20 leukemic specimens and corresponding matched controls and we analyzed the results using CEQer. Taken globally, these analyses showed that the combined use of comparative digital exon quantification and LOH/AI allows generating very accurate CNA data. Therefore, we propose CEQer as an efficient, robust and user-friendly graphical tool for the identification of CNA/AI in the context of whole-exome sequencing data.
Distributed environmental control
NASA Technical Reports Server (NTRS)
Cleveland, Gary A.
1992-01-01
We present an architecture of distributed, independent control agents designed to work with the Computer Aided System Engineering and Analysis (CASE/A) simulation tool. CASE/A simulates behavior of Environmental Control and Life Support Systems (ECLSS). We describe a lattice of agents capable of distributed sensing and overcoming certain sensor and effector failures. We address how the architecture can achieve the coordinating functions of a hierarchical command structure while maintaining the robustness and flexibility of independent agents. These agents work between the time steps of the CASE/A simulation tool to arrive at command decisions based on the state variables maintained by CASE/A. Control is evaluated according to both effectiveness (e.g., how well temperature was maintained) and resource utilization (the amount of power and materials used).
NASA Astrophysics Data System (ADS)
Wrobel, P. M.; Bogovac, M.; Sghaier, H.; Leani, J. J.; Migliori, A.; Padilla-Alvarez, R.; Czyzycki, M.; Osan, J.; Kaiser, R. B.; Karydas, A. G.
2016-10-01
A new synchrotron beamline end-station for multipurpose X-ray spectrometry applications has been recently commissioned and it is currently accessible by end-users at the XRF beamline of Elettra Sincrotrone Trieste. The end-station consists of an ultra-high vacuum chamber that includes as main instrument a seven-axis motorized manipulator for sample and detectors positioning, different kinds of X-ray detectors and optical cameras. The beamline end-station allows performing measurements in different X-ray spectrometry techniques such as Microscopic X-Ray Fluorescence analysis (μXRF), Total Reflection X-Ray Fluorescence analysis (TXRF), Grazing Incidence/Exit X-Ray Fluorescence analysis (GI-XRF/GE-XRF), X-Ray Reflectometry (XRR), and X-Ray Absorption Spectroscopy (XAS). A LabVIEW Graphical User Interface (GUI) bound with Tango control system consisted of many custom made software modules is utilized as a user-friendly tool for control of the entire end-station hardware components. The present work describes this advanced Tango and LabVIEW software platform that utilizes in an optimal synergistic manner the merits and functionality of these well-established programming and equipment control tools.
Messai, Habib; Farman, Muhammad; Sarraj-Laabidi, Abir; Hammami-Semmar, Asma; Semmar, Nabil
2016-11-17
Olive oils (OOs) show high chemical variability due to several factors of genetic, environmental and anthropic types. Genetic and environmental factors are responsible for natural compositions and polymorphic diversification resulting in different varietal patterns and phenotypes. Anthropic factors, however, are at the origin of different blends' preparation leading to normative, labelled or adulterated commercial products. Control of complex OO samples requires their (i) characterization by specific markers; (ii) authentication by fingerprint patterns; and (iii) monitoring by traceability analysis. These quality control and management aims require the use of several multivariate statistical tools: specificity highlighting requires ordination methods; authentication checking calls for classification and pattern recognition methods; traceability analysis implies the use of network-based approaches able to separate or extract mixed information and memorized signals from complex matrices. This chapter presents a review of different chemometrics methods applied for the control of OO variability from metabolic and physical-chemical measured characteristics. The different chemometrics methods are illustrated by different study cases on monovarietal and blended OO originated from different countries. Chemometrics tools offer multiple ways for quantitative evaluations and qualitative control of complex chemical variability of OO in relation to several intrinsic and extrinsic factors.
Development of a Comprehensive Digital Avionics Curriculum for the Aeronautical Engineer
2006-03-01
able to analyze and design aircraft and missile guidance and control systems, including feedback stabilization schemes and stochastic processes, using ...Uncertainty modeling for robust control; Robust closed-loop stability and performance; Robust H- infinity control; Robustness check using mu-analysis...Controlled feedback (reduces noise) 3. Statistical group response (reduce pressure toward conformity) When used as a tool to study a complex problem
Object-Oriented MDAO Tool with Aeroservoelastic Model Tuning Capability
NASA Technical Reports Server (NTRS)
Pak, Chan-gi; Li, Wesley; Lung, Shun-fat
2008-01-01
An object-oriented multi-disciplinary analysis and optimization (MDAO) tool has been developed at the NASA Dryden Flight Research Center to automate the design and analysis process and leverage existing commercial as well as in-house codes to enable true multidisciplinary optimization in the preliminary design stage of subsonic, transonic, supersonic and hypersonic aircraft. Once the structural analysis discipline is finalized and integrated completely into the MDAO process, other disciplines such as aerodynamics and flight controls will be integrated as well. Simple and efficient model tuning capabilities based on optimization problem are successfully integrated with the MDAO tool. More synchronized all phases of experimental testing (ground and flight), analytical model updating, high-fidelity simulations for model validation, and integrated design may result in reduction of uncertainties in the aeroservoelastic model and increase the flight safety.
NASA Astrophysics Data System (ADS)
Vines, Aleksander; Hansen, Morten W.; Korosov, Anton
2017-04-01
Existing infrastructure international and Norwegian projects, e.g., NorDataNet, NMDC and NORMAP, provide open data access through the OPeNDAP protocol following the conventions for CF (Climate and Forecast) metadata, designed to promote the processing and sharing of files created with the NetCDF application programming interface (API). This approach is now also being implemented in the Norwegian Sentinel Data Hub (satellittdata.no) to provide satellite EO data to the user community. Simultaneously with providing simplified and unified data access, these projects also seek to use and establish common standards for use and discovery metadata. This then allows development of standardized tools for data search and (subset) streaming over the internet to perform actual scientific analysis. A combinnation of software tools, which we call a Scientific Platform as a Service (SPaaS), will take advantage of these opportunities to harmonize and streamline the search, retrieval and analysis of integrated satellite and auxiliary observations of the oceans in a seamless system. The SPaaS is a cloud solution for integration of analysis tools with scientific datasets via an API. The core part of the SPaaS is a distributed metadata catalog to store granular metadata describing the structure, location and content of available satellite, model, and in situ datasets. The analysis tools include software for visualization (also online), interactive in-depth analysis, and server-based processing chains. The API conveys search requests between system nodes (i.e., interactive and server tools) and provides easy access to the metadata catalog, data repositories, and the tools. The SPaaS components are integrated in virtual machines, of which provisioning and deployment are automatized using existing state-of-the-art open-source tools (e.g., Vagrant, Ansible, Docker). The open-source code for scientific tools and virtual machine configurations is under version control at https://github.com/nansencenter/, and is coupled to an online continuous integration system (e.g., Travis CI).
Controllability analysis and decentralized control of a wet limestone flue gas desulfurization plant
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perales, A.L.V.; Ortiz, F.J.G.; Ollero, P.
2008-12-15
Presently, decentralized feedback control is the only control strategy used in wet limestone flue gas desulfurization (WLFGD) plants. Proper tuning of this control strategy is becoming an important issue in WLFGD plants because more stringent SO{sub 2} regulations have come into force recently. Controllability analysis is a highly valuable tool for proper design of control systems, but it has not been applied to WLFGD plants so far. In this paper a decentralized control strategy is designed and applied to a WLFGD pilot plant taking into account the conclusions of a controllability analysis. The results reveal that good SO{sub 2} controlmore » in WLFGD plants can be achieved mainly because the main disturbance of the process is well-aligned with the plant and interactions between control loops are beneficial to SO{sub 2} control.« less
Kim, Eung-Sam; Ahn, Eun Hyun; Chung, Euiheon; Kim, Deok-Ho
2013-01-01
Nanotechnology-based tools are beginning to emerge as promising platforms for quantitative high-throughput analysis of live cells and tissues. Despite unprecedented progress made over the last decade, a challenge still lies in integrating emerging nanotechnology-based tools into macroscopic biomedical apparatuses for practical purposes in biomedical sciences. In this review, we discuss the recent advances and limitations in the analysis and control of mechanical, biochemical, fluidic, and optical interactions in the interface areas of nanotechnology-based materials and living cells in both in vitro and in vivo settings. PMID:24258011
Kim, Eung-Sam; Ahn, Eun Hyun; Chung, Euiheon; Kim, Deok-Ho
2013-12-01
Nanotechnology-based tools are beginning to emerge as promising platforms for quantitative high-throughput analysis of live cells and tissues. Despite unprecedented progress made over the last decade, a challenge still lies in integrating emerging nanotechnology-based tools into macroscopic biomedical apparatuses for practical purposes in biomedical sciences. In this review, we discuss the recent advances and limitations in the analysis and control of mechanical, biochemical, fluidic, and optical interactions in the interface areas of nanotechnologybased materials and living cells in both in vitro and in vivo settings.
Toxic release consequence analysis tool (TORCAT) for inherently safer design plant.
Shariff, Azmi Mohd; Zaini, Dzulkarnain
2010-10-15
Many major accidents due to toxic release in the past have caused many fatalities such as the tragedy of MIC release in Bhopal, India (1984). One of the approaches is to use inherently safer design technique that utilizes inherent safety principle to eliminate or minimize accidents rather than to control the hazard. This technique is best implemented in preliminary design stage where the consequence of toxic release can be evaluated and necessary design improvements can be implemented to eliminate or minimize the accidents to as low as reasonably practicable (ALARP) without resorting to costly protective system. However, currently there is no commercial tool available that has such capability. This paper reports on the preliminary findings on the development of a prototype tool for consequence analysis and design improvement via inherent safety principle by utilizing an integrated process design simulator with toxic release consequence analysis model. The consequence analysis based on the worst-case scenarios during process flowsheeting stage were conducted as case studies. The preliminary finding shows that toxic release consequences analysis tool (TORCAT) has capability to eliminate or minimize the potential toxic release accidents by adopting the inherent safety principle early in preliminary design stage. 2010 Elsevier B.V. All rights reserved.
Ghosh, Arup; Qin, Shiming; Lee, Jooyeoun; Wang, Gi-Nam
2016-01-01
Operational faults and behavioural anomalies associated with PLC control processes take place often in a manufacturing system. Real time identification of these operational faults and behavioural anomalies is necessary in the manufacturing industry. In this paper, we present an automated tool, called PLC Log-Data Analysis Tool (PLAT) that can detect them by using log-data records of the PLC signals. PLAT automatically creates a nominal model of the PLC control process and employs a novel hash table based indexing and searching scheme to satisfy those purposes. Our experiments show that PLAT is significantly fast, provides real time identification of operational faults and behavioural anomalies, and can execute within a small memory footprint. In addition, PLAT can easily handle a large manufacturing system with a reasonable computing configuration and can be installed in parallel to the data logging system to identify operational faults and behavioural anomalies effectively.
Ghosh, Arup; Qin, Shiming; Lee, Jooyeoun
2016-01-01
Operational faults and behavioural anomalies associated with PLC control processes take place often in a manufacturing system. Real time identification of these operational faults and behavioural anomalies is necessary in the manufacturing industry. In this paper, we present an automated tool, called PLC Log-Data Analysis Tool (PLAT) that can detect them by using log-data records of the PLC signals. PLAT automatically creates a nominal model of the PLC control process and employs a novel hash table based indexing and searching scheme to satisfy those purposes. Our experiments show that PLAT is significantly fast, provides real time identification of operational faults and behavioural anomalies, and can execute within a small memory footprint. In addition, PLAT can easily handle a large manufacturing system with a reasonable computing configuration and can be installed in parallel to the data logging system to identify operational faults and behavioural anomalies effectively. PMID:27974882
ISAC: A tool for aeroservoelastic modeling and analysis
NASA Technical Reports Server (NTRS)
Adams, William M., Jr.; Hoadley, Sherwood Tiffany
1993-01-01
The capabilities of the Interaction of Structures, Aerodynamics, and Controls (ISAC) system of program modules is discussed. The major modeling, analysis, and data management components of ISAC are identified. Equations of motion are displayed for a Laplace-domain representation of the unsteady aerodynamic forces. Options for approximating a frequency-domain representation of unsteady aerodynamic forces with rational functions of the Laplace variable are shown. Linear time invariant state-space equations of motion that result are discussed. Model generation and analyses of stability and dynamic response characteristics are shown for an aeroelastic vehicle which illustrates some of the capabilities of ISAC as a modeling and analysis tool for aeroelastic applications.
Playbook Data Analysis Tool: Collecting Interaction Data from Extremely Remote Users
NASA Technical Reports Server (NTRS)
Kanefsky, Bob; Zheng, Jimin; Deliz, Ivonne; Marquez, Jessica J.; Hillenius, Steven
2017-01-01
Typically, user tests for software tools are conducted in person. At NASA, the users may be located at the bottom of the ocean in a pressurized habitat, above the atmosphere in the International Space Station, or in an isolated capsule on a simulated asteroid mission. The Playbook Data Analysis Tool (P-DAT) is a human-computer interaction (HCI) evaluation tool that the NASA Ames HCI Group has developed to record user interactions with Playbook, the group's existing planning-and-execution software application. Once the remotely collected user interaction data makes its way back to Earth, researchers can use P-DAT for in-depth analysis. Since a critical component of the Playbook project is to understand how to develop more intuitive software tools for astronauts to plan in space, P-DAT helps guide us in the development of additional easy-to-use features for Playbook, informing the design of future crew autonomy tools.P-DAT has demonstrated the capability of discreetly capturing usability data in amanner that is transparent to Playbook’s end-users. In our experience, P-DAT data hasalready shown its utility, revealing potential usability patterns, helping diagnose softwarebugs, and identifying metrics and events that are pertinent to Playbook usage aswell as spaceflight operations. As we continue to develop this analysis tool, P-DATmay yet provide a method for long-duration, unobtrusive human performance collectionand evaluation for mission controllers back on Earth and researchers investigatingthe effects and mitigations related to future human spaceflight performance.
NASA Astrophysics Data System (ADS)
Lukyanov, A. A.; Grigoriev, S. N.; Bobrovskij, I. N.; Melnikov, P. A.; Bobrovskij, N. M.
2017-05-01
With regard to the complexity of the new technology and increase its reliability requirements laboriousness of control operations in industrial quality control systems increases significantly. The importance of quality management control due to the fact that its promotes the correct use of production conditions, the relevant requirements are required. Digital image processing allows to reach a new technological level of production (new technological way). The most complicated automated interpretation of information is the basis for decision-making in the management of production processes. In the case of surface analysis of tools used for processing with the using of metalworking fluids (MWF) it is more complicated. The authors suggest new algorithm for optical inspection of the wear of the cylinder tool for burnishing, which used in surface plastic deformation without using of MWF. The main advantage of proposed algorithm is the possibility of automatic recognition of images of burnisher tool with the subsequent allocation of its boundaries, finding a working surface and automatically allocating the defects and wear area. Software that implements the algorithm was developed by the authors in Matlab programming environment, but can be implemented using other programming languages.
Development of CCSDS DCT to Support Spacecraft Dynamic Events
NASA Technical Reports Server (NTRS)
Sidhwa, Anahita F
2011-01-01
This report discusses the development of Consultative Committee for Space Data Systems (CCSDS) Design Control Table (DCT) to support spacecraft dynamic events. The Consultative Committee for Space Data Systems (CCSDS) Design Control Table (DCT) is a versatile link calculation tool to analyze different kinds of radio frequency links. It started out as an Excel-based program, and is now being evolved into a Mathematica-based link analysis tool. The Mathematica platform offers a rich set of advanced analysis capabilities, and can be easily extended to a web-based architecture. Last year the CCSDS DCT's for the uplink, downlink, two-way, and ranging models were developed as well as the corresponding input and output interfaces. Another significant accomplishment is the integration of the NAIF SPICE library into the Mathematica computation platform.
Norén, G Niklas; Bergvall, Tomas; Ryan, Patrick B; Juhlin, Kristina; Schuemie, Martijn J; Madigan, David
2013-10-01
Observational healthcare data offer the potential to identify adverse drug reactions that may be missed by spontaneous reporting. The self-controlled cohort analysis within the Temporal Pattern Discovery framework compares the observed-to-expected ratio of medical outcomes during post-exposure surveillance periods with those during a set of distinct pre-exposure control periods in the same patients. It utilizes an external control group to account for systematic differences between the different time periods, thus combining within- and between-patient confounder adjustment in a single measure. To evaluate the performance of the calibrated self-controlled cohort analysis within Temporal Pattern Discovery as a tool for risk identification in observational healthcare data. Different implementations of the calibrated self-controlled cohort analysis were applied to 399 drug-outcome pairs (165 positive and 234 negative test cases across 4 health outcomes of interest) in 5 real observational databases (four with administrative claims and one with electronic health records). Performance was evaluated on real data through sensitivity/specificity, the area under receiver operator characteristics curve (AUC), and bias. The calibrated self-controlled cohort analysis achieved good predictive accuracy across the outcomes and databases under study. The optimal design based on this reference set uses a 360 days surveillance period and a single control period 180 days prior to new prescriptions. It achieved an average AUC of 0.75 and AUC >0.70 in all but one scenario. A design with three separate control periods performed better for the electronic health records database and for acute renal failure across all data sets. The estimates for negative test cases were generally unbiased, but a minor negative bias of up to 0.2 on the RR-scale was observed with the configurations using multiple control periods, for acute liver injury and upper gastrointestinal bleeding. The calibrated self-controlled cohort analysis within Temporal Pattern Discovery shows promise as a tool for risk identification; it performs well at discriminating positive from negative test cases. The optimal parameter configuration may vary with the data set and medical outcome of interest.
Prioritization of malaria endemic zones using self-organizing maps in the Manipur state of India.
Murty, Upadhyayula Suryanarayana; Srinivasa Rao, Mutheneni; Misra, Sunil
2008-09-01
Due to the availability of a huge amount of epidemiological and public health data that require analysis and interpretation by using appropriate mathematical tools to support the existing method to control the mosquito and mosquito-borne diseases in a more effective way, data-mining tools are used to make sense from the chaos. Using data-mining tools, one can develop predictive models, patterns, association rules, and clusters of diseases, which can help the decision-makers in controlling the diseases. This paper mainly focuses on the applications of data-mining tools that have been used for the first time to prioritize the malaria endemic regions in Manipur state by using Self Organizing Maps (SOM). The SOM results (in two-dimensional images called Kohonen maps) clearly show the visual classification of malaria endemic zones into high, medium and low in the different districts of Manipur, and will be discussed in the paper.
Muir-Paulik, S A; Johnson, L E A; Kennedy, P; Aden, T; Villanueva, J; Reisdorf, E; Humes, R; Moen, A C
2016-01-01
The 2005 International Health Regulations (IHR 2005) emphasized the importance of laboratory capacity to detect emerging diseases including novel influenza viruses. To support IHR 2005 requirements and the need to enhance influenza laboratory surveillance capacity, the Association of Public Health Laboratories (APHL) and the Centers for Disease Control and Prevention (CDC) Influenza Division developed the International Influenza Laboratory Capacity Review (Tool). Data from 37 assessments were reviewed and analyzed to verify that the quantitative analysis results accurately depicted a laboratory's capacity and capabilities. Subject matter experts in influenza and laboratory practice used an iterative approach to develop the Tool incorporating feedback and lessons learnt through piloting and implementation. To systematically analyze assessment data, a quantitative framework for analysis was added to the Tool. The review indicated that changes in scores consistently reflected enhanced or decreased capacity. The review process also validated the utility of adding a quantitative analysis component to the assessments and the benefit of establishing a baseline from which to compare future assessments in a standardized way. Use of the Tool has provided APHL, CDC and each assessed laboratory with a standardized analysis of the laboratory's capacity. The information generated is used to improve laboratory systems for laboratory testing and enhance influenza surveillance globally. We describe the development of the Tool and lessons learnt. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Development and Testing of Control Laws for the Active Aeroelastic Wing Program
NASA Technical Reports Server (NTRS)
Dibley, Ryan P.; Allen, Michael J.; Clarke, Robert; Gera, Joseph; Hodgkinson, John
2005-01-01
The Active Aeroelastic Wing research program was a joint program between the U.S. Air Force Research Laboratory and NASA established to investigate the characteristics of an aeroelastic wing and the technique of using wing twist for roll control. The flight test program employed the use of an F/A-18 aircraft modified by reducing the wing torsional stiffness and adding a custom research flight control system. The research flight control system was optimized to maximize roll rate using only wing surfaces to twist the wing while simultaneously maintaining design load limits, stability margins, and handling qualities. NASA Dryden Flight Research Center developed control laws using the software design tool called CONDUIT, which employs a multi-objective function optimization to tune selected control system design parameters. Modifications were made to the Active Aeroelastic Wing implementation in this new software design tool to incorporate the NASA Dryden Flight Research Center nonlinear F/A-18 simulation for time history analysis. This paper describes the design process, including how the control law requirements were incorporated into constraints for the optimization of this specific software design tool. Predicted performance is also compared to results from flight.
Dong, Xing; Zhang, Kevin; Ren, Yuan; Wilson, Reda; O'Neil, Mary Elizabeth
2016-01-01
Studying population-based cancer survival by leveraging the high-quality cancer incidence data collected by the Centers for Disease Control and Prevention's National Program of Cancer Registries (NPCR) can offer valuable insight into the cancer burden and impact in the United States. We describe the development and validation of a SASmacro tool that calculates population-based cancer site-specific relative survival estimates comparable to those obtained through SEER*Stat. The NPCR relative survival analysis SAS tool (NPCR SAS tool) was developed based on the relative survival method and SAS macros developed by Paul Dickman. NPCR cancer incidence data from 25 states submitted in November 2012 were used, specifically cases diagnosed from 2003 to 2010 with follow-up through 2010. Decennial and annual complete life tables published by the National Center for Health Statistics (NCHS) for 2000 through 2009 were used. To assess comparability between the 2 tools, 5-year relative survival rates were calculated for 25 cancer sites by sex, race, and age group using the NPCR SAS tool and the National Cancer Institute's SEER*Stat 8.1.5 software. A module to create data files for SEER*Stat was also developed for the NPCR SAS tool. Comparison of the results produced by both SAS and SEER*Stat showed comparable and reliable relative survival estimates for NPCR data. For a majority of the sites, the net differences between the NPCR SAS tool and SEER*Stat-produced relative survival estimates ranged from -0.1% to 0.1%. The estimated standard errors were highly comparable between the 2 tools as well. The NPCR SAS tool will allow researchers to accurately estimate cancer 5-year relative survival estimates that are comparable to those produced by SEER*Stat for NPCR data. Comparison of output from the NPCR SAS tool and SEER*Stat provided additional quality control capabilities for evaluating data prior to producing NPCR relative survival estimates.
Understanding and Improving Knowledge Transactions in Command and Control
2003-06-01
implications for the development of tools to facilitate efficient and effectiv and knowledge exchange. Cognitive task analysis (CTA) in support...makers]?” *quotes taken from K-web cognitive task analysis , Global 2000 and Global 2001 War Games, interviews with Carl Vinson K-Web users following
Paliwoda, Michelle; New, Karen; Bogossian, Fiona
2016-09-01
All newborns are at risk of deterioration as a result of failing to make the transition to extra uterine life. Signs of deterioration can be subtle and easily missed. It has been postulated that the use of an Early Warning Tool may assist clinicians in recognising and responding to signs of deterioration earlier in neonates, thereby preventing a serious adverse event. To examine whether observations from a Standard Observation Tool, applied to three neonatal Early Warning Tools, would hypothetically trigger an escalation of care more frequently than actual escalation of care using the Standard Observation Tool. A retrospective case-control study. A maternity unit in a tertiary public hospital in Australia. Neonates born in 2013 of greater than or equal to 34(+0) weeks gestation, admitted directly to the maternity ward from their birthing location and whose subsequent deterioration required admission to the neonatal unit, were identified as cases from databases of the study hospital. Each case was matched with three controls, inborn during the same period and who did not experience deterioration and neonatal unit admission. Clinical and physiological data recorded on a Standard Observation Tool, from time of admission to the maternity ward, for cases and controls were charted onto each of three Early Warning Tools. The primary outcome was whether the tool 'triggered an escalation of care'. Descriptive statistics (n, %, Mean and SD) were employed. Cases (n=26) comprised late preterm, early term and post-term neonates and matched by gestational age group with 3 controls (n=78). Overall, the Standard Observation Tool triggered an escalation of care for 92.3% of cases compared to the Early Warning Tools; New South Wales Health 80.8%, United Kingdom Newborn Early Warning Chart 57.7% and The Australian Capital Territory Neonatal Early Warning Score 11.5%. Subgroup analysis by gestational age found differences between the tools in hypothetically triggering an escalation of care. The Standard Observation Tool triggered an escalation of care more frequently than the Early Warning Tools, which may be as a result of behavioural data captured on the Standard Observation Tool and escalated, which could not be on the Early Warning Tools. Findings demonstrate that a single tool applied to all gestational age ranges may not be effective in identifying early deterioration or may over trigger an escalation of care. Further research is required into the sensitivity and specificity of Early Warning Tools in neonatal sub-populations. Copyright © 2016 Elsevier Ltd. All rights reserved.
Development of the Surface Management System Integrated with CTAS Arrival Tools
NASA Technical Reports Server (NTRS)
Jung, Yoon C.; Jara, Dave
2005-01-01
The Surface Management System (SMS) developed by NASA Ames Research Center in coordination with the Federal Aviation Administration (FAA) is a decision support tool to help tower traffic coordinators and Ground/Local controllers in managing and controlling airport surface traffic in order to increase capacity, efficiency, and flexibility. SMS provides common situation awareness to personnel at various air traffic control facilities such as airport traffic control towers (ATCT s), airline ramp towers, Terminal Radar Approach Control (TRACON), and Air Route Traffic Control Center (ARTCC). SMS also provides a traffic management tool to assist ATCT traffic management coordinators (TMCs) in making decisions such as airport configuration and runway load balancing. The Build 1 of the SMS tool was installed and successfully tested at Memphis International Airport (MEM) and received high acceptance scores from ATCT controllers and coordinators, as well as airline ramp controllers. NASA Ames Research Center continues to develop SMS under NASA s Strategic Airspace Usage (SAU) project in order to improve its prediction accuracy and robustness under various modeling uncertainties. This paper reports the recent development effort performed by the NASA Ames Research Center: 1) integration of Center TRACON Automation System (CTAS) capability with SMS and 2) an alternative approach to obtain airline gate information through a publicly available website. The preliminary analysis results performed on the air/surface traffic data at the DFW airport have shown significant improvement in predicting airport arrival demand and IN time at the gate. This paper concludes with recommendations for future research and development.
1988-11-01
264 ANALYSIS RESTART. ............. ..... ....... 269 1.0 TITLE CARD. .............. ............. 271 2.0 CONTROL CARDS...stress soil model will provide a tool for such analysis of waterfront structures. To understand the significance of liquefaction, it is important to note...Implementing this effective stress soil model into a finite element computer program would allow analysis of soil and structure together. TECHNICAL BACKGROUND
Investigation of a Verification and Validation Tool with a Turbofan Aircraft Engine Application
NASA Technical Reports Server (NTRS)
Uth, Peter; Narang-Siddarth, Anshu; Wong, Edmond
2018-01-01
The development of more advanced control architectures for turbofan aircraft engines can yield gains in performance and efficiency over the lifetime of an engine. However, the implementation of these increasingly complex controllers is contingent on their ability to provide safe, reliable engine operation. Therefore, having the means to verify the safety of new control algorithms is crucial. As a step towards this goal, CoCoSim, a publicly available verification tool for Simulink, is used to analyze C-MAPSS40k, a 40,000 lbf class turbo-fan engine model developed at NASA for testing new control algorithms. Due to current limitations of the verification software, several modifications are made to C-MAPSS40k to achieve compatibility with CoCoSim. Some of these modifications sacrifice fidelity to the original model. Several safety and performance requirements typical for turbofan engines are identified and constructed into a verification framework. Preliminary results using an industry standard baseline controller for these requirements are presented. While verification capabilities are demonstrated, a truly comprehensive analysis will require further development of the verification tool.
Analysis of ChIP-seq Data in R/Bioconductor.
de Santiago, Ines; Carroll, Thomas
2018-01-01
The development of novel high-throughput sequencing methods for ChIP (chromatin immunoprecipitation) has provided a very powerful tool to study gene regulation in multiple conditions at unprecedented resolution and scale. Proactive quality-control and appropriate data analysis techniques are of critical importance to extract the most meaningful results from the data. Over the last years, an array of R/Bioconductor tools has been developed allowing researchers to process and analyze ChIP-seq data. This chapter provides an overview of the methods available to analyze ChIP-seq data based primarily on software packages from the open-source Bioconductor project. Protocols described in this chapter cover basic steps including data alignment, peak calling, quality control and data visualization, as well as more complex methods such as the identification of differentially bound regions and functional analyses to annotate regulatory regions. The steps in the data analysis process were demonstrated on publicly available data sets and will serve as a demonstration of the computational procedures routinely used for the analysis of ChIP-seq data in R/Bioconductor, from which readers can construct their own analysis pipelines.
DOE Office of Scientific and Technical Information (OSTI.GOV)
A. Alfonsi; C. Rabiti; D. Mandelli
The Reactor Analysis and Virtual control ENviroment (RAVEN) code is a software tool that acts as the control logic driver and post-processing engine for the newly developed Thermal-Hydraulic code RELAP-7. RAVEN is now a multi-purpose Probabilistic Risk Assessment (PRA) software framework that allows dispatching different functionalities: Derive and actuate the control logic required to simulate the plant control system and operator actions (guided procedures), allowing on-line monitoring/controlling in the Phase Space Perform both Monte-Carlo sampling of random distributed events and Dynamic Event Tree based analysis Facilitate the input/output handling through a Graphical User Interface (GUI) and a post-processing data miningmore » module« less
Mason, Alexina J; Gomes, Manuel; Grieve, Richard; Ulug, Pinar; Powell, Janet T; Carpenter, James
2017-08-01
The analyses of randomised controlled trials with missing data typically assume that, after conditioning on the observed data, the probability of missing data does not depend on the patient's outcome, and so the data are 'missing at random' . This assumption is usually implausible, for example, because patients in relatively poor health may be more likely to drop out. Methodological guidelines recommend that trials require sensitivity analysis, which is best informed by elicited expert opinion, to assess whether conclusions are robust to alternative assumptions about the missing data. A major barrier to implementing these methods in practice is the lack of relevant practical tools for eliciting expert opinion. We develop a new practical tool for eliciting expert opinion and demonstrate its use for randomised controlled trials with missing data. We develop and illustrate our approach for eliciting expert opinion with the IMPROVE trial (ISRCTN 48334791), an ongoing multi-centre randomised controlled trial which compares an emergency endovascular strategy versus open repair for patients with ruptured abdominal aortic aneurysm. In the IMPROVE trial at 3 months post-randomisation, 21% of surviving patients did not complete health-related quality of life questionnaires (assessed by EQ-5D-3L). We address this problem by developing a web-based tool that provides a practical approach for eliciting expert opinion about quality of life differences between patients with missing versus complete data. We show how this expert opinion can define informative priors within a fully Bayesian framework to perform sensitivity analyses that allow the missing data to depend upon unobserved patient characteristics. A total of 26 experts, of 46 asked to participate, completed the elicitation exercise. The elicited quality of life scores were lower on average for the patients with missing versus complete data, but there was considerable uncertainty in these elicited values. The missing at random analysis found that patients randomised to the emergency endovascular strategy versus open repair had higher average (95% credible interval) quality of life scores of 0.062 (-0.005 to 0.130). Our sensitivity analysis that used the elicited expert information as pooled priors found that the gain in average quality of life for the emergency endovascular strategy versus open repair was 0.076 (-0.054 to 0.198). We provide and exemplify a practical tool for eliciting the expert opinion required by recommended approaches to the sensitivity analyses of randomised controlled trials. We show how this approach allows the trial analysis to fully recognise the uncertainty that arises from making alternative, plausible assumptions about the reasons for missing data. This tool can be widely used in the design, analysis and interpretation of future trials, and to facilitate this, materials are available for download.
The Management Standards Indicator Tool and evaluation of burnout.
Ravalier, J M; McVicar, A; Munn-Giddings, C
2013-03-01
Psychosocial hazards in the workplace can impact upon employee health. The UK Health and Safety Executive's (HSE) Management Standards Indicator Tool (MSIT) appears to have utility in relation to health impacts but we were unable to find studies relating it to burnout. To explore the utility of the MSIT in evaluating risk of burnout assessed by the Maslach Burnout Inventory-General Survey (MBI-GS). This was a cross-sectional survey of 128 borough council employees. MSIT data were analysed according to MSIT and MBI-GS threshold scores and by using multivariate linear regression with MBI-GS factors as dependent variables. MSIT factor scores were gradated according to categories of risk of burnout according to published MBI-GS thresholds, and identified priority workplace concerns as demands, relationships, role and change. These factors also featured as significant independent variables, with control, in outcomes of the regression analysis. Exhaustion was associated with demands and control (adjusted R (2) = 0.331); cynicism was associated with change, role and demands (adjusted R (2) =0.429); and professional efficacy was associated with managerial support, role, control and demands (adjusted R (2) = 0.413). MSIT analysis generally has congruence with MBI-GS assessment of burnout. The identification of control within regression models but not as a priority concern in the MSIT analysis could suggest an issue of the setting of the MSIT thresholds for this factor, but verification requires a much larger study. Incorporation of relationship, role and change into the MSIT, missing from other conventional tools, appeared to add to its validity.
ERIC Educational Resources Information Center
Mock, Nancy B.; And Others
1993-01-01
The use of case-control methodology as an applied policy/planning research tool in assessing the potential effectiveness of behavioral interventions is studied in connection with diarrhea control in Zaire. Results with 107 matched pairs of children demonstrate the importance of hygiene-related knowledge and the utility of the research approach.…
System analysis tools for an ELT at ESO
NASA Astrophysics Data System (ADS)
Mueller, Michael; Koch, Franz
2006-06-01
Engineering of complex, large scale systems like the ELT designs currently investigated and developed in Europe and Northern America require powerful and sophisticated tools within specific technical disciplines such as mechanics, optics and control engineering. However, even analyzing a certain component of the telescope like the telescope structure necessitates a system approach to evaluate the structural effects onto the optical performance. This paper shows several software tools developed by the European Southern Observatory (ESO) which focus onto the system approach in the analyses: Using modal results of a finite element analysis the SMI-toolbox allows an easy generation of structural models with different sizes and levels of accuracy for the control design and closed-loop simulations. The optical modeling code BeamWarrior was developed by ESO and Astrium GmbH, Germany) especially for integrated modeling and interfering with a structural model. Within BeamWarrior displacements and deformations can be applied in an arbitrary coordinate system, and hence also in the global coordinates of the FE model avoiding error prone transformations. In addition to this, a sparse state space model object was developed for Matlab to gain in computational efficiency and reduced memory requirements due to the sparsity pattern of both the structural models and the control architecture. As one result these tools allow building an integrated model in order to reliably simulate interactions, cross-coupling effects, system responses, and to evaluate global performance. In order to evaluate disturbance effects on the optical performance in openloop more efficiently, an optical evaluation toolbox was built in the FE software ANSYS which performs Zernike decomposition and best-fit computation of the deformations directly in the FE analysis.
Experimental Evaluation of Verification and Validation Tools on Martian Rover Software
NASA Technical Reports Server (NTRS)
Brat, Guillaume; Giannakopoulou, Dimitra; Goldberg, Allen; Havelund, Klaus; Lowry, Mike; Pasareanu, Corina; Venet, Arnaud; Visser, Willem
2003-01-01
To achieve its science objectives in deep space exploration, NASA has a need for science platform vehicles to autonomously make control decisions in a time frame that excludes intervention from Earth-based controllers. Round-trip light-time is one significant factor motivating autonomy capability, another factor is the need to reduce ground support operations cost. An unsolved problem potentially impeding the adoption of autonomy capability is the verification and validation of such software systems, which exhibit far more behaviors (and hence distinct execution paths in the software) than is typical in current deepspace platforms. Hence the need for a study to benchmark advanced Verification and Validation (V&V) tools on representative autonomy software. The objective of the study was to access the maturity of different technologies, to provide data indicative of potential synergies between them, and to identify gaps in the technologies with respect to the challenge of autonomy V&V. The study consisted of two parts: first, a set of relatively independent case studies of different tools on the same autonomy code, second a carefully controlled experiment with human participants on a subset of these technologies. This paper describes the second part of the study. Overall, nearly four hundred hours of data on human use of three different advanced V&V tools were accumulated, with a control group that used conventional testing methods. The experiment simulated four independent V&V teams debugging three successive versions of an executive controller for a Martian Rover. Defects were carefully seeded into the three versions based on a profile of defects from CVS logs that occurred in the actual development of the executive controller. The rest of the document is structured a s follows. In section 2 and 3, we respectively describe the tools used in the study and the rover software that was analyzed. In section 4 the methodology for the experiment is described; this includes the code preparation, seeding of defects, participant training and experimental setup. Next we give a qualitative overview of how the experiment went from the point of view of each technology; model checking (section 5), static analysis (section 6), runtime analysis (section 7) and testing (section 8). The find section gives some preliminary quantitative results on how the tools compared.
Top-attack modeling and automatic target detection using synthetic FLIR scenery
NASA Astrophysics Data System (ADS)
Weber, Bruce A.; Penn, Joseph A.
2004-09-01
A series of experiments have been performed to verify the utility of algorithmic tools for the modeling and analysis of cold-target signatures in synthetic, top-attack, FLIR video sequences. The tools include: MuSES/CREATION for the creation of synthetic imagery with targets, an ARL target detection algorithm to detect imbedded synthetic targets in scenes, and an ARL scoring algorithm, using Receiver-Operating-Characteristic (ROC) curve analysis, to evaluate detector performance. Cold-target detection variability was examined as a function of target emissivity, surrounding clutter type, and target placement in non-obscuring clutter locations. Detector metrics were also individually scored so as to characterize the effect of signature/clutter variations. Results show that using these tools, a detailed, physically meaningful, target detection analysis is possible and that scenario specific target detectors may be developed by selective choice and/or weighting of detector metrics. However, developing these tools into a reliable predictive capability will require the extension of these results to the modeling and analysis of a large number of data sets configured for a wide range of target and clutter conditions. Finally, these tools should also be useful for the comparison of competitive detection algorithms by providing well defined, and controllable target detection scenarios, as well as for the training and testing of expert human observers.
Tool Support for Parametric Analysis of Large Software Simulation Systems
NASA Technical Reports Server (NTRS)
Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony
2008-01-01
The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.
Karmakar, Sougata; Pal, Madhu Sudan; Majumdar, Deepti; Majumdar, Dhurjati
2012-01-01
Ergonomic evaluation of visual demands becomes crucial for the operators/users when rapid decision making is needed under extreme time constraint like navigation task of jet aircraft. Research reported here comprises ergonomic evaluation of pilot's vision in a jet aircraft in virtual environment to demonstrate how vision analysis tools of digital human modeling software can be used effectively for such study. Three (03) dynamic digital pilot models, representative of smallest, average and largest Indian pilot population were generated from anthropometric database and interfaced with digital prototype of the cockpit in Jack software for analysis of vision within and outside the cockpit. Vision analysis tools like view cones, eye view windows, blind spot area, obscuration zone, reflection zone etc. were employed during evaluation of visual fields. Vision analysis tool was also used for studying kinematic changes of pilot's body joints during simulated gazing activity. From present study, it can be concluded that vision analysis tool of digital human modeling software was found very effective in evaluation of position and alignment of different displays and controls in the workstation based upon their priorities within the visual fields and anthropometry of the targeted users, long before the development of its physical prototype.
Dols, W Stuart; Emmerich, Steven J; Polidoro, Brian J
2016-03-01
Building energy analysis tools are available in many forms that provide the ability to address a broad spectrum of energy-related issues in various combinations. Often these tools operate in isolation from one another, making it difficult to evaluate the interactions between related phenomena and interacting systems, forcing oversimplified assumptions to be made about various phenomena that could otherwise be addressed directly with another tool. One example of such interdependence is the interaction between heat transfer, inter-zone airflow and indoor contaminant transport. In order to better address these interdependencies, the National Institute of Standards and Technology (NIST) has developed an updated version of the multi-zone airflow and contaminant transport modelling tool, CONTAM, along with a set of utilities to enable coupling of the full CONTAM model with the TRNSYS simulation tool in a more seamless manner and with additional capabilities that were previously not available. This paper provides an overview of these new capabilities and applies them to simulating a medium-size office building. These simulations address the interaction between whole-building energy, airflow and contaminant transport in evaluating various ventilation strategies including natural and demand-controlled ventilation. CONTAM has been in practical use for many years allowing building designers, as well as IAQ and ventilation system analysts, to simulate the complex interactions between building physical layout and HVAC system configuration in determining building airflow and contaminant transport. It has been widely used to design and analyse smoke management systems and evaluate building performance in response to chemical, biological and radiological events. While CONTAM has been used to address design and performance of buildings implementing energy conserving ventilation systems, e.g., natural and hybrid, this new coupled simulation capability will enable users to apply the tool to couple CONTAM with existing energy analysis software to address the interaction between indoor air quality considerations and energy conservation measures in building design and analysis. This paper presents two practical case studies using the coupled modelling tool to evaluate IAQ performance of a CO 2 -based demand-controlled ventilation system under different levels of building envelope airtightness and the design and analysis of a natural ventilation system.
Dols, W. Stuart.; Emmerich, Steven J.; Polidoro, Brian J.
2016-01-01
Building energy analysis tools are available in many forms that provide the ability to address a broad spectrum of energy-related issues in various combinations. Often these tools operate in isolation from one another, making it difficult to evaluate the interactions between related phenomena and interacting systems, forcing oversimplified assumptions to be made about various phenomena that could otherwise be addressed directly with another tool. One example of such interdependence is the interaction between heat transfer, inter-zone airflow and indoor contaminant transport. In order to better address these interdependencies, the National Institute of Standards and Technology (NIST) has developed an updated version of the multi-zone airflow and contaminant transport modelling tool, CONTAM, along with a set of utilities to enable coupling of the full CONTAM model with the TRNSYS simulation tool in a more seamless manner and with additional capabilities that were previously not available. This paper provides an overview of these new capabilities and applies them to simulating a medium-size office building. These simulations address the interaction between whole-building energy, airflow and contaminant transport in evaluating various ventilation strategies including natural and demand-controlled ventilation. Practical Application CONTAM has been in practical use for many years allowing building designers, as well as IAQ and ventilation system analysts, to simulate the complex interactions between building physical layout and HVAC system configuration in determining building airflow and contaminant transport. It has been widely used to design and analyse smoke management systems and evaluate building performance in response to chemical, biological and radiological events. While CONTAM has been used to address design and performance of buildings implementing energy conserving ventilation systems, e.g., natural and hybrid, this new coupled simulation capability will enable users to apply the tool to couple CONTAM with existing energy analysis software to address the interaction between indoor air quality considerations and energy conservation measures in building design and analysis. This paper presents two practical case studies using the coupled modelling tool to evaluate IAQ performance of a CO2-based demand-controlled ventilation system under different levels of building envelope airtightness and the design and analysis of a natural ventilation system. PMID:27099405
Joint symbolic dynamics for the assessment of cardiovascular and cardiorespiratory interactions
Baumert, Mathias; Javorka, Michal; Kabir, Muammar
2015-01-01
Beat-to-beat variations in heart period provide information on cardiovascular control and are closely linked to variations in arterial pressure and respiration. Joint symbolic analysis of heart period, systolic arterial pressure and respiration allows for a simple description of their shared short-term dynamics that are governed by cardiac baroreflex control and cardiorespiratory coupling. In this review, we discuss methodology and research applications. Studies suggest that analysis of joint symbolic dynamics provides a powerful tool for identifying physiological and pathophysiological changes in cardiovascular and cardiorespiratory control. PMID:25548272
Joint symbolic dynamics for the assessment of cardiovascular and cardiorespiratory interactions.
Baumert, Mathias; Javorka, Michal; Kabir, Muammar
2015-02-13
Beat-to-beat variations in heart period provide information on cardiovascular control and are closely linked to variations in arterial pressure and respiration. Joint symbolic analysis of heart period, systolic arterial pressure and respiration allows for a simple description of their shared short-term dynamics that are governed by cardiac baroreflex control and cardiorespiratory coupling. In this review, we discuss methodology and research applications. Studies suggest that analysis of joint symbolic dynamics provides a powerful tool for identifying physiological and pathophysiological changes in cardiovascular and cardiorespiratory control.
Probabilistic Parameter Uncertainty Analysis of Single Input Single Output Control Systems
NASA Technical Reports Server (NTRS)
Smith, Brett A.; Kenny, Sean P.; Crespo, Luis G.
2005-01-01
The current standards for handling uncertainty in control systems use interval bounds for definition of the uncertain parameters. This approach gives no information about the likelihood of system performance, but simply gives the response bounds. When used in design, current methods of m-analysis and can lead to overly conservative controller design. With these methods, worst case conditions are weighted equally with the most likely conditions. This research explores a unique approach for probabilistic analysis of control systems. Current reliability methods are examined showing the strong areas of each in handling probability. A hybrid method is developed using these reliability tools for efficiently propagating probabilistic uncertainty through classical control analysis problems. The method developed is applied to classical response analysis as well as analysis methods that explore the effects of the uncertain parameters on stability and performance metrics. The benefits of using this hybrid approach for calculating the mean and variance of responses cumulative distribution functions are shown. Results of the probabilistic analysis of a missile pitch control system, and a non-collocated mass spring system, show the added information provided by this hybrid analysis.
Multidisciplinary analysis of actively controlled large flexible spacecraft
NASA Technical Reports Server (NTRS)
Cooper, Paul A.; Young, John W.; Sutter, Thomas R.
1986-01-01
The control of Flexible Structures (COFS) program has supported the development of an analysis capability at the Langley Research Center called the Integrated Multidisciplinary Analysis Tool (IMAT) which provides an efficient data storage and transfer capability among commercial computer codes to aid in the dynamic analysis of actively controlled structures. IMAT is a system of computer programs which transfers Computer-Aided-Design (CAD) configurations, structural finite element models, material property and stress information, structural and rigid-body dynamic model information, and linear system matrices for control law formulation among various commercial applications programs through a common database. Although general in its formulation, IMAT was developed specifically to aid in the evaluation of the structures. A description of the IMAT system and results of an application of the system are given.
Analysis of Interactive Conflict Resolution Tool Usage in a Mixed Equipage Environment
NASA Technical Reports Server (NTRS)
Homola, Jeffrey; Morey, Susan; Cabrall, Christopher; Martin, Lynne; Mercer, Joey; Prevot, Thomas
2013-01-01
A human-in-the-loop simulation was conducted that examined separation assurance concepts in varying levels of traffic density with mixtures of aircraft equipage and automation. This paper's analysis focuses on one of the experimental conditions in which traffic levels were approximately fifty percent higher than today, and approximately fifty percent of the traffic within the test area were equipped with data communications (data comm) capabilities. The other fifty percent of the aircraft required control by voice much like today. Within this environment, the air traffic controller participants were provided access to tools and automation designed to support the primary task of separation assurance that are currently unavailable. Two tools were selected for analysis in this paper: 1) a pre-probed altitude fly-out menu that provided instant feedback of conflict probe results for a range of altitudes, and 2) an interactive auto resolver that provided on-demand access to an automation-generated conflict resolution trajectory. Although encouraged, use of the support tools was not required; the participants were free to use the tools as they saw fit, and they were also free to accept, reject, or modify the resolutions offered by the automation. This mode of interaction provided a unique opportunity to examine exactly when and how these tools were used, as well as how acceptable the resolutions were. Results showed that the participants used the pre-probed altitude fly-out menu in 14% of conflict cases and preferred to use it in a strategic timeframe on data comm equipped and level flight aircraft. The interactive auto resolver was also used in a primarily strategic timeframe on 22% of conflicts and that their preference was to use it on conflicts involving data comm equipped aircraft as well. Of the 258 resolutions displayed, 46% were implemented and 54% were not. The auto resolver was rated highly by participants in terms of confidence and preference. Factors such as aircraft equipage, ownership, and location of predicted separation loss appeared to play a role in the decision of controllers to accept or reject the auto resolver's resolutions.
Analysis techniques for multivariate root loci. [a tool in linear control systems
NASA Technical Reports Server (NTRS)
Thompson, P. M.; Stein, G.; Laub, A. J.
1980-01-01
Analysis and techniques are developed for the multivariable root locus and the multivariable optimal root locus. The generalized eigenvalue problem is used to compute angles and sensitivities for both types of loci, and an algorithm is presented that determines the asymptotic properties of the optimal root locus.
Computing Linear Mathematical Models Of Aircraft
NASA Technical Reports Server (NTRS)
Duke, Eugene L.; Antoniewicz, Robert F.; Krambeer, Keith D.
1991-01-01
Derivation and Definition of Linear Aircraft Model (LINEAR) computer program provides user with powerful, and flexible, standard, documented, and verified software tool for linearization of mathematical models of aerodynamics of aircraft. Intended for use in software tool to drive linear analysis of stability and design of control laws for aircraft. Capable of both extracting such linearized engine effects as net thrust, torque, and gyroscopic effects, and including these effects in linear model of system. Designed to provide easy selection of state, control, and observation variables used in particular model. Also provides flexibility of allowing alternate formulations of both state and observation equations. Written in FORTRAN.
Update on HCDstruct - A Tool for Hybrid Wing Body Conceptual Design and Structural Optimization
NASA Technical Reports Server (NTRS)
Gern, Frank H.
2015-01-01
HCDstruct is a Matlab® based software tool to rapidly build a finite element model for structural optimization of hybrid wing body (HWB) aircraft at the conceptual design level. The tool uses outputs from a Flight Optimization System (FLOPS) performance analysis together with a conceptual outer mold line of the vehicle, e.g. created by Vehicle Sketch Pad (VSP), to generate a set of MSC Nastran® bulk data files. These files can readily be used to perform a structural optimization and weight estimation using Nastran’s® Solution 200 multidisciplinary optimization solver. Initially developed at NASA Langley Research Center to perform increased fidelity conceptual level HWB centerbody structural analyses, HCDstruct has grown into a complete HWB structural sizing and weight estimation tool, including a fully flexible aeroelastic loads analysis. Recent upgrades to the tool include the expansion to a full wing tip-to-wing tip model for asymmetric analyses like engine out conditions and dynamic overswings, as well as a fully actuated trailing edge, featuring up to 15 independently actuated control surfaces and twin tails. Several example applications of the HCDstruct tool are presented.
Control and prediction of the course of brewery fermentations by gravimetric analysis.
Kosín, P; Savel, J; Broz, A; Sigler, K
2008-01-01
A simple, fast and cheap test suitable for predicting the course of brewery fermentations based on mass analysis is described and its efficiency is evaluated. Compared to commonly used yeast vitality tests, this analysis takes into account wort composition and other factors that influence fermentation performance. It can be used to predict the shape of the fermentation curve in brewery fermentations and in research and development projects concerning yeast vitality, fermentation conditions and wort composition. It can also be a useful tool for homebrewers to control their fermentations.
Design of a high altitude long endurance flying-wing solar-powered unmanned air vehicle
NASA Astrophysics Data System (ADS)
Alsahlani, A. A.; Johnston, L. J.; Atcliffe, P. A.
2017-06-01
The low-Reynolds number environment of high-altitude §ight places severe demands on the aerodynamic design and stability and control of a high altitude, long endurance (HALE) unmanned air vehicle (UAV). The aerodynamic efficiency of a §ying-wing configuration makes it an attractive design option for such an application and is investigated in the present work. The proposed configuration has a high-aspect ratio, swept-wing planform, the wing sweep being necessary to provide an adequate moment arm for outboard longitudinal and lateral control surfaces. A design optimization framework is developed under a MATLAB environment, combining aerodynamic, structural, and stability analysis. Low-order analysis tools are employed to facilitate efficient computations, which is important when there are multiple optimization loops for the various engineering analyses. In particular, a vortex-lattice method is used to compute the wing planform aerodynamics, coupled to a twodimensional (2D) panel method to derive aerofoil sectional characteristics. Integral boundary-layer methods are coupled to the panel method in order to predict §ow separation boundaries during the design iterations. A quasi-analytical method is adapted for application to flyingwing con¦gurations to predict the wing weight and a linear finite-beam element approach is used for structural analysis of the wing-box. Stability is a particular concern in the low-density environment of high-altitude flight for flying-wing aircraft and so provision of adequate directional stability and control power forms part of the optimization process. At present, a modified Genetic Algorithm is used in all of the optimization loops. Each of the low-order engineering analysis tools is validated using higher-order methods to provide con¦dence in the use of these computationally-efficient tools in the present design-optimization framework. This paper includes the results of employing the present optimization tools in the design of a HALE, flying-wing UAV to indicate that this is a viable design configuration option.
Kero, Tanja; Lindsjö, Lars; Sörensen, Jens; Lubberink, Mark
2016-08-01
(11)C-PIB PET is a promising non-invasive diagnostic tool for cardiac amyloidosis. Semiautomatic analysis of PET data is now available but it is not known how accurate these methods are for amyloid imaging. The aim of this study was to evaluate the feasibility of one semiautomatic software tool for analysis and visualization of (11)C-PIB left ventricular retention index (RI) in cardiac amyloidosis. Patients with systemic amyloidosis and cardiac involvement (n = 10) and healthy controls (n = 5) were investigated with dynamic (11)C-PIB PET. Two observers analyzed the PET studies with semiautomatic software to calculate the left ventricular RI of (11)C-PIB and to create parametric images. The mean RI at 15-25 min from the semiautomatic analysis was compared with RI based on manual analysis and showed comparable values (0.056 vs 0.054 min(-1) for amyloidosis patients and 0.024 vs 0.025 min(-1) in healthy controls; P = .78) and the correlation was excellent (r = 0.98). Inter-reader reproducibility also was excellent (intraclass correlation coefficient, ICC > 0.98). Parametric polarmaps and histograms made visual separation of amyloidosis patients and healthy controls fast and simple. Accurate semiautomatic analysis of cardiac (11)C-PIB RI in amyloidosis patients is feasible. Parametric polarmaps and histograms make visual interpretation fast and simple.
Artificial intelligence applications in the intensive care unit.
Hanson, C W; Marshall, B E
2001-02-01
To review the history and current applications of artificial intelligence in the intensive care unit. The MEDLINE database, bibliographies of selected articles, and current texts on the subject. The studies that were selected for review used artificial intelligence tools for a variety of intensive care applications, including direct patient care and retrospective database analysis. All literature relevant to the topic was reviewed. Although some of the earliest artificial intelligence (AI) applications were medically oriented, AI has not been widely accepted in medicine. Despite this, patient demographic, clinical, and billing data are increasingly available in an electronic format and therefore susceptible to analysis by intelligent software. Individual AI tools are specifically suited to different tasks, such as waveform analysis or device control. The intensive care environment is particularly suited to the implementation of AI tools because of the wealth of available data and the inherent opportunities for increased efficiency in inpatient care. A variety of new AI tools have become available in recent years that can function as intelligent assistants to clinicians, constantly monitoring electronic data streams for important trends, or adjusting the settings of bedside devices. The integration of these tools into the intensive care unit can be expected to reduce costs and improve patient outcomes.
New risk metrics and mathematical tools for risk analysis: Current and future challenges
DOE Office of Scientific and Technical Information (OSTI.GOV)
Skandamis, Panagiotis N., E-mail: pskan@aua.gr; Andritsos, Nikolaos, E-mail: pskan@aua.gr; Psomas, Antonios, E-mail: pskan@aua.gr
The current status of the food safety supply world wide, has led Food and Agriculture Organization (FAO) and World Health Organization (WHO) to establishing Risk Analysis as the single framework for building food safety control programs. A series of guidelines and reports that detail out the various steps in Risk Analysis, namely Risk Management, Risk Assessment and Risk Communication is available. The Risk Analysis approach enables integration between operational food management systems, such as Hazard Analysis Critical Control Points, public health and governmental decisions. To do that, a series of new Risk Metrics has been established as follows: i) themore » Appropriate Level of Protection (ALOP), which indicates the maximum numbers of illnesses in a population per annum, defined by quantitative risk assessments, and used to establish; ii) Food Safety Objective (FSO), which sets the maximum frequency and/or concentration of a hazard in a food at the time of consumption that provides or contributes to the ALOP. Given that ALOP is rather a metric of the public health tolerable burden (it addresses the total ‘failure’ that may be handled at a national level), it is difficult to be interpreted into control measures applied at the manufacturing level. Thus, a series of specific objectives and criteria for performance of individual processes and products have been established, all of them assisting in the achievement of FSO and hence, ALOP. In order to achieve FSO, tools quantifying the effect of processes and intrinsic properties of foods on survival and growth of pathogens are essential. In this context, predictive microbiology and risk assessment have offered an important assistance to Food Safety Management. Predictive modelling is the basis of exposure assessment and the development of stochastic and kinetic models, which are also available in the form of Web-based applications, e.g., COMBASE and Microbial Responses Viewer), or introduced into user-friendly softwares, (e.g., Seafood Spoilage Predictor) have evolved the use of information systems in the food safety management. Such tools are updateable with new food-pathogen specific models containing cardinal parameters and multiple dependent variables, including plate counts, concentration of metabolic products, or even expression levels of certain genes. Then, these tools may further serve as decision-support tools which may assist in product logistics, based on their scientifically-based and “momentary” expressed spoilage and safety level.« less
New risk metrics and mathematical tools for risk analysis: Current and future challenges
NASA Astrophysics Data System (ADS)
Skandamis, Panagiotis N.; Andritsos, Nikolaos; Psomas, Antonios; Paramythiotis, Spyridon
2015-01-01
The current status of the food safety supply world wide, has led Food and Agriculture Organization (FAO) and World Health Organization (WHO) to establishing Risk Analysis as the single framework for building food safety control programs. A series of guidelines and reports that detail out the various steps in Risk Analysis, namely Risk Management, Risk Assessment and Risk Communication is available. The Risk Analysis approach enables integration between operational food management systems, such as Hazard Analysis Critical Control Points, public health and governmental decisions. To do that, a series of new Risk Metrics has been established as follows: i) the Appropriate Level of Protection (ALOP), which indicates the maximum numbers of illnesses in a population per annum, defined by quantitative risk assessments, and used to establish; ii) Food Safety Objective (FSO), which sets the maximum frequency and/or concentration of a hazard in a food at the time of consumption that provides or contributes to the ALOP. Given that ALOP is rather a metric of the public health tolerable burden (it addresses the total `failure' that may be handled at a national level), it is difficult to be interpreted into control measures applied at the manufacturing level. Thus, a series of specific objectives and criteria for performance of individual processes and products have been established, all of them assisting in the achievement of FSO and hence, ALOP. In order to achieve FSO, tools quantifying the effect of processes and intrinsic properties of foods on survival and growth of pathogens are essential. In this context, predictive microbiology and risk assessment have offered an important assistance to Food Safety Management. Predictive modelling is the basis of exposure assessment and the development of stochastic and kinetic models, which are also available in the form of Web-based applications, e.g., COMBASE and Microbial Responses Viewer), or introduced into user-friendly softwares, (e.g., Seafood Spoilage Predictor) have evolved the use of information systems in the food safety management. Such tools are updateable with new food-pathogen specific models containing cardinal parameters and multiple dependent variables, including plate counts, concentration of metabolic products, or even expression levels of certain genes. Then, these tools may further serve as decision-support tools which may assist in product logistics, based on their scientifically-based and "momentary" expressed spoilage and safety level.
EPICS-based control and data acquisition for the APS slope profiler (Conference Presentation)
NASA Astrophysics Data System (ADS)
Sullivan, Joseph; Assoufid, Lahsen; Qian, Jun; Jemian, Peter R.; Mooney, Tim; Rivers, Mark L.; Goetze, Kurt; Sluiter, Ronald L.; Lang, Keenan
2016-09-01
The motion control, data acquisition and analysis system for APS Slope Measuring Profiler was implemented using the Experimental Physics and Industrial Control System (EPICS). EPICS was designed as a framework with software tools and applications that provide a software infrastructure used in building distributed control systems to operate devices such as particle accelerators, large experiments and major telescopes. EPICS was chosen to implement the APS Slope Measuring Profiler because it is also applicable to single purpose systems. The control and data handling capability available in the EPICS framework provides the basic functionality needed for high precision X-ray mirror measurement. Those built in capabilities include hardware integration of high-performance motion control systems (3-axis gantry and tip-tilt stages), mirror measurement devices (autocollimator, laser spot camera) and temperature sensors. Scanning the mirror and taking measurements was accomplished with an EPICS feature (the sscan record) which synchronizes motor positioning with measurement triggers and data storage. Various mirror scanning modes were automatically configured using EPICS built-in scripting. EPICS tools also provide low-level image processing (areaDetector). Operation screens were created using EPICS-aware GUI screen development tools.
NASA Technical Reports Server (NTRS)
Jacklin, Stephen; Schumann, Johann; Gupta, Pramod; Richard, Michael; Guenther, Kurt; Soares, Fola
2005-01-01
Adaptive control technologies that incorporate learning algorithms have been proposed to enable automatic flight control and vehicle recovery, autonomous flight, and to maintain vehicle performance in the face of unknown, changing, or poorly defined operating environments. In order for adaptive control systems to be used in safety-critical aerospace applications, they must be proven to be highly safe and reliable. Rigorous methods for adaptive software verification and validation must be developed to ensure that control system software failures will not occur. Of central importance in this regard is the need to establish reliable methods that guarantee convergent learning, rapid convergence (learning) rate, and algorithm stability. This paper presents the major problems of adaptive control systems that use learning to improve performance. The paper then presents the major procedures and tools presently developed or currently being developed to enable the verification, validation, and ultimate certification of these adaptive control systems. These technologies include the application of automated program analysis methods, techniques to improve the learning process, analytical methods to verify stability, methods to automatically synthesize code, simulation and test methods, and tools to provide on-line software assurance.
NASA Technical Reports Server (NTRS)
Brooks, David E.; Gassman, Holly; Beering, Dave R.; Welch, Arun; Hoder, Douglas J.; Ivancic, William D.
1999-01-01
Transmission Control Protocol (TCP) is the underlying protocol used within the Internet for reliable information transfer. As such, there is great interest to have all implementations of TCP efficiently interoperate. This is particularly important for links exhibiting long bandwidth-delay products. The tools exist to perform TCP analysis at low rates and low delays. However, for extremely high-rate and lone-delay links such as 622 Mbps over geosynchronous satellites, new tools and testing techniques are required. This paper describes the tools and techniques used to analyze and debug various TCP implementations over high-speed, long-delay links.
Environmental control and life support system analysis tools for the Space Station era
NASA Technical Reports Server (NTRS)
Blakely, R. L.; Rowell, L. F.
1984-01-01
This paper describes the concept of a developing emulation, simulation, sizing, and technology assessment program (ESSTAP) which can be used effectively for the various functional disciplines (structures, power, ECLSS, etc.) beginning with the initial system selection and conceptual design processes and continuing on through the mission operation and growth phases of the Space Station for the purpose of minimizing overall program costs. It will discuss the basic requirements for these tools, as currently envisioned for the Environmental Control and Life Support System (ECLSS), identifying their intended and potential uses and applications, and present examples and status of several representative tools. The development and applications of a Space Station Atmospheric Revitalization Subsystem (ARS) demonstration model to be used for concent verification will also be discussed.
Material Protection, Accounting, and Control Technologies (MPACT) Advanced Integration Roadmap
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Mike; Cipiti, Ben; Demuth, Scott Francis
2017-01-30
The development of sustainable advanced nuclear fuel cycles is a long-term goal of the Office of Nuclear Energy’s (DOE-NE) Fuel Cycle Technologies program. The Material Protection, Accounting, and Control Technologies (MPACT) campaign is supporting research and development (R&D) of advanced instrumentation, analysis tools, and integration methodologies to meet this goal (Miller, 2015). This advanced R&D is intended to facilitate safeguards and security by design of fuel cycle facilities. The lab-scale demonstration of a virtual facility, distributed test bed, that connects the individual tools being developed at National Laboratories and university research establishments, is a key program milestone for 2020. Thesemore » tools will consist of instrumentation and devices as well as computer software for modeling, simulation and integration.« less
Material Protection, Accounting, and Control Technologies (MPACT) Advanced Integration Roadmap
DOE Office of Scientific and Technical Information (OSTI.GOV)
Durkee, Joe W.; Cipiti, Ben; Demuth, Scott Francis
The development of sustainable advanced nuclear fuel cycles is a long-term goal of the Office of Nuclear Energy’s (DOE-NE) Fuel Cycle Technologies program. The Material Protection, Accounting, and Control Technologies (MPACT) campaign is supporting research and development (R&D) of advanced instrumentation, analysis tools, and integration methodologies to meet this goal (Miller, 2015). This advanced R&D is intended to facilitate safeguards and security by design of fuel cycle facilities. The lab-scale demonstration of a virtual facility, distributed test bed, that connects the individual tools being developed at National Laboratories and university research establishments, is a key program milestone for 2020. Thesemore » tools will consist of instrumentation and devices as well as computer software for modeling, simulation and integration.« less
NASA Astrophysics Data System (ADS)
Zhao, Fei; Zhang, Chi; Yang, Guilin; Chen, Chinyin
2016-12-01
This paper presents an online estimation method of cutting error by analyzing of internal sensor readings. The internal sensors of numerical control (NC) machine tool are selected to avoid installation problem. The estimation mathematic model of cutting error was proposed to compute the relative position of cutting point and tool center point (TCP) from internal sensor readings based on cutting theory of gear. In order to verify the effectiveness of the proposed model, it was simulated and experimented in gear generating grinding process. The cutting error of gear was estimated and the factors which induce cutting error were analyzed. The simulation and experiments verify that the proposed approach is an efficient way to estimate the cutting error of work-piece during machining process.
NASA Technical Reports Server (NTRS)
Csank, Jeffrey; Zinnecker, Alicia
2014-01-01
Systems analysis involves steady-state simulations of combined components to evaluate the steady-state performance, weight, and cost of a system; dynamic considerations are not included until later in the design process. The Dynamic Systems Analysis task, under NASAs Fixed Wing project, is developing the capability for assessing dynamic issues at earlier stages during systems analysis. To provide this capability the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) has been developed to design a single flight condition controller (defined as altitude and Mach number) and, ultimately, provide an estimate of the closed-loop performance of the engine model. This tool has been integrated with the Commercial Modular Aero-Propulsion System Simulation 40,000(CMAPSS40k) engine model to demonstrate the additional information TTECTrA makes available for dynamic systems analysis. This dynamic data can be used to evaluate the trade-off between performance and safety, which could not be done with steady-state systems analysis data. TTECTrA has been designed to integrate with any turbine engine model that is compatible with the MATLABSimulink (The MathWorks, Inc.) environment.
NASA Technical Reports Server (NTRS)
Csank, Jeffrey Thomas; Zinnecker, Alicia Mae
2014-01-01
Systems analysis involves steady-state simulations of combined components to evaluate the steady-state performance, weight, and cost of a system; dynamic considerations are not included until later in the design process. The Dynamic Systems Analysis task, under NASAs Fixed Wing project, is developing the capability for assessing dynamic issues at earlier stages during systems analysis. To provide this capability the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) has been developed to design a single flight condition controller (defined as altitude and Mach number) and, ultimately, provide an estimate of the closed-loop performance of the engine model. This tool has been integrated with the Commercial Modular Aero-Propulsion System Simulation 40,000 (CMAPSS 40k) engine model to demonstrate the additional information TTECTrA makes available for dynamic systems analysis. This dynamic data can be used to evaluate the trade-off between performance and safety, which could not be done with steady-state systems analysis data. TTECTrA has been designed to integrate with any turbine engine model that is compatible with the MATLAB Simulink (The MathWorks, Inc.) environment.
MOTIFSIM 2.1: An Enhanced Software Platform for Detecting Similarity in Multiple DNA Motif Data Sets
Huang, Chun-Hsi
2017-01-01
Abstract Finding binding site motifs plays an important role in bioinformatics as it reveals the transcription factors that control the gene expression. The development for motif finders has flourished in the past years with many tools have been introduced to the research community. Although these tools possess exceptional features for detecting motifs, they report different results for an identical data set. Hence, using multiple tools is recommended because motifs reported by several tools are likely biologically significant. However, the results from multiple tools need to be compared for obtaining common significant motifs. MOTIFSIM web tool and command-line tool were developed for this purpose. In this work, we present several technical improvements as well as additional features to further support the motif analysis in our new release MOTIFSIM 2.1. PMID:28632401
Trainer Interventions as Instructional Strategies in Air Traffic Control Training
ERIC Educational Resources Information Center
Koskela, Inka; Palukka, Hannele
2011-01-01
Purpose: This paper aims to identify methods of guidance and supervision used in air traffic control training. It also aims to show how these methods facilitate trainee participation in core work activities. Design/methodology/approach: The paper applies the tools of conversation analysis and ethnomethodology to explore the ways in which trainers…
Carroll, Adam J; Badger, Murray R; Harvey Millar, A
2010-07-14
Standardization of analytical approaches and reporting methods via community-wide collaboration can work synergistically with web-tool development to result in rapid community-driven expansion of online data repositories suitable for data mining and meta-analysis. In metabolomics, the inter-laboratory reproducibility of gas-chromatography/mass-spectrometry (GC/MS) makes it an obvious target for such development. While a number of web-tools offer access to datasets and/or tools for raw data processing and statistical analysis, none of these systems are currently set up to act as a public repository by easily accepting, processing and presenting publicly submitted GC/MS metabolomics datasets for public re-analysis. Here, we present MetabolomeExpress, a new File Transfer Protocol (FTP) server and web-tool for the online storage, processing, visualisation and statistical re-analysis of publicly submitted GC/MS metabolomics datasets. Users may search a quality-controlled database of metabolite response statistics from publicly submitted datasets by a number of parameters (eg. metabolite, species, organ/biofluid etc.). Users may also perform meta-analysis comparisons of multiple independent experiments or re-analyse public primary datasets via user-friendly tools for t-test, principal components analysis, hierarchical cluster analysis and correlation analysis. They may interact with chromatograms, mass spectra and peak detection results via an integrated raw data viewer. Researchers who register for a free account may upload (via FTP) their own data to the server for online processing via a novel raw data processing pipeline. MetabolomeExpress https://www.metabolome-express.org provides a new opportunity for the general metabolomics community to transparently present online the raw and processed GC/MS data underlying their metabolomics publications. Transparent sharing of these data will allow researchers to assess data quality and draw their own insights from published metabolomics datasets.
Implementation of statistical process control for proteomic experiments via LC MS/MS.
Bereman, Michael S; Johnson, Richard; Bollinger, James; Boss, Yuval; Shulman, Nick; MacLean, Brendan; Hoofnagle, Andrew N; MacCoss, Michael J
2014-04-01
Statistical process control (SPC) is a robust set of tools that aids in the visualization, detection, and identification of assignable causes of variation in any process that creates products, services, or information. A tool has been developed termed Statistical Process Control in Proteomics (SProCoP) which implements aspects of SPC (e.g., control charts and Pareto analysis) into the Skyline proteomics software. It monitors five quality control metrics in a shotgun or targeted proteomic workflow. None of these metrics require peptide identification. The source code, written in the R statistical language, runs directly from the Skyline interface, which supports the use of raw data files from several of the mass spectrometry vendors. It provides real time evaluation of the chromatographic performance (e.g., retention time reproducibility, peak asymmetry, and resolution), and mass spectrometric performance (targeted peptide ion intensity and mass measurement accuracy for high resolving power instruments) via control charts. Thresholds are experiment- and instrument-specific and are determined empirically from user-defined quality control standards that enable the separation of random noise and systematic error. Finally, Pareto analysis provides a summary of performance metrics and guides the user to metrics with high variance. The utility of these charts to evaluate proteomic experiments is illustrated in two case studies.
Practical applications of surface analytic tools in tribology
NASA Technical Reports Server (NTRS)
Ferrante, J.
1980-01-01
A brief description of many of the widely used tools is presented. Of this list, those which have the highest applicability for giving elemental and/or compound analysis for problems of interest in tribology along with being truly surface sensitive (that is less than 10 atomic layers) are presented. The latter group is critiqued in detail in terms of strengths and weaknesses. Emphasis is placed on post facto analysis of experiments performed under real conditions (e.g., in air with lubricants). It is further indicated that such equipment could be used for screening and quality control.
MetaMeta: integrating metagenome analysis tools to improve taxonomic profiling.
Piro, Vitor C; Matschkowski, Marcel; Renard, Bernhard Y
2017-08-14
Many metagenome analysis tools are presently available to classify sequences and profile environmental samples. In particular, taxonomic profiling and binning methods are commonly used for such tasks. Tools available among these two categories make use of several techniques, e.g., read mapping, k-mer alignment, and composition analysis. Variations on the construction of the corresponding reference sequence databases are also common. In addition, different tools provide good results in different datasets and configurations. All this variation creates a complicated scenario to researchers to decide which methods to use. Installation, configuration and execution can also be difficult especially when dealing with multiple datasets and tools. We propose MetaMeta: a pipeline to execute and integrate results from metagenome analysis tools. MetaMeta provides an easy workflow to run multiple tools with multiple samples, producing a single enhanced output profile for each sample. MetaMeta includes a database generation, pre-processing, execution, and integration steps, allowing easy execution and parallelization. The integration relies on the co-occurrence of organisms from different methods as the main feature to improve community profiling while accounting for differences in their databases. In a controlled case with simulated and real data, we show that the integrated profiles of MetaMeta overcome the best single profile. Using the same input data, it provides more sensitive and reliable results with the presence of each organism being supported by several methods. MetaMeta uses Snakemake and has six pre-configured tools, all available at BioConda channel for easy installation (conda install -c bioconda metameta). The MetaMeta pipeline is open-source and can be downloaded at: https://gitlab.com/rki_bioinformatics .
SPARTA: Simple Program for Automated reference-based bacterial RNA-seq Transcriptome Analysis.
Johnson, Benjamin K; Scholz, Matthew B; Teal, Tracy K; Abramovitch, Robert B
2016-02-04
Many tools exist in the analysis of bacterial RNA sequencing (RNA-seq) transcriptional profiling experiments to identify differentially expressed genes between experimental conditions. Generally, the workflow includes quality control of reads, mapping to a reference, counting transcript abundance, and statistical tests for differentially expressed genes. In spite of the numerous tools developed for each component of an RNA-seq analysis workflow, easy-to-use bacterially oriented workflow applications to combine multiple tools and automate the process are lacking. With many tools to choose from for each step, the task of identifying a specific tool, adapting the input/output options to the specific use-case, and integrating the tools into a coherent analysis pipeline is not a trivial endeavor, particularly for microbiologists with limited bioinformatics experience. To make bacterial RNA-seq data analysis more accessible, we developed a Simple Program for Automated reference-based bacterial RNA-seq Transcriptome Analysis (SPARTA). SPARTA is a reference-based bacterial RNA-seq analysis workflow application for single-end Illumina reads. SPARTA is turnkey software that simplifies the process of analyzing RNA-seq data sets, making bacterial RNA-seq analysis a routine process that can be undertaken on a personal computer or in the classroom. The easy-to-install, complete workflow processes whole transcriptome shotgun sequencing data files by trimming reads and removing adapters, mapping reads to a reference, counting gene features, calculating differential gene expression, and, importantly, checking for potential batch effects within the data set. SPARTA outputs quality analysis reports, gene feature counts and differential gene expression tables and scatterplots. SPARTA provides an easy-to-use bacterial RNA-seq transcriptional profiling workflow to identify differentially expressed genes between experimental conditions. This software will enable microbiologists with limited bioinformatics experience to analyze their data and integrate next generation sequencing (NGS) technologies into the classroom. The SPARTA software and tutorial are available at sparta.readthedocs.org.
Knowledge-Acquisition Tool For Expert System
NASA Technical Reports Server (NTRS)
Disbrow, James D.; Duke, Eugene L.; Regenie, Victoria A.
1988-01-01
Digital flight-control systems monitored by computer program that evaluates and recommends. Flight-systems engineers for advanced, high-performance aircraft use knowlege-acquisition tool for expert-system flight-status monitor suppling interpretative data. Interpretative function especially important in time-critical, high-stress situations because it facilitates problem identification and corrective strategy. Conditions evaluated and recommendations made by ground-based engineers having essential knowledge for analysis and monitoring of performances of advanced aircraft systems.
A comprehensive quality control workflow for paired tumor-normal NGS experiments.
Schroeder, Christopher M; Hilke, Franz J; Löffler, Markus W; Bitzer, Michael; Lenz, Florian; Sturm, Marc
2017-06-01
Quality control (QC) is an important part of all NGS data analysis stages. Many available tools calculate QC metrics from different analysis steps of single sample experiments (raw reads, mapped reads and variant lists). Multi-sample experiments, as sequencing of tumor-normal pairs, require additional QC metrics to ensure validity of results. These multi-sample QC metrics still lack standardization. We therefore suggest a new workflow for QC of DNA sequencing of tumor-normal pairs. With this workflow well-known single-sample QC metrics and additional metrics specific for tumor-normal pairs can be calculated. The segmentation into different tools offers a high flexibility and allows reuse for other purposes. All tools produce qcML, a generic XML format for QC of -omics experiments. qcML uses quality metrics defined in an ontology, which was adapted for NGS. All QC tools are implemented in C ++ and run both under Linux and Windows. Plotting requires python 2.7 and matplotlib. The software is available under the 'GNU General Public License version 2' as part of the ngs-bits project: https://github.com/imgag/ngs-bits. christopher.schroeder@med.uni-tuebingen.de. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Piazza, Rocco; Magistroni, Vera; Pirola, Alessandra; Redaelli, Sara; Spinelli, Roberta; Redaelli, Serena; Galbiati, Marta; Valletta, Simona; Giudici, Giovanni; Cazzaniga, Giovanni; Gambacorti-Passerini, Carlo
2013-01-01
Copy number alterations (CNA) are common events occurring in leukaemias and solid tumors. Comparative Genome Hybridization (CGH) is actually the gold standard technique to analyze CNAs; however, CGH analysis requires dedicated instruments and is able to perform only low resolution Loss of Heterozygosity (LOH) analyses. Here we present CEQer (Comparative Exome Quantification analyzer), a new graphical, event-driven tool for CNA/allelic-imbalance (AI) coupled analysis of exome sequencing data. By using case-control matched exome data, CEQer performs a comparative digital exonic quantification to generate CNA data and couples this information with exome-wide LOH and allelic imbalance detection. This data is used to build mixed statistical/heuristic models allowing the identification of CNA/AI events. To test our tool, we initially used in silico generated data, then we performed whole-exome sequencing from 20 leukemic specimens and corresponding matched controls and we analyzed the results using CEQer. Taken globally, these analyses showed that the combined use of comparative digital exon quantification and LOH/AI allows generating very accurate CNA data. Therefore, we propose CEQer as an efficient, robust and user-friendly graphical tool for the identification of CNA/AI in the context of whole-exome sequencing data. PMID:24124457
Messai, Habib; Farman, Muhammad; Sarraj-Laabidi, Abir; Hammami-Semmar, Asma; Semmar, Nabil
2016-01-01
Background. Olive oils (OOs) show high chemical variability due to several factors of genetic, environmental and anthropic types. Genetic and environmental factors are responsible for natural compositions and polymorphic diversification resulting in different varietal patterns and phenotypes. Anthropic factors, however, are at the origin of different blends’ preparation leading to normative, labelled or adulterated commercial products. Control of complex OO samples requires their (i) characterization by specific markers; (ii) authentication by fingerprint patterns; and (iii) monitoring by traceability analysis. Methods. These quality control and management aims require the use of several multivariate statistical tools: specificity highlighting requires ordination methods; authentication checking calls for classification and pattern recognition methods; traceability analysis implies the use of network-based approaches able to separate or extract mixed information and memorized signals from complex matrices. Results. This chapter presents a review of different chemometrics methods applied for the control of OO variability from metabolic and physical-chemical measured characteristics. The different chemometrics methods are illustrated by different study cases on monovarietal and blended OO originated from different countries. Conclusion. Chemometrics tools offer multiple ways for quantitative evaluations and qualitative control of complex chemical variability of OO in relation to several intrinsic and extrinsic factors. PMID:28231172
Bunderson, Nathan E.; Bingham, Jeffrey T.; Sohn, M. Hongchul; Ting, Lena H.; Burkholder, Thomas J.
2015-01-01
Neuromusculoskeletal models solve the basic problem of determining how the body moves under the influence of external and internal forces. Existing biomechanical modeling programs often emphasize dynamics with the goal of finding a feed-forward neural program to replicate experimental data or of estimating force contributions or individual muscles. The computation of rigid-body dynamics, muscle forces, and activation of the muscles are often performed separately. We have developed an intrinsically forward computational platform (Neuromechanic, www.neuromechanic.com) that explicitly represents the interdependencies among rigid body dynamics, frictional contact, muscle mechanics, and neural control modules. This formulation has significant advantages for optimization and forward simulation, particularly with application to neural controllers with feedback or regulatory features. Explicit inclusion of all state dependencies allows calculation of system derivatives with respect to kinematic states as well as muscle and neural control states, thus affording a wealth of analytical tools, including linearization, stability analyses and calculation of initial conditions for forward simulations. In this review, we describe our algorithm for generating state equations and explain how they may be used in integration, linearization and stability analysis tools to provide structural insights into the neural control of movement. PMID:23027632
Bunderson, Nathan E; Bingham, Jeffrey T; Sohn, M Hongchul; Ting, Lena H; Burkholder, Thomas J
2012-10-01
Neuromusculoskeletal models solve the basic problem of determining how the body moves under the influence of external and internal forces. Existing biomechanical modeling programs often emphasize dynamics with the goal of finding a feed-forward neural program to replicate experimental data or of estimating force contributions or individual muscles. The computation of rigid-body dynamics, muscle forces, and activation of the muscles are often performed separately. We have developed an intrinsically forward computational platform (Neuromechanic, www.neuromechanic.com) that explicitly represents the interdependencies among rigid body dynamics, frictional contact, muscle mechanics, and neural control modules. This formulation has significant advantages for optimization and forward simulation, particularly with application to neural controllers with feedback or regulatory features. Explicit inclusion of all state dependencies allows calculation of system derivatives with respect to kinematic states and muscle and neural control states, thus affording a wealth of analytical tools, including linearization, stability analyses and calculation of initial conditions for forward simulations. In this review, we describe our algorithm for generating state equations and explain how they may be used in integration, linearization, and stability analysis tools to provide structural insights into the neural control of movement. Copyright © 2012 John Wiley & Sons, Ltd.
Küster, Eberhard; Dorusch, Falk; Vogt, Carsten; Weiss, Holger; Altenburger, Rolf
2004-07-15
Success of groundwater remediation is typically controlled via snapshot analysis of selected chemical substances or physical parameters. Biological parameters, i.e. ecotoxicological assays, are rarely employed. Hence the aim of the study was to develop a bioassay tool, which allows an on line monitoring of contaminated groundwater, as well as a toxicity reduction evaluation (TRE) of different remediation techniques in parallel and may furthermore be used as an additional tool for process control to supervise remediation techniques in a real time mode. Parallel testing of groundwater remediation techniques was accomplished for short and long time periods, by using the energy dependent luminescence of the bacterium Vibrio fischeri as biological monitoring parameter. One data point every hour for each remediation technique was generated by an automated biomonitor. The bacteria proved to be highly sensitive to the contaminated groundwater and the biomonitor showed a long standing time despite the highly corrosive groundwater present in Bitterfeld, Germany. The bacterial biomonitor is demonstrated to be a valuable tool for remediation success evaluation. Dose response relationships were generated for the six quantitatively dominant groundwater contaminants (2-chlortoluene, 1,2- and 1,4-dichlorobenzene, monochlorobenzene, ethylenbenzene and benzene). The concentrations of individual volatile organic chemicals (VOCs) could not explain the observed effects in the bacteria. An expected mixture toxicity was calculated for the six components using the concept of concentration addition. The calculated EC(50) for the mixture was still one order of magnitude lower than the observed EC(50) of the actual groundwater. The results pointed out that chemical analysis of the six most quantitative substances alone was not able to explain the effects observed with the bacteria. Thus chemical analysis alone may not be an adequate tool for remediation success evaluation in terms of toxicity reduction.
Chen, Josephine; Zhao, Po; Massaro, Donald; Clerch, Linda B; Almon, Richard R; DuBois, Debra C; Jusko, William J; Hoffman, Eric P
2004-01-01
Publicly accessible DNA databases (genome browsers) are rapidly accelerating post-genomic research (see http://www.genome.ucsc.edu/), with integrated genomic DNA, gene structure, EST/ splicing and cross-species ortholog data. DNA databases have relatively low dimensionality; the genome is a linear code that anchors all associated data. In contrast, RNA expression and protein databases need to be able to handle very high dimensional data, with time, tissue, cell type and genes, as interrelated variables. The high dimensionality of microarray expression profile data, and the lack of a standard experimental platform have complicated the development of web-accessible databases and analytical tools. We have designed and implemented a public resource of expression profile data containing 1024 human, mouse and rat Affymetrix GeneChip expression profiles, generated in the same laboratory, and subject to the same quality and procedural controls (Public Expression Profiling Resource; PEPR). Our Oracle-based PEPR data warehouse includes a novel time series query analysis tool (SGQT), enabling dynamic generation of graphs and spreadsheets showing the action of any transcript of interest over time. In this report, we demonstrate the utility of this tool using a 27 time point, in vivo muscle regeneration series. This data warehouse and associated analysis tools provides access to multidimensional microarray data through web-based interfaces, both for download of all types of raw data for independent analysis, and also for straightforward gene-based queries. Planned implementations of PEPR will include web-based remote entry of projects adhering to quality control and standard operating procedure (QC/SOP) criteria, and automated output of alternative probe set algorithms for each project (see http://microarray.cnmcresearch.org/pgadatatable.asp).
Chen, Josephine; Zhao, Po; Massaro, Donald; Clerch, Linda B.; Almon, Richard R.; DuBois, Debra C.; Jusko, William J.; Hoffman, Eric P.
2004-01-01
Publicly accessible DNA databases (genome browsers) are rapidly accelerating post-genomic research (see http://www.genome.ucsc.edu/), with integrated genomic DNA, gene structure, EST/ splicing and cross-species ortholog data. DNA databases have relatively low dimensionality; the genome is a linear code that anchors all associated data. In contrast, RNA expression and protein databases need to be able to handle very high dimensional data, with time, tissue, cell type and genes, as interrelated variables. The high dimensionality of microarray expression profile data, and the lack of a standard experimental platform have complicated the development of web-accessible databases and analytical tools. We have designed and implemented a public resource of expression profile data containing 1024 human, mouse and rat Affymetrix GeneChip expression profiles, generated in the same laboratory, and subject to the same quality and procedural controls (Public Expression Profiling Resource; PEPR). Our Oracle-based PEPR data warehouse includes a novel time series query analysis tool (SGQT), enabling dynamic generation of graphs and spreadsheets showing the action of any transcript of interest over time. In this report, we demonstrate the utility of this tool using a 27 time point, in vivo muscle regeneration series. This data warehouse and associated analysis tools provides access to multidimensional microarray data through web-based interfaces, both for download of all types of raw data for independent analysis, and also for straightforward gene-based queries. Planned implementations of PEPR will include web-based remote entry of projects adhering to quality control and standard operating procedure (QC/SOP) criteria, and automated output of alternative probe set algorithms for each project (see http://microarray.cnmcresearch.org/pgadatatable.asp). PMID:14681485
Gál, Lukáš; Čeppan, Michal; Reháková, Milena; Dvonka, Vladimír; Tarajčáková, Jarmila; Hanus, Jozef
2013-11-01
A method has been developed for identification of corrosive iron-gall inks in historical drawings and documents. The method is based on target-factor analysis of visible-near infrared fibre optic reflection spectra (VIS-NIR FORS). A set of reference spectra was obtained from model samples of laboratory-prepared inks covering a wide range of mixing ratios of basic ink components deposited on substrates and artificially aged. As criteria for correspondence of a studied spectrum with a reference spectrum, the apparent error in target (AET) and the empirical function SPOIL according to Malinowski were used. The capability of the proposed tool to distinguish corrosive iron-gall inks from bistre and sepia inks was evaluated by use of a set of control samples of bistre, sepia, and iron-gall inks. Examples are presented of analysis of historical drawings from the 15th and 16th centuries and written documents from the 19th century. The results of analysis based on the tool were confirmed by XRF analysis and colorimetric spot analysis.
New Techniques for the Generation and Analysis of Tailored Microbial Systems on Surfaces.
Furst, Ariel L; Smith, Matthew J; Francis, Matthew B
2018-05-17
The interactions between microbes and surfaces provide critically important cues that control the behavior and growth of the cells. As our understanding of complex microbial communities improves, there is a growing need for experimental tools that can establish and control the spatial arrangements of these cells in a range of contexts. Recent improvements in methods to attach bacteria and yeast to nonbiological substrates, combined with an expanding set of techniques available to study these cells, position this field for many new discoveries. Improving methods for controlling the immobilization of bacteria provides powerful experimental tools for testing hypotheses regarding microbiome interactions, studying the transfer of nutrients between bacterial species, and developing microbial communities for green energy production and pollution remediation.
Adaptation of a Control Center Development Environment for Industrial Process Control
NASA Technical Reports Server (NTRS)
Killough, Ronnie L.; Malik, James M.
1994-01-01
In the control center, raw telemetry data is received for storage, display, and analysis. This raw data must be combined and manipulated in various ways by mathematical computations to facilitate analysis, provide diversified fault detection mechanisms, and enhance display readability. A development tool called the Graphical Computation Builder (GCB) has been implemented which provides flight controllers with the capability to implement computations for use in the control center. The GCB provides a language that contains both general programming constructs and language elements specifically tailored for the control center environment. The GCB concept allows staff who are not skilled in computer programming to author and maintain computer programs. The GCB user is isolated from the details of external subsystem interfaces and has access to high-level functions such as matrix operators, trigonometric functions, and unit conversion macros. The GCB provides a high level of feedback during computation development that improves upon the often cryptic errors produced by computer language compilers. An equivalent need can be identified in the industrial data acquisition and process control domain: that of an integrated graphical development tool tailored to the application to hide the operating system, computer language, and data acquisition interface details. The GCB features a modular design which makes it suitable for technology transfer without significant rework. Control center-specific language elements can be replaced by elements specific to industrial process control.
COINGRAD; Control Oriented Interactive Graphical Analysis and Design.
ERIC Educational Resources Information Center
Volz, Richard A.; And Others
The computer is currently a vital tool in engineering analysis and design. With the introduction of moderately priced graphics terminals, it will become even more important in the future as rapid graphic interaction between the engineer and the computer becomes more feasible in computer-aided design (CAD). To provide a vehicle for introducing…
Mobile Learning Projects--A Critical Analysis of the State of the Art
ERIC Educational Resources Information Center
Frohberg, D.; Goth, C.; Schwabe, G.
2009-01-01
This paper provides a critical analysis of Mobile Learning projects published before the end of 2007. The review uses a Mobile Learning framework to evaluate and categorize 102 Mobile Learning projects, and to briefly introduce exemplary projects for each category. All projects were analysed with the criteria: context, tools, control,…
Behavior under the Microscope: Increasing the Resolution of Our Experimental Procedures
ERIC Educational Resources Information Center
Palmer, David C.
2010-01-01
Behavior analysis has exploited conceptual tools whose experimental validity has been amply demonstrated, but their relevance to large-scale and fine-grained behavioral phenomena remains uncertain, because the experimental analysis of these domains faces formidable obstacles of measurement and control. In this essay I suggest that, at least at the…
Darviri, Christina; Alexopoulos, Evangelos C; Artemiadis, Artemios K; Tigani, Xanthi; Kraniotou, Christina; Darvyri, Panagiota; Chrousos, George P
2014-09-24
The main goal of stress management and health promotion programs is to improve health by empowering people to take control over their lives. Daily health-related lifestyle choices are integral targets of these interventions and critical to evaluating their efficacy. To date, concepts such as self-efficacy, self-control and empowerment are assessed by tools that only partially address daily lifestyle choices. The aim of this study is to validate a novel measurement tool, the Healthy Lifestyle and Personal Control Questionnaire (HLPCQ), which aims to assess the concept of empowerment through a constellation of daily activities. Therefore, we performed principal component analysis (PCA) of 26 items that were derived from the qualitative data of several stress management programs conducted by our research team. The PCA resulted in the following five-factor solution: 1) Dietary Healthy Choices, 2) Dietary Harm Avoidance, 3) Daily Routine, 4) Organized Physical Exercise and 5) Social and Mental Balance. All subscales showed satisfactory internal consistency and variance, relative to theoretical score ranges. Subscale scores and the total score were significantly correlated with perceived stress and health locus of control, implying good criterion validity. Associations with sociodemographic data and other variables, such as sleep quality and health assessments, were also found. The HLPCQ is a good tool for assessing the efficacy of future health-promoting interventions to improve individuals' lifestyle and wellbeing.
A model of motor performance during surface penetration: from physics to voluntary control.
Klatzky, Roberta L; Gershon, Pnina; Shivaprabhu, Vikas; Lee, Randy; Wu, Bing; Stetten, George; Swendsen, Robert H
2013-10-01
The act of puncturing a surface with a hand-held tool is a ubiquitous but complex motor behavior that requires precise force control to avoid potentially severe consequences. We present a detailed model of puncture over a time course of approximately 1,000 ms, which is fit to kinematic data from individual punctures, obtained via a simulation with high-fidelity force feedback. The model describes puncture as proceeding from purely physically determined interactions between the surface and tool, through decline of force due to biomechanical viscosity, to cortically mediated voluntary control. When fit to the data, it yields parameters for the inertial mass of the tool/person coupling, time characteristic of force decline, onset of active braking, stopping time and distance, and late oscillatory behavior, all of which the analysis relates to physical variables manipulated in the simulation. While the present data characterize distinct phases of motor performance in a group of healthy young adults, the approach could potentially be extended to quantify the performance of individuals from other populations, e.g., with sensory-motor impairments. Applications to surgical force control devices are also considered.
Ovejero, M C; Pérez Vega-Leal, A; Gallardo, M I; Espino, J M; Selva, A; Cortés-Giraldo, M A; Arráns, R
2017-02-01
The aim of this work is to present a new data acquisition, control, and analysis software system written in LabVIEW. This system has been designed to obtain the dosimetry of a silicon strip detector in polyethylene. It allows the full automation of the experiments and data analysis required for the dosimetric characterization of silicon detectors. It becomes a useful tool that can be applied in the daily routine check of a beam accelerator.
This is the first phase of a potentially multi-phase project aimed at identifying scientific methodologies that will lead to the development of innnovative analytical tools supporting the analysis of control strategy effectiveness, namely. accountabilty. Significant reductions i...
Bioimpedance harmonic analysis as a tool to simultaneously assess circulation and nervous control.
Mudraya, I S; Revenko, S V; Nesterov, A V; Gavrilov, I Yu; Kirpatovsky, V I
2011-07-01
Multicycle harmonic (Fourier) analysis of bioimpedance was employed to simultaneously assess circulation and neural activity in visceral (rat urinary bladder) and somatic (human finger) organs. The informative value of the first cardiac harmonic of the bladder impedance as an index of bladder circulation is demonstrated. The individual reactions of normal and obstructive bladders in response to infusion cystometry were recorded. The potency of multicycle harmonic analysis of bioimpedance to assess sympathetic and parasympathetic neural control in urinary bladder is discussed. In the human finger, bioimpedance harmonic analysis revealed three periodic components at the rate of the heart beat, respiration and Mayer wave (0.1 Hz), which were observed under normal conditions and during blood flow arrest in the hand. The revealed spectrum peaks were explained by the changes in systemic blood pressure and in regional vascular tone resulting from neural vasomotor control. During normal respiration and circulation, two side cardiac peaks were revealed in a bioimpedance amplitude spectrum, whose amplitude reflected the depth of amplitude respiratory modulation of the cardiac output. During normal breathing, the peaks corresponding to the second and third cardiac harmonics were split, reflecting frequency respiratory modulation of the heart rate. Multicycle harmonic analysis of bioimpedance is a novel potent tool to examine the interaction between the respiratory and cardiovascular system and to simultaneously assess regional circulation and neural influences in visceral and somatic organs.
Statistical process control: separating signal from noise in emergency department operations.
Pimentel, Laura; Barrueto, Fermin
2015-05-01
Statistical process control (SPC) is a visually appealing and statistically rigorous methodology very suitable to the analysis of emergency department (ED) operations. We demonstrate that the control chart is the primary tool of SPC; it is constructed by plotting data measuring the key quality indicators of operational processes in rationally ordered subgroups such as units of time. Control limits are calculated using formulas reflecting the variation in the data points from one another and from the mean. SPC allows managers to determine whether operational processes are controlled and predictable. We review why the moving range chart is most appropriate for use in the complex ED milieu, how to apply SPC to ED operations, and how to determine when performance improvement is needed. SPC is an excellent tool for operational analysis and quality improvement for these reasons: 1) control charts make large data sets intuitively coherent by integrating statistical and visual descriptions; 2) SPC provides analysis of process stability and capability rather than simple comparison with a benchmark; 3) SPC allows distinction between special cause variation (signal), indicating an unstable process requiring action, and common cause variation (noise), reflecting a stable process; and 4) SPC keeps the focus of quality improvement on process rather than individual performance. Because data have no meaning apart from their context, and every process generates information that can be used to improve it, we contend that SPC should be seriously considered for driving quality improvement in emergency medicine. Copyright © 2015 Elsevier Inc. All rights reserved.
Integrated dynamic analysis simulation of space stations with controllable solar array
NASA Technical Reports Server (NTRS)
Heinrichs, J. A.; Fee, J. J.
1972-01-01
A methodology is formulated and presented for the integrated structural dynamic analysis of space stations with controllable solar arrays and non-controllable appendages. The structural system flexibility characteristics are considered in the dynamic analysis by a synthesis technique whereby free-free space station modal coordinates and cantilever appendage coordinates are inertially coupled. A digital simulation of this analysis method is described and verified by comparison of interaction load solutions with other methods of solution. Motion equations are simulated for both the zero gravity and artificial gravity (spinning) orbital conditions. Closed loop controlling dynamics for both orientation control of the arrays and attitude control of the space station are provided in the simulation by various generic types of controlling systems. The capability of the simulation as a design tool is demonstrated by utilizing typical space station and solar array structural representations and a specific structural perturbing force. Response and interaction load solutions are presented for this structural configuration and indicate the importance of using an integrated type analysis for the predictions of structural interactions.
NASA Astrophysics Data System (ADS)
Pagliarone, C. E.; Uttaro, S.; Cappelli, L.; Fallone, M.; Kartal, S.
2017-02-01
CAT, Cryogenic Analysis Tools is a software package developed using LabVIEW and ROOT environments to analyze the performances of large size cryostats, where many parameters, input, and control variables need to be acquired and studied at the same time. The present paper describes how CAT works and which are the main improvements achieved in the new version: CAT 2. New Graphical User Interfaces have been developed in order to make the use of the full package more user-friendly as well as a process of resource optimization has been carried out. The offline analysis of the full cryostat performances is available both trough ROOT line command interface band also by using the new graphical interfaces.
quantGenius: implementation of a decision support system for qPCR-based gene quantification.
Baebler, Špela; Svalina, Miha; Petek, Marko; Stare, Katja; Rotter, Ana; Pompe-Novak, Maruša; Gruden, Kristina
2017-05-25
Quantitative molecular biology remains a challenge for researchers due to inconsistent approaches for control of errors in the final results. Due to several factors that can influence the final result, quantitative analysis and interpretation of qPCR data are still not trivial. Together with the development of high-throughput qPCR platforms, there is a need for a tool allowing for robust, reliable and fast nucleic acid quantification. We have developed "quantGenius" ( http://quantgenius.nib.si ), an open-access web application for a reliable qPCR-based quantification of nucleic acids. The quantGenius workflow interactively guides the user through data import, quality control (QC) and calculation steps. The input is machine- and chemistry-independent. Quantification is performed using the standard curve approach, with normalization to one or several reference genes. The special feature of the application is the implementation of user-guided QC-based decision support system, based on qPCR standards, that takes into account pipetting errors, assay amplification efficiencies, limits of detection and quantification of the assays as well as the control of PCR inhibition in individual samples. The intermediate calculations and final results are exportable in a data matrix suitable for further statistical analysis or visualization. We additionally compare the most important features of quantGenius with similar advanced software tools and illustrate the importance of proper QC system in the analysis of qPCR data in two use cases. To our knowledge, quantGenius is the only qPCR data analysis tool that integrates QC-based decision support and will help scientists to obtain reliable results which are the basis for biologically meaningful data interpretation.
Gerbig, Stefanie; Stern, Gerold; Brunn, Hubertus E; Düring, Rolf-Alexander; Spengler, Bernhard; Schulz, Sabine
2017-03-01
Direct analysis of fruit and vegetable surfaces is an important tool for in situ detection of food contaminants such as pesticides. We tested three different ways to prepare samples for the qualitative desorption electrospray ionization mass spectrometry (DESI-MS) analysis of 32 pesticides found on nine authentic fruits collected from food control. Best recovery rates for topically applied pesticides (88%) were found by analyzing the surface of a glass slide which had been rubbed against the surface of the food. Pesticide concentration in all samples was at or below the maximum residue level allowed. In addition to the high sensitivity of the method for qualitative analysis, quantitative or, at least, semi-quantitative information is needed in food control. We developed a DESI-MS method for the simultaneous determination of linear calibration curves of multiple pesticides of the same chemical class using normalization to one internal standard (ISTD). The method was first optimized for food extracts and subsequently evaluated for the quantification of pesticides in three authentic food extracts. Next, pesticides and the ISTD were applied directly onto food surfaces, and the corresponding calibration curves were obtained. The determination of linear calibration curves was still feasible, as demonstrated for three different food surfaces. This proof-of-principle method was used to simultaneously quantify two pesticides on an authentic sample, showing that the method developed could serve as a fast and simple preselective tool for disclosure of pesticide regulation violations. Graphical Abstract Multiple pesticide residues were detected and quantified in-situ from an authentic set of food items and extracts in a proof of principle study.
Guidebook for analysis of tether applications
NASA Technical Reports Server (NTRS)
Carroll, J. A.
1985-01-01
This guidebook is intended as a tool to facilitate initial analyses of proposed tether applications in space. Topics disscussed include: orbit and orbit transfer equations; orbital perturbations; aerodynamic drag; thermal balance; micrometeoroids; gravity gradient effects; tether control strategies; momentum transfer; orbit transfer by tethered release/rendezvous; impact hazards for tethers; electrodynamic tether principles; and electrodynamic libration control issues.
Graeden, Ellie; Kerr, Justin; Sorrell, Erin M.; Katz, Rebecca
2018-01-01
Managing infectious disease requires rapid and effective response to support decision making. The decisions are complex and require understanding of the diseases, disease intervention and control measures, and the disease-relevant characteristics of the local community. Though disease modeling frameworks have been developed to address these questions, the complexity of current models presents a significant barrier to community-level decision makers in using the outputs of the most scientifically robust methods to support pragmatic decisions about implementing a public health response effort, even for endemic diseases with which they are already familiar. Here, we describe the development of an application available on the internet, including from mobile devices, with a simple user interface, to support on-the-ground decision-making for integrating disease control programs, given local conditions and practical constraints. The model upon which the tool is built provides predictive analysis for the effectiveness of integration of schistosomiasis and malaria control, two diseases with extensive geographical and epidemiological overlap, and which result in significant morbidity and mortality in affected regions. Working with data from countries across sub-Saharan Africa and the Middle East, we present a proof-of-principle method and corresponding prototype tool to provide guidance on how to optimize integration of vertical disease control programs. This method and tool demonstrate significant progress in effectively translating the best available scientific models to support practical decision making on the ground with the potential to significantly increase the efficacy and cost-effectiveness of disease control. Author summary Designing and implementing effective programs for infectious disease control requires complex decision-making, informed by an understanding of the diseases, the types of disease interventions and control measures available, and the disease-relevant characteristics of the local community. Though disease modeling frameworks have been developed to address these questions and support decision-making, the complexity of current models presents a significant barrier to on-the-ground end users. The picture is further complicated when considering approaches for integration of different disease control programs, where co-infection dynamics, treatment interactions, and other variables must also be taken into account. Here, we describe the development of an application available on the internet with a simple user interface, to support on-the-ground decision-making for integrating disease control, given local conditions and practical constraints. The model upon which the tool is built provides predictive analysis for the effectiveness of integration of schistosomiasis and malaria control, two diseases with extensive geographical and epidemiological overlap. This proof-of-concept method and tool demonstrate significant progress in effectively translating the best available scientific models to support pragmatic decision-making on the ground, with the potential to significantly increase the impact and cost-effectiveness of disease control. PMID:29649260
Scoring Tools for the Analysis of Infant Respiratory Inductive Plethysmography Signals.
Robles-Rubio, Carlos Alejandro; Bertolizio, Gianluca; Brown, Karen A; Kearney, Robert E
2015-01-01
Infants recovering from anesthesia are at risk of life threatening Postoperative Apnea (POA). POA events are rare, and so the study of POA requires the analysis of long cardiorespiratory records. Manual scoring is the preferred method of analysis for these data, but it is limited by low intra- and inter-scorer repeatability. Furthermore, recommended scoring rules do not provide a comprehensive description of the respiratory patterns. This work describes a set of manual scoring tools that address these limitations. These tools include: (i) a set of definitions and scoring rules for 6 mutually exclusive, unique patterns that fully characterize infant respiratory inductive plethysmography (RIP) signals; (ii) RIPScore, a graphical, manual scoring software to apply these rules to infant data; (iii) a library of data segments representing each of the 6 patterns; (iv) a fully automated, interactive formal training protocol to standardize the analysis and establish intra- and inter-scorer repeatability; and (v) a quality control method to monitor scorer ongoing performance over time. To evaluate these tools, three scorers from varied backgrounds were recruited and trained to reach a performance level similar to that of an expert. These scorers used RIPScore to analyze data from infants at risk of POA in two separate, independent instances. Scorers performed with high accuracy and consistency, analyzed data efficiently, had very good intra- and inter-scorer repeatability, and exhibited only minor confusion between patterns. These results indicate that our tools represent an excellent method for the analysis of respiratory patterns in long data records. Although the tools were developed for the study of POA, their use extends to any study of respiratory patterns using RIP (e.g., sleep apnea, extubation readiness). Moreover, by establishing and monitoring scorer repeatability, our tools enable the analysis of large data sets by multiple scorers, which is essential for longitudinal and multicenter studies.
Bringing "Scientific Expeditions" Into the Schools
NASA Technical Reports Server (NTRS)
Watson, Val; Lasinski, T. A. (Technical Monitor)
1995-01-01
Two new technologies, the FASTexpedition and Remote FAST, have been developed that provide remote, 3D, high resolution, dynamic, interactive viewing of scientific data (such as simulations or measurements of fluid dynamics). The FASTexpedition permits one to access scientific data from the World Wide Web, take guided expeditions through the data, and continue with self controlled expeditions through the data. Remote FAST permits collaborators at remote sites to simultaneously view an analysis of scientific data being controlled by one of the collaborators. Control can be transferred between sites. These technologies are now being used for remote collaboration in joint university, industry, and NASA projects in computational fluid dynamics (CFD) and wind tunnel testing. Also, NASA Ames Research Center has initiated a project to make scientific data and guided expeditions through the data available as FASTexpeditions on the World Wide Web for educational purposes. Previously, remote visualiZation of dynamic data was done using video format (transmitting pixel information) such as video conferencing or MPEG movies on the Internet. The concept for this new technology is to send the raw data (e.g., grids, vectors, and scalars) along with viewing scripts over the Internet and have the pixels generated by a visualization tool running on the viewer's local workstation. The visualization tool that is currently used is FAST (Flow Analysis Software Toolkit). The advantages of this new technology over using video format are: 1. The visual is much higher in resolution (1280xl024 pixels with 24 bits of color) than typical video format transmitted over the network. 2. The form of the visualization can be controlled interactively (because the viewer is interactively controlling the visualization tool running on his workstation). 3. A rich variety of guided expeditions through the data can be included easily. 4. A capability is provided for other sites to see a visual analysis of one site as the analysis is interactively performed. Control of the analysis can be passed from site to site. 5. The scenes can be viewed in 3D using stereo vision. 6. The network bandwidth used for the visualization using this new technology is much smaller than when using video format. (The measured peak bandwidth used was 1 Kbit/sec whereas the measured bandwidth for a small video picture was 500 Kbits/sec.)
Application of the user-centred design process according ISO 9241-210 in air traffic control.
König, Christina; Hofmann, Thomas; Bruder, Ralph
2012-01-01
Designing a usable human machine interface for air traffic control is challenging and should follow approved methods. The ISO 9241-210 standard promises high usability of products by integrating future users and following an iterative process. This contribution describes the proceeding and first results of the analysis and application of ISO 9241-210 to develop a planning tool for air traffic controllers.
Study of a direct visualization display tool for space applications
NASA Astrophysics Data System (ADS)
Pereira do Carmo, J.; Gordo, P. R.; Martins, M.; Rodrigues, F.; Teodoro, P.
2017-11-01
The study of a Direct Visualization Display Tool (DVDT) for space applications is reported. The review of novel technologies for a compact display tool is described. Several applications for this tool have been identified with the support of ESA astronauts and are presented. A baseline design is proposed. It consists mainly of OLEDs as image source; a specially designed optical prism as relay optics; a Personal Digital Assistant (PDA), with data acquisition card, as control unit; and voice control and simplified keyboard as interfaces. Optical analysis and the final estimated performance are reported. The system is able to display information (text, pictures or/and video) with SVGA resolution directly to the astronaut using a Field of View (FOV) of 20x14.5 degrees. The image delivery system is a monocular Head Mounted Display (HMD) that weights less than 100g. The HMD optical system has an eye pupil of 7mm and an eye relief distance of 30mm.
Multi-modal virtual environment research at Armstrong Laboratory
NASA Technical Reports Server (NTRS)
Eggleston, Robert G.
1995-01-01
One mission of the Paul M. Fitts Human Engineering Division of Armstrong Laboratory is to improve the user interface for complex systems through user-centered exploratory development and research activities. In support of this goal, many current projects attempt to advance and exploit user-interface concepts made possible by virtual reality (VR) technologies. Virtual environments may be used as a general purpose interface medium, an alternative display/control method, a data visualization and analysis tool, or a graphically based performance assessment tool. An overview is given of research projects within the division on prototype interface hardware/software development, integrated interface concept development, interface design and evaluation tool development, and user and mission performance evaluation tool development.
NASA Astrophysics Data System (ADS)
Ogruc Ildiz, G.; Arslan, M.; Unsalan, O.; Araujo-Andrade, C.; Kurt, E.; Karatepe, H. T.; Yilmaz, A.; Yalcinkaya, O. B.; Herken, H.
2016-01-01
In this study, a methodology based on Fourier-transform infrared spectroscopy and principal component analysis and partial least square methods is proposed for the analysis of blood plasma samples in order to identify spectral changes correlated with some biomarkers associated with schizophrenia and bipolarity. Our main goal was to use the spectral information for the calibration of statistical models to discriminate and classify blood plasma samples belonging to bipolar and schizophrenic patients. IR spectra of 30 samples of blood plasma obtained from each, bipolar and schizophrenic patients and healthy control group were collected. The results obtained from principal component analysis (PCA) show a clear discrimination between the bipolar (BP), schizophrenic (SZ) and control group' (CG) blood samples that also give possibility to identify three main regions that show the major differences correlated with both mental disorders (biomarkers). Furthermore, a model for the classification of the blood samples was calibrated using partial least square discriminant analysis (PLS-DA), allowing the correct classification of BP, SZ and CG samples. The results obtained applying this methodology suggest that it can be used as a complimentary diagnostic tool for the detection and discrimination of these mental diseases.
Advanced space system analysis software. Technical, user, and programmer guide
NASA Technical Reports Server (NTRS)
Farrell, C. E.; Zimbelman, H. F.
1981-01-01
The LASS computer program provides a tool for interactive preliminary and conceptual design of LSS. Eight program modules were developed, including four automated model geometry generators, an associated mass properties module, an appendage synthesizer module, an rf analysis module, and an orbital transfer analysis module. The existing rigid body controls analysis module was modified to permit analysis of effects of solar pressure on orbital performance. A description of each module, user instructions, and programmer information are included.
Mason, Alexina J; Gomes, Manuel; Grieve, Richard; Ulug, Pinar; Powell, Janet T; Carpenter, James
2017-01-01
Background/aims: The analyses of randomised controlled trials with missing data typically assume that, after conditioning on the observed data, the probability of missing data does not depend on the patient’s outcome, and so the data are ‘missing at random’ . This assumption is usually implausible, for example, because patients in relatively poor health may be more likely to drop out. Methodological guidelines recommend that trials require sensitivity analysis, which is best informed by elicited expert opinion, to assess whether conclusions are robust to alternative assumptions about the missing data. A major barrier to implementing these methods in practice is the lack of relevant practical tools for eliciting expert opinion. We develop a new practical tool for eliciting expert opinion and demonstrate its use for randomised controlled trials with missing data. Methods: We develop and illustrate our approach for eliciting expert opinion with the IMPROVE trial (ISRCTN 48334791), an ongoing multi-centre randomised controlled trial which compares an emergency endovascular strategy versus open repair for patients with ruptured abdominal aortic aneurysm. In the IMPROVE trial at 3 months post-randomisation, 21% of surviving patients did not complete health-related quality of life questionnaires (assessed by EQ-5D-3L). We address this problem by developing a web-based tool that provides a practical approach for eliciting expert opinion about quality of life differences between patients with missing versus complete data. We show how this expert opinion can define informative priors within a fully Bayesian framework to perform sensitivity analyses that allow the missing data to depend upon unobserved patient characteristics. Results: A total of 26 experts, of 46 asked to participate, completed the elicitation exercise. The elicited quality of life scores were lower on average for the patients with missing versus complete data, but there was considerable uncertainty in these elicited values. The missing at random analysis found that patients randomised to the emergency endovascular strategy versus open repair had higher average (95% credible interval) quality of life scores of 0.062 (−0.005 to 0.130). Our sensitivity analysis that used the elicited expert information as pooled priors found that the gain in average quality of life for the emergency endovascular strategy versus open repair was 0.076 (−0.054 to 0.198). Conclusion: We provide and exemplify a practical tool for eliciting the expert opinion required by recommended approaches to the sensitivity analyses of randomised controlled trials. We show how this approach allows the trial analysis to fully recognise the uncertainty that arises from making alternative, plausible assumptions about the reasons for missing data. This tool can be widely used in the design, analysis and interpretation of future trials, and to facilitate this, materials are available for download. PMID:28675302
2011-09-20
optimal portfolio point on the efficient frontier, for example, Portfolio B on the chart in Figure A1. Then, by subsequently changing some of the ... optimized portfolio controlling for risk using the IRM methodology and tool suite. Results indicate that both rapid and incremental implementation...Results of the KVA and SD scenario analysis provided the financial information required to forecast an optimized
An Analysis of the Elements of Collaboration Associated with Top Collaborative Tools
2010-03-01
lets you access your e-mail, calendar, and files from any web browser anywhere in the world. Web based www.hotoffice.com Noodle Vialect’s (parent...www.taroby.org Yuuguu Yuuguu is an instant screen sharing, web conferencing, remote support, desktop remote control and messaging tool. Client...Office, Noodle , Novlet, Revizr, Taroby, and Yuuguu) received all seven NS ratings (see Table 20 below). The overall ratings for the major elements
AN ADVANCED TOOL FOR APPLIED INTEGRATED SAFETY MANAGEMENT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Potts, T. Todd; Hylko, James M.; Douglas, Terence A.
2003-02-27
WESKEM, LLC's Environmental, Safety and Health (ES&H) Department had previously assessed that a lack of consistency, poor communication and using antiquated communication tools could result in varying operating practices, as well as a failure to capture and disseminate appropriate Integrated Safety Management (ISM) information. To address these issues, the ES&H Department established an Activity Hazard Review (AHR)/Activity Hazard Analysis (AHA) process for systematically identifying, assessing, and controlling hazards associated with project work activities during work planning and execution. Depending on the scope of a project, information from field walkdowns and table-top meetings are collected on an AHR form. The AHAmore » then documents the potential failure and consequence scenarios for a particular hazard. Also, the AHA recommends whether the type of mitigation appears appropriate or whether additional controls should be implemented. Since the application is web based, the information is captured into a single system and organized according to the >200 work activities already recorded in the database. Using the streamlined AHA method improved cycle time from over four hours to an average of one hour, allowing more time to analyze unique hazards and develop appropriate controls. Also, the enhanced configuration control created a readily available AHA library to research and utilize along with standardizing hazard analysis and control selection across four separate work sites located in Kentucky and Tennessee. The AHR/AHA system provides an applied example of how the ISM concept evolved into a standardized field-deployed tool yielding considerable efficiency gains in project planning and resource utilization. Employee safety is preserved through detailed planning that now requires only a portion of the time previously necessary. The available resources can then be applied to implementing appropriate engineering, administrative and personal protective equipment controls in the field.« less
Quality and Efficiency Improvement Tools for Every Radiologist.
Kudla, Alexei U; Brook, Olga R
2018-06-01
In an era of value-based medicine, data-driven quality improvement is more important than ever to ensure safe and efficient imaging services. Familiarity with high-value tools enables all radiologists to successfully engage in quality and efficiency improvement. In this article, we review the model for improvement, strategies for measurement, and common practical tools with real-life examples that include Run chart, Control chart (Shewhart chart), Fishbone (Cause-and-Effect or Ishikawa) diagram, Pareto chart, 5 Whys, and Root Cause Analysis. Copyright © 2018 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
Operational Analysis in the Launch Environment
NASA Technical Reports Server (NTRS)
James, George; Kaouk, Mo; Cao, Tim; Fogt, Vince; Rocha, Rodney; Schultz, Ken; Tucker, Jon-Michael; Rayos, Eli; Bell,Jeff; Alldredge, David;
2012-01-01
The launch environment is a challenging regime to work due to changing system dynamics, changing environmental loading, joint compression loads that cannot be easily applied on the ground, and control effects. Operational testing is one of the few feasible approaches to capture system level dynamics since ground testing cannot reproduce all of these conditions easily. However, the most successful applications of Operational Modal Testing involve systems with good stationarity and long data acquisition times. This paper covers an ongoing effort to understand the launch environment and the utility of current operational modal tools. This work is expected to produce a collection of operational tools that can be applied to non-stationary launch environment, experience dealing with launch data, and an expanding database of flight parameters such as damping. This paper reports on recent efforts to build a software framework for the data processing utilizing existing and specialty tools; understand the limits of current tools; assess a wider variety of current tools; and expand the experience with additional datasets as well as to begin to address issues raised in earlier launch analysis studies.
NASA Technical Reports Server (NTRS)
Lung, Shun-fat; Pak, Chan-gi
2008-01-01
Updating the finite element model using measured data is a challenging problem in the area of structural dynamics. The model updating process requires not only satisfactory correlations between analytical and experimental results, but also the retention of dynamic properties of structures. Accurate rigid body dynamics are important for flight control system design and aeroelastic trim analysis. Minimizing the difference between analytical and experimental results is a type of optimization problem. In this research, a multidisciplinary design, analysis, and optimization (MDAO) tool is introduced to optimize the objective function and constraints such that the mass properties, the natural frequencies, and the mode shapes are matched to the target data as well as the mass matrix being orthogonalized.
NASA Technical Reports Server (NTRS)
Lung, Shun-fat; Pak, Chan-gi
2008-01-01
Updating the finite element model using measured data is a challenging problem in the area of structural dynamics. The model updating process requires not only satisfactory correlations between analytical and experimental results, but also the retention of dynamic properties of structures. Accurate rigid body dynamics are important for flight control system design and aeroelastic trim analysis. Minimizing the difference between analytical and experimental results is a type of optimization problem. In this research, a multidisciplinary design, analysis, and optimization [MDAO] tool is introduced to optimize the objective function and constraints such that the mass properties, the natural frequencies, and the mode shapes are matched to the target data as well as the mass matrix being orthogonalized.
Fault management for the Space Station Freedom control center
NASA Technical Reports Server (NTRS)
Clark, Colin; Jowers, Steven; Mcnenny, Robert; Culbert, Chris; Kirby, Sarah; Lauritsen, Janet
1992-01-01
This paper describes model based reasoning fault isolation in complex systems using automated digraph analysis. It discusses the use of the digraph representation as the paradigm for modeling physical systems and a method for executing these failure models to provide real-time failure analysis. It also discusses the generality, ease of development and maintenance, complexity management, and susceptibility to verification and validation of digraph failure models. It specifically describes how a NASA-developed digraph evaluation tool and an automated process working with that tool can identify failures in a monitored system when supplied with one or more fault indications. This approach is well suited to commercial applications of real-time failure analysis in complex systems because it is both powerful and cost effective.
Software technology testbed softpanel prototype
NASA Technical Reports Server (NTRS)
1991-01-01
The following subject areas are covered: analysis of using Ada for the development of real-time control systems for the Space Station; analysis of the functionality of the Application Generator; analysis of the User Support Environment criteria; analysis of the SSE tools and procedures which are to be used for the development of ground/flight software for the Space Station; analysis if the CBATS tutorial (an Ada tutorial package); analysis of Interleaf; analysis of the Integration, Test and Verification process of the Space Station; analysis of the DMS on-orbit flight architecture; analysis of the simulation architecture.
Software Tools to Support Research on Airport Departure Planning
NASA Technical Reports Server (NTRS)
Carr, Francis; Evans, Antony; Feron, Eric; Clarke, John-Paul
2003-01-01
A simple, portable and useful collection of software tools has been developed for the analysis of airport surface traffic. The tools are based on a flexible and robust traffic-flow model, and include calibration, validation and simulation functionality for this model. Several different interfaces have been developed to help promote usage of these tools, including a portable Matlab(TM) implementation of the basic algorithms; a web-based interface which provides online access to automated analyses of airport traffic based on a database of real-world operations data which covers over 250 U.S. airports over a 5-year period; and an interactive simulation-based tool currently in use as part of a college-level educational module. More advanced applications for airport departure traffic include taxi-time prediction and evaluation of "windowing" congestion control.
Goenka, Anu; Jeena, Prakash M; Mlisana, Koleka; Solomon, Tom; Spicer, Kevin; Stephenson, Rebecca; Verma, Arpana; Dhada, Barnesh; Griffiths, Michael J
2018-03-01
Early diagnosis of tuberculous meningitis (TBM) is crucial to achieve optimum outcomes. There is no effective rapid diagnostic test for use in children. We aimed to develop a clinical decision tool to facilitate the early diagnosis of childhood TBM. Retrospective case-control study was performed across 7 hospitals in KwaZulu-Natal, South Africa (2010-2014). We identified the variables most predictive of microbiologically confirmed TBM in children (3 months to 15 years) by univariate analysis. These variables were modelled into a clinical decision tool and performance tested on an independent sample group. Of 865 children with suspected TBM, 3% (25) were identified with microbiologically confirmed TBM. Clinical information was retrieved for 22 microbiologically confirmed cases of TBM and compared with 66 controls matched for age, ethnicity, sex and geographical origin. The 9 most predictive variables among the confirmed cases were used to develop a clinical decision tool (CHILD TB LP): altered Consciousness; caregiver HIV infected; Illness length >7 days; Lethargy; focal neurologic Deficit; failure to Thrive; Blood/serum sodium <132 mmol/L; CSF >10 Lymphocytes ×10/L; CSF Protein >0.65 g/L. This tool successfully classified an independent sample of 7 cases and 21 controls with a sensitivity of 100% and specificity of 90%. The CHILD TB LP decision tool accurately classified microbiologically confirmed TBM. We propose that CHILD TB LP is prospectively evaluated as a novel rapid diagnostic tool for use in the initial evaluation of children with suspected neurologic infection presenting to hospitals in similar settings.
Control Systems with Normalized and Covariance Adaptation by Optimal Control Modification
NASA Technical Reports Server (NTRS)
Nguyen, Nhan T. (Inventor); Burken, John J. (Inventor); Hanson, Curtis E. (Inventor)
2016-01-01
Disclosed is a novel adaptive control method and system called optimal control modification with normalization and covariance adjustment. The invention addresses specifically to current challenges with adaptive control in these areas: 1) persistent excitation, 2) complex nonlinear input-output mapping, 3) large inputs and persistent learning, and 4) the lack of stability analysis tools for certification. The invention has been subject to many simulations and flight testing. The results substantiate the effectiveness of the invention and demonstrate the technical feasibility for use in modern aircraft flight control systems.
A Petri Net Approach Based Elementary Siphons Supervisor for Flexible Manufacturing Systems
NASA Astrophysics Data System (ADS)
Abdul-Hussin, Mowafak Hassan
2015-05-01
This paper presents an approach to constructing a class of an S3PR net for modeling, simulation and control of processes occurring in the flexible manufacturing system (FMS) used based elementary siphons of a Petri net. Siphons are very important to the analysis and control of deadlocks of FMS that is significant objectives of siphons. Petri net models in the efficiency structure analysis, and utilization of the FMSs when different policy can be implemented lead to the deadlock prevention. We are representing an effective deadlock-free policy of a special class of Petri nets called S3PR. Simulation of Petri net structural analysis and reachability graph analysis is used for analysis and control of Petri nets. Petri nets contain been successfully as one of the most powerful tools for modelling of FMS, where Using structural analysis, we show that liveness of such systems can be attributed to the absence of under marked siphons.
Modeling Constellation Virtual Missions Using the Vdot(Trademark) Process Management Tool
NASA Technical Reports Server (NTRS)
Hardy, Roger; ONeil, Daniel; Sturken, Ian; Nix, Michael; Yanez, Damian
2011-01-01
The authors have identified a software tool suite that will support NASA's Virtual Mission (VM) effort. This is accomplished by transforming a spreadsheet database of mission events, task inputs and outputs, timelines, and organizations into process visualization tools and a Vdot process management model that includes embedded analysis software as well as requirements and information related to data manipulation and transfer. This paper describes the progress to date, and the application of the Virtual Mission to not only Constellation but to other architectures, and the pertinence to other aerospace applications. Vdot s intuitive visual interface brings VMs to life by turning static, paper-based processes into active, electronic processes that can be deployed, executed, managed, verified, and continuously improved. A VM can be executed using a computer-based, human-in-the-loop, real-time format, under the direction and control of the NASA VM Manager. Engineers in the various disciplines will not have to be Vdot-proficient but rather can fill out on-line, Excel-type databases with the mission information discussed above. The author s tool suite converts this database into several process visualization tools for review and into Microsoft Project, which can be imported directly into Vdot. Many tools can be embedded directly into Vdot, and when the necessary data/information is received from a preceding task, the analysis can be initiated automatically. Other NASA analysis tools are too complex for this process but Vdot automatically notifies the tool user that the data has been received and analysis can begin. The VM can be simulated from end-to-end using the author s tool suite. The planned approach for the Vdot-based process simulation is to generate the process model from a database; other advantages of this semi-automated approach are the participants can be geographically remote and after refining the process models via the human-in-the-loop simulation, the system can evolve into a process management server for the actual process.
Design and thermal analysis of a mold used in the injection of elastomers
NASA Astrophysics Data System (ADS)
Fekiri, Nasser; Canto, Cécile; Madec, Yannick; Mousseau, Pierre; Plot, Christophe; Sarda, Alain
2017-10-01
In the process of injection molding of elastomers, improving the energy efficiency of the tools is a current challenge for industry in terms of energy consumption, productivity and product quality. In the rubber industry, 20% of the energy consumed by capital goods comes from heating processes; more than 50% of heat losses are linked to insufficient control and thermal insulation of Molds. The design of the tooling evolves in particular towards the reduction of the heated mass and the thermal insulation of the molds. In this paper, we present a complex tool composed, on one hand, of a multi-cavity mold designed by reducing the heated mass and equipped with independent control zones placed closest to each molding cavity and, on the other hand, of a regulated channel block (RCB) which makes it possible to limit the waste of rubber during the injection. The originality of this tool lies in thermally isolating the regulated channel block from the mold and the cavities between them in order to better control the temperature field in the material which is transformed. We present the design and the instrumentation of the experimental set-up. Experimental measurements allow us to understand the thermal of the tool and to show the thermal heterogeneities on the surface of the mold and in the various cavities. Tests of injection molding of the rubber and a thermal balance on the energy consumption of the tool are carried out.
Investigation of type-I interferon dysregulation by arenaviruses : a multidisciplinary approach.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kozina, Carol L.; Moorman, Matthew Wallace; Branda, Catherine
2011-09-01
This report provides a detailed overview of the work performed for project number 130781, 'A Systems Biology Approach to Understanding Viral Hemorrhagic Fever Pathogenesis.' We report progress in five key areas: single cell isolation devices and control systems, fluorescent cytokine and transcription factor reporters, on-chip viral infection assays, molecular virology analysis of Arenavirus nucleoprotein structure-function, and development of computational tools to predict virus-host protein interactions. Although a great deal of work remains from that begun here, we have developed several novel single cell analysis tools and knowledge of Arenavirus biology that will facilitate and inform future publications and funding proposals.
Interferometric analysis of polishing surface with a petal tool
NASA Astrophysics Data System (ADS)
Salas-Sánchez, Alfonso; Leal-Cabrera, Irce; Percino Zacarias, Elizabeth; Granados-Agustín, Fermín S.
2011-09-01
In this work, we describe a phase shift interferometric monitoring of polishing processes produced by a petal tool over a spherical surface to obtain a parabolic surface. In the process, we used a commercial polishing machine; the purpose of this work is to have control of polishing time. To achieve this analysis, we used a Fizeau interferometer of ZYGO Company for optical shop testing, and the Durango software from Diffraction International Company. For data acquisition, simulation and evaluation of optical surfaces, we start polishing process with a spherical surface with 15.46 cm of diameter; a 59.9 cm of radius curvature and, with f/# 1.9.
NASA Astrophysics Data System (ADS)
Galvis, D.; Exposito, C.; Osma, G.; Amado, L.; Ordóñez, G.
2016-07-01
This paper presents an analysis of hybrid lighting systems of Electrical Engineering Building in the Industrial University of Santander, which is a pilot of green building for warm- tropical conditions. Analysis of lighting performance of inner spaces is based on lighting curves obtained from characterization of daylighting systems of these spaces. A computation tool was made in Excel-Visual Basic to simulate the behaviour of artificial lighting system considering artificial control system, user behaviour and solar condition. Also, this tool allows to estimate the electrical energy consumption of the lighting system for a day, a month and a year.
Hierarchical Control and Trajectory Planning
NASA Technical Reports Server (NTRS)
Martin, Clyde F.; Horn, P. W.
1994-01-01
Most of the time on this project was spent on the trajectory planning problem. The construction is equivalent to the classical spline construction in the case that the system matrix is nilpotent. If the dimension of the system is n then the spline of degree 2n-1 is constructed. This gives a new approach to the construction of splines that is more efficient than the usual construction and at the same time allows the construction of a much larger class of splines. All known classes of splines are reconstructed using the approach of linear control theory. As a numerical analysis tool control theory gives a very good tool for constructing splines. However, for the purposes of trajectory planning it is quite another story. Enclosed in this document are four reports done under this grant.
Human Connectome Project Informatics: quality control, database services, and data visualization
Marcus, Daniel S.; Harms, Michael P.; Snyder, Abraham Z.; Jenkinson, Mark; Wilson, J Anthony; Glasser, Matthew F.; Barch, Deanna M.; Archie, Kevin A.; Burgess, Gregory C.; Ramaratnam, Mohana; Hodge, Michael; Horton, William; Herrick, Rick; Olsen, Timothy; McKay, Michael; House, Matthew; Hileman, Michael; Reid, Erin; Harwell, John; Coalson, Timothy; Schindler, Jon; Elam, Jennifer S.; Curtiss, Sandra W.; Van Essen, David C.
2013-01-01
The Human Connectome Project (HCP) has developed protocols, standard operating and quality control procedures, and a suite of informatics tools to enable high throughput data collection, data sharing, automated data processing and analysis, and data mining and visualization. Quality control procedures include methods to maintain data collection consistency over time, to measure head motion, and to establish quantitative modality-specific overall quality assessments. Database services developed as customizations of the XNAT imaging informatics platform support both internal daily operations and open access data sharing. The Connectome Workbench visualization environment enables user interaction with HCP data and is increasingly integrated with the HCP's database services. Here we describe the current state of these procedures and tools and their application in the ongoing HCP study. PMID:23707591
A System for Fault Management and Fault Consequences Analysis for NASA's Deep Space Habitat
NASA Technical Reports Server (NTRS)
Colombano, Silvano; Spirkovska, Liljana; Baskaran, Vijaykumar; Aaseng, Gordon; McCann, Robert S.; Ossenfort, John; Smith, Irene; Iverson, David L.; Schwabacher, Mark
2013-01-01
NASA's exploration program envisions the utilization of a Deep Space Habitat (DSH) for human exploration of the space environment in the vicinity of Mars and/or asteroids. Communication latencies with ground control of as long as 20+ minutes make it imperative that DSH operations be highly autonomous, as any telemetry-based detection of a systems problem on Earth could well occur too late to assist the crew with the problem. A DSH-based development program has been initiated to develop and test the automation technologies necessary to support highly autonomous DSH operations. One such technology is a fault management tool to support performance monitoring of vehicle systems operations and to assist with real-time decision making in connection with operational anomalies and failures. Toward that end, we are developing Advanced Caution and Warning System (ACAWS), a tool that combines dynamic and interactive graphical representations of spacecraft systems, systems modeling, automated diagnostic analysis and root cause identification, system and mission impact assessment, and mitigation procedure identification to help spacecraft operators (both flight controllers and crew) understand and respond to anomalies more effectively. In this paper, we describe four major architecture elements of ACAWS: Anomaly Detection, Fault Isolation, System Effects Analysis, and Graphic User Interface (GUI), and how these elements work in concert with each other and with other tools to provide fault management support to both the controllers and crew. We then describe recent evaluations and tests of ACAWS on the DSH testbed. The results of these tests support the feasibility and strength of our approach to failure management automation and enhanced operational autonomy
Design of Robust Adaptive Unbalance Response Controllers for Rotors with Magnetic Bearings
NASA Technical Reports Server (NTRS)
Knospe, Carl R.; Tamer, Samir M.; Fedigan, Stephen J.
1996-01-01
Experimental results have recently demonstrated that an adaptive open loop control strategy can be highly effective in the suppression of unbalance induced vibration on rotors supported in active magnetic bearings. This algorithm, however, relies upon a predetermined gain matrix. Typically, this matrix is determined by an optimal control formulation resulting in the choice of the pseudo-inverse of the nominal influence coefficient matrix as the gain matrix. This solution may result in problems with stability and performance robustness since the estimated influence coefficient matrix is not equal to the actual influence coefficient matrix. Recently, analysis tools have been developed to examine the robustness of this control algorithm with respect to structured uncertainty. Herein, these tools are extended to produce a design procedure for determining the adaptive law's gain matrix. The resulting control algorithm has a guaranteed convergence rate and steady state performance in spite of the uncertainty in the rotor system. Several examples are presented which demonstrate the effectiveness of this approach and its advantages over the standard optimal control formulation.
System Analysis Applied to Autonomy: Application to Human-Rated Lunar/Mars Landers
NASA Technical Reports Server (NTRS)
Young, Larry A.
2006-01-01
System analysis is an essential technical discipline for the modern design of spacecraft and their associated missions. Specifically, system analysis is a powerful aid in identifying and prioritizing the required technologies needed for mission and/or vehicle development efforts. Maturation of intelligent systems technologies, and their incorporation into spacecraft systems, are dictating the development of new analysis tools, and incorporation of such tools into existing system analysis methodologies, in order to fully capture the trade-offs of autonomy on vehicle and mission success. A "system analysis of autonomy" methodology will be outlined and applied to a set of notional human-rated lunar/Mars lander missions toward answering these questions: 1. what is the optimum level of vehicle autonomy and intelligence required? and 2. what are the specific attributes of an autonomous system implementation essential for a given surface lander mission/application in order to maximize mission success? Future human-rated lunar/Mars landers, though nominally under the control of their crew, will, nonetheless, be highly automated systems. These automated systems will range from mission/flight control functions, to vehicle health monitoring and prognostication, to life-support and other "housekeeping" functions. The optimum degree of autonomy afforded to these spacecraft systems/functions has profound implications from an exploration system architecture standpoint.
CSM digital autopilot testing in support of ASTP experiments control requirements
NASA Technical Reports Server (NTRS)
Rue, D. L.
1975-01-01
Results are presented of CSM digital autopilot (DAP) testing. The testing was performed to demonstrate and evaluate control modes which are currently planned or could be considered for use in support of experiments on the ASTP mission. The testing was performed on the Lockheed Guidance, Navigation, and Control System Functional Simulator (GNCFS). This simulator, which was designed to test the Apollo and Skylab DAP control system, has been used extensively and is a proven tool for CSM DAP analysis.
Joint Control: A Discussion of Recent Research
Palmer, David C
2006-01-01
The discrimination of the onset of joint control is an important interpretive tool in explaining matching behavior and other complex phenomena, but the difficulty of getting experimental control of all relevant variables stands in the way of a definitive experiment. The studies in the present issue of The Analysis of Verbal Behavior illustrate how modest experiments can take their place in a web of interpretation to make a strong case that joint control is a necessary element of such phenomena. PMID:22477357
NASA to Test In-Flight Folding Spanwise Adaptive Wing to Enhance Aircraft Efficiency
2014-10-21
The objectives of testing on PTERA include the development of tools and vetting of system integration, evaluation of vehicle control law, and analysis of SAW airworthiness to examine benefits to in-flight efficiency.
Comparisons of Kinematics and Dynamics Simulation Software Tools
NASA Technical Reports Server (NTRS)
Shiue, Yeu-Sheng Paul
2002-01-01
Kinematic and dynamic analyses for moving bodies are essential to system engineers and designers in the process of design and validations. 3D visualization and motion simulation plus finite element analysis (FEA) give engineers a better way to present ideas and results. Marshall Space Flight Center (MSFC) system engineering researchers are currently using IGRIP from DELMIA Inc. as a kinematic simulation tool for discrete bodies motion simulations. Although IGRIP is an excellent tool for kinematic simulation with some dynamic analysis capabilities in robotic control, explorations of other alternatives with more powerful dynamic analysis and FEA capabilities are necessary. Kinematics analysis will only examine the displacement, velocity, and acceleration of the mechanism without considering effects from masses of components. With dynamic analysis and FEA, effects such as the forces or torques at the joint due to mass and inertia of components can be identified. With keen market competition, ALGOR Mechanical Event Simulation (MES), MSC visualNastran 4D, Unigraphics Motion+, and Pro/MECHANICA were chosen for explorations. In this study, comparisons between software tools were presented in terms of following categories: graphical user interface (GUI), import capability, tutorial availability, ease of use, kinematic simulation capability, dynamic simulation capability, FEA capability, graphical output, technical support, and cost. Propulsion Test Article (PTA) with Fastrac engine model exported from IGRIP and an office chair mechanism were used as examples for simulations.
Modelling and analysis of gene regulatory network using feedback control theory
NASA Astrophysics Data System (ADS)
El-Samad, H.; Khammash, M.
2010-01-01
Molecular pathways are a part of a remarkable hierarchy of regulatory networks that operate at all levels of organisation. These regulatory networks are responsible for much of the biological complexity within the cell. The dynamic character of these pathways and the prevalence of feedback regulation strategies in their operation make them amenable to systematic mathematical analysis using the same tools that have been used with success in analysing and designing engineering control systems. In this article, we aim at establishing this strong connection through various examples where the behaviour exhibited by gene networks is explained in terms of their underlying control strategies. We complement our analysis by a survey of mathematical techniques commonly used to model gene regulatory networks and analyse their dynamic behaviour.
Tools for Local and Distributed Climate Data Access
NASA Astrophysics Data System (ADS)
Schweitzer, R.; O'Brien, K.; Burger, E. F.; Smith, K. M.; Manke, A. B.; Radhakrishnan, A.; Balaji, V.
2017-12-01
Last year we reported on our efforts to adapt existing tools to facilitate model development. During the lifecycle of a Climate Model Intercomparison Project (CMIP), data must be quality controlled before it can be published and studied. Like previous efforts, the next CMIP6 will produce an unprecedented volume of data. For an institution, modelling group or modeller the volume of data is unmanageable without tools that organize and automate as many processes as possible. Even if a modelling group has tools for data and metadata management, it often falls on individuals to do the initial quality assessment for a model run with bespoke tools. Using individually crafted tools can lead to interruptions when project personnel change and may result in inconsistencies and duplication of effort across groups. This talk will expand on our experiences using available tools (Ferret/PyFerret, the Live Access Server, the GFDL Curator, the GFDL Model Development Database Interface and the THREDDS Data Server) to seamlessly automate the data assembly process to give users "one-click" access to a rich suite of Web-based analysis and comparison tools. On the surface, it appears that this collection of tools is well suited to the task, but our experience of the last year taught us that the data volume and distributed storage adds a number of challenges in adapting the tools for this task. Quality control and initial evaluation add their own set of challenges. We will discuss how we addressed the needs of QC researchers by expanding standard tools to include specialized plots and leveraged the configurability of the tools to add specific user defined analysis operations so they are available to everyone using the system. We also report on our efforts to overcome some of the technical barriers for wide adoption of the tools by providing pre-built containers that are easily deployed in virtual machine and cloud environments. Finally, we will offer some suggestions for added features, configuration options and improved robustness that can make future implementation of similar systems operate faster and more reliably. Solving these challenges for data sets distributed narrowly across networks and storage systems of points the way to solving similar problems associated with sharing data distributed across institutions continents.
Data and Tools | Energy Analysis | NREL
and Tools Energy Analysis Data and Tools NREL develops energy analysis data and tools to assess collections. Data Products Technology and Performance Analysis Tools Energy Systems Analysis Tools Economic and Financial Analysis Tools
Intentional defect array wafers: their practical use in semiconductor control and monitoring systems
NASA Astrophysics Data System (ADS)
Emami, Iraj; McIntyre, Michael; Retersdorf, Michael
2003-07-01
In the competitive world of semiconductor manufacturing today, control of the process and manufacturing equipment is paramount to success of the business. Consistent with the need for rapid development of process technology, is a need for development wiht respect to equipment control including defect metrology tools. Historical control methods for defect metrology tools included a raw count of defects detected on a characterized production or test wafer with little or not regard to the attributes of the detected defects. Over time, these characterized wafers degrade with multiple passes on the tools and handling requiring the tool owner to create and characterize new samples periodically. With the complex engineering software analysis systems used today, there is a strong reliance on the accuracy of defect size, location, and classification in order to provide the best value when correlating the in line to sort type of data. Intentional Defect Array (IDA) wafers were designed and manufacturered at International Sematech (ISMT) in Austin, Texas and is a product of collaboration between ISMT member companies and suppliers of advanced defect inspection equipment. These wafers provide the use with known defect types and sizes in predetermined locations across the entire wafer. The wafers are designed to incorporate several desired flows and use critical dimensions consistent with current and future technology nodes. This paper briefly describes the design of the IDA wafer and details many practical applications in the control of advanced defect inspection equipment.
Kastberger, G; Kranner, G
2000-02-01
Viscovery SOMine is a software tool for advanced analysis and monitoring of numerical data sets. It was developed for professional use in business, industry, and science and to support dependency analysis, deviation detection, unsupervised clustering, nonlinear regression, data association, pattern recognition, and animated monitoring. Based on the concept of self-organizing maps (SOMs), it employs a robust variant of unsupervised neural networks--namely, Kohonen's Batch-SOM, which is further enhanced with a new scaling technique for speeding up the learning process. This tool provides a powerful means by which to analyze complex data sets without prior statistical knowledge. The data representation contained in the trained SOM is systematically converted to be used in a spectrum of visualization techniques, such as evaluating dependencies between components, investigating geometric properties of the data distribution, searching for clusters, or monitoring new data. We have used this software tool to analyze and visualize multiple influences of the ocellar system on free-flight behavior in giant honeybees. Occlusion of ocelli will affect orienting reactivities in relation to flight target, level of disturbance, and position of the bee in the flight chamber; it will induce phototaxis and make orienting imprecise and dependent on motivational settings. Ocelli permit the adjustment of orienting strategies to environmental demands by enforcing abilities such as centering or flight kinetics and by providing independent control of posture and flight course.
Coupled rotor/airframe vibration analysis
NASA Technical Reports Server (NTRS)
Sopher, R.; Studwell, R. E.; Cassarino, S.; Kottapalli, S. B. R.
1982-01-01
A coupled rotor/airframe vibration analysis developed as a design tool for predicting helicopter vibrations and a research tool to quantify the effects of structural properties, aerodynamic interactions, and vibration reduction devices on vehicle vibration levels is described. The analysis consists of a base program utilizing an impedance matching technique to represent the coupled rotor/airframe dynamics of the system supported by inputs from several external programs supplying sophisticated rotor and airframe aerodynamic and structural dynamic representation. The theoretical background, computer program capabilities and limited correlation results are presented in this report. Correlation results using scale model wind tunnel results show that the analysis can adequately predict trends of vibration variations with airspeed and higher harmonic control effects. Predictions of absolute values of vibration levels were found to be very sensitive to modal characteristics and results were not representative of measured values.
NASA Technical Reports Server (NTRS)
Hammond, Dana P.
1991-01-01
The Technical Requirements Analysis and Control Systems (TRACS) software package is described. TRACS offers supplemental tools for the analysis, control, and interchange of project requirements. This package provides the fundamental capability to analyze and control requirements, serves a focal point for project requirements, and integrates a system that supports efficient and consistent operations. TRACS uses relational data base technology (ORACLE) in a stand alone or in a distributed environment that can be used to coordinate the activities required to support a project through its entire life cycle. TRACS uses a set of keyword and mouse driven screens (HyperCard) which imposes adherence through a controlled user interface. The user interface provides an interactive capability to interrogate the data base and to display or print project requirement information. TRACS has a limited report capability, but can be extended with PostScript conventions.
Pérez, Dennis; Van der Stuyft, Patrick; Toledo, María Eugenia; Ceballos, Enrique; Fabré, Francisco; Lefèvre, Pierre
2018-01-01
Within the context of a field trial conducted by the Cuban vector control program (AaCP), we assessed acceptability of insecticide-treated curtains (ITCs) and residual insecticide treatment (RIT) with deltamethrin by the community. We also assessed the potential influence of interviewees' risk perceptions for getting dengue and disease severity. We embedded a qualitative study using in-depth interviews in a cluster randomized trial (CRT) testing the effectiveness of ITCs and RIT in Santiago de Cuba. In-depth interviews (N = 38) were conducted four and twelve months after deployment of the tools with people who accepted the tools, who stopped using them and who did not accept the tools. Data analysis was deductive. Main reasons for accepting ITCs at the start of the trial were perceived efficacy and not being harmful to health. Constraints linked to manufacturer instructions were the main reason for not using ITCs. People stopped using the ITCs due to perceived allergy, toxicity and low efficacy. Few heads of households refused RIT despite the noting reasons for rejection, such as allergy, health hazard and toxicity. Positive opinions of the vector control program influenced acceptability of both tools. However, frequent insecticide fogging as part of routine AaCP vector control actions diminished perceived efficacy of both tools and, therefore, acceptability. Fifty percent of interviewees did feel at risk for getting dengue and considered dengue a severe disease. However, this did not appear to influence acceptability of ITCs or RIT. Acceptability of ITCs and RIT was linked to acceptability of AaCP routine vector control activities. However, uptake and use were not always an indication of acceptability. Factors leading to acceptability may be best identified using qualitative methods, but more research is needed on the concept of acceptability and its measurement.
Analysis of gene expression profile microarray data in complex regional pain syndrome.
Tan, Wulin; Song, Yiyan; Mo, Chengqiang; Jiang, Shuangjian; Wang, Zhongxing
2017-09-01
The aim of the present study was to predict key genes and proteins associated with complex regional pain syndrome (CRPS) using bioinformatics analysis. The gene expression profiling microarray data, GSE47603, which included peripheral blood samples from 4 patients with CRPS and 5 healthy controls, was obtained from the Gene Expression Omnibus (GEO) database. The differentially expressed genes (DEGs) in CRPS patients compared with healthy controls were identified using the GEO2R online tool. Functional enrichment analysis was then performed using The Database for Annotation Visualization and Integrated Discovery online tool. Protein‑protein interaction (PPI) network analysis was subsequently performed using Search Tool for the Retrieval of Interaction Genes database and analyzed with Cytoscape software. A total of 257 DEGs were identified, including 243 upregulated genes and 14 downregulated ones. Genes in the human leukocyte antigen (HLA) family were most significantly differentially expressed. Enrichment analysis demonstrated that signaling pathways, including immune response, cell motion, adhesion and angiogenesis were associated with CRPS. PPI network analysis revealed that key genes, including early region 1A binding protein p300 (EP300), CREB‑binding protein (CREBBP), signal transducer and activator of transcription (STAT)3, STAT5A and integrin α M were associated with CRPS. The results suggest that the immune response may therefore serve an important role in CRPS development. In addition, genes in the HLA family, such as HLA‑DQB1 and HLA‑DRB1, may present potential biomarkers for the diagnosis of CRPS. Furthermore, EP300, its paralog CREBBP, and the STAT family genes, STAT3 and STAT5 may be important in the development of CRPS.
The EPA Control Strategy Tool (CoST) is a software tool for projecting potential future control scenarios, their effects on emissions and estimated costs. This tool uses the NEI and the Control Measures Dataset as key inputs. CoST outputs are projections of future control scenarios.
Kinematics of mechanical and adhesional micromanipulation under a scanning electron microscope
NASA Astrophysics Data System (ADS)
Saito, Shigeki; Miyazaki, Hideki T.; Sato, Tomomasa; Takahashi, Kunio
2002-11-01
In this paper, the kinematics of mechanical and adhesional micromanipulation using a needle-shaped tool under a scanning electron microscope is analyzed. A mode diagram is derived to indicate the possible micro-object behavior for the specified operational conditions. Based on the diagram, a reasonable method for pick and place operation is proposed. The keys to successful analysis are to introduce adhesional and rolling-resistance factors into the kinematic system consisting of a sphere, a needle-shaped tool, and a substrate, and to consider the time dependence of these factors due to the electron-beam (EB) irradiation. Adhesional force and the lower limit of maximum rolling resistance are evaluated quantitatively in theoretical and experimental ways. This analysis shows that it is possible to control the fracture of either the tool-sphere or substrate-sphere interface of the system selectively by the tool-loading angle and that such a selective fracture of the interfaces enables reliable pick or place operation even under EB irradiation. Although the conventional micromanipulation was not repeatable because the technique was based on an empirically effective method, this analysis should provide us with a guideline to reliable micromanipulation.
NASA Technical Reports Server (NTRS)
Gomez, Ashley Nicole; Martin, Lynne Hazel; Homola, Jeffrey; Morey, Susan; Cabrall, Christopher; Mercer, Joey; Prevot, Thomas
2013-01-01
In a study, that introduced ground-based separation assurance automation through a series of envisioned transitional phases of concept maturity, it was found that subjective responses to scales of workload, situation awareness, and acceptability in a post run questionnaire revealed as-predicted results for three of the four study conditions but not for the third, Moderate condition. The trend continued for losses of separation (LOS) where the number of LOS events were far greater than expected in the Moderate condition. To offer an account of why the Moderate condition was perceived to be more difficult to manage than predicted, researchers examined the increase in amount and complexity of traffic, increase in communication load, and increased complexities as a result of the simulation's mix of aircraft equipage. Further analysis compared the tools presented through the phases, finding that controllers took advantage of the informational properties of the tools presented but shied away from using their decision support capabilities. Taking into account similar findings from other studies, it is suggested that the Moderate condition represented the first step into a "shared control" environment, which requires the controller to use the automation as a decision making partner rather than just a provider of information. Viewed in this light, the combination of tools offered in the Moderate condition was reviewed and some tradeoffs that may offset the identified complexities were suggested.
McCarthy, Davis J; Campbell, Kieran R; Lun, Aaron T L; Wills, Quin F
2017-04-15
Single-cell RNA sequencing (scRNA-seq) is increasingly used to study gene expression at the level of individual cells. However, preparing raw sequence data for further analysis is not a straightforward process. Biases, artifacts and other sources of unwanted variation are present in the data, requiring substantial time and effort to be spent on pre-processing, quality control (QC) and normalization. We have developed the R/Bioconductor package scater to facilitate rigorous pre-processing, quality control, normalization and visualization of scRNA-seq data. The package provides a convenient, flexible workflow to process raw sequencing reads into a high-quality expression dataset ready for downstream analysis. scater provides a rich suite of plotting tools for single-cell data and a flexible data structure that is compatible with existing tools and can be used as infrastructure for future software development. The open-source code, along with installation instructions, vignettes and case studies, is available through Bioconductor at http://bioconductor.org/packages/scater . davis@ebi.ac.uk. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press.
Analysis and simulation tools for solar array power systems
NASA Astrophysics Data System (ADS)
Pongratananukul, Nattorn
This dissertation presents simulation tools developed specifically for the design of solar array power systems. Contributions are made in several aspects of the system design phases, including solar source modeling, system simulation, and controller verification. A tool to automate the study of solar array configurations using general purpose circuit simulators has been developed based on the modeling of individual solar cells. Hierarchical structure of solar cell elements, including semiconductor properties, allows simulation of electrical properties as well as the evaluation of the impact of environmental conditions. A second developed tool provides a co-simulation platform with the capability to verify the performance of an actual digital controller implemented in programmable hardware such as a DSP processor, while the entire solar array including the DC-DC power converter is modeled in software algorithms running on a computer. This "virtual plant" allows developing and debugging code for the digital controller, and also to improve the control algorithm. One important task in solar arrays is to track the maximum power point on the array in order to maximize the power that can be delivered. Digital controllers implemented with programmable processors are particularly attractive for this task because sophisticated tracking algorithms can be implemented and revised when needed to optimize their performance. The proposed co-simulation tools are thus very valuable in developing and optimizing the control algorithm, before the system is built. Examples that demonstrate the effectiveness of the proposed methodologies are presented. The proposed simulation tools are also valuable in the design of multi-channel arrays. In the specific system that we have designed and tested, the control algorithm is implemented on a single digital signal processor. In each of the channels the maximum power point is tracked individually. In the prototype we built, off-the-shelf commercial DC-DC converters were utilized. At the end, the overall performance of the entire system was evaluated using solar array simulators capable of simulating various I-V characteristics, and also by using an electronic load. Experimental results are presented.
Dynamic analysis of space structures including elastic, multibody, and control behavior
NASA Technical Reports Server (NTRS)
Pinson, Larry; Soosaar, Keto
1989-01-01
The problem is to develop analysis methods, modeling stategies, and simulation tools to predict with assurance the on-orbit performance and integrity of large complex space structures that cannot be verified on the ground. The problem must incorporate large reliable structural models, multi-body flexible dynamics, multi-tier controller interaction, environmental models including 1g and atmosphere, various on-board disturbances, and linkage to mission-level performance codes. All areas are in serious need of work, but the weakest link is multi-body flexible dynamics.
Antiwindup analysis and design approaches for MIMO systems
NASA Technical Reports Server (NTRS)
Marcopoli, Vincent R.; Phillips, Stephen M.
1994-01-01
Performance degradation of multiple-input multiple-output (MIMO) control systems having limited actuators is often handled by augmenting the controller with an antiwindup mechanism, which attempts to maintain system performance when limits are encountered. The goals of this paper are: (1) To develop a method to analyze antiwindup systems to determine precisely what stability and performance degradation is incurred under limited conditions. It is shown that by reformulating limited actuator commands as resulting from multiplicative perturbations to the corresponding controller requests, mu-analysis tools can be utilized to obtain quantitative measures of stability and performance degradation. (2) To propose a linear, time invariant (LTI) criterion on which to base the antiwindup design. These analysis and design methods are illustrated through the evaluation of two competing antiwindup schemes augmenting the controller of a Short Take-Off and Vertical Landing (STOVL) aircraft in transition flight.
Antiwindup analysis and design approaches for MIMO systems
NASA Technical Reports Server (NTRS)
Marcopoli, Vincent R.; Phillips, Stephen M.
1993-01-01
Performance degradation of multiple-input multiple-output (MIMO) control systems having limited actuators is often handled by augmenting the controller with an antiwindup mechanism, which attempts to maintain system performance when limits are encountered. The goals of this paper are: 1) to develop a method to analyze antiwindup systems to determine precisely what stability and performance degradation is incurred under limited conditions. It is shown that by reformulating limited actuator commands as resulting from multiplicative perturbations to the corresponding controller requests, mu-analysis tools can be utilized to obtain quantitative measures of stability and performance degradation. 2) To propose a linear, time invariant (LTI) criterion on which to base the antiwindup design. These analysis and design methods are illustrated through the evaluation of two competing antiwindup schemes augmenting the controller of a Short Take-Off and Vertical Landing (STOVL) aircraft in transition flight.
Gene ARMADA: an integrated multi-analysis platform for microarray data implemented in MATLAB.
Chatziioannou, Aristotelis; Moulos, Panagiotis; Kolisis, Fragiskos N
2009-10-27
The microarray data analysis realm is ever growing through the development of various tools, open source and commercial. However there is absence of predefined rational algorithmic analysis workflows or batch standardized processing to incorporate all steps, from raw data import up to the derivation of significantly differentially expressed gene lists. This absence obfuscates the analytical procedure and obstructs the massive comparative processing of genomic microarray datasets. Moreover, the solutions provided, heavily depend on the programming skills of the user, whereas in the case of GUI embedded solutions, they do not provide direct support of various raw image analysis formats or a versatile and simultaneously flexible combination of signal processing methods. We describe here Gene ARMADA (Automated Robust MicroArray Data Analysis), a MATLAB implemented platform with a Graphical User Interface. This suite integrates all steps of microarray data analysis including automated data import, noise correction and filtering, normalization, statistical selection of differentially expressed genes, clustering, classification and annotation. In its current version, Gene ARMADA fully supports 2 coloured cDNA and Affymetrix oligonucleotide arrays, plus custom arrays for which experimental details are given in tabular form (Excel spreadsheet, comma separated values, tab-delimited text formats). It also supports the analysis of already processed results through its versatile import editor. Besides being fully automated, Gene ARMADA incorporates numerous functionalities of the Statistics and Bioinformatics Toolboxes of MATLAB. In addition, it provides numerous visualization and exploration tools plus customizable export data formats for seamless integration by other analysis tools or MATLAB, for further processing. Gene ARMADA requires MATLAB 7.4 (R2007a) or higher and is also distributed as a stand-alone application with MATLAB Component Runtime. Gene ARMADA provides a highly adaptable, integrative, yet flexible tool which can be used for automated quality control, analysis, annotation and visualization of microarray data, constituting a starting point for further data interpretation and integration with numerous other tools.
VID-R and SCAN: Tools and Methods for the Automated Analysis of Visual Records.
ERIC Educational Resources Information Center
Ekman, Paul; And Others
The VID-R (Visual Information Display and Retrieval) system that enables computer-aided analysis of visual records is composed of a film-to-television chain, two videotape recorders with complete remote control of functions, a video-disc recorder, three high-resolution television monitors, a teletype, a PDP-8, a video and audio interface, three…
ERIC Educational Resources Information Center
Martínez, Marc Lafuente; Valdivia, Ibis M. Álvarez
2016-01-01
This paper presents a higher education experience aimed at explicitly promoting metacognitive processes in a social and collaborative context. Students carried out a debate on an e-forum, and were later asked to collaboratively analyse their own debates. The control group conducted this analysis using text-based tools; the experimental group…
Advanced Productivity Analysis Methods for Air Traffic Control Operations
1976-12-01
Routine Work ............................... 37 4.2.2. Surveillance Work .......................... 40 4.2.3. Conflict Prcessing Work ................... 41...crossing and overtake conflicts) includes potential- conflict recognition, assessment, and resolution decision making and A/N voice communications...makers to utilize £ .quantitative and dynamic analysis as a tool for decision - making. 1.1.3 Types of Simulation Models Although there are many ways to
Control Theoretic Modeling for Uncertain Cultural Attitudes and Unknown Adversarial Intent
2009-02-01
Constructive computational tools. 15. SUBJECT TERMS social learning, social networks , multiagent systems, game theory 16. SECURITY CLASSIFICATION OF: a...over- reactionary behaviors; 3) analysis of rational social learning in networks : analysis of belief propagation in social networks in various...general methodology as a predictive device for social network formation and for communication network formation with constraints on the lengths of
SSOAP Toolbox Enhancements and Case Study
Recognizing the need for tools to support the development of sanitary sewer overflow (SSO) control plans, in October 2009 the U.S. Environmental Protection Agency (EPA) released the first version of the Sanitary Sewer Overflow Analysis and Planning (SSOAP) Toolbox. This first ve...
Statistical Analysis Tools for Learning in Engineering Laboratories.
ERIC Educational Resources Information Center
Maher, Carolyn A.
1990-01-01
Described are engineering programs that have used automated data acquisition systems to implement data collection and analyze experiments. Applications include a biochemical engineering laboratory, heat transfer performance, engineering materials testing, mechanical system reliability, statistical control laboratory, thermo-fluid laboratory, and a…
Iserbyt, Peter; Byra, Mark
2013-11-01
Research investigating design effects of instructional tools for learning Basic Life Support (BLS) is almost non-existent. To demonstrate the design of instructional tools matter. The effect of spatial contiguity, a design principle stating that people learn more deeply when words and corresponding pictures are placed close (i.e., integrated) rather than far from each other on a page was investigated on task cards for learning Cardiopulmonary Resuscitation (CPR) during reciprocal peer learning. A randomized controlled trial. A total of 111 students (mean age: 13 years) constituting six intact classes learned BLS through reciprocal learning with task cards. Task cards combine a picture of the skill with written instructions about how to perform it. In each class, students were randomly assigned to the experimental group or the control. In the control, written instructions were placed under the picture on the task cards. In the experimental group, written instructions were placed close to the corresponding part of the picture on the task cards reflecting application of the spatial contiguity principle. One-way analysis of variance found significantly better performances in the experimental group for ventilation volumes (P=.03, ηp2=.10) and flow rates (P=.02, ηp2=.10). For chest compression depth, compression frequency, compressions with correct hand placement, and duty cycles no significant differences were found. This study shows that the design of instructional tools (i.e., task cards) affects student learning. Research-based design of learning tools can enhance BLS and CPR education. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Kraft, R. E.
1996-01-01
A computational method to predict modal reflection coefficients in cylindrical ducts has been developed based on the work of Homicz, Lordi, and Rehm, which uses the Wiener-Hopf method to account for the boundary conditions at the termination of a thin cylindrical pipe. The purpose of this study is to develop a computational routine to predict the reflection coefficients of higher order acoustic modes impinging on the unflanged termination of a cylindrical duct. This effort was conducted wider Task Order 5 of the NASA Lewis LET Program, Active Noise Control of aircraft Engines: Feasibility Study, and will be used as part of the development of an integrated source noise, acoustic propagation, ANC actuator coupling, and control system algorithm simulation. The reflection coefficient prediction will be incorporated into an existing cylindrical duct modal analysis to account for the reflection of modes from the duct termination. This will provide a more accurate, rapid computation design tool for evaluating the effect of reflected waves on active noise control systems mounted in the duct, as well as providing a tool for the design of acoustic treatment in inlet ducts. As an active noise control system design tool, the method can be used preliminary to more accurate but more numerically intensive acoustic propagation models such as finite element methods. The resulting computer program has been shown to give reasonable results, some examples of which are presented. Reliable data to use for comparison is scarce, so complete checkout is difficult, and further checkout is needed over a wider range of system parameters. In future efforts the method will be adapted as a subroutine to the GEAE segmented cylindrical duct modal analysis program.
Liu, Hui-lin; Wan, Xia; Yang, Gong-huan
2013-02-01
To explore the relationship between the strength of tobacco control and the effectiveness of creating smoke-free hospital, and summarize the main factors that affect the program of creating smoke-free hospitals. A total of 210 hospitals from 7 provinces/municipalities directly under the central government were enrolled in this study using stratified random sampling method. Principle component analysis and regression analysis were conducted to analyze the strength of tobacco control and the effectiveness of creating smoke-free hospitals. Two principal components were extracted in the strength of tobacco control index, which respectively reflected the tobacco control policies and efforts, and the willingness and leadership of hospital managers regarding tobacco control. The regression analysis indicated that only the first principal component was significantly correlated with the progression in creating smoke-free hospital (P<0.001), i.e. hospitals with higher scores on the first principal component had better achievements in smoke-free environment creation. Tobacco control policies and efforts are critical in creating smoke-free hospitals. The principal component analysis provides a comprehensive and objective tool for evaluating the creation of smoke-free hospitals.
Autonomous control systems - Architecture and fundamental issues
NASA Technical Reports Server (NTRS)
Antsaklis, P. J.; Passino, K. M.; Wang, S. J.
1988-01-01
A hierarchical functional autonomous controller architecture is introduced. In particular, the architecture for the control of future space vehicles is described in detail; it is designed to ensure the autonomous operation of the control system and it allows interaction with the pilot and crew/ground station, and the systems on board the autonomous vehicle. The fundamental issues in autonomous control system modeling and analysis are discussed. It is proposed to utilize a hybrid approach to modeling and analysis of autonomous systems. This will incorporate conventional control methods based on differential equations and techniques for the analysis of systems described with a symbolic formalism. In this way, the theory of conventional control can be fully utilized. It is stressed that autonomy is the design requirement and intelligent control methods appear at present, to offer some of the necessary tools to achieve autonomy. A conventional approach may evolve and replace some or all of the `intelligent' functions. It is shown that in addition to conventional controllers, the autonomous control system incorporates planning, learning, and FDI (fault detection and identification).
Design tool for multiprocessor scheduling and evaluation of iterative dataflow algorithms
NASA Technical Reports Server (NTRS)
Jones, Robert L., III
1995-01-01
A graph-theoretic design process and software tool is defined for selecting a multiprocessing scheduling solution for a class of computational problems. The problems of interest are those that can be described with a dataflow graph and are intended to be executed repetitively on a set of identical processors. Typical applications include signal processing and control law problems. Graph-search algorithms and analysis techniques are introduced and shown to effectively determine performance bounds, scheduling constraints, and resource requirements. The software tool applies the design process to a given problem and includes performance optimization through the inclusion of additional precedence constraints among the schedulable tasks.
Computer implemented method, and apparatus for controlling a hand-held tool
NASA Technical Reports Server (NTRS)
Wagner, Kenneth William (Inventor); Taylor, James Clayton (Inventor)
1999-01-01
The invention described here in is a computer-implemented method and apparatus for controlling a hand-held tool. In particular, the control of a hand held tool is for the purpose of controlling the speed of a fastener interface mechanism and the torque applied to fasteners by the fastener interface mechanism of the hand-held tool and monitoring the operating parameters of the tool. The control is embodied in intool software embedded on a processor within the tool which also communicates with remote software. An operator can run the tool, or through the interaction of both software, operate the tool from a remote location, analyze data from a performance history recorded by the tool, and select various torque and speed parameters for each fastener.
Grid Computing and Collaboration Technology in Support of Fusion Energy Sciences
NASA Astrophysics Data System (ADS)
Schissel, D. P.
2004-11-01
The SciDAC Initiative is creating a computational grid designed to advance scientific understanding in fusion research by facilitating collaborations, enabling more effective integration of experiments, theory and modeling, and allowing more efficient use of experimental facilities. The philosophy is that data, codes, analysis routines, visualization tools, and communication tools should be thought of as easy to use network available services. Access to services is stressed rather than portability. Services share the same basic security infrastructure so that stakeholders can control their own resources and helps ensure fair use of resources. The collaborative control room is being developed using the open-source Access Grid software that enables secure group-to-group collaboration with capabilities beyond teleconferencing including application sharing and control. The ability to effectively integrate off-site scientists into a dynamic control room will be critical to the success of future international projects like ITER. Grid computing, the secure integration of computer systems over high-speed networks to provide on-demand access to data analysis capabilities and related functions, is being deployed as an alternative to traditional resource sharing among institutions. The first grid computational service deployed was the transport code TRANSP and included tools for run preparation, submission, monitoring and management. This approach saves user sites from the laborious effort of maintaining a complex code while at the same time reducing the burden on developers by avoiding the support of a large number of heterogeneous installations. This tutorial will present the philosophy behind an advanced collaborative environment, give specific examples, and discuss its usage beyond FES.
Evaluation of Intersection Traffic Control Measures through Simulation
NASA Astrophysics Data System (ADS)
Asaithambi, Gowri; Sivanandan, R.
2015-12-01
Modeling traffic flow is stochastic in nature due to randomness in variables such as vehicle arrivals and speeds. Due to this and due to complex vehicular interactions and their manoeuvres, it is extremely difficult to model the traffic flow through analytical methods. To study this type of complex traffic system and vehicle interactions, simulation is considered as an effective tool. Application of homogeneous traffic models to heterogeneous traffic may not be able to capture the complex manoeuvres and interactions in such flows. Hence, a microscopic simulation model for heterogeneous traffic is developed using object oriented concepts. This simulation model acts as a tool for evaluating various control measures at signalized intersections. The present study focuses on the evaluation of Right Turn Lane (RTL) and Channelised Left Turn Lane (CLTL). A sensitivity analysis was performed to evaluate RTL and CLTL by varying the approach volumes, turn proportions and turn lane lengths. RTL is found to be advantageous only up to certain approach volumes and right-turn proportions, beyond which it is counter-productive. CLTL is found to be advantageous for lower approach volumes for all turn proportions, signifying the benefits of CLTL. It is counter-productive for higher approach volume and lower turn proportions. This study pinpoints the break-even points for various scenarios. The developed simulation model can be used as an appropriate intersection lane control tool for enhancing the efficiency of flow at intersections. This model can also be employed for scenario analysis and can be valuable to field traffic engineers in implementing vehicle-type based and lane-based traffic control measures.
Stability analysis using SDSA tool
NASA Astrophysics Data System (ADS)
Goetzendorf-Grabowski, Tomasz; Mieszalski, Dawid; Marcinkiewicz, Ewa
2011-11-01
The SDSA (Simulation and Dynamic Stability Analysis) application is presented as a tool for analysing the dynamic characteristics of the aircraft just in the conceptual design stage. SDSA is part of the CEASIOM (Computerized Environment for Aircraft Synthesis and Integrated Optimization Methods) software environment which was developed within the SimSAC (Simulating Aircraft Stability And Control Characteristics for Use in Conceptual Design) project, funded by the European Commission 6th Framework Program. SDSA can also be used as stand alone software, and integrated with other design and optimisation systems using software wrappers. This paper focuses on the main functionalities of SDSA and presents both computational and free flight experimental results to compare and validate the presented software. Two aircraft are considered, the EADS Ranger 2000 and the Warsaw University designed PW-6 glider. For the two cases considered here the SDSA software is shown to be an excellent tool for predicting dynamic characteristics of an aircraft.
Generating community-built tools for data sharing and analysis in environmental networks
Read, Jordan S.; Gries, Corinna; Read, Emily K.; Klug, Jennifer; Hanson, Paul C.; Hipsey, Matthew R.; Jennings, Eleanor; O'Reilley, Catherine; Winslow, Luke A.; Pierson, Don; McBride, Christopher G.; Hamilton, David
2016-01-01
Rapid data growth in many environmental sectors has necessitated tools to manage and analyze these data. The development of tools often lags behind the proliferation of data, however, which may slow exploratory opportunities and scientific progress. The Global Lake Ecological Observatory Network (GLEON) collaborative model supports an efficient and comprehensive data–analysis–insight life cycle, including implementations of data quality control checks, statistical calculations/derivations, models, and data visualizations. These tools are community-built and openly shared. We discuss the network structure that enables tool development and a culture of sharing, leading to optimized output from limited resources. Specifically, data sharing and a flat collaborative structure encourage the development of tools that enable scientific insights from these data. Here we provide a cross-section of scientific advances derived from global-scale analyses in GLEON. We document enhancements to science capabilities made possible by the development of analytical tools and highlight opportunities to expand this framework to benefit other environmental networks.
Research in Computer Forensics
2002-06-01
systems and how they can aid in the recovery of digital evidence in a forensic analysis. Exposures to hacking techniques and tools in CS3675—Internet...cryptography, access control, authentication, biometrics, actions to be taken during an attack and case studies of hacking and information warfare. 11...chat, surfing, instant messaging and hacking with powerful access control and filter capabilities. The monitor can operates in a Prevention mode to
SARDA HITL Preliminary Human Factors Measures and Analyses
NASA Technical Reports Server (NTRS)
Hyashi, Miwa; Dulchinos, Victoria
2012-01-01
Human factors data collected during the SARDA HITL Simulation Experiment include a variety of subjective measures, including the NASA TLX, questionnaire questions regarding situational awareness, advisory usefulness, UI usability, and controller trust. Preliminary analysis of the TLX data indicate that workload may not be adversely affected by use of the advisories, additionally, the controller's subjective ratings of the advisories may suggest acceptance of the tool.
Modeling Off-Nominal Recovery in NextGen Terminal-Area Operations
NASA Technical Reports Server (NTRS)
Callantine, Todd J.
2011-01-01
Robust schedule-based arrival management requires efficient recovery from off-nominal situations. This paper presents research on modeling off-nominal situations and plans for recovering from them using TRAC, a route/airspace design, fast-time simulation, and analysis tool for studying NextGen trajectory-based operations. The paper provides an overview of a schedule-based arrival-management concept and supporting controller tools, then describes TRAC implementations of methods for constructing off-nominal scenarios, generating trajectory options to meet scheduling constraints, and automatically producing recovery plans.
Yamato, Tie Parma; Maher, Chris; Koes, Bart; Moseley, Anne
2017-06-01
The Physiotherapy Evidence Database (PEDro) scale has been widely used to investigate methodological quality in physiotherapy randomized controlled trials; however, its validity has not been tested for pharmaceutical trials. The aim of this study was to investigate the validity and interrater reliability of the PEDro scale for pharmaceutical trials. The reliability was also examined for the Cochrane Back and Neck (CBN) Group risk of bias tool. This is a secondary analysis of data from a previous study. We considered randomized placebo controlled trials evaluating any pain medication for chronic spinal pain or osteoarthritis. Convergent validity was evaluated by correlating the PEDro score with the summary score of the CBN risk of bias tool. The construct validity was tested using a linear regression analysis to determine the degree to which the total PEDro score is associated with treatment effect sizes, journal impact factor, and the summary score for the CBN risk of bias tool. The interrater reliability was estimated using the Prevalence and Bias Adjusted Kappa coefficient and 95% confidence interval (CI) for the PEDro scale and CBN risk of bias tool. Fifty-three trials were included, with 91 treatment effect sizes included in the analyses. The correlation between PEDro scale and CBN risk of bias tool was 0.83 (95% CI 0.76-0.88) after adjusting for reliability, indicating strong convergence. The PEDro score was inversely associated with effect sizes, significantly associated with the summary score for the CBN risk of bias tool, and not associated with the journal impact factor. The interrater reliability for each item of the PEDro scale and CBN risk of bias tool was at least substantial for most items (>0.60). The intraclass correlation coefficient for the PEDro score was 0.80 (95% CI 0.68-0.88), and for the CBN, risk of bias tool was 0.81 (95% CI 0.69-0.88). There was evidence for the convergent and construct validity for the PEDro scale when used to evaluate methodological quality of pharmacological trials. Both risk of bias tools have acceptably high interrater reliability. Copyright © 2017 Elsevier Inc. All rights reserved.
Scholma, Jetse; Fuhler, Gwenny M.; Joore, Jos; Hulsman, Marc; Schivo, Stefano; List, Alan F.; Reinders, Marcel J. T.; Peppelenbosch, Maikel P.; Post, Janine N.
2016-01-01
Massive parallel analysis using array technology has become the mainstay for analysis of genomes and transcriptomes. Analogously, the predominance of phosphorylation as a regulator of cellular metabolism has fostered the development of peptide arrays of kinase consensus substrates that allow the charting of cellular phosphorylation events (often called kinome profiling). However, whereas the bioinformatical framework for expression array analysis is well-developed, no advanced analysis tools are yet available for kinome profiling. Especially intra-array and interarray normalization of peptide array phosphorylation remain problematic, due to the absence of “housekeeping” kinases and the obvious fallacy of the assumption that different experimental conditions should exhibit equal amounts of kinase activity. Here we describe the development of analysis tools that reliably quantify phosphorylation of peptide arrays and that allow normalization of the signals obtained. We provide a method for intraslide gradient correction and spot quality control. We describe a novel interarray normalization procedure, named repetitive signal enhancement, RSE, which provides a mathematical approach to limit the false negative results occuring with the use of other normalization procedures. Using in silico and biological experiments we show that employing such protocols yields superior insight into cellular physiology as compared to classical analysis tools for kinome profiling. PMID:27225531
Integrating Flight Dynamics & Control Analysis and Simulation in Rotorcraft Conceptual Design
NASA Technical Reports Server (NTRS)
Lawrence, Ben; Berger, Tom; Tischler, Mark B.; Theodore, Colin R; Elmore, Josh; Gallaher, Andrew; Tobias, Eric L.
2016-01-01
The development of a toolset, SIMPLI-FLYD ('SIMPLIfied FLight dynamics for conceptual Design') is described. SIMPLI-FLYD is a collection of tools that perform flight dynamics and control modeling and analysis of rotorcraft conceptual designs including a capability to evaluate the designs in an X-Plane-based real-time simulation. The establishment of this framework is now facilitating the exploration of this new capability, in terms of modeling fidelity and data requirements, and the investigation of which stability and control and handling qualities requirements are appropriate for conceptual design. Illustrative design variation studies for single main rotor and tiltrotor vehicle configurations show sensitivity of the stability and control characteristics and an approach to highlight potential weight savings by identifying over-design.
NASA Astrophysics Data System (ADS)
Ileana, Ioan; Risteiu, Mircea; Marc, Gheorghe
2016-12-01
This paper is a part of our research dedicated to high power LED lamps designing. The boost-up selected technology wants to meet driver producers' tendency in the frame of efficiency and disturbances constrains. In our work we used modeling and simulation tools for implementing scenarios of the driver work when some controlling functions are executed (output voltage/ current versus input voltage and fixed switching frequency, input and output electric power transfer versus switching frequency, transient inductor voltage analysis, and transient out capacitor analysis). Some electrical and thermal stress conditions are also analyzed. Based on these aspects, a high reliable power LED driver has been designed.
TH-C-18A-08: A Management Tool for CT Dose Monitoring, Analysis, and Protocol Review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, J; Chan, F; Newman, B
2014-06-15
Purpose: To develop a customizable tool for enterprise-wide managing of CT protocols and analyzing radiation dose information of CT exams for a variety of quality control applications Methods: All clinical CT protocols implemented on the 11 CT scanners at our institution were extracted in digital format. The original protocols had been preset by our CT management team. A commercial CT dose tracking software (DoseWatch,GE healthcare,WI) was used to collect exam information (exam date, patient age etc.), scanning parameters, and radiation doses for all CT exams. We developed a Matlab-based program (MathWorks,MA) with graphic user interface which allows to analyze themore » scanning protocols with the actual dose estimates, and compare the data to national (ACR,AAPM) and internal reference values for CT quality control. Results: The CT protocol review portion of our tool allows the user to look up the scanning and image reconstruction parameters of any protocol on any of the installed CT systems among about 120 protocols per scanner. In the dose analysis tool, dose information of all CT exams (from 05/2013 to 02/2014) was stratified on a protocol level, and within a protocol down to series level, i.e. each individual exposure event. This allows numerical and graphical review of dose information of any combination of scanner models, protocols and series. The key functions of the tool include: statistics of CTDI, DLP and SSDE, dose monitoring using user-set CTDI/DLP/SSDE thresholds, look-up of any CT exam dose data, and CT protocol review. Conclusion: our inhouse CT management tool provides radiologists, technologists and administration a first-hand near real-time enterprise-wide knowledge on CT dose levels of different exam types. Medical physicists use this tool to manage CT protocols, compare and optimize dose levels across different scanner models. It provides technologists feedback on CT scanning operation, and knowledge on important dose baselines and thresholds.« less
NASA Astrophysics Data System (ADS)
Xing, Xi; Rey-de-Castro, Roberto; Rabitz, Herschel
2014-12-01
Optimally shaped femtosecond laser pulses can often be effectively identified in adaptive feedback quantum control experiments, but elucidating the underlying control mechanism can be a difficult task requiring significant additional analysis. We introduce landscape Hessian analysis (LHA) as a practical experimental tool to aid in elucidating control mechanism insights. This technique is applied to the dissociative ionization of CH2BrI using shaped fs laser pulses for optimization of the absolute yields of ionic fragments as well as their ratios for the competing processes of breaking the C-Br and C-I bonds. The experimental results suggest that these nominally complex problems can be reduced to a low-dimensional control space with insights into the control mechanisms. While the optimal yield for some fragments is dominated by a non-resonant intensity-driven process, the optimal generation of other fragments maa difficult task requiring significant additionaly be explained by a non-resonant process coupled to few level resonant dynamics. Theoretical analysis and modeling is consistent with the experimental observations.
Spatial Epidemiology of Plasmodium vivax, Afghanistan
Leslie, Toby; Kolaczinski, Kate; Mohsen, Engineer; Mehboob, Najeebullah; Saleheen, Sarah; Khudonazarov, Juma; Freeman, Tim; Clements, Archie; Rowland, Mark; Kolaczinski, Jan
2006-01-01
Plasmodium vivax is endemic to many areas of Afghanistan. Geographic analysis helped highlight areas of malaria risk and clarified ecologic risk factors for transmission. Remote sensing enabled development of a risk map, thereby providing a valuable tool to help guide malaria control strategies. PMID:17176583
SSOAP - A USEPA Toolbox for Sanitary Sewer Overflow Analysis and Control Planning - Presentation
The United States Environmental Protection Agency (USEPA) has identified a need to use proven methodologies to develop computer tools that help communities properly characterize rainfall-derived infiltration and inflow (RDII) into sanitary sewer systems and develop sanitary sewer...
NASA Astrophysics Data System (ADS)
Ozana, Stepan; Pies, Martin; Docekal, Tomas
2016-06-01
REX Control System is a professional advanced tool for design and implementation of complex control systems that belongs to softPLC category. It covers the entire process starting from simulation of functionality of the application before deployment, through implementation on real-time target, towards analysis, diagnostics and visualization. Basically it consists of two parts: the development tools and the runtime system. It is also compatible with Simulink environment, and the way of implementation of control algorithm is very similar. The control scheme is finally compiled (using RexDraw utility) and uploaded into a chosen real-time target (using RexView utility). There is a wide variety of hardware platforms and real-time operating systems supported by REX Control System such as for example Windows Embedded, Linux, Linux/Xenomai deployed on SBC, IPC, PAC, Raspberry Pi and others with many I/O interfaces. It is modern system designed both for measurement and control applications, offering a lot of additional functions concerning data archiving, visualization based on HTML5, and communication standards. The paper will sum up possibilities of its use in educational process, focused on control of case studies of physical models with classical and advanced control algorithms.
Viscous-Inviscid Methods in Unsteady Aerodynamic Analysis of Bio-Inspired Morphing Wings
NASA Astrophysics Data System (ADS)
Dhruv, Akash V.
Flight has been one of the greatest realizations of human imagination, revolutionizing communication and transportation over the years. This has greatly influenced the growth of technology itself, enabling researchers to communicate and share their ideas more effectively, extending the human potential to create more sophisticated systems. While the end product of a sophisticated technology makes our lives easier, its development process presents an array of challenges in itself. In last decade, scientists and engineers have turned towards bio-inspiration to design more efficient and robust aerodynamic systems to enhance the ability of Unmanned Aerial Vehicles (UAVs) to be operated in cluttered environments, where tight maneuverability and controllability are necessary. Effective use of UAVs in domestic airspace will mark the beginning of a new age in communication and transportation. The design of such complex systems necessitates the need for faster and more effective tools to perform preliminary investigations in design, thereby streamlining the design process. This thesis explores the implementation of numerical panel methods for aerodynamic analysis of bio-inspired morphing wings. Numerical panel methods have been one of the earliest forms of computational methods for aerodynamic analysis to be developed. Although the early editions of this method performed only inviscid analysis, the algorithm has matured over the years as a result of contributions made by prominent aerodynamicists. The method discussed in this thesis is influenced by recent advancements in panel methods and incorporates both viscous and inviscid analysis of multi-flap wings. The surface calculation of aerodynamic coefficients makes this method less computationally expensive than traditional Computational Fluid Dynamics (CFD) solvers available, and thus is effective when both speed and accuracy are desired. The morphing wing design, which consists of sequential feather-like flaps installed over the upper and lower surfaces of a standard airfoil, proves to be an effective alternative to standard control surfaces by increasing the flight capability of bird-scale UAVs. The results obtained for this wing design under various flight and flap configurations provide insight into its aerodynamic behavior, which enhance the maneuverability and controllability. The overall method acts as an important tool to create an aerodynamic database to develop a distributed control system for autonomous operation of the multi-flap morphing wing, supporting the use of viscous-inviscid methods as a tool in rapid aerodynamic analysis.
Eijssen, Lars M T; Goelela, Varshna S; Kelder, Thomas; Adriaens, Michiel E; Evelo, Chris T; Radonjic, Marijana
2015-06-30
Illumina whole-genome expression bead arrays are a widely used platform for transcriptomics. Most of the tools available for the analysis of the resulting data are not easily applicable by less experienced users. ArrayAnalysis.org provides researchers with an easy-to-use and comprehensive interface to the functionality of R and Bioconductor packages for microarray data analysis. As a modular open source project, it allows developers to contribute modules that provide support for additional types of data or extend workflows. To enable data analysis of Illumina bead arrays for a broad user community, we have developed a module for ArrayAnalysis.org that provides a free and user-friendly web interface for quality control and pre-processing for these arrays. This module can be used together with existing modules for statistical and pathway analysis to provide a full workflow for Illumina gene expression data analysis. The module accepts data exported from Illumina's GenomeStudio, and provides the user with quality control plots and normalized data. The outputs are directly linked to the existing statistics module of ArrayAnalysis.org, but can also be downloaded for further downstream analysis in third-party tools. The Illumina bead arrays analysis module is available at http://www.arrayanalysis.org . A user guide, a tutorial demonstrating the analysis of an example dataset, and R scripts are available. The module can be used as a starting point for statistical evaluation and pathway analysis provided on the website or to generate processed input data for a broad range of applications in life sciences research.
A tensor analysis to evaluate the effect of high-pull headgear on Class II malocclusions.
Ngan, P; Scheick, J; Florman, M
1993-03-01
The inaccuracies inherent in cephalometric analysis of treatment effects are well known. The objective of this article is to present a more reliable research tool in the analysis of cephalometric data. Bookstein introduced a dilation function by means of a homogeneous deformation tensor as a method of describing changes in cephalometric data. His article gave an analytic description of the deformation tensor that permits the rapid and highly accurate calculation of it on a desktop computer. The first part of this article describes the underlying ideas and mathematics. The second part uses the tensor analysis to analyze the cephalometric results of a group of patients treated with high-pull activator (HPA) to demonstrate the application of this research tool. Eight patients with Class II skeletal open bite malocclusions in the mixed dentition were treated with HPA. A control sample consisting of eight untreated children with Class II who were obtained from The Ohio State University Growth Study was used as a comparison group. Lateral cephalograms taken before and at the completion of treatment were traced, digitized, and analyzed with the conventional method and tensor analysis. The results showed that HPA had little or no effect on maxillary skeletal structures. However, reduction in growth rate was found with the skeletal triangle S-N-A, indicating a posterior tipping and torquing of the maxillary incisors. The treatment also induced additional deformation on the mandible in a downward and slightly forward direction. Together with the results from the conventional cephalometric analysis, HPA seemed to provide the vertical and rotational control of the maxilla during orthopedic Class II treatment by inhibiting the downward and forward eruptive path of the upper posterior teeth. The newly designed computer software permits rapid analysis of cephalometric data with the tensor analysis on a desktop computer. This tool may be useful in analyzing growth changes for research data.
Computational electronics and electromagnetics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shang, C C
The Computational Electronics and Electromagnetics thrust area serves as the focal point for Engineering R and D activities for developing computer-based design and analysis tools. Representative applications include design of particle accelerator cells and beamline components; design of transmission line components; engineering analysis and design of high-power (optical and microwave) components; photonics and optoelectronics circuit design; electromagnetic susceptibility analysis; and antenna synthesis. The FY-97 effort focuses on development and validation of (1) accelerator design codes; (2) 3-D massively parallel, time-dependent EM codes; (3) material models; (4) coupling and application of engineering tools for analysis and design of high-power components; andmore » (5) development of beam control algorithms coupled to beam transport physics codes. These efforts are in association with technology development in the power conversion, nondestructive evaluation, and microtechnology areas. The efforts complement technology development in Lawrence Livermore National programs.« less
NASA Technical Reports Server (NTRS)
Chatterjee, Sharmista
1993-01-01
Our first goal in this project was to perform a systems analysis of a closed loop Environmental Control Life Support System (ECLSS). This pertains to the development of a model of an existing real system from which to assess the state or performance of the existing system. Systems analysis is applied to conceptual models obtained from a system design effort. For our modelling purposes we used a simulator tool called ASPEN (Advanced System for Process Engineering). Our second goal was to evaluate the thermodynamic efficiency of the different components comprising an ECLSS. Use is made of the second law of thermodynamics to determine the amount of irreversibility of energy loss of each component. This will aid design scientists in selecting the components generating the least entropy, as our penultimate goal is to keep the entropy generation of the whole system at a minimum.
MetaDP: a comprehensive web server for disease prediction of 16S rRNA metagenomic datasets.
Xu, Xilin; Wu, Aiping; Zhang, Xinlei; Su, Mingming; Jiang, Taijiao; Yuan, Zhe-Ming
2016-01-01
High-throughput sequencing-based metagenomics has garnered considerable interest in recent years. Numerous methods and tools have been developed for the analysis of metagenomic data. However, it is still a daunting task to install a large number of tools and complete a complicated analysis, especially for researchers with minimal bioinformatics backgrounds. To address this problem, we constructed an automated software named MetaDP for 16S rRNA sequencing data analysis, including data quality control, operational taxonomic unit clustering, diversity analysis, and disease risk prediction modeling. Furthermore, a support vector machine-based prediction model for intestinal bowel syndrome (IBS) was built by applying MetaDP to microbial 16S sequencing data from 108 children. The success of the IBS prediction model suggests that the platform may also be applied to other diseases related to gut microbes, such as obesity, metabolic syndrome, or intestinal cancer, among others (http://metadp.cn:7001/).
Integration of EGA secure data access into Galaxy.
Hoogstrate, Youri; Zhang, Chao; Senf, Alexander; Bijlard, Jochem; Hiltemann, Saskia; van Enckevort, David; Repo, Susanna; Heringa, Jaap; Jenster, Guido; J A Fijneman, Remond; Boiten, Jan-Willem; A Meijer, Gerrit; Stubbs, Andrew; Rambla, Jordi; Spalding, Dylan; Abeln, Sanne
2016-01-01
High-throughput molecular profiling techniques are routinely generating vast amounts of data for translational medicine studies. Secure access controlled systems are needed to manage, store, transfer and distribute these data due to its personally identifiable nature. The European Genome-phenome Archive (EGA) was created to facilitate access and management to long-term archival of bio-molecular data. Each data provider is responsible for ensuring a Data Access Committee is in place to grant access to data stored in the EGA. Moreover, the transfer of data during upload and download is encrypted. ELIXIR, a European research infrastructure for life-science data, initiated a project (2016 Human Data Implementation Study) to understand and document the ELIXIR requirements for secure management of controlled-access data. As part of this project, a full ecosystem was designed to connect archived raw experimental molecular profiling data with interpreted data and the computational workflows, using the CTMM Translational Research IT (CTMM-TraIT) infrastructure http://www.ctmm-trait.nl as an example. Here we present the first outcomes of this project, a framework to enable the download of EGA data to a Galaxy server in a secure way. Galaxy provides an intuitive user interface for molecular biologists and bioinformaticians to run and design data analysis workflows. More specifically, we developed a tool -- ega_download_streamer - that can download data securely from EGA into a Galaxy server, which can subsequently be further processed. This tool will allow a user within the browser to run an entire analysis containing sensitive data from EGA, and to make this analysis available for other researchers in a reproducible manner, as shown with a proof of concept study. The tool ega_download_streamer is available in the Galaxy tool shed: https://toolshed.g2.bx.psu.edu/view/yhoogstrate/ega_download_streamer.
Integration of EGA secure data access into Galaxy
Hoogstrate, Youri; Zhang, Chao; Senf, Alexander; Bijlard, Jochem; Hiltemann, Saskia; van Enckevort, David; Repo, Susanna; Heringa, Jaap; Jenster, Guido; Fijneman, Remond J.A.; Boiten, Jan-Willem; A. Meijer, Gerrit; Stubbs, Andrew; Rambla, Jordi; Spalding, Dylan; Abeln, Sanne
2016-01-01
High-throughput molecular profiling techniques are routinely generating vast amounts of data for translational medicine studies. Secure access controlled systems are needed to manage, store, transfer and distribute these data due to its personally identifiable nature. The European Genome-phenome Archive (EGA) was created to facilitate access and management to long-term archival of bio-molecular data. Each data provider is responsible for ensuring a Data Access Committee is in place to grant access to data stored in the EGA. Moreover, the transfer of data during upload and download is encrypted. ELIXIR, a European research infrastructure for life-science data, initiated a project (2016 Human Data Implementation Study) to understand and document the ELIXIR requirements for secure management of controlled-access data. As part of this project, a full ecosystem was designed to connect archived raw experimental molecular profiling data with interpreted data and the computational workflows, using the CTMM Translational Research IT (CTMM-TraIT) infrastructure http://www.ctmm-trait.nl as an example. Here we present the first outcomes of this project, a framework to enable the download of EGA data to a Galaxy server in a secure way. Galaxy provides an intuitive user interface for molecular biologists and bioinformaticians to run and design data analysis workflows. More specifically, we developed a tool -- ega_download_streamer - that can download data securely from EGA into a Galaxy server, which can subsequently be further processed. This tool will allow a user within the browser to run an entire analysis containing sensitive data from EGA, and to make this analysis available for other researchers in a reproducible manner, as shown with a proof of concept study. The tool ega_download_streamer is available in the Galaxy tool shed: https://toolshed.g2.bx.psu.edu/view/yhoogstrate/ega_download_streamer. PMID:28232859
Goch, Caspar J; Stieltjes, Bram; Henze, Romy; Hering, Jan; Poustka, Luise; Meinzer, Hans-Peter; Maier-Hein, Klaus H
2014-05-01
Diagnosis of autism spectrum disorders (ASD) is difficult, as symptoms vary greatly and are difficult to quantify objectively. Recent work has focused on the assessment of non-invasive diffusion tensor imaging-based biomarkers that reflect the microstructural characteristics of neuronal pathways in the brain. While tractography-based approaches typically analyze specific structures of interest, a graph-based large-scale network analysis of the connectome can yield comprehensive measures of larger-scale architectural patterns in the brain. Commonly applied global network indices, however, do not provide any specificity with respect to functional areas or anatomical structures. Aim of this work was to assess the concept of network centrality as a tool to perform locally specific analysis without disregarding the global network architecture and compare it to other popular network indices. We create connectome networks from fiber tractographies and parcellations of the human brain and compute global network indices as well as local indices for Wernicke's Area, Broca's Area and the Motor Cortex. Our approach was evaluated on 18 children suffering from ASD and 18 typically developed controls using magnetic resonance imaging-based cortical parcellations in combination with diffusion tensor imaging tractography. We show that the network centrality of Wernicke's area is significantly (p<0.001) reduced in ASD, while the motor cortex, which was used as a control region, did not show significant alterations. This could reflect the reduced capacity for comprehension of language in ASD. The betweenness centrality could potentially be an important metric in the development of future diagnostic tools in the clinical context of ASD diagnosis. Our results further demonstrate the applicability of large-scale network analysis tools in the domain of region-specific analysis with a potential application in many different psychological disorders.
Dynamics modeling and loads analysis of an offshore floating wind turbine
NASA Astrophysics Data System (ADS)
Jonkman, Jason Mark
The vast deepwater wind resource represents a potential to use offshore floating wind turbines to power much of the world with renewable energy. Many floating wind turbine concepts have been proposed, but dynamics models, which account for the wind inflow, aerodynamics, elasticity, and controls of the wind turbine, along with the incident waves, sea current, hydrodynamics, and platform and mooring dynamics of the floater, were needed to determine their technical and economic feasibility. This work presents the development of a comprehensive simulation tool for modeling the coupled dynamic response of offshore floating wind turbines, the verification of the simulation tool through model-to-model comparisons, and the application of the simulation tool to an integrated loads analysis for one of the promising system concepts. A fully coupled aero-hydro-servo-elastic simulation tool was developed with enough sophistication to address the limitations of previous frequency- and time-domain studies and to have the features required to perform loads analyses for a variety of wind turbine, support platform, and mooring system configurations. The simulation capability was tested using model-to-model comparisons. The favorable results of all of the verification exercises provided confidence to perform more thorough analyses. The simulation tool was then applied in a preliminary loads analysis of a wind turbine supported by a barge with catenary moorings. A barge platform was chosen because of its simplicity in design, fabrication, and installation. The loads analysis aimed to characterize the dynamic response and to identify potential loads and instabilities resulting from the dynamic couplings between the turbine and the floating barge in the presence of combined wind and wave excitation. The coupling between the wind turbine response and the barge-pitch motion, in particular, produced larger extreme loads in the floating turbine than experienced by an equivalent land-based turbine. Instabilities were also found in the system. The influence of conventional wind turbine blade-pitch control actions on the pitch damping of the floating turbine was also assessed. Design modifications for reducing the platform motions, improving the turbine response, and eliminating the instabilities are suggested. These suggestions are aimed at obtaining cost-effective designs that achieve favorable performance while maintaining structural integrity.
Initial Evaluation of a Conflict Detection Tool in the Terminal Area
NASA Technical Reports Server (NTRS)
Verma Savita Arora; Tang, Huabin; Ballinger, Deborah S.; Kozon, Thomas E.; Farrahi, Amir Hossein
2012-01-01
Despite the recent economic recession and its adverse impact on air travel, the Federal Aviation Administration (FAA) continues to forecast an increase in air traffic demand that may see traffic double or triple by the year 2025. Increases in air traffic will burden the air traffic management system, and higher levels of safety and efficiency will be required. The air traffic controllers primary task is to ensure separation between aircraft in their airspace and keep the skies safe. As air traffic is forecasted to increase in volume and complexity [1], there is an increased likelihood of conflicts between aircraft, which adds risk and inefficiency to air traffic management and increases controller workload. To attenuate these factors, recent ATM research has shown that air and ground-based automation tools could reduce controller workload, especially if the automation is focused on conflict detection and resolution. Conflict Alert is a short time horizon conflict detection tool deployed in the Terminal Radar Approach Control (TRACON), which has limited utility due to the high number of false alerts generated and its use of dead reckoning to predict loss of separation between aircraft. Terminal Tactical Separation Assurance Flight Environment (T-TSAFE) is a short time horizon conflict detection tool that uses both flight intent and dead reckoning to detect conflicts. Results of a fast time simulation experiment indicated that TTSAFE provided a more effective alert lead-time and generated less false alerts than Conflict Alert [2]. TSAFE was previously tested in a Human-In-The-Loop (HITL) simulation study that focused on the en route phase of flight [3]. The current study tested the T-TSAFE tool in an HITL simulation study, focusing on the terminal environment with current day operations. The study identified procedures, roles, responsibilities, information requirements and usability, with the help of TRACON controllers who participated in the experiment. Metrics such as lead alert time, alert response time, workload, situation awareness and other measures were statistically analyzed. These metrics were examined from an overall perspective and comparisons between conditions (altitude resolutions via keyboard entry vs. ADS-B entry) and controller positions (two final approach sectors and two feeder sectors) were also examined. Results of these analyses and controller feedback provided evidence of T-TSAFE s potential promise as a useful air traffic controller tool. Heuristic analysis also provided information on ways in which the T-TSAFE tool can be improved. Details of analyses results will be presented in the full paper.
Firmware Modification Analysis in Programmable Logic Controllers
2014-03-27
security and operational requirements [18, 19]. Money is a factor for the DOD but not a driving one. With private industry, money is a primary influential... functions in the original firmware. A proof-of-concept experiment demonstrates the functionality of the analysis tool using different firmware versions...Opcode Difference Comparison . . . . . . . . . . . . . . 37 3.1.2.3 Function Difference Comparison . . . . . . . . . . . . . 37 3.1.2.4 Call Graph
Evaluation of verification and testing tools for FORTRAN programs
NASA Technical Reports Server (NTRS)
Smith, K. A.
1980-01-01
Two automated software verification and testing systems were developed for use in the analysis of computer programs. An evaluation of the static analyzer DAVE and the dynamic analyzer PET, which are used in the analysis of FORTRAN programs on Control Data (CDC) computers, are described. Both systems were found to be effective and complementary, and are recommended for use in testing FORTRAN programs.
Cost-effective use of minicomputers to solve structural problems
NASA Technical Reports Server (NTRS)
Storaasli, O. O.; Foster, E. P.
1978-01-01
Minicomputers are receiving increased use throughout the aerospace industry. Until recently, their use focused primarily on process control and numerically controlled tooling applications, while their exposure to and the opportunity for structural calculations has been limited. With the increased availability of this computer hardware, the question arises as to the feasibility and practicality of carrying out comprehensive structural analysis on a minicomputer. This paper presents results on the potential for using minicomputers for structural analysis by (1) selecting a comprehensive, finite-element structural analysis system in use on large mainframe computers; (2) implementing the system on a minicomputer; and (3) comparing the performance of the minicomputers with that of a large mainframe computer for the solution to a wide range of finite element structural analysis problems.
Overview of computational control research at UT Austin
NASA Technical Reports Server (NTRS)
Bong, Wie
1989-01-01
An overview of current research activities at UT Austin is presented to discuss certain technical issues in the following areas: (1) Computer-Aided Nonlinear Control Design: In this project, the describing function method is employed for the nonlinear control analysis and design of a flexible spacecraft equipped with pulse modulated reaction jets. INCA program has been enhanced to allow the numerical calculation of describing functions as well as the nonlinear limit cycle analysis capability in the frequency domain; (2) Robust Linear Quadratic Gaussian (LQG) Compensator Synthesis: Robust control design techniques and software tools are developed for flexible space structures with parameter uncertainty. In particular, an interactive, robust multivariable control design capability is being developed for INCA program; and (3) LQR-Based Autonomous Control System for the Space Station: In this project, real time implementation of LQR-based autonomous control system is investigated for the space station with time-varying inertias and with significant multibody dynamic interactions.
The SARE tool for rabies control: Current experience in Ethiopia.
Coetzer, A; Kidane, A H; Bekele, M; Hundera, A D; Pieracci, E G; Shiferaw, M L; Wallace, R; Nel, L H
2016-11-01
The Stepwise Approach towards Rabies Elimination (SARE) tool was developed through a joint effort of the Food and Agriculture Organization (FAO) of the United Nations and the Global Alliance for Rabies Control (GARC), to provide a standard mechanism for countries to assess their rabies situation and measure progress in eliminating the disease. Because the African continent has the highest per capita death rate from rabies, and Ethiopia is estimated to have the second largest number of rabies deaths of all African countries, Ethiopia undertook a self-assessment by means of the Stepwise Approach towards Rabies Elimination (SARE) tool. In February 2016, the Ethiopian government hosted an intersectoral consultative meeting in an effort to assess the progress that has been made towards the control and elimination of canine rabies. The SARE assessment identified a number of critical gaps, including poor inter-sectoral collaboration and limited availability and access to dog vaccine, while the existence of a surveillance system for rabies and legislation for outbreak declaration and response were among the strengths identified. The SARE tool enabled key criteria to be prioritized, thereby accelerating the National Strategy and ensuring that Ethiopia will progress rapidly in line with the goals set by the global community for the elimination of human rabies deaths by 2030. Although the analysis showed that Ethiopia is still in the early stages of rabies control (Stage 0.5/5), the country shows great promise in terms of developing a SARE-guided National Rabies Prevention and Control Strategy. Copyright © 2016 Elsevier B.V. All rights reserved.
The Litho-Density tool calibration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ellis, D.; Flaum, C.; Marienbach, E.
1983-10-01
The Litho-Density tool (LDT) uses a gamma ray source and two NaI scintillator detectors for borehole measurement of electron density, p/SUB e/, and a quantity, P/SUB e/, which is related to the photoelectric cross section at 60 keV and therefore to the lithology of the formation. An active stabilization system controls the gains of the two detectors which permits selective gamma-ray detection. Spectral analysis is performed in the near detector (2 energy windows) and in the detector farther away from the source (3 energy windows). This paper describes the results of laboratory measurements undertaken to define the basic tool response.more » The tool is shown to provide reliable measurements of formation density and lithology under a variety of environmental conditions.« less
Challenges Facing Design and Analysis Tools
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.; Broduer, Steve (Technical Monitor)
2001-01-01
The design and analysis of future aerospace systems will strongly rely on advanced engineering analysis tools used in combination with risk mitigation procedures. The implications of such a trend place increased demands on these tools to assess off-nominal conditions, residual strength, damage propagation, and extreme loading conditions in order to understand and quantify these effects as they affect mission success. Advances in computer hardware such as CPU processing speed, memory, secondary storage, and visualization provide significant resources for the engineer to exploit in engineering design. The challenges facing design and analysis tools fall into three primary areas. The first area involves mechanics needs such as constitutive modeling, contact and penetration simulation, crack growth prediction, damage initiation and progression prediction, transient dynamics and deployment simulations, and solution algorithms. The second area involves computational needs such as fast, robust solvers, adaptivity for model and solution strategies, control processes for concurrent, distributed computing for uncertainty assessments, and immersive technology. Traditional finite element codes still require fast direct solvers which when coupled to current CPU power enables new insight as a result of high-fidelity modeling. The third area involves decision making by the analyst. This area involves the integration and interrogation of vast amounts of information - some global in character while local details are critical and often drive the design. The proposed presentation will describe and illustrate these areas using composite structures, energy-absorbing structures, and inflatable space structures. While certain engineering approximations within the finite element model may be adequate for global response prediction, they generally are inadequate in a design setting or when local response prediction is critical. Pitfalls to be avoided and trends for emerging analysis tools will be described.
Pathfinder 2: In Situ Design Cost Trades (IDCT) Tool
2003-05-01
collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT...Cost Analysis box in the center of the figure is represented as an ICOM box where input, controls , output, and mechanisms are denoted on each of the...characteristics (e.g. mission profile, maintenance plan). All of these are received in some form from domain experts. Controls are denoted at the top of the
The value of job analysis, job description and performance.
Wolfe, M N; Coggins, S
1997-01-01
All companies, regardless of size, are faced with the same employment concerns. Efficient personnel management requires the use of three human resource techniques--job analysis, job description and performance appraisal. These techniques and tools are not for large practices only. Small groups can obtain the same benefits by employing these performance control measures. Job analysis allows for the development of a compensation system. Job descriptions summarize the most important duties. Performance appraisals help reward outstanding work.
Murad, André M; Laumann, Raul A; Lima, Thaina de A; Sarmento, Rubia B C; Noronha, Eliane F; Rocha, Thales L; Valadares-Inglis, Maria C; Franco, Octávio L
2006-01-01
Cowpea crops are severely attacked by Callosobruchus maculatus, a Coleopteran that at the larval stage penetrates into stored seeds and feeds on cotyledons. Cowpea weevil control could be based in utilization of bacteria and fungi to reduce pest development. Entomopathogenic fungi, such as Metarhizium anisopliae, are able to control insect-pests and are widely applied in biological control. This report evaluated ten M. anisopliae isolates according to their virulence, correlating chitinolytic, proteolytic and alpha-amylolytic activities, as well proteomic analysis by two dimensional gels of fungal secretions in response to an induced medium containing C. maculatus shells, indicating novel biotechnological tools capable of improving cowpea crop resistance.
Technology Combination Analysis Tool (TCAT) for Active Debris Removal
NASA Astrophysics Data System (ADS)
Chamot, B.; Richard, M.; Salmon, T.; Pisseloup, A.; Cougnet, C.; Axthelm, R.; Saunder, C.; Dupont, C.; Lequette, L.
2013-08-01
This paper present the work of the Swiss Space Center EPFL within the CNES-funded OTV-2 study. In order to find the most performant Active Debris Removal (ADR) mission architectures and technologies, a tool was developed in order to design and compare ADR spacecraft, and to plan ADR campaigns to remove large debris. Two types of architectures are considered to be efficient: the Chaser (single-debris spacecraft), the Mothership/ Kits (multiple-debris spacecraft). Both are able to perform controlled re-entry. The tool includes modules to optimise the launch dates and the order of capture, to design missions and spacecraft, and to select launch vehicles. The propulsion, power and structure subsystems are sized by the tool thanks to high-level parametric models whilst the other ones are defined by their mass and power consumption. Final results are still under investigation by the consortium but two concrete examples of the tool's outputs are presented in the paper.
Process analytical technology in the pharmaceutical industry: a toolkit for continuous improvement.
Scott, Bradley; Wilcock, Anne
2006-01-01
Process analytical technology (PAT) refers to a series of tools used to ensure that quality is built into products while at the same time improving the understanding of processes, increasing efficiency, and decreasing costs. It has not been widely adopted by the pharmaceutical industry. As the setting for this paper, the current pharmaceutical manufacturing paradigm and PAT guidance to date are discussed prior to the review of PAT principles and tools, benefits, and challenges. The PAT toolkit contains process analyzers, multivariate analysis tools, process control tools, and continuous improvement/knowledge management/information technology systems. The integration and implementation of these tools is complex, and has resulted in uncertainty with respect to both regulation and validation. The paucity of staff knowledgeable in this area may complicate adoption. Studies to quantitate the benefits resulting from the adoption of PAT within the pharmaceutical industry would be a valuable addition to the qualitative studies that are currently available.
NASA Astrophysics Data System (ADS)
Pedersen, N. L.
2015-06-01
The strength of a gear is typically defined relative to durability (pitting) and load capacity (tooth-breakage). Tooth-breakage is controlled by the root shape and this gear part can be designed because there is no contact between gear pairs here. The shape of gears is generally defined by different standards, with the ISO standard probably being the most common one. Gears are manufactured using two principally different tools: rack tools and gear tools. In this work, the bending stress of involute teeth is minimized by shape optimization made directly on the final gear. This optimized shape is then used to find the cutting tool (the gear envelope) that can create this optimized gear shape. A simple but sufficiently flexible root parameterization is applied and emphasis is put on the importance of separating the shape parameterization from the finite element analysis of stresses. Large improvements in the stress level are found.
HACCP: Integrating Science and Management through ASTM Standards
From a technical perspective, hazard analysis-critical control point (HACCP) evaluation may be considered a risk management tool suited to a wide range of applications. As one outcome of a symposium convened by American Society for Testing and Materials (ASTM) in August, 2005, th...
NASA Astrophysics Data System (ADS)
Hilliard, Antony
Energy Monitoring and Targeting is a well-established business process that develops information about utility energy consumption in a business or institution. While M&T has persisted as a worthwhile energy conservation support activity, it has not been widely adopted. This dissertation explains M&T challenges in terms of diagnosing and controlling energy consumption, informed by a naturalistic field study of M&T work. A Cognitive Work Analysis of M&T identifies structures that diagnosis can search, information flows un-supported in canonical support tools, and opportunities to extend the most popular tool for MM&T: Cumulative Sum of Residuals (CUSUM) charts. A design application outlines how CUSUM charts were augmented with a more contemporary statistical change detection strategy, Recursive Parameter Estimates, modified to better suit the M&T task using Representation Aiding principles. The design was experimentally evaluated in a controlled M&T synthetic task, and was shown to significantly improve diagnosis performance.
Model based systems engineering for astronomical projects
NASA Astrophysics Data System (ADS)
Karban, R.; Andolfato, L.; Bristow, P.; Chiozzi, G.; Esselborn, M.; Schilling, M.; Schmid, C.; Sommer, H.; Zamparelli, M.
2014-08-01
Model Based Systems Engineering (MBSE) is an emerging field of systems engineering for which the System Modeling Language (SysML) is a key enabler for descriptive, prescriptive and predictive models. This paper surveys some of the capabilities, expectations and peculiarities of tools-assisted MBSE experienced in real-life astronomical projects. The examples range in depth and scope across a wide spectrum of applications (for example documentation, requirements, analysis, trade studies) and purposes (addressing a particular development need, or accompanying a project throughout many - if not all - its lifecycle phases, fostering reuse and minimizing ambiguity). From the beginnings of the Active Phasing Experiment, through VLT instrumentation, VLTI infrastructure, Telescope Control System for the E-ELT, until Wavefront Control for the E-ELT, we show how stepwise refinements of tools, processes and methods have provided tangible benefits to customary system engineering activities like requirement flow-down, design trade studies, interfaces definition, and validation, by means of a variety of approaches (like Model Checking, Simulation, Model Transformation) and methodologies (like OOSEM, State Analysis)
Developing a uniformed assessment tool to evaluate care service needs for disabled persons in Japan.
Takei, Teiji; Takahashi, Hiroshi; Nakatani, Hiroki
2008-05-01
Until recently, the care services for disabled persons have been under rigid control by public sectors in terms of provision and funding in Japan. A reform was introduced in 2003 that brought a rapid increase of utilization of services and serious shortage of financial resources. Under these circumstances, the "Services and Supports for Persons with Disabilities Act" was enacted in 2005, requiring that the care service provision process should be transparent, fair and standardized. The purpose of this study is to develop an objective tool for assessing the need for disability care. In the present study we evaluate 1423 cases of patients receiving care services in 60 municipalities, including all three categories of disabilities (physical, intellectual and mental). Using the data of the total 106 items, we conducted factor analysis and regression analysis to develop an assessment tool for people with disabilities. The data revealed that instrumental activities of daily living (IADL) played an essential role in assessing disability levels. We have developed the uniformed assessment tool that has been utilized to guide the types and quantity of care services throughout Japan.
Modeling and Analysis of Power Processing Systems (MAPPS), initial phase 2
NASA Technical Reports Server (NTRS)
Yu, Y.; Lee, F. C.; Wangenheim, H.; Warren, D.
1977-01-01
The overall objective of the program is to provide the engineering tools to reduce the analysis, design, and development effort, and thus the cost, in achieving the required performances for switching regulators and dc-dc converter systems. The program was both tutorial and application oriented. Various analytical methods were described in detail and supplemented with examples, and those with standardization appeals were reduced into computer-based subprograms. Major program efforts included those concerning small and large signal control-dependent performance analysis and simulation, control circuit design, power circuit design and optimization, system configuration study, and system performance simulation. Techniques including discrete time domain, conventional frequency domain, Lagrange multiplier, nonlinear programming, and control design synthesis were employed in these efforts. To enhance interactive conversation between the modeling and analysis subprograms and the user, a working prototype of the Data Management Program was also developed to facilitate expansion as future subprogram capabilities increase.
deepTools2: a next generation web server for deep-sequencing data analysis.
Ramírez, Fidel; Ryan, Devon P; Grüning, Björn; Bhardwaj, Vivek; Kilpert, Fabian; Richter, Andreas S; Heyne, Steffen; Dündar, Friederike; Manke, Thomas
2016-07-08
We present an update to our Galaxy-based web server for processing and visualizing deeply sequenced data. Its core tool set, deepTools, allows users to perform complete bioinformatic workflows ranging from quality controls and normalizations of aligned reads to integrative analyses, including clustering and visualization approaches. Since we first described our deepTools Galaxy server in 2014, we have implemented new solutions for many requests from the community and our users. Here, we introduce significant enhancements and new tools to further improve data visualization and interpretation. deepTools continue to be open to all users and freely available as a web service at deeptools.ie-freiburg.mpg.de The new deepTools2 suite can be easily deployed within any Galaxy framework via the toolshed repository, and we also provide source code for command line usage under Linux and Mac OS X. A public and documented API for access to deepTools functionality is also available. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
Experimental evaluation of tool run-out in micro milling
NASA Astrophysics Data System (ADS)
Attanasio, Aldo; Ceretti, Elisabetta
2018-05-01
This paper deals with micro milling cutting process focusing the attention on tool run-out measurement. In fact, among the effects of the scale reduction from macro to micro (i.e., size effects) tool run-out plays an important role. This research is aimed at developing an easy and reliable method to measure tool run-out in micro milling based on experimental tests and an analytical model. From an Industry 4.0 perspective this measuring strategy can be integrated into an adaptive system for controlling cutting forces, with the objective of improving the production quality, the process stability, reducing at the same time the tool wear and the machining costs. The proposed procedure estimates the tool run-out parameters from the tool diameter, the channel width, and the phase angle between the cutting edges. The cutting edge phase measurement is based on the force signal analysis. The developed procedure has been tested on data coming from micro milling experimental tests performed on a Ti6Al4V sample. The results showed that the developed procedure can be successfully used for tool run-out estimation.
Birch, Ivan; Vernon, Wesley; Walker, Jeremy; Saxelby, Jai
2013-10-01
Gait analysis from closed circuit camera footage is now commonly used as evidence in criminal trials. The biomechanical analysis of human gait is a well established science in both clinical and laboratory settings. However, closed circuit camera footage is rarely of the quality of that taken in the more controlled clinical and laboratory environments. The less than ideal quality of much of this footage for use in gait analysis is associated with a range of issues, the combination of which can often render the footage unsuitable for use in gait analysis. The aim of this piece of work was to develop a tool for assessing the suitability of closed circuit camera footage for the purpose of forensic gait analysis. A Delphi technique was employed with a small sample of expert forensic gait analysis practitioners, to identify key quality elements of CCTV footage used in legal proceedings. Five elements of the footage were identified and then subdivided into 15 contributing sub-elements, each of which was scored using a 5-point Likert scale. A Microsoft Excel worksheet was developed to calculate automatically an overall score from the fifteen sub-element scores. Five expert witnesses experienced in using CCTV footage for gait analysis then trialled the prototype tool on current case footage. A repeatability study was also undertaken using standardized CCTV footage. The results showed the tool to be a simple and repeatable means of assessing the suitability of closed circuit camera footage for use in forensic gait analysis. The inappropriate use of poor quality footage could lead to challenges to the practice of forensic gait analysis. All parties involved in criminal proceedings must therefore understand the fitness for purpose of any footage used. The development of this tool could offer a method of achieving this goal, and help to assure the continued role of forensic gait analysis as an aid to the identification process. Copyright © 2013 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.
NASA Technical Reports Server (NTRS)
Newsom, Jerry R.
1991-01-01
Control-Structures Interaction (CSI) technology embraces the understanding of the interaction between the spacecraft structure and the control system, and the creation and validation of concepts, techniques, and tools, for enabling the interdisciplinary design of an integrated structure and control system, rather than the integration of a structural design and a control system design. The goal of this program is to develop validated CSI technology for integrated design/analysis and qualification of large flexible space systems and precision space structures. A description of the CSI technology program is presented.
Simulation Environment for Orion Launch Abort System Control Design Studies
NASA Technical Reports Server (NTRS)
McMinn, J. Dana; Jackson, E. Bruce; Christhilf, David M.
2007-01-01
The development and use of an interactive environment to perform control system design and analysis of the proposed Crew Exploration Vehicle Launch Abort System is described. The environment, built using a commercial dynamic systems design package, includes use of an open-source configuration control software tool and a collaborative wiki to coordinate between the simulation developers, control law developers and users. A method for switching between multiple candidate control laws and vehicle configurations is described. Aerodynamic models, especially in a development program, change rapidly, so a means for automating the implementation of new aerodynamic models is described.
Vacata, Vladimir; Jahns-Streubel, Gerlinde; Baldus, Mirjana; Wood, William Graham
2007-01-01
This report was written in response to the article by Wood published recently in this journal. It describes a practical solution to the problems of controlling the pre-analytical phase in the clinical diagnostic laboratory. As an indicator of quality in the pre-analytical phase of sample processing, a target analyte was chosen which is sensitive to delay in centrifugation and/or analysis. The results of analyses of the samples sent by satellite medical practitioners were compared with those from an on-site hospital laboratory with a controllable optimized pre-analytical phase. The aim of the comparison was: (a) to identify those medical practices whose mean/median sample values significantly deviate from those of the control situation in the hospital laboratory due to the possible problems in the pre-analytical phase; (b) to aid these laboratories in the process of rectifying these problems. A Microsoft Excel-based Pre-Analytical Survey tool (PAS tool) has been developed which addresses the above mentioned problems. It has been tested on serum potassium which is known to be sensitive to delay and/or irregularities in sample treatment. The PAS tool has been shown to be one possibility for improving the quality of the analyses by identifying the sources of problems within the pre-analytical phase, thus allowing them to be rectified. Additionally, the PAS tool has an educational value and can also be adopted for use in other decentralized laboratories.
MTK: An AI tool for model-based reasoning
NASA Technical Reports Server (NTRS)
Erickson, William K.; Rudokas, Mary R.
1988-01-01
A 1988 goal for the Systems Autonomy Demonstration Project Office of the NASA Ames Research Office is to apply model-based representation and reasoning techniques in a knowledge-based system that will provide monitoring, fault diagnosis, control, and trend analysis of the Space Station Thermal Control System (TCS). A number of issues raised during the development of the first prototype system inspired the design and construction of a model-based reasoning tool called MTK, which was used in the building of the second prototype. These issues are outlined here with examples from the thermal system to highlight the motivating factors behind them, followed by an overview of the capabilities of MTK, which was developed to address these issues in a generic fashion.
Introducing MASC: a movie for the assessment of social cognition.
Dziobek, Isabel; Fleck, Stefan; Kalbe, Elke; Rogers, Kimberley; Hassenstab, Jason; Brand, Matthias; Kessler, Josef; Woike, Jan K; Wolf, Oliver T; Convit, Antonio
2006-07-01
In the present study we introduce a sensitive video-based test for the evaluation of subtle mindreading difficulties: the Movie for the Assessment of Social Cognition (MASC). This new mindreading tool involves watching a short film and answering questions referring to the actors' mental states. A group of adults with Asperger syndrome (n = 19) and well-matched control subjects (n = 20) were administered the MASC and three other mindreading tools as part of a broader neuropsychological testing session. Compared to control subjects, Asperger individuals exhibited marked and selective difficulties in social cognition. A Receiver Operating Characteristic (ROC) analysis for the mindreading tests identified the MASC as discriminating the diagnostic groups most accurately. Issues pertaining to the multidimensionality of the social cognition construct are discussed.
PinAPL-Py: A comprehensive web-application for the analysis of CRISPR/Cas9 screens.
Spahn, Philipp N; Bath, Tyler; Weiss, Ryan J; Kim, Jihoon; Esko, Jeffrey D; Lewis, Nathan E; Harismendy, Olivier
2017-11-20
Large-scale genetic screens using CRISPR/Cas9 technology have emerged as a major tool for functional genomics. With its increased popularity, experimental biologists frequently acquire large sequencing datasets for which they often do not have an easy analysis option. While a few bioinformatic tools have been developed for this purpose, their utility is still hindered either due to limited functionality or the requirement of bioinformatic expertise. To make sequencing data analysis of CRISPR/Cas9 screens more accessible to a wide range of scientists, we developed a Platform-independent Analysis of Pooled Screens using Python (PinAPL-Py), which is operated as an intuitive web-service. PinAPL-Py implements state-of-the-art tools and statistical models, assembled in a comprehensive workflow covering sequence quality control, automated sgRNA sequence extraction, alignment, sgRNA enrichment/depletion analysis and gene ranking. The workflow is set up to use a variety of popular sgRNA libraries as well as custom libraries that can be easily uploaded. Various analysis options are offered, suitable to analyze a large variety of CRISPR/Cas9 screening experiments. Analysis output includes ranked lists of sgRNAs and genes, and publication-ready plots. PinAPL-Py helps to advance genome-wide screening efforts by combining comprehensive functionality with user-friendly implementation. PinAPL-Py is freely accessible at http://pinapl-py.ucsd.edu with instructions and test datasets.
Orion Entry, Descent, and Landing Simulation
NASA Technical Reports Server (NTRS)
Hoelscher, Brian R.
2007-01-01
The Orion Entry, Descent, and Landing simulation was created over the past two years to serve as the primary Crew Exploration Vehicle guidance, navigation, and control (GN&C) design and analysis tool at the National Aeronautics and Space Administration (NASA). The Advanced NASA Technology Architecture for Exploration Studies (ANTARES) simulation is a six degree-of-freedom tool with a unique design architecture which has a high level of flexibility. This paper describes the decision history and motivations that guided the creation of this simulation tool. The capabilities of the models within ANTARES are presented in detail. Special attention is given to features of the highly flexible GN&C architecture and the details of the implemented GN&C algorithms. ANTARES provides a foundation simulation for the Orion Project that has already been successfully used for requirements analysis, system definition analysis, and preliminary GN&C design analysis. ANTARES will find useful application in engineering analysis, mission operations, crew training, avionics-in-the-loop testing, etc. This paper focuses on the entry simulation aspect of ANTARES, which is part of a bigger simulation package supporting the entire mission profile of the Orion vehicle. The unique aspects of entry GN&C design are covered, including how the simulation is being used for Monte Carlo dispersion analysis and for support of linear stability analysis. Sample simulation output from ANTARES is presented in an appendix.
Coal gasification systems engineering and analysis. Appendix H: Work breakdown structure
NASA Technical Reports Server (NTRS)
1980-01-01
A work breakdown structure (WBS) is presented which encompasses the multiple facets (hardware, software, services, and other tasks) of the coal gasification program. The WBS is shown to provide the basis for the following: management and control; cost estimating; budgeting and reporting; scheduling activities; organizational structuring; specification tree generation; weight allocation and control; procurement and contracting activities; and serves as a tool for program evaluation.
NASA Technical Reports Server (NTRS)
DeSmidt, Hans A.; Smith, Edward C.; Bill, Robert C.; Wang, Kon-Well
2013-01-01
This project develops comprehensive modeling and simulation tools for analysis of variable rotor speed helicopter propulsion system dynamics. The Comprehensive Variable-Speed Rotorcraft Propulsion Modeling (CVSRPM) tool developed in this research is used to investigate coupled rotor/engine/fuel control/gearbox/shaft/clutch/flight control system dynamic interactions for several variable rotor speed mission scenarios. In this investigation, a prototypical two-speed Dual-Clutch Transmission (DCT) is proposed and designed to achieve 50 percent rotor speed variation. The comprehensive modeling tool developed in this study is utilized to analyze the two-speed shift response of both a conventional single rotor helicopter and a tiltrotor drive system. In the tiltrotor system, both a Parallel Shift Control (PSC) strategy and a Sequential Shift Control (SSC) strategy for constant and variable forward speed mission profiles are analyzed. Under the PSC strategy, selecting clutch shift-rate results in a design tradeoff between transient engine surge margins and clutch frictional power dissipation. In the case of SSC, clutch power dissipation is drastically reduced in exchange for the necessity to disengage one engine at a time which requires a multi-DCT drive system topology. In addition to comprehensive simulations, several sections are dedicated to detailed analysis of driveline subsystem components under variable speed operation. In particular an aeroelastic simulation of a stiff in-plane rotor using nonlinear quasi-steady blade element theory was conducted to investigate variable speed rotor dynamics. It was found that 2/rev and 4/rev flap and lag vibrations were significant during resonance crossings with 4/rev lagwise loads being directly transferred into drive-system torque disturbances. To capture the clutch engagement dynamics, a nonlinear stick-slip clutch torque model is developed. Also, a transient gas-turbine engine model based on first principles mean-line compressor and turbine approximations is developed. Finally an analysis of high frequency gear dynamics including the effect of tooth mesh stiffness variation under variable speed operation is conducted including experimental validation. Through exploring the interactions between the various subsystems, this investigation provides important insights into the continuing development of variable-speed rotorcraft propulsion systems.
Dispersed Fringe Sensing Analysis - DFSA
NASA Technical Reports Server (NTRS)
Sigrist, Norbert; Shi, Fang; Redding, David C.; Basinger, Scott A.; Ohara, Catherine M.; Seo, Byoung-Joon; Bikkannavar, Siddarayappa A.; Spechler, Joshua A.
2012-01-01
Dispersed Fringe Sensing (DFS) is a technique for measuring and phasing segmented telescope mirrors using a dispersed broadband light image. DFS is capable of breaking the monochromatic light ambiguity, measuring absolute piston errors between segments of large segmented primary mirrors to tens of nanometers accuracy over a range of 100 micrometers or more. The DFSA software tool analyzes DFS images to extract DFS encoded segment piston errors, which can be used to measure piston distances between primary mirror segments of ground and space telescopes. This information is necessary to control mirror segments to establish a smooth, continuous primary figure needed to achieve high optical quality. The DFSA tool is versatile, allowing precise piston measurements from a variety of different optical configurations. DFSA technology may be used for measuring wavefront pistons from sub-apertures defined by adjacent segments (such as Keck Telescope), or from separated sub-apertures used for testing large optical systems (such as sub-aperture wavefront testing for large primary mirrors using auto-collimating flats). An experimental demonstration of the coarse-phasing technology with verification of DFSA was performed at the Keck Telescope. DFSA includes image processing, wavelength and source spectral calibration, fringe extraction line determination, dispersed fringe analysis, and wavefront piston sign determination. The code is robust against internal optical system aberrations and against spectral variations of the source. In addition to the DFSA tool, the software package contains a simple but sophisticated MATLAB model to generate dispersed fringe images of optical system configurations in order to quickly estimate the coarse phasing performance given the optical and operational design requirements. Combining MATLAB (a high-level language and interactive environment developed by MathWorks), MACOS (JPL s software package for Modeling and Analysis for Controlled Optical Systems), and DFSA provides a unique optical development, modeling and analysis package to study current and future approaches to coarse phasing controlled segmented optical systems.
Contamination and Surface Preparation Effects on Composite Bonding
NASA Technical Reports Server (NTRS)
Kutscha, Eileen O.; Vahey, Paul G.; Belcher, Marcus A.; VanVoast, Peter J.; Grace, William B.; Blohowiak, Kay Y.; Palmieri, Frank L.; Connell, John W.
2017-01-01
Results presented here demonstrate the effect of several prebond surface contaminants (hydrocarbon, machining fluid, latex, silicone, peel ply residue, release film) on bond quality, as measured by fracture toughness and failure modes of carbon fiber reinforced epoxy substrates bonded in secondary and co-bond configurations with paste and film adhesives. Additionally, the capability of various prebond surface property measurement tools to detect contaminants and potentially predict subsequent bond performance of three different adhesives is also shown. Surface measurement methods included water contact angle, Dyne solution wettability, optically stimulated electron emission spectroscopy, surface free energy, inverse gas chromatography, and Fourier transform infrared spectroscopy with chemometrics analysis. Information will also be provided on the effectiveness of mechanical and energetic surface treatments to recover a bondable surface after contamination. The benefits and drawbacks of the various surface analysis tools to detect contaminants and evaluate prebond surfaces after surface treatment were assessed as well as their ability to correlate to bond performance. Surface analysis tools were also evaluated for their potential use as in-line quality control of adhesive bonding parameters in the manufacturing environment.
KNIME for reproducible cross-domain analysis of life science data.
Fillbrunn, Alexander; Dietz, Christian; Pfeuffer, Julianus; Rahn, René; Landrum, Gregory A; Berthold, Michael R
2017-11-10
Experiments in the life sciences often involve tools from a variety of domains such as mass spectrometry, next generation sequencing, or image processing. Passing the data between those tools often involves complex scripts for controlling data flow, data transformation, and statistical analysis. Such scripts are not only prone to be platform dependent, they also tend to grow as the experiment progresses and are seldomly well documented, a fact that hinders the reproducibility of the experiment. Workflow systems such as KNIME Analytics Platform aim to solve these problems by providing a platform for connecting tools graphically and guaranteeing the same results on different operating systems. As an open source software, KNIME allows scientists and programmers to provide their own extensions to the scientific community. In this review paper we present selected extensions from the life sciences that simplify data exploration, analysis, and visualization and are interoperable due to KNIME's unified data model. Additionally, we name other workflow systems that are commonly used in the life sciences and highlight their similarities and differences to KNIME. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Computer system for scanning tunneling microscope automation
NASA Astrophysics Data System (ADS)
Aguilar, M.; García, A.; Pascual, P. J.; Presa, J.; Santisteban, A.
1987-03-01
A computerized system for the automation of a scanning tunneling microscope is presented. It is based on an IBM personal computer (PC) either an XT or an AT, which performs the control, data acquisition and storage operations, displays the STM "images" in real time, and provides image processing tools for the restoration and analysis of data. It supports different data acquisition and control cards and image display cards. The software has been designed in a modular way to allow the replacement of these cards and other equipment improvements as well as the inclusion of user routines for data analysis.
Reliability Assessment for Low-cost Unmanned Aerial Vehicles
NASA Astrophysics Data System (ADS)
Freeman, Paul Michael
Existing low-cost unmanned aerospace systems are unreliable, and engineers must blend reliability analysis with fault-tolerant control in novel ways. This dissertation introduces the University of Minnesota unmanned aerial vehicle flight research platform, a comprehensive simulation and flight test facility for reliability and fault-tolerance research. An industry-standard reliability assessment technique, the failure modes and effects analysis, is performed for an unmanned aircraft. Particular attention is afforded to the control surface and servo-actuation subsystem. Maintaining effector health is essential for safe flight; failures may lead to loss of control incidents. Failure likelihood, severity, and risk are qualitatively assessed for several effector failure modes. Design changes are recommended to improve aircraft reliability based on this analysis. Most notably, the control surfaces are split, providing independent actuation and dual-redundancy. The simulation models for control surface aerodynamic effects are updated to reflect the split surfaces using a first-principles geometric analysis. The failure modes and effects analysis is extended by using a high-fidelity nonlinear aircraft simulation. A trim state discovery is performed to identify the achievable steady, wings-level flight envelope of the healthy and damaged vehicle. Tolerance of elevator actuator failures is studied using familiar tools from linear systems analysis. This analysis reveals significant inherent performance limitations for candidate adaptive/reconfigurable control algorithms used for the vehicle. Moreover, it demonstrates how these tools can be applied in a design feedback loop to make safety-critical unmanned systems more reliable. Control surface impairments that do occur must be quickly and accurately detected. This dissertation also considers fault detection and identification for an unmanned aerial vehicle using model-based and model-free approaches and applies those algorithms to experimental faulted and unfaulted flight test data. Flight tests are conducted with actuator faults that affect the plant input and sensor faults that affect the vehicle state measurements. A model-based detection strategy is designed and uses robust linear filtering methods to reject exogenous disturbances, e.g. wind, while providing robustness to model variation. A data-driven algorithm is developed to operate exclusively on raw flight test data without physical model knowledge. The fault detection and identification performance of these complementary but different methods is compared. Together, enhanced reliability assessment and multi-pronged fault detection and identification techniques can help to bring about the next generation of reliable low-cost unmanned aircraft.
Mechanical problem-solving strategies in left-brain damaged patients and apraxia of tool use.
Osiurak, François; Jarry, Christophe; Lesourd, Mathieu; Baumard, Josselin; Le Gall, Didier
2013-08-01
Left brain damage (LBD) can impair the ability to use familiar tools (apraxia of tool use) as well as novel tools to solve mechanical problems. Thus far, the emphasis has been placed on quantitative analyses of patients' performance. Nevertheless, the question still to be answered is, what are the strategies employed by those patients when confronted with tool use situations? To answer it, we asked 16 LBD patients and 43 healthy controls to solve mechanical problems by means of several potential tools. To specify the strategies, we recorded the time spent in performing four kinds of action (no manipulation, tool manipulation, box manipulation, and tool-box manipulation) as well as the number of relevant and irrelevant tools grasped. We compared LBD patients' performance with that of controls who encountered difficulties with the task (controls-) or not (controls+). Our results indicated that LBD patients grasped a higher number of irrelevant tools than controls+ and controls-. Concerning time allocation, controls+ and controls- spent significantly more time in performing tool-box manipulation than LBD patients. These results are inconsistent with the possibility that LBD patients could engage in trial-and-error strategies and, rather, suggest that they tend to be perplexed. These findings seem to indicate that the inability to reason about the objects' physical properties might prevent LBD patients from following any problem-solving strategy. Copyright © 2013 Elsevier Ltd. All rights reserved.
Comprehensive benchmarking and ensemble approaches for metagenomic classifiers.
McIntyre, Alexa B R; Ounit, Rachid; Afshinnekoo, Ebrahim; Prill, Robert J; Hénaff, Elizabeth; Alexander, Noah; Minot, Samuel S; Danko, David; Foox, Jonathan; Ahsanuddin, Sofia; Tighe, Scott; Hasan, Nur A; Subramanian, Poorani; Moffat, Kelly; Levy, Shawn; Lonardi, Stefano; Greenfield, Nick; Colwell, Rita R; Rosen, Gail L; Mason, Christopher E
2017-09-21
One of the main challenges in metagenomics is the identification of microorganisms in clinical and environmental samples. While an extensive and heterogeneous set of computational tools is available to classify microorganisms using whole-genome shotgun sequencing data, comprehensive comparisons of these methods are limited. In this study, we use the largest-to-date set of laboratory-generated and simulated controls across 846 species to evaluate the performance of 11 metagenomic classifiers. Tools were characterized on the basis of their ability to identify taxa at the genus, species, and strain levels, quantify relative abundances of taxa, and classify individual reads to the species level. Strikingly, the number of species identified by the 11 tools can differ by over three orders of magnitude on the same datasets. Various strategies can ameliorate taxonomic misclassification, including abundance filtering, ensemble approaches, and tool intersection. Nevertheless, these strategies were often insufficient to completely eliminate false positives from environmental samples, which are especially important where they concern medically relevant species. Overall, pairing tools with different classification strategies (k-mer, alignment, marker) can combine their respective advantages. This study provides positive and negative controls, titrated standards, and a guide for selecting tools for metagenomic analyses by comparing ranges of precision, accuracy, and recall. We show that proper experimental design and analysis parameters can reduce false positives, provide greater resolution of species in complex metagenomic samples, and improve the interpretation of results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Satogata, Todd
2013-04-22
The integrated control system (ICS) is responsible for the whole ESS machine and facility: accelerator, target, neutron scattering instruments and conventional facilities. This unified approach keeps the costs of development, maintenance and support relatively low. ESS has selected a standardised, field-proven controls framework, the Experimental Physics and Industrial Control System (EPICS), which was originally developed jointly by Argonne and Los Alamos National Laboratories. Complementing this selection are best practices and experience from similar facilities regarding platform standardisation, control system development and device integration and commissioning. The components of ICS include the control system core, the control boxes, the BLED databasemore » management system, and the human machine interface. The control system core is a set of systems and tools that make it possible for the control system to provide required data, information and services to engineers, operators, physicists and the facility itself. The core components are the timing system that makes possible clock synchronisation across the facility, the machine protection system (MPS) and the personnel protection system (PPS) that prevent damage to the machine and personnel, and a set of control system services. Control boxes are servers that control a collection of equipment (for example a radio frequency cavity). The integrated control system will include many control boxes that can be assigned to one supplier, such as an internal team, a collaborating institute or a commercial vendor. This approach facilitates a clear division of responsibilities and makes integration much easier. A control box is composed of a standardised hardware platform, components, development tools and services. On the top level, it interfaces with the core control system components (timing, MPS, PPS) and with the human-machine interface. At the bottom, it interfaces with the equipment and parts of the facility through a set of analog and digital signals, real-time control loops and other communication buses. The ICS central data management system is named BLED (beam line element databases). BLED is a set of databases, tools and services that is used to store, manage and access data. It holds vital control system configuration and physics-related (lattice) information about the accelerator, target and instruments. It facilitates control system configuration by bringing together direct input-output controller (IOC) con guration and real-time data from proton and neutron beam line models. BLED also simplifies development and speeds up the code-test-debug cycle. The set of tools that access BLED will be tailored to the needs of different categories of users, such as ESS staff physicists, engineers, and operators; external partner laboratories; and visiting experimental instrument users. The human-machine interface is vital to providing a high-quality experience to ICS users. It encompasses a wide array of devices and software tools, from control room screens to engineer terminal windows; from beam physics data tools to post-mortem data analysis tools. It serves users with a wide range of skills from widely varied backgrounds. The Controls Group is developing a set of user profiles to accommodate this diverse range of use-cases and users.« less
Clinical risk analysis with failure mode and effect analysis (FMEA) model in a dialysis unit.
Bonfant, Giovanna; Belfanti, Pietro; Paternoster, Giuseppe; Gabrielli, Danila; Gaiter, Alberto M; Manes, Massimo; Molino, Andrea; Pellu, Valentina; Ponzetti, Clemente; Farina, Massimo; Nebiolo, Pier E
2010-01-01
The aim of clinical risk management is to improve the quality of care provided by health care organizations and to assure patients' safety. Failure mode and effect analysis (FMEA) is a tool employed for clinical risk reduction. We applied FMEA to chronic hemodialysis outpatients. FMEA steps: (i) process study: we recorded phases and activities. (ii) Hazard analysis: we listed activity-related failure modes and their effects; described control measures; assigned severity, occurrence and detection scores for each failure mode and calculated the risk priority numbers (RPNs) by multiplying the 3 scores. Total RPN is calculated by adding single failure mode RPN. (iii) Planning: we performed a RPNs prioritization on a priority matrix taking into account the 3 scores, and we analyzed failure modes causes, made recommendations and planned new control measures. (iv) Monitoring: after failure mode elimination or reduction, we compared the resulting RPN with the previous one. Our failure modes with the highest RPN came from communication and organization problems. Two tools have been created to ameliorate information flow: "dialysis agenda" software and nursing datasheets. We scheduled nephrological examinations, and we changed both medical and nursing organization. Total RPN value decreased from 892 to 815 (8.6%) after reorganization. Employing FMEA, we worked on a few critical activities, and we reduced patients' clinical risk. A priority matrix also takes into account the weight of the control measures: we believe this evaluation is quick, because of simple priority selection, and that it decreases action times.
NASA's Cryogenic Fluid Management Technology Project
NASA Technical Reports Server (NTRS)
Tramel, Terri L.; Motil, Susan M.
2008-01-01
The Cryogenic Fluid Management (CFM) Project's primary objective is to develop storage, transfer, and handling technologies for cryogens that will support the enabling of high performance cryogenic propulsion systems, lunar surface systems and economical ground operations. Such technologies can significantly reduce propellant launch mass and required on-orbit margins, reduce or even eliminate propellant tank fluid boil-off losses for long term missions, and simplify vehicle operations. This paper will present the status of the specific technologies that the CFM Project is developing. The two main areas of concentration are analysis models development and CFM hardware development. The project develops analysis tools and models based on thermodynamics, hydrodynamics, and existing flight/test data. These tools assist in the development of pressure/thermal control devices (such as the Thermodynamic Vent System (TVS), and Multi-layer insulation); with the ultimate goal being to develop a mature set of tools and models that can characterize the performance of the pressure/thermal control devices incorporated in the design of an entire CFM system with minimal cryogen loss. The project does hardware development and testing to verify our understanding of the physical principles involved, and to validate the performance of CFM components, subsystems and systems. This database provides information to anchor our analytical models. This paper describes some of the current activities of the NASA's Cryogenic Fluid Management Project.
Validation of the Italian version of the HSE Indicator Tool.
Magnavita, N
2012-06-01
An Italian version of the Health & Safety Executive's (HSE) Management Standards Revised Indicator Tool (MS-RIT) has been used to monitor the working conditions that may lead to stress. To initially examine the factor structure of the Italian version of the MS-RIT, in comparison with the original UK tool, and to investigate its validity and reliability; second, to study the association between occupational stress and psychological distress. Workers from 17 companies self-completed the MS-RIT and the General Health Questionnaire used to measure the psychological distress while they waited for their periodic examination at the workplace. Factor analysis was employed to ascertain whether the Italian version maintained the original subdivision into seven scales. Odds ratios were calculated to estimate the risk of impairment associated with exposure to stress at the workplace. In total, 748 workers participated; the response rate was 91%. The factor structure of the Italian MS-RIT corresponded partially to the original UK version. The 'demand', 'control', 'role', ' relationship' and 'colleague-support' scales were equivalent to the UK ones. A principal factor, termed ' elasticity', incorporated the UK 'management-support' and 'change' scales. Reliability analysis of the sub-scales revealed Cronbach's alpha values ranging from 0.75 to 0.86. Our findings confirmed the usefulness of the Italian version of the HSE MS-RIT in stress control.
NASA Astrophysics Data System (ADS)
Remmele, Steffen; Ritzerfeld, Julia; Nickel, Walter; Hesser, Jürgen
2011-03-01
RNAi-based high-throughput microscopy screens have become an important tool in biological sciences in order to decrypt mostly unknown biological functions of human genes. However, manual analysis is impossible for such screens since the amount of image data sets can often be in the hundred thousands. Reliable automated tools are thus required to analyse the fluorescence microscopy image data sets usually containing two or more reaction channels. The herein presented image analysis tool is designed to analyse an RNAi screen investigating the intracellular trafficking and targeting of acylated Src kinases. In this specific screen, a data set consists of three reaction channels and the investigated cells can appear in different phenotypes. The main issue of the image processing task is an automatic cell segmentation which has to be robust and accurate for all different phenotypes and a successive phenotype classification. The cell segmentation is done in two steps by segmenting the cell nuclei first and then using a classifier-enhanced region growing on basis of the cell nuclei to segment the cells. The classification of the cells is realized by a support vector machine which has to be trained manually using supervised learning. Furthermore, the tool is brightness invariant allowing different staining quality and it provides a quality control that copes with typical defects during preparation and acquisition. A first version of the tool has already been successfully applied for an RNAi-screen containing three hundred thousand image data sets and the SVM extended version is designed for additional screens.
Enabling Efficient Climate Science Workflows in High Performance Computing Environments
NASA Astrophysics Data System (ADS)
Krishnan, H.; Byna, S.; Wehner, M. F.; Gu, J.; O'Brien, T. A.; Loring, B.; Stone, D. A.; Collins, W.; Prabhat, M.; Liu, Y.; Johnson, J. N.; Paciorek, C. J.
2015-12-01
A typical climate science workflow often involves a combination of acquisition of data, modeling, simulation, analysis, visualization, publishing, and storage of results. Each of these tasks provide a myriad of challenges when running on a high performance computing environment such as Hopper or Edison at NERSC. Hurdles such as data transfer and management, job scheduling, parallel analysis routines, and publication require a lot of forethought and planning to ensure that proper quality control mechanisms are in place. These steps require effectively utilizing a combination of well tested and newly developed functionality to move data, perform analysis, apply statistical routines, and finally, serve results and tools to the greater scientific community. As part of the CAlibrated and Systematic Characterization, Attribution and Detection of Extremes (CASCADE) project we highlight a stack of tools our team utilizes and has developed to ensure that large scale simulation and analysis work are commonplace and provide operations that assist in everything from generation/procurement of data (HTAR/Globus) to automating publication of results to portals like the Earth Systems Grid Federation (ESGF), all while executing everything in between in a scalable environment in a task parallel way (MPI). We highlight the use and benefit of these tools by showing several climate science analysis use cases they have been applied to.
Amino Acid profile as a feasible tool for determination of the authenticity of fruit juices.
Asadpoor, Mostafa; Ansarin, Masoud; Nemati, Mahboob
2014-12-01
Fruit juice is a nutrient rich food product with a direct connection to public health. The purpose of this research was to determine the amino acid profile of juices and provide a quick and accurate indicator for determining their authenticity. The method of analysis was HPLC with fluorescence detector and pre-column derivatization by orthophtaldialdehyde (OPA). Sixty-six samples of fruit juices were analyzed, and fourteen amino acids were identified and determined in the sampled fruit juices. The fruit samples used for this analysis were apples, oranges, cherry, pineapple, mango, apricot, pomegranate, peach and grapes. The results showed that 32% of samples tested in this study had a lower concentrate percentage as compared to that of their labels and/or other possible authenticity problems in the manufacturing process. The following samples showed probable adulteration: four cherry juice samples, two pomegranate juice samples, one mango, three grape, four peach, seven orange, two apple and one apricot juice samples. In general, determining the amount of amino acids and comparing sample amino acids profiles with the standard values seems to be an indicator for quality control. This method can provide the regulatory agencies with a tool, to help produce a healthier juice. The aim of this study is the analytical control of the fruit juice composition is becoming an important issue, and HPLC can provide an important and essential tool for more accurate research as well as for routine analysis.
Purdue ionomics information management system. An integrated functional genomics platform.
Baxter, Ivan; Ouzzani, Mourad; Orcun, Seza; Kennedy, Brad; Jandhyala, Shrinivas S; Salt, David E
2007-02-01
The advent of high-throughput phenotyping technologies has created a deluge of information that is difficult to deal with without the appropriate data management tools. These data management tools should integrate defined workflow controls for genomic-scale data acquisition and validation, data storage and retrieval, and data analysis, indexed around the genomic information of the organism of interest. To maximize the impact of these large datasets, it is critical that they are rapidly disseminated to the broader research community, allowing open access for data mining and discovery. We describe here a system that incorporates such functionalities developed around the Purdue University high-throughput ionomics phenotyping platform. The Purdue Ionomics Information Management System (PiiMS) provides integrated workflow control, data storage, and analysis to facilitate high-throughput data acquisition, along with integrated tools for data search, retrieval, and visualization for hypothesis development. PiiMS is deployed as a World Wide Web-enabled system, allowing for integration of distributed workflow processes and open access to raw data for analysis by numerous laboratories. PiiMS currently contains data on shoot concentrations of P, Ca, K, Mg, Cu, Fe, Zn, Mn, Co, Ni, B, Se, Mo, Na, As, and Cd in over 60,000 shoot tissue samples of Arabidopsis (Arabidopsis thaliana), including ethyl methanesulfonate, fast-neutron and defined T-DNA mutants, and natural accession and populations of recombinant inbred lines from over 800 separate experiments, representing over 1,000,000 fully quantitative elemental concentrations. PiiMS is accessible at www.purdue.edu/dp/ionomics.
NASA Technical Reports Server (NTRS)
Yeh, H. Y. Jannivine; Brown, Cheryl B.; Jeng, Frank F.; Anderson, Molly; Ewert, Michael K.
2009-01-01
The development of the Advanced Life Support (ALS) Sizing Analysis Tool (ALSSAT) using Microsoft(Registered TradeMark) Excel was initiated by the Crew and Thermal Systems Division (CTSD) of Johnson Space Center (JSC) in 1997 to support the ALS and Exploration Offices in Environmental Control and Life Support System (ECLSS) design and studies. It aids the user in performing detailed sizing of the ECLSS for different combinations of the Exploration Life support (ELS) regenerative system technologies. This analysis tool will assist the user in performing ECLSS preliminary design and trade studies as well as system optimization efficiently and economically. The latest ALSSAT related publication in ICES 2004 detailed ALSSAT s development status including the completion of all six ELS Subsystems (ELSS), namely, the Air Management Subsystem, the Biomass Subsystem, the Food Management Subsystem, the Solid Waste Management Subsystem, the Water Management Subsystem, and the Thermal Control Subsystem and two external interfaces, including the Extravehicular Activity and the Human Accommodations. Since 2004, many more regenerative technologies in the ELSS were implemented into ALSSAT. ALSSAT has also been used for the ELS Research and Technology Development Metric Calculation for FY02 thru FY06. It was also used to conduct the Lunar Outpost Metric calculation for FY08 and was integrated as part of a Habitat Model developed at Langley Research Center to support the Constellation program. This paper will give an update on the analysis tool s current development status as well as present the analytical results of one of the trade studies that was performed.
Simultaneous Independent Control of Tool Axial Force and Temperature in Friction Stir Processing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ross, Kenneth A.; Grant, Glenn J.; Darsell, Jens T.
Maintaining consistent tool depth relative to the part surface is a critical requirement for many Friction stir processing (FSP) applications. Force control is often used with the goal of obtaining a constant weld depth. When force control is used, if weld temperature decreases, flow stress increases and the tool is pushed up. If weld temperature increases, flow stress decreases and the tool dives. These variations in tool depth and weld temperature cause various types of weld defects. Robust temperature control for FSP maintains a commanded temperature through control of the spindle axis only. Robust temperature control and force control aremore » completely decoupled in control logic and machine motion. This results in stable temperature, force and tool depth despite the presence of geometric and thermal disturbances. Performance of this control method is presented for various weld paths and alloy systems.« less
Precise and Efficient Static Array Bound Checking for Large Embedded C Programs
NASA Technical Reports Server (NTRS)
Venet, Arnaud
2004-01-01
In this paper we describe the design and implementation of a static array-bound checker for a family of embedded programs: the flight control software of recent Mars missions. These codes are large (up to 250 KLOC), pointer intensive, heavily multithreaded and written in an object-oriented style, which makes their analysis very challenging. We designed a tool called C Global Surveyor (CGS) that can analyze the largest code in a couple of hours with a precision of 80%. The scalability and precision of the analyzer are achieved by using an incremental framework in which a pointer analysis and a numerical analysis of array indices mutually refine each other. CGS has been designed so that it can distribute the analysis over several processors in a cluster of machines. To the best of our knowledge this is the first distributed implementation of static analysis algorithms. Throughout the paper we will discuss the scalability setbacks that we encountered during the construction of the tool and their impact on the initial design decisions.
Fan, Dong-Dong; Kuang, Yan-Hui; Dong, Li-Hua; Ye, Xiao; Chen, Liang-Mian; Zhang, Dong; Ma, Zhen-Shan; Wang, Jin-Yu; Zhu, Jing-Jing; Wang, Zhi-Min; Wang, De-Qin; Li, Chu-Yuan
2017-04-01
To optimize the purification process of gynostemma pentaphyllum saponins (GPS) based on "adjoint marker" online control technology with GPS as the testing index. UPLC-QTOF-MS technology was used for qualitative analysis. "Adjoint marker" online control results showed that the end point of load sample was that the UV absorbance of effluent liquid was equal to half of that of load sample solution, and the absorbance was basically stable when the end point was stable. In UPLC-QTOF-MS qualitative analysis, 16 saponins were identified from GPS, including 13 known gynostemma saponins and 3 new saponins. This optimized method was proved to be simple, scientific, reasonable, easy for online determination, real-time record, and can be better applied to the mass production and automation of production. The results of qualitative analysis indicated that the "adjoint marker" online control technology can well retain main efficacy components of medicinal materials, and provide analysis tools for the process control and quality traceability. Copyright© by the Chinese Pharmaceutical Association.
Thermal sensors to control polymer forming. Challenge and solutions
NASA Astrophysics Data System (ADS)
Lemeunier, F.; Boyard, N.; Sarda, A.; Plot, C.; Lefèvre, N.; Petit, I.; Colomines, G.; Allanic, N.; Bailleul, J. L.
2017-10-01
Many thermal sensors are already used, for many years, to better understand and control material forming processes, especially polymer processing. Due to technical constraints (high pressure, sealing, sensor dimensions…) the thermal measurement is often performed in the tool or close its surface. Thus, it only gives partial and disturbed information. Having reliable information about the heat flux exchanges between the tool and the material during the process would be very helpful to improve the control of the process and to favor the development of new materials. In this work, we present several sensors developed in labs to study the molding steps in forming processes. The analysis of the obtained thermal measurements (temperature, heat flux) shows the required sensitivity threshold of sensitivity of thermal sensors to be able to detect on-line the rate of thermal reaction. Based on these data, we will present new sensor designs which have been patented.
Problems of Mathematical Finance by Stochastic Control Methods
NASA Astrophysics Data System (ADS)
Stettner, Łukasz
The purpose of this paper is to present main ideas of mathematics of finance using the stochastic control methods. There is an interplay between stochastic control and mathematics of finance. On the one hand stochastic control is a powerful tool to study financial problems. On the other hand financial applications have stimulated development in several research subareas of stochastic control in the last two decades. We start with pricing of financial derivatives and modeling of asset prices, studying the conditions for the absence of arbitrage. Then we consider pricing of defaultable contingent claims. Investments in bonds lead us to the term structure modeling problems. Special attention is devoted to historical static portfolio analysis called Markowitz theory. We also briefly sketch dynamic portfolio problems using viscosity solutions to Hamilton-Jacobi-Bellman equation, martingale-convex analysis method or stochastic maximum principle together with backward stochastic differential equation. Finally, long time portfolio analysis for both risk neutral and risk sensitive functionals is introduced.
Reid, Susan A; Callister, Robin; Katekar, Michael G; Treleaven, Julia M
2017-08-01
Cervicogenic dizziness (CGD) is hard to diagnose as there is no objective test. Can a brief assessment tool be derived from the Dizziness Handicap Inventory (DHI) to assist in screening for CGD? Case-control study with split-sample analysis. 86 people with CGD and 86 people with general dizziness completed the DHI as part of the assessment of their dizziness. Descriptive statistics were used to assess how frequently each question on the DHI was answered 'yes' or 'sometimes' by participants with CGD and by participants with general dizziness. The questions that best discriminated between GCD and general dizziness were compiled into a brief assessment tool for CGD. Data from 80 participants (40 from each group) were used to generate a receiver operating characteristic (ROC) curve to establish a cut-off score for that brief assessment tool. Then, data from the remaining 92 participants were used to try to validate the diagnostic ability of the brief assessment tool using that cut-off score. Questions 1, 9 and 11 were the most discriminatory and were combined to form the brief assessment tool. The ROC curve indicated an optimal threshold of 9. The diagnostic ability of the brief assessment tool among the remaining 46 participants from each group was: sensitivity 77% (95% CI: 67 to 84), specificity 66% (56-75), positive likelihood ratio 2.28 (1.66-3.13), and negative likelihood ratio 0.35 (0.23-0.53). A brief assessment tool of three questions appears to be helpful in screening for CGD. Copyright © 2017. Published by Elsevier Ltd.
NASA Technical Reports Server (NTRS)
Hoffman, Edward J. (Editor); Lawbaugh, William M. (Editor)
1996-01-01
Papers address the following topics: NASA's project management development process; Better decisions through structural analysis; NASA's commercial technology management system; Today's management techniques and tools; Program control in NASA - needs and opportunities; and Resources for NASA managers.
Capability Portfolio Analysis Tool (CPAT) Verification and Validation Report
2013-01-01
BFSB Battlefield Surveillance Brigade BFV Bradley Fighting Vehicle BMOD Bradley Modernization C2 (H) Command and Control (HBCT) C2 (S...Fire Infantry Fighting Vehicle (IFV); Fire Integrated Support Team (FIST); Engineer (Eng); Cavalry (CAV) BFV FOV CDD Block II - 16 Apr 2010 GCV FOV
Modeling and Controls Development of 48V Mild Hybrid Electric Vehicles
The Advanced Light-Duty Powertrain and Hybrid Analysis tool (ALPHA) was created by EPA to evaluate the Greenhouse Gas (GHG) emissions of Light-Duty (LD) vehicles. It is a physics-based, forward-looking, full vehicle computer simulator capable of analyzing various vehicle types c...
Molecular electron recollision dynamics in intense circularly polarized laser pulses
NASA Astrophysics Data System (ADS)
Bandrauk, André D.; Yuan, Kai-Jun
2018-04-01
Extreme UV and x-ray table top light sources based on high-order harmonic generation (HHG) are focused now on circular polarization for the generation of circularly polarized attosecond pulses as new tools for controlling electron dynamics, such as charge transfer and migration and the generation of attosecond quantum electron currents for ultrafast magneto-optics. A fundamental electron dynamical process in HHG is laser induced electron recollision with the parent ion, well established theoretically and experimentally for linear polarization. We discuss molecular electron recollision dynamics in circular polarization by theoretical analysis and numerical simulation. The control of the polarization of HHG with circularly polarized ionizing pulses is examined and it is shown that bichromatic circularly polarized pulses enhance recollision dynamics, rendering HHG more efficient, especially in molecules because of their nonspherical symmetry. The polarization of the harmonics is found to be dependent on the compatibility of the rotational symmetry of the net electric field created by combinations of bichromatic circularly polarized pulses with the dynamical symmetry of molecules. We show how the field and molecule symmetry influences the electron recollision trajectories by a time-frequency analysis of harmonics. The results, in principle, offer new unique controllable tools in the study of attosecond molecular electron dynamics.
Optimal Synthesis of Compliant Mechanisms using Subdivision and Commercial FEA (DETC2004-57497)
NASA Technical Reports Server (NTRS)
Hull, Patrick V.; Canfield, Stephen
2004-01-01
The field of distributed-compliance mechanisms has seen significant work in developing suitable topology optimization tools for their design. These optimal design tools have grown out of the techniques of structural optimization. This paper will build on the previous work in topology optimization and compliant mechanism design by proposing an alternative design space parameterization through control points and adding another step to the process, that of subdivision. The control points allow a specific design to be represented as a solid model during the optimization process. The process of subdivision creates an additional number of control points that help smooth the surface (for example a C(sup 2) continuous surface depending on the method of subdivision chosen) creating a manufacturable design free of some traditional numerical instabilities. Note that these additional control points do not add to the number of design parameters. This alternative parameterization and description as a solid model effectively and completely separates the design variables from the analysis variables during the optimization procedure. The motivation behind this work is to create an automated design tool from task definition to functional prototype created on a CNC or rapid-prototype machine. This paper will describe the proposed compliant mechanism design process and will demonstrate the procedure on several examples common in the literature.
Cheng, Y Z; Xu, T J; Jin, X X; Tang, D; Wei, T; Sun, Y Y; Meng, F Q; Shi, G; Wang, R X
2012-01-01
Through multiple alignment analysis of mitochondrial tRNA-Thr and tRNA-Phe sequences from 161 fishes, new universal primers specially targeting the entire mitochondrial control region were designed. This new primer set successfully amplified the expected PCR products from various kinds of marine fish species, belonging to various families, and the amplified segments were confirmed to be the control region by sequencing. These primers provide a useful tool to study the control region diversity in economically important fish species, the possible mechanism of control region evolution, and the functions of the conserved motifs in the control region.
Harris, Dale M; Rantalainen, Timo; Muthalib, Makii; Johnson, Liam; Teo, Wei-Peng
2015-01-01
The use of virtual reality games (known as "exergaming") as a neurorehabilitation tool is gaining interest. Therefore, we aim to collate evidence for the effects of exergaming on the balance and postural control of older adults and people with idiopathic Parkinson's disease (IPD). Six electronic databases were searched, from inception to April 2015, to identify relevant studies. Standardized mean differences (SMDs) and 95% confidence intervals (CI) were used to calculate effect sizes between experimental and control groups. I (2) statistics were used to determine levels of heterogeneity. 325 older adults and 56 people with IPD who were assessed across 11 -studies. The results showed that exergaming improved static balance (SMD 1.069, 95% CI 0.563-1.576), postural control (SMD 0.826, 95% CI 0.481-1.170), and dynamic balance (SMD -0.808, 95% CI -1.192 to -0.424) in healthy older adults. Two IPD studies showed an improvement in static balance (SMD 0.124, 95% CI -0.581 to 0.828) and postural control (SMD 2.576, 95% CI 1.534-3.599). Our findings suggest that exergaming might be an appropriate therapeutic tool for improving balance and postural control in older adults, but more -large-scale trials are needed to determine if the same is true for people with IPD.
Dokras, Anuja; Clifton, Shari; Futterweit, Walter; Wild, Robert
2011-01-01
Polycystic ovary syndrome (PCOS) and depression both have a high prevalence in reproductive-aged women. This study aimed to determine the prevalence of abnormal depression scores in women who meet currently recognized definitions of PCOS compared with women in a well-defined control group. The search was performed in MEDLINE, EMBASE Classic plus EMBASE, PsycINFO, Current Contents-Clinical Medicine and Current Contents-Life Sciences and Web of Science. Cochrane software Review Manager 5.0.24 was used to construct forest plots comparing risk of abnormal depression scores in those in the PCOS and control groups. Studies with well-defined criteria of women with PCOS and control groups of women without PCOS, with demographic information including age and body mass index (BMI), were included. Of 752 screened articles, 17 met the selection criteria for systematic review and 10 studies were included in the meta-analysis. Data were abstracted independently by three reviewers. All studies were cross-sectional and most used the Rotterdam criteria for the diagnosis of PCOS (n=10). The odds ratio (OR) for abnormal depression scores was 4.03 (95% confidence interval [CI] 2.96-5.5, P<.01) in women with PCOS (n=522) compared with those in the control groups (n=475). A subanalysis showed that the odds for abnormal depression scores was independent of BMI (OR 4.09, 95% CI 2.62-6.41). Several validated tools were used to screen for depression; the common tool used was the Beck Depression Inventory. The results of our study suggest the need to screen all women with PCOS for depression using validated screening tools. Women with PCOS are at an increased risk for abnormal depression scores independent of BMI.
Bogan, Richard K; Turner, Jo Anne
2007-01-01
Insomnia is the leading sleep disorder in the US; however, diagnosis is often problematic. This pilot study assessed the clinical value of a novel diagnostic insomnia questionnaire. The SleepMed Insomnia Index (SMI) was administered to 543 consecutive patients and 50 normal control subjects during a pilot study. Mean SMI scores were assessed based on subsequent sleep-related diagnoses. The SMI scores for patients with sleep-related disorders were significantly higher than those for the control group (p < 0.001) and highest for the 90 patients comprising the insomnia group. Analysis of the SMI scores from the 90 insomnia patients indicates a high degree of reliability (Cronbach’s alpha: 0.7). These data support our clinical experience with this diagnostic tool which indicates a strong likelihood of disrupted nighttime sleep in patients with high SMI scores. Following further validation, the SMI may prove to be a valuable tool for evaluating sleep disorders, specifically as an aid in the diagnosis of insomnia. The Sleep Matrix is a visual tool that quantifies a sleep complaint by combining scores from the Epworth Sleepiness Scale (ESS) and the SMI. The SMI measures an insomnia component while the ESS is an accepted measure of daytime sleepiness. The Sleep Matrix visually displays the complexity of the sleep complaint in an effort to differentiate insomnia with differing etiologies from other sleep disorders and measure treatment outcomes. To pilot test the Sleep Matrix, the tool was administered to 90 patients with insomnia and to 22 normal controls. Plots from the insomnia patients were concentrated into the “insomnia zone” while scores from the normal controls were located in the “normal zone” located in the lower left quadrant. Additional research using the Sleep Matrix could provide data that the tool could be utilized to visually aid the clinician in the diagnosis of unknown sleep complaints. PMID:19300579
Interactive cutting path analysis programs
NASA Technical Reports Server (NTRS)
Weiner, J. M.; Williams, D. S.; Colley, S. R.
1975-01-01
The operation of numerically controlled machine tools is interactively simulated. Four programs were developed to graphically display the cutting paths for a Monarch lathe, Cintimatic mill, Strippit sheet metal punch, and the wiring path for a Standard wire wrap machine. These programs are run on a IMLAC PDS-ID graphic display system under the DOS-3 disk operating system. The cutting path analysis programs accept input via both paper tape and disk file.
NASA Technical Reports Server (NTRS)
Kong, Suk Bin
2001-01-01
Volatile organic compound(VOC), ethylene gas, was characterized and quantified by GC/FID. 20-50 ppb levels were detected during the growth stages of radish. SPME could be a good analytical tool for the purpose. Low temperature trapping method using dry ice/diethyl ether and liquid nitrogen bath was recommended for the sampling process for GC/PID and GC/MS analysis.
Application of Diagnostic Analysis Tools to the Ares I Thrust Vector Control System
NASA Technical Reports Server (NTRS)
Maul, William A.; Melcher, Kevin J.; Chicatelli, Amy K.; Johnson, Stephen B.
2010-01-01
The NASA Ares I Crew Launch Vehicle is being designed to support missions to the International Space Station (ISS), to the Moon, and beyond. The Ares I is undergoing design and development utilizing commercial-off-the-shelf tools and hardware when applicable, along with cutting edge launch technologies and state-of-the-art design and development. In support of the vehicle s design and development, the Ares Functional Fault Analysis group was tasked to develop an Ares Vehicle Diagnostic Model (AVDM) and to demonstrate the capability of that model to support failure-related analyses and design integration. One important component of the AVDM is the Upper Stage (US) Thrust Vector Control (TVC) diagnostic model-a representation of the failure space of the US TVC subsystem. This paper first presents an overview of the AVDM, its development approach, and the software used to implement the model and conduct diagnostic analysis. It then uses the US TVC diagnostic model to illustrate details of the development, implementation, analysis, and verification processes. Finally, the paper describes how the AVDM model can impact both design and ground operations, and how some of these impacts are being realized during discussions of US TVC diagnostic analyses with US TVC designers.
Software For Graphical Representation Of A Network
NASA Technical Reports Server (NTRS)
Mcallister, R. William; Mclellan, James P.
1993-01-01
System Visualization Tool (SVT) computer program developed to provide systems engineers with means of graphically representing networks. Generates diagrams illustrating structures and states of networks defined by users. Provides systems engineers powerful tool simplifing analysis of requirements and testing and maintenance of complex software-controlled systems. Employs visual models supporting analysis of chronological sequences of requirements, simulation data, and related software functions. Applied to pneumatic, hydraulic, and propellant-distribution networks. Used to define and view arbitrary configurations of such major hardware components of system as propellant tanks, valves, propellant lines, and engines. Also graphically displays status of each component. Advantage of SVT: utilizes visual cues to represent configuration of each component within network. Written in Turbo Pascal(R), version 5.0.
Risk Management Implementation Tool
NASA Technical Reports Server (NTRS)
Wright, Shayla L.
2004-01-01
Continuous Risk Management (CM) is a software engineering practice with processes, methods, and tools for managing risk in a project. It provides a controlled environment for practical decision making, in order to assess continually what could go wrong, determine which risk are important to deal with, implement strategies to deal with those risk and assure the measure effectiveness of the implemented strategies. Continuous Risk Management provides many training workshops and courses to teach the staff how to implement risk management to their various experiments and projects. The steps of the CRM process are identification, analysis, planning, tracking, and control. These steps and the various methods and tools that go along with them, identification, and dealing with risk is clear-cut. The office that I worked in was the Risk Management Office (RMO). The RMO at NASA works hard to uphold NASA s mission of exploration and advancement of scientific knowledge and technology by defining and reducing program risk. The RMO is one of the divisions that fall under the Safety and Assurance Directorate (SAAD). I worked under Cynthia Calhoun, Flight Software Systems Engineer. My task was to develop a help screen for the Continuous Risk Management Implementation Tool (RMIT). The Risk Management Implementation Tool will be used by many NASA managers to identify, analyze, track, control, and communicate risks in their programs and projects. The RMIT will provide a means for NASA to continuously assess risks. The goals and purposes for this tool is to provide a simple means to manage risks, be used by program and project managers throughout NASA for managing risk, and to take an aggressive approach to advertise and advocate the use of RMIT at each NASA center.
EPA/ECLSS consumables analyses for the Spacelab 1 flight
NASA Technical Reports Server (NTRS)
Steines, G. J.; Pipher, M. D.
1976-01-01
The results of electrical power system (EPS) and environmental control/life support system (ECLSS) consumables analyses of the Spacelab 1 mission are presented. The analyses were performed to assess the capability of the orbiter systems to support the proposed mission and to establish the various non propulsive consumables requirements. The EPS analysis was performed using the shuttle electrical power system (SEPS) analysis computer program. The ECLSS analysis was performed using the shuttle environmental consumables requirements evaluation tool (SECRET) program.
NASA Technical Reports Server (NTRS)
Farrell, C. E.; Krauze, L. D.
1983-01-01
The IDEAS computer of NASA is a tool for interactive preliminary design and analysis of LSS (Large Space System). Nine analysis modules were either modified or created. These modules include the capabilities of automatic model generation, model mass properties calculation, model area calculation, nonkinematic deployment modeling, rigid-body controls analysis, RF performance prediction, subsystem properties definition, and EOS science sensor selection. For each module, a section is provided that contains technical information, user instructions, and programmer documentation.
NASA Astrophysics Data System (ADS)
Foucher, Johann; Labrosse, Aurelien; Dervillé, Alexandre; Zimmermann, Yann; Bernard, Guilhem; Martinez, Sergio; Grönqvist, Hanna; Baderot, Julien; Pinzan, Florian
2017-03-01
The development and integration of new materials and structures at the nanoscale require multiple parallel characterizations in order to control mostly physico-chemical properties as a function of applications. Among all properties, we can list physical properties such as: size, shape, specific surface area, aspect ratio, agglomeration/aggregation state, size distribution, surface morphology/topography, structure (including crystallinity and defect structure), solubility and chemical properties such as: structural formula/molecular structure, composition (including degree of purity, known impurities or additives), phase identity, surface chemistry (composition, charge, tension, reactive sites, physical structure, photocatalytic properties, zeta potential), hydrophilicity/lipophilicity. Depending on the final material formulation (aerosol, powder, nanostructuration…) and the industrial application (semiconductor, cosmetics, chemistry, automotive…), a fleet of complementary characterization equipments must be used in synergy for accurate process tuning and high production yield. The synergy between equipment so-called hybrid metrology consists in using the strength of each technique in order to reduce the global uncertainty for better and faster process control. The only way to succeed doing this exercise is to use data fusion methodology. In this paper, we will introduce the work that has been done to create the first generic hybrid metrology software platform dedicated to nanotechnologies process control. The first part will be dedicated to process flow modeling that is related to a fleet of metrology tools. The second part will introduce the concept of entity model which describes the various parameters that have to be extracted. The entity model is fed with data analysis as a function of the application (automatic analysis or semi-automated analysis). The final part will introduce two ways of doing data fusion on real data coming from imaging (SEM, TEM, AFM) and non-imaging techniques (SAXS). First approach is dedicated to high level fusion which is the art of combining various populations of results from homogeneous or heterogeneous tools, taking into account precision and repeatability of each of them to obtain a new more accurate result. The second approach is dedicated to deep level fusion which is the art of combining raw data from various tools in order to create a new raw data. We will introduce a new concept of virtual tool creator based on deep level fusion. As a conclusion we will discuss the implementation of hybrid metrology in semiconductor environment for advanced process control
Recent Methodology in Ginseng Analysis
Baek, Seung-Hoon; Bae, Ok-Nam; Park, Jeong Hill
2012-01-01
As much as the popularity of ginseng in herbal prescriptions or remedies, ginseng has become the focus of research in many scientific fields. Analytical methodologies for ginseng, referred to as ginseng analysis hereafter, have been developed for bioactive component discovery, phytochemical profiling, quality control, and pharmacokinetic studies. This review summarizes the most recent advances in ginseng analysis in the past half-decade including emerging techniques and analytical trends. Ginseng analysis includes all of the leading analytical tools and serves as a representative model for the analytical research of herbal medicines. PMID:23717112
CONDUIT: A New Multidisciplinary Integration Environment for Flight Control Development
NASA Technical Reports Server (NTRS)
Tischler, Mark B.; Colbourne, Jason D.; Morel, Mark R.; Biezad, Daniel J.; Levine, William S.; Moldoveanu, Veronica
1997-01-01
A state-of-the-art computational facility for aircraft flight control design, evaluation, and integration called CONDUIT (Control Designer's Unified Interface) has been developed. This paper describes the CONDUIT tool and case study applications to complex rotary- and fixed-wing fly-by-wire flight control problems. Control system analysis and design optimization methods are presented, including definition of design specifications and system models within CONDUIT, and the multi-objective function optimization (CONSOL-OPTCAD) used to tune the selected design parameters. Design examples are based on flight test programs for which extensive data are available for validation. CONDUIT is used to analyze baseline control laws against pertinent military handling qualities and control system specifications. In both case studies, CONDUIT successfully exploits trade-offs between forward loop and feedback dynamics to significantly improve the expected handling, qualities and minimize the required actuator authority. The CONDUIT system provides a new environment for integrated control system analysis and design, and has potential for significantly reducing the time and cost of control system flight test optimization.
GenomicTools: a computational platform for developing high-throughput analytics in genomics.
Tsirigos, Aristotelis; Haiminen, Niina; Bilal, Erhan; Utro, Filippo
2012-01-15
Recent advances in sequencing technology have resulted in the dramatic increase of sequencing data, which, in turn, requires efficient management of computational resources, such as computing time, memory requirements as well as prototyping of computational pipelines. We present GenomicTools, a flexible computational platform, comprising both a command-line set of tools and a C++ API, for the analysis and manipulation of high-throughput sequencing data such as DNA-seq, RNA-seq, ChIP-seq and MethylC-seq. GenomicTools implements a variety of mathematical operations between sets of genomic regions thereby enabling the prototyping of computational pipelines that can address a wide spectrum of tasks ranging from pre-processing and quality control to meta-analyses. Additionally, the GenomicTools platform is designed to analyze large datasets of any size by minimizing memory requirements. In practical applications, where comparable, GenomicTools outperforms existing tools in terms of both time and memory usage. The GenomicTools platform (version 2.0.0) was implemented in C++. The source code, documentation, user manual, example datasets and scripts are available online at http://code.google.com/p/ibm-cbc-genomic-tools.
DOE Office of Scientific and Technical Information (OSTI.GOV)
W. Davis; P. Roney; T. Carroll
The MDSplus data acquisition system has been used successfully since the 1999 startup of NSTX [National Spherical Torus Experiment] for control, data acquisition, and analysis for diagnostic subsystems. For each plasma ''shot'' on NSTX about 75 MBs of data is acquired and loaded into MDSplus hierarchical data structures in 2-3 minutes. Physicists adapted to the MDSplus software tools with no real difficulty. Some locally developed tools are described. The support from the developers at MIT [Massachusetts Institute of Technology] was timely and insightful. The use of MDSplus has resulted in a significant cost savings for NSTX.
An analysis of the crossover between local and massive separation on airfoils
NASA Technical Reports Server (NTRS)
Barnett, M.; Carter, J. E.
1987-01-01
Massive separation on airfoils operating at high Reynolds number is an important problem to the aerodynamicist, since its onset generally determines the limiting performance of an airfoil, and it can lead to serious problems related to aircraft control as well as turbomachinery operation. The phenomenon of crossover between local separation and massive separation on realistic airfoil geometries induced by airfoil thickness is investigated for low speed (incompressible) flow. The problem is studied both for the asymptotic limit of infinite Reynolds number using triple-deck theory, and for finite Reynolds number using interacting boundary-layer theory. Numerical results are presented which follow the evolution of the flow as it develops from a mildly separated state to one dominated by the massively separated flow structure as the thickness of the airfoil geometry is systematically increased. The effect of turbulence upon the evolution of the flow is considered, and the impact is significant, with the principal effect being the suppression of the onset of separation. Finally, the effect of surface suction and injection for boundary-layer control is considered. The approach which was developed provides a valuable tool for the analysis of boundary-layer separation up to and beyond stall. Another important conclusion is that interacting boundary-layer theory provides an efficient tool for the analysis of the effect of turbulence and boundary-layer control upon separated vicsous flow.
[Video-based self-control in surgical teaching. A new tool in a new concept].
Dahmen, U; Sänger, C; Wurst, C; Arlt, J; Wei, W; Dondorf, F; Richter, B; Settmacher, U; Dirsch, O
2013-10-01
Image and video-based results and process control are essential tools of a new teaching concept for conveying surgical skills. The new teaching concept integrates approved teaching principles and new media. Every performance of exercises is videotaped and the result photographically recorded. The quality of the process and result becomes accessible for an analysis by the teacher and the student/learner. The learner is instructed to perform a criteria-based self-analysis of the video and image material by themselves. The new learning concept has so far been successfully applied in seven rounds within the newly designed modular class "Intensivkurs Chirurgische Techniken" (Intensive training of surgical techniques). Result documentation and analysis via digital picture was completed by almost every student. The quality of the results was high. Interestingly the result quality did not correlate with the time needed for the exercise. The training success had a lasting effect. The new and elaborate concept improves the quality of teaching. In the long run resources for patient care should be saved when training students according to this concept prior to performing tasks in the operating theater. These resources should be allocated for further refining innovative teaching concepts.
NASA Technical Reports Server (NTRS)
Funk, Christie J.; Perry, Boyd, III; Silva, Walter A.; Newman, Brett
2014-01-01
A software program and associated methodology to study gust loading on aircraft exists for a classification of geometrically simplified flexible configurations. This program consists of a simple aircraft response model with two rigid and three flexible symmetric degrees-of - freedom and allows for the calculation of various airplane responses due to a discrete one-minus- cosine gust as well as continuous turbulence. Simplifications, assumptions, and opportunities for potential improvements pertaining to the existing software program are first identified, then a revised version of the original software tool is developed with improved methodology to include more complex geometries, additional excitation cases, and additional output data so as to provide a more useful and precise tool for gust load analysis. In order to improve the original software program to enhance usefulness, a wing control surface and horizontal tail control surface is added, an extended application of the discrete one-minus-cosine gust input is employed, a supplemental continuous turbulence spectrum is implemented, and a capability to animate the total vehicle deformation response to gust inputs is included. These revisions and enhancements are implemented and an analysis of the results is used to validate the modifications.
Fully automated analysis of multi-resolution four-channel micro-array genotyping data
NASA Astrophysics Data System (ADS)
Abbaspour, Mohsen; Abugharbieh, Rafeef; Podder, Mohua; Tebbutt, Scott J.
2006-03-01
We present a fully-automated and robust microarray image analysis system for handling multi-resolution images (down to 3-micron with sizes up to 80 MBs per channel). The system is developed to provide rapid and accurate data extraction for our recently developed microarray analysis and quality control tool (SNP Chart). Currently available commercial microarray image analysis applications are inefficient, due to the considerable user interaction typically required. Four-channel DNA microarray technology is a robust and accurate tool for determining genotypes of multiple genetic markers in individuals. It plays an important role in the state of the art trend where traditional medical treatments are to be replaced by personalized genetic medicine, i.e. individualized therapy based on the patient's genetic heritage. However, fast, robust, and precise image processing tools are required for the prospective practical use of microarray-based genetic testing for predicting disease susceptibilities and drug effects in clinical practice, which require a turn-around timeline compatible with clinical decision-making. In this paper we have developed a fully-automated image analysis platform for the rapid investigation of hundreds of genetic variations across multiple genes. Validation tests indicate very high accuracy levels for genotyping results. Our method achieves a significant reduction in analysis time, from several hours to just a few minutes, and is completely automated requiring no manual interaction or guidance.
NASA Astrophysics Data System (ADS)
Mosier, Gary E.; Femiano, Michael; Ha, Kong; Bely, Pierre Y.; Burg, Richard; Redding, David C.; Kissil, Andrew; Rakoczy, John; Craig, Larry
1998-08-01
All current concepts for the NGST are innovative designs which present unique systems-level challenges. The goals are to outperform existing observatories at a fraction of the current price/performance ratio. Standard practices for developing systems error budgets, such as the 'root-sum-of- squares' error tree, are insufficient for designs of this complexity. Simulation and optimization are the tools needed for this project; in particular tools that integrate controls, optics, thermal and structural analysis, and design optimization. This paper describes such an environment which allows sub-system performance specifications to be analyzed parametrically, and includes optimizing metrics that capture the science requirements. The resulting systems-level design trades are greatly facilitated, and significant cost savings can be realized. This modeling environment, built around a tightly integrated combination of commercial off-the-shelf and in-house- developed codes, provides the foundation for linear and non- linear analysis on both the time and frequency-domains, statistical analysis, and design optimization. It features an interactive user interface and integrated graphics that allow highly-effective, real-time work to be done by multidisciplinary design teams. For the NGST, it has been applied to issues such as pointing control, dynamic isolation of spacecraft disturbances, wavefront sensing and control, on-orbit thermal stability of the optics, and development of systems-level error budgets. In this paper, results are presented from parametric trade studies that assess requirements for pointing control, structural dynamics, reaction wheel dynamic disturbances, and vibration isolation. These studies attempt to define requirements bounds such that the resulting design is optimized at the systems level, without attempting to optimize each subsystem individually. The performance metrics are defined in terms of image quality, specifically centroiding error and RMS wavefront error, which directly links to science requirements.
NASA Technical Reports Server (NTRS)
Csank, Jeffrey T.; Zinnecker, Alicia M.
2014-01-01
The aircraft engine design process seeks to achieve the best overall system-level performance, weight, and cost for a given engine design. This is achieved by a complex process known as systems analysis, where steady-state simulations are used to identify trade-offs that should be balanced to optimize the system. The steady-state simulations and data on which systems analysis relies may not adequately capture the true performance trade-offs that exist during transient operation. Dynamic Systems Analysis provides the capability for assessing these trade-offs at an earlier stage of the engine design process. The concept of dynamic systems analysis and the type of information available from this analysis are presented in this paper. To provide this capability, the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) was developed. This tool aids a user in the design of a power management controller to regulate thrust, and a transient limiter to protect the engine model from surge at a single flight condition (defined by an altitude and Mach number). Results from simulation of the closed-loop system may be used to estimate the dynamic performance of the model. This enables evaluation of the trade-off between performance and operability, or safety, in the engine, which could not be done with steady-state data alone. A design study is presented to compare the dynamic performance of two different engine models integrated with the TTECTrA software.
Relating MBSE to Spacecraft Development: A NASA Pathfinder
NASA Technical Reports Server (NTRS)
Othon, Bill
2016-01-01
The NASA Engineering and Safety Center (NESC) has sponsored a Pathfinder Study to investigate how Model Based Systems Engineering (MBSE) and Model Based Engineering (MBE) techniques can be applied by NASA spacecraft development projects. The objectives of this Pathfinder Study included analyzing both the products of the modeling activity, as well as the process and tool chain through which the spacecraft design activities are executed. Several aspects of MBSE methodology and process were explored. Adoption and consistent use of the MBSE methodology within an existing development environment can be difficult. The Pathfinder Team evaluated the possibility that an "MBSE Template" could be developed as both a teaching tool as well as a baseline from which future NASA projects could leverage. Elements of this template include spacecraft system component libraries, data dictionaries and ontology specifications, as well as software services that do work on the models themselves. The Pathfinder Study also evaluated the tool chain aspects of development. Two chains were considered: 1. The Development tool chain, through which SysML model development was performed and controlled, and 2. The Analysis tool chain, through which both static and dynamic system analysis is performed. Of particular interest was the ability to exchange data between SysML and other engineering tools such as CAD and Dynamic Simulation tools. For this study, the team selected a Mars Lander vehicle as the element to be designed. The paper will discuss what system models were developed, how data was captured and exchanged, and what analyses were conducted.
Traumatic Brain Injury Detection Using Electrophysiological Methods
Rapp, Paul E.; Keyser, David O.; Albano, Alfonso; Hernandez, Rene; Gibson, Douglas B.; Zambon, Robert A.; Hairston, W. David; Hughes, John D.; Krystal, Andrew; Nichols, Andrew S.
2015-01-01
Measuring neuronal activity with electrophysiological methods may be useful in detecting neurological dysfunctions, such as mild traumatic brain injury (mTBI). This approach may be particularly valuable for rapid detection in at-risk populations including military service members and athletes. Electrophysiological methods, such as quantitative electroencephalography (qEEG) and recording event-related potentials (ERPs) may be promising; however, the field is nascent and significant controversy exists on the efficacy and accuracy of the approaches as diagnostic tools. For example, the specific measures derived from an electroencephalogram (EEG) that are most suitable as markers of dysfunction have not been clearly established. A study was conducted to summarize and evaluate the statistical rigor of evidence on the overall utility of qEEG as an mTBI detection tool. The analysis evaluated qEEG measures/parameters that may be most suitable as fieldable diagnostic tools, identified other types of EEG measures and analysis methods of promise, recommended specific measures and analysis methods for further development as mTBI detection tools, identified research gaps in the field, and recommended future research and development thrust areas. The qEEG study group formed the following conclusions: (1) Individual qEEG measures provide limited diagnostic utility for mTBI. However, many measures can be important features of qEEG discriminant functions, which do show significant promise as mTBI detection tools. (2) ERPs offer utility in mTBI detection. In fact, evidence indicates that ERPs can identify abnormalities in cases where EEGs alone are non-disclosing. (3) The standard mathematical procedures used in the characterization of mTBI EEGs should be expanded to incorporate newer methods of analysis including non-linear dynamical analysis, complexity measures, analysis of causal interactions, graph theory, and information dynamics. (4) Reports of high specificity in qEEG evaluations of TBI must be interpreted with care. High specificities have been reported in carefully constructed clinical studies in which healthy controls were compared against a carefully selected TBI population. The published literature indicates, however, that similar abnormalities in qEEG measures are observed in other neuropsychiatric disorders. While it may be possible to distinguish a clinical patient from a healthy control participant with this technology, these measures are unlikely to discriminate between, for example, major depressive disorder, bipolar disorder, or TBI. The specificities observed in these clinical studies may well be lost in real world clinical practice. (5) The absence of specificity does not preclude clinical utility. The possibility of use as a longitudinal measure of treatment response remains. However, efficacy as a longitudinal clinical measure does require acceptable test–retest reliability. To date, very few test–retest reliability studies have been published with qEEG data obtained from TBI patients or from healthy controls. This is a particular concern because high variability is a known characteristic of the injured central nervous system. PMID:25698950
Traumatic brain injury detection using electrophysiological methods.
Rapp, Paul E; Keyser, David O; Albano, Alfonso; Hernandez, Rene; Gibson, Douglas B; Zambon, Robert A; Hairston, W David; Hughes, John D; Krystal, Andrew; Nichols, Andrew S
2015-01-01
Measuring neuronal activity with electrophysiological methods may be useful in detecting neurological dysfunctions, such as mild traumatic brain injury (mTBI). This approach may be particularly valuable for rapid detection in at-risk populations including military service members and athletes. Electrophysiological methods, such as quantitative electroencephalography (qEEG) and recording event-related potentials (ERPs) may be promising; however, the field is nascent and significant controversy exists on the efficacy and accuracy of the approaches as diagnostic tools. For example, the specific measures derived from an electroencephalogram (EEG) that are most suitable as markers of dysfunction have not been clearly established. A study was conducted to summarize and evaluate the statistical rigor of evidence on the overall utility of qEEG as an mTBI detection tool. The analysis evaluated qEEG measures/parameters that may be most suitable as fieldable diagnostic tools, identified other types of EEG measures and analysis methods of promise, recommended specific measures and analysis methods for further development as mTBI detection tools, identified research gaps in the field, and recommended future research and development thrust areas. The qEEG study group formed the following conclusions: (1) Individual qEEG measures provide limited diagnostic utility for mTBI. However, many measures can be important features of qEEG discriminant functions, which do show significant promise as mTBI detection tools. (2) ERPs offer utility in mTBI detection. In fact, evidence indicates that ERPs can identify abnormalities in cases where EEGs alone are non-disclosing. (3) The standard mathematical procedures used in the characterization of mTBI EEGs should be expanded to incorporate newer methods of analysis including non-linear dynamical analysis, complexity measures, analysis of causal interactions, graph theory, and information dynamics. (4) Reports of high specificity in qEEG evaluations of TBI must be interpreted with care. High specificities have been reported in carefully constructed clinical studies in which healthy controls were compared against a carefully selected TBI population. The published literature indicates, however, that similar abnormalities in qEEG measures are observed in other neuropsychiatric disorders. While it may be possible to distinguish a clinical patient from a healthy control participant with this technology, these measures are unlikely to discriminate between, for example, major depressive disorder, bipolar disorder, or TBI. The specificities observed in these clinical studies may well be lost in real world clinical practice. (5) The absence of specificity does not preclude clinical utility. The possibility of use as a longitudinal measure of treatment response remains. However, efficacy as a longitudinal clinical measure does require acceptable test-retest reliability. To date, very few test-retest reliability studies have been published with qEEG data obtained from TBI patients or from healthy controls. This is a particular concern because high variability is a known characteristic of the injured central nervous system.
Van der Stuyft, Patrick; Toledo, María Eugenia; Ceballos, Enrique; Fabré, Francisco; Lefèvre, Pierre
2018-01-01
Background Within the context of a field trial conducted by the Cuban vector control program (AaCP), we assessed acceptability of insecticide-treated curtains (ITCs) and residual insecticide treatment (RIT) with deltamethrin by the community. We also assessed the potential influence of interviewees’ risk perceptions for getting dengue and disease severity. Methodology/principal findings We embedded a qualitative study using in-depth interviews in a cluster randomized trial (CRT) testing the effectiveness of ITCs and RIT in Santiago de Cuba. In-depth interviews (N = 38) were conducted four and twelve months after deployment of the tools with people who accepted the tools, who stopped using them and who did not accept the tools. Data analysis was deductive. Main reasons for accepting ITCs at the start of the trial were perceived efficacy and not being harmful to health. Constraints linked to manufacturer instructions were the main reason for not using ITCs. People stopped using the ITCs due to perceived allergy, toxicity and low efficacy. Few heads of households refused RIT despite the noting reasons for rejection, such as allergy, health hazard and toxicity. Positive opinions of the vector control program influenced acceptability of both tools. However, frequent insecticide fogging as part of routine AaCP vector control actions diminished perceived efficacy of both tools and, therefore, acceptability. Fifty percent of interviewees did feel at risk for getting dengue and considered dengue a severe disease. However, this did not appear to influence acceptability of ITCs or RIT. Conclusion/significance Acceptability of ITCs and RIT was linked to acceptability of AaCP routine vector control activities. However, uptake and use were not always an indication of acceptability. Factors leading to acceptability may be best identified using qualitative methods, but more research is needed on the concept of acceptability and its measurement. PMID:29293501
Kambas, Antonis; Venetsanou, Fotini
2014-07-01
The aim of this study was (a) to develop an assessment tool (the Democritos Movement Screening Tool for Preschool Children - DEMOST-PRE), designed to provide preschool educators, clinicians and researchers with information about assessment and screening of the motor proficiency of children aged 4-6 years, as well as the development and control of movement programmes and (b) to assess its factorial validity. First, tool's content and face validity were established and its final structure was determined. Then, the DEMOST-PRE was administered to 435 children (197 girls) aged 48-71 months (M=60.48 months, SD=6.98). The factor analysis conducted revealed two distinct components. Present evidence combined with the DEMOST-PRE administrative traits make it promising for preschool aged children's assessment. Copyright © 2014 Elsevier Ltd. All rights reserved.
Terrestrial Planet Finder Coronagraph and Enabling Technologies
NASA Technical Reports Server (NTRS)
Ford, Virginia G.
2005-01-01
Starlight suppression research is Stowed in Delta IV-H advancing rapidly to approach the required contrast ratio. The current analysis of the TPF Coronagraph system indicates that it is feasible to achieve the stability required by using developing technologies: a) Wave Front Sensing and Control (DMs, control algorithms, and sensing); b) Laser metrology. Yet needed: a) Property data measured with great precision in the required environments; b) Modeling tools that are verified with testbeds.
Development of Analysis Tools for Certification of Flight Control Laws
2009-03-31
In Proc. Conf. on Decision and Control, pages 881-886, Bahamas, 2004. [7] G. Chesi, A. Garulli, A. Tesi , and A. Vicino. LMI-based computation of...Minneapolis, MN, 2006, pp. 117-122. [10] G. Chesi, A. Garulli, A. Tesi . and A. Vicino, "LMI-based computation of optimal quadratic Lyapunov functions...Convex Optimization. Cambridge Univ. Press. Chesi, G., A. Garulli, A. Tesi and A. Vicino (2005). LMI-based computation of optimal quadratic Lyapunov
Savini, Lara; Tora, Susanna; Di Lorenzo, Alessio; Cioci, Daniela; Monaco, Federica; Polci, Andrea; Orsini, Massimiliano; Calistri, Paolo; Conte, Annamaria
2018-01-01
In the last decades an increasing number of West Nile Disease cases was observed in equines and humans in the Mediterranean basin and surveillance systems are set up in numerous countries to manage and control the disease. The collection, storage and distribution of information on the spread of the disease becomes important for a shared intervention and control strategy. To this end, a Web Geographic Information System has been developed and disease data, climatic and environmental remote sensed data, full genome sequences of selected isolated strains are made available. This paper describes the Disease Monitoring Dashboard (DMD) web system application, the tools available for the preliminary analysis on climatic and environmental factors and the other interactive tools for epidemiological analysis. WNV occurrence data are collected from multiple official and unofficial sources. Whole genome sequences and metadata of WNV strains are retrieved from public databases or generated in the framework of the Italian surveillance activities. Climatic and environmental data are provided by NASA website. The Geographical Information System is composed by Oracle 10g Database and ESRI ArcGIS Server 10.03; the web mapping client application is developed with the ArcGIS API for Javascript and Phylocanvas library to facilitate and optimize the mash-up approach. ESRI ArcSDE 10.1 has been used to store spatial data. The DMD application is accessible through a generic web browser at https://netmed.izs.it/networkMediterraneo/. The system collects data through on-line forms and automated procedures and visualizes data as interactive graphs, maps and tables. The spatial and temporal dynamic visualization of disease events is managed by a time slider that returns results on both map and epidemiological curve. Climatic and environmental data can be associated to cases through python procedures and downloaded as Excel files. The system compiles multiple datasets through user-friendly web tools; it integrates entomological, veterinary and human surveillance, molecular information on pathogens and environmental and climatic data. The principal result of the DMD development is the transfer and dissemination of knowledge and technologies to develop strategies for integrated prevention and control measures of animal and human diseases.
Self-Directed Learning: A Tool for Lifelong Learning
ERIC Educational Resources Information Center
Boyer, Stefanie L.; Edmondson, Diane R.; Artis, Andrew B.; Fleming, David
2014-01-01
A meta-analytic review of self-directed learning (SDL) research over 30 years, five countries, and across multiple academic disciplines is used to explore its relationships with five key nomologically related constructs for effective workplace learning. The meta-analysis revealed positive relationships between SDL and internal locus of control,…
Gene action analysis by inheritance and QTL mapping of resistance to root-knot nematodes in cotton.
USDA-ARS?s Scientific Manuscript database
Host-plant resistance is highly effective in controlling crop loss from nematode infection. In addition, molecular markers can be powerful tools for marker-assisted selection (MAS), where they reduce laborious greenhouse phenotype evaluation to identify root-knot nematode (RKN) Meloidogyne incognita...
Dogs have been studied for many years as a medical diagnostic tool to detect a pre-clinical disease state by sniffing emissions directly from a human or an in vitro biological sample. Some of the studies report high sensitivity and specificity in blinded case-control studies. How...
MATREX: A Unifying Modeling and Simulation Architecture for Live-Virtual-Constructive Applications
2007-05-23
Deployment Systems Acquisition Operations & Support B C Sustainment FRP Decision Review FOC LRIP/IOT& ECritical Design Review Pre-Systems...CMS2 – Comprehensive Munitions & Sensor Server • CSAT – C4ISR Static Analysis Tool • C4ISR – Command & Control, Communications, Computers
Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona
2012-01-01
Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical process control study on process. Interpretation of such a study provides information about stability, process variability, changing of trends, and quantification of process ability against defective production. Comparative evaluation of critical quality attributes by Pareto charts provides the least capable and most variable process that is liable for improvement. Statistical process control thus proves to be an important tool for six sigma-capable process development and continuous quality improvement.
The Environmental Control and Life Support System (ECLSS) advanced automation project
NASA Technical Reports Server (NTRS)
Dewberry, Brandon S.; Carnes, Ray
1990-01-01
The objective of the environmental control and life support system (ECLSS) Advanced Automation Project is to influence the design of the initial and evolutionary Space Station Freedom Program (SSFP) ECLSS toward a man-made closed environment in which minimal flight and ground manpower is needed. Another objective includes capturing ECLSS design and development knowledge future missions. Our approach has been to (1) analyze the SSFP ECLSS, (2) envision as our goal a fully automated evolutionary environmental control system - an augmentation of the baseline, and (3) document the advanced software systems, hooks, and scars which will be necessary to achieve this goal. From this analysis, prototype software is being developed, and will be tested using air and water recovery simulations and hardware subsystems. In addition, the advanced software is being designed, developed, and tested using automation software management plan and lifecycle tools. Automated knowledge acquisition, engineering, verification and testing tools are being used to develop the software. In this way, we can capture ECLSS development knowledge for future use develop more robust and complex software, provide feedback to the knowledge based system tool community, and ensure proper visibility of our efforts.
Skoulikidis, N Th; Amaxidis, Y; Bertahas, I; Laschou, S; Gritzalis, K
2006-06-01
Twenty-nine small- and mid-sized permanent rivers (thirty-six sites) scattered throughout Greece and equally distributed within three geo-chemical-climatic zones, have been investigated in a seasonal base. Hydrochemical types have been determined and spatio-temporal variations have been interpreted in relation to environmental characteristics and anthropogenic pressures. Multivariate statistical techniques have been used to identify the factors and processes affecting hydrochemical variability and the driving forces that control aquatic composition. It has been shown that spatial variation of aquatic quality is mainly governed by geological and hydrogeological factors. Due to geological and climatic variability, the three zones have different hydrochemical characteristics. Temporal hydrological variations in combination with hydrogeological factors control seasonal hydrochemical trends. Respiration processes due to municipal wastewaters, dominate in summer, and enhance nutrient, chloride and sodium concentrations, while nitrate originates primarily from agriculture. Photosynthetic processes dominate in spring. Carbonate chemistry is controlled by hydrogeological factors and biological activity. A possible enrichment of surface waters with nutrients in "pristine" forested catchments is attributed to soil leaching and mineralisation processes. Two management tools have been developed: a nutrient classification system and a rapid prediction of aquatic composition tool.
Tool use and mechanical problem solving in apraxia.
Goldenberg, G; Hagmann, S
1998-07-01
Moorlaas (1928) proposed that apraxic patients can identify objects and can remember the purpose they have been made for but do not know the way in which they must be used to achieve that purpose. Knowledge about the use of objects and tools can have two sources: It can be based on retrieval of instructions of use from semantic memory or on a direct inference of function from structure. The ability to infer function from structure enables subjects to use unfamiliar tools and to detect alternative uses of familiar tools. It is the basis of mechanical problem solving. The purpose of the present study was to analyze retrieval of instruction of use, mechanical problem solving, and actual tool use in patients with apraxia due to circumscribed lesions of the left hemisphere. For assessing mechanical problem solving we developed a test of selection and application of novel tools. Access to instruction of use was tested by pantomime of tool use. Actual tool use was examined for the same familiar tools. Forty two patients with left brain damage (LBD) and aphasia, 22 patients with right brain damage (RBD) and 22 controls were examined. Only LBD patients differed from controls on all tests. RBD patients had difficulties with the use but not with the selection of novel tools. In LBD patients there was a significant correlation between pantomime of tool use and novel tool selection but there were single cases who scored in the defective range on one of these tests and normally on the other. Analysis of LBD patients' lesions suggested that frontal lobe damage does not disturb novel tool selection. Only LBD patients who failed on pantomime of object use and on novel tool selection committed errors in actual use of familiar tools. The finding that mechanical problem solving is invariably defective in apraxic patients who commit errors with familiar tools is in good accord with clinical observations, as the gravity of their errors goes beyond what one would expect as a mere sequel of loss of access to instruction of use.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ozana, Stepan, E-mail: stepan.ozana@vsb.cz; Pies, Martin, E-mail: martin.pies@vsb.cz; Docekal, Tomas, E-mail: docekalt@email.cz
REX Control System is a professional advanced tool for design and implementation of complex control systems that belongs to softPLC category. It covers the entire process starting from simulation of functionality of the application before deployment, through implementation on real-time target, towards analysis, diagnostics and visualization. Basically it consists of two parts: the development tools and the runtime system. It is also compatible with Simulink environment, and the way of implementation of control algorithm is very similar. The control scheme is finally compiled (using RexDraw utility) and uploaded into a chosen real-time target (using RexView utility). There is a widemore » variety of hardware platforms and real-time operating systems supported by REX Control System such as for example Windows Embedded, Linux, Linux/Xenomai deployed on SBC, IPC, PAC, Raspberry Pi and others with many I/O interfaces. It is modern system designed both for measurement and control applications, offering a lot of additional functions concerning data archiving, visualization based on HTML5, and communication standards. The paper will sum up possibilities of its use in educational process, focused on control of case studies of physical models with classical and advanced control algorithms.« less
NASA Technical Reports Server (NTRS)
Davidson, John B.; Murphy, Patrick C.; Lallman, Frederick J.; Hoffler, Keith D.; Bacon, Barton J.
1998-01-01
This report contains a description of a lateral-directional control law designed for the NASA High-Alpha Research Vehicle (HARV). The HARV is a F/A-18 aircraft modified to include a research flight computer, spin chute, and thrust-vectoring in the pitch and yaw axes. Two separate design tools, CRAFT and Pseudo Controls, were integrated to synthesize the lateral-directional control law. This report contains a description of the lateral-directional control law, analyses, and nonlinear simulation (batch and piloted) results. Linear analysis results include closed-loop eigenvalues, stability margins, robustness to changes in various plant parameters, and servo-elastic frequency responses. Step time responses from nonlinear batch simulation are presented and compared to design guidelines. Piloted simulation task scenarios, task guidelines, and pilot subjective ratings for the various maneuvers are discussed. Linear analysis shows that the control law meets the stability margin guidelines and is robust to stability and control parameter changes. Nonlinear batch simulation analysis shows the control law exhibits good performance and meets most of the design guidelines over the entire range of angle-of-attack. This control law (designated NASA-1A) was flight tested during the Summer of 1994 at NASA Dryden Flight Research Center.
Advanced techniques in reliability model representation and solution
NASA Technical Reports Server (NTRS)
Palumbo, Daniel L.; Nicol, David M.
1992-01-01
The current tendency of flight control system designs is towards increased integration of applications and increased distribution of computational elements. The reliability analysis of such systems is difficult because subsystem interactions are increasingly interdependent. Researchers at NASA Langley Research Center have been working for several years to extend the capability of Markov modeling techniques to address these problems. This effort has been focused in the areas of increased model abstraction and increased computational capability. The reliability model generator (RMG) is a software tool that uses as input a graphical object-oriented block diagram of the system. RMG uses a failure-effects algorithm to produce the reliability model from the graphical description. The ASSURE software tool is a parallel processing program that uses the semi-Markov unreliability range evaluator (SURE) solution technique and the abstract semi-Markov specification interface to the SURE tool (ASSIST) modeling language. A failure modes-effects simulation is used by ASSURE. These tools were used to analyze a significant portion of a complex flight control system. The successful combination of the power of graphical representation, automated model generation, and parallel computation leads to the conclusion that distributed fault-tolerant system architectures can now be analyzed.
NASA Astrophysics Data System (ADS)
Ghani, Jaharah A.; Mohd Rodzi, Mohd Nor Azmi; Zaki Nuawi, Mohd; Othman, Kamal; Rahman, Mohd. Nizam Ab.; Haron, Che Hassan Che; Deros, Baba Md
2011-01-01
Machining is one of the most important manufacturing processes in these modern industries especially for finishing an automotive component after the primary manufacturing processes such as casting and forging. In this study the turning parameters of dry cutting environment (without air, normal air and chilled air), various cutting speed, and feed rate are evaluated using a Taguchi optimization methodology. An orthogonal array L27 (313), signal-to-noise (S/N) ratio and analysis of variance (ANOVA) are employed to analyze the effect of these turning parameters on the performance of a coated carbide tool. The results show that the tool life is affected by the cutting speed, feed rate and cutting environment with contribution of 38%, 32% and 27% respectively. Whereas for the surface roughness, the feed rate is significantly controlled the machined surface produced by 77%, followed by the cutting environment of 19%. The cutting speed is found insignificant in controlling the machined surface produced. The study shows that the dry cutting environment factor should be considered in order to produce longer tool life as well as for obtaining a good machined surface.
Prospects for engineering dynamic CRISPR-Cas transcriptional circuits to improve bioproduction.
Fontana, Jason; Voje, William E; Zalatan, Jesse G; Carothers, James M
2018-05-08
Dynamic control of gene expression is emerging as an important strategy for controlling flux in metabolic pathways and improving bioproduction of valuable compounds. Integrating dynamic genetic control tools with CRISPR-Cas transcriptional regulation could significantly improve our ability to fine-tune the expression of multiple endogenous and heterologous genes according to the state of the cell. In this mini-review, we combine an analysis of recent literature with examples from our own work to discuss the prospects and challenges of developing dynamically regulated CRISPR-Cas transcriptional control systems for applications in synthetic biology and metabolic engineering.
Man-rated flight software for the F-8 DFBW program
NASA Technical Reports Server (NTRS)
Bairnsfather, R. R.
1976-01-01
The design, implementation, and verification of the flight control software used in the F-8 DFBW program are discussed. Since the DFBW utilizes an Apollo computer and hardware, the procedures, controls, and basic management techniques employed are based on those developed for the Apollo software system. Program assembly control, simulator configuration control, erasable-memory load generation, change procedures and anomaly reporting are discussed. The primary verification tools are described, as well as the program test plans and their implementation on the various simulators. Failure effects analysis and the creation of special failure generating software for testing purposes are described.
The Multiple Control of Verbal Behavior
Michael, Jack; Palmer, David C; Sundberg, Mark L
2011-01-01
Amid the novel terms and original analyses in Skinner's Verbal Behavior, the importance of his discussion of multiple control is easily missed, but multiple control of verbal responses is the rule rather than the exception. In this paper we summarize and illustrate Skinner's analysis of multiple control and introduce the terms convergent multiple control and divergent multiple control. We point out some implications for applied work and discuss examples of the role of multiple control in humor, poetry, problem solving, and recall. Joint control and conditional discrimination are discussed as special cases of multiple control. We suggest that multiple control is a useful analytic tool for interpreting virtually all complex behavior, and we consider the concepts of derived relations and naming as cases in point. PMID:22532752
Using microRNA profiling in urine samples to develop a non-invasive test for bladder cancer.
Mengual, Lourdes; Lozano, Juan José; Ingelmo-Torres, Mercedes; Gazquez, Cristina; Ribal, María José; Alcaraz, Antonio
2013-12-01
Current standard methods used to detect and monitor bladder urothelial cell carcinoma (UCC) are invasive or have low sensitivity. The incorporation into clinical practice of a non-invasive tool for UCC assessment would enormously improve patients' quality of life and outcome. This study aimed to examine the microRNA (miRNA) expression profiles in urines of UCC patients in order to develop a non-invasive accurate and reliable tool to diagnose and provide information on the aggressiveness of the tumor. We performed a global miRNA expression profiling analysis of the urinary cells from 40 UCC patients and controls using TaqMan Human MicroRNA Array followed by validation of 22 selected potentially diagnostic and prognostic miRNAs in a separate cohort of 277 samples using a miRCURY LNA qPCR system. miRNA-based signatures were developed by multivariate logistic regression analysis and internally cross-validated. In the initial cohort of patients, we identified 40 and 30 aberrantly expressed miRNA in UCC compared with control urines and in high compared with low grade tumors, respectively. Quantification of 22 key miRNAs in an independent cohort resulted in the identification of a six miRNA diagnostic signature with a sensitivity of 84.8% and specificity of 86.5% (AUC = 0.92) and a two miRNA prognostic model with a sensitivity of 84.95% and a specificity of 74.14% (AUC = 0.83). Internal cross-validation analysis confirmed the accuracy rates of both models, reinforcing the strength of our findings. Although the data needs to be externally validated, miRNA analysis in urine appears to be a valuable tool for the non-invasive assessment of UCC. Copyright © 2013 UICC.
Visual analysis and exploration of complex corporate shareholder networks
NASA Astrophysics Data System (ADS)
Tekušová, Tatiana; Kohlhammer, Jörn
2008-01-01
The analysis of large corporate shareholder network structures is an important task in corporate governance, in financing, and in financial investment domains. In a modern economy, large structures of cross-corporation, cross-border shareholder relationships exist, forming complex networks. These networks are often difficult to analyze with traditional approaches. An efficient visualization of the networks helps to reveal the interdependent shareholding formations and the controlling patterns. In this paper, we propose an effective visualization tool that supports the financial analyst in understanding complex shareholding networks. We develop an interactive visual analysis system by combining state-of-the-art visualization technologies with economic analysis methods. Our system is capable to reveal patterns in large corporate shareholder networks, allows the visual identification of the ultimate shareholders, and supports the visual analysis of integrated cash flow and control rights. We apply our system on an extensive real-world database of shareholder relationships, showing its usefulness for effective visual analysis.
Multi-Disciplinary Design Optimization Using WAVE
NASA Technical Reports Server (NTRS)
Irwin, Keith
2000-01-01
The current preliminary design tools lack the product performance, quality and cost prediction fidelity required to design Six Sigma products. They are also frequently incompatible with the tools used in detailed design, leading to a great deal of rework and lost or discarded data in the transition from preliminary to detailed design. Thus, enhanced preliminary design tools are needed in order to produce adequate financial returns to the business. To achieve this goal, GEAE has focused on building the preliminary design system around the same geometric 3D solid model that will be used in detailed design. With this approach, the preliminary designer will no longer convert a flowpath sketch into an engine cross section but rather, automatically create 3D solid geometry for structural integrity, life, weight, cost, complexity, producibility, and maintainability assessments. Likewise, both the preliminary design and the detailed design can benefit from the use of the same preliminary part sizing routines. The design analysis tools will also be integrated with the 3D solid model to eliminate manual transfer of data between programs. GEAE has aggressively pursued the computerized control of engineering knowledge for many years. Through its study and validation of 3D CAD programs and processes, GEAE concluded that total system control was not feasible at that time. Prior CAD tools focused exclusively on detail part geometry and Knowledge Based Engineering systems concentrated on rules input and data output. A system was needed to bridge the gap between the two to capture the total system. With the introduction of WAVE Engineering from UGS, the possibilities of an engineering system control device began to formulate. GEAE decided to investigate the new WAVE functionality to accomplish this task. NASA joined GEAE in funding this validation project through Task Order No. 1. With the validation project complete, the second phase under Task Order No. 2 was established to develop an associative control structure (framework) in the UG WAVE environment enabling multi-disciplinary design of turbine propulsion systems. The capabilities of WAVE were evaluated to assess its use as a rapid optimization and productivity tool. This project also identified future WAVE product enhancements that will make the tool still more beneficial for product development.
HC StratoMineR: A Web-Based Tool for the Rapid Analysis of High-Content Datasets.
Omta, Wienand A; van Heesbeen, Roy G; Pagliero, Romina J; van der Velden, Lieke M; Lelieveld, Daphne; Nellen, Mehdi; Kramer, Maik; Yeong, Marley; Saeidi, Amir M; Medema, Rene H; Spruit, Marco; Brinkkemper, Sjaak; Klumperman, Judith; Egan, David A
2016-10-01
High-content screening (HCS) can generate large multidimensional datasets and when aligned with the appropriate data mining tools, it can yield valuable insights into the mechanism of action of bioactive molecules. However, easy-to-use data mining tools are not widely available, with the result that these datasets are frequently underutilized. Here, we present HC StratoMineR, a web-based tool for high-content data analysis. It is a decision-supportive platform that guides even non-expert users through a high-content data analysis workflow. HC StratoMineR is built by using My Structured Query Language for storage and querying, PHP: Hypertext Preprocessor as the main programming language, and jQuery for additional user interface functionality. R is used for statistical calculations, logic and data visualizations. Furthermore, C++ and graphical processor unit power is diffusely embedded in R by using the rcpp and rpud libraries for operations that are computationally highly intensive. We show that we can use HC StratoMineR for the analysis of multivariate data from a high-content siRNA knock-down screen and a small-molecule screen. It can be used to rapidly filter out undesirable data; to select relevant data; and to perform quality control, data reduction, data exploration, morphological hit picking, and data clustering. Our results demonstrate that HC StratoMineR can be used to functionally categorize HCS hits and, thus, provide valuable information for hit prioritization.
Active edge control in the precessions polishing process for manufacturing large mirror segments
NASA Astrophysics Data System (ADS)
Li, Hongyu; Zhang, Wei; Walker, David; Yu, Gouyo
2014-09-01
The segmentation of the primary mirror is the only promising solution for building the next generation of ground telescopes. However, manufacturing segmented mirrors presents its own challenges. The edge mis-figure impacts directly on the telescope's scientific output. The `Edge effect' significantly dominates the polishing precision. Therefore, the edge control is regarded as one of the most difficult technical issues in the segment production that needs to be addressed urgently. This paper reports an active edge control technique for the mirror segments fabrication using the Precession's polishing technique. The strategy in this technique requires that the large spot be selected on the bulk area for fast polishing, and the small spot is used for edge figuring. This can be performed by tool lift and optimizing the dell time to compensate for non-uniform material removal at the edge zone. This requires accurate and stable edge tool influence functions. To obtain the full tool influence function at the edge, we have demonstrated in previous work a novel hybrid-measurement method which uses both simultaneous phase interferometry and profilometry. In this paper, the edge effect under `Bonnet tool' polishing is investigated. The pressure distribution is analyzed by means of finite element analysis (FEA). According to the `Preston' equation, the shape of the edge tool influence functions is predicted. With this help, the multiple process parameters at the edge zone are optimized. This is demonstrated on a 200mm crosscorners hexagonal part with a result of PV less than 200nm for entire surface.
Sonification Prototype for Space Physics
NASA Astrophysics Data System (ADS)
Candey, R. M.; Schertenleib, A. M.; Diaz Merced, W. L.
2005-12-01
As an alternative and adjunct to visual displays, auditory exploration of data via sonification (data controlled sound) and audification (audible playback of data samples) is promising for complex or rapidly/temporally changing visualizations, for data exploration of large datasets (particularly multi-dimensional datasets), and for exploring datasets in frequency rather than spatial dimensions (see also International Conferences on Auditory Display
Hur, Pilwon; Shorter, K Alex; Mehta, Prashant G; Hsiao-Wecksler, Elizabeth T
2012-04-01
In this paper, a novel analysis technique, invariant density analysis (IDA), is introduced. IDA quantifies steady-state behavior of the postural control system using center of pressure (COP) data collected during quiet standing. IDA relies on the analysis of a reduced-order finite Markov model to characterize stochastic behavior observed during postural sway. Five IDA parameters characterize the model and offer physiological insight into the long-term dynamical behavior of the postural control system. Two studies were performed to demonstrate the efficacy of IDA. Study 1 showed that multiple short trials can be concatenated to create a dataset suitable for IDA. Study 2 demonstrated that IDA was effective at distinguishing age-related differences in postural control behavior between young, middle-aged, and older adults. These results suggest that the postural control system of young adults converges more quickly to their steady-state behavior while maintaining COP nearer an overall centroid than either the middle-aged or older adults. Additionally, larger entropy values for older adults indicate that their COP follows a more stochastic path, while smaller entropy values for young adults indicate a more deterministic path. These results illustrate the potential of IDA as a quantitative tool for the assessment of the quiet-standing postural control system.
Laurinaviciene, Aida; Plancoulaine, Benoit; Baltrusaityte, Indra; Meskauskas, Raimundas; Besusparis, Justinas; Lesciute-Krilaviciene, Daiva; Raudeliunas, Darius; Iqbal, Yasir; Herlin, Paulette; Laurinavicius, Arvydas
2014-01-01
Digital immunohistochemistry (IHC) is one of the most promising applications brought by new generation image analysis (IA). While conventional IHC staining quality is monitored by semi-quantitative visual evaluation of tissue controls, IA may require more sensitive measurement. We designed an automated system to digitally monitor IHC multi-tissue controls, based on SQL-level integration of laboratory information system with image and statistical analysis tools. Consecutive sections of TMA containing 10 cores of breast cancer tissue were used as tissue controls in routine Ki67 IHC testing. Ventana slide label barcode ID was sent to the LIS to register the serial section sequence. The slides were stained and scanned (Aperio ScanScope XT), IA was performed by the Aperio/Leica Colocalization and Genie Classifier/Nuclear algorithms. SQL-based integration ensured automated statistical analysis of the IA data by the SAS Enterprise Guide project. Factor analysis and plot visualizations were performed to explore slide-to-slide variation of the Ki67 IHC staining results in the control tissue. Slide-to-slide intra-core IHC staining analysis revealed rather significant variation of the variables reflecting the sample size, while Brown and Blue Intensity were relatively stable. To further investigate this variation, the IA results from the 10 cores were aggregated to minimize tissue-related variance. Factor analysis revealed association between the variables reflecting the sample size detected by IA and Blue Intensity. Since the main feature to be extracted from the tissue controls was staining intensity, we further explored the variation of the intensity variables in the individual cores. MeanBrownBlue Intensity ((Brown+Blue)/2) and DiffBrownBlue Intensity (Brown-Blue) were introduced to better contrast the absolute intensity and the colour balance variation in each core; relevant factor scores were extracted. Finally, tissue-related factors of IHC staining variance were explored in the individual tissue cores. Our solution enabled to monitor staining of IHC multi-tissue controls by the means of IA, followed by automated statistical analysis, integrated into the laboratory workflow. We found that, even in consecutive serial tissue sections, tissue-related factors affected the IHC IA results; meanwhile, less intense blue counterstain was associated with less amount of tissue, detected by the IA tools.
2014-01-01
Background Digital immunohistochemistry (IHC) is one of the most promising applications brought by new generation image analysis (IA). While conventional IHC staining quality is monitored by semi-quantitative visual evaluation of tissue controls, IA may require more sensitive measurement. We designed an automated system to digitally monitor IHC multi-tissue controls, based on SQL-level integration of laboratory information system with image and statistical analysis tools. Methods Consecutive sections of TMA containing 10 cores of breast cancer tissue were used as tissue controls in routine Ki67 IHC testing. Ventana slide label barcode ID was sent to the LIS to register the serial section sequence. The slides were stained and scanned (Aperio ScanScope XT), IA was performed by the Aperio/Leica Colocalization and Genie Classifier/Nuclear algorithms. SQL-based integration ensured automated statistical analysis of the IA data by the SAS Enterprise Guide project. Factor analysis and plot visualizations were performed to explore slide-to-slide variation of the Ki67 IHC staining results in the control tissue. Results Slide-to-slide intra-core IHC staining analysis revealed rather significant variation of the variables reflecting the sample size, while Brown and Blue Intensity were relatively stable. To further investigate this variation, the IA results from the 10 cores were aggregated to minimize tissue-related variance. Factor analysis revealed association between the variables reflecting the sample size detected by IA and Blue Intensity. Since the main feature to be extracted from the tissue controls was staining intensity, we further explored the variation of the intensity variables in the individual cores. MeanBrownBlue Intensity ((Brown+Blue)/2) and DiffBrownBlue Intensity (Brown-Blue) were introduced to better contrast the absolute intensity and the colour balance variation in each core; relevant factor scores were extracted. Finally, tissue-related factors of IHC staining variance were explored in the individual tissue cores. Conclusions Our solution enabled to monitor staining of IHC multi-tissue controls by the means of IA, followed by automated statistical analysis, integrated into the laboratory workflow. We found that, even in consecutive serial tissue sections, tissue-related factors affected the IHC IA results; meanwhile, less intense blue counterstain was associated with less amount of tissue, detected by the IA tools. PMID:25565007
Evaluation of Correlation of Blood Glucose and Salivary Glucose Level in Known Diabetic Patients.
Gupta, Anjali; Singh, Siddharth Kumar; Padmavathi, B N; Rajan, S Y; Mamatha, G P; Kumar, Sandeep; Roy, Sayak; Sareen, Mohit
2015-05-01
Diabetes mellitus is a chronic heterogenous disease in which there is dysregulation of carbohydrates, protein and lipid metabolism; leading to elevated blood glucose levels. The present study was conducted to evaluate the correlation between blood glucose and salivary glucose levels in known diabetic patients and control group and also to evaluate salivary glucose level as a diagnostic tool in diabetic patients. A total number of 250 patients were studied, out of which 212 formed the study group and 38 formed the control group. Among 250 patients, correlation was evaluated between blood glucose and salivary glucose values which on analysis revealed Pearson correlation of 0.073. The p-value was 0.247, which was statistically non significant. Salivary glucose values cannot be considered as a diagnostic tool for diabetic individuals.
1997-01-01
Microbiological safety is achieved by applying good hygienic practices throughout the food chain, "from farm to fork". Governmental food control is traditionally based on inspection of the facilities where foods are handled, and on testing food samples. Testing is usually applied to imported foods, when no information concerning the safety of a consignment is available. The microbiological safety is judged by means of microbiological criteria. Such criteria should, in the context of the WTO/SPS measures, be scientifically justified, and established according to the principles described by the Codex Alimentarius. However, microbiological testing is not a very reliable tool for consumer protection; the emphasis is currently shifting to the application of food safety management tools such as the Hazard Analysis Critical Control Point system (HACCP).
Fabrication de couches minces a memoire de forme et effets de l'irradiation ionique
NASA Astrophysics Data System (ADS)
Goldberg, Florent
1998-09-01
Nickel and titanium when combined in the right stoichiometric proportion (1:1) can form alloys showing the shape memory effect. Within the scope of this thesis, thin films of such alloys have been successfully produced by sputtering. Precise control of composition is crucial in order to obtain the shape memory effect. A combination of analytical tools which can accurately determine the behavior of such materials is also required (calorimetric analysis, crystallography, composition analysis, etc.). Rutherford backscattering spectrometry has been used for quantitative composition analysis. Thereafter irradiation of films with light ions (He+) of few MeV was shown to allow lowering of the characteristic premartensitic transformation temperatures while preserving the shape memory effect. Those results open the door to a new field of research, particularly for ion irradiation and its potential use as a tool to modify the thermomechanical behavior of shape memory thin film actuators.
Flight test trajectory control analysis
NASA Technical Reports Server (NTRS)
Walker, R.; Gupta, N.
1983-01-01
Recent extensions to optimal control theory applied to meaningful linear models with sufficiently flexible software tools provide powerful techniques for designing flight test trajectory controllers (FTTCs). This report describes the principal steps for systematic development of flight trajectory controllers, which can be summarized as planning, modeling, designing, and validating a trajectory controller. The techniques have been kept as general as possible and should apply to a wide range of problems where quantities must be computed and displayed to a pilot to improve pilot effectiveness and to reduce workload and fatigue. To illustrate the approach, a detailed trajectory guidance law is developed and demonstrated for the F-15 aircraft flying the zoom-and-pushover maneuver.
Effect-directed analysis supporting monitoring of aquatic environments--An in-depth overview.
Brack, Werner; Ait-Aissa, Selim; Burgess, Robert M; Busch, Wibke; Creusot, Nicolas; Di Paolo, Carolina; Escher, Beate I; Mark Hewitt, L; Hilscherova, Klara; Hollender, Juliane; Hollert, Henner; Jonker, Willem; Kool, Jeroen; Lamoree, Marja; Muschket, Matthias; Neumann, Steffen; Rostkowski, Pawel; Ruttkies, Christoph; Schollee, Jennifer; Schymanski, Emma L; Schulze, Tobias; Seiler, Thomas-Benjamin; Tindall, Andrew J; De Aragão Umbuzeiro, Gisela; Vrana, Branislav; Krauss, Martin
2016-02-15
Aquatic environments are often contaminated with complex mixtures of chemicals that may pose a risk to ecosystems and human health. This contamination cannot be addressed with target analysis alone but tools are required to reduce this complexity and identify those chemicals that might cause adverse effects. Effect-directed analysis (EDA) is designed to meet this challenge and faces increasing interest in water and sediment quality monitoring. Thus, the present paper summarizes current experience with the EDA approach and the tools required, and provides practical advice on their application. The paper highlights the need for proper problem formulation and gives general advice for study design. As the EDA approach is directed by toxicity, basic principles for the selection of bioassays are given as well as a comprehensive compilation of appropriate assays, including their strengths and weaknesses. A specific focus is given to strategies for sampling, extraction and bioassay dosing since they strongly impact prioritization of toxicants in EDA. Reduction of sample complexity mainly relies on fractionation procedures, which are discussed in this paper, including quality assurance and quality control. Automated combinations of fractionation, biotesting and chemical analysis using so-called hyphenated tools can enhance the throughput and might reduce the risk of artifacts in laboratory work. The key to determining the chemical structures causing effects is analytical toxicant identification. The latest approaches, tools, software and databases for target-, suspect and non-target screening as well as unknown identification are discussed together with analytical and toxicological confirmation approaches. A better understanding of optimal use and combination of EDA tools will help to design efficient and successful toxicant identification studies in the context of quality monitoring in multiply stressed environments. Copyright © 2015 Elsevier B.V. All rights reserved.
Automatic control system generation for robot design validation
NASA Technical Reports Server (NTRS)
Bacon, James A. (Inventor); English, James D. (Inventor)
2012-01-01
The specification and drawings present a new method, system and software product for and apparatus for generating a robotic validation system for a robot design. The robotic validation system for the robot design of a robotic system is automatically generated by converting a robot design into a generic robotic description using a predetermined format, then generating a control system from the generic robotic description and finally updating robot design parameters of the robotic system with an analysis tool using both the generic robot description and the control system.
NASA Technical Reports Server (NTRS)
Hattis, Philip D.; Malchow, Harvey L.
1992-01-01
Horizontal takeoff airbreathing-propulsion launch vehicles require near-optimal guidance and control which takes into account performance sensitivities to atmospheric characteristics while satisfying physically-derived operational constraints. A generic trajectory/control analysis tool that deepens insight into these considerations has been applied to two versions of a winged-cone vehicle model. Information that is critical to the design and trajectory of these vehicles is derived, and several unusual characteristics of the airbreathing propulsion model are shown to have potentially substantial effects on vehicle dynamics.
Tool Time: Gender and Students' Use of Tools, Control, and Authority.
ERIC Educational Resources Information Center
Jones, M. Gail; Brader-Araje, Laura; Carboni, Lisa Wilson; Carter, Glenda; Rua, Melissa J.; Banilower, Eric; Hatch, Holly
2000-01-01
Observes 16 students from five elementary science classes to examine how students use tools when constructing new knowledge during science instruction, how control of tools is actualized from pedagogical perspectives, how language and tool accessibility intersect, how gender intersects with tool use, and how competition for resources impacts…
Fighting malaria in Madhya Pradesh (Central India): Are we loosing the battle?
Singh, Neeru; Dash, Aditya P; Thimasarn, Krongthong
2009-01-01
Malaria control in Madhya Pradesh is complex because of vast tracts of forest with tribal settlement. Fifty four million individuals of various ethnic origins, accounting for 8% of the total population of India, contributed 30% of total malaria cases, 60% of total falciparum cases and 50% of malaria deaths in the country. Ambitious goals to control tribal malaria by launching "Enhanced Malaria Control Project" (EMCP) by the National Vector Borne Disease Control Programme (NVBDCP), with the World Bank assistance, became effective in September 1997 in eight north Indian states. Under EMCP, the programme used a broader mix of new interventions, i.e. insecticide-treated bed nets, spraying houses with effective residual insecticides, use of larvivorous fishes, rapid diagnostic tests for prompt diagnosis, treatment of the sick with effective radical treatment and increased public awareness and IEC. However, the challenge is to scale up these services. A retrospective analysis of data on malaria morbidity and associated mortality reported under the existing surveillance system of the Madhya Pradesh (Central India) for the years 1996–2007 was carried out to determine the impact of EMCP on malaria morbidity and associated mortality. Analysis revealed that despite the availability of effective intervention tools for the prevention and control of malaria, falciparum malaria remains uncontrolled and deaths due to malaria have increased. Precisely, the aim of this epidemiological analysis is to draw lessons applicable to all international aid efforts, bureaucracy, policy makers and programme managers in assessing its project performance as a new Global Malaria Action Plan is launched with ambitious goal of reducing malaria and its elimination by scaling up the use of existing tools. PMID:19419588
Bonaccorsi, Gloria; Fila, Enrica; Messina, Carmelo; Maietti, Elisa; Ulivieri, Fabio Massimo; Caudarella, Renata; Greco, Pantaleo; Guglielmi, Giuseppe
2017-10-01
To evaluate (a) the performance in predicting the presence of bone fractures of trabecular bone score (TBS) and hip structural analysis (HSA) in type 2 diabetic postmenopausal women compared to a control group and (b) the fracture prediction ability of TBS versus Fracture Risk Calculator (FRAX ® ) as well as whether TBS can improve the fracture prediction ability of FRAX ® in diabetic women. Eighty diabetic postmenopausal women were matched with 88 controls without major diseases for age and body mass index. The individual 10-year fracture risk was assessed by FRAX ® tool for Europe-Italy; bone mineral density (BMD) at lumbar spine, femoral neck and total hip was evaluated through dual-energy X-ray absorptiometry; TBS measurements were taken using the same region of interest as the BMD measurements; HSA was performed at proximal femur with the HSA software. Regarding variables of interest, the only significant difference between diabetic and control groups was observed for the value of TBS (median value: 1.215; IQR 1.138-1.285 in controls vs. 1.173; IQR 1.082-1.217 in diabetic; p = 0.002). The prevalence of fractures in diabetic women was almost tripled than in controls (13.8 vs. 3.4 %; p = 0.02). The receiver operator characteristic curve analysis showed that TBS alone (AUC = 0.71) had no significantly lower discriminative power for fracture prediction in diabetic women than FRAX major adjusted for TBS (AUC = 0.74; p = 0.65). In diabetic postmenopausal women TBS is an excellent tool in identifying fragility fractures.
Dynamic Systems Analysis for Turbine Based Aero Propulsion Systems
NASA Technical Reports Server (NTRS)
Csank, Jeffrey T.
2016-01-01
The aircraft engine design process seeks to optimize the overall system-level performance, weight, and cost for a given concept. Steady-state simulations and data are used to identify trade-offs that should be balanced to optimize the system in a process known as systems analysis. These systems analysis simulations and data may not adequately capture the true performance trade-offs that exist during transient operation. Dynamic systems analysis provides the capability for assessing the dynamic tradeoffs at an earlier stage of the engine design process. The dynamic systems analysis concept, developed tools, and potential benefit are presented in this paper. To provide this capability, the Tool for Turbine Engine Closed-loop Transient Analysis (TTECTrA) was developed to provide the user with an estimate of the closed-loop performance (response time) and operability (high pressure compressor surge margin) for a given engine design and set of control design requirements. TTECTrA along with engine deterioration information, can be used to develop a more generic relationship between performance and operability that can impact the engine design constraints and potentially lead to a more efficient engine.